I wonder if my system is good or bad. My server needs 0.1kWh.
kWh is a unit of energy, not power
Wasn’t it stated for the usage during November? 60kWh for november. Seems logic to me.
Edit: forget it, he’s saying his server needs 0.1kWh which is bonkers ofc
Only one person here has posted its usage for November. The OP has not talked about November or any timeframe.
I was really confused by that and that the decided units weren’t just in W (0.1 kW is pretty weird even)
Wh shouldn’t even exist tbh, we should use Joules, less confusing
At least in the US, the electric company charges in kWh, computer parts are advertised in terms of watts, and batteries tend to be in amp hours, which is easy to convert to watt hours.
Joules just overcomplicates things.
Wow, the US education system must be improved. 1J is 3600Wh. That’s literraly the same thing, but the name is less confusing because people tend to confuse W and Wh
Wow, the US education system must be improved.
I pay my electric bill by the kWh too, and I don’t live in the US. When it comes to household and EV energy consumption, kWh is the unit of choice.
1J is 3600Wh.
No, if you’re going to lecture people on this, at least be right about facts. 1W is 1J/s. So multiply by an hour and you get 1Wh = 3600J
That’s literraly the same thing,
It’s not literally the same thing. The two units are linearly proportional to each other, but they’re not the same. If they were the same, then this discussion would be rather silly.
but the name is less confusing because people tend to confuse W and Wh
Finally, something I can agree with. But that’s only because physics is so undervalued in most educational systems.
Do you regularly divide/multiply by 3600? That’s not something I typically do in my head, and there’s no reason to do it when everything is denominated in watts. What exactly is the benefit?
I did a physics degree and am comfortable with Joules, but in the context of electricity bills, kWh makes more sense.
All appliances are advertised in terms of their Watt power draw, so estimating their daily impact on my bill is as simple as multiplying their kW draw by the number of hours in a day I expect to run the thing (multiplied by the cost per kWh by the utility company of course).
Watt hours makes sense to me. A watt hour is just a watt draw that runs for an hour, it’s right in the name.
Maybe you’ve just whooooshed me or something, I’ve never looked into Joules or why they’re better/worse.
Joules (J) are the official unit of energy. 1W=1J/s. That means 1Wh=3600J or that 1J is kinda like “1 Watt second”. You’re right that Wh is easier since everything is rated in Watts and it would be insane to measure energy consumption by seconds. Imagine getting your electric bill and it says you’ve used 3,157,200,000J.
Thanks for the explainer, that makes a lot of sense.
3,157,200,000J
Or just 3.1572GJ.
Which apparently is how this Canadian natural gas company bills its customers: https://www.fortisbc.com/about-us/facilities-operations-and-energy-information/how-gas-is-measured
Mate, kWh is a measure of electricity volume, like gallons is to liquid. Also, 100 watt hours would be a much more sensical way to say the same thing. What you’ve said in the title is like saying your server uses 1 gallon of water. It’s meaningless without a unit of time. Watts is a measure of current flow (pun intended), similar to a measurement like gallons per minute.
For example, if your server uses 100 watts for an hour it has used 100 watt hours of electricity. If your server uses 100 watts for 100 hours it has used 10000 watts of electricity, aka 10kwh.
My NAS uses about 60 watts at idle, and near 100w when it’s working on something. I use an old laptop for a plex server, it probably uses like 50 watts at idle and like 150 or 200 when streaming a 4k movie, I haven’t checked tbh. I did just acquire a BEEFY network switch that’s going to use 120 watts 24/7 though, so that’ll hurt the pocket book for sure. Soon all of my servers should be in the same place, with that network switch, so I’ll know exactly how much power it’s using.
Do you mean 0.1kWh per hour, so 0.1kW or 100W?
My N100 server needs about 11W.
To my understanding 0.1kWh means 0.1 kW per hour.
It’s the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your
devicefactory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.Thank you for explaining it.
My computer uses 1kwh per hour.
It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.
Why can’t you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.
kWh is a unit of power consumed. It doesn’t say anything about time and you can’t assume any time period. That wouldn’t make any sense. If you want to say how much power a device consumes, just state how many watts (W) it draws.
Thanks!
A watt is 1 Joule per Second (1 J/s). E.g. Every second, your device draws 1 Joule of energy. This energy over time is called “Power” and is a rate of energy transfer.
A watt-hour is (1 J/s) * (1 hr)
This can be rewritten as (3600 J/hr) * (1 hr). The “per hour” and “hour” cancel themselves out which makes 1 watt-hour equal to 3600 Joules.
1 kWh is 3,600 kJ or 3.6 MJ
0.1kWh per hour can be written as 0.1kWh/h, which is the same as 0.1kW.
Thanks. Hence, in the future I can say that it uses 0.1kW?
If this was over an hour, yes. Though you’d typically state it as 100W ;)
Yes. Or 100W.
The N100 is such a little powerhouse and I’m sad they haven’t managed to produce anything better. All of the “upgrades” are either just not enough of an upgrade for the money, it just more power hungry.
You might have your units confused.
0.1kWh over how much time? Per day? Per hour? Per week?
Watthours refer to total power used to do something, from a starting point to an ending point. It makes no sense to say that a device needs a certain amount of Wh, unless you’re talking about something like charging a battery to full.
Power being used by a device, (like a computer) is just watts.
Think of the difference between speed and distance. Watts is how fast power is being used, watt-hours is how much has been used, or will be used.
If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.
kWh is the stupidest unit ever. kWh = 1000J/s * 6060s = 3.610^6J so 0.1kWh = 360kJ
If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.
A maximum of 500 watts. Fortunately your PC doesn’t actually max out your PSU or your system would crash.
I forgive 'em cuz watt hours are a disgusting unit in general
idea what unit speed change in position over time meters per second m/s acceleration change in speed over time meters per second, per second m/s/s=m/s² force acceleration applied to each of unit of mass kg * m/s² work acceleration applied along a distance, which transfers energy kg * m/s² * m = kg * m²/s² power work over time kg * m² / s³ energy expenditure power level during units of time (kg * m² / s³) * s = kg * m²/s² Power over time, × time, is just power! kWh are just joules (J) with extra steps! Screw kWh, I will die on this hill!!! Raaah
Power over time could be interpreted as power/time. Power x time isn’t power, it’s energy (=== work). But otherwise I’m with you. Joules or gtfo.
Whoops, typo! Fixed c:
Could be worse, could be BTU. And some people still use tons (of heating/cooling).
For two servers (one with a lot of spinning rust), two switches, and a few other miscellaneous network appliances. My server rack averages around 600-650W. During periods of high demand (nightly backups, for instance), that can peak at around 750W.
My server with 8 hard drives uses about 60 watts and goes up to around 80 under heavy load. The firewall, switch, access points and modem use another 50-60 watts.
I really need upgrade my server and firewall to something about 10 years newer, it would reduce my power consumption quite a bit and I would have a lot more runtime on UPS.
I’m idling at 120W with eight drives, but I’m currently looking into how to lower it.
Idle: 30 Watts
Starting all docker containers after reboot: 140 Watts
It needs around 28 kWh per month.
AiBot post. Fuck this shit.
Can you please explain?
Mine runs at about 120 watts per hour.
Please. Watt is an SI unit of power, equivalent of Joule per second. Watt-hour is a non-SI unit of energy( 1Wh = 3600 J). Learn the difference and use it correctly.
My 10 year old ITX NAS build with 4 HDDs used 40W at idle. Just upgraded to an Aoostart WTR Pro with the same 4 HDDs, uses 28W at idle. My power bill currently averages around US$0.13/kWh.
My whole setup including 2 PIs and one fully speced out AM4 system with 100TB of drives a Intel Arc and 4x 32gb ecc ram uses between 280W - 420W I live in Germany and pay 25ct per KWh and my whole apartment uses 600w at any given time and approximately 15kwh per day 😭
last I checked with a kill-a-watt I was drawing an average of 2.5kWh after a week of monitoring my whole rack. that was about three years ago and the following was running in my rack.
- r610 dual 1kw PSU
- homebuilt server Gigabyte 750w PSU
- homebuilt Asus gaming rig 650w PSU
- homebuilt Asus retro(xp) gaming/testing rig 350w PSU
- HP laptop as dev env/warmsite ~ 200w PSU
- Amcrest NVR 80w (I guess?)
- HP T610 65w PSU
- Terramaster F5-422 90w PSU
- TP-Link TL-SG2424P 180w PSU
- Brocade ICX6610-48P-E dual dual 1kw PSU
- Misc routers, rpis, poe aps, modems(cable & 5G) ~ 700w combined (cameras not included, brocade powers them directly)
I also have two battery systems split between high priority and low priority infrastructure.
I was drawing an average of 2.5kWh after a week of monitoring my whole rack
That doesn’t seem right; that’s only ~18W. Each one of those systems alone will exceed that at idle running 24/7. I’d expect 1-2 orders of magnitude more.
IDK, after a week of runtime it told me 2.5kwh average. could be average per hour?
Highest power bill I ever saw was summer of 2022. $1800. temps outside were into to 110-120 range and was the hottest ever here.
maybe I’ll hook it back up, but I’ve got different (newer) hardware now.
Ugh, I need to get off my ass and install a rack and some fiber drops to finalize my network buildout.
45 to 55 watt.
But I make use of it for backup and firewall. No cloud shit.
17W for an N100 system with 4 HDD’s
Which HDDs? That’s really good.
Seagate Ironwolf “ST4000VN006”
I do have some issues with read speeds but that’s probably networking related or due to using RAID5.
That’s pretty low with 4 HDD’s. One of my servers use 30 watts. Half of that is from the 2 HDD’s in it.
@meldrik @qaz I’ve got a bunch of older, smaller drives, and as they fail I’m slowly transitioning to much more efficient (and larger) HGST helium drives. I don’t have measurements, but anecdotally a dual-drive USB dock with crappy 1.5A power adapter (so 18W) couldn’t handle spinning up two older drives but could handle two HGST drives.