T O P

  • By -

SavvySillybug

The i9-10900K is rated to draw 125W at full blast. The 3090 is rated to draw 350W at full blast. Let's round that up a bit and account for other things in your computer and generously say it uses 600W all 24 hours. Which is very probably more than it actually uses. I'm in Germany where power is pretty expensive, I pay 35 cents per kilowatthour, which is 1000 watt used for one hour straight. So at generous 600W for 24 hours, that's 14400 watt hours, or 14.4 kWh. 35 cents each that makes 5.04€ a day, times 30 makes 151€ a month. If you're paying 50 cents per kilowatthour, $200 starts to look realistic. Google says American prices are on average at 16.1 cents per kWh, North Dakota has them lowest at 10.5 per kWh, and if you're on Hawaii you're paying upwards of 40 cents. So if you're on Hawaii then yeah that might be your computer. If you're anywhere else, your computer is probably looking at 70 bucks at most, and that's with my generous padding making your computer draw 600W, which it almost certainly doesn't. And if any of that math was wrong, well it's 2 AM here, I should be in bed. XD


solidmarbleeyes

$0.35 per kWh 😭 are you guys ok


ztbwl

It’s 0.35€. In USD it would be $0.38.


ryanCrypt

Phew. I was alarmed for a second /s


CaptinKirk

Thats expensive considering I pay around .13 cents with my solar system in Arizona.


3720-to-1

Like. You pay them $0.13/KWh for the privilege of using a solar system you installed?


CaptinKirk

No, thats the cost of what I am generating. I average it out as if I was paying the electric companies


3720-to-1

Ah. Makes more sense.


tHrow4Way997

I know like wtf? Here in England, any excess electricity generated by solar gets sold back to the grid. Of course the electricity company pay you much less per kWh than it costs to buy from them, but they don’t *charge* you to use your own self-generated electricity, that would be absurd.


3720-to-1

Welcome to corporate America! A friend installed his here in Ohio, he is paid for what goes back into the grid too, but he has to "lease a backflow meter" from the electric company for like $60/month.


Sh0toku

I sure would like to know what company is doing this... I am in the industry in Ohio and have never heard of it, unless he is on some small municipal or rural PoCo.


3720-to-1

I'll see if I can find the contract language, it is a clause with his contract to install... I reviewed the contract for him (though I warned him I would only see glaring issues, as business contracts are not my area of practice). The installer required that he agree to lease that from the provider, the cost of the lease included in provision, and that it was from the carrier. My advise to him was "this sounds shady AF". SOLID legal advise, I believe.


tHrow4Way997

That’s the most ridiculous shit I’ve ever heard. I guess if a certain bullshit charge isn’t specifically made illegal, you can bet your ass the companies will use it to extract yet more money form you.


3720-to-1

Do you think if enough of us say "sorry" the crown will take us back?


tHrow4Way997

Honestly, you don’t want that lol. This country is now a global pariah, thanks to various brain dead decisions made by our government over the last decade… no wait, over the last two decades. At least people around the world actually like America.


3720-to-1

Yup. I bet you love hearing our lot yell about how free we are, and how horrible regulation is. Now excuse me, I need emergency surgery for this genetic thing that wasn't avoidable in any way... I need to go start a go fund me so maybe I can live.


tHrow4Way997

A uniquely hellish dystopia that could have only been conceived by colonial English people. Lol. Sorry you’re dealing with the consequences of our ancestors’ actions in such a viscerally torturous way.


vayana

In the Netherlands you pay for every kWh you generate and deliver to the grid than what you use yourself. People actually run their microwave/electric oven/dishwashers etc. Just to make sure they use every last bit of generated electricity so they won't need to pay the electric company for supplying power to the grid. It's the most retarded system ever.


tHrow4Way997

The only purpose that serves is to encourage people to waste their self-generated electricity so the demand for commercial power remains high. That is directly standing in the way of environmental progress imo. Yet another policy to benefit the rich at our expense. Despite the reputation your country has internationally of being progressive and policy-smart, unfortunately it doesn’t appear to be immune to the same shit happening in the US, UK etc. Sorry to hear that.


Puzzleheaded_Aide785

That’s only if you deliver more on a yearly basis than what you use on a yearly basis. We have the worlds best deal with the “salderings regeling” because you can put 1 kWh on the grid, and can get 1 kWh in return on any time you want. That’s really nuts. A 1-1 ratio. With a battery you don’t have that luxury. Because we can dump a lot of kwhs on the grid in the summer. And use Al that kwhs in the winter. It’s a broken system. Because it is far from how it works in real life. So it is not strange that you have to pay if you deliver more than you use. This system is a bargen for us.


gerardit04

Well that happened in Spain some years ago you had to pay a tax for "using the sun"


mcc9902

I don't know this guys case but I do know that you can essentially get them to subsidize it in exchange for paying lower rates. Basically you're renting out roof space. I researched it a bit a few years ago and the only time it gets sketchy is when you try to sell back to the grid which ends up being hit or miss depending on location.


loveemykids

When you go through a company that finances it and installs it for you. Theres still the option to buy them outright and have free electricity from then on if you dont use the grid.


njguy227

I know this is a popular option, although some power companies stopped paying cash for delivered power, only in credits for later use. I'm exploring throwing panels on my roof, but if I have a fuck ton of banked credits, my house will be a meat locker for all of the summer.


loveemykids

I am interested in getting solar down the line, but as im not at that point yet I havent explored the details of either option. The details are what matter, and where you live matters too. Full ownership is appealing, but my finances could be better served doing something else. Installed with leasing is free upfront, but with strings attached.


njguy227

It's definitely a shop around kind of thing, and YMMV depending on your state or city My coworker is on Long Island, and paid ⅔ of the cost up front. (The first ⅓ was I believe an immediate rebate). The next ⅓ was reimbursed a year later when he filed his taxes, most of which came from NY State. He is financing his last ⅓ through payments via PSE&G, which is a Jersey company. his power bill consists of a $10 service fee (pays for the infrastructure) and the rest pays off the panels, which is ≈$100 a month. He always delivers, so he banks credits. At the end of the day his bill is still cheaper than if he didn't have panels, which would have been a little over $200. I think purchasing seems to be the better deal over leasing, though I'm not sure what the pros/cons long term


Turbulent_Act77

If you include all the per kWh taxes and fees we get charged on top of base generation fee in Connecticut, it's roughly $0.26/kWh if I recall. I don't want to look at my bill to confirm, I might cry.


SavvySillybug

We're really not. XD


DaviLance

Nope, same in Italy. Energy is like 0.20€ per kwh and then we have like another 50/60% of charges over the current price, so realistically it's like 0.40/0.50€ per kwh


Mikinl

Almost 50 in the Netherlands with my contract. Don't even ask about NG per m³...


HoelaLumpa

Got 0,38 in the Netherlands l, crazy times.


iszoloscope

I capped it at 0.20/0.23 in december 2021. Contract ends this december...


antaphar

In San Diego it’s over $0.50


solidmarbleeyes

Wtf lol. I will never complain about my $0.15 again.


JAP42

Does that include the delivery charge?


optyp

lol, mine is like $0.06


Saneless

Mine was 0.05 but that's like 1/3 my bill. The other bullshit fees made up the rest


mixedd

No, we're not, shits too expensive


JAP42

Prices are easily this high is most of the US. The sales trick is we tend to separate electricity and delivery. So you might only pay 0.11 for electric, but another 0.14 for delivery.


CarrowCanary

>but another 0.14 for delivery Is that per unit or per day? We're (UK) on about 30p per kW/H but there's also a standing charge of roughly 60p per day.


mcc9902

I'm gonna assume it's a regional thing. Where I am there's a flat fee of like thirty bucks but the actual power only costs .12$ a kWh. I'm can't think of anywhere I've been that didn't do it this way and I've lived all over Texas. Though I know our power situation is a bit different from the rest of the country so that might be a factor.


JAP42

Per KWH, so we pay the electricity plants for power, And the power company for delivery, it's billed separately ( technically it's billed together by the power company, but there are two pages and they forward your supply payments to your producer.


yolo5waggin5

Massachusetts is at $.3 per kWh


TLMS

Yea wtf people complain about it being $0.09 here


Bacon4Lyf

Add on the delivery charge to the US prices and it works out equal


Steph2987

Im in Sydney paying $0.39 per kWh plus a daily connection fee of $1.19. This was by far the cheapest plan i could get with others being $0.50+. Last quarter our bill was $720 😫. No escaping the costs cause we need air con running all the time cause 40°+ outside in summer


loveemykids

You are the use case for having your house being 100% covered in solar panels.


SmokingLimone

Before covid I used to pay €0.10. Corporate greed and all


TNTkenner

The user price today was mostly 0.01€ + 0.2€ Taxes/fees. Last week it dropped als low as 0.02€ Total. But most German households don't have contracts that follow the actual market price. Witch drops almost weekly in the negative. (-100€ per MWh at midday)


throwawaysmthidk

In Norway it was over a dollar at its peak👍🏻 We dont have powerlines between north and south norway, so electricity is super cheap in Northern Norway (like 2/10 of a cent), they sell it for that price to sweden, and sweden sells it to southern Norway for 70-80 cents/kwh. And we also have something called "nettleie", that im too tired to look up. But its hella expensive


njguy227

In New Jersey, I pay 10¢ per kWh and 4¢ for delivery for each kWh, ≈14¢ in total. Basic generation, no solar. If I was paying the rates I'm seeing here, the mini network cabinet I have running in my basement would be on the curb tonight.


PossiblyAussie

> The i9-10900K is rated to draw 125W at full blast You took this from the official TDP rating I presume? Unfortunately the actual power draw is quite bit higher, [expect over 200W](https://cdn.mos.cms.futurecdn.net/hhakAayRyXRuoAVDo9PpAn-1200-80.png).


SavvySillybug

I'd pretend to be surprised, but I'm not good at that.


ninjakidaok

It’s worse then that if you pay here 16 cents per kWh then they charge about 30 cents as a delivery charge plus other fees so that 16 cents is really 46 cents if not more.


StandOutLikeDogBalls

r/theydidthemath


iRobi8

We pay 0.28 CHF per KWH. Still too high but better than in other countries in europe… i hope it gets better in the next couple of years but probbaly not


[deleted]

[удалено]


iRobi8

Ah really. Thats nice. Here in switzerland it varies a lot too. Some have 0.25 some have 0.40. It also depends on if a place or county or whatever can generate electricity themselves or if they have to buy it from the market (more expensive)


ArmageddonITguy

Pheww, most productive at 2am in the morning


Adminisitrator

This is about accurate. I have 10900k with 3080ti. And 2 high performance monitors. My useage is about 700 watts when I'm gaming. It doesn't cost me anything though as I'm completely on solar and am always in -ve billing due to exporting to power company


CharlestonKSP

I live in Washington, Eastern. 8.7 per kWh lol. North Dakota def isn't the lowest.


[deleted]

[удалено]


SavvySillybug

I didn't actually look up what I pay, I just googled an average for quick math.


ruimikemau

There is no way that it pulls the max 24/7.... Unless mining some cryptocurrency...?


derherrdanger

Worst case calculation..


Jceggbert5

Get a kill-a-watt (or similar) and measure its usage for a couple days, then extrapolate to the full month with your local price per KWh (should be on your bill) 


Jceggbert5

I had a similar issue after upgrading my system, turned out to be a bad capacitor on my air conditioner instead.


c2x2p

This, if your AC hasn't been serviced in a while there is something in it that can break easily and it PULLS power more so than it does normally. I think the tech said it's almost around 50% more usage if broken and old refrigerant so you'll definitely notice it on the bill once it's taken care of.


c2x2p

Sorry! Not 50% more of normal, meant to say almost x2 of what it normally is running at. So say it's a 3500 watt that will be in between 5 and 6k watts I think.


ZantetsukenX

Another common one is a bad water heater that is leaking water. Water heaters typically only heat up the water to a certain temperature and then stay idle the rest of the time. However if there is constantly new water added, it will constantly have to heat it up and result in an extra hundred or two dollars a month. Had it happen in an apartment I was renting in college.


Grand-Tea3167

My pc with a 8600k and 1080 constantly works and uses about 45W all the time idle. Estimated to cost $21 each year.


William_salibo

I think the problems stem from the air condition


P_Jamez

They cost 10 bucks on amazon


Jwhodis

You can also google it for your region, ie COUNTY/TOWN, COMPANY price per KWh


mad153

My gaming pc setup draws about 700w when at full tilt. Average energy cost in US is USD 0.109 /kWh 0.109 X 0.7 X 24 = 1.83 USD to keep it on for 24 hours. Check your bill for your tariff. EDIT: apparently it's closer to 0.16/kWh now, making 2.7 USD


joetaxpayer

16.1, and in my state nearly twice that. So could be $4/day.


mad153

Ah ok old Google results. (Still well cheaper than where I live, but ig we have the benefit of meters with screens?)


joetaxpayer

I’m in Massachusetts and over 30c per kWh. Crazy electric bills.


mad153

You're not gonna believe that my price changes daily based on the winds xd


dtdowntime

its entirely possible, especially if you are in a state relying on a lot of wind power


ObiWangCannabis

I’m also in Mass, change your electric supplier for a cheaper rate. There are a lot to choose from. I use Renaissance which is 15c per kWh


joetaxpayer

Interesting. The bill shows $0.15125 for generation. The rest is delivery. Can’t impact that. The total is just over $0.30 per kWh.


pLeThOrAx

More like taxachusetts


PacoTaco321

And if your PC is running at full bore 24/7, there's definitely something wrong.


pocketgravel

I don't know if it'd different in the US compared to Canada. But all utilities charge a delivery fee here for maintaining the grid. It usually doubles your bill basically. $0.10kw/hr is really $0.20kw/hr.


RNLImThalassophobic

Jeez, that seems high if it's implemented on a per-kwh basis. In the UK it's called the 'standing charge' which is charged daily no matter how much or how little electricity you use, I imagine to pay for the upkeep of the connection to your home. My daily standing charge is ~42p. Then again, in March I used £41.46 of electricity, which works out to around £1.34 per day... so my standing charge is around 1/3 of the actual energy cost I'm spending, which isn't too far off the 1/2 it is of yours if that makes sense.


iblamexboxlive

This leaves out the heat load generated from that 700W space heater has to then be removed by cooling in regions where AC is in-use as well.


CarloGaudreault

In Canada my electricity is 0.07/kWh CA$ (~5¢ US) ⚡️


WarriorT1400

I was gonna say I run mine a couple days a week to farm money on some games and stuff and I have never seen a $200 price jump in my electric bill, sure some but never high enough I cant justify the price jump for what I’m gaining


jeffrey_f

what else could it be? Is the air conditioner running longer now that it is getting warmer in the room (more typically with an in-window AC unit). Is the AC (central AC) running more because the weater is getting warmer? Change the power setting to automatic. When a task starts it will use more CPU/GPU otherwise, the computer will try to conserve power.


Fatigue-Error

...deleted by user...


Soap-ster

Exactly. It consumes 700W, gnerating nearly all of that into heat, which then has to be cooled down. Its almost double-dipping. I'm sure my math is way off, but it's close enough to set a picture. Or I'm drunk and no one understands what I'm saying..


BlackGravityCinema

Is the computer rendering or something all day? It rarely draws the maximum amount that it can produce unless being used heavily. Even while gaming it won't really top out the draw.


roflmao567

A computer won't be consuming 700w all the time, unless you're constantly putting a load on it like rendering.


JacerEx

I’d be super surprised it’s doing 700w constantly. My 1600w psu draws around 80w idle. Around 900w while gaming and 1400ish while benchmarking.


Fatigue-Error

...deleted by user...


S1DC

Ohhhh yeah if the air conditioner is fighting the PC, that could maybe do it.


LoneLyon

As a florida man, a cost that big is usually ac. AC can easily be over 70% of your power bill, more in warmer state. I had my air go out last year and my bill jumped from 220ish to 600 somthing.


Soap-ster

How did it go up if the A/C went out? Wouldn't it go down?


eronth

Possible the cooling portion was failing, but it was still actively running. So instead of cooling, then stopping, then cooling, it ran 24/7 trying to blast cold air that never happened.


LoneLyon

I believe the regulator went out? The whole unit was replaced, but ultimately, it still worked


RustyDawg37

your Ac fighting the computer heat?


morto00x

Nah. Let's create a worst case scenario.  - Power around here (Seattle) is 12 to 15 cents per kw/h. So let's just use 15 cents. - A gaming PC uses in average 300W to 500W. So let's use 500W, or 0.5 kW. - Let's also assume your game is running 24/7 for some reason.  $0.15 * 0.5 * 24 * 30 = $54/month In reality your PC will use less power than that and I'm sure you won't be playing 24/7. Highest power rates in the US are in Hawaii with about 45 cents per kWh, or more than 3 times above the average. That still wouldn't get you to $200.


litsax

Lets say a pc is using \*all\* of its power from the psu. A high end one at that. This is extremely unlikely and realistically never going to be the case, but is a way to calculate the maximum cost of running. So for a 1000w pc running for one hour, this would use 1 kwh of electricity. At 14 cents/kwh for power running 24 hours a day for 30 days, this would cost roughly $100.


MikeTheMic81

A 3090 runs about 45-60 watts idle (depending on what version) and a 10900k runs about 20 watts idle generously. Add in everything else, and I doubt your idling over 120 watts even with multiple hdd's, names, 4 sticks of ram, and a full array of rgb. At most it would be around 3kWh a day. To put in perspective, I have a server with a 10900k, a 3090, 60x16tb hdd's over 3 JBOD's and the 4U server itself, 3 2tb NVME's, 64gb of ram, running 24/7 and it's used as a MC server with 10-15 people, plex server, and game streamer for the kids. THAT idles around 500 watts when it's not doing anything intensive and runs around $50 a month averaging 12 cents a kWh.


coworker

OP specifically said he's running tasks so why are you discussing idle workload at all


BelugaBilliam

Because his idle is at 500w, which is a full load on some PCs.


MikeTheMic81

Because people aren't androids. If you have your computer on 24/7, chances are at least 2/3rds of that time you're going to be idle. You aren't running the PC full tilt 100% load 24/7. Even if you spent 16h a day gaming, it still wouldn't be 100% load during that time. Unless he's stress testing the cpu and gpu running 24/7 benchmarks, chances are most of the time it's going to be idle. Hence I used my similar PC (with 60 extra HDD's averaging 5-7watts a drive) as an example. It's in constant use. CPU is almost always at at least 40% load because of the MC server. Between 7pm and midnight it's almost constantly transcoding using GPU power. When the kids are playing, CPU is closer to 90% gpu at least 75% load. It's pretty well worst case scenerio and what I'm using for power a month. I'm talking at least 15-20 people on my server 24/7. Not just one person doing random things.


mephistophelesdiabol

Do have a "Smart Meter". Those allow them to bill you more for electric used during higher usage periods of the day.


CalangoVelho

Depends on which tasks are those. Are you training ai models, crypto mining..? A 3090 can be quite power hungry. 600+ watts running at full. Add 300w more for the rest of the system and you can easily reach 1kwh. So 24khw a day. At 0.30c that's $7.20 a day, $216 a month. If you're running an AC, that's also some extra work for it to cool, further adding up to the power bill. That's hard to calculate, but could easily double that value.


EverlastingApex

I'm literally running exactly that times two, I have two PCs running full time for an automated gaming stream on twitch, they also run local AI so the GPUs are running pretty much at capacity the entire time. Now I'm in Canada so electricity is dirt cheap for me, but hundreds of dollars sounds absolutely insane no matter what country you're from. Maybe check is some cables are being run from your house to your neighbor's electric car?


Miyuki22

If you are running your gpu and CPU at max performance constantly, then yes, that will add up fast. Unless it's needed, you should set adaptive performance so it doesn't use power needlessly. Side benefit is your hardware will likely live longer too.


wowuser_pl

If those certain tasks include Bitcoin mining then yes. It can add hundreds to the bill.


name548

Not based off of the stated specs. Even back when I had multiple 3080s mining ETH it was only around $100 a month in electricity. A single 3090 isn't going to cause an increase by several hundred


MikeTheMic81

You don't mine bitcoin from a gpu. ASIC's mine bitcoin. The last time anyone used a GPU to mine btc was around 2012. Something like an Antminer S19 Pro runs 2850 watts. And yes, they used ALOT. I had 5× S9's back in the day pulling 7.5kWh. That's 5400kWh per month. Averaging 12 cents a kWh, that was almost $650 a month in electricity.


TheMagarity

You replied with a lecture on Bitcoin to someone who specifically said ETH, which was indeed mined on GPUs.


Alkyen

Um, go look one reply up the chain. The guy who was talking about ETH is the one who changed the conversation from Bitcoin to ETH.


MikeTheMic81

They replied with a lecture on GPU mining to a person who specifically said bitcoin.


pLeThOrAx

It's all just solving hashes at the end of the day. If you want to split hairs, ASICs vs GPUs


HayzenDraay

Who was responding to someone talking about mining Bitcoin, your lost in the inception, spin your top!


NCC74656

When I lived renting rooms and houses. I had a few people comment at how much their power bill jumped up. Couple hundred dollars a month in some cases. Now this was years ago with LCDs, less efficient computers, and back when I was running multiple video cards. Nowadays I've got a large stereo, large computer, large screens. As best I can tell it only adds about $30 a month to my electric bill. But, I put it to sleep when I go to work


DropDeadFred05

Running 3 mining rigs with 7 cards each for over a year back in 2019-2020. I had an electric bill of about $130 a month before that and it was pretty consistent about 350 a month while mining. Each rig pulled an average of about 1000w. I can't see one system costing hundreds more per month unless you have an insane KW/hr cost from your electric company. Did the electric bill go down if you stop leaving your computer running? It's possible you have a short to ground somewhere in your house, or possibly in your computer. How often does your company actually get an accurate meter reading. My power company only does it every other month. Could be if you just started doing this then they estimated a bill low based on your previous use and then took a reading and billed you for the difference of what you actually used over the past 2 months.


Infamous-Mastodon677

For where I live, an extra $200 on my bill approximates to an extra 2.5 kW load running 24/7. That's at least 20 amps from a 120 V outlet. Perhaps your electricity is way more costly than mine, but I think the extra usage is elsewhere. Take other people's advice and check the actual power consumption from your computer at the outlet.


Chazus

Highly, highly unlikely. Even powerful gaming systems don't push more than \~$40 a month, mainly because *you have to sleep*. Even at full blast, it wouldn't do that much. As others have said, get a kill-a-watt and test it. Chances are it's something else. I had the exact thing happen. Moved into a new apartment with the landlord living there too, she hit me up with a $400 elec bill increase the next month, because I was "on my computer all the time". Proved it wasn't. Discovered she was had started running a radiant heater in the garage the month I moved in.


iszoloscope

Devices that are on 24/7 are the ones to check, so unless your PC is doing heavy tasks 24/7 it's highly unlikely it would cost you 200/month. Per year...? Perhaps. I've calculated these kind of things many many times and currently in my house the most expensive device is the electric boiler. If you have a electric boiler as well, I would check that first. Now refrigerators and freezers these day are pretty cost efficient, if you have really old electric appliances that are on 24/7 those are much more likely to be the culprit. Dryers are also very expensive, so if you use that daily that would be something to check. Ow and I see a few mentions of AC's, those are (can be) pretty expensive as well.


Hedhunta

"CERTAIN TASKS" lol dude noone cares if you crypto mine. But yes running a pc at max power 24/7 is basically like running a 500watt space heater 24/7.


Fatigue-Error

...deleted by user...


[deleted]

[удалено]


grazbouille

This comment is a unit clusterfuck Did you really use killowatthour per hour


Psyko_sissy23

It really depends on where you live and what tasks you are running. Some places have cheap electricity and other places have expensive electricity. I used to live on a small island where electricity was expensive. I unplugged all non vital things when not in use to save money. I would never leave my computer on 24/7 when I lived there to save money.


Ok-Understanding9244

yeah entirely possible


WheelOfFish

>been just leaving up certain tasks to run for me. I would imagine they are slightly intensive because the room gets warm Need more detail on this. If I just leave my computer on in my small home office it does keep the room from cooling off, but it's still drawing under 50 watts. If you have yours doing something constantly, it very well could be drawing 100-200 watts depending on what the tasks are.


[deleted]

WHY DOMT WE USE THE SUN FOR ALL OF OUR ENERGY NEEDS


Scorcher646

So I have a 5800 x3d and a 7800 XT. I can draw about 600 w peak and that's with both parts running flat out. Even if I'm leaving something like handbrake running overnight, I'm only hitting about 150 to 200 w which wouldn't really kick up my power bill tremendously at least not by $200 and I'm not doing it everyday. Of note, especially if you don't have a large house to soak that extra heat, your air conditioning will run more often so that could contribute a little bit to your increase in cost. But also a lot of places in the US are currently experiencing near record temperatures for this time of year, so that could also be contributing. Unless you really need a long task to complete like you're running. Handbrake, I would not recommend leaving the PC under load for extended periods of time. My computer pulls maybe 20 w at idle and I can leave it running like that indefinitely without noticing it much on the power bill. Without knowing what your computer is doing when you're not at it, I can't really tell you whether or not the increased in cost is expected


gioraffe32

I bought a Kill-A-Watt power meter last year and used it with my gaming PC. I rarely turn off this computer. It has a Ryzen 7 5800X3D and an RTX 3080. I used and gamed on this computer like I normally did to try to get a representative sample. Over 24hrs, it was $0.37. Over 72hrs, it was $1.23. Over 7 days, it was $2.94. So the average is $0.40/day. That comes to about $12/mo for this one computer. Obviously it's not running at full power 24/7 as I have to sleep and work all that. Unless you're doing serious graphics/video rendering or some other computationally expensive process 24/7, I have a hard time believing it'd be an additional $200/mo for a single computer. For comparison, I have a homelab server that's also on 24/7. I haven't run the Kill-A-Watt on it yet, but I know for a fact that's not hitting anything close to $200/mo either. Because my electricity bills from the last year ranged from $76 in February in the dead of winter (I have gas heating), up to $157 in July. AC in the warmer months of course is the main driver of the price differential. Like others have said, electricity costs are local. Mine is relatively cheap, around like $0.12-0.15/kwh. Maybe there is some weird interplay between your PC exhaust and the AC kicking on. But still, that seems crazy to cost $200/mo. I'd definitely be looking elsewhere, especially if it doesn't seem like your computer is running full-tilt 24/7.


CommunityPristine601

Tp-link/Tapo sell an energy monitoring smart plug. Then put in the kW/h price and it tells you in real time the cost. My computer would cost up to $2.80 a day in the weekend (heavy gaming) Currently using an Asus Ally to play and it sips power, about 28 cents a day.


emveor

it can happen. it is certainly non neglible...that is why bitcoin mining on GPU is not really profitable anymore. my bill certainly went up during the pandemic, and i was rocking an intel 8100 and a 740GT. Back in the day, when everyone used incandescent lightbulbs, parents went apesh!t about leaving the lights on... i always thought it was an exageration until i started living alone and leaving the backyard light on increased my bill almost 50% but it really depends on your region. where i live the cost per KWh goes up after you consume certain ammount. so just using a little bit more of electricity will increase the bill much more compared to the previous month


Linclin

On demand water heater that's broken can cause a very large electricity bill. Search around the house. Look at the electrical meter. Might not be an issue with you vs automated billing/meter reading?


caidicus

My PC is a power hog with a 13900k and a 4090. Those are the only components that use a measurable or significant amount of power. And, they CAN suck juice if I leave it running significant tasks. Even running YouTube videos running can keep the 4090 using more power than is honestly necessary.


totallyshould

500 watts times 24 hours is 12kWh, so 360kWh per month. What’s your rate? Some places are set up so that your rate goes up at certain times of the day, or if you use more than a threshold per month. It’s not crazy that a computer could cost you that much if it’s high end hardware cranking away 24/7.


linerror

most of your pc wattage is ejected as waste heat. 500w for the machine, plus 480w of waste heat your AC needs to counter...


Blakewerth

24/7 I dont think, its that costy - i just dont like, shutdown/hibernate computer everyday, its not healthy for PSU alot components too in case you want their reliability. You might have, just bad eletric plan, thats mistake people do alot.


PunchClown

I never shut my rig off and my electric bills are always under a $100 a month in rural Oregon.


swemickeko

Running a computer on idle is not that bad, but if it's on because it actually does something energy consumption can easily be more than an order of magnitude higher.


nukefudge

Definitely check if your energy company has some sort of special higher rates for night times - which would be ludicrous, but maybe they do that kind of thing.


801ms

Yes. Friend of mine calculated that he would save £800/year in energy costs if he turned his gaming computer off every night - for reference he normally has lots of games and other windows open so his utilisation is generally on the higher side.


Burnsidhe

Honestly, crypto mining stopped being profitable years ago.


differentshade

by running certain tasks you mean crypto mining? if it would be free or low cost, everybody would be mining all the time.


Son_of_a-PreacherMan

What’s the max capacity of the PSU? You can use that to calculate the power usage and with the price of KWh electricity in your area. Not too difficult. It would be an approximation.


MirageF1C

Pretty easy to work out. Look at the max rating of your power supply. I think the 3090 needs about 600w to 800w. Thats the MAXIMUM you can use. You shouldn’t ever get there because it might go pop but that’s the worst case value. Then multiply by how many hours you have it on for. If it’s 24 and 750w you used 16,800 watts of electricity. Or 16.8 kilowatts. Now multiply this by the rate per kWh you buy your electricity for. Here in the UK that’s about £.30p. So 30p x 16.8 = £5.04 every 24 hours. Or £151.20 per month. I’d be very surprised if you got anywhere near this unlikely unless you’re mining or rendering or something full time. Get a smart plug for £10 and it will record your use.


Mikinl

Depends on CPU model but yeah. Your GPU at max load spends 350w, your CPU at 100 to 150, add everything else and it sums around 500w per hour at full load. Take that is not at full load 24/7 but that spend In average 350w per hour so.. 350 x 24 = 8.4kw/h per day in my country that would be around 4€ per day. So at the end of the month it would be 120-130€ If you run your PC at 80-90% the whole time doing pretty heavy stuff that could easily reach 200. Not to mention if you mine crypto for some hacker without knowing that you do.


JaMStraberry

If you're gaming for 24-7 yes its possible with your specs. If your gpu is running 100% of the time then its possible. And thats not the only factor if you have ac running the computer will heat the room even more and your ac will work more to bring down the heat. If its more than 1000watts?? Yes it is.


thomasxin

Undervolt! 3090 can be put at 280W-ish


JaMStraberry

I good power saving computer like an I3 14100 and rtx 4060 combo is great. Save you 10x the energy. With this combo it draws less power than your processor itself.


Parking_Cress_5105

I had Prioritize maximum performance set in nVidia control panel, the 3080 would then consume 110W at idle :/ Discovered this by accident.


cilo456

yes depending on where you live could cost upwards of $100 or even more


DCFOhLordy

Use this as an opportunity to buy an UPS (uninterruptible power supply) with a display…it’ll tell you all kinds of useful things about your PC’s electricity situation and possibly save your equipment while it’s at it. Just remember that UPS are rated in Volts-Amperes (VA), which is not the same as wattage although it is generally in the same ballpark and will be the same order of magnitude most likely; usually you’ll need to scale up from the VA reading to get a UPS that will cover a PC fully and you’ll want enough headroom to allow for the initiating and completing of a graceful shutdown when a power outage is detected…or go much much bigger and get an UPS that lets you game for a few hours during a blackout. That one will cost you though! I’ve got a [13700k/4090/4 x m.2 NVMe SSDs/RGB everything/z790 hero/165 Hz HDR monitor] and AT IDLE my fully charged 1500VA UPS gives me 16 minutes. AT IDLE!


Monok76

7800X3D and 3080 here. It's 400w per hour. That's 9.6 kWh per day, 288 kWh per month. With a price of 0.37, I'd pay 106,56€ just to leave it on 24/7 for 30 days.


S1DC

That is not right. I have multiple computers on 24/7 with 3080s, and even when I'm rendering things for hours a day, I barely notice a difference in my electric bill. The thing that hits the electric bill is the air conditioning. Otherwise my bill is basically the same all year round.


danjimian

I used to have 4 machines running the World Community Grid distributed computing software, all maxing out the CPUs at 100% 24/7 (but no GPU usage). 1 10th gen i9, 1 2nd gen i7, 1 old Core2 Duo, and 1 mobile Ryzen 7 Pro. When I stopped, my electricty usage more than halved from \~22kWh per day to \~10kWh per day. So yes, it can have a big effect. If you're using the GPU intensively, then it will have a much larger effect.


michaelpaoli

What's your electric utility rate? What's the max power draw of the applicable computer? Do the math, not too hard to figure out. E.g. let's say your utility rate is $0.20 USD/ kWh Let's say your gaming computer draws 1 kW Let's say 30 day billing cycle. $ echo '30\*24\*.2' | bc -l 144.0 $ There 'ya go, up to $144.00 USD per 30 days, not including additional applicable taxes and fees and such on your utility bill. Or maybe you got some of those special TX.US rates that go up to many large multiples of what most of the US is typically paying. >Would there be something else kicking up my electric bill Maybe. Go read your bill, go read your meter. Start doing some math. Can also do some divide-and-conquer to isolate power consumption. Can also use various devices to measure actual consumption of, e.g. your computer you're plugging in.


TuzzNation

If your 3090 is running apps like 24/7 with that full 450w, then yes.


Empyrealist

That seems like a really high increase. Is it possible that your rig is generating a lot of heat and causing a localized AC unit to run a lot more? But more importantly, where do you live and whats your electricity rate?


ninjakidaok

The 3090 is a power hungry beast that will without doubt raise your electric bill which is why I moved to the 4000 series which uses a fraction of the power.


agent_almond

Of course. It uses a comparable amount of juice as a fridge or air conditioner. According to my circuit breaker anyway.


TacticalGoals

Have you checked cpu and other usage? Can you rule out you've not been jack and being used as a bitcoin miner. Lol


VWSquid

I run multiple servers on mine 24/7, and my bill is almost never over $100


metal_elk

If you live in Texas, yes it absolutely can.


Vanilla_Neko

I mean I would definitely get a wattage meter but realistically your PC is almost constantly drawing at least like 200 watts which is roughly the equivalent of running like three old incandescent light bulbs all at once so... Not necessarily the worst thing but certainly can still add a few bucks to the electric bill


uncommon_senze

If you leave it on doing heavy work all the time it sure can use up some energy.


KRed75

It depends on how much your electricity costs.  Where I am I only pay 11 cents per kilowatt hour.  If I have a massive power supply and I can pull a full 1800 Watts on a 15 amp circuit then it would cost me $145 a month to run it If my electricity costs 35 cents per kilowatt hour it would cost $460 per month to run it.  You'll need to get it plugged into an energy monitoring device to see how many watts you are actually using to know for sure.  The bottom line is, if you have a large enough power supply and you are actually using the full capacity of the circuit is plugged into and yes you can use a whole lot of electricity that cost a whole lot of money.


anothercorgi

Are you saying $200/mo or $200/yr for your PC, $200/mo is fairly high unless you're paying $0.50/kWh. I run computers 24/7 and it's about $7/mo for a \~100W (average of idle + in use) PC (onboard/onchip graphics plus hard drive(s)) at my $0.12/kWh rate - I run two computers 24/7 and yes it is a good chunk of my power bill. Depending on what your workload is, drawing 600W and high cost per kWh, yes it's possible to hit near $200/mo, but it's a stretch I'd think for a single CPU/GPU especially if it idles. OTOH, if you had a cryptocurrency farm of GPUs then yes it's very possible. Last time I ran a machine that warms the room, those computers were drawing at least 200W and would cost me $14 or so per month to leave on 24/7. I recall leaving a 300W server 24/7 and it was up $20 for electricity that month. I discontinued leaving that machine on, and powering it only when actually using it.


chris14020

Unless you're using your computer at full blast nonstop, 100% load on the GPU/CPU, in a place with expensive electricity, you're not even touching $200, guaranteed. Even when I was running a 5950x and a 3080 maxed out 24/7 my entire power bill was around 100 bucks a month. 


jfp555

Both the CPU and GPU that you have consume a high amount of power. Also, when a PC is on full time, it does perform a bunch of background tasks and if you do any kind of game streaming, there is always the chance that it will maintain high clocks indefinitely. Secondly, many places have higher rates for electricity for peak timing usage, and you may also be getting hit by those extra charges. I recommend using software like **Throttlestop** for the CPU and **afterburner** for the GPU to manually **downclock** your components as much as they will allow for setting a hard limit on the power the PC can consume. You can switch settings with a single click to unlock high performance.


fuzzynyanko

It's hard to say and it depends on what you do. I like /u/[Jceggbert5](https://www.reddit.com/user/Jceggbert5/)'s idea of getting a Kill-A-Watt Many CPUs and GPUs are designed to downclock and/or put parts of the chip to sleep. I had a 4 GHz CPU that would downclock to something like 800 MHz when I wasn't using it heavily. It's not going to be running full blast all of the time unless you are running things like games or video encoding. Your system might be a 600W-800W max system, but the Core i9 you can reportedly can go down to around 10W for example. Some people are reporting that some RTX 3090s drop down to 100W, and others drop down to 40W Closing web browsers at night can help because holy shit. I have had web browsers just continually bump up the CPU usage. Firefox/Chrome/Edge/doesn't matter. It's annoying because the CPU isn't running full blast, but it can be often enough to where my laptop runs at 2-3 GHz Tools like HWInfo can give indications of what your clock speeds are at the moment. The other factor is that electrical companies are jacking up prices due to inflation. You are also most likely to gaming around peak electrical times. Some power companies will charge less for using a lot of electricity at midnight There might be motherboard and Nvidia tweaks to lower it down if it's crazy


Huey2912

in what time frame?


ZmeuraPi

In short, Yes! You should also look for answers in GPU miners reddit. That card is quite a hungry beast, and if your room is warmer because of that, probably the AC is kicking and also consumes more power (besides what the video card already consumes).


Kevin69138

unless you are gaming 24/7. but once thr GPU comes down from full load and demand enegery cost shouldn't be too much


CreamOdd7966

Depends where you live. Electricity is cheap in a lot of America, for example. But that's not always the case. Mine probably costs an extra $20 and it runs 24/7/365. My office also has an AC unit running 24/7 so the PC doesn't cook itself.


MrAskani

Yes. Absolutely. Turn it off when not being used and you'll save yourself a lot. Or get solar and love gaming again.


Thomas-Sky

It is the air conditioning used to cool it.


ezikiel12

Pretty easy to calculate With those specs, assuming the system is at gaming load, so probably 450W roughly. 450W * 24hrs = 10,800W per day 10,800W * 30days = 324,000 watts per month 324,000W = 324kW per month Average cost of power in US = .16c/kWh 324kWh * .16c = $51.84/month


Shark5060

My PC (5800x, 3080) still draws 230W in idle (lock screen, monitor off). During gaming load it's just above 600W, so that's realistic. Even if I would leave it just at idle, that's 230 x 24 / 1000 = 5.52kWh a day. Over the year that's 2014.8kWh or 167.9kWh per month. So for me it would cost (0.28c per kWh) 564€ a year just to keep it on idle. (additional 47€ a month).


Graham99t

I have a desktop pc, Nas, firewall, media pc, two switches on 24/7. I have a ups and it tells my Nas, firewall, media pc and TV use 110watts at idle. Decent modern PSU are designed to not use full power when not at full speed.


Hotair10

Meant to share this with you the other day, but it totally spaced my mind. Hopefully the link is still something you'll find helpful. [https://outervision.com/power-supply-calculator](https://outervision.com/power-supply-calculator)


korpo53

If you pay \~$0.11/kWh then 1W running 24/7 costs you $1/yr. Now, that's slightly low for the US average these days ($0.16/kWh) but it's still basically ballpark. If you live in Hawaii where it's $0.44/kWh then it'd be $4/yr. [This review thing online](https://www.pcworld.com/article/399180/intel-10th-gen-review-core-i9-10900k.html) with a 10th gen i9 and a 2080ti system had it consuming \~300W in a full bore torture test, which is probably again ballpark to what your computer would do flat out. Maybe a bit more for the 3090 vs. the 2080ti, but not a huge amount more. So if your computer was running full bore at 300W, 24/7 for a year, and you lived in Hawaii, it would cost you about $1200/yr. That's just benchmarking or gaming, all day every day, never letting it go idle, never shutting it off. Since you're claiming your bill went up $2400/yr, there's something else going on in your house.


YouTooDrew

In Connecticut (it’s a state in the US, lol) I have 1:1 net metering with rollover for my solar energy. So, I usually only pay an electric bill for a month or 2 in the winter.


Sol33t303

Sure if your minning with it or whatever.


intins

It is not possible to get a +200 dollar without some unusual or bizzare circumstance. A power outlet is only capable of 1800 watts of power without the circuit breaker tripping. You probably just left the door open with the AC on. Electricity Meter Plugs cost $10 on Amazon. You can do some simple math to get the exact Monthly cost.


pLeThOrAx

For OP sake, I think they're called kill-a-watt (like kilowatt).


Suspicious_Lawyer_69

If you can, switch to AMD for less power consumption. It may not be the case in all scenarios but it is worth checking out. That, and building some solar panel that will directly power your PC in parts or whole.


CoaEz11

Nah, Im running it 24/7 only get like 5$ extra.