1
   

Computers & Electricity

 
 
gollum
 
Reply Sun 2 Apr, 2006 07:33 am
How much does operating a personal computer for an hour increase a person's electric bill?
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Discussion • Score: 1 • Views: 964 • Replies: 7
No top replies

 
Phoenix32890
 
  1  
Reply Sun 2 Apr, 2006 07:39 am
Check this out:

http://www.csgnetwork.com/elecenergycalcs.html
0 Replies
 
USAFHokie80
 
  1  
Reply Sat 15 Apr, 2006 06:34 pm
Every computer would be different. It would depend on the output of the power supply in the computer. They range from 200 watts to 1kwatt.
0 Replies
 
markr
 
  1  
Reply Sun 16 Apr, 2006 11:50 am
The output rating of the power supply is an upper bound. Actual consumption may be much less.
0 Replies
 
timberlandko
 
  1  
Reply Sun 16 Apr, 2006 12:16 pm
Just a real-world, "The way things work" note here - the energy required to start a computer from the power-off state, especially a computer with a conventional CRT monitor, is equivalent to the energy consumed by the system over several hours of running in the idle-standby state, and is in fact greater than the energy required to operate the system normally for a considerable period of time. For a system used in a typical, frequently-accessed manner, one or more times a day, a few hours per day, the most energy-efficient thing to do is to configure the system to power off its monitor after a period of non-use, and to enter full standby after some time following automatic monitor shutdown.

My 'puters generally are "On" 24/7/365, configured to first power down their monitors after a relatively brief period of non-use, then to enter full standby after a slightly longer period of non-use, essentially only shut down/powered off as may be required for maintenance, physical internal reconfiguration, and/or repair (and I should add that actual repair is not a particularly common happenstance). Among my machines are some of mid-'90s-vintage, and even the oldest of them have fared fine under the "always on" regime.
0 Replies
 
gollum
 
  1  
Reply Sun 16 Apr, 2006 06:17 pm
timberlandko--
Thank you, that is informative.

You speak of a conventional CRT monitor. Do flat screen monitors use less power?
0 Replies
 
timberlandko
 
  1  
Reply Sun 16 Apr, 2006 06:52 pm
Depending on the technology and display type, a flat panel may consume somewhere between half to a third as much power than a conventional CRT monitor in normal operation, and will pump considerably less heat into the room.

At present, flat panel displays are incapable of the video performance CRTs can achieve, particularly the better ones, but if out-and-out video performance isn't an issue for you, you may find a flat panel does everything you want it to do.

On the other hand, if you've spent big bucks for a higher-end video card, or especially if you've ganged a pair of top-flight video cards, using a flat panel is akin to putting a race car on the track with worn, undersized street tires.

In most real-world situations, the typical user probably will find a flat panel satisfactory, though cutting-edge gamers and hard-core graphics geeks still prefer high-performance CRTs.
0 Replies
 
markr
 
  1  
Reply Fri 5 May, 2006 12:25 am
LCDs now have refresh times that are faster than the human eye can perceive.
0 Replies
 
 

Related Topics

Clone of Micosoft Office - Question by Advocate
Do You Turn Off Your Computer at Night? - Discussion by Phoenix32890
The "Death" of the Computer Mouse - Discussion by Phoenix32890
Windows 10... - Discussion by Region Philbis
Surface Pro 3: What do you think? - Question by neologist
Windows 8 tips thread - Discussion by Wilso
GOOGLE CHROME - Question by Setanta
.Net and Firefox... - Discussion by gungasnake
Hacking a computer and remote access - Discussion by trying2learn
 
  1. Forums
  2. » Computers & Electricity
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.09 seconds on 11/15/2024 at 12:58:26