Energy Saving (or not!)

peejay

New Member
Messages
23
Reaction score
0
Points
0
A long time ago in a galaxy far far away.

Back in the midsts of time, as these words were scrolling up cinema screens across the world, boffins were working hard on creating machines - vastly complex technological feats of the modern age, that had the staggering ability to add 1 and 1 together (and, with alarming reliabily, get the answer 2 - or 10 in binary - every time).

These fantastic creations could take strings of 0’s and 1’s, and perform staggering tasks with them (so long as each staggering task could be broken down sufficiently to adding 0 and 0, 1 and 0, or 1 and 1).

Standards were established to make life easier (like ASCII), and suddenly these strings of binary digits could be manipulated to form text - then, by virtue of 0’s and 1’s in precisely the right timing, these letters could be displayed on a screen.

To actually achieve anything useful, careful planning was required to make machines that could work through millions of digits a second, and suddenly, the age of the home computer was upon us.

Skip forwards to today. We now enjoy machines with the uncanny ability to do precisely the same thing (ie add 1 and 1 and get 10), but to perform the task at a speed that makes the early home computers look like nothing more than an abacus. However, the basic task being performed is identical.

I remember receiving my first computer for Christmas 1982. It was a 16K Sinclair Spectrum, which was where my adoration of Sinclair design and products all began. It was a modern marvel, with colour, graphics, sound (of a fashion!) and had amazing processing power, with ample memory for most tasks. In the box was the computer, the manuals (back when the manuals were actually filled with useful information), the Horizons starter tape, tape leads, and a 9V transformer. All of this processing power could get all the electricity it required from a single small PSU. There were no hulking heatsinks and no noisy fans - this small black box provided all the power needed.

Skip forwards, once again, to today. I am sitting in front of an very average PC with regards to power - it’s an Athlon 3000 complete with a gig of ram, a couple of hard drives, and other assorted gubbins. The case contains no less than 6 fans all in (3 case fans, a PSU fan, a CPU fan and a GPU fan) all whirring away. And making noise. And using electricity.

The PSU is a 500W affair - a far cry from the 7.2W transformer that the original speccy required. While this does give me room to manouevre on the power front, I feel sure that my system would eat a 300W supply for breakfast.
Now then - basic laws of physics - energy cannot be created or destroyed - only converted from one form to another. Given all the computer is doing is moving and adding 0’s and 1’s, then the energy used is none (since all the energy used by storing a 1 will dissipate when it converts to a 0 anyway). So what is happening to all this electrical energy that my computer is consuming? Simple - it is being converted into sound (electrical hum and fan noise) and heat. A lot of heat - hence the need for all the cooling fans.

But hang on - where did things go wrong? After all, a laptop doesn’t need anywhere near that much power (or your battery wouldn’t last long enough to get past the POST screen!) and certainly doesn’t require fans to the extent that it’s drives feel like they are living in a wind tunnel. Yet these miniaturised systems can perform just as well as their bigger desktop brothers, with, in essence, the same components inside them. They also have the added disadvantage that since the majority of the contents are not fresh air, then something must be radically different.

In my opinion, just as software has got sloppy in its design, so as the hardware it runs on. I remember reading an article many moons ago that claimed that the increase in heat was due to the fact the chip manufacturers were using finer and finer tracks (”wires” to those not in the know) in order to cram more and more into them. Yet doesn’t the fact that a laptop is miniaturised even more dispel this theory?

As much as I despise all of the people that ram the global warming arguments down our throats, I would love their pressure to make computer component manufacturers to take a good look at their products, and start to come up with ways to make them run more efficiently. It would make a refreshing change to go into the local electrical store and be able to buy a PC with an energy efficiency rating, much as you can with other electrical products.

Perhaps when that day comes, we can look forward to ditching the mass of fans, the huge power supplies, and the spiralling electricity bill.

And maybe then I can listen to my mp3's without the constant steady hum of half a dozen fans in the background.
 

Smith6612

I ate all of the x10Pizza
Community Support
Messages
6,518
Reaction score
48
Points
48
Well, I can completely agree with you on your big paragraph. Otherwise, there's most likely no way we can get computers to run better but use less power, as with the amount of transistors or whatever in a processor has always been increasing. Also, Desktops and Laptops do get the same performance I agree, but when you're a gamer, since laptops, even the gaming ones aren't as strong as a desktop for power and heat reasons, you need the power/wattage. For example, my gaming PC, hand built by me is a pretty high end system that I upgrade on a regular basis. I have a 1,200 watt power supply installed in it mainly because of the video cards (3 GeForce 8800 Ultras cards in SLi soon to be replaced by GTX 280s in a week) and how they suck up power, and all of my hard drives. But basically, you're also right about the energy part where it cannot be created or destroyed; it's just there. However, like the power plants, they're limited on resources. The recent Particle Accelerator in Spain might allow us to create nuclear fission which would let us produce power from what I'm hearing is indefinitely, but there's still some work with it to achieve that.
 

charithjperera

New Member
Messages
14
Reaction score
0
Points
0
Lol, the particle accelerator is in switzerland and france and it has nothing to do with nuclear fusion that is meant to provide unlimited energy (the ITER is the research fusion reactor).

Anyway, even if we do get virtually unlimited power from fusion, just imagine how wasteful we will become. All energy efficiency regulations will go down the drain and the only thing that will matter is how fast or powerful devices are.

We as humans only act on things when they are an immediate issue. Cars were really inefficient and we wanted the largest trucks we could get while fuel was cheap and seemed unlimited, now we are designing cars like the VW 1L that do 100KM per liter.
The same way, with fusion we will start burning through all our water and not give a stuff... When the day finally comes when we start to run out we will really be messed up if we havnt taken over any other planets by then.

This sort of thing makes me question what are humans trying to achieve by advancing technologically.
 

vol7ron

New Member
Messages
434
Reaction score
0
Points
0
Power isn't the only struggle, there's also the ability to reduce friction via lubrications or better designed surfaces. I also think that the original post forgot that some of the energy was also lost to vibrations to the case.

Nevertheless, there is no way any laptop is capable of outperforming my PC. Second, you are right, laptops are still powerful enough for the home-user and basic tasks, but they do run on a battery, that has a lot of stored energy - the average battery only last 1-2 hours and that's with the energy-conservation hardware (speedstepping, etc). Though you forgot to mention that there are hardware that try to reduce heat and energy in the desktop market. These adhere to micro-ATX form motherboards.

Also, I agree with you, the smaller a chipset/circuitboard gets, the less energy should be required and therefore less heat. I'm not sure about what you read, but it could be possible that you read it wrong.

As for particle accelerators, we had one at my college - nothing better to smash an atom with. Also, nuclear fission plants are no new thing, perhaps you meant some other way using fusion to abstract energy.

Finally, as for your comments on the 500W vs 7.2W, the power is in a constant signal. If you're doing intensive tasks, your laptop would burn power so fast. Power goes to RAM/CPUs/GPUs/HDs/Fans/Mobos, and high performance one require a steady, strong voltage to maintain consistencies. Depending on your setup you may not need that 500W, many people don't. It's not the Wattage that's important, it's the amperage. It's the rails of 3/5/12v +- combined with the wattage allowed. If a 2kW PSU only had 3v rails, what good would it be?
 

peejay

New Member
Messages
23
Reaction score
0
Points
0
Ah, no no no - my point is that the hardware being currently produced seems amazingly inefficient. Okay, so there are a handful of people who have desktop machines that a lappy can't match, but they are in the minority. Let's take a step back in time for a moment .....

Imagine a Pentium machine (the original Pentium, from way back in 1995) - any modern laptop will make them look silly. Take that same Pentium machine, and give it the same amount of cooling you find in a laptop (generally a small fan that is active less than 50% of the time, even when maxxing it out.) No PSU fan, no GPU fan - that's pretty much where an average lappy lies. How long do you think that desktop machine would last before burning out? I reckon only a few minutes, at best. And all that heat is simply wasted energy.

Even for those of you with high powered machines, I can guarantee that in a couple of years, the average lappy will be able to keep up with them - and without all the cooling you have in place in your desktop machines now. Fancy trying to run your desktop without all the fans? I wouldn't recommend it!
 

vol7ron

New Member
Messages
434
Reaction score
0
Points
0
Inefficient for its size? I agree with that. But the custom, enthusiast market is big. People want to mod there computers to not only perform the best, but also look a certain way. This requires being big enough to accommodate many different types of heat sinks and the ability to change components.

Laptops aren't really customizable after market. Sure you might be able to upgrade your RAM and maybe your vid card, but not much after that. As for power, you're right. Smaller form factor generally requires less juice. But the size of a laptop requires many proprietary parts that would be very expensive to the home user and many of those parts are integrated onto the mobo. You definitely could not run a server-database on a laptop. Laptops are meant for the end-user, whereas desktops these days are what servers used to be less than 10 years ago.

The technology is out already that we almost don't need fans at all. I mean just think about an iPod. You can get an 80GB iPod for relatively cheap, it has an operating system and graphics engine. Take a step up and you have PDAs or even smart phones. With hard drives embracing the SSD technology, we're approaching a computer with no moving parts - aside the CD/DVD/Bluray player.

Essentially, you have transistors, capacitors, diodes/resistors, and fiberglass, with a constant volt running along metal connectors. That should be pretty damn efficient, we just need better heat control as you alluded to.

I stand by my decision that many people overbuy on the power supply, this is because most homebuyers don't take advantage of the space in their case. They have slots for multiple HDs but only buy 1 or 2. They have slots for 2 videocards, but only use 1. They have slot for multiple PCI devices (raid controllers/usb slots/fans,tv tuners/sound cards/etc), but many don't even use the PCI slot.

What companies should be doing is really selling voltmeters and teaching people how to read the suggested power for their board, given their hardware, so they know what PSU to buy. Think of that - you plug a device in your USB, it tells you how much power each part is consuming. Or even, the mobo manufacturer has drop downs and tells you the minimum psu for your listed components.
 

Spartan Erik

Retired
Messages
6,764
Reaction score
0
Points
0
Even for those of you with high powered machines, I can guarantee that in a couple of years, the average lappy will be able to keep up with them - and without all the cooling you have in place in your desktop machines now.

Well of course..

and in that same number of years, the new desktops will outperform the laptops that outperformed the desktops in the previous generation.

Can't defy physics folks (yet, anyway). We've done all that we can do with transistor technology. Other than making them more efficient, we're going to eventually have to move on to a whole new technology entirely
 

Smith6612

I ate all of the x10Pizza
Community Support
Messages
6,518
Reaction score
48
Points
48
First of all, my 486 box has no fans or heat sinks on the CPU what so ever and I can run it maxed out for days and it won't overheat. It only has one fan cooling the whole box, and that is the PSU fan. This 486 box is going to be 20 years old in a couple years, and it still runs fine.

As for my gaming computer however, that thing is a power sucker because of three GeForce cards in SLi and that added with a Quad core CPU, lots of LED lighting, 3 high RPM 1TB drives and 3GB worth of Kingson's HyperX RAM, two LCD monitors, a system with a combo of fan sizes and an optical drive/burner, I need a big supply. Of course, when I overclock the machine which I do rarely it uses more power than it normally does.

So yeah, while hardware today is efficient, there's no stopping how much power it's going to use if you want speed. If you want something that uses very little power, get a laptop which is going to be slower in specs to compensate for heat and power usage, or buy yourself a low end budget PC.

Also, all physics work and video encoding/decoding work should be directed to video equipment such as video cards. Video cards use more power but do the job a lot better than the CPU can do, even if you have a quad core/octa-core machine. Like in Crysis, since the game can hardly utilize my quad core system past 30% usage even with physics going like crazy, setting the cpu_physics value to 0 in the game sends the physics to my GPUs, which seriously improves the performance, and of course the framerates since the program isn't killing it's own CPU allowance doing mass physics. Also, CUDA is nice for video as well.
 

fempower

New Member
Messages
145
Reaction score
0
Points
0
You said it. There's no need for all this excess power usage, especially for people who don't do much with their PCs.
 

peejay

New Member
Messages
23
Reaction score
0
Points
0
As for my gaming computer however, that thing is a power sucker because of three GeForce cards in SLi and that added with a Quad core CPU, lots of LED lighting, 3 high RPM 1TB drives and 3GB worth of Kingson's HyperX RAM, two LCD monitors, a system with a combo of fan sizes and an optical drive/burner, I need a big supply. Of course, when I overclock the machine which I do rarely it uses more power than it normally does.

Ah, but that highlights the point perfectly, showing just how inefficient the hardware of today is. Let me clarify:-

Going back to the principle that energy cannot be created or destoyed, and given your quad core CPU is doing EXACTLY the same thing that any other CPU would be doing (even if it is doing it quicker), then why would it need more power? Granted, the LED lighting uses power, since the electrical energy is converted to light energy, but I see nothing else that warrants wasting electricity. The hard drives - they will use more power than slower drives (as the electrical energy is converted to kinetic energy), but the storage capability is irrelevant. The RAM - given the conservation of energy principle - should use no more power than any other RAM, the speed of it is irrelevant. Plus, as you have stated, when you overclock it, it uses more power - why? Simple, really - the basic inefficiency means it is being wasted as heat - therefore meaning that you need a shedload of fans to prevent the machine becoming a molten puddle of silicon.

Of course, disregarding the fans, and with the exception of moving parts (the hard drive / optical drive when it is in use / floppy drive - remember them? ;)) and any light sources, then the computer is merely taking electrical energy and rearranging it. A perfect system should be able to do this with no heat loss or electrical hum, and give an overall electrical usage of practically zero (however, there is no such thing as a perfect system, even when using super conductors.)

But, if a laptop can do what a desktop machine can do and generate a fraction of the heat (which is why one small fan can keep the heat under control), then doesn't that suggest that there is something fundamentally wrong with the component design of the desktop machine?
 

Domenico

Member
Messages
117
Reaction score
0
Points
16
Maybe there will be a trend where the average CPU takes more and more power. That would explain why it takes so much to run a terminater's CUP...

Anyways, over the years companies have been making faster and better computer components, but at the same time ones that run more efficiently. There's a need for the newest and fastest for computers and there's a need for the best that takes the least power for labtops and mobile devices.

Over the years there's a trend that the mobile components are equal to their desktop counterparts form a couple of years back.

---

I don't feel all of you people's pain in terms of how much you hate to waste electricity because I live in a place where electricity is really cheap.

Buy a cooling system if you don't like the fans' hums. Most tower cases are compatible.
 

peejay

New Member
Messages
23
Reaction score
0
Points
0
Buy a cooling system if you don't like the fans' hums.

While that is a solution, it does not fix the problem, which is the wasted energy. To put it another way, that is like having a car that does 30 miles to the gallon, when you are perfectly well aware that the same (or similar) car can do 500 miles to the gallon, and rather than making an issue of it, cirumventing the problem by buying your own oil refinery
 

Smith6612

I ate all of the x10Pizza
Community Support
Messages
6,518
Reaction score
48
Points
48
Ah, but that highlights the point perfectly, showing just how inefficient the hardware of today is. Let me clarify:-

Going back to the principle that energy cannot be created or destoyed, and given your quad core CPU is doing EXACTLY the same thing that any other CPU would be doing (even if it is doing it quicker), then why would it need more power? Granted, the LED lighting uses power, since the electrical energy is converted to light energy, but I see nothing else that warrants wasting electricity. The hard drives - they will use more power than slower drives (as the electrical energy is converted to kinetic energy), but the storage capability is irrelevant. The RAM - given the conservation of energy principle - should use no more power than any other RAM, the speed of it is irrelevant. Plus, as you have stated, when you overclock it, it uses more power - why? Simple, really - the basic inefficiency means it is being wasted as heat - therefore meaning that you need a shedload of fans to prevent the machine becoming a molten puddle of silicon.

Of course, disregarding the fans, and with the exception of moving parts (the hard drive / optical drive when it is in use / floppy drive - remember them? ;)) and any light sources, then the computer is merely taking electrical energy and rearranging it. A perfect system should be able to do this with no heat loss or electrical hum, and give an overall electrical usage of practically zero (however, there is no such thing as a perfect system, even when using super conductors.)

But, if a laptop can do what a desktop machine can do and generate a fraction of the heat (which is why one small fan can keep the heat under control), then doesn't that suggest that there is something fundamentally wrong with the component design of the desktop machine?

It might, but don't forget that laptop hardware is made mobile. If you were to even try to put desktop grade components in a laptop, I can bet you that there is a 80% chance that your battery won't last any more than a half hour using gaming hardware. Mobile components of laptops are basically the same as PCs, however the clock speed and even the video cards are traditionally half. Take a gaming laptop and take a gaming desktop. Each the laptop and the desktop run the equivalent of each other. However, if you notice, the laptop will have slower clock RAM, and not only that, the video card will basically be a stripped down version of the desktop card, to compensate for heat and power usage which laptops can't have a large amount of. Take the nVidia GeForce 7900GS and the GeForce 7950. The GeForce 7900GS is a PCI-E Desktop card, and the 7950 is the mobile version of the 7900 built for laptops, all of the same chipset. Looking on the nVidia website, clearly the 7900GS beats the 7950 in both speed and power (clock cores, clock speed, RAM clock, shaders, stream processors, etc) to of course compensate for heat and power usage.

Also look at the processor speeds of laptops vs. Desktops. Laptops can only go so high to of course, cut down on power usage and heat generation. Personally, I have never seen a laptop have any processor go above 2.8GHz, and for dual cores, never seen any above 2Ghz per core. Also, I have never seen any quad cores in laptops as well. The reason? As it becomes more powerful, the chip has more componets in it. While even my quad core uses the same amount of power as say a Pentium 4 or a low end dual core, my chip produces more heat but uses the same amount of power, and varies while under load the same way an older CPU would. And of course, if I want to turn up the clock speed on it which is overclocking, I have to feed it more power to compensate for it, and because of the increased power, yeah my components are going to start to run a little warmer.

Now I don't know how you're thinking, you mgiht be thinking about basic/power user use with desktops and laptops. And sure, a laptop will do the job and use less power, but you are sacrificing overall performance, maybe not raw performance at first to a desktop. I'm looking at this from a gamer's point of view since I use the PC for gaming, and clearly even if my desktop is overpowered and a heat generator, I would send any laptop to oblivion with some of my games/my uses simply because laptops use less power and have generally half the performance of a desktop. My desktop will simply overpower a laptop simply because there is more room for components to be put in, and it's not having to worry about drawing from a battery and being mobile.
 
Last edited:
Top