Thanks for proving my point again, whats the point of power it it's not utilized 100%?
I'm not trying to prove your point honestly. As we've all stated before, consoles all have a general hardware specification. Consoles these days pretty much all use more than one CPU core, much like modern PCs these days. Just like on the PC, a console's power isn't fully utilized, at least not 100% all the time as such is the nature of any computer program. That's just the nature of the game. Take the PS3 for example though in my view. Ever since the new Cell processor came out and the PS3 has been sold, developers have been having a tough time even trying to figure out how to fully utilize a PS3's potential, especially of that cell processor, and they still are having some issues on pulling out the new potential of a PS3. On the Xbox 360 which is pushed heavilly by Microsoft towards the developers, because the 360 is running three processor cores, the game developers are offloading different things such as Physics, AI, and the game itself onto each different core, which is essentially Multithreading. Now, getting to my point on developer laziness vs. money, Because the Xbox itself has been such a huge selling console, developers have more incentive to push for that extra step and code proper multithreading support which actually works as they know that the better the game is designed, the more money they can rake in.
On the PS3, it's not the developer's fault on being lazy. It's simply due to the architecture of the console which makes it harder to code the game for, but they have been able to hit quite a bit of potential on the Ps3. Since the Xbox has been out for some time and it's basically using PC hardware in it (some of which is custom such as the ATi GPU in it), it's esentually easy for a developer to add multithreading support, as it's basically OS + Known Hardware, not New hardware + OS like the PS3. So essentually, the argument comes down to in performance, is the developer's lack of care for multithreading support in a PC. As I've said before, just adding "dual core support" isn't enough for a game, as those barely do a thing if anything, even with full processor affinity and a high end Dual/Quad/i7. Instead of developers just multithreading Xbox 360 games or working on whatever the PS3 has, PCs should be just as easy to multithread despite the differences in hardware/software. Developers can go as far back a a Pentium 4 HT processor, which is considered a dual core chip.
So yes, it's pure laziness on the developer for not making the PC games perform how they should. Multithreading does have a huge difference in the framerates of games. But might I also mention, with consoles, typically they have games that are:
1: More Optimized
2: Have slightly degraded texture quality
3: NO ANTIALIASING
4: Fairly good Antistrophic filtering
5: less RAM.
The benefits that a PC has over a console are:
1: Modding
2: That untapped power (developer laziness for the most part)
3: The ability to upgrade it cheaply
4: The huge workforce behind a PC
5: Larger support of HID devices without hacking or modding
6: OS of choice as well as better support for downloadable content (if you don't believe me, I'll send you a PM of something to try installing to a console that has The Orange Box).
7: Ability for user to troubleshoot every aspect of it, from hardware to software issues, as well as the ability to "debrick" hardware, and replace/upgrade hardware as needed
And looking back at one of your older posts, you said that the PSU kills your hardware when it dies. You should know that this same instance does happen with consoles. And unlike the pre-built PCs or consoles, you can control what goes into the PC, not so on consoles where you may or may not know if the PSU is a cheapo, a really good one, or one that is highly inefficient. Unless you're buying POS power supplies that are known to fry the motherboard every time they croak or from budget/poor OEM builders who put in cheapo hardware (eMachines anyone?), I can see why you said that. However, I've had a couple very durable power supplies fry on me due to brownouts, and not once have any of the systems I've worked on or own have any component fry. One such system I have which is 11 years old had this happen to it, and it still runs great to this day, acting as a Linux server/router. All it took was a replacement PSU and it was all set to go, plus I get the parts overnight'ed as well.
For sending back dead/blown parts that are still under warranty that has not been broken due to things like overclocking, a simple RMA is all that is needed, and I can have a new part within 2-3 days tops, not waiting weeks to ship a console, wait for it to be repaired, and then be at the mercy of the company repairing it to ship it back when they decide.
I'm not a console hater, but read my previous posts once more. I have enough ground on my end to cover my opinion of "PCs are better for gaming." As per price, sure consoles are convenient and all and tend to be cheaper for the gaming value, when it comes down to getting that HDTV + overpriced HDMI cables, you're already talking a full blown mid-range gaming rig in terms of price. The consoles are generally cheaper though as they're subsidized by advertisements (ever see the Xbox Live Dashboard? Yes, there are ads and Netflix is a player in it). PCs, however from OEMs and such are not subsidized by the crapware that goes into them at all. You're paying for the hardware + labor + shipping to get the PC to you, and the trialware/crapware all goes to the OEM for advertising of their machines and so fourth.