Social Question

talljasperman's avatar

What ever happened to Moore's law? Are computers really improving that much?

Asked by talljasperman (21863points) August 29th, 2010

based on Moore’s Law computers are getting exponentially better.'s_law

I bought a core duo… 3.00 GzH computer with 4 GB ram and 640 GB of hard drive space two years ago for $1350… Now looking again on computer sale sites I am not impressed… everything the same except the computer is 3.33 GHZ and Windows 7 for $2,500
are PC’s computers not getting better anymore…when will we get better computers? or are we at the max processor speed?... if not what is the max? where can I get info on new computers developments?

Observing members: 0 Composing members: 0

16 Answers

FireMadeFlesh's avatar

Wikipedia ('s_Law_-_2008.svg) says it still holds true. However, in a lecture I went to Lawrence Krauss said that if it holds true for another 42 years (40 now) then all the energy in the observable universe will have been used up. Expect it to break down any time now.
Apologies for the bad link, Fluther doesn’t like underscores.

jerv's avatar

Speed isn’t the best measure of a computer’s power. My Gateway cost less than half what your Core2 Duo did and has a slightly lower clockspeed, but will drink your rig’s milkshake.

A Core i7 may only clock in at no greater than 3.33 GHz without overclocking, but even the lowest desktop i7 is twice as powerful as my i3, with teh high-end ones being closer to five times as powerful.

You don’t need clockspeed to be fast at computing, and Moore’s Law is safe for at least another decade.

I almost forgot…. CPU benchmarks. The Core i7–980X is nasty-fast and unlocked for overclocking fun. And if you think that’s great, wait until you see what they replace it with next year :D

timtrueman's avatar

The real issue is computing power has been becoming more and more parallel (meaning instead of one processor running faster and faster, we’re getting more processors and those processors are more efficient, for example: they have single cycle instructions that do specialized tasks like encoding and decoding tasks that used to take many cycles before the specialized instructions). The problem is much of today’s software is still written to run on one processor and many tasks may not be something you can break up into multiple tasks that can be done at the same time. There’s a whole host of problems with scaling up the number of concurrent processes that can be running. If one process need to open a file and write to it it locks that resource up and all the other processes usually sit around idle waiting for their turn.

The bottleneck for the raw horsepower we’re hitting that’s preventing us from feeling the full effects of Moore’s law is that software that cannot take advantage of its hardware and perhaps even the nature of the tasks we perform not being suited to the design of the hardware. I think hardware will continue to run ahead of software until either humans stop writing software (because AI is doing it—which is a long way off if ever) or Santa brings me those space Lego sets I always wanted.

jerv's avatar

@timtrueman I am not 100% sure of that. You are correct if we are talking only about Windoze-base PCs, but most of the fastest computers in the world are clusters with multiple CPUs (and my multiple, I mean a hell of a lot more than four) and most of them run an OS that automatically and inherently takes advantage of every clock cycle it can get no matter how many CPUs.

That miraculous piece of software that can do multi-threading like nobody’s business? Linux!

timtrueman's avatar

@jerv I’m talking about application software. Most server tasks are highly-parallel. Serving up web pages to tons of people can be easily broken up. Day-to-day applications you use on your home computer aren’t as nice. Sure there is some parallelization but if you doubled the number of processors on your computer you wouldn’t see any real performance gain once you have more than two. Of course I agree that operating systems are decent at multitasking. It’s the nature of the processes running within that OS that have trouble being written to take advantage of all the hardware available to them.

jerv's avatar

@timtrueman Ah. I was looking at it from a different angle, specifically, number-crunching.

timtrueman's avatar

@jerv Two perspectives are always better than one—way to tag-team this question! :)

shpadoinkle_sue's avatar

Moore’s Law is too expensive for me. I get a new computer once about every 5 or 6 years.

FireMadeFlesh's avatar

@py_sue That’s a good way to do things I think. If you get a relatively good one each time, it is still fast enough for that long, and after six years you get a computer that is eight times as fast for the same money (by Moore’s Law).

shpadoinkle_sue's avatar

@FireMadeFlesh Yup. Be good to your electronics and they’ll last you a long time.

Odysseus's avatar

Moores law still applies, its just that we are just not allowed to get our hands on it yet. Google search, Quantum computing

Ben_Dover's avatar

Why not just replace the hard drive every three or four years?

shpadoinkle_sue's avatar

@Ben_Dover Me? I know next to nothing about the inside of computers. I’m not one who needs the new things all the time. It’s also a matter of money. I don’t really spend money on computers. But thanks for the thought. :)

FireMadeFlesh's avatar

@Ben_Dover I need new hard drives more often than that. Hard drives have relatively little to do with speed though, which is what Moore’s Law is talking about. Processors, motherboards and in some cases RAM must be bought as a set, since each motherboard can only take a half-dozen or so different types of processors, so when you want a faster computer you invariably need a new one (or even more expensive water cooling gear if you want to overclock).

jerv's avatar

@Ben_Dover Because the hardware requirements for software changes, and after 5 years, the only way your computer will be even remotely viable is is you spent $$$$$$$$ to get top-tier stuff.

I had a 3.4GHz P4 with a primo graphics card. 2 years later, it was mid-range. After 4 years, it was bottom-rung. After 6, it couldn’t run certain programs.

The computer I replaced it with is three times as fast and powerful, and cost less in it’s entirety than just the graphics card alone from that P4 did.

Answer this question




to answer.
Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
Knowledge Networking @ Fluther