Question on Moore's Law?

Anonymous A

New member
Hi can someone help me with this?

If a program running on a new computer takes 1 minute to solve a problem today, how long would it take on a computer that was purchased new years ago? Hint: use Moore's Law.

a) 8 minutes
b) 6 minutes
c) 4 minutes
d) 2 minutes
e) 16 minutes

The answer is e) but i don't know why, i've tried working backwards but i get 32 minutes instead. I know that Moore's Law states that computing power "DOUBLES every year".

So, if it takes 1 minute to solve a problem today, then back in 6 years ago it will take:

1x2x2x2x2x2 = 32 minutes

That's how i attempted it but it's wrong, someone please explain to me?
Oh sorry i thought it was doubled every year. So if it doubles every 18 months, will the answer be a) 8 minutes then?

If i solve the question using 24 months then i get 16 minutes, but how do i know whether i should use 18 or 24 months??
nvm i did it the wrong way round, using 18 months does get me 16 minutes which is the right answer, thx guys!
 
correct me if i am wrong, but dosent moores law say power doubles every 18 months?
edit:
i have heard 18 months, but wiki says:
Moore's law describes a long-term trend in the history of computing hardware. Since the invention of the integrated circuit in 1958, the number of transistors that can be placed inexpensively on an integrated circuit has increased exponentially, doubling approximately every two years.
 
Back
Top