Using hubbles law to determine age of universe - I can't understand the necessary...

  • Thread starter Thread starter Vatu
  • Start date Start date
V

Vatu

Guest
...assumption? Hubbles law says that the further away a galaxy is the faster it is moving away. As I understand it we can take the distance of a galaxy and divide it by it's velocity to determine the age of the universe. This requires an assumption that the galaxy has always been moving with the same velocity.

This is where I get lost. Doesn't this assumption go against Hubble's law itself? When the galaxy was closer it would have been moving at a slower velocity so this assumption seems very problematic to me.

My question in a nutshell is: How can we assume that the velocity of receeding galaxies has always been constant?
Thanks for the answer Mo Fayed,

I am still confused though. The acceleration that I am referring to isn't from a force acting on it, but from the expansion of space itself. If we look at the raisin bread analogy all the raisins are accelerating away from any given raisin, not because a force is acting on them but because the bread itself is expanding. I'm not saying you are wrong, I'm sure you understand it better than me. I am just trying to understand this confusing idea
Thanks μηδεƦ ßετα τεςτιηg,

I think that was a good answer but now I feel even more confused.
One more thing, wouldn't that unknown force simply be gravity?
 
To change the velocity of a speeding galaxy would require a force acting on it. There doesn't seem to be an obvious one, but some recent observations have suggested that there is, in fact, an unknown force acting to slow down galaxies. Nobody has yet explained what it might be or how it might work.
 
Back
Top