I am buying a car from a dealership and was told that the car still has the original factory warranty, and judging from the mileage and the year, yes, it should be. I just talked to a friend and he said a lot of times when a car is sold to a dealer, the original factory warranty ends. Does the factory warranty get transferred to the new owner, no matter it is from a dealership or from an individual?