Your friend drives a car down the street in front of your house. You look at your watch to see if you can measure how fast he's going. You start timing as he passes a block away and stop timing as he passes in front of you, counting 30 seconds for him to go the 740 feet. You measure the time on your watch, with an error of +/- 0.6 seconds and you can only be sure of the distance +/- 2 feet. What is the standard error in your measurement of the velocity of the car? Be accurate to two digits below decimal point.