An object slides off a frictionless horizontal table of height h with an initial speed v_0 and lands a horizontal distance R from the base of the table. In terms of R, how far would the object land from the base of a table twice as high?
I tried to use the quadratic kinematic equation, but I didnt know how to set situation one equal to situation 2. And I didnt know what to solve for or how to get R into the equation
I tried to use the quadratic kinematic equation, but I didnt know how to set situation one equal to situation 2. And I didnt know what to solve for or how to get R into the equation