A ball is gently dropped from a height of 20 m. If its velocity increases uniformly at the rate of 10 m s^{−2}, with what velocity will it strike the ground? After what time will it strike the ground?

Advertisement Remove all ads

#### Solution

Distance covered by the ball, *s* = 20 m

Acceleration, *a* = 10 m/s^{2}

Initially, velocity, *u* = 0 (since the ball was initially at rest)

Final velocity of the ball with which it strikes the ground,* v*

According to the third equation of motion:

*v*^{2} = *u*^{2} + 2 *as*

*v*^{2} = 0 + 2 (10) (20)

*v *= 20 m/s

According to the first equation of motion:

*v* = *u* + *at*

Where,

Time, *t* taken by the ball to strike the ground is,

20 = 0 + 10 (*t*)

*t* = 2 s

Hence, the ball strikes the ground after 2 s with a velocity of 20 m/s.

Concept: Equations of Motion by Graphical Method - Derivation of Velocity - Time Relation by Graphical Method

Is there an error in this question or solution?

#### APPEARS IN

Advertisement Remove all ads