A diver initially moving horizontally with speed v dives dives offthe edge of a vertical cliff and lands in the water a distance dfrom the base of the cliff. How far from the base of the cliffwould the diver have landed if the diver initially had been movinghorizontally with speed 2v?

A. d
B. √(2d)
C. 2d
D. 4d
E. It cannot be determined unless the hieght of the cliff isknown

Respuesta :

Answer:

Option C is correct

Explanation:

"The time is determined by the vertical distance. The formula is sqr(2d/a) = t. There still is no acceleration in the horizontal direction."

For the first drive

d = d

t = sqr(2d/a)

r = v  

For the Second drive

d = ??

t = sqr(2d/a)

r = 2v

Since the times are same, equate the results.

t = d/v = d1/2v

v*d1 = 2v*d The v's cancel because they are related.

d1 = 2d.

You go twice as far as you did before.

Option C is correct

If the diver initially had been moving horizontally with speed 2v, the horizontal distance would be 2d.

The given parameters;

horizontal velocity of the diver, = v

the horizontal distance of the diver = d

The distance traveled horizontally when the diver is moving at 2v is calculated as follows;

The initial horizontal distance is given as;

d = vt

When the speed is doubled (2v);

2(d) = 2(v)t

2d = 2vt

Thus, if the diver initially had been moving horizontally with speed 2v, the horizontal distance would be 2d.

Learn more here: https://brainly.com/question/24527971