Respuesta :
Answer:
2 seconds
Step-by-step explanation:
Speed=Distance/time
Time=distance/speed
Distance=speedxtime
Speed=60ft6in. / ?
Time =60ft6in/105
Divide those and get 2
Answer:
The ball will take 0.39 seconds to travel from pitcher to the batter.
Step-by-step explanation:
Distance between the pitcher's mound and home plate is 60 feet and 6 inches.
60 feet and 6 inches = (60 + 0.5) feet
= 60.5 feet
Fastest recorded pitch is = 15 miles per hour
Since 1 mile = 5280 feet
Then 105 miles = 105 × 5280
= 554400 feet
and the speed of the pitcher will be 554400 feet per hour.
Now we know Speed = [tex]\frac{\text{Distance}}{\text{Time}}[/tex]
Time = [tex]\frac{\text{Distance}}{\text{Speed}}[/tex]
= [tex]\frac{60.5}{554400}[/tex]
= [tex]1.09\times 10^{-4}[/tex] hours
Or [tex]1.09\times 10^{-4}\times 3600[/tex]Seconds
= 0.39 seconds
Therefore, the ball will take 0.39 seconds to travel from pitcher to the batter.