Q:

WILL GIVE BRAINLYEST!!!!!!! PLZZZZZZZZZ HEEEELLLPPPPRadio signals travel at a rate of 3 × 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 × 10*7 meters?A.) 8.3 secondsB.) 1.2 × 10^–1 secondsC.) 1.08 × 10^16 secondsD.) 10.8 × 10^15 seconds

Accepted Solution

A:
In this problem we are given with the speed of the signal and the distance traveled and is asked to determine the time of travel. IN this case, the expression that can be applied is 
time = distance / speed 

= (3.6 × 10⁷ m) / (3 × 10⁸ m/s) 

= (3.6 / 3) × 10⁷⁻⁸ s 

= 1.2 × 10⁻¹ s 

t= 0.12 s