A radar station detects an incoming missile. At first contact, the missile is found to be a distance d = 57.5 miles away from the radar dish, at an angle of θ = 30.0 ̊ from the horizon. After 2.50 seconds, the missile is detected a distance d = 17.5 miles away from the radar dish, at an angle of θ = 70.0 ̊ from the horizon. (a) Calculate the average velocity of the missile during this time in Cartesian form (x and y components). Use a coordinate system where the +x direction is to the right, the +y direction is up, and the origin is at the location of the satellite dish.
A radar station detects an incoming missile. At first contact, the missile is found to be a
distance d = 57.5 miles away from the radar dish, at an angle of θ = 30.0 ̊ from the horizon.
After 2.50 seconds, the missile is detected a distance d = 17.5 miles away from the radar
dish, at an angle of θ = 70.0 ̊ from the horizon.
(a) Calculate the average velocity of the missile
during this time in Cartesian form (x and y
components). Use a coordinate system where
the +x direction is to the right, the +y direction
is up, and the origin is at the location of the
satellite dish. Give your answer in units of
miles per second.
In cartesian coordinates, the position of missile in 1st instant:
And the position of missile after 2.5 seconds,
Trending now
This is a popular solution!
Step by step
Solved in 2 steps