As we read in the book, a star that appears to be 1 magnitude brighter will have approximately 2.5 times as much flux hitting an observer's detector/telescope/eye (i.e. a star with an apparent magnitude of 4 has approximately 2.5 times more flux hitting the detector as a star with an apparent magnitude of 5). With this in mind what is the approximate ratio of the flux hitting the a detector for a star with an apparent magnitude of 7 compared to a star with an apparent magnitude of 13? (hint: remember that magnitudes follow a logarithmic scale, not a linear one)
As we read in the book, a star that appears to be 1 magnitude brighter will have approximately 2.5 times as much flux hitting an observer's detector/telescope/eye (i.e. a star with an apparent magnitude of 4 has approximately 2.5 times more flux hitting the detector as a star with an apparent magnitude of 5). With this in mind what is the approximate ratio of the flux hitting the a detector for a star with an apparent magnitude of 7 compared to a star with an apparent magnitude of 13? (hint: remember that magnitudes follow a logarithmic scale, not a linear one)
Related questions
Question
As we read in the book, a star that appears to be 1 magnitude brighter will have approximately 2.5 times as much flux hitting an observer's detector/telescope/eye (i.e. a star with an apparent magnitude of 4 has approximately 2.5 times more flux hitting the detector as a star with an apparent magnitude of 5). With this in mind what is the approximate ratio of the flux hitting the a detector for a star with an apparent magnitude of 7 compared to a star with an apparent magnitude of 13? (hint: remember that magnitudes follow a logarithmic scale, not a linear one)
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
This is a popular solution!
Trending now
This is a popular solution!
Step by step
Solved in 3 steps with 2 images