Astronomy Astronomers measure the apparent brightness of a

Chapter , Problem 4.3.90

(choose chapter or problem)

Astronomy Astronomers measure the apparent brightness of a star by a unit called the apparent magnitude. This unit was created in the second century B.C. when the Greek astronomer Hipparchus classified the relative brightness of several stars. In his list, he assigned the number 1 to the stars that appeared to be the brightest (Sirius, Vega, and Deneb). They are first-magnitude stars. Hipparchus assigned the number 2 to all the stars in the Big Dipper. They are secondmagnitude stars. The following table shows the relationship between a stars brightness relative to a first-magnitude star and the stars apparent magnitude. Notice from the table that a first-magnitude star appears to be about 2.51 times as bright as a second-magnitude star. The following logarithmic function gives the apparent magnitudeof a star as a function of its brightness x. M(x) = -2.51 log x + 1, 0 6 x 1a. Use to find the apparent magnitude of a star that isas bright as a first-magnitude star. Round to the nearesthundredth.b. Find the apparent magnitude of a star that is as brightas a first-magnitude star. Round to the nearest hundredth.c. Which star appears brighter: a star with an apparent magnitudeof 12 or a star with an apparent magnitude of 15?d. Is an increasing function or a decreasing function?

Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.

Becoming a subscriber
Or look for another answer

×

Login

Login or Sign up for access to all of our study tools and educational content!

Forgot password?
Register Now

×

Register

Sign up for access to all content on our site!

Or login if you already have an account

×

Reset password

If you have an active account we’ll send you an e-mail for password recovery

Or login if you have your password back