From Slate:
Read moreJapan's unfolding nuclear disaster has introduced Americans to the confusing practice of measuring radiation exposure. According to some stories, the water nearby to the No. 2 Fukushima reactor has a radioactivity level of 1,000 millisieverts per hour. But other articles describe radiation levels in terms of millirem per year. And a few sources have referred to exposure in terms of millirad or nanogray per hour. Why don't all radiation experts just use the same unit? Because some people are afraid to switch to the metric system. As with distance, weight, and temperature, doses of radiation can be expressed in either SI units (sieverts) or U.S. customary units (rem). U.S. scientists and engineers in most fields had switched to metric units by 1964, when the National Bureau of Standards (now the National Institute of Standards and Technology) officially adopted the international system. But nuclear physicists never made the full switcheroo. That's because a wholesale change in measurement could lead to mistakes, at least during the transition—and even a small mistake can be very dangerous when it comes to radiation exposure. (There is an historical argument for being cautious: In 1999, NASA lost contact with the Mars Climate Orbiter because of a mix-up between metric and customary units [PDF].) On the basis of this concern, the U.S. Nuclear Regulatory Commission still requires plants to report radiation releases in rem, while the rest of the world uses sieverts. For the record, one rem is equivalent to one-hundredth of a sievert.
No comments:
Post a Comment