The linear theory used to extrapolate the cancer risk of radon exposure from high levels where direct data are available to low levels encountered in homes is tested by comparing lung cancer rates, m, and average radon levels, r, in numerous U.S. states and counties. It is shown that most problems normally associated with ecological studies do not apply here. The data show a very strong tendency for lung cancer rates, corrected for smoking prevalence (S), to decrease with increasing r, in sharp contrast to the opposite behavior predicted by the theory. It is shown that even a perfect negative r-S correlation cannot explain this discrepancy. Actual r-S correlations are only a few percent. Several other possible explanations for the discrepancy are explored, but none can reduce it by more than about 25%.