Purpose: Bowtie filters are commonly employed in CT scanners to minimize radiation dose by reducing intensity variations across detector elements in the presence of patient anatomy. This filtration modifies a number of x‐ray beam properties (effective energy, flux, first and second order statistics), making them non‐uniform across the fan beam field of view. Because these phenomena are not usually included in analysis of CT performance, this presentation will quantify the influence of bowtie filters on sinogram measurements and demonstrate their effect in reconstructed images. Method and Materials: A model developed for energy integrating x‐ray detectors, extended to bowtie filters, was used to compute signal means, variances, and beam quality for a realistic scanner configuration. Experimental measurements were acquired with cylinder phantoms located off‐isocenter, allowing objects to sample different portions of the fan beam. Using measured bowtie filter profiles and clinical CT patient scan data, simulated dose reduction images were generated to show visual effects. This approach was used to study a novel dose reduction method, region of interest (ROI) scanning, wherein full intensity is applied only to a local volume of interest while surrounding tissue receives a significantly lower dose. Results: The dominant effect of the bowtie filter is to increase noise in the periphery of the image field. A variation of effective energy results in a small amount of nonlinearity, which can be effectively corrected through calibration. Differences in second order statistics are at the threshold of observer detection. ROI scanning achieves very good local image quality while reducing dose in surrounding tissue. Conclusion: When bowtie filters are properly implemented, they provide reduction of patient exposure with minimal image degradation. The impact of bowtie filters on CT signals is significant and must be accounted for in accurate modeling of the scanning process.