Purpose: The acquisition of multiple images of varied yet complementary information is rapidly becoming the norm in radiotherapy treatment planning. Beside CT data sets, PET and/or MRI or MRS images are also being used to aid in the definition of the target volume (or normal structures) for treatment optimization. We have been developing new methods to integrate available imaging information (anatomical and/or physiological) for concurrent target registration and segmentation. Towards this goal, we have investigated several clustering and active contour methods for simultaneous 2D/3D segmentation/registration of multi‐modality images consisting of combinations of PET, CT, or MRI datasets. In this work, we present a phantom validation study of this approach. Method and Materials: A commercial anthropomorphic head phantom of average human size was used. Targets consisting of plastic spheres and rods were placed throughout the cranium section of the phantom. Tap water was used for CT imaging. For MRI and PET imaging, the water inside the phantom was doped with CuNO3 and 18F‐FDG, respectively. The cold spots spheres were considered as the segmentation targets (four spheres each is 25.4 mm in diameter). The rods were used as landmarks to assist in alignment. A geometric deformable model known as the multi‐valued level set (MVLS) method was applied as our computational vehicle. Results: The MVLS algorithm achieved 90% segmentation accuracy and less than 2% volume error when integrating all of the three modalities. This is contrasted with 74% segmentation accuracy and 4.4% volume error when using CT‐based systems only. Conclusion: We have validated a semi‐automatic method for integrating information from different imaging modalities. Our phantom study demonstrates the feasibility of the proposed image integration approach. This approach could potentially provide radiologists/oncologists with reliable and efficient tools to analyze simultaneously different modality data in different cancer sites. This work was supported by ACS‐IRG‐58‐010‐50 grant.