We present a new non-uniform sampling method for the accurate estimation of mutual information in multi-modal brain image rigid registration. Most existing density estimators used for mutual information computation incorrectly assume that the intensity of each voxel is independent from its neighborhood. Our method uses the 3D Fast Discrete Curvelet Transform to reduce the sampled voxels’ interdependency by sampling voxels that are less dependent on their neighborhood, and thus provide a more accurate estimation of the mutual information and a more accurate registration. The main advantages of our method over other non-uniform sampling schemes are that: (1) it provides more accurate estimation of the image statistics with fewer samples; (2) it is less sensitive to the variability of anatomical structures shapes, orientations, and sizes, and; (3) it yields more accurate registration results. Extensive evaluation on 1000 synthetic registrations between T1 and T2-weighted clinical MRI images and 20 real clinical registrations of brain CT images to Proton Density (PD) and T1 and T2-weighted MRI images from the public RIRE database show the effectiveness of our method. Our method has the lowest mean registration errors recorded to date for CT-MR image registration in the RIRE website for methods tested on more than five datasets. These results indicate that our sampling scheme can be used to achieve more accurate multi-modal registration required for image guided therapy and surgery.