High Signal-to-Noise Ratio
Signal-to-Noise Ratio, or SNR, is defined as:

The signal in the image is how "bright" the image is at each voxel, or volumetric pixel (i.e. a pixel with depth). This signal, or the grayscale value in the image, is directly correlated to the attenuation of X-rays at that specific volume in the sample. A more dense or higher atomic number composition will attenuate more X-rays and have a "brighter" signal in the microCT image. The standard deviation of the signal is a measure of the noise in the signal. Therefore signal-to-noise ratio is essentially the signal divided by the noise, as the name implies.
To improve SNR, one needs to either increase the signal in the image or decrease the noise. There are a number of scanning parameters that can be changed to accomplish this, but usually it comes at a cost to the spatial or temporal resolution. Unlike spatial or temporal resolution, there are some artifact and denoising filters that can improve SNR after the image is collected. Caution must be practiced when using these options, though, because they can fundamentally alter the data or add bias in analysis. If it is possible, it is usually best to optimize SNR with scanning parameters rather than fully relying on image processing algorithms.