Log R deviation
The Log R Deviation (or Log R Ratio standard deviation) quantifies the variability of the the signal intensity for each SNP marker on an array, ie, noise level.
Log R deviation is one of the key metrics used to determine array sample quality, alongside call rate.
Lower values indicate more consistent signal intensities. A high Log R Deviation can indicate a poor-quality sample or potential issues with CNV calling.
Displayed to three decimal places.
Last updated
Was this helpful?