It depends hows its been meansured have a look here----
Contrast ratio is the difference between the brightest and darkest parts of an image, measured in discrete steps, at any given moment. Generally, the higher the contrast ratio, the more realistic the image is. Contrast ratios for plasma displays are often advertised as high as 20,000:1. On the surface, this is a significant advantage of plasma over other display technologies. Although there are no industry-wide guidelines for reporting contrast ratio, most manufacturers follow either the ANSI standard or perform a full-on-full-off test. The ANSI standard uses a checkered test pattern whereby the darkest blacks and the lightest whites are simultaneously measured, yielding the most accurate "real-world" ratings. In contrast, a full-on-full-off test measures the ratio using a pure black screen and a pure white screen, which gives higher values but does not represent a typical viewing scenario. Manufacturers can further artificially improve the reported contrast ratio by increasing the contrast and brightness settings to achieve the highest test values. However, a contrast ratio generated by this method is misleading, as content would be essentially unwatchable at such settings.
Plasma is often cited as having better black levels (and contrast ratios), although both plasma and LCD have their own technological challenges. Each cell on a plasma display has to be precharged before it is due to be illuminated (otherwise the cell would not respond quickly enough) and this precharging means the cells can achieve a true black. Some manufacturers have worked hard to reduce the precharge and the associated background glow, to the point where black levels on modern plasmas are starting to rival CRT. With LCD technology, black pixels are generated by a light polarization method and are unable to completely block the underlying backlight.