Wednesday, July 30, 2014
Most Popular
Hardware Reviews
Consumer Electronics
WEB Reviews
Seagate Enterprise Capacity 3.5 V.3 4TB SAS 6Gb/s HDD Review
OCZ Vector 256GB SSD Review @ Custom PC Review
Gigabyte F2A85XM-D3H
NZXT Phantom 630
Auvio Bluetooth Portable Speaker Review
Corsair H90 CPU Cooler Review
BIOSTAR Hi-Fi Z77X (Intel Z77) Motherboard Review
Noctua NH-L9i Cooler Review on Technic3D
Breaking News
Samsung And Apple See Their Smartphone Market Shares Plunging
Twitter Says Its User base Increased
Microsoft Details Windows Phone 8.1 Update, Brings Cortana To New Markets
Facebook to Shut Down Gifts Service
Netflix To Pay AT&T For Smooth Video Delivery
Lite-On To Showcase New Enterprise Storage Solutions
Sharp Develops Brighter LED for LCD Backlights
Microsoft Releases The Sharks Cove, A Raspberry Pi Alternative
Home > Hardware Reviews > Consumer Electronics

Thursday, October 21, 2004
NeuNeo HVD108 DVD Player


NeuNeo HVD108 DVD Player - Page 3


Entering The 'High-Definition-Television Information Maze'...

How does digital television (DTV) differ from the traditional, analog TV, with which we all grew up?
Most of us have never questioned the technology hidden behind the TV picture tube; nor given any thought to how our favorite TV programs show up in our living rooms. Without getting into a lot of "hi-tech-stuff," (at least we'll try not to) let's start by looking at the TV set we've all been familiar with for the past 50+ years.

The standard-analog or NTSC-TV, (National Television Standards Committee) transmits only the video signal, containing no other information. But a major problem with analog signals is that between the transmitter and your TV set many things can interfere, thus distorting the picture you see. From a plane flying far overhead, to hilly terrain, tall buildings and even atmospheric conditions, all are obstacles that can interrupt the signal.

Also, the strength of the analog signal is critical; a weaker signal can cause "snowy" and distorted pictures - something with which those of us who remember "pre-cable/satellite" are too familiar.

The Digital Difference

Digital TV signals are made up of coded instructions - the same 'bits' of 'ones and zeros' that make your computer work, and give life to your 'CD's' and 'DVD's' - which are transmitted to your Digital Receiver, (aka: Tuner, Decoder or Set-Top-Box "STB") which in turn deciphers the code.
Your receiver isn't concerned with the signal strength, or what conditions exist between you and the transmitter. As long as the signal gets to the receiver, it can read the code and reproduce a near-perfect picture.

Ghostly Images...

A distinct advantage of digital broadcasting is that bad reception is a thing of the past. One reason Cable TV caught on is because it delivers clear TV pictures without regard to the viewer's location. Viewers don't have to be constantly adjusting the antenna in an attempt to tune in a clear picture.

DTV eliminates "snow" and "ghosting" caused by the weak signals from distant or blocked transmitting towers. If your analog television set is not receiving a strong, undistorted signal from the tower, you will not get a perfectly clear picture.

DTV eliminates "snow" and "ghosting" caused by the weak signals from distant or blocked transmitting towers. If your analog television set is not receiving a strong, undistorted signal from the tower, you will not get a perfectly clear picture.
Both digital and analog television signals get weaker the farther they travel away from the transmitting tower. On an analog TV, the picture slowly deteriorates from bad to worse for more distant receivers. However, the picture on a digital set will stay perfect until the signal becomes too weak for the receiver to distinguish between a (1) and a (0), at which point the image disappears completely.

You could compare this to sending Morse Code. As long as the person at the other end can make out the dots and dashes being transmitted they will be able to read the message. Once they lose the distinction between a dot and a dash they lose the message. Digital TV acts the same way; instead of sending dots and dashes, it sends millions of (1's) and (0's) every second. The bottom line ... you either receive a 100% perfect quality image, or nothing at all.

What this means to you, the digital, or high-definition-television viewer, is that you will never get a "bad" picture. Either you have a picture or you don't. However, if you are receiving over-the-air or "off-air" (OTA) broadcasts, it is crucial for the antenna to be accurately directed towards the signal source - the transmitting tower of the station you are watching.

Note: The focus, up to this point, has been on DTV - digital television... Then what is HDTV - High-Definition-Television?> And what is the difference between DTV and HDTV?

Add an "H" to 'DTV' and... WOW! You WILL know the difference!

Okay, we stated that DTV differs from NTSC-analog TV in the technology used to transmit the signal, plus some of the reasons why Digital is better. High-Definition-Television IS digital television...
But it's IMPORTANT to understand that all Digital TV is NOT High-Definition-Television!
HDTV is just ONE of (18) ATSC approved formats that comprise Digital TV.

In High-Definition-Television, the picture displayed on your television screen is digitally transmitted, but it must also meet the "ATSC" (Advanced Television Standards Committee) 'Standards for High-Definition-Television' in order for it to be "true" HDTV.
(Note: Unfortunately, you may confront 'mis-information' claiming 'digital TV' is 'high definition television'... NOT True! Or you may be told that as long as the TV meets one or two criteria it's the same as HDTV... again NOT True!
Remember - "Trust; BUT VERIFY!" Be sure that any TV set you are considering, is 'True' High-Definition-Television)

How Do You Know It's HD - TV?

A 'Bit' of Basic Technology...

(Only a "wee-bit" - Promise! But after all, we are discussing Digital-High-Definition-Television; possibly the most significant 'leap forward' in consumer-technology to impact our society in more than a century!)

The image you see on your television screen is comprised of a series of horizontal lines. An electron gun 'shoots' energy beams (light) which strike a layer of phosphor on the inside surface of the picture tube, causing it to glow. These glowing lines create the image displayed on your TV screen. How they are formatted, which resolution is used, what standards are met, determine the type of television picture you are receiving.


Basically, TV 'resolution' refers to how many horizontal lines are displayed on your TV screen. (Although it is the horizontal lines that are counted, this is usually referred to as "vertical resolution" because the lines are counted from top to bottom - or vertically).

Note: Resolution is sometimes expressed as the total 'pixel' count, which is a product of the number of lines and number of pixels per line - we will cover 'pixels' in more detail later on.

HDTV Resolution compared to NTSC-Analog

Interlaced and Progressive Scan

There are two methods that can be used to display the lines on the screen - either "Interlaced" or "Progressive Scan." The NTSC adopted the standard known as "interlaced" to provide a method of compression that achieves a higher resolution using less-costly circuitry. The NTSC-analog standard is "525 lines - interlaced, at 30 fps" (frames per second). This may be written as: 525-i/30 fps; however, only 480 lines are used to make the visible image, the remaining lines contain information pertaining to picture synchronization and are not seen. For this reason, the stated 'resolution' usually refers only to the visible lines; ie... 480-i/30 fps.

In using the "Interlaced" method, the 480 lines are created in two fields (phases). The "scan-rate" for these (2) fields is 60hz (60 times per second). In phase #1, the first 1/60th of a second, 240 lines (the odd numbered lines - 1,3,5 etc) are scanned on the tube. In the second 1/60th of a second, (phase #2)- the remaining 240 (even-numbered) lines are scanned. Thus each field of 240 lines is scanned 30 times a second, and produces one complete frame (30) times per second; (1/60 second X 2 fields = 2/60 second = 1 complete frame, 30 times per second). It's the total lines-per-image that indicates the resolution of the system, ie. (525i or 480i).

There are other analog systems that have resolutions of less than 480 lines per image. For instance, DVD's have 450 lines; while VHS players come in with a poor showing of only 240 lines.

Progressive Scan System

Digital TV also has formats that use the interlaced system; however, DTV also uses another system, called "Progressive Scan." The progressive system scans the total number of lines, 60 times a second; not half and half as in interlaced. This means you see the complete image displayed on your TV screen two-times more often. This results in smoother motion in moving images, having less motion artifacts and none of the visible "flicker." A progressive scan system with 480 lines of resolution is written, "480p."

High-Definition-Television - vs. - Standard Definition Television

As already stated, the ATSC has assigned (18) formats to Digital TV. At the current time, in HDTV we are primarily concerned with just two: 1080-i and 720-p. "True" High-Definition-Television may have, either 1080 interlaced lines, or 720 progressive-scanned lines. (Higher resolutions may be introduced in the future; for instance 1080-p, which is not currently used because of high manufacturing costs).

Digital broadcasts in 480-i or 480-p are classified as "SDTV" (Standard Definition). SDTV has a sharper, crisper picture than NTSC-analog TV. It is superior to analog because the transmitted signal is digital. SDTV can be either (480i) or (480p) but is more often 480p. On smaller (direct-view) TV sets, 480p is noticeably better than the analog 480-i, but on the much larger, "projection" sets, SDTV can not compare to High-Definition-Television's 720p, or 1080i formats.

Note: 1080i displays more lines and thus delivers more information. This produces better "spatial resolution" - producing sharper pictures when the image is "still" or has little motion. Manufacturers have generally preferred the Interlaced format because more lines of resolution can be delivered with less bandwidth, resulting in lower costs.

Many viewers, including those in the computer world, prefer the 720p format because its full frame, progressive scanning, enables it to reproduce fast-moving action and graphics without blurring the image. Thus, 720p is said to have better "temporal" resolution. Incidentally, if you have a computer system - doesn't everyone? - your monitor uses progressive scan.

Both sides in this (i/p) debate are dug-in, with ardent supporters and aggressive detractors abounding on each side. But which is the better system is a subjective determination. You might say "it's in the eye of the beholder."

The different television networks have individually selected the particular DTV format used by their respective networks for transmitting High-Definition-Television broadcasts. For instance, ABC uses 720p, while CBS transmits in 1080i, and FOX has elected to transmit only SDTV (480-p). The important note for you, regardless which High-Definition-Television System you buy, is to be sure it is capable of "up-converting" or "down-converting," enabling you to view all transmitted signals in your set's designated (native) format.

Picture Pixels

High-Definition-Television displays pictures that contain significantly more detail, resulting in much 'crisper' pictures. Images viewed on TV screens are made up of small picture elements known as 'pixels.' Each of these pixels is made up of three, closely spaced 'dots' of color - red, blue and green.
Combined together on the TV's phosphor screen, and viewed from a distance, the colors are seen as one. The phosphor at each of these dots emits light directly proportional to the intensity from the electron beam that hits it, as it scans across the screen.

On traditional, NTSC TV's 256 levels of intensity are possible for each of the three colors. The result is a range of 16.8 million colors for each pixel. The pixels in the analog system are slightly 'taller' than their width. Get up close to an analog screen - especially the larger projection sets - and you can easily see the red, blue and green rectangles. This is why distortion is sometimes seen on traditional, NTSC TV's.

The pixels in HDTV sets are square; they are also smaller, and spaced closer together. There can be (4 1/2) HDTV pixels in the same space that a single NTSC pixel requires. The result is that High-Definition-Television can display at least 4.5 times more detail than NTSC-analog TV.

What Is "Aspect Ratio"?

...And Why Is "Wide-Screen" TV The DTV Standard?

DTV sets are sold in two 'Aspect Ratios.' Aspect Ratio refers to the ratio between the horizontal (width) measurement and the vertical (height) measurement of the screen. This ratio is also used in reference to how the picture is transmitted and displayed on the screen.

The two aspect ratios used in DTV are (4:3) and (16:9). That is, (4) units wide by (3) units high, and (16) units wide by (9) units high respectively.
Your NTSC-analog television has an aspect ratio of (4:3); the screen appears almost 'square' because it has just slightly more width than height. For instance, a (4') wide screen would have a height of (3').

Digital Television's (16:9) 'wide-screen' is approximately (1/3) larger than a comparable (4:3) set. As a comparison - if you have a "wide-screen' set measuring (16) units wide by (9) units high, then a comparable (4:3) set would measure (12) units wide by (9) units high.

The ATSC adopted the 16:9 "wide-screen" aspect ratio as the standard for Digital-HDTV, because significantly more information can be displayed on the screen. And Wide-screen DTV/HDTV sets appear more 'rectangular' than the familiar 'square' shape of your NTSC-analog set.
(Don't confuse this with screen "size," which is the screen's diagonal measurement)

Why "Wide-Screen"?...
(Some background, and a little evolution...)

Believe it or not, the 4:3 aspect ratio was originally developed by W.K.L. Dickson in 1889, while working at the famed Thomas Edison Laboratory. He was running experiments with a Kinescope (motion-picture) camera. He made the decision to create his film (1") wide and (3/4") high. This ratio soon became the standard of the film industry. In 1941, when the NTSC proposed the standards for broadcast television they had no reason not to adopt the same 4:3 ratio used by the film industry.

In the 1950's, Hollywood found they needed to provide the public with a specific reason to buy movie tickets, when it was easier for them to sit home and enjoy free television. Besides trying innovations like "3-D", studios experimented with the aspect ratio; "Cinemascope" was one of the early 'wide-screen' ratios that can still be seen today.

The reasoning that led to wide-screen formats is simply that the wider view is closer to the human field of vision. Because the viewer is visually drawn more into the action with wide-screen, the enjoyment level is enhanced.

Our vision is optimized within a 30-degree field of vision. We see details best within the 'center' area of this field, while our peripheral vision is better at detecting motion. Beyond 30-degrees there is no visible benefit.

The familiar 4:3 ratio allows us only a 10-degree field of vision. In the theater, 'wide-screen' formats were easy to reproduce by using more or less of the area projected on the screen, as needed. However, as movies were displayed on TV screens, and later made into videos, the aspect ratio became more complicated. Initially, movies were 'cropped' to 'fit' 4:3 analog-TV sets.

This is accomplished by a process called "pan and scan," which involves moving the 4:3 viewing area back and forth, to center the scene on the primary action. While pan and scan is okay if nothing is occurring in the peripheral areas, often, important information in these areas is cut off. In addition, pan and scan may not give the viewer the same "feel" that the original film had, because the scene is not actually seen as the movie director intended.

To enable movies to be viewed in their original, 'wide-screen' aspect, the "letter-box' process was developed. With 'letter-boxing,' the picture's height is reduced, thus allowing the full width of the image to fit the TV screen. This enables you to see the entire scene the way it was filmed.

However, reducing the image height requires removing some information that leaves a portion of the vertical area 'blank' - this appears as a black 'bar' on the screen. The image is displayed in the center of the screen, with the blank area divided into two, horizontal, black 'bars' across the top and bottom of the screen. These bars increase or decrease, as the aspect ratio changes. However, 'letter-boxed' movies, originally filmed in extra-wide format, can be especially troublesome when viewed on smaller TV screens, due to the extremely reduced viewing area.

"Letter-box' or 'Window-box'?

It's important to understand that the digital signal can be transmitted in either the (4:3) or (16:9) aspect ratio. (Although it's alleged that all DTV will be broadcast in (16:9) at some future date)
Your DTV will be able to display both aspect ratios - regardless of which ratio is 'native' to your set. When you watch a program that is transmitted in (4:3) aspect ratio, on a (16:9) screen, the image will be "window-boxed" - centered on the screen with vertical black bars ('gray' on some models) on both sides. When you watch a (16:9) program on a (4:3) screen, it will appear "letter-boxed" (previously described) with horizontal bars across the top and bottom of the screen.

"Used under permission granted by - copyright 2004"

Get RSS feed Easy Print E-Mail this Message

Home | News | All News | Reviews | Articles | Guides | Download | Expert Area | Forum | Site Info
Site best viewed at 1024x768+ - CDRINFO.COM 1998-2014 - All rights reserved -
Privacy policy - Contact Us .