Guitaro
Well-Known Member
I'm sure there are better explanations for these various things, but this is what I've learned about the topics. Enjoy! 
1080i vs 720p:
There is a difference between 1080i and 720p.
P = progressive = the image will look smoother because the whole picture is displayed at the same time. 720p, however, the resolution (amount of pixels on screen) is around 1280 x 720. Good, but not great.
I = Interlaced = the image is displayed in halves, but so fast that you can't tell it's doing it. Even though interlaced is inferior, 1080i still has more pixels, 1920 x 1080. So it will always look better, just not as smooth.
For example, primetime broadcast television is almost always in 1080i. Is the reason why it looks so much better on a screen where the native resolution is 1920 x 1080. In other words, if you have an LCD monitor, try using a resolution that's not the native one. It still looks good but it's not as sharp. Televisions can adjust better than a monitor, but there are some similarities.
Source Does Make a Difference:
Also, the source makes a huge difference. Converting to 720p, and watching on your TV or computer, from a Blu-ray (1080p), for example, is always going to look better than broadcast television that sends in 720p. It's because there's more information to start with and there's less information lost during the transmission. Therefore, in reality, broadcast television, cable, etc, is not true 720p or 1080i. There's going to be a slight loss of quality due to the distance the data has to travel. Data loss is inevitable. However, if you had a fiber-optic cable going directly from the source to your house, then it would be a lot less. That's not the case for anyone. Fiber runs to the neighbor junction box but a cable runs from there to your house. In other words, line noise is introduced when you use metal. Shielding helps but only so much.
Signal location:
If you want to know what the signal is, just look at your box. I have Comcast. It tells me if the signal is in 720p, 480i or 1080i. 480i is the non-HD channels. It's still better than the old 330 or 380 that they used to use. I forget which was cable and which was VHS. Laserdiscs were 425 lines of resolutions. DVD is 480i or p, depending on if you have a progressive scan DVD player or not. Today, of course, Blu-ray players upscale DVDs to 1080p. A great example of what I mean by the quality of the source. A DVDs can't magically be true 1080p, but it can look a lot smoother than the default 480p, because of better decoding technology.
3D Channels:
Comcast 3D channels are 897 and 898. I have the sports package and 898 (ESPN3D) is not apart of that. Bummer. However, they do have free 3D stuff in On-demand. Naturally, your television or monitor has to be 3D capable or it won't work. In other words, it has to be able to layer two images on top of each other to get the 3D effect. Sometimes that means side-by-side. Some 3D broadcasts are top-and-bottom layering. My television, will auto-detect 3D and switch over. Otherwise, I set it to manual and it tells me when to hit the 3D button. It also can convert any 2D image to 3D, but it's not nearly as good as a true 3D source. Still, if the image is sharp, then 2D>3D conversion will have more depth than a regular 2D image. BTW, my glasses are powered. Powered is superior to passive (no power), unless you're talking about IMAX 3D. Still, it's not as uniform as powered 3D, IMHO.
Smooth-Motion / Motion Compensation:
Basically, @120hz, SM takes one real frame, and then a CPU in the television guesses what should go between the next real frame, and adds a computer generated one after that. @240hz, that goes up to one real frame and two fake frames. Needless to say, it makes the picture look really weird when images move across the screen. Still, @600mhz, a plasma uses their own methods but is basically the same thing. Although, that extra refresh speed does help keep the images more free of distortion. In fact, plasma televisions are always ranked at the top of the list when it comes to image quality. My Panasonic 55" flagship plasma looks amazing with pretty much anything I throw at it.
Personal Preference:
I turn off Smooth-Motion, etc, because I think it makes movies look like someone filmed it with a home DV camera. Plus, it tends to make me dizzy if I use it for too long. In other words, your eyes naturally blur images (motion blur) when you turn your head real fast. Therefore, it will always look weird when a television image doesn't blur like it's supposed to with a fast-moving images. Also, a widespread Hollywood effect is called "motion blur" that adds that blur so the motion on-screen looks more natural and movie-like. Plus, it can give them a nice transition for editing between cuts.

1080i vs 720p:
There is a difference between 1080i and 720p.
P = progressive = the image will look smoother because the whole picture is displayed at the same time. 720p, however, the resolution (amount of pixels on screen) is around 1280 x 720. Good, but not great.
I = Interlaced = the image is displayed in halves, but so fast that you can't tell it's doing it. Even though interlaced is inferior, 1080i still has more pixels, 1920 x 1080. So it will always look better, just not as smooth.
For example, primetime broadcast television is almost always in 1080i. Is the reason why it looks so much better on a screen where the native resolution is 1920 x 1080. In other words, if you have an LCD monitor, try using a resolution that's not the native one. It still looks good but it's not as sharp. Televisions can adjust better than a monitor, but there are some similarities.
Source Does Make a Difference:
Also, the source makes a huge difference. Converting to 720p, and watching on your TV or computer, from a Blu-ray (1080p), for example, is always going to look better than broadcast television that sends in 720p. It's because there's more information to start with and there's less information lost during the transmission. Therefore, in reality, broadcast television, cable, etc, is not true 720p or 1080i. There's going to be a slight loss of quality due to the distance the data has to travel. Data loss is inevitable. However, if you had a fiber-optic cable going directly from the source to your house, then it would be a lot less. That's not the case for anyone. Fiber runs to the neighbor junction box but a cable runs from there to your house. In other words, line noise is introduced when you use metal. Shielding helps but only so much.
Signal location:
If you want to know what the signal is, just look at your box. I have Comcast. It tells me if the signal is in 720p, 480i or 1080i. 480i is the non-HD channels. It's still better than the old 330 or 380 that they used to use. I forget which was cable and which was VHS. Laserdiscs were 425 lines of resolutions. DVD is 480i or p, depending on if you have a progressive scan DVD player or not. Today, of course, Blu-ray players upscale DVDs to 1080p. A great example of what I mean by the quality of the source. A DVDs can't magically be true 1080p, but it can look a lot smoother than the default 480p, because of better decoding technology.
3D Channels:
Comcast 3D channels are 897 and 898. I have the sports package and 898 (ESPN3D) is not apart of that. Bummer. However, they do have free 3D stuff in On-demand. Naturally, your television or monitor has to be 3D capable or it won't work. In other words, it has to be able to layer two images on top of each other to get the 3D effect. Sometimes that means side-by-side. Some 3D broadcasts are top-and-bottom layering. My television, will auto-detect 3D and switch over. Otherwise, I set it to manual and it tells me when to hit the 3D button. It also can convert any 2D image to 3D, but it's not nearly as good as a true 3D source. Still, if the image is sharp, then 2D>3D conversion will have more depth than a regular 2D image. BTW, my glasses are powered. Powered is superior to passive (no power), unless you're talking about IMAX 3D. Still, it's not as uniform as powered 3D, IMHO.
Smooth-Motion / Motion Compensation:
Basically, @120hz, SM takes one real frame, and then a CPU in the television guesses what should go between the next real frame, and adds a computer generated one after that. @240hz, that goes up to one real frame and two fake frames. Needless to say, it makes the picture look really weird when images move across the screen. Still, @600mhz, a plasma uses their own methods but is basically the same thing. Although, that extra refresh speed does help keep the images more free of distortion. In fact, plasma televisions are always ranked at the top of the list when it comes to image quality. My Panasonic 55" flagship plasma looks amazing with pretty much anything I throw at it.
Personal Preference:
I turn off Smooth-Motion, etc, because I think it makes movies look like someone filmed it with a home DV camera. Plus, it tends to make me dizzy if I use it for too long. In other words, your eyes naturally blur images (motion blur) when you turn your head real fast. Therefore, it will always look weird when a television image doesn't blur like it's supposed to with a fast-moving images. Also, a widespread Hollywood effect is called "motion blur" that adds that blur so the motion on-screen looks more natural and movie-like. Plus, it can give them a nice transition for editing between cuts.