View Single Post
Old 04-10-2018, 11:44 PM   #76 (permalink)
All Darc
Banned
 
Join Date: Feb 2018
Location: Brazil
Posts: 1,476
Thanks: 14
Thanked 363 Times in 327 Posts
I never said it (video compression) was related to LCD panel. I was talking about the two issues, video displays and video signal.

The decoding ability of players and some TV also counts, for example the Samsung 4K here, who play from pen drive, render really bad gradients. In a 1GB file for just 90 seconds of video, with the camera steady still, and even though creating banding and artefacts... It's suspicious the decoding processing of the 4K TV was garbage too.
But some freak technicians, will blame just the decoding hability, even if the compression was extremelly high and impossible to look ane decency in any decoding system on Earth.

And I could also talk about video cameras, cause a given number of pixels for a camera sensor do not necessarily mean that the camera will have really good skill to capture details, since also depends of the kind of sensor it uses. Some 4K camera are garbage, not really better than prime uncompressed HD.

Depending of the type of sensor you need a 4K sensor to get a perfect 2K data, or a 8K sensor to get a perfect 4K data.

Quote:
Originally Posted by jamesqf View Post
But all of that has exactly nothing to do with LCD vs CRT displays. If you show an uncompressed video source on an LCD, most if not all of those artifacts would go away. If you show a compressed source on a CRT (assuming you could find one with the right resolution & form factor), you would see the compression-related artifacts on it.



So which UFO did you come on, then?

Last edited by All Darc; 04-11-2018 at 07:43 AM..
  Reply With Quote