DreamLoveBelieve
New member
I've always though in some instances to me anyway some 720p and 1080p sources looked/sounded the same, now I know I'm not crazy!
It really has more to do with the 'quality' of the display unit, not the encoding or transmission. Being a very early adopter, neither of my displays do 1080P, and only one does 720P (they both do 1080i though). NONE of the LCD displays I see at the local stores come close to the quality of the 10 year old displays I own... But the ARE broadcast-grade displays.
I've d/l'ed several movies with both 720P and 1080P, and can't see any difference either. But it will have to wait until the next 'upgrade' go-around with a top-notch 1080P plasma (most probably a Panasonic, as it has the best reviews, now that Pioneer has dropped production of plasmas and sold the plant to Panasonic), and that puts it in the 2011 time frame (unless the economy collapses yet again).
The thing a lot of people forget about, is the audio. LOTS of x264 recodes have full-bitrate DTS (1.5+Mb/s) down-sampled from the HD audio (Dolby HD or DTS HD), and for most, it's almost overkill, as there is the option during recoding to do the audio at half-bitrate (768Kb/s) DTS, I've only seen one example of that. That's a LOT of bits, and one has to remember the audio is constant, not variable, like the video.
But until I get set up and do some 'real' testing, it's up in the air. And, a LOT of the Blu-Ray discs are pretty poor quality, if the 'high-end' magazines I read are correct. Again, the studios are following the same track with HD as they did with standard DVD's some 10 years ago... lots of poor quality mastering out there.