# HDMI 2.0 CEDIA Webinar



## Scott Wilkinson


Last week, I attended a webinar hosted by CEDIA (Custom Electronics Design and Installation Association) called "HDMI 2.0: A Look Into the Standard." The presenters were Steve Venuti, President of HDMI LLC; Jeff Park, Technical Specification Manager of HDMI, LLC; and Michael Heiss, CE industry consultant and jolly-good CEDIA Fellow as well as chair of the CEDIA Technology Council.

 

I didn't learn much that I don't already know, but it was a good reminder that the version number doesn't mean much other than a list of possible features that manufacturers might or might not implement. That's why HDMI Licensing wants companies to indicate what HDMI features they have included in their products rather than simply touting "HDMI 2.0." That number refers only to the specification that defines what features are supported, not what must be implemented.

 

HDMI 2.0 ups the maximum bandwidth from 10.2 gigabits per second to 18 Gbps, which can be carried on existing high-speed HDMI-certified cables. However, extenders, boosters, and any other electronics in the HDMI signal chain—including Redmere booster chips and HDBaseT—probably can't support that bitrate without a hardware upgrade.

 

The increase in bandwidth is made possible by a new, more efficient signaling method. Even better, the interface uses the previous signaling method for traffic below 10.2 Gbps, then kicks in the new signaling above that, which means it's completely backward compatible with HDMI 1.4 devices.

 

New features supported by HDMI 2.0 include the ability to transmit 4K video at 50 and 60 frames per second (with some limitations, which I'll get to in a moment) and up to 32 channels of audio with a sample rate up to 1536 kHz. Also, new commands have been added to CEC (Consumer Electronics Control, the ability to control multiple connected devices from one remote), and all commands must be implemented rather than being optional as in previous versions—a welcome requirement even if it flies in the face of HDMI's otherwise feature-optional paradigm. Other features include support for the Rec.2020 color space, dual viewing (two programs displayed on the same TV and isolated for each viewer with glasses, much like 3D), multi-stream audio, dynamic auto lip-sync, and the 21:9 aspect ratio.



*HDMI 2.0 adds many new features to the HDMI spec. (Graphic from HDMI Licensing, LLC)*

 

As I said earlier, HDMI 2.0 can handle 4K/UHD at 50 and 60 frames per second, but there are some limitations—in particular, in the bit depth and level of color subsampling it can convey. For those who are unfamiliar with color subsampling, it's a type of data compression in which some color pixels are discarded from a component-video signal and reconstructed by the display. It's specified as a series of three numbers—the most common schemes are 4:4:4, 4:2:2, and 4:2:0. Because color subsampling applies to component-video signals, the first number refers to the black-and-white pixels, while the second and third numbers refer to the color-difference pixels.

 

With 4:4:4, no color pixels are discarded, while 4:2:2 discards half the color pixels, and 4:2:0 discards 75% of the color pixels, which reduces storage and transmission-bandwidth requirements. However, the less color subsampling that is used, the better the image quality, especially in terms of clean transitions between colors. Amazingly, Blu-ray uses 4:2:0 and still manages to achieve great picture quality.

 

Using 4:2:0 color subsampling, HDMI 2.0 can convey 4K/UHD at 50/60 fps with up to 16 bits of resolution per color. This provides tremendous dynamic range—far more than the current HD system, which uses 8-bit resolution. If the color subsampling is 4:2:2, HDMI 2.0 can accommodate up to 12 bits of resolution for 4K/UHD at 50/60 fps. And at 4:4:4, HDMI 2.0 is limited to 8 bits for 4K/UHD at 50/60 fps. This presents a conundrum for video-content creators and consumers, who want the best possible specs all around.

 



*As more data is transmitted, the bandwidth requirements increase. Notice how much bandwidth is required for 8K (4320/60p) at 4:4:4 with 12-bit resolution—far more than HDMI 2.0 can support! (Graphic from HDMI Licensing, LLC)*

 

I suspect—hope, actually—that the UHD system will settle on 4:2:2 at 12-bit resolution, but that is far from certain at this point. A resolution greater than 8 bits is critical to support a higher dynamic range without visible banding, which is even more important than the increased number of pixels in my opinion. And less-aggressive color subsampling will yield sharper transitions between colors.

 

HDMI 2.0 also supports the Rec.2020 specification, which includes a much wider color gamut than the current Rec.709. This allows content and displays to accurately reproduce many more colors than today's Blu-rays and HDTVs.

 



*Rec.2020 specifies a much larger color gamut than the current standard of Rec.709. (Graphic from HDMI Licensing, LLC)*

 

Many people ask me about alternatives to HDMI—in particular, DisplayPort. As you can see in the following table, DisplayPort 1.2 does offer a somewhat higher overall bandwidth than HDMI 2.0, and much higher Ethernet bandwidth. It also transmits some power and USB communications. DisplayPort is common in the world of computers, but HDMI is so entrenched in the consumer-electronics industry that I doubt it will ever be replaced by DisplayPort. HDBaseT also carries power and USB along with HDMI signals, but its overall bandwidth is the same as HDMI 1.4 until its hardware is upgraded.

 



*DisplayPort 1.2 offers a bit more overall bandwidth, but HDMI is too entrenched in the CE industry to be supplanted. (Graphic from HDMI Licensing, LLC)*

 

The bottom line is that the term "HDMI 2.0" means next to nothing when trying to figure out the specific capabilities of a particular piece of gear. It's up to consumers to discover which features a manufacturer has included in its products, which can be added in a firmware update, and which will never be implemented. Hopefully, manufacturers will start explicitly listing the features they include in each product, making it easier for consumers to select the gear that's right for them.

 

Like AVS Forum on Facebook

Follow AVS Forum on Twitter

+1 AVS Forum on Google+


----------



## fierce_gt

some of that sounds better than expected, for once.


but I'm curious, when you're saying the features are optional, are you talking in regards to the displays and sources, not the cables right?


I mean, I'm a little confused with what is actually changing. sounds like for the shorter cable lengths, the cables don't actually need to be changed, but I'm sure they will be relabelled as '2,0' now that there is a standard for it. so for cables, they would be able to support ALL those features listed, correct?


and then, the displays, like has always been, would need to specify what features they support. just like current displays may have hdmi1.4 inputs, but not be 3d capable. and the same thing for sources, just because it has an hdmi 2,0 output, that only means it will pass 18gbps, but the features will still be independently listed. has this not always been the case? or are you saying a display/source may list hdmi2.0 but not support 18gbps? and they can claim that simply because an hdmi 2.0 cable can be plugged into it?


the other thing that interested me was the support for 21:9. After seeing the possibilities of a CIH system with a projector, I find it hard to find a downside to using a wider display and doing the same thing. well, the downside being how difficult is it to get content that's not broadcast in 4:3 around here, but hopefully that would 'have' to change


----------



## BrolicBeast

An absolutely outstanding article Scott. I've been enlightened quite a bit.


----------



## Scott Wilkinson




> Quote:
> Originally Posted by *fierce_gt*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521175
> 
> 
> some of that sounds better than expected, for once.
> 
> 
> but I'm curious, when you're saying the features are optional, are you talking in regards to the displays and sources, not the cables right?
> 
> 
> I mean, I'm a little confused with what is actually changing. sounds like for the shorter cable lengths, the cables don't actually need to be changed, but I'm sure they will be relabelled as '2,0' now that there is a standard for it. so for cables, they would be able to support ALL those features listed, correct?
> 
> 
> and then, the displays, like has always been, would need to specify what features they support. just like current displays may have hdmi1.4 inputs, but not be 3d capable. and the same thing for sources, just because it has an hdmi 2,0 output, that only means it will pass 18gbps, but the features will still be independently listed. has this not always been the case? or are you saying a display/source may list hdmi2.0 but not support 18gbps? and they can claim that simply because an hdmi 2.0 cable can be plugged into it?
> 
> 
> the other thing that interested me was the support for 21:9. After seeing the possibilities of a CIH system with a projector, I find it hard to find a downside to using a wider display and doing the same thing. well, the downside being how difficult is it to get content that's not broadcast in 4:3 around here, but hopefully that would 'have' to change


The cables only convey data; they don't care what those data are. HDMI-certified high-speed cables can convey 18 Gbps, no matter what specific features are included in that datastream.

 

Some manufacturers claim that certain 2013 products have "HDMI 2.0" with a bandwidth of 10.2 Gbps, and that's technically true in that they could handle 2160p/60 at 4:2:0 with 8-bit resolution, which requires a bandwidth of 8.91 Gbps. That's technically HDMI 2.0, but those products can't handle higher specs, so calling it HDMI 2.0 is a bit misleading in my book. Upping the bandwidth to 18 Gbps requires new hardware, which wasn't even available last year.


----------



## dew42

Was there any talk of 3D? 1080p 3D with 48/60 fps or 2160p 3D with 24/30 fps? I think I'd prefer HFR over 4K for 3D.


----------



## Dan Hitchman




> Quote:
> Originally Posted by *dew42*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521435
> 
> 
> Was there any talk of 3D? 1080p 3D with 48/60 fps or 2160p 3D with 24/30 fps? I think I'd prefer HFR over 4K for 3D.



I'd rather have 12 bit 4:2:2 and a wider color gamut and object audio than 3D.


----------



## iatacs19

I am liking the new CEC-Extension being a requirement. It's about damn time we can use any remote for any/all devices.


----------



## dew42




> Quote:
> Originally Posted by *Dan Hitchman*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521491
> 
> 
> I'd rather have 12 bit 4:2:2 and a wider color gamut and object audio than 3D.



I'd hope the bandwidth 3D adds would have no impact on 2D in the HDMI spec. I'd like 12 bit 4:2:2 for 2D and 3D. The compromise I suggested to give up 4K to get HFR would be for 3D only.


Looking at the numbers above; 1080/60p 4:4:4 16 bit uses 8.91 Gbps. Times two for 3D would be just under the 18 Gbps limit.


----------



## golem

Ah so when the Sony X950b only lists 3840x2160/60p (YCbCr 4:2:0 8bit), it means it probably wasnt built using the newer faster bandwidth hardware? Thats disappointing.


----------



## Matthias Hutter

I hope more TVs offer DisplayPort 1.2/1.3 connectors, for some 4k/60p 16bit 4:4:4 goodness.

But the need for copy protection forced them to use HDMI


----------



## blah450

^^^^^^^This...and is anyone else ready to have more secure connection points at terminals rather than just the slip-in-style of HDMI?


----------



## Phrehdd

Pity about DisplayPort. Perhaps Apple will enter into the foray with the option for DisplayPort given its new dealings with Comcast and also potential entrance into 'media displays' (TVs as it were). Apple is perhaps the only company that could bend the trend to move forward with HDMI 2.0. Then again, I don't want to be victim to Apple's Henry Ford mentality.


----------



## KidHorn

I'm confused about the 3D comments. Why would 3D require more bandwidth than 2D? Don't 3D and 2D use the same number of pixels?


----------



## MikeyD360

So anyone who wasted $20K on Sony's 4K projector, or LG's 84" UHD TV now has an expensive paperweight given HDMI2.0 standards didnt exist when these were released... hands up who saw that coming...


----------



## keb33509




> Quote:
> Originally Posted by *KidHorn*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522101
> 
> 
> I'm confused about the 3D comments. Why would 3D require more bandwidth than 2D? Don't 3D and 2D use the same number of pixels?



Twice as many, seeing as it processed two video streams at once.


----------



## KidHorn




> Quote:
> Originally Posted by *keb33509*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522289
> 
> 
> Twice as many, seeing as it processed two video streams at once.



So if you have a 4k TV, 2D renders 2160 lines but 3D renders 4320 lines. How is that possible?


----------



## keb33509

Granted this is in an HDMI 2.0 discussion, you never specified 4K. 4K 3D if/when it becomes available would render two 2160p images at a time. 1080p 3D renders two full 1080p images at once. It still increases the required bandwidth regardless if the 3D is 1080 or 2160. There are two simultaneous video feeds being supplied with 3D Blu-Ray.


It doesn't increase the pixel count, it is only the amount of bandwidth it has to process at once.


----------



## dew42

We need HEVC over HDMI.


----------



## hidefpaul

"The bottom line is that the term "HDMI 2.0" means next to nothing when trying to figure out the specific capabilities of a particular piece of gear. It's up to consumers to discover which features a manufacturer has included in its products, which can be added in a firmware update, and which will never be implemented. Hopefully, manufacturers will start explicitly listing the features they include in each product, making it easier for consumers to select the gear that's right for them."


Thanks again Scott, as always, a great report.


Now, let me just start by saying that I COMPLETELY DISAGREE with the HDMI Organization, letting the product manufactures provide the consumer with which parts of the HDMI 2.0 spec they will be offering in their product. I believe that ALL HDMI 2.0 products should adhere to the COMPLETE SPEC, with ALL FEATURES IMPLEMENTED, PERIOD!


This entire HDMI cabling 1.0 - 2.0 (mess as I call it), can be confusing to even some AVS enthusiast, let alone the common Joe consumer out there, imagine the confusion they will run into.

Another issue I have is that, the HDMI Org should NOT ALLOW manufactures to implement specs OVER FW up-grades..... you could be waiting months or even a year for a FW. (Pioneer BDP-51 - DTSMA FW anyone- almost 1 year wait for FW)

Personally the HDMI association is putting way too much faith in the manufactures. Either that, or they are trying to please them given the mess the HDMI Org has put this entire industry in.


To the HDMI ORGANIZATION - DO IT RIGHT OR DON"T DO IT AT ALL!!!


Paul


----------



## bluewaves




> Quote:
> Originally Posted by *MikeyD360*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522165
> 
> 
> So anyone who wasted $20K on Sony's 4K projector, or LG's 84" UHD TV now has an expensive paperweight given HDMI2.0 standards didnt exist when these were released... hands up who saw that coming...



Never buy the first gen of any product


----------



## BNestico

Seeing that I was still in high school when HD was becoming prevalent and was more worried about chasing tail than tv resolution, was there this much confusion/disagreement about SD vs HD as there currently is between HD and UHD?


----------



## leevit




> Quote:
> Originally Posted by *MikeyD360*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522165
> 
> 
> So anyone who wasted $20K on Sony's 4K projector, or LG's 84" UHD TV now has an expensive paperweight given HDMI2.0 standards didnt exist when these were released... hands up who saw that coming...


The very good reason to just wait it out, and not buy the 1st gen of new tech.!


----------



## vaktmestern

Dont trust Sonys 2.0hdmi software upgrade


----------



## docevil

I'd like to know about the backwards compatibility aspect... The overwhelming majority of film is 24fps with no sign of change in the foreseeable future.


What would happen in the scenario where an HDMI 2.0 2160/30p 12bit 4:2:2 source is plugged into a UHD display that is HDMI 1.4b (which supports 2160/30p 12bit 4:2:2)?


----------



## BrolicBeast




> Quote:
> Originally Posted by *BNestico*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522801
> 
> 
> Seeing that I was still in high school when HD was becoming prevalent and was more worried about chasing tail than tv resolution, was there this much confusion/disagreement about SD vs HD as there currently is between HD and UHD?



Not as much confusion at all. Everyone, no matter who you asked, agreed to the superiority of HD (which was much simpler back then. No features--just a better picture, and--of course-- the option to use Component cables ).


----------



## Scott Wilkinson




> Quote:
> Originally Posted by *golem*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521853
> 
> 
> Ah so when the Sony X950b only lists 3840x2160/60p (YCbCr 4:2:0 8bit), it means it probably wasnt built using the newer faster bandwidth hardware? Thats disappointing.


That is correct.


----------



## ssb201

While I am all for HDMI 2.0 finally coming out, and I, myself have waited for it to be released before upgrading my gear. There does seem to be a great deal of confusion:


1) For the most part HDMI 1.4b is more than sufficient 4K (2160p content). When it comes just to video, unless you want 3D @ 2160p or 4:4:4 color at greater than 8 bit there is no need to move to HDMI 2.0. Movies do not require higher frame rates than 24 or 30 frames per second. The only use for the higher frame rates offered by HDMI 2.0 is gaming and no consoles support the standard so that leaves PCs only. Eventually we will have 4K Blu-ray or some other standard 3D which will require the bandwidth, but not right now. I do wonder- is it possible to update firmware on HDMI 1.4 devices to support 1080 at 48p?


2) Just about every cable works for every version of the standard. I have never met an HDMI cable under 10 feet that would not work. The same should hold with 2.0. People who are confused by cabling are either dealing with long runs or are bamboozled by the marketing and labeling.


----------



## JohnAV




> Quote:
> Originally Posted by *ssb201*  /t/1523994/hdmi-2-0-cedia-webinar#post_24523800
> 
> 
> 1) For the most part HDMI 1.4b is more than sufficient 4K (2160p content). When it comes just to video, unless you want 3D @ 2160p or 4:4:4 color at greater than 8 bit there is no need to move to HDMI 2.0. Movies do not require higher frame rates than 24 or 30 frames per second. The only use for the higher frame rates offered by HDMI 2.0 is gaming and no consoles support the standard so that leaves PCs only. Eventually we will have 4K Blu-ray or some other standard 3D which will require the bandwidth, but not right now. I do wonder- is it possible to update firmware on HDMI 1.4 devices to support 1080 at 48p?


Along that line of discussion, my concern is with so called HDCP 2.2 associated with HDMI 2.0 and backwards compatibility with existing 4K ready equipment. We had two years of gear that can easily pass [email protected] 24/30 Hz, now because the HDMI.org is wanting to revise HDCP we might have a situation where future media might not be backwards compatible with current 4K ready gear because that group insists on a new HDCP flag only associated with HDMI 2.0 devices. Nice huh?


----------



## AV_Integrated

Nobody, at all, seems to address the audio aspect.


While the majority of people will just hook up to their TV 6 feet away, it is the more complex setups that have been completely screwed over by HDMI for years.


A/V receivers do NOT support digital audio output to zone 2. There are some limited exceptions to this, but the rule is that if you have 10 sources hooked up via HDMI, and you want those sources available to a second zone, such as outside, or in another room, you must hook up analog audio to get the stereo sound in those spaces. Despite this, A/V receivers have been dropping their analog audio inputs, components have been dropping their analog audio outputs, and consumers are left out in the cold.


The hope would be that as a STANDARD, HDMI 2.0 will feature full surround sound and a separate stereo audio mix which is fed across the same HDMI cable. Any multi-zone A/V receivers will be able to pull the stereo feed off any HDMI connection at any time, and still use surround sound for the local feed.


Better yet, receivers with a second HDMI output for zone 2, would be able to feed zone 1 with surround sound, and zone 2 with stereo embedded on HDMI.


They don't talk about it, then don't work hard with it.


As for HDMI CEC - it hasn't worked yet, and I have no belief that it will anytime soon.


Finally, ARC seems to always require CEC, which is worthless if CEC isn't actually working properly, like it isn't right now. HDMI CEC should be one of the coolest features of HDMI, and is a complete failure IMO. Soundbars should use it, A/V receivers should use it, and it should just be something those products can request at any point with, or without CEC for the rest of the functionality.


Ah well, nobody is willing to tell anyone what is really going on, so consumers will remain confused, and installers will make promises that can't be fulfilled. Business as usual.


----------



## jabba359




> Quote:
> Originally Posted by *KidHorn*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522101
> 
> 
> I'm confused about the 3D comments. Why would 3D require more bandwidth than 2D? Don't 3D and 2D use the same number of pixels?





> Quote:
> Originally Posted by *keb33509*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522289
> 
> 
> Twice as many, seeing as it processed two video streams at once.





> Quote:
> Originally Posted by *KidHorn*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522402
> 
> 
> So if you have a 4k TV, 2D renders 2160 lines but 3D renders 4320 lines. How is that possible?



First off, your math is wrong. Doubling your lines actually quadruples your image resolution due to the lines doubling in both width _and_ height.


But that's not quite how it works anyway, as the "twice as many" statement comes from the quantity of frames per second transmitted by 3D, not from an increase in resolution.


2D -> 24 frames per second total. (3840x2160=8,294,400 pixels per frame x 24 frames every second=199,065,600 pixels every second)

3D -> 24 frames per second to the left eye + 24 frames per second to the right eye = 48 frames per second total, which is twice as many frames as 2D. (3840x2160=8,294,400 pixels per frame x 24 frames every second=199,065,600 pixels every second _per eye_. Multiply that by two eyes and it equals a total count of 398,131,200 pixels every second that it's transmitting! That's a lot of pixels!)


The same holds true for 30 frames per second video: 30 FPS in 2D, or 30 (left eye) plus 30 right eye in 3D.


I hope this was more helpful than confusing!


----------



## MikeyD360




> Quote:
> Originally Posted by *BNestico*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522801
> 
> 
> Seeing that I was still in high school when HD was becoming prevalent and was more worried about chasing tail than tv resolution, was there this much confusion/disagreement about SD vs HD as there currently is between HD and UHD?



Its was a bit of a mess and still is - there was "HD Ready" "HD Compatible" and then "Full HD" and it took a while for it to be clear if you were buying a TV that was 720,1080i or 1080p. Not to mention the minefield of whether a HD TV actually had a HD tuner and could recieve HD DTV signals or just accepted a HD signal over HDMI.

Unfortunately this seems to be an industry that thrives on keeping the consumer as confused as possible so that they can shove some buzzwords like HDMI2.0 or "4 million terahertz sub-field warp generator" in your face and convince you to drop way too much cash into a soon to be obsolete product.


----------



## Reddig

What exactly is "dynamic auto lip-sync" and is it a feature or something that happens on its own or a possible option that will be found in AVR's and such with HDMI 2.0?


----------



## Otto Pylot

Sounds like some sort of automatic lip-syncing option that would have nothing to do with HDMI 2.0. Lip-syncing functionality has been around for awhile.


----------



## Scott Wilkinson




> Quote:
> Originally Posted by *dew42*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521435
> 
> 
> Was there any talk of 3D? 1080p 3D with 48/60 fps or 2160p 3D with 24/30 fps? I think I'd prefer HFR over 4K for 3D.


Nothing more specific than HDMI 2.0 can handle any and all forms of 3D that don't exceed a bandwidth of 18 Gbps. The PowerPoint slide on 3D says, "Supports any combination of 3D technique and 4K resolution and frame rates, up to the bandwidth limits (up to 18 Gbps). One example: 4K @ 60 Hz, 8-bit color 4:2:0, full side-by-side 3D." Of course, side-by-side normally means half horizontal resolution in each eye, but I'm not sure if "full side-by-side" means full res in each eye, nor am I sure if 60 Hz means 30 or 60 Hz per eye. I'll see what I can find out about this.


----------



## barrelbelly

Scott:


This is an inspired piece of writing. It answered every single question I had. And it educated me well beyond the scope of my understanding. Thank you very much! The only question I really have is this: Does the new higher HDMI standard basically handcuff us to 4k as the video resolution standard for the foreseeable future? Because, unless I'm reading the top end thresholds incorrectly, it seems to make any upward bump to 8k or higher irrelevant in CE applications. PC can always go the DisplayPort route for higher end PC monitors. But CE panels seem gimped somewhat for future improvements. Am I reading the limitations correctly.


----------



## ssb201




> Quote:
> Originally Posted by *JohnAV*  /t/1523994/hdmi-2-0-cedia-webinar#post_24523848
> 
> 
> Along that line of discussion, my concern is with so called HDCP 2.2 associated with HDMI 2.0 and backwards compatibility with existing 4K ready equipment. We had two years of gear that can easily pass [email protected] 24/30 Hz, now because the HDMI.org is wanting to revise HDCP we might have a situation where future media might not be backwards compatible with current 4K ready gear because that group insists on a new HDCP flag only associated with HDMI 2.0 devices. Nice huh?



Good point. Unfortunately, moving to HDCP 2.2 was inevitable with the vulnerability found in HDCP 2. http://blog.cryptographyengineering.com/2012/08/reposted-cryptanalysis-of-hdcp-v2.html I have some hope that firmware may allow current devices to be updated to support the new copy protection (since there is no change in algorithms, only authentication protocols) but how this may impact the ecosystem of intermediate devices- switches, scalers, etc. is more problematic.


----------



## ssb201




> Quote:
> Originally Posted by *barrelbelly*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24524671
> 
> 
> Scott:
> 
> 
> This is an inspired piece of writing. It answered every single question I had. And it educated me well beyond the scope of my understanding. Thank you very much! The only question I really have is this: Does the new higher HDMI standard basically handcuff us to 4k as the video resolution standard for the foreseeable future? Because, unless I'm reading the top end thresholds incorrectly, it seems to make any upward bump to 8k or higher irrelevant in CE applications. PC can always go the DisplayPort route for higher end PC monitors. But CE panels seem gimped somewhat for future improvements. Am I reading the limitations correctly.



You got it. To do 8K, HDMI 2.x would need to double again the bandwidth. This is what DisplayPort 1.3 has done to support 8k, [email protected], or dual 4K.


----------



## Scott Wilkinson




> Quote:
> Originally Posted by *Reddig*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24524260
> 
> 
> What exactly is "dynamic auto lip-sync" and is it a feature or something that happens on its own or a possible option that will be found in AVR's and such with HDMI 2.0?


It's a feature that must be implemented in products such as AVRs and TVs. Video is typically delayed with respect to the audio because of the time it takes to process video, and that delay in the TV is communicated to the appropriate devices via EDID (Extended Display ID) and CEC; from the PowerPoint slides on this subject, it seems that CEC takes precedence over EDID. Compatible devices receive this info and delay the audio accordingly. But again, the devices must implement this feature for it to work.


----------



## Reddig




> Quote:
> Originally Posted by *Scott Wilkinson*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24524708
> 
> 
> 
> It's a feature that must be implemented in products such as AVRs and TVs. Video is typically delayed with respect to the audio because of the time it takes to process video, and that delay in the TV is communicated to the appropriate devices via EDID (Extended Display ID) and CEC; from the PowerPoint slides on this subject, it seems that CEC takes precedence over EDID. Compatible devices receive this info and delay the audio accordingly. But again, the devices must implement this feature for it to work.



Thank you Scott. I'd love to have this feature as I always find myself having to adjust the delay on my processor to when I play a 3D BD compared to standard BDs and DVDs where no delay is needed. Love to not have to adjust it each time I watch 3D than have to switch it back. I'm very sensitive to dialogue not matching the lips where as my wife won't notice as easily.


----------



## Geoff D

Thanks for the update but there's not a great deal of new info, not to me anyway, because my 4K TV should still be able to accept 2160p24 with better sampling and bit depth. Might need a 4K BD player with two HDMI outs to split off the 4K video and lossless audio (because having both will certainly overwhelm the 1.4 bandwidth), but hopefully that'll be part of the player spec from the beginning.


If/when movies are made at 2160p60 and we actually get something like 2160p60 10-bit 4:2:2 TV broadcasts, then I'll start crying into my cereal. But not yet.


----------



## Scott Wilkinson




> Quote:
> Originally Posted by *Geoff D*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24525360
> 
> 
> Thanks for the update but there's not a great deal of new info, not to me anyway, because my 4K TV should still be able to accept 2160p24 with better sampling and bit depth.


Are you sure about that? As far as I know, many current 4K TVs are 8-bit panels, and few can accept 4:2:2, much less 4:4:4.


----------



## dew42




> Quote:
> Originally Posted by *Scott Wilkinson*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24524664
> 
> 
> 
> Nothing more specific than HDMI 2.0 can handle any and all forms of 3D that don't exceed a bandwidth of 18 Gbps...



Thanks!


----------



## Geoff D




> Quote:
> Originally Posted by *Scott Wilkinson*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24525949
> 
> 
> Are you sure about that? As far as I know, many current 4K TVs are 8-bit panels, and few can accept 4:2:2, much less 4:4:4.



My set uses an 8-bit + Hi-FRC panel, which can in theory reproduce 10-bit colour. In any case, it happily accepts 'upscaled' 12-bit 4:4:4 colour @ 1080p from my Blu-ray player, so why would it not accept a 2160p24 signal with similarly increased sampling and bit depth? Yes, 12-bit 4:4:4 is obviously out of the question due to the 10.2Gb/s HDMI bandwidth restriction on my Sony set, but if they go with 12-bit 4:2:2 as you suspect/hope, then that's under the limit so it should work.


I fully understand that accepting a certain signal is one thing, whether it actually displays every facet of that increased bit depth and colour sampling is quite another, and I suspect I won't have a leg to stand on (though I'm praying I can at least get 10-bit video). But it'll be fun finding out! Now all we need's a deck of cards...


----------



## sanderdvd

I see some replies about the Sony 4K projectors becomming obsolete. Well, not true at all. My Sony VW1100 has HDMI 2.0. Is has the 'slower' 10GBPS chip but for Blu-Ray content 3840x2160 @23,976fps in 4:2:2, 16bit this will be fast enough. I don t see 60fps Blu-Ray s comming for at least 5 years. But i DO see 3840x2160 @23 4:2:2 16-bit movies comming in max one year from now.


----------



## tomj2617


Does anyone know if the 2014  Pioneer VSX-1124K have most of the best HDMI 2.0 Specs? Thanks


----------



## Ben Withrow




> Quote:
> Originally Posted by *MikeyD360*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522165
> 
> 
> So anyone who wasted $20K on Sony's 4K projector, or LG's 84" UHD TV now has an expensive paperweight given HDMI2.0 standards didnt exist when these were released... hands up who saw that coming...



I'm not sure about the LG TV, but Sony is upgrading my 4K VW1000 to a VW1100, which will support HDMI 2.0 up to at least 10.2G. I just wanted to give Sony props for keeping a promise to us first adopters. The 10.2G chip should be able to handle whatever the 4K Bluray spec throws at us. Many times first adopters are hosed, but Sony really stepped up to the plate on this one. My apologies to those who relish in the pains that first adopters often feel










By the way, you should see how good that projector makes HD look on a big screen.


----------



## sanderdvd




> Quote:
> Originally Posted by *Ben Withrow*  /t/1523994/hdmi-2-0-cedia-webinar/0_100#post_24527298
> 
> 
> I'm not sure about the LG TV, but Sony is upgrading my 4K VW1000 to a VW1100, which will support HDMI 2.0 up to at least 10.2G. I just wanted to give Sony props for keeping a promise to us first adopters. The 10.2G chip should be able to handle whatever the 4K Bluray spec throws at us. Many times first adopters are hosed, but Sony really stepped up to the plate on this one. My apologies to those who relish in the pains that first adopters often feel
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, you should see how good that projector makes HD look on a big screen.



HD content with my VW1000 in combination with madVR JINC3 upscaling algorithm looks stunning.


----------



## Geoff D




> Quote:
> Originally Posted by *sanderdvd*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24527056
> 
> 
> I see some replies about the Sony 4K projectors becomming obsolete. Well, not true at all. My Sony VW1100 has HDMI 2.0. Is has the 'slower' 10GBPS chip but for Blu-Ray content 3840x2160 @23,976fps in 4:2:2, 16bit this will be fast enough. I don t see 60fps Blu-Ray s comming for at least 5 years. But i DO see 3840x2160 @23 4:2:2 16-bit movies comming in max one year from now.



The funny thing is, support for 12-bit 4:2:2 2160p at up to 30Hz is actually part of the existing 1.4b HDMI spec, which is why I'm positive that our Sony gear will be able to handle most 24p 4K signals without or without this 2.0 firmware upgrade. To be honest, that's all I care about because movies aren't going to be shot in 50/60p any time soon and eventual 4K TV broadcasts @ 50/60i/p will likely use lower bit depth and colour sampling anyway.


The colour space issue is a different kettle of fish, and should they choose Rec.2020 or even YDzDx it remains to be seen how they'll deal with transmuting that colour information into accurate Rec.709 signals that our 'regular' displays can recognise. But I do wonder whether the powers-that-be are so wary of all this stuff not working with existing tech that they go for the lowest common 4K denominator (that'd be me, then) first, delivering 4K Lite, and then produce some HDR/HFR/WCG extensions to the tech at a later date.


----------



## sanderdvd




> Quote:
> Originally Posted by *Geoff D*  /t/1523994/hdmi-2-0-cedia-webinar/0_100#post_24527597
> 
> 
> The funny thing is, support for 12-bit 4:2:2 2160p at up to 30Hz is actually part of the existing 1.4b HDMI spec, which is why I'm positive that our Sony gear will be able to handle most 24p 4K signals without or without this 2.0 firmware upgrade. To be honest, that's all I care about because movies aren't going to be shot in 50/60p any time soon and eventual 4K TV broadcasts @ 50/60i/p will likely use lower bit depth and colour sampling anyway.
> 
> 
> The colour space issue is a different kettle of fish, and should they choose Rec.2020 or even YDzDx it remains to be seen how they'll deal with transmuting that colour information into accurate Rec.709 signals that our 'regular' displays can recognise. But I do wonder whether the powers-that-be are so wary of all this stuff not working with existing tech that they go for the lowest common 4K denominator (that'd be me, then) first, delivering 4K Lite, and then produce some HDR/HFR/WCG extensions to the tech at a later date.



and what you are saying here is the EXACT reason why I m not upgrading my VW1000 to the VW1100 yet.


----------



## pkeegan

Will it support Dolby Vision?


----------



## sdurani




> Quote:
> Originally Posted by *pkeegan*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24527915
> 
> 
> Will it support Dolby Vision?


Current HDMI will support Dolby Vision, HDMI 2.0 isn't required for that.


----------



## Geoff D




> Quote:
> Originally Posted by *sanderdvd*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24527622
> 
> 
> and what you are saying here is the EXACT reason why I m not upgrading my VW1000 to the VW1100 yet.



Indeed. Heck, the whole way that colour, brightness, gamma etc is mastered & displayed in the home could be about to undergo a radical overhaul. So, having jumped on the 4K bandwagon at the earliest possible opportunity - silly, I know, but my X9 is a bloody good TV regardless, which is why I bought it - it seems even sillier to start chasing after singular upgrades like proper HDMI 2.0 or whatever, when an entire tranche of new display tech that makes EVERY current TV obsolete might be with us sooner rather than later.


I'm 'a sit tight and see what happens.


----------



## Colm




> Quote:
> Originally Posted by *Scott Wilkinson*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521072
> 
> 
> HDMI 2.0 also supports the Rec.2020 specification, which includes a much wider color gamut than the current Rec.709. This allows content and displays to accurately reproduce many more colors than today's Blu-rays and HDTVs.


No, for a given bit depth, it will accurately reproduce the same number of colors as any other gamut. There is just a broader range of colors. On average, more colors will map to a given number. That allows reproduction of some colors that are not possible now, but some colors that are accurately reproduced now will not be as accurately reproduced with the new gamut. This is an easy change for HDMI because all it requires is a change in EDID. But it is moot until content and hardware support it. And that isn't a given, at least in the sense of broad support.


----------



## helvetica bold

I wonder what HDMI hardware is in the PS4 and Xbox One, can they support the new 18 Gbps bandwidth? Also i just downloaded a HDMI 2.0 firmware update for my Sony 1040 receiver.

Does the 2.0 spec benefit us 1080p users at all? Sony stated the update improves PQ...


----------



## darkangelism

Since most movies are only 24/30fps could it handle 2160p/30 4:4:4: 12 bit?


----------



## Otto Pylot




> Quote:
> Originally Posted by *helvetica bold*  /t/1523994/hdmi-2-0-cedia-webinar/40_20#post_24528683
> 
> 
> I wonder what HDMI hardware is in the PS4 and Xbox One, can they support the new 18 Gbps bandwidth? Also i just downloaded a HDMI 2.0 firmware update for my Sony 1040 receiver.
> 
> Does the 2.0 spec benefit us 1080p users at all? Sony stated the update improves PQ...



If only your receiver is HDMI 2.0 compliant that won't help you much if your other devices aren't 2.0 compliant as well. Or at least I can't see how it would. Does the firmware upgrade to the complete 2.0 spec or is it just clock speed?


----------



## helvetica bold

I believe my W9 also got the 2.0 update but

Heres what my Sonys receiver update covers


STR-DN1040 AV Receiver Firmware Upgrade


Release Date 3/19/2014

Version s9327.1090.0

File Size 1 Bytes


This firmware upgrade provides the following benefits:

Improvements over version s9327.1074.0:

Improved image quality to the TV connected via HDMI.

Improved quality of GUI for the Front High left "Level" indication

Improved quality of the Center Lift Up (speaker setting) parameter backup

Improved connectivity quality via DLNA® devices

Improved quality for playing non-compressing FLAC content in USB devices


Provides HDMI® 2.0 compatibility for the following 4K/60p signals:

3840x2160p @ 59.94/60Hz (YCbCr 4:2:0 / 8bit only)

3840x2160p @ 50Hz (YCbCr 4:2:0 / 8bit only)

4096x2160p @ 59.94/60Hz (YCbCr 4:2:0 / 8bit only)

4096x2160p @ 50Hz (YCbCr 4:2:0 / 8bit only)


Benefits provided by previous updates and included in version s9327.1090.0

Improved seek feature via DLNA devices

Improved iPhone®/iPod® device operation from the AV receiver

Improved Party Mode feature

Improved GUI (Graphical User Interface) for the WPA/WPA2 settings


----------



## Otto Pylot

Ok. Looks like the update is not fully compliant but certainly better than 1.4b. No CEC Extensions, dual-view, multi-stream audio, 21:9 aspect or dynamic auto lip-sync. Resolutions are 8-bit so the bandwidth is 8.91 Gbps.


----------



## eaamon

lets see i just updated everything to 1.4b......

i'll wait until they get to HDMI 2.4b before i update again.


----------



## Otto Pylot

Yeah. I sure wouldn't buy any "HDMI 2.0" devices now until the dust settles. It seems that the term "HDMI 2.0" is about as accurate and clear as 4k right is.


----------



## Utopianemo




> Quote:
> Originally Posted by *dew42*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521760
> 
> 
> I'd hope the bandwidth 3D adds would have no impact on 2D in the HDMI spec. I'd like 12 bit 4:2:2 for 2D and 3D. The compromise I suggested to give up 4K to get HFR would be for 3D only.
> 
> 
> Looking at the numbers above; 1080/60p 4:4:4 16 bit uses 8.91 Gbps. Times two for 3D would be just under the 18 Gbps limit.



Sure, if you didn't need audio, that would be just about enough.


----------



## Utopianemo




> Quote:
> Originally Posted by *MikeyD360*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522165
> 
> 
> So anyone who wasted $20K on Sony's 4K projector, or LG's 84" UHD TV now has an expensive paperweight given HDMI2.0 standards didnt exist when these were released... hands up who saw that coming...



*Hand shoots up*


Anybody who is a longtime listener to Scott's podcast saw it coming. I've been locked in a dilemma for months now because my next upgrade needs to be my AVR. I was set on a Pio Elite, but even with 4K upscaling/passthrough, I knew(from HT Geeks) that the insiders at CEDIA, HDMI, and so on were pushing for higher color bit depth and other things. So it didn't make sense to drop a large chunk of change for the state-of-the-art when the state was going to change 6 months down the line.


That's changed somewhat now that Onkyo, Pioneer, and to a lesser extent Emotiva are coming out with AVRs or processors that are "HDMI 2.0 compliant". Since the 2.0 spec is primarily a list of features, HDMI is strenuously discouraging manufacturers from using the 2.0 moniker(unless their devices comply with every feature on the list). That's why Emotiva dropped the 2.0 from their feature list. Given that Onkyo and Pioneer are still openly touting that their products feature HDMI 2.0, I am hopeful this means they'll support REC2020 and the higher bit depth.......but it's not likely.


----------



## Utopianemo




> Quote:
> Originally Posted by *golem*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521853
> 
> 
> Ah so when the Sony X950b only lists 3840x2160/60p (YCbCr 4:2:0 8bit), it means it probably wasnt built using the newer faster bandwidth hardware? Thats disappointing.


the faster bandwidth hardware didn't exist at the time. Scott interviewed Joe Kane a few months ago, as well as one of the guys who was involved with the 2.0 spec(name escapes me), and in one of those two interviews it was mentioned that the hardware was just becoming available to the manufacturers. So it's really new.


----------



## Utopianemo




> Quote:
> Originally Posted by *Geoff D*  /t/1523994/hdmi-2-0-cedia-webinar/30#post_24527597
> 
> 
> The funny thing is, support for 12-bit 4:2:2 2160p at up to 30Hz is actually part of the existing 1.4b HDMI spec, which is why I'm positive that our Sony gear will be able to handle most 24p 4K signals without or without this 2.0 firmware upgrade. To be honest, that's all I care about because movies aren't going to be shot in 50/60p any time soon and eventual 4K TV broadcasts @ 50/60i/p will likely use lower bit depth and colour sampling anyway.
> 
> 
> The colour space issue is a different kettle of fish, and should they choose Rec.2020 or even YDzDx it remains to be seen how they'll deal with transmuting that colour information into accurate Rec.709 signals that our 'regular' displays can recognise. But I do wonder whether the powers-that-be are so wary of all this stuff not working with existing tech that they go for the lowest common 4K denominator (that'd be me, then) first, delivering 4K Lite, and then produce some HDR/HFR/WCG extensions to the tech at a later date.



Oh, c'mon, you're just posting so you can keep putting that "u" in color!


----------



## Utopianemo




> Quote:
> Originally Posted by *hidefpaul*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522610
> 
> 
> Now, let me just start by saying that I COMPLETELY DISAGREE with the HDMI Organization, letting the product manufactures provide the consumer with which parts of the HDMI 2.0 spec they will be offering in their product. I believe that ALL HDMI 2.0 products should adhere to the COMPLETE SPEC, with ALL FEATURES IMPLEMENTED, PERIOD!




If they were to follow your suggestion, then all TVs would have to be 21x9 from here on out.


----------



## Geoff D




> Quote:
> Originally Posted by *Utopianemo*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24531605
> 
> 
> Oh, c'mon, you're just posting so you can keep putting that "u" in color!



Colour, rumour, favour, it's good to get a "u" in there sometimes.


----------



## hidefpaul




> Quote:
> Originally Posted by *JohnAV*  /t/1523994/hdmi-2-0-cedia-webinar#post_24523848
> 
> 
> Along that line of discussion, my concern is with so called HDCP 2.2 associated with HDMI 2.0 and backwards compatibility with existing 4K ready equipment. We had two years of gear that can easily pass [email protected] 24/30 Hz, now because the HDMI.org is wanting to revise HDCP we might have a situation where future media might not be backwards compatible with current 4K ready gear because that group insists on a new HDCP flag only associated with HDMI 2.0 devices. Nice huh?



Thanks JohnAV, I did not realize, that is the case. What I wonder is who is letting these guys (HDMI.org) get away with this crap!!! Is there not a governing body that can do something here? UNBELIEVABLE!!


Paul


----------



## Geoff D

That's a good point about HDCP. Existing 4K passthrough kit will be **** outta luck when it comes to 4K BD (or whatever else the industry comes up with for 4K distribution), because the lack of HDCP 2.2 will give you a blank screen. I suppose we should be thankful that there has only been a couple of years of '4K compatible' amps being available, so such kit isn't too deeply entrenched in people's homes. I upgraded my amp a couple years ago to one that could pass 3D, and I'm not even thinking about changing it again for 4K until these things are ironed out.


----------



## Dan Hitchman

It again looks like the manufacturers are so eager to get something... anything... into the marketplace to bump up sales that they always trip over their own feet.


For one thing, the HDMI board was too quick to come up with a 4k "solution" so that they would continue to be the connection of choice when there was obviously no set of specs. ready to go for UHD media.


Sony and others were desperate to the point they put out ultra-expensive TV's that weren't even close to those potential specs. with no way to really upgrade them.


The BDA has been dragging their feet and may use discs that really aren't going to show UHD video in its best light or come out with specs. that stick with dumbed down versions of UHD. The MPEG committee just added Rec. 2020 support to H.265. And on and on.


It's like they don't communicate with each other at all and we always end up with this kind of mess.


----------



## Mark12547




> Quote:
> Originally Posted by *Utopianemo*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24531630
> 
> 
> 
> 
> 
> Quote:
> Originally Posted by *hidefpaul*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522610
> 
> 
> Now, let me just start by saying that I COMPLETELY DISAGREE with the HDMI Organization, letting the product manufactures provide the consumer with which parts of the HDMI 2.0 spec they will be offering in their product. I believe that ALL HDMI 2.0 products should adhere to the COMPLETE SPEC, with ALL FEATURES IMPLEMENTED, PERIOD!
> 
> 
> 
> 
> 
> If they were to follow your suggestion, then all TVs would have to be 21x9 from here on out.
Click to expand...

 

I don't see 21x9 TVs as a consequence of forcing all HDMI 2.0 TVs abide by all specs of HDMI 2.0, just that they would have to be able to accept a 21x9 video format and do something reasonable if the TV has a different aspect ratio, such as doing its own letterboxing or allowing the consumer to zoom in (effectively doing "center cut"). Conceptually I don't see that as significantly different from how most 16x9 TVs currently handle 4:3 signals by adding pillar bars (or possible other options like stretch or zoom to fill the screen).

 

I suspect there will be lots of confusion if manufacturers pick and choose which 2.0 features to implement. Today there are enough problems with manufacturers not stating what video formats their TVs or streaming devices handle, and I still come across occasional messages from owners of older HDTVs that can't accept signals that some newer streaming devices generate, typically an older HDTV that accepts 1080i but cannot handle either 720p or 1080p and trying to connect a streaming device that generates either 720p or 1080p but not 1080i, though fortunately I am seeing fewer such messages than I used to.


----------



## Digitally challe




> Quote:
> Originally Posted by *Matthias Hutter*  /t/1523994/hdmi-2-0-cedia-webinar/0_100#post_24521969
> 
> 
> I hope more TVs offer DisplayPort 1.2/1.3 connectors, for some 4k/60p 16bit 4:4:4 goodness.
> 
> But the need for copy protection forced them to use HDMI





> Quote:
> Originally Posted by *hidefpaul*  /t/1523994/hdmi-2-0-cedia-webinar/0_100#post_24522610
> 
> 
> "The bottom line is that the term "HDMI 2.0" means next to nothing when trying to figure out the specific capabilities of a particular piece of gear. It's up to consumers to discover which features a manufacturer has included in its products, which can be added in a firmware update, and which will never be implemented. Hopefully, manufacturers will start explicitly listing the features they include in each product, making it easier for consumers to select the gear that's right for them."
> 
> 
> Thanks again Scott, as always, a great report.
> 
> 
> Now, let me just start by saying that I COMPLETELY DISAGREE with the HDMI Organization, letting the product manufactures provide the consumer with which parts of the HDMI 2.0 spec they will be offering in their product. I believe that ALL HDMI 2.0 products should adhere to the COMPLETE SPEC, with ALL FEATURES IMPLEMENTED, PERIOD!
> 
> 
> This entire HDMI cabling 1.0 - 2.0 (mess as I call it), can be confusing to even some AVS enthusiast, let alone the common Joe consumer out there, imagine the confusion they will run into.
> 
> Another issue I have is that, the HDMI Org should NOT ALLOW manufactures to implement specs OVER FW up-grades..... you could be waiting months or even a year for a FW. (Pioneer BDP-51 - DTSMA FW anyone- almost 1 year wait for FW)
> 
> Personally the HDMI association is putting way too much faith in the manufactures. Either that, or they are trying to please them given the mess the HDMI Org has put this entire industry in.
> 
> 
> To the HDMI ORGANIZATION - DO IT RIGHT OR DON"T DO IT AT ALL!!!
> 
> 
> Paul





> Quote:
> Originally Posted by *iatacs19*  /t/1523994/hdmi-2-0-cedia-webinar/0_100#post_24521538
> 
> 
> I am liking the new CEC-Extension being a requirement. It's about damn time we can use any remote for any/all devices.





> Quote:
> Originally Posted by *Dan Hitchman*  /t/1523994/hdmi-2-0-cedia-webinar/0_100#post_24521491
> 
> 
> I'd rather have 12 bit 4:2:2 and a wider color gamut and object audio than 3D.





> Quote:
> Originally Posted by *dew42*  /t/1523994/hdmi-2-0-cedia-webinar/0_100#post_24522563
> 
> 
> We need HEVC over HDMI.



Amen to all the above.


----------



## HockeyoAJB




> Quote:
> Originally Posted by *dew42*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522563
> 
> 
> We need HEVC over HDMI.


I know what HEVC is, but what exactly is HEVC over HDMI?  AFAIK, all you need to do in order to be able to utilize HEVC is to start encoding content using an HEVC encoder (this is done by the content providers whether it be an optical disc format or a video stream) and then play back the content with a device that has an HEVC decoder in it.  I don't think the HDMI cables care what type of encoding scheme was used on the content, so long as the cable can handle the required bandwidth of the AV signal.  Also, usually, the encoded stream would never be passed over an HDMI cable since it would typically be decoded at the source (optical disc player/streaming device) and then passed over the HDMI cable as an uncompressed AV signal.  I'm aware that many players can bitstream encoded audio over SPDIF/HDMI, but I've not heard of any that bitstream encoded video.

 

Am I missing something?


----------



## pettern




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24533808
> 
> 
> 
> I know what HEVC is, but what exactly is HEVC over HDMI?  AFAIK, all you need to do in order to be able to utilize HEVC is to start encoding content using an HEVC encoder (this is done by the content providers whether it be an optical disc format or a video stream) and then play back the content with a device that has an HEVC decoder in it.  I don't think the HDMI cables care what type of encoding scheme was used on the content, so long as the cable can handle the required bandwidth of the AV signal.  Also, usually, the encoded stream would never be passed over an HDMI cable since it would typically be decoded at the source (optical disc player/streaming device) and then passed over the HDMI cable as an uncompressed AV signal.  I'm aware that many players can bitstream encoded audio over SPDIF/HDMI, but I've not heard of any that bitstream encoded video.
> 
> 
> Am I missing something?


You're missing the joke


----------



## tleavit

So to be clear. I have two great 7 year old (using hdmi 1.3) 40 and 50 foot Monoprice HDMI cables going to my projector. Reading here, the cable will need to be replaced with a new one? I still have 2 x cat5e cable run but have never used them.


----------



## img eL




> Quote:
> Originally Posted by *tleavit*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24535199
> 
> 
> Reading here, the cable will need to be replaced with a new one?



I think were cool with just High Speed Hdmi cables for the time being


----------



## img eL

from the hdmi.org FAQ

*Does HDMI 2.0 require new connectors?*


No, HDMI 2.0 uses the existing connectors.

*Does HDMI 2.0 require new cables?*


No, HDMI 2.0 features will work with existing HDMI cables. Higher bandwidth features, such as [email protected]/60 (2160p) video formats, will require existing High Speed HDMI cables (Category 2 cables).


----------



## dew42




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24533808
> 
> 
> 
> I know what HEVC is, but what exactly is HEVC over HDMI?  ...



Rather than decoding the compressed data in the player it would be decoded in the TV. Since TVs have video decoders for internet streaming adding HEVC, if its not already there, may not be that big of a deal. The result would be more data pushed through the HDMI connection without hitting the 18Gbps limit. HDMI would not need to do anything other than provide a protocol so the player could ask the TV if it could handle HEVC.


I have no idea if this is being considered or not. It just occurred to me as a way to overcome the bandwidth limit.


----------



## Otto Pylot




> Quote:
> Originally Posted by *tleavit*  /t/1523994/hdmi-2-0-cedia-webinar/60_20#post_24535199
> 
> 
> So to be clear. I have two great 7 year old (using hdmi 1.3) 40 and 50 foot Monoprice HDMI cables going to my projector. Reading here, the cable will need to be replaced with a new one? I still have 2 x cat5e cable run but have never used them.



For a run that long, I'd use your CAT-5e cables or pull CAT-6 and use those. High speed HDMI, as most of you know, is only certifiable for up to 25'. That doesn't mean they won't work at longer lengths depending on the wire gauge but with UHD/4k on the horizon, HDMI 2.0 coming, etc it might be a good idea to look at what cable options are available and upgrade to avoid any possible headaches later on.


----------



## tleavit

If I try my 40' cable, what would be the obvious markers of failure? Since its data, I would guess dropping and errors on the signal reported on the devices? I'm concerned because when I built my media room, I couldn't run conduits to my projector so my cables are set hard in my ceiling and to get a new cable up here, I'm going to have to get creative (and ugly, like running it up the ceiling). I of course knew it would bite me in the buttox later. But then again, I didn't expect it to run the technology so well even 7 years after the install either.


----------



## Otto Pylot

Off the top of my head I'd say "sparkles" (little flashes of light on the screen), pixelating, etc. If the wire gauge is thick enough you may be ok, for now.


----------



## ssb201




> Quote:
> Originally Posted by *dew42*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24535908
> 
> 
> Rather than decoding the compressed data in the player it would be decoded in the TV. Since TVs have video decoders for internet streaming adding HEVC, if its not already there, may not be that big of a deal. The result would be more data pushed through the HDMI connection without hitting the 18Gbps limit. HDMI would not need to do anything other than provide a protocol so the player could ask the TV if it could handle HEVC.
> 
> 
> I have no idea if this is being considered or not. It just occurred to me as a way to overcome the bandwidth limit.



Pretty much all of the announced UHD TVs will support HEVC (h.265) decoding. While the TMDS signaling of HDMI could not be used for such functionality, it is possible to send the compressed content over the HEC channel. There are a few problems with this:


1) HEC (HDMI Ethernet Channel) is currently limited to 100Mbit. While this is fine for Blu-Ray or Internet Streaming h.265, it would be quite limiting for future content.

2) Blu-Ray and other content distribution methods require HDCP (Digital Copy Protection) and HDCP protects the TMDS channel not the HEC.


It is an interesting idea. Going back to the HiFi days of separates, where the Blu-Ray drive becomes the "transport" and nothing more. I certainly could foresee a day where a very basic disc reader transfers the stored data over gigabit Ethernet to the TV which decodes the content, displays it, and send the audio to a receiver using ARC (Audio Return Channel). There would have to be a content protection mechanism for protecting the data communication between the drive and the display.


In the long run (8K and up displays) something like this may become inevitable. The amount of bandwidth needed for pixel by pixel transmission from device to device will become so high that it simply cannot be done unless it is within the device or over fiber. This is the same issue that is beginning to pop up with PCs and the limitations of PCI Express. There are only so many lanes and such a high frequency that can be run across the traces.


----------



## TorTorden

So just hoping to confirm a suspicion,


A manufacturer could implement the new CEC extensions and say Dual view, and not support [email protected] but still stamp the box with HDMI 2.0 ?


----------



## HockeyoAJB




> Quote:
> Originally Posted by *TorTorden*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24539687
> 
> 
> So just hoping to confirm a suspicion,
> 
> 
> A manufacturer could implement the new CEC extensions and say Dual view, and not support [email protected] but still stamp the box with HDMI 2.0 ?


Technically, manufacturers aren't supposed to be stamping "HDMI 2.0" on their boxes or in their product specs.  The HDMI organization has been pushing manufacturers to list each feature individually.  One way that manufacturers are trying to clarify it (while still attempting to draw attention to the HDMI 2.0 features that they do support) is to list the newest features (the ones that are in the HDMI 2.0 spec) at the top of the feature list with the "HDMI 2.0" moniker either next to the feature or as a header above a group of HDMI 2.0 features.

 

The HDMI organization's main goal is to establish the standard for the interface between consumer electronic devices.  While there needs to be a list of features in mind for determining what the minimum bandwidth requirement should be, it is not in HDMI's scope to dictate what features *must* be included in consumer electronic devices.


----------



## Otto Pylot

^^^^^ agree. Another, imo, deceptive tactic that mfrs are doing is stating that their products "Supports HDMI 2.0". To some, that implies HDMI 2.0, period. Sony's HDMI 2.0 firmware update, for example, increases clock speed to support some of the new resolutions that are covered in HMDI 2.0 but it is not fully HDMI 2.0 compliant. So without the consumer knowing what the new specs are, and what are included in what ever "HDMI 2.0" device they are purchasing, they may still be disappointed once the dust settles and they have issues that they thought would have been eliminated with the new standard. One has to look very closely at any product that has HDMI 2.0 anywhere on the box or product literature to see exactly what they are getting with HDMI 2.0. Unfortunately, a lot of folks don't want to take the time to do their homework and jump at the new marketing spin thinking they are getting, or will get, the entire new spec. Still too early for fully compliant HDMI 2.0 hardware.


----------



## bungi43




> Quote:
> Originally Posted by *Utopianemo*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24531453
> 
> 
> *Hand shoots up*
> 
> 
> Anybody who is a longtime listener to Scott's podcast saw it coming. I've been locked in a dilemma for months now because my next upgrade needs to be my AVR. I was set on a Pio Elite, but even with 4K upscaling/passthrough, I knew(from HT Geeks) that the insiders at CEDIA, HDMI, and so on were pushing for higher color bit depth and other things. So it didn't make sense to drop a large chunk of change for the state-of-the-art when the state was going to change 6 months down the line.
> 
> 
> That's changed somewhat now that Onkyo, Pioneer, and to a lesser extent Emotiva are coming out with AVRs or processors that are "HDMI 2.0 compliant". Since the 2.0 spec is primarily a list of features, HDMI is strenuously discouraging manufacturers from using the 2.0 moniker(unless their devices comply with every feature on the list). That's why Emotiva dropped the 2.0 from their feature list. Given that Onkyo and Pioneer are still openly touting that their products feature HDMI 2.0, I am hopeful this means they'll support REC2020 and the higher bit depth.......but it's not likely.



I'm in the same boat as you. I need to upgrade, and my AVR is next on the list. Problem for me is, my video equipment is a bit old, and I'm not in a huge hurry to update it, so I'm not entirely sure much of this applies to me anyways. I don't have a 3D TV, I don't have a UHD TV, and I don't intend on buying either anytime soon. I was fortunate enough to basically walk into a final model Panny Plasma, and the other stuff hasn't jumped out at me yet (on top of the negative reviews a lot of it has received)...so I'm trying to figure out with HDMI 2.0 would be important enough for me to wait much longer to upgrade my AVR.


----------



## TorTorden




> Quote:
> Originally Posted by *bungi43*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24543213
> 
> 
> I'm in the same boat as you. I need to upgrade, and my AVR is next on the list. Problem for me is, my video equipment is a bit old, and I'm not in a huge hurry to update it, so I'm not entirely sure much of this applies to me anyways. I don't have a 3D TV, I don't have a UHD TV, and I don't intend on buying either anytime soon. I was fortunate enough to basically walk into a final model Panny Plasma, and the other stuff hasn't jumped out at me yet (on top of the negative reviews a lot of it has received)...so I'm trying to figure out with HDMI 2.0 would be important enough for me to wait much longer to upgrade my AVR.



Personally if I needed/wanted to upgrade my avr now I would look for a used one and come back when things settles, I'm absolutely sure my 2 year old avr have a few more years on it but there's always someone who has to have the latest so you can often find good used deals (barring scams of course)


If there a 4k disc player is released I would also guess it would be players with a HDMI split option and we can bypass the avr for image as many have done for 3d for so long.


----------



## chavel

Is there any possibility of linking a display port computer to a HDMI 2.0 4K display with an adapter now or in the future?.


----------



## Otto Pylot

^^^^ I hope so. Maybe Display Port will become another input in the future.


----------



## pettern




> Quote:
> Originally Posted by *chavel*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24544627
> 
> 
> Is there any possibility of linking a display port computer to a HDMI 2.0 4K display with an adapter now or in the future?.



Use a display port to HDMI cable?


----------



## chavel




> Quote:
> Originally Posted by *pettern*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24545085
> 
> 
> Use a display port to HDMI cable?



I should have said "at 3840 X 2160 60Hz " This would be for non HDCP computer use on a 4K display such as photography. At some point hdmi 2.0 should find its way into video cards and laptop computers.


----------



## HDTVAV




> Quote:
> Originally Posted by *golem*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521853
> 
> 
> Ah so when the Sony X950b only lists 3840x2160/60p (YCbCr 4:2:0 8bit), it means it probably wasnt built using the newer faster bandwidth hardware? Thats disappointing.


 

 


> Quote:
> Originally Posted by *Scott Wilkinson*  /t/1523994/hdmi-2-0-cedia-webinar#post_24523552
> 
> 
> 
> That is correct.


 

Great...

 

So not even the coming out 2014 Sony models have the faster bandwidth hardware in them...

 

How can they seriously state that they are HDMI 2.0 then?

 

That is just plain WRONG!

 

So what, wait for the 2015 models?


----------



## Utopianemo




> Quote:
> Originally Posted by *Mark12547*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24533070
> 
> 
> I don't see 21x9 TVs as a consequence of forcing all HDMI 2.0 TVs abide by all specs of HDMI 2.0, just that they would have to be able to accept a 21x9 video format and do something reasonable if the TV has a different aspect ratio, such as doing its own letterboxing or allowing the consumer to zoom in (effectively doing "center cut"). Conceptually I don't see that as significantly different from how most 16x9 TVs currently handle 4:3 signals by adding pillar bars (or possible other options like stretch or zoom to fill the screen).


Good point.


----------



## Utopianemo




> Quote:
> Originally Posted by *HDTVAV*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24550003
> 
> 
> 
> 
> Great...
> 
> 
> So not even the coming out 2014 Sony models have the faster bandwidth hardware in them...
> 
> 
> How can they seriously state that they are HDMI 2.0 then?
> 
> 
> That is just plain WRONG!
> 
> 
> So what, wait for the 2015 models?


I don't think the 2014 Sony models state they are HDMI 2.0.....

As was mentioned before, the higher end Sony models seem to have some firmware upgradability, so those guys aren't totally out in the cold. I wouldn't get too worked up about the whole thing. This always happens with a new standard; there are always bumps that need to be smoothed out. Don't be an early adapter and you'll be just fine.


----------



## Utopianemo




> Quote:
> Originally Posted by *Geoff D*  /t/1523994/hdmi-2-0-cedia-webinar/60#post_24531798
> 
> 
> Colour, rumour, favour, it's good to get a "u" in there sometimes.


Yeah, that and adding extra letters to non-ferrous metals, like alumin*i*um and copppppppper.


----------



## HDTVAV




> Quote:
> Originally Posted by *Utopianemo*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24550064
> 
> 
> 
> I don't think the 2014 Sony models state they are HDMI 2.0.....


 

Directly from Sony's press release a few days ago...

 

*All Sony 4K displays currently support the features defined in HDMI 2.0.*


----------



## Utopianemo




> Quote:
> Originally Posted by *HDTVAV*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24550095
> 
> 
> Directly from Sony's press release a few days ago...
> 
> *All Sony 4K displays currently support the features defined in HDMI 2.0.*



Is the faster hardware you're asking about included in the spec? It's a rhetorical question. If Sony says their displays support the features defined in HDMI 2.0, then they do. I'm only speculating, but I would guess Sony put extra beefy guts in their new displays knowing 1) the spec was still in flux as the displays were being developed and 2) they needed to get the jump on the market. Since it's been mentioned elsewhere in this thread that their 4K displays have some degree of firmware upgradeability, I'm guessing they tried to anticipate where the HDMI spec was heading and outfit the tvs with enough brawn to accomodate the changes. Again, I'm only speculating, but the people here who actually have Sony 4K tvs and projectors don't seem to be too worried about it. It will all be okay.


----------



## HDTVAV




> Quote:
> Originally Posted by *Utopianemo*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24550147
> 
> 
> 
> Is the faster hardware you're asking about included in the spec? It's a rhetorical question. If Sony says their displays support the features defined in HDMI 2.0, then they do. I'm only speculating, but I would guess Sony put extra beefy guts in their new displays knowing 1) the spec was still in flux as the displays were being developed and 2) they needed to get the jump on the market. Since it's been mentioned elsewhere in this thread that their 4K displays have some degree of firmware upgradeability, I'm guessing they tried to anticipate where the HDMI spec was heading and outfit the tvs with enough brawn to accomodate the changes. Again, I'm only speculating, but the people here who actually have Sony 4K tvs and projectors don't seem to be too worried about it. It will all be okay.


 

Maybe you are not reading it right... 









 

With all due respect - it says - currently...

 

*All Sony 4K displays currently support the features defined in HDMI 2.0.*

 

It doesn't say may be available with a future firmware upgrade - it says they have *the features* - currently.

 

And as far as your assertion:

 

*"If Sony says their displays support the features defined in HDMI 2.0, then they do."*

 

Do you know something the rest of us don't? LOL


----------



## Utopianemo

I may be reading it wrong.


Scott said,

"*As I said earlier, HDMI 2.0 can handle 4K/UHD at 50 and 60 frames per second, but there are some limitations—in particular, in the bit depth and level of color subsampling it can convey.*"


You quoted golem saying "Ah so when the Sony X950b only lists 3840x2160/60p (YCbCr 4:2:0 8bit), it means it probably wasnt built using the newer faster bandwidth hardware? Thats disappointing."


and then followed that up by asking how Sony can claim to be compatible with HDMI 2.0. They can, because the HDMI 2.0 spec doesn't say the device has to display *BOTH* 3480x2160/60P AND a higher bit depth simultaneously. It merely states that those display parameters are part of the spec. The Sony displays CURRENTLY can do 4K/60FPS at a low bit depth. Displaying a higher bit depth is a separate feature that isn't required to be done at a high frame rate. Golem's frustration is that the faster hardware needed to display 4K/60fps at a higher bit depth isn't available, but doing all that at the same time isn't a part of the HDMI 2.0 spec.


Wikipedia gives a good run-down of the HDMI Spec here: http://en.wikipedia.org/wiki/HDMI#Version_2.0


----------



## HDTVAV




> Quote:
> Originally Posted by *Utopianemo*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24550185
> 
> 
> I may be reading it wrong.
> 
> 
> Scott said,
> 
> "*As I said earlier, HDMI 2.0 can handle 4K/UHD at 50 and 60 frames per second, but there are some limitations—in particular, in the bit depth and level of color subsampling it can convey.*"
> 
> 
> You quoted golem saying "Ah so when the Sony X950b only lists 3840x2160/60p (YCbCr 4:2:0 8bit), it means it probably wasnt built using the newer faster bandwidth hardware? Thats disappointing."
> 
> 
> and then followed that up by asking how Sony can claim to be compatible with HDMI 2.0. They can, because the HDMI 2.0 spec doesn't say the device has to display *BOTH* 3480x2160/60P AND a higher bit depth simultaneously. It merely states that those display parameters are part of the spec. The Sony displays CURRENTLY can do 4K/60FPS at a low bit depth. Displaying a higher bit depth is a separate feature that isn't required to be done at a high frame rate. Golem's frustration is that the faster hardware needed to display 4K/60fps at a higher bit depth isn't available, but doing all that at the same time isn't a part of the HDMI 2.0 spec.
> 
> 
> Wikipedia gives a good run-down of the HDMI Spec here: http://en.wikipedia.org/wiki/HDMI#Version_2.0


 

OK, now you are just being silly by trying to be right - instead of dealing with the issues... 









 

Here are the HDMI 2.0 specs directly from HDMI.org - 

 

*Does HDMI 2.0 support BT.2020 (rec.2020) colorimetry?*

 

*Yes. HDMI 2.0 includes support for BT.2020 Colorimetry with 10 or more bits of color depth. *

 

*Video Formats defined in BT.2020 and supported by HDMI 2.0 specification: *

 

*– 2160p, 10/12 bits, 24/25/30Hz, RGB/4:2:2/4:4:4 *

*– 2160p, 10/12 bits, 50/60Hz, 4:2:0/4:2:2*

 

* *

And

 

*What are the 4K formats supported by HDMI 2.0?*

 

 
*8bit*
*10bit*
*12bit*
*16bit*
*[email protected]*
RGB

4:4:4
*RGB*

*4:4:4*
*RGB*

*4:4:4*

4:2:2
*RGB*

*4:4:4*
*[email protected]* 
*[email protected]* 
*[email protected]*
*RGB*

*4:4:4*

*4:2:0*
*4:2:0*
*4:2:2*

*4:2:0*
*4:2:0*
*[email protected]* 
*BOLD **texts are new with HDMI 2.0*

 

When, according to the Sony specs on their website - MOST of the above is not supported - on either the 2013 or 2014 models...

 

The TVs only support 4:2:0 8bit - and not 10 bit, or 12 bit, or 16 bit, etc. - that HDMI 2.0 does.

 

Or am I missing something here...?


----------



## kristoffer77

Movies in 24FPS 4K with 4:4:4 16 Bit RGB color would be


----------



## Otto Pylot

I think the point that is being missed here is that the Sony HDMI 2.0 upgrade increases the bandwidth to 8.91Gbps (or thereabouts) which is part of the HDMI 2.0 spec, but it's not the entire bandwidth capability or feature set that is described in the HDMI 2.0 specifications. So, in a sense, it's "HDMI 2.0 Lite" as someone else said. HDMI 2.0? Yes. Fully compliant HDMI 2.0? No. It will be interesting to see how this plays out for the early adopters of "HDMI 2.0" tv's this year as opposed to what is available with next year's models (and other devices). Too early to use HDMI 2.0 as purchasing decision.


----------



## HDTVAV




> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24551193
> 
> 
> I think the point that is being missed here is that the Sony HDMI 2.0 upgrade increases the bandwidth to 8.91Gbps (or thereabouts) which is part of the HDMI 2.0 spec, but it's not the entire bandwidth capability or feature set that is described in the HDMI 2.0 specifications. So, in a sense, it's "HDMI 2.0 Lite" as someone else said. HDMI 2.0? Yes. Fully compliant HDMI 2.0? No. It will be interesting to see how this plays out for the early adopters of "HDMI 2.0" tv's this year as opposed to what is available with next year's models (and other devices). Too early to use HDMI 2.0 as purchasing decision.


 

Well, the HDMI 1.4 spec is a bandwidth of up to 10.2Gbps - so not sure how Sony's HDMI 2.0 increases the bandwidth to 8.91Gbps...?

 

And as far as stating that what Sony has is HDMI 2.0 - would be like some manufacturer stating that their product is HDMI 1.4 - but it doesn't support 3D... 

 

I guess the entire point is - manufacturers should not be allowed to use the HDMI 2.0 moniker - unless their product is capable of all of the HDMI 2.0 specifications.

 

Or, in the alternative - the HDMI consortium should come up with a newer designation - either HDMI 2.0, 2.1, 2.2, etc.... Or HDMI 2.0 "lite"... or something.

 

The consumer is the one that is being mislead here - and in a VERY big way.


----------



## Geoff D

Sony are not increasing the bandwidth at all. What Sony have done is make their 10.2Gb/s HDMI 1.4b equipment compatible with 2.0 specs that fall within the bandwidth limit.


----------



## Otto Pylot

That does make more sense when I went back and looked at the recent Sony HDMI 2.0 upgrade for 4k/60p, 8-bit only. I still think it's misleading though for mfrs to be allowed to state it in such a way that the consumer thinks they're getting the new HDMI 2.0 specs without an explanation of what it actually means at this point in time.


----------



## safe91




> Quote:
> Originally Posted by *HDTVAV*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24550201
> 
> 
> OK, now you are just being silly by trying to be right - instead of dealing with the issues...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here are the HDMI 2.0 specs directly from HDMI.org -
> 
> 
> 
> *Does HDMI 2.0 support BT.2020 (rec.2020) colorimetry?*
> 
> 
> 
> 
> 
> 
> 
> *Yes. HDMI 2.0 includes support for BT.2020 Colorimetry with 10 or more bits of color depth. *
> 
> 
> 
> 
> 
> 
> 
> *Video Formats defined in BT.2020 and supported by HDMI 2.0 specification: *
> 
> 
> 
> 
> 
> 
> 
> *– 2160p, 10/12 bits, 24/25/30Hz, RGB/4:2:2/4:4:4 *
> 
> 
> 
> *– 2160p, 10/12 bits, 50/60Hz, 4:2:0/4:2:2*
> 
> 
> 
> 
> 
> 
> 
> * *
> 
> 
> 
> And
> 
> 
> 
> 
> 
> 
> 
> 
> *What are the 4K formats supported by HDMI 2.0?*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *8bit*
> 
> 
> 
> 
> *10bit*
> 
> 
> 
> 
> *12bit*
> 
> 
> 
> 
> *16bit*
> 
> 
> 
> 
> *[email protected]*
> 
> 
> 
> 
> RGB
> 
> 
> 
> 4:4:4
> 
> 
> 
> 
> *RGB*
> 
> 
> 
> *4:4:4*
> 
> 
> 
> 
> *RGB*
> 
> 
> 
> *4:4:4*
> 
> 
> 
> 4:2:2
> 
> 
> 
> 
> *RGB*
> 
> 
> 
> *4:4:4*
> 
> 
> 
> 
> *[email protected]*
> 
> 
>  
> 
> *[email protected]*
> 
> 
>  
> 
> *[email protected]*
> 
> 
> 
> 
> *RGB*
> 
> 
> 
> *4:4:4*
> 
> 
> 
> *4:2:0*
> 
> 
> 
> 
> *4:2:0*
> 
> 
> 
> 
> *4:2:2*
> 
> 
> 
> *4:2:0*
> 
> 
> 
> 
> *4:2:0*
> 
> 
> 
> 
> *[email protected]*
> 
> 
>  
> 
> *BOLD **texts are new with HDMI 2.0*
> 
> 
> 
> 
> 
> 
> 
> When, according to the Sony specs on their website - MOST of the above is not supported - on either the 2013 or 2014 models...
> 
> 
> 
> 
> 
> 
> 
> The TVs only support 4:2:0 8bit - and not 10 bit, or 12 bit, or 16 bit, etc. - that HDMI 2.0 does.
> 
> 
> 
> 
> 
> 
> 
> Or am I missing something here...?



Hi,


another interesting thread but let me bring some précisions :


1/ two worlds close enough work to write the specifications for future video formats to come and each of these worlds first worked concillier constraints (economic and technical) with current and future realities :

- Télévision ( Broadband, broadcasting ) is driven by ITU, EBU, SMPTE, DVD, CEA, ...

- Movies on physical storage format ( BluRay, BluRay 4k ) is driven ( actually ) by BDA and Hollywood majors ( with basics recommandations of ITU and SMPTE ).


2 / Some technologies are common and others specific to each of these two worlds :

- Signal transportation ( HDMI ), encoder/decoder ( HEVC, AVC ), some color norms ( Rec.709, Rec.2020 ) or frequencies are common.

- Content protection ( HDCP ), some frequencies and aspect ratio are specific to each world.


3 / HDMI to be implemented in both worlds must be compliant with all features.


4 / Rec.2020 was selected for UHD-1 and UHD-2 ( 4k and 8k in TV world ) :

- It will work with color depth of 10 or 12 bits ( 8 bit color depth is not compliant because there not enough informations, but it could work with more than 12 bits with an addon of Rec.2020 spécifications ).

- Rec.2020 is supported by HDMI 2.0 and HEVC/AVC codecs.

- Rec.2020 authorizes the use of RGB or YUV (4:4:4, 4:2:2, AND 4:2:0 ) whatever frequencies ...


5 / Can we say that, because HDMI 2.0 support Rec.2020 and because Rec.2020 authorizes 4:4:4, 4:2:2 and 4:2:0, so HDMI 2.0 will support every kind of video signal format ? Theorically yes ... but, because it dépends on what frequecies are are typically used in each world, we have to answer that it is more complicate.

- Most of TVs brings natively 8 bit decoding capacities, there are very few models which are capable of decoding 10bits.

- More videoprojectors are 10bits compliants regarding to the number of models out.

- 4:2:0 with 8 bits is the only video format accepted by BDA ( also used in majority in TV world ).

- 24hz is not frequently used in TV world, 50/60hz is not frequently used in movie format.


6 / Does A,B,C HDMI LEVEL ( introduced by Sony ) exists in HDMI spécifications ? NO WAY !

It's just a clever ( and marketing ) way for Sony to "occupy the land" with non discontinuity UHD models on market, waiting for UHD/60hz chipsets to be available in mas production.


Here is a comparison between HDMI capabilities and what Sony choose to implement on HDMI LEVELS :











Obviously, all video formats are not used in both world but we can have a global point of view of what is transportable or not ( even if some video format will never be implemented ).


7 / HDMI consortium use TMDS "protocol" for video signal transportation via HDMI cables :

- 3 pair-cables are dedicated to video signal Inside HDMI cable.

- They are synchronised at a specific speed ( called TMDS frequency ) depending on HDMI release ( 340Mhz since HDMI 1.4, 600Mhz since HMDI 2.0 ).

- On every clock frequency, the 3 x TMDS cables send 10 bits ( at the same time ), whereby it explains that with TMDS protocol, there is no band consumption differencies within 8bits or 10 bits video signal. When they used 8 bits video signal, they complete the 10bits transmission with 2 bits ( for auto control transmission ) to preserve the synchronisation ( 1 clock frequencie = 1 pixel sent ).

- So, at every clock frequency a complete pixel is sent ( RGB/YUV are coded with 3 primaries ). That's why TMDS frequency is also called pixels number sent by second.


8 / How industry see the future of these technologies in the public sector ( TV world ) ?


Digitaleurop have published a road map which indicate many things :











• The roadmap is divided temporally into two parts : 2013 to mid 2016 period reserved for transition to UHD and the UHD -1 / mid 2016 mid 2019 transition to the UHD -2. We note that no aspect setting future standards that will be able to deploy industry are known ( color space , frequency , color depth , audio format).

• End 2013: generalization components for encoding / decoding a HEVC UHD signal 30Hz Max .

• Mid 2014 generalization components for encoding / decoding a HEVC UHD signal at 60Hz max , early generalization of HDMI 2.0 on TV screens.

• Mid 2015 HEVC codec evolution , evolution of the HDMI standard , video production environments UHD in live operational for satellite and DTT.

• End 2015 replacing the Blu Ray and 4K UHD ( unknown frequency display ) Compatible Disc, appearance 8k TV screens with a new HDMI standard.

• Mid 2016 a new general type of connection in addition to HDMI, generalization UHD -1 compatible with the future standard satellite receiver decoders deployment.

• End 2016 : generalization components for encoding / decoding a HEVC UHD signal to 120Hz max .

• Early 2018 : 4K TV appearance IDTV screens .

• Mid 2018 : evolution of the HDMI standard for the UHD -2.

• Mid 2019 a new general type of connection in addition to the HDMI to support speeds of UHD -2 deployment.


DVB has globally the same point of view :











• Phase 1: decoders available in 2014/15 will be the main limiting frame rate ( 60p ) and a color depth of 10 bit max . The typical profile of UHD signal that broadcasting companies will be able to stream in 2014/2015 would be the 2160p50/60 or so of 1080p100/120 , 10bits , Rec.709 treatment with " dynamic range " type. 5.1 audio tracks will be recommended at this stage .

• Phase 2: decoders available in 2017/18 will accept frequencies up to 120p and a color depth of 12 bits. The typical profile of UHD signal that broadcasting companies will be able to stream in 2017/2018 would be the 2160p100/120 , 12bits , Rec.2020 treatment with " dynamic range " type. 5.1 audio tracks will be generalized and expanded with a new format audio broadcasting ( to be defined by the ITU ), allowing to add to it discrete or hybrid soundtracks and spend to see 7.1 of 9.1.

• Phase 3: the arrival of the UHD -2 in 2020 with 33 million pixels per image . The typical profile of UHD signal that broadcasting companies will be able to broadcast from 2020 would be the 4320p100/120 , 12bits , Rec.2020 treatment with " dynamic range " type.


Regards,


Safe.


----------



## HDTVAV


With all due respect safe91 - I have no idea what you were trying to "say"...

 

Care to give us a few-sentence synopsis...?


----------



## safe91




> Quote:
> Originally Posted by *HDTVAV*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24551539
> 
> 
> With all due respect safe91 - I have no idea what you were trying to "say"...
> 
> 
> Care to give us a few-sentence synopsis...?



Sorry for this massive post but many things are connected, and work on a specific one don't allow the global view needed.

I was not trying to write my thinking but i was trying to explain known facts.

I hope those who have read attentively to the end, has certainly understand what I was referring.



Regards.


----------



## HockeyoAJB




> Quote:
> Originally Posted by *HDTVAV*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24551539
> 
> 
> 
> With all due respect safe91 - I have no idea what you were trying to "say"...
> 
> 
> 
> Care to give us a few-sentence synopsis...?


Most of the information in the final two charts deals with the standard for broadcast UHDTV (satellite, cable, and possibly OTA?) and when they expect to implement certain improvements in Europe.  The US will likely trail Europe by as much as 6 months.  Japan and China intend to skip 4K and go straight to 8K in the next couple of years.

 

I guess the bottom line is that you should understand that HDMI 2.0 and many of the standards that the industry is developing now are based on what they hope to accomplish in the coming years.  The standards come first (initially as a basic outline).  Then equipment starts to be made that can do some (but not all) of what is listed in the standards.  As technology advances, more and more of what is listed in those standards comes to fruition.  Meanwhile, they continue to evaluate where the industry is and where it hopes to go, revising the standards as needed/wanted.

 

It is/was a mistake to try and use the HDMI moniker as some sort of certification of what features a particular consumer electronic device has or some kind of grading system for said equipment.  That's not its purpose.  Bear in mind that some of the features in the HDMI 2.0 specification such as ITU BT.2020 (rec. 2020) color space can't currently be implemented because the display technology to accomplish it doesn't even exist yet.  Others, such as HDR (high dynamic range) are only in the beginning stages of implementation as there isn't a universal standard for how to capture, deliver, and display content that contains HDR information at every step of the chain from the production studio to people's homes, yet.  Dolby Vision is one possible solution for HDR, but it hasn't been accepted as *the* solution yet.


----------



## safe91




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24551989
> 
> 
> 
> Most of the information in the final two charts deals with the standard for broadcast UHDTV (satellite, cable, and possibly OTA?) and when they expect to implement certain improvements in Europe.  The US will likely trail Europe by as much as 6 months.  Japan and China intend to skip 4K and go straight to 8K in the next couple of years.
> 
> 
> I guess the bottom line is that you should understand that HDMI 2.0 and many of the standards that the industry is developing now are based on what they hope to accomplish in the coming years.  The standards come first (initially as a basic outline).  Then equipment starts to be made that can do some (but not all) of what is listed in the standards.  As technology advances, more and more of what is listed in those standards comes to fruition.  Meanwhile, they continue to evaluate where the industry is and where it hopes to go, revising the standards as needed/wanted.
> 
> 
> It is/was a mistake to try and use the HDMI moniker as some sort of certification of what features a particular consumer electronic device has or some kind of grading system for said equipment.  That's not its purpose.  Bear in mind that some of the features in the HDMI 2.0 specification such as ITU BT.2020 (rec. 2020) color space can't currently be implemented because the display technology to accomplish it doesn't even exist yet.  Others, such as HDR (high dynamic range) are only in the beginning stages of implementation as there isn't a universal standard for how to capture, deliver, and display content that contains HDR information at every step of the chain from the production studio to people's homes, yet.  Dolby Vision is one possible solution for HDR, but it hasn't been accepted as _*the*_ solution yet.



Thanks HockeyoAJB for this perfect synthesis.


If future is known for TV world, we have actually no information at all from the other world ( UHD/4K movie on physical storage ).

Whatever caracteristics of video signal choosen, be sure that HDMI will be compliant, and if not, we'll have a new release 2.x. In other words, HDMI consortium is not refering what will be the next futur video formats but it will be compliant.


Sony can implement what they want on their products and called it compliant HDMI 2.0 with different levels, but the real full capacity of HDMI 2.0 will not be exploited this year ...


Regards.


----------



## HDTVAV


Exactly...

 

However, the average, and even the more sophisticated consumer - does not know this.

 

When you see something has HDMI 2.0 - you expect, and reasonably so, that it has FULL HDMI 2.0 - and not 1 of the 10 specs, or some sort of subset of HDMI 2.0.

 

Again, it is VERY misleading - and it is done knowing it is misleading.


----------



## HDTVAV


So when should we expect arrival of FULL HDMI 2.0 in products...?

 

Obviously not in 2014...

 

Next year's models - 2015?

 

Hopefully not 2016...?


----------



## safe91




> Quote:
> Originally Posted by *HDTVAV*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24552173
> 
> 
> Exactly...
> 
> 
> However, the average, and even the more sophisticated consumer - does not know this.
> 
> 
> When you see something has HDMI 2.0 - you expect, and reasonably so, that it has FULL HDMI 2.0 - and not 1 of the 10 specs, or some sort of subset of HDMI 2.0.
> 
> 
> Again, it is VERY misleading - and it is done knowing it is misleading.



That's why people have a look on forum like AVS ? To get real and clear news ?


- Consumers should avoid to associate HDMI release with HDMI cables ( no correspondane at all ) and learn how to find what is adapted to their needs, ... sellers continue misleading consumers.

- Early adopters usually knows this informations ( not all but a big part of them ).


----------



## Otto Pylot

I think 2016 is reasonable for an almost fully compliant line of HDMI 2.0 devices. However that means you'll need to update not only your tv but your receiver, blu-ray player, etc etc etc.


----------



## safe91




> Quote:
> Originally Posted by *HDTVAV*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24552179
> 
> 
> So when should we expect arrival of FULL HDMI 2.0 in products...?
> 
> 
> Obviously not in 2014...
> 
> 
> Nest year's models - 2015?
> 
> 
> Hopefully not 2016...?




End of 2015 coincide with the arrival of the BluRay 4K and the commercialisation of definitive UHD products ( first phase ). We can expect at this time taht everything ( full HDMI 2.0, HDR, HEVC ) will be out on market.


For my own, i hope 10bits color depth as minimum standard on both world. 4:2:2 could be a reasonable choice also ( more practical in next BluRay generation than in satellite and TNT world ).


Rec.2020 could maybe expected for 2017, difficult to say.


----------



## HDTVAV




> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24552235
> 
> 
> I think 2016 is reasonable for an almost fully compliant line of HDMI 2.0 devices. However that means you'll need to update not only your tv but your receiver, blu-ray player, etc etc etc.


 

 


> Quote:
> Originally Posted by *safe91*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24552280
> 
> 
> 
> End of 2015 coincide with the arrival of the BluRay 4K and the commercialisation of definitive UHD products ( first phase ). We can expect at this time taht everything ( full HDMI 2.0, HDR, HEVC ) will be out on market.
> 
> 
> For my own, i hope 10bits color depth as minimum standard on both world. 4:2:2 could be a reasonable choice also ( more practical in next BluRay generation than in satellite and TNT world ).
> 
> 
> Rec.2020 could maybe expected for 2017, difficult to say.


 

So, in other words - the 2016 models...?


----------



## safe91

Yes !


----------



## HockeyoAJB




> Quote:
> Originally Posted by *HDTVAV*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24552294
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So, in other words - the 2016 models...?


By which point HDMI 2.x or even 3.0 will be released as we begin the transition to 8K and whatever other features the industry deems worthy to keep consumers spending money on the next great thing.  However, the content that makes 4K equipment worthwhile will finally be available at that point and it would be a few years before 8K content would be readily available, so you could probably justify the expense at that point.  The only problem is you would need to replace all of your equipment within a fairly short window (2016 to 2019) in order to get the full benefit of UHD 4K before you have to start worrying about some of it becoming obsolete.


----------



## Otto Pylot

^^^^ then there's the even more difficult task of the dreaded "WAF"


----------



## Scott Wilkinson




> Quote:
> Originally Posted by *Geoff D*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24551381
> 
> 
> Sony are not increasing the bandwidth at all. What Sony have done is make their 10.2Gb/s HDMI 1.4b equipment compatible with 2.0 specs that fall within the bandwidth limit.


Correct. And the only 2.0 feature that falls within the 10.2 Gbps bandwidth limit is 2160p/60 at 4:2:0 8-bit. All other features supported by HDMI 2.0 require more bandwidth than 10.2 Gbps, which means they require a hardware upgrade. Still, having this one feature allows manufacturers to claim they have HDMI 2.0, though I think that's pretty misleading.


----------



## Otto Pylot

Agreed. And I think that's the whole point of this HDMI 2.0 discussion. I fear there are going to be a lot of disappointed people who buy HDMI 2.0 tv's this year thinking they have the full specs.


----------



## Scott Wilkinson




> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24562543
> 
> 
> Agreed. And I think that's the whole point of this HDMI 2.0 discussion. I fear there are going to be a lot of disappointed people who buy HDMI 2.0 tv's this year thinking they have the full specs.


Yep, I'm afraid so. That's why I attended the webinar and wrote this report, to help people understand what's really going on with HDMI 2.0.


----------



## Otto Pylot




> Quote:
> Originally Posted by *Scott Wilkinson*  /t/1523994/hdmi-2-0-cedia-webinar/120_20#post_24562631
> 
> 
> 
> Yep, I'm afraid so. That's why I attended the webinar and wrote this report, to help people understand what's really going on with HDMI 2.0.



...and I'd like to thank you for attending and reporting. It was very well written and informative.


----------



## Geoff D




> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24562543
> 
> 
> Agreed. And I think that's the whole point of this HDMI 2.0 discussion. I fear there are going to be a lot of disappointed people who buy HDMI 2.0 tv's this year thinking they have the full specs.



Indeed. And on that point, I'm puzzled as to how Panasonic's HDMI 2.0 implementation in their 2013 1st gen 4K sets seems to be regarded as the real deal while everyone else is having to wait until mid-to-late 2014 for legit 2.0 hardware. Heck, because Sony have already got their 2014 4K sets on the production lines, it means that there won't be any genuine 2.0 silicon in their sets until the 2015 range comes out!


----------



## pettern

As long as the 2014-models handle the increased speed (through a beefed up HDMI controller chip), the rest can likely be handled with firmware upgrades, as it appears that it's mostly a software problem.


----------



## Geoff D

Yeah, I suppose that's what Panasonic has done, they over-specced the HDMI processing hardware and then fiddled the software to the appropriate degree once the official 2.0 features were announced. But Sony are on the HDMI Forum's board of directors just like Panasonic, and it baffles me that they didn't come up with something similar, given that they obviously MUST have known that features could be added via firmware just as long as the bandwidth was there. Instead they decided on the easiest way out, a firmware upgrade to unlock the bare minimum of HDMI 2.0 support using existing silicon. Typical ass-backwards Sony thinking.


----------



## Joe Fernand

CEDIA 2013 in Denver hit you with huge ‘We are first with HDMI 2.0’ banners as soon as you walked in the door – quickly followed by quiet mutterings of ‘in-life upgrades, external converters, plug-in boards, it’ll be alright on the night’ every ‘fudge’ you can imagine!


Joe


----------



## pettern




> Quote:
> Originally Posted by *Geoff D*  /t/1523994/hdmi-2-0-cedia-webinar/120#post_24563811
> 
> 
> Yeah, I suppose that's what Panasonic has done, they over-specced the HDMI processing hardware and then fiddled the software to the appropriate degree once the official 2.0 features were announced. But Sony are on the HDMI Forum's board of directors just like Panasonic, and it baffles me that they didn't come up with something similar, given that they obviously MUST have known that features could be added via firmware just as long as the bandwidth was there. Instead they decided on the easiest way out, a firmware upgrade to unlock the bare minimum of HDMI 2.0 support using existing silicon. Typical ass-backwards Sony thinking.



This is likely what Sony is also doing according to http://hdguru.com/hdmi-2-0-what-you-need-to-know/ which speculates they are also using an chip that can be clocked higher to handle the bandwidth requirements. See also http://www.cnet.com/news/hdmi-2-0-upgrade-path-where-do-the-manufacturers-stand/ where the TVs that do require a hardware update is listed. In the context of this, it does sound like Sony will just upclock the HDMI hardware to deal with the increased bandwidth requirements, and handle the rest in software. I got the Samsung 64F9005 as it's the only TV with a guaranteed upgrade path, though.


----------



## Otto Pylot

That's sort of what I heard late last year that Sony's HDMI 2.0 "upgrade" was going to be increasing the clock speed. I just assumed that when the first upgrade was recently released it was the clock increase but apparently it was not. They were just pushing the 1.4 resolution into the 2.0 range and still keeping within the bandwidth of 1.4. I guess if that works ok the next step would be the clocking with any associated software upgrade. I'm still a little leery about upgrading the firmware multiple times to get as close to full HDMI 2.0 compliance as possible. Seems like that path for "full compliance" is open to all kinds of potential issues as opposed to having it just built-in to the hardware to begin with.


----------



## Geoff D




> Quote:
> Originally Posted by *pettern*  /t/1523994/hdmi-2-0-cedia-webinar/120#post_24564037
> 
> 
> This is likely what Sony is also doing according to http://hdguru.com/hdmi-2-0-what-you-need-to-know/ which speculates they are also using an chip that can be clocked higher to handle the bandwidth requirements. See also http://www.cnet.com/news/hdmi-2-0-upgrade-path-where-do-the-manufacturers-stand/ where the TVs that do require a hardware update is listed. In the context of this, it does sound like Sony will just upclock the HDMI hardware to deal with the increased bandwidth requirements, and handle the rest in software. I got the Samsung 64F9005 as it's the only TV with a guaranteed upgrade path, though.





> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/120#post_24564122
> 
> 
> That's sort of what I heard late last year that Sony's HDMI 2.0 "upgrade" was going to be increasing the clock speed. I just assumed that when the first upgrade was recently released it was the clock increase but apparently it was not. They were just pushing the 1.4 resolution into the 2.0 range and still keeping within the bandwidth of 1.4. I guess if that works ok the next step would be the clocking with any associated software upgrade. I'm still a little leery about upgrading the firmware multiple times to get as close to full HDMI 2.0 compliance as possible. Seems like that path for "full compliance" is open to all kinds of potential issues as opposed to having it just built-in to the hardware to begin with.



You can't magically add bandwidth if it's restricted in the hardware itself. "Overclock" the processor all you want, but what good does it do if the data coming in is physically bottlenecked by the silicon that's in front of it? Some MFs appear to have put in beefier hardware in anticipation of this, yes, but what Sony has done is make the regulation 1.4b chip compatible with whatever 2.0 specifications fall under the 10.2Gb/s bandwidth limit; that is to say, not many.


As for the calculations in the table at "hdguru", they seem to be wildly optimistic to me, not least because they include "32 channels of 24-bit, 96 kHz lossless audio" too! The table in the first page of this thread paints a very different picture, and comes from the HDMI Licensing group themselves, so I know which one I believe.


----------



## Otto Pylot




> Quote:
> Originally Posted by *Geoff D*  /t/1523994/hdmi-2-0-cedia-webinar/120_20#post_24565860
> 
> 
> 
> You can't magically add bandwidth if it's restricted in the hardware itself. "Overclock" the processor all you want, but what good does it do if the data coming in is physically bottlenecked by the silicon that's in front of it? Some MFs appear to have put in beefier hardware in anticipation of this, yes, but what Sony has done is make the regulation 1.4b chip compatible with whatever 2.0 specifications fall under the 10.2Gb/s bandwidth limit; that is to say, not many.
> 
> 
> As for the calculations in the table at "hdguru", they seem to be wildly optimistic to me, not least because they include "32 channels of 24-bit, 96 kHz lossless audio" too! The table in the first page of this thread paints a very different picture, and comes from the HDMI Licensing group themselves, so I know which one I believe.



I absolutely agree with you. The more I read about this the more it seems like a "bait and switch" scam by Sony or any other mfr who says their 2014 tv's are HDMI 2.0. It still amazes me that they can get away with it but then again, my guess is that the majority of buying public doesn't take the time to research their purchases carefully or bother to stop by AVS and ask. They just take the marketing spin hook, line, and sinker.


----------



## dsinger




> Quote:
> Originally Posted by *Geoff D*  /t/1523994/hdmi-2-0-cedia-webinar/120#post_24563811
> 
> 
> Yeah, I suppose that's what Panasonic has done, they over-specced the HDMI processing hardware and then fiddled the software to the appropriate degree once the official 2.0 features were announced. But Sony are on the HDMI Forum's board of directors just like Panasonic, and it baffles me that they didn't come up with something similar, given that they obviously MUST have known that features could be added via firmware just as long as the bandwidth was there. Instead they decided on the easiest way out, a firmware upgrade to unlock the bare minimum of HDMI 2.0 support using existing silicon. Typical ass-backwards Sony thinking.



Don't have a link but Panasonic claims to manufacture their own HDMI 2.0 chips. This came out about the same time they released their first 4k set last year. Still the only full 2.0 set as far As i can determine.


----------



## Geoff D

Ah, that explains it. Shame that the other manufacturers couldn't have bought Panasonic's chips, but I'd imagine that they weren't up for sale!


----------



## khoop

I just bought the Sony XBR-55X850A, and after reading this whole thread, can I just simply upgrade my components like you would your PC in order to have the full 2.0 HDMI, or did I just kinda waste 2400$ bucks.

I had a Samsung UN55D7000 tv, but , I got brain washed, you could say, in buying this Sony.

I think, either way, this tv is way better then my old D7000 model but, I coulda got a bigger 1080p tv for what I paid on this one.


----------



## MrSmartyAss

Silicon Image announced in January at the CES 2014 show a new HDMI 2.0 full featured chip for TV manufacturers/ set-top box makers. It is supposed to be sampling to manufactures now.


The dual-mode Sil9777 port processor supports:

• 4K Ultra HD resolution @ 50/60Hz as outlined in the HDMI 2.0 specification

• HDCP 2.2 premium content protection

• MHL versions 1.0, 2.0 and 3.0, with video input resolution up to 4K


----------



## Otto Pylot

Good to know, as long as it meets all the other specifications outlined in HDMI 2.0 besides resolution and HDCP. At least the chipset is available for mfrs to incorporate if they choose to do so.


----------



## HDTVAV




> Quote:
> Originally Posted by *MrSmartyAss*  /t/1523994/hdmi-2-0-cedia-webinar/120#post_24573848
> 
> 
> Silicon Image announced in January at the CES 2014 show a new HDMI 2.0 full featured chip for TV manufacturers/ set-top box makers. It is supposed to be sampling to manufactures now.
> 
> 
> The dual-mode Sil9777 port processor supports:
> 
> • 4K Ultra HD resolution @ 50/60Hz as outlined in the HDMI 2.0 specification
> 
> • HDCP 2.2 premium content protection
> 
> • MHL versions 1.0, 2.0 and 3.0, with video input resolution up to 4K


 

Isn't that what they already have now...?


----------



## HockeyoAJB




> Quote:
> Originally Posted by *HDTVAV*  /t/1523994/hdmi-2-0-cedia-webinar/120#post_24577047
> 
> 
> 
> 
> 
> Isn't that what they already have now...?


Current displays and AVR's that use the 10.2 Gbps chips top out at UHD (3840x2160) @ 60 fps w/ 8 bit color & 4:2:0 subsampling.  The new chips should support higher bit rate color (10 bit, 12 bit, etc.) and better color subsampling (4:2:2 & 4:4:4) even @ 4K60.

 

MHL 3.0 support is also new, though it might be possible to update existing displays & AVR's to support some of the features of MHL 3.0 via. firmware update, like they have done with some of the features mentioned in the HDMI 2.0 spec.  All displays and AVR's that were released in 2013 only supported MHL 2.0 at the time of their release (if they supported MHL at all).

 

The new chips should also have the bandwidth capacity to support every other feature mentioned in the HDMI 2.0 spec.  However, many of those features have yet to be implemented in consumer electronic devices and bandwidth was not the only issue preventing some of them from being implemented.  Hardware, software, format, and encoding scheme technologies still need to be settled upon in some cases.  HDMI cables & chips are only a part of the equation.


----------



## Dan Hitchman




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/120#post_24577748
> 
> 
> 
> Current displays and AVR's that use the 10.2 Gbps chips top out at UHD (3840x2160) @ 60 fps w/ 8 bit color & 4:2:0 subsampling.  The new chips should support higher bit rate color (10 bit, 12, bit, etc.) and better color subsampling (4:2:2 & 4:4:4) even @ 4K60.
> 
> 
> MHL 3.0 support is also new, though it might be possible to update existing displays & AVR's to support some of the features of MHL 3.0 via. firmware update, like they have done with some of the features mentioned in the HDMI 2.0 spec.  All displays and AVR's that were released in 2013 only supported MHL 2.0 at the time of their release (if they supported MHL at all).
> 
> 
> The new chips should also have the bandwidth capacity to support every other feature mentioned in the HDMI 2.0 spec.  However, many of those features have yet to be implemented in consumer electronic devices and bandwidth was not the only issue preventing some of them from being implemented.  Hardware, software, format, and encoding scheme technologies still need to be settled upon in some cases.  HDMI cables & chips are only a part of the equation.



Early adopters... buyer beware. I don't think I'll be getting any UHD devices until a couple years in. The ground is shifting too much as it is.


----------



## TMcG




> Quote:
> Originally Posted by *AV_Integrated*  /t/1523994/hdmi-2-0-cedia-webinar#post_24523906
> 
> 
> Nobody, at all, seems to address the audio aspect.... it is the more complex setups that have been completely screwed over by HDMI for years.
> 
> *The hope would be that as a STANDARD, HDMI 2.0 will feature full surround sound and a separate stereo audio mix which is fed across the same HDMI cable.* Any multi-zone A/V receivers will be able to pull the stereo feed off any HDMI connection at any time, and still use surround sound for the local feed.



QUOTED FOR TRUTH!!!! Do you hear us, HDMI Licensing, LLC??


I'd also throw liked to throw in problems with EDID, HDCP and DRM as it pertains to distributed video over HDMI, especially with HDMI matrix switches...but each of those could be separate threads unto themselves.


----------



## Dan Hitchman




> Quote:
> Originally Posted by *TMcG*  /t/1523994/hdmi-2-0-cedia-webinar/120#post_24581894
> 
> 
> QUOTED FOR TRUTH!!!! Do you hear us, HDMI Licensing, LLC??
> 
> 
> I'd also like to throw in problems with EDID, HDCP and DRM as it pertains to distributed video over HDMI, especially with HDMI matrix switches...but each of those could be separate threads unto themselves.



Yes! HDMI better address these big issues, and pronto!


----------



## Wryker

Excellent write up. I remember watching an episode of Revision 3 on my TiVo and the 4:2:2 etc was briefly explained and I was totally confused but your post has made it clear.


----------



## Tesla1856




> Quote:
> Originally Posted by *AV_Integrated*  /t/1523994/hdmi-2-0-cedia-webinar#post_24523906
> 
> 
> A/V receivers do NOT support digital audio output to zone 2. There are some limited exceptions to this, but the rule is that if you have 10 sources hooked up via HDMI, and you want those sources available to a second zone, such as outside, or in another room, you must hook up analog audio to get the stereo sound in those spaces. Despite this, A/V receivers have been dropping their analog audio inputs, components have been dropping their analog audio outputs, and consumers are left out in the cold.
> 
> 
> The hope would be that as a STANDARD, HDMI 2.0 will feature full surround sound and a separate stereo audio mix which is fed across the same HDMI cable. Any multi-zone A/V receivers will be able to pull the stereo feed off any HDMI connection at any time, and still use surround sound for the local feed.
> 
> 
> Better yet, receivers with a second HDMI output for zone 2, would be able to feed zone 1 with surround sound, and zone 2 with stereo embedded on HDMI.


This is all very interesting ... HT-Tech moving forward (just like computers).

 

But your Zone-2 comments especially caught my attention. I finally hooked up my Zone-2 and while I embraced HDMI a while back, now I get to reinstall my RCA cables and try to get more sources to Zone-2. *I agree ... it all seems a step backwards.*

 

My Onkyo isn't that old (but non-network) so no easy digital media to Zone-2. I've put my energy into HT-PC via HDMI because I thought that was where things were going. Sound is in HDMI signal supplied by video card. The normal sound-card (with analog output support) isn't even being used.

 

To keep this thread on topic we can't chat here ... but do you have a thread started about analog Zone-2?


----------



## steve1971

Another great article Scott. It seems to me that this whole HDMI 2.0 thing is a jumbled mess which needs to be ironed out by the manufacturers and better explained by them. But again thanks for posting the article.


----------



## 6athome




> Quote:
> Originally Posted by *Matthias Hutter*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521969
> 
> 
> I hope more TVs offer DisplayPort 1.2/1.3 connectors, for some 4k/60p 16bit 4:4:4 goodness.
> 
> But the need for copy protection forced them to use HDMI


Panasonic offers Display port on their 4K TV'S. I have a cable card computer that plays TV and records HBO & Stars for playback thru HDMI! My question is, if I try the same thing with display port, will it not work?


THANKS AGAIN SCOTT!


----------



## HockeyoAJB




> Quote:
> Originally Posted by *6athome*  /t/1523994/hdmi-2-0-cedia-webinar/120#post_24595912
> 
> 
> 
> Panasonic offers Display port on their 4K TV'S. I have a cable card computer that plays TV and records HBO & Stars for playback thru HDMI! My question is, if I try the same thing with display port, will it not work?
> 
> 
> THANKS AGAIN SCOTT!


 

There are multiple factors at play in your scenario, so it would be best to address each of them separately.

 

First, your cable company does not broadcast in 4K, let alone 4k/60p 16-bit 4:4:4, so your source material is going to be 1080p at best.  So, unless your plan was to use playback software on your computer that does 4K upscaling (such as PowerDVD 13), then the signal from your PC to the display would still be 1080p at best.  In this case, there is no benefit to using display port as HDMI can easily handle a 1080p signal.

 

Second, even if you did want your playback software to do the upscaling, the result would still not be 4K/60p 16-bit 4:4:4, as nobody currently broadcasts 16-bit color at any resolution.  Most broadcasts are still 8-bit color and 4:2:0, even at 1080p.  So, your upscaled signal would be 4K/60p 8-bit 4:2:0 for some content and 4K/24p 8-bit 4:2:0 for the rest.  Both of these can be carried by a high speed HDMI cable as well.  The 24 fps stuff would be accepted by any 4K display that has an HDMI port.  The 60 fps stuff would require a 4K display that has at least the bare minimum support for HDMI 2.0.  Even 4K displays with the older 10.2 gbps chips can handle this with a firmware update.  So, again, no need to use display port as HDMI is currently capable of this.

 

Third, in regards to your question on whether or not you *could* use display port for either of the above, this depends on what type of copyright protection the content you want to display has.  Based on the fact that you are able to record the content using a cable card tuner in your current setup, it is clear that the content is not flagged as "copy never" by your cable provider.  Some PVR software (i.e. Windows Media Center) will allow you to record content that your cable provider has flagged as "copy once".  If the content you are recording is flagged as "copy once", it is unlikely that you will be able to use display port to feed the signal to your display as the software will not recognize the display port as HDCP compliant.  You would have to use HDMI or DVI in this situation.  However, if the content is flagged as "copy freely" then you should be able to use display port, if you wish.  Again, there would be no benefit for doing so, but you would have the option.

 

The only real benefit for putting display ports on 4K displays at present is for games or other graphics related content that is generated by the PC, or, for professional video editing.  Truth be told, I don't think display port is used by many professionals either.  Production companies tend to use either SDI (Serial Digital Interface: coax cable w/ BNC connectors), fiber optic, or HDMI for the connection between the camera and the rest of their hardware (distribution decks, VTR's, capture cards, encoders, etc.).  And, most professional video editing is done on Macs which use Thunderbolt/Thunderbolt2 for connecting to the displays.

 

I don't really see display port becoming much more common than it already is, either.  It's hasn't been embraced by the Consumer Electronic Industry, despite the fact that it has some advantages over current HDMI.  It seems that the plan is to focus on improving HDMI rather than switch to another type of connection.  And, it isn't being used by Hollywood either as Thunderbolt technology is already well ahead of display port.  In the last year, even PC manufacturers like Asus and Gigabyte have started making mother boards w/ Thunderbolt ports, so it is no longer restricted to Macs.  My guess is that Thunderbolt will replace display port on graphics cards within the next few years.  So, you will have HDMI on the consumer electronic side, Thunderbolt/SDI/fiber on the professional side, and a combination of the two in the PC/Mac world.


----------



## Dark Matter




> Quote:
> Originally Posted by *KidHorn*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522402
> 
> 
> So if you have a 4k TV, 2D renders 2160 lines but 3D renders 4320 lines. How is that possible?



You don't have double the resolution per se, you have double the frame rate b/c each frame has to be produced at a slightly different angle for each eye to provide the "3D effect". Because you're delivering double the framerate for 3D, you're transmitting twice the amount of data in the same time frame as the same movie in 2D, so a 24p movie requires 48 frame per second, 24 per eye. Make sense?


----------



## SeeNoWeevil


So if you have an AVR equipped with HDMI 1.4 (Onkyo 828), listed as being capable of 2160p30. That means it's at least 8.91Gb/s, right? Which means *in theory* it could be software updated to support 2160p60 4:2:0 8-bit?

 

Are manufacturers (Onkyo for example) actually putting the *same* HDMI chips in different ranges but badging the newer ones as HDMI 2.0??


----------



## 6athome

Thanks HockeyoAJB !

I used to play ice hockey, I just watch ice hockey now. I am a fan of FALD because of watching a lot of ice hockey!

The computer I am using is a HP9300T with cablelabs DRM certified bios which is only able to be played back on the HP9300T and with xbox media center on another TV. Looks like thunderbolt is going to be on my next computer.


----------



## Otto Pylot




> Quote:
> Originally Posted by *SeeNoWeevil*  /t/1523994/hdmi-2-0-cedia-webinar/140_20#post_24598729
> 
> 
> So if you have an AVR equipped with HDMI 1.4 (Onkyo 828), listed as being capable of 2160p30. That means it's at least 8.91Gb/s, right? Which means _in theory_ it could be software updated to support 2160p60 4:2:0 8-bit?
> 
> 
> Are manufacturers (Onkyo for example) actually putting the _same_ HDMI chips in different ranges but badging the newer ones as HDMI 2.0??



"In theory", anything is possible. Realistically it probably won't happen. The chip sets have to be initially designed to be able to take the speed bump necessary for the 2160/60fps 8-bit resolution, which is that gray area between HDMI 1.4b and 2.0. That's what Sony did and called it HDMI 2.0. Other mfrs may follow suit (hard to find out whose doing what) or just wait till the chip sets are available. I would say that if your avr never indicated that it was "HDMI 2.0 Ready", then it can't be upgraded to anything more than what it is now. This is going to be the big problem. You can buy a tv now that is HDMI 2.0 ready to a certain extent, but your other devices aren't (blu-ray player, STB's, etc) so eventually one will have to replace everything if you want all of your devices to have HDMI 2.0 capability. But unless the various mfrs list what options are available, you could end up with different devices with different sets of the HDMI 2.0 specs.


----------



## Geoff D

From the looks of it, Sony deliberately gave themselves a 'back door' through which they could fiddle with the HDMI firmware, but I doubt that too many others were thinking along those lines. Has any manufacturer other than Sony used the same method?


Still, some mfs might not even bother for the simple reason that all Sony are doing is adding the bare minimum of HDMI 2.0 compatibility.


----------



## Otto Pylot

^^^^ I haven't heard of any mfrs offering "upgradeable to HDMI 2.0" tv's other than Sony. I know Samsung is coming out with their One Connect add-on (for about $400!) that is supposed to have HDMI 2.0 but I don't believe Samsung has listed which of the HDMI 2.0 options it will have. Don't even know if it contains the new HDMI 2.0 chip sets that Panasonic is rumored to have designed and built. Same is true for the new Pioneer HDMI 2.0 receivers. It would be so much easier if mfrs listed what feature sets their "HDMI 2.0" devices contain. HDMI 1.4b had Deep Color, ethernet, etc that never came to pass so I don't see why the mfrs don't list the options their HDMI 2.0 devices have regardless of whether there's a consumer device that can currently use it or not.


----------



## SeeNoWeevil

Can HDCP 2.2 be added to chips without it or does it need the new keys embedded during manufacture?


----------



## Geoff D

Sony's earliest builds of their 4K TVs needed a hardware upgrade to support HDCP 2.2, so I would say no.


----------



## Otto Pylot

This has been stated many times before, but if you really NEED a new tv now, there a quite a few 4k sets that are very nice and will certainly serve you well. However, if you want to get in on the latest upcoming technology such as fully compliant HDMI 2.0 (at least with higher bandwidth than the 10.2 Gbps max of 1.4b), 10-bit/12-bit panels, I'd wait another year or so. Remember, you will probably have to upgrade your other devices eventually as well for full compatibility. The mfrs are really being cagey now on what their HDMI 2.0 tv's are really capable of, now and in the future (firmware upgrades) , so I would research carefully.


----------



## LDizzle




> Quote:
> Originally Posted by *Dan Hitchman*  /t/1523994/hdmi-2-0-cedia-webinar#post_24521491
> 
> 
> I'd rather have 12 bit 4:2:2 and a wider color gamut and object audio than 3D.


I'd rather have it all!


----------



## jim brice




> Quote:
> Originally Posted by *MikeyD360*  /t/1523994/hdmi-2-0-cedia-webinar#post_24522165
> 
> 
> So anyone who wasted $20K on Sony's 4K projector, or LG's 84" UHD TV now has an expensive paperweight given HDMI2.0 standards didnt exist when these were released... hands up who saw that coming...



Yup hdmi 2.0 is a big reason why i still haven't jumped on the 4k bandwagon. Correct me if i'm wrong but isn't the Vizio y series one of the first to come with hdmi 2.0 as a standard port?


Or does Sony and other companies have newer 4k models with 2.0 now? Anyway i feel as if the Vizio Y most likely might be the first 4k i try given the fact the 50" will be $999.99 i believe.


Until then i can just worry about flawed Sony models cough cough (kdl-48w600b) not saying that the Vizio wont be flawed because i know it will have it's flaws.


----------



## Otto Pylot




> Quote:
> Originally Posted by *jim brice*  /t/1523994/hdmi-2-0-cedia-webinar/140_20#post_24620355
> 
> 
> Yup hdmi 2.0 is a big reason why i still haven't jumped on the 4k bandwagon. Correct me if i'm wrong but isn't the Vizio y series one of the first to come with hdmi 2.0 as a standard port?
> 
> 
> Or does Sony and other companies have newer 4k models with 2.0 now? Anyway i feel as if the Vizio Y most likely might be the first 4k i try given the fact the 50" will be $999.99 i believe.
> 
> 
> Until then i can just worry about flawed Sony models cough cough (kdl-48w600b) not saying that the Vizio wont be flawed because i know it will have it's flaws.



Isn't the Sony advertised as having an HDMI 2.0 input? However Vizio spins their HDMI 2.0 what remains to be seen is what their HDMI 2.0 input supports. Personally I steer clear of Vizio based on past experiences. Again, just advertising a tv with HDMI 2.0 means nothing unless they list what aspects of HDMI 2.0 they currently support. Unless the models released later on this year have the new chip sets, my guess is that they will be what one poster called "HDMI 2.0 Lite". This is going to get interesting as we get closer to Black Friday and the holiday season. I bet there will be lots of dealer-specific models advertising HDMI 2.0, at great prices, to the un-suspecting consumer.


----------



## MrEastSide

*sigh* HDMI, the most clunky, poorly thought out interface known in consumer electronics. Bring on more consumer confusion, handshake issues and hardware problems... Inroducing HDMI 2.0, which 0.01% of consumers will actually utilize features for.


----------



## Joe Fernand

_'Inroducing HDMI 2.0, which 0.01% of consumers will actually utilize features for.'_ - though they will have to have it on their new 2$ Full HD, 4K Soundbar










Joe


----------



## HockeyoAJB




> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/150#post_24602262
> 
> 
> ^^^^ I haven't heard of any mfrs offering "upgradeable to HDMI 2.0" tv's other than Sony. I know Samsung is coming out with their One Connect add-on (for about $400!) that is supposed to have HDMI 2.0 but I don't believe Samsung has listed which of the HDMI 2.0 options it will have. Don't even know if it contains the new HDMI 2.0 chip sets that Panasonic is rumored to have designed and built. Same is true for the new Pioneer HDMI 2.0 receivers. It would be so much easier if mfrs listed what feature sets their "HDMI 2.0" devices contain. HDMI 1.4b had Deep Color, ethernet, etc that never came to pass so I don't see why the mfrs don't list the options their HDMI 2.0 devices have regardless of whether there's a consumer device that can currently use it or not.


 

Regarding the Samsung One Connect box, how future proof can this really be?  I mean, the display itself is still limited to what was in it when it was built, so you can't magically make an edge-lit display turn into a FALD display by changing out the One Connect box.  Same goes for the maximum dynamic range and color gamut that the display can show. Remember, what it *can* show and what it *does* show due to the limitations of current signals are two different things.  The only thing that replacing the One Connect box can do is allow you to upgrade the connection hardware to allow an improved signal to reach the display and perhaps add more processing power or newer decoding chips if they put them in the box.  Bear in mind that the One Connect box is still connected to the display via. some sort of cable.  That cable (like any other cable that has ever existed) must also have a maximum bandwidth.  One would assume that the bandwidth of this cable is greater than 18 Gbps (otherwise it would be obsolete the second they issue HDMI 2.X with a spec'd bandwidth of greater than the current HDMI 2.0), but by how much?  It is unlikely that this cable could be upgraded without also upgrading the port it plugs into on the display.  So, eventually replacing the box alone will no longer be enough to enable whatever the latest features are.  And even before you reach that point, the upgradeability of the PQ is still limited to what is in the box vs. what is in the display.

 

Not saying that it is a bad idea or that it won't be useful in some regards.  Just don't be fooled into thinking that it is some kind of magical solution that will allow you to keep the same display for 5 years and yet have that display be equal to what the competition is offering 5 years down the road.


----------



## Otto Pylot




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/140_20#post_24621579
> 
> 
> Regarding the Samsung One Connect box, how future proof can this really be?  I mean, the display itself is still limited to what was in it when it was built, so you can't magically make an edge-lit display turn into a FALD display by changing out the One Connect box.  Same goes for the maximum dynamic range and color gamut that the display can show. Remember, what it _can_ show and what it _does_ show due to the limitations of current signals are two different things.  The only thing that replacing the One Connect box can do is allow you to upgrade the connection hardware to allow an improved signal to reach the display and perhaps add more processing power or newer decoding chips if they put them in the box.  Bear in mind that the One Connect box is still connected to the display via. some sort of cable.  That cable (like any other cable that has ever existed) must also have a maximum bandwidth.  One would assume that the bandwidth of this cable is greater than 18 Gbps (otherwise it would be obsolete the second they issue HDMI 2.X with a spec'd bandwidth of greater than the current HDMI 2.0), but by how much?  It is unlikely that this cable could be upgraded without also upgrading the port it plugs into on the display.  So, eventually replacing the box alone will no longer be enough to enable whatever the latest features are.  And even before you reach that point, the upgradeability of the PQ is still limited to what is in the box vs. what is in the display.
> 
> 
> Not saying that it is a bad idea or that it won't be useful in some regards.  Just don't be fooled into thinking that it is some kind of magical solution that will allow you to keep the same display for 5 years and yet have that display be equal to what the competition is offering 5 years down the road.



I agree with what you are saying. I just wanted to point out what Samsung is doing, which some feel will keep their tv's at close to current technology for the next few years. It just doesn't make much sense to me to spend $2000 on a new tv and then have to spend another $400 to "upgrade" it which may only have a limited useful time frame of maybe a year at best.


----------



## mobius




> Quote:
> Originally Posted by *blah450*  /t/1523994/hdmi-2-0-cedia-webinar/0_100#post_24521976
> 
> 
> ^^^^^^^This...and is anyone else ready to have more secure connection points at terminals rather than just the slip-in-style of HDMI?



Hell yes! That's one of the biggest failings of HDMI (with unreliable data signaling being number one). Consumers SHOULD NOT have the issues we have trying to pipe HD signals to more than one display. Changing sources on my matrix switch should not be a hit-or-miss experience with any HDMI-enabled device.


----------



## sanderdvd

I own a Sony VPL-VW1000ES projector and I m considering the HDMI 2.0 upgrade (VW1000 => VW1100) for it. You guys think it makes any sense to do this upgrade?


----------



## HockeyoAJB




> Quote:
> Originally Posted by *sanderdvd*  /t/1523994/hdmi-2-0-cedia-webinar/150#post_24633860
> 
> 
> I own a Sony VPL-VW1000ES projector and I m considering the HDMI 2.0 upgrade (VW1000 => VW1100) for it. You guys think it makes any sense to do this upgrade?


 

Is this really an HDMI 2.0 upgrade or is it an HDCP 2.2 upgrade?  With the 1st gen Sony 4K displays, the hardware upgrade was simply to make the display HDCP 2.2 compliant.  Without it, content that was encrypted using HDCP 2.2 could not be viewed on the display.  All those displays needed to be "HDMI 2.0 compliant" (meaning able to accept 4K/60p 8-bit, 4:2:0 content, in this case) was a firmware update.

 

If the upgrade from 1000 to 1100 is merely to add 4K/60p compatibility and it costs a decent chunk of money then I would be less inclined to bother since 4K/60p content is so rare (as in virtually non-existent unless you create it yourself).  On the other hand, if it is to make the projector HDCP 2.2 compliant then I would be more inclined to do it since a good portion of the 4K/24p and 4K/30p content that exists currently uses HDCP 2.2 encryption.


----------



## Otto Pylot




> Quote:
> Originally Posted by *sanderdvd*  /t/1523994/hdmi-2-0-cedia-webinar/160_20#post_24633860
> 
> 
> I own a Sony VPL-VW1000ES projector and I m considering the HDMI 2.0 upgrade (VW1000 => VW1100) for it. You guys think it makes any sense to do this upgrade?



Depends on what the upgrade actually offers in the way of HDMI 2.0.


----------



## mo949

upgrading the connectors to hdmi 2.0 isn't going to magically enhance the tv's ability to display a wider range of colors. I'd save the upgrade price for the complete package. Its the color expansion that's going to net the biggest benefits in the near future IMO.


----------



## Otto Pylot

+1


----------



## sanderdvd

The VW1000>VW1100 upgrade includes the HDCP 2.2 upgrade so this means it is worth it?


----------



## HockeyoAJB




> Quote:
> Originally Posted by *sanderdvd*  /t/1523994/hdmi-2-0-cedia-webinar/150#post_24637650
> 
> 
> The VW1000>VW1100 upgrade includes the HDCP 2.2 upgrade so this means it is worth it?



I think it is required if you want to use the Sony 4K player to watch native 4K content. So far, I think Sony is the only company to make the move from HDCP 1.3 (which was cracked long ago) to HDCP 2.2 for 4K content. Obviously, YouTube doesn't use it and I don't think Netflix does for House of Cards. However, it seems that the industry is planning to use HDCP 2.2 for UHD content going forward. It will likely be enforced in BD4K if/when that becomes a reality.


----------



## RobAC




> Quote:
> Originally Posted by *safe91*  /t/1523994/hdmi-2-0-cedia-webinar/90#post_24551768
> 
> 
> Sorry for this massive post but many things are connected, and work on a specific one don't allow the global view needed.
> 
> I was not trying to write my thinking but i was trying to explain known facts.
> 
> I hope those who have read attentively to the end, has certainly understand what I was referring.
> 
> Regards.



Thanks for the very enlightening timeline charts and post.

Yours and Scott's (and everyone here,) really help to explain this HDMI confusion.


Rob


----------



## Otto Pylot




> Quote:
> Originally Posted by *sanderdvd*  /t/1523994/hdmi-2-0-cedia-webinar/160_20#post_24637650
> 
> 
> The VW1000>VW1100 upgrade includes the HDCP 2.2 upgrade so this means it is worth it?



Does it include 2160/60p at 10-bit or 12-bit color depth? Does it include color sampling at higher than 4:2:0? What about Rec. 2020 specs or CEC Extensions? Can the new tv handle those video specs? HDCP 2.2 is just one aspect of HDMI 2.0. If you plan on keeping your new tv (potential new tv) for a few years I'd wait unless you absolutely need to buy one now or the new tv is capable of those resolutions, color sampling, etc. now.


----------



## HockeyoAJB




> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/150#post_24638651
> 
> 
> 
> Does it include 2160/60p at 10-bit or 12-bit color depth? Does it include color sampling at higher than 4:2:0? What about Rec. 2020 specs or CEC Extensions? Can the new tv handle those video specs? HDCP 2.2 is just one aspect of HDMI 2.0. If you plan on keeping your new tv (potential new tv) for a few years I'd wait unless you absolutely need to buy one now or the new tv is capable of those resolutions, color sampling, etc. now.


 

Just to clarify, Sanderdvd already owns the Sony VPL-VW1000ES (a $25,000 4K projector) and is asking if it is worth paying $2,500 to have a Sony tech upgrade the board to make the projector HDCP 2.2 compliant and also add support for 4K/60p content, giving it the same capabilities as the newer VPL-VW1100ES (a $28,000 4K projector) which already supports both out of the box.

 

To answer some of your questions, no it will not do 2160/60p 10-bit or 12-bit.  It still uses the 10.2 Gbps chip, so it tops out at DCI 4K (4096x2160) @ 60 fps 8-bit 4:2:0.  It can do higher bit depths and better color subsampling at 24 fps and at lower resolutions, though.  It would need the forthcoming 18 Gbps chips in order to be able to receive a 2160/60p 10-bit or 12-bit signal w/ 4:2:2 or 4:4:4 subsampling.  These are not available, yet.

 

Nothing fully supports rec 2020, currently.  However, out of the box, both projectors have better color space support than most displays.  They support not only rec. 709 and xvYCC (extended color gamut used in Sony's "Mastered in 4K" titles), but also the DCI and Adobe RGB color spaces.

 

Also, they boast a dynamic contrast ratio of 1,000,000:1, so have the capability to produce a high dynamic range.  However, they do not currently support Dolby Vision, so they do not have a way to read HDR encoded content.  Instead, they use their own algorithms to "stretch" the dynamic range of the source.  This can be adjusted.

 

For more information, see the following thread...

http://www.avsforum.com/t/1359018/sony-vpl-vw1000


----------



## Otto Pylot

^^^ thanks for the clarification.


----------



## sanderdvd




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/100_100#post_24639267
> 
> 
> Just to clarify, Sanderdvd already owns the Sony VPL-VW1000ES (a $25,000 4K projector) and is asking if it is worth paying $2,500 to have a Sony tech upgrade the board to make the projector HDCP 2.2 compliant and also add support for 4K/60p content, giving it the same capabilities as the newer VPL-WV1100ES (a $28,000 4K projector) which already supports both out of the box.
> 
> 
> To answer some of your questions, no it will not do 2160/60p 10-bit or 12-bit.  It still uses the 10.2 Gbps chip, so it tops out at DCI 4K (4096x2160) @ 60 fps 8-bit 4:2:0.  It can do higher bit rates and better color subsampling at 24 fps and at lower resolutions, though.  It would need the forthcoming 18 Gbps chips in order to be able to receive a 2160/60p 10-bit or 12-bit signal w/ 4:2:2 or 4:4:4 subsampling.  These are not available, yet.
> 
> 
> Nothing fully supports rec 2020, currently.  However, out of the box, both projectors have better color space support than most displays.  They support not only rec. 709 and xvYCC (extended color gamut used in Sony's "Mastered in 4K" titles), but also the DCI and Adobe RGB color spaces.
> 
> 
> Also, they boast a dynamic contrast ratio of 1,000,000:1, so have the capability to produce a high dynamic range.  However, they do not currently support Dolby Vision, so they do not have a way to read HDR encoded content.  Instead, they use their own algorithms to "stretch" the dynamic range of the source.  This can be adjusted.
> 
> 
> For more information, see the following thread...
> http://www.avsforum.com/t/1359018/sony-vpl-vw1000


thanks for this helpfull information.

I indeed own a VW1000 and I think that it is a smart choice to do the upgrade now looking at the information you provided me


----------



## travisrps


hey Scott is there benefit in buying the 2.0 hdmi cable for a 1080p set?


----------



## Otto Pylot

^^^^ there's no such thing as an HDMI 2.0 cable. Any certified high speed HDMI cable will work for HDMI 1.4b or 2.0.


----------



## TMcG




> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/150#post_24645677
> 
> 
> ^^^^ there's no such thing as an HDMI 2.0 cable. Any certified high speed HDMI cable will work for HDMI 1.4b or 2.0.


***


*Except when you need a HDMI cable that is certified to handle 18 Gbit/s, well beyond the current 10.2 Gbit/s max throughput


----------



## Otto Pylot




> Quote:
> Originally Posted by *TMcG*  /t/1523994/hdmi-2-0-cedia-webinar/160_20#post_24646130
> 
> ***
> 
> 
> *Except when you need a HDMI cable that is certified to handle 18 Gbit/s, well beyond the current 10.2 Gbit/s max throughput



From Scott Wilkinson's post:


HDMI 2.0 ups the maximum bandwidth from 10.2 gigabits per second to 18 Gbps, which can be carried on existing high-speed HDMI-certified cables.


----------



## TMcG

Uhhh....as I said and as you quoted...._certified_ to handle the higher bandwidth. Only a handful I have seen can carry that kind of bandwidth, and they are all under 15 feet, this being the longest I have seen: http://www.monoprice.com/Product?c_id=102&cp_id=10255&cs_id=1025508&p_id=10767&seq=1&format=2


----------



## Joe Fernand

Certified High Speed and 'engineered to deliver' are not one and the same thing!


Careful use of wording in the link you provide.


Joe


----------



## TMcG




> Quote:
> Originally Posted by *Joe Fernand*  /t/1523994/hdmi-2-0-cedia-webinar/150#post_24648398
> 
> 
> Certified High Speed and 'engineered to deliver' are not one and the same thing!
> 
> 
> Careful use of wording in the link you provide.
> 
> 
> Joe



From Monoprice:


> Quote:
> have been engineered to deliver _*at least the full 18Gbps*_ data required by the new HDMI spec.



Barring third party independent testing, I am confident Monoprice is not marketing snake oil when it comes to their data transmission rates.


My point above was that while any HDMI cable of any speed will work with any HDMI hardware version, when you need the increased bandwidth for higher frame rate or deeper bit rate color, most of today's cables rated for the full data bandwidth of HDMI 1.4 won't cut it. You will have to upgrade to a newer cable that can handle the full HDMI 2.0 bandwidth to get the additional features HDMI 2.0 provides, simple as that.


----------



## Otto Pylot

Then by all means go ahead and buy what you want if you think you're going to get a cable that will get you HDMI 2.0 performance when ready. I hope all of your hardware is fully HDMI 2.0 compliant as well.


----------



## Colm




> Quote:
> Originally Posted by *TMcG*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24648924
> 
> 
> My point above was that while any HDMI cable of any speed will work with any HDMI hardware version, when you need the increased bandwidth for higher frame rate or deeper bit rate color, most of today's cables rated for the full data bandwidth of HDMI 1.4 won't cut it. You will have to upgrade to a newer cable that can handle the full HDMI 2.0 bandwidth to get the additional features HDMI 2.0 provides, simple as that.


It is not clear that is the case. AFAIK there are only class 1 and class 2 cables (standard and high speed) even under HDMI 2.0. IOW there haven't added a new certification for higher speed cables. The HDMI licensing folks even say that a high speed (passive 10.2 Gbps) cable will work. And it will to some distance, just what that distance is has yet to be seen. The signaling method has been tweaked a bit for the higher speeds. Maybe a cable design that is good to 25 feet at 10.2 Gbps will be good for 15 feet at 18 Gbps, maybe more, maybe less. Maybe if you can get away with a 28 AWG cable now you will have to upgrade to a 24 AWG cable. Folks with long cables may have a problem. The percentage who will have problems is anybody's guess. But there is a HDBaseT chip set due out this year that should do the job for any reasonable length.


I think the 18 Gbps claim for the monoprice Redmere cable is causing a bit of confusion. The 18 Gbps seems to be simply monoprice's claim, not a certification from an ATC. The claim is understandable. They have to differentiate between their older Redmere cables with the 10.2 Gbps chips in them and the new ones with the 18 Gbps chips even though both are high speed cables because the older cables will not work at the higher speeds at any length. But that says nothing about the capabilities of existing passive high speed cables at the higher speeds possible under HDMI 2.0. BTW I will take monoprice's claims on the Redmere cables at face value for now, but you might want to take anything that monoprice says with a grain of salt. Check out the thread on their "12 AWG" speaker wire.


FWIW the only HDMI 2.0 features that might require an upgrade are those that push the bit rate beyond the limit of the existing cable. Other features will work just fine on existing cables. But I think you know that.


----------



## TMcG

In the end it will be quite a while before any of us will technically **need** a HDMI cable with more than 10.2 Gbps bandwidth for all practical purposes. And to your point, I am sure there is great variation in actual data throughput between different cables at different lengths. I think the secret for high data transmission rates over longer cable runs (to the projector or a distant 4K TV through an HDMI Matrix switch, for example), will be a baluns-based system and not a long HDMI wire. However (and as you would expect at this early stage), all current baluns systems, even HDBaseT, cannot transmit the 18Gbps data rates over any distance. I'm sure the products will come in due time.


----------



## IRJ

Anyone got a source for a list of existing receivers with are compatible with HDMI 2.0 standards?


----------



## Otto Pylot




> Quote:
> Originally Posted by *IRJ*  /t/1523994/hdmi-2-0-cedia-webinar/180_20#post_24658404
> 
> 
> Anyone got a source for a list of existing receivers with are compatible with HDMI 2.0 standards?



Sony, Onkyo, and Pioneer have, or will be shortly, releasing new receivers that are supposed to be HDMI 2.0 compatible. However, what remains to be seen is how many protocols of the HDMI 2.0 standard they will have and if they will be using the new HDMI 2.0 chipsets, or do some voodoo firmware/software upgrade to limited compatibility like Sony did with their new HDMI 2.0 tv's.


----------



## HockeyoAJB




> Quote:
> Originally Posted by *IRJ*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24658404
> 
> 
> Anyone got a source for a list of existing receivers with are compatible with HDMI 2.0 standards?


 

I put this list together using only info from the manufacturer's official websites and the owners manuals for each model.

 

2013 models

 

Sony STR DN840 - Up to 4K/60p 8-bit 4:2:0 (after firmware update) 10.2 Gbps, HDCP 1.3

Sony STR DN1040 - Up to 4K/60p 8-bit 4:2:0 (after firmware update) 10.2 Gbps, HDCP 1.3

Sony STR DA1800ES - Up to 4K/60p 8-bit 4:2:0 (after firmware update) 10.2 Gbps, HDCP 1.3

Sony STR DA2800ES - Up to 4K/60p 8-bit 4:2:0 (after firmware update) 10.2 Gbps, HDCP 1.3

Sony STR DA5800ES - Up to 4K/60p 8-bit 4:2:0 (after firmware update) 10.2 Gbps, HDCP 1.3

 

2014 models

 

Onkyo TX-NR535 - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 1.3?

Onkyo TX-NR636 - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 2.2

Onkyo TX-NR737 - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 2.2

Onkyo TX-NR838 - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 2.2

 

Pioneer VSX-824 - Up to 4K/60p 8-bit 4:4:4* or 12-bit 4:2:2 (* Future firmware update required for 4:4:4) 18 Gbps, HDCP 1.3?

Pioneer VSX-1024 - Up to 4K/60p 8-bit 4:4:4* or 12-bit 4:2:2 (* Future firmware update required for 4:4:4) 18 Gbps, HDCP 1.3?

Pioneer VSX-1124 - Up to 4K/60p 8-bit 4:4:4* or 12-bit 4:2:2 (* Future firmware update required for 4:4:4) 18 Gbps, HDCP 1.3?

Pioneer Elite VSX-44 - Up to 4K/60p 8-bit 4:4:4* or 12-bit 4:2:2 (* Future firmware update required for 4:4:4) 18 Gbps, HDCP 1.3?

Pioneer Elite VSX-80 - Up to 4K/60p 8-bit 4:4:4* or 12-bit 4:2:2 (* Future firmware update required for 4:4:4) 18 Gbps, HDCP 1.3?

 

Sony STR DN550 - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 1.3?

Sony STR DN750 - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 1.3?

Sony STR DN850 - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 1.3?

Sony STR DN1050 - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 1.3?

 

Yamaha RX-V677 - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 1.3?

Yamaha RX-V777BT - Up to 4K/60p 8-bit 4:2:0 (No mention of being able to upgrade to a higher bit depth or better color sub-sampling at present.) ? Gbps, HDCP 1.3?

 

Notes:

 

1) Denon & Marantz have not released info on their upcoming 2014 models yet.

2) Sony is the only manufacturer to have already provided firmware updates for their 2013 models enabling 4K/60p passthru on the models listed.  It appears that none of the other manufacturers will be updating their 2013 models.

3) Pioneer is the only manufacturer that has actually provided the bandwidth of the chips in their 2014 models.  They say that they are capable of 18 Gbps so, at least bandwidth-wise, meet the full HDMI 2.0 spec.  They have already announced that they will be providing a future firmware update to add 4:4:4 color sub-sampling.  They make no mention of what their current color bit-depth or sub-sampling is, nor have they said whether the bit-depth can be increased by a future firmware update.

4) Onkyo is the only manufacturer to list HDCP 2.2 compliance on their TX-NR636, 737, & 838 models.  The Onkyo TX-535 does not mention HDCP 2.2, so it is assumed that it will only support HDCP 1.3 and below.

5) Yamaha has also announced the release of three lower-end 2014 models: RX-V377, 477, & 577.  All three cap out at 4K/30p.

6) Higher-end 2014 models have not been announced by any of the manufacturers yet.


----------



## Otto Pylot

Nice list. Glad you took the time to compile it. I think Pioneer btw is the only mfr so far who is making their own HDMI 2.0 chipsets so I would expect them at some time to be closer to full compliance than the other mfrs. Now, if the rest of the devices (tv's, stb's, etc) can get the new chipsets.......


----------



## 6athome

Which is better for picture quality! 4:2:2, HDMI 2.0 can accommodate up to 12 bits of resolution for 4K/UHD at 50/60 fps. And at 4:4:4, HDMI 2.0 is limited to 8 bits?

Will 444 at 8 bit give a better picture or will 422 with a 10 bit panel?

In the case of a new 4K tv should one look for a 10 bit panel which can handle 422 or go with an 8 bit panel with 444?


----------



## blee0120

What do you guys think about the audio side of HDMI 2.0? I am in need of a new AVR and do not want to spend money on the first available AVR that's hdmi 2.0. Since, they might not have the full specs later on down the road for 4K BD. With advances in UHD audio, would it be a safer bet going with an amp? I'm using an Oppo, so my AVR is just for audio. I know resale value is much better with an amp than an AVR.


----------



## Dan Hitchman




> Quote:
> Originally Posted by *blee0120*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24669982
> 
> 
> What do you guys think about the audio side of HDMI 2.0? I am in need of a new AVR and do not want to spend money on the first available AVR that's hdmi 2.0. Since, they might not have the full specs later on down the road for 4K BD. With advances in UHD audio, would it be a safer bet going with an amp? I'm using an Oppo, so my AVR is just for audio. I know resale value is much better with an amp than an AVR.



If you're wanting to have the audio side ready for UHD, I would wait until receivers or pre-amp's come out with object audio decoding. Don't just get the first thing that gets released in the next few months. Just because it has HDMI 2.0, the product may not have the full audio/video capabilities of the new UHD media. They haven't even locked down the specs. yet.


----------



## HockeyoAJB




> Quote:
> Originally Posted by *6athome*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24669640
> 
> 
> Which is better for picture quality! 4:2:2, HDMI 2.0 can accommodate up to 12 bits of resolution for 4K/UHD at 50/60 fps. And at 4:4:4, HDMI 2.0 is limited to 8 bits?
> 
> Will 444 at 8 bit give a better picture or will 422 with a 10 bit panel?
> 
> In the case of a new 4K tv should one look for a 10 bit panel which can handle 422 or go with an 8 bit panel with 444?


 

Correct me if I'm wrong, but I believe that a 10 or 12-bit panel can accept and display an 8-bit 4:4:4 signal provided the pipeline between the source and display can support the bandwidth.  Giving it a 4:4:4 signal actually means the TV has to do less work since it doesn't have to fill in the color information for as many pixels as less color information was discarded to begin with.  On the other hand, an 8-bit panel can never display anything higher than 8-bit color no matter how low you take the resolution.  So, from this perspective, the 10 or 12-bit panel will always have greater potential than an 8-bit panel.

 

However, a 10 or 12-bit panel also costs more than an 8-bit panel, all other things being equal.  So, if you want to know whether or not the extra money for the 10 or 12-bit panel is worth it then you would have to know if any of the content you view on that display will ever be encoded with a bit depth of greater than 8.  To know this for certain, you will have to wait until the UHD specs are finalized.  It appears that they are at least entertaining the idea of going to 10 or 12-bit video in the next few years for UHD content.


----------



## blee0120




> Quote:
> Originally Posted by *Dan Hitchman*  /t/1523994/hdmi-2-0-cedia-webinar/160_40#post_24670259
> 
> 
> If you're wanting to have the audio side ready for UHD, I would wait until receivers or pre-amp's come out with object audio decoding. Don't just get the first thing that gets released in the next few months. Just because it has HDMI 2.0, the product may not have the full audio/video capabilities of the new UHD media. They haven't even locked down the specs. yet.



I will use an Oppo as my pre amp anyways. But I have a broken avr at the moment. I'm just going to go with an amp and hope I can use it when a 4k Oppo comes out


----------



## dabotsonline




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24659109
> 
> 2014 models
> 
> 
> Pioneer VSX-824 - Up to 4K/60p ?-bit 4:?:? at release (Pioneer claims that it will be upgradable to 4:4:4 via. a future firmware update.  No mention of bit depth.)  18 Gbps, HDCP 1.3?
> 
> Pioneer VSX-1024 - Up to 4K/60p ?-bit 4:?:? at release (Pioneer claims that it will be upgradable to 4:4:4 via. a future firmware update.  No mention of bit depth.)  18 Gbps, HDCP 1.3?
> 
> Pioneer VSX-1124 - Up to 4K/60p ?-bit 4:?:? at release (Pioneer claims that it will be upgradable to 4:4:4 via. a future firmware update.  No mention of bit depth.)  18 Gbps, HDCP 1.3?
> 
> Pioneer Elite VSX-44 - Up to 4K/60p ?-bit 4:?:? at release (Pioneer claims that it will be upgradable to 4:4:4 via. a future firmware update.  No mention of bit depth.)  18 Gbps, HDCP 1.3?
> 
> Pioneer Elite VSX-80 - Up to 4K/60p ?-bit 4:?:? at release (Pioneer claims that it will be upgradable to 4:4:4 via. a future firmware update.  No mention of bit depth.)  18 Gbps, HDCP 1.3?





> Quote:
> Ultra HD (4K) Upscaling and Pass-Through (4K/60p/4:4:4/24-bit*, 4K/24p/4:4:4/36-bit, 4K/60p/4:2:2/36-bit)
> 
> 
> * 4:4:4 color support requires future firmware update


 http://www.pioneerelectronics.com/ephox/StaticFiles/Manuals/Home/VSX-1124-K%20Single%20Sheet_v3.pdf


----------



## Dan Hitchman




> Quote:
> Originally Posted by *blee0120*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24670302
> 
> 
> I will use an Oppo as my pre amp anyways. But I have a broken avr at the moment. I'm just going to go with an amp and hope I can use it when a 4k Oppo comes out



I don't see how you can plug a player directly into a power amp without a some sort of pre-amp in between. Most players only have line level out signal strength.


----------



## blee0120

A lot use there Oppo 105 as a pre amp and even some with the 103


----------



## HockeyoAJB




> Quote:
> Originally Posted by *dabotsonline*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24670307
> 
> 
> 
> http://www.pioneerelectronics.com/ephox/StaticFiles/Manuals/Home/VSX-1124-K%20Single%20Sheet_v3.pdf


 

Thanks.  I updated my list above to reflect this.  Just so nobody is confused, when a manufacturer quotes 24 or 32 bit color, that is the total # of bits for all 3 primaries.  For the purposes of the above list, all values will be expressed in terms of number of bits per primary, since that is how they are most commonly listed.  In other words, if a manufacturer says 24 bits, that translates to 8 bits per color (8 x 3 = 24) and if they say 36 bits, that translates to 12 bits per color (12 x 3 = 36).


----------



## Dan Hitchman




> Quote:
> Originally Posted by *blee0120*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24670361
> 
> 
> A lot use there Oppo 105 as a pre amp and even some with the 103



What amp are you wanting to get? Multi-channel?


----------



## blee0120




> Quote:
> Originally Posted by *Dan Hitchman*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24670707
> 
> 
> What amp are you wanting to get? Multi-channel?



a 7 channel amp. I have amps for my 3 subs already. I actually used PA speakers with my Oppo and it didn't sound too bad


----------



## dabotsonline




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24670695
> 
> 
> Thanks.  I updated my list above to reflect this.  Just so nobody is confused, when a manufacturer quotes 24 or 32 bit color, that is the total # of bits for all 3 primaries.  For the purposes of the above list, all values will be expressed in terms of number of bits per primary, since that is how they are most commonly listed.  In other words, if a manufacturer says 24 bits, that translates to 8 bits per color (8 x 3 = 24) and if they say 36 bits, that translates to 12 bits per color (12 x 3 = 36).


Cheers. You might also want to point out that the $600 Pioneer VSX-1124 and $700 Pioneer Elite VSX-80 have two, rather than one, HDMI 2.0 18Gbps outputs. I don't think many AV receivers feature this, but this will be useful for those who own two displays (indeed, I'd like to see more budget screens along the lines of the Seiki range, with no onboard tuners or processing or upscaling whatsoever, like a monitor, so it could all be left to the AVR and its input devices. It'd be even better if they featured no speakers and only a single HDMI 2.0 input so that the price could be driven down further - the money could be focused on a 10-bit or even 12-bit panel with minimal motion blur or input lag. There are a couple of proposals in the Ultra-D thread for people who could use the second HDMI 2.0 output on the above AVRs for a second, dedicated glasses-free 3D screen. Therefore, if my proposed bare-bones screen was 2D only - like the Seiki - then this would drive down the price further still).


I'm looking forward to seeing the other manufacturers' AVRs and also preamps. Several of the latter category feature XLR outputs but, as I'm aware, only the $1100 Denon DN-500AV preamp also features XLR inputs (useful for DJ mixers with XLR outputs, such as the $1000 Denon DN-X1600, which also features an in-built soundcard) so I'm interested to see the HDMI 2.0 successor to that. Incidentally, D&M Holdings recently sold Denon Professional, Marantz Professional, and Denon DJ to inMusic Brands (which already includes Akai Professional, Alesis, Alto Professional, ION Audio, Numark, M-Audio, AIR Music Technology, and Mixmeister), so the products under the D&M roadmap will presumably come to market later this year, whereas products which will now be developed under inMusic will presumably come to market in 2016.


----------



## canton160

So....now hdmi 1.4 4K/60p (4:2:0) is now called hdmi 2.0 compliant without hardware upgrade ?


Or will be called Hdmi 1.4 and supports 4K/60P (4:2:0) some manufacturers already put their specs out ....2160p/60hz (hdmi 1.4) see the Lumagen website 


Or the Oppo Firmware 
_Added support for the "[email protected]/60Hz" resolution to the BDP-103D/105D (HDMI 1 only, and in YCbCr 4:2:0 format only)._


----------



## Otto Pylot

Those specs are in the "gray area" so even though they are at the maximum of 1.4a (10.2Gbps) bandwidth, they fall under the HDMI 2.0 umbrella because of backwards compatibility. It is definitely shady to call it 2.0 without some sort of qualification but the mfrs are allowed to get away with it.


----------



## helvetica bold

What TV have 12 bit panels? Geeze I just found out my Sony W9 is a 10 bit with 12 bit processing.



Sent from my iPhone using Tapatalk


----------



## blee0120

Do this mean that current 4k GPUs will be able to output games at 4k60hz?


----------



## MikeyD360




> Quote:
> Originally Posted by *blee0120*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24708753
> 
> 
> Do this mean that current 4k GPUs will be able to output games at 4k60hz?



Currently not at any meaningful quality. Given that some next-gen console titles still cant run in 1080p/60 quadrupling the pixels isn't going to be easy. But I imagine AMD and nVidia are working hard.


----------



## shae

Will HDMI 2.0 TVs be more likely to (officially) accept 120Hz input from computers, or at the very least more than 60Hz?


----------



## BlueChris


Hi guys.. .my 2nd post here and i came across seeking crucial information as matter HDMI-2.0 which Scott greatly explained.

 

I w8ting a LG 55UB850V which is 4k 60p HEVC to arrive in a week ... and according to manual the HDMI-2.0 port its 4K 60p 10bit but nothing is said there as matter the 4.2.0 or 4.2.2.... i mailed LG customer support asking for clarification in this and w8ting for their answer.. but as long as i read information the more im in the side to cancel the order and w8 some months for the thing to settle down.

The problem is that my old LG felt down and destroyed (dont ask how) so i need a TV asap and this TV specs and Picture Quality seems prety good for the price (1700€ in Greece with Bonus a LG BD740 4k Player).

 

So im stuck a bit....


----------



## Otto Pylot

Sounds like you NEED a tv now so I'd say go for it. It's a good tv so you should be happy with it. I personally don't think that the tv mfrs are going to be completely transparent in what their HDMI 2.0 tv's have in the way of compliance until late this year or next year. Once the chips are readily available I think we'll see more features listed etc instead of "HDMI 2.0 Compliant" which is very vague. There's probably going to be a huge push as it gets closer to "Black Friday (Week)" with all kinds of marketing spin which will probably confuse matters even worse. TV mfrs are going to want to be getting rid of the "HDMI 2.0 Compliant" tv's inventory, or what ever they will be calling them because the true HDMI 2.0 sets (with embedded chipsets) will be coming.


----------



## thebland

Is it true that HDMI 1.4 can carry 8 channels of multi-channel audio but HDMI 2.0 can carry up to 24?


----------



## sdurani

HDMI 2.0 can carry 32 channels of PCM audio.


----------



## thebland




> Quote:
> Originally Posted by *sdurani*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24767745
> 
> 
> HDMI 2.0 can carry 32 channels of PCM audio.



Thanks, Sanjay! And the limit on HDMI 1.4?


----------



## HockeyoAJB




> Quote:
> Originally Posted by *thebland*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24767758
> 
> 
> 
> Thanks, Sanjay! And the limit on HDMI 1.4?


 

From http://www.audioholics.com/hdtv-formats/understanding-difference-hdmi-versions

 

HDMI 1.0 and higher = up to 8 channels of 24-bit/192 kHz audio (PCM)

HDMI 1.1 and higher = support for DVD Audio hi-res format

HDMI 1.2 and higher = up to 8 channels of DSD (SACD) audio

HDMI 1.3 and higher = supported output of up to 8 channels of native Dolby TrueHD and DTS-HD Master Audio for external decoding by AVR.

HDMI 2.0 and higher = up to 32 audio channels & up to 1536 kHz audio sampling rate


----------



## sdurani




> Quote:
> Originally Posted by *thebland*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24767758
> 
> 
> And the limit on HDMI 1.4?


8 channels of LPCM.


However, HDMI 1.3 and HDMI 1.4 can pass encoded bitstreams without knowing what's in the bitstream. So if you have, for example, a Dolby Atmos soundtrack with 9 channel beds and various audio objects that have been data packed using Dolby TrueHD, then the player will see the TrueHD flag and let it through.


Those 8-channel or 32-channel limitations are for PCM audio.


----------



## HockeyoAJB




> Quote:
> Originally Posted by *sdurani*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24767956
> 
> 
> 
> 8 channels of LPCM.
> 
> 
> However, HDMI 1.3 and HDMI 1.4 can pass encoded bitstreams without knowing what's in the bitstream. So if you have, for example, a Dolby Atmos soundtrack with 9 channel beds and various audio objects that have been data packed using Dolby TrueHD, then the player will see the TrueHD flag and let it through.
> 
> 
> Those 8-channel or 32-channel limitations are for PCM audio.


 

True.  Though, I think that any AVR or pre/pro that can decode Dobly Atmos would use HDMI 2.0 or higher hardware anyway.  Also, it is likely that any movies that feature a Dolby Atmos soundtrack will likely also be 4K resolution, which means it will require a new player, also using HDMI 2.0 hardware.  If they ever release an audio disc that uses a Dolby TrueHD or DTS-HD Master Audio encoding containing more than 8 channels then you could potentially play that disc with a current HDMI 1.3/1.4 compliant Blu-Ray player, provided your AVR or pre/pro can decode more than 8 channels.  Otherwise, you would need to encode the audio in a compatible format yourself.


----------



## sdurani




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24768043
> 
> 
> I think that any AVR or pre/pro that can decode Dobly Atmos would use HDMI 2.0 or higher hardware anyway.


But the players don't need to. IF a 7-year-old BD player can transmit an Atmos bitstream, then it will make for easier adoption by consumers.


> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24768043
> 
> 
> Also, it is likely that any movies that feature a Dolby Atmos soundtrack will likely also be 4K resolution, which means it will require a new player, also using HDMI 2.0 hardware.


I don't see why Dolby would wait for some over-the-horizon 4K disc just to release Atmos, considering that video resolution has no bearing object based audio.


Being a public company, Dolby has a fiduciary responsibility to its shareholders. I'm guessing the price tag for developing Atmos was significant, so they would want a return on investment ASAP. The moment Atmos decoders hit the consumer marketplace, I would be encouraging studios to get their Atmos mixes out on BD, DVD, streaming, cable, satellite, broadcast, etc.


HDMI 2.0 is already in this year's crop of AVRs, so there is no question that it will also be in upcoming object-aware AVRs. My point was simply that it is not required in a player in order to pass an Atmos soundtrack. The channel limitations being discussed have to do with LPCM, not bitstream.


----------



## HockeyoAJB




> Quote:
> Originally Posted by *sdurani*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24768257
> 
> 
> 
> But the players don't need to. IF a 7-year-old BD player can transmit an Atmos bitstream, then it will make for easier adoption by consumers.
> 
> I don't see why Dolby would wait for some over-the-horizon 4K disc just to release Atmos, considering that video resolution has no bearing object based audio.
> 
> 
> Being a public company, Dolby has a fiduciary responsibility to its shareholders. I'm guessing the price tag for developing Atmos was significant, so they would want a return on investment ASAP. The moment Atmos decoders hit the consumer marketplace, I would be encouraging studios to get their Atmos mixes out on BD, DVD, streaming, cable, satellite, broadcast, etc.
> 
> 
> HDMI 2.0 is already in this year's crop of AVRs, so there is no question that it will also be in upcoming object-aware AVRs. My point was simply that it is not required in a player in order to pass an Atmos soundtrack. The channel limitations being discussed have to do with LPCM, not bitstream.


 

Yeah.  It will likely depend on what distribution methods exist for 4K content when Dolby Atmos compatible AVR's hit the market.  If 4K is still limited to streaming only then perhaps Dolby could persuade the studios to include Atmos soundtracks on Blu-Rays.  However, if a new physical format is on the horizon then CE manufacturers (some of which have close ties to the studios) might push the studios to make it an exclusive feature of the new hardware/media in order to help generate sales of the new products.


----------



## Dan Hitchman

Dolby Atmos for the home is probably still in the developmental stages or Dolby would be shouting this to the rooftops. DTS claims their consumer oriented DTS-MDA object format (called DTS-UHD) is all ready for prime time, though I haven't heard any details about its capabilities. SMPTE wants any and all object audio formats to play nicely together (I would assume at least on the speaker mapping/rendering side of the equation), so that may take a bit of doing.


It may be that standard Blu-ray discs would not have the extra bandwidth and storage space to handle a high quality object track (that isn't really dumbed down) plus 1080p video. Advanced UHD media and HDMI 2.0 may be the only way to deliver everything we want.


----------



## sdurani




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24768595
> 
> 
> If 4K is still limited to streaming only then perhaps Dolby could persuade the studios to include Atmos soundtracks on Blu-Rays.


I'm still not understanding your rationale for delaying the release. Why couldn't Atmos soundtracks be streamed like discrete multi-channel is these days? (This would be in addition to putting lossless versions of those soundtracks on BD.)


----------



## TMcG




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24768595
> 
> 
> However, if a new physical format is on the horizon then CE manufacturers (some of which have close ties to the studios) might push the studios to make it an exclusive feature of the new hardware/media in order to help generate sales of the new products.



New 4K Bluray replication machine: http://www.cnet.com/news/100gb-discs-point-to-4k-blu-ray/ 


Ironically, there seem to be an equal number of industry news articles that trumpet the inevitable arrival of 4K Blurays in the near future as those that say 4K Bluray is "dead tech walking".


Although 1080p streaming picture quality is generally very good for traditional small screens, it's still a very noticeable downgrade on a large projection system when compared to Bluray, nevermind the constant optimization of network availability. Given the enormous increase in data for 4K vs. 1080P, especially with full bit rate color and audio, I can't see how 4K would be successful in anything BUT physical media for the near future unless you have a 4K device that needs to buffer for much longer than you'd ever want to wait for the media to begin playing in full resolution.


And to me there is inherent risk in going with streaming-only media as ISPs are constantly looking for ways to throttle bandwidth of high-demand users and manipulate download speeds....or happily charge you extra fees to provide a higher level of service. If I've already purchased a 4k digital download of the movie, for example, how many times must I pay for the bandwidth again and again to stream it to my display? Plus I just don't like the fact of relying on a pay service as a necessary precursor to watch media I already own. Just my two cents, of course.


----------



## HockeyoAJB




> Quote:
> Originally Posted by *sdurani*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24768846
> 
> 
> I'm still not understanding your rationale for delaying the release. Why couldn't Atmos soundtracks be streamed like discrete multi-channel is these days? (This would be in addition to putting lossless versions of those soundtracks on BD.)



I hadn't considered the possibilty that they would do lossy versions of object based audio. I was assuming that it would all be lossless and would be marketed as ultra-premium audio. I figured that when it became available, it would be included only with the highest quality content (either Blu-ray or it's replacement). Once that was done, they could finally upgrade the audio for broadcast and rentals to 7.1 lossless.


However, your idea of doing both lossless and lossy versions of object-based audio for new content would work better. It still allows for a distinction to be made between premium and standard content, while providing incentive to purchase new 4K and 1080p equipment. And, eventually they could stop doing channel-based mixes, as everyone would be expected to have a system that supports object-based audio.


----------



## sdurani




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24769115
> 
> 
> I was assuming that it would all be lossless and would be marketed as ultra-premium audio.


Guess I'm guilty of assuming the opposite, since everything (music and movies, audio and video) has lossy versions delivered to consumers. With the popularity of iTunes and Netflix, I took it for granted that object-based soundtracks would reach their widest audiences in lossy form. Just as folks don't need HDMI 2.0 on their media streamers to watch 'House of Cards' streamed in 4K, likewise they shouldn't need it to bitstream an Atmos soundtrack.


----------



## BlueChris




> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/180#post_24763750
> 
> 
> Sounds like you NEED a tv now so I'd say go for it. It's a good tv so you should be happy with it. I personally don't think that the tv mfrs are going to be completely transparent in what their HDMI 2.0 tv's have in the way of compliance until late this year or next year. Once the chips are readily available I think we'll see more features listed etc instead of "HDMI 2.0 Compliant" which is very vague. There's probably going to be a huge push as it gets closer to "Black Friday (Week)" with all kinds of marketing spin which will probably confuse matters even worse. TV mfrs are going to want to be getting rid of the "HDMI 2.0 Compliant" tv's inventory, or what ever they will be calling them because the true HDMI 2.0 sets (with embedded chipsets) will be coming.


 

Thx m8 im w8ting the TV at 5 of June to arrive and i will provide info if needed..

 

Point is i mailed LG asking them the exact specs of the HDMI-2.0 that the TV has and what abilities as matter color depth etc.. they reply to me to see in the wikipedia the specs of the HDMI-2.0 and that the TV complies with everything!!?? 

 

Well this is too good to be true and i hold my horses but i keep also the answer so in 1-2 years if i plug something and it wont play then they will have it back


----------



## Otto Pylot




> Quote:
> Originally Posted by *BlueChris*  /t/1523994/hdmi-2-0-cedia-webinar/220_20#post_24770517
> 
> 
> Thx m8 im w8ting the TV at 5 of June to arrive and i will provide info if needed..
> 
> 
> Point is i mailed LG asking them the exact specs of the HDMI-2.0 that the TV has and what abilities as matter color depth etc.. they reply to me to see in the wikipedia the specs of the HDMI-2.0 and that the TV complies with everything!!??
> 
> 
> Well this is too good to be true and i hold my horses but i keep also the answer so in 1-2 years if i plug something and it wont play then they will have it back



Any customer support group that refers a customer back to a Wikipedia article as a reference for their product is not to be trusted. Obviously the support person did not understand your question, or really knows anything about the new LG, and just directed you to a definition of HDMI 2.0. LG, and other mfrs, need to tell the customer, preferably in writing in their product literature, which protocols of the HDMI 2.0 spec their device or devices support.


----------



## HockeyoAJB




> Quote:
> Originally Posted by *Otto Pylot*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24771361
> 
> 
> 
> Any customer support group that refers a customer back to a Wikipedia article as a reference for their product is not to be trusted. Obviously the support person did not understand your question, or really knows anything about the new LG, and just directed you to a definition of HDMI 2.0. LG, and other mfrs, need to tell the customer, preferably in writing in their product literature, which protocols of the HDMI 2.0 spec their device or devices support.


+1

 

Also, the wiki page doesn't really give you a clear picture of the different combinations of resolution, frame rate, bit depth, and color subsampling supported by HDMI 2.0.  For example, it shows support for 4:2:0, 4:2:2, and 4:4:4, but no indication of what bit depths are available for each of these and at what frame rates.  For 4Kp50/60 8-bit video, HDMI 2.0 supports RGB, 4:2:0, and 4:4:4.  For 4Kp50/60 10-bit & 16-bit video, HDMI 2.0 only supports 4:2:0.  For 4Kp50/60 12-bit video, HDMI 2.0 supports both 4:2:2 and 4:2:0.  Change the frame rate to 24/25/30 and the color space and color sub-sampling support is different.

 

At the very least, they could have referred you to the Overview of HDMI 2.0 in the Knowledge Base on HDMI.org's official website...

http://www.hdmi.org/learningcenter/kb.aspx#119

 

But, even then they would still be misleading you in all likelihood.  I very much doubt that the TV could pass 32 channel audio out to another device and it certainly can't play it as 32 discrete channels.


----------



## Otto Pylot

The audio portion of HDMI 2.0 is one of those specs that will probably never make it to consumer devices, at least not for quite some time. Sort of like Deep Color and ethernet with 1.4. I'm more concerned with pq/depth, CEC Extensions, etc. But yeah, the smoke and mirrors about HDMI 2.0 Compatibility is going to get worse as we get closer to the holidays. I'm just waiting for the Black Friday (Week) claims just to watch the feeding frenzy.


Oh, and don't forget those HDMI 2.0 cables that will be advertised as well







On sale at a special price no less.


----------



## SoundChex




> Quote:
> Originally Posted by *HockeyoAJB*  /t/1523994/hdmi-2-0-cedia-webinar/210#post_24771536
> 
> 
> I very much doubt that the TV could pass 32 channel audio out to another device and it certainly can't play it as 32 discrete channels.


*ETRI* is currently working with *NHK* _on wavefront synthesis research_, and with *LG* _on pre-production development_ to deliver a consumer version of that technology using both *top and bottom of display* soundbars. (_And I saw another ETRI "design" which included a *third* soundbar located behind the audience!_







)

 

 


From the graphics, it's hard to tell the point in the process at which the broadcast|disc|etc bitstream is decoded|rendered into multichannel PCM for subsequent delivery to the _3Daudio_ soundbar system over HDMI 2.0.










One option for home playback of a *Dolby Atmos* movie could be to first render the movie into a (_nominal_) *Hamasaki 22.2* mix (_in a manner similar to that in which the 5.1 and 7.1 theatrical mixes are generated from the Atmos object code_) and subsequently to play that 22.2 audio back through a *SoundWindow*. _How acceptable might that be to the average home audience? Hard to tell without production hardware to experience._

_


----------

