We can already do 8K resolution. We still haven’t gotten to a point where the average broadcast is anything more than 720p or 1080p.
It’s the reason I never bought a 4K or 8K tv. Sure, I have a new tv, but the only thing thats 8k is the forced ads into the TVs OS.
And that’s why I don’t see benefit to a new HDMI. It’s just going to support more protocols, and make tv’s do more things that we don’t want. It’s going to make DRM in the cable. It’s going to make unskippable ads, it’s going to make all this shit that nobody wants or needs, but ooooOOOOooooo!!! Look! It’s new tech, so everybody gotta have it!!!
But how’s it going to improve your visual experience if the content isn’t any better resolution to begin with?
No, but it’s the only reason for HDMI (along with AV receivers) because it’s all the manufacturers support, so all of us home theater nerds that do care about this stuff have no real* choice but to keep up with the HDMI world. Yes, you can set up a media server that streams 4K video, but you’re not going to find a DisplayPort 4K UHD player, or a 7.2 AVR that plugs into your 77” OLED and supports all of your game consoles. HDMI is just the unfortunate reality there.
That said, the tech that actually takes advantage of the new cable specs tends to lag behind significantly, and new gaming consoles that support HDMI 2.2 likely will in a limited (ultimately disappointing) capacity for years, just like previous versions.
(Also the top comment in this thread doesn’t really seem to reflect modern reality for most people I know. Most people are using their TV to stream at 1080p - 4K, not watching broadcast TV - in which case, yeah, get a $60 720p LCD or whatever, HDMI specs won’t matter to that kind of viewer. Still, subscription streaming quality definitely doesn’t take full advantage of your expensive shiny new TV the way physical media - or a media server - might, but that’s another conversation).
I’d like to be able to run my 3 4K monitors without compression via a single port and still have the option for one to be high refreshrate. Things like spreadsheets benefit far more from the increased resolutions than shows anyways, imo.
On HDMI 2.1, you can do 8K30fps before you have to compress the stream with DSC, which is “visually lossless”, so actually lossy. We don’t even have 5K120fps or 4K240fps without compression. These are common refresh rates for gaming. So you could say that the highest resolution that supports all use cases without compromises is 1440p. That’s definitely not enough even by today’s standards. I think you’re underestimating the time it takes for standards to reach widespread adoption. The average viewer is not going to have access to it until the technology is cheap and enough time has passed for at least a few hundred million units to have reached the market. If you think you’re going to be using it in 2030 or later for your “average broadcast”, then it needs to be designed today.
Of course HDMI is shit for reasons you mention. DisplayPort is better, but it’s not an open standard either and it supports DRM as well. But designing for higher bandwidth is necessary and it has nothing to do with HDMI’s problems.
Yes. It’s nice to have crisp fonts, but how realistic do I need pictures on my screen really?
Life is finite. I have my dog, family members, some need for rest and food and cleaning and music, chores forgotten, and a girl I’d like to talk to (likely either it’s my imagination and she doesn’t like me that way or the ship has sailed, but still to hope is to live).
Pictures on a usual 1920x1080 display are already as good as what people of 1950s could dream to see on paper. I’m not judging those who want more, but it seems like pressure for the sake of it. More GHz, more pixels, bigger something.
I hope that pressure for “more” will change to pressure for “better” some day.
No thank you. I’d like to pass on this.
We can already do 8K resolution. We still haven’t gotten to a point where the average broadcast is anything more than 720p or 1080p.
It’s the reason I never bought a 4K or 8K tv. Sure, I have a new tv, but the only thing thats 8k is the forced ads into the TVs OS.
And that’s why I don’t see benefit to a new HDMI. It’s just going to support more protocols, and make tv’s do more things that we don’t want. It’s going to make DRM in the cable. It’s going to make unskippable ads, it’s going to make all this shit that nobody wants or needs, but ooooOOOOooooo!!! Look! It’s new tech, so everybody gotta have it!!!
But how’s it going to improve your visual experience if the content isn’t any better resolution to begin with?
do you really think the only use for a screen is television? boy, let me introduce you to the world of personal computing, it’s all the rage
You mean there’s other uses besides the weather channel?
No, but it’s the only reason for HDMI (along with AV receivers) because it’s all the manufacturers support, so all of us home theater nerds that do care about this stuff have no real* choice but to keep up with the HDMI world. Yes, you can set up a media server that streams 4K video, but you’re not going to find a DisplayPort 4K UHD player, or a 7.2 AVR that plugs into your 77” OLED and supports all of your game consoles. HDMI is just the unfortunate reality there.
That said, the tech that actually takes advantage of the new cable specs tends to lag behind significantly, and new gaming consoles that support HDMI 2.2 likely will in a limited (ultimately disappointing) capacity for years, just like previous versions.
(Also the top comment in this thread doesn’t really seem to reflect modern reality for most people I know. Most people are using their TV to stream at 1080p - 4K, not watching broadcast TV - in which case, yeah, get a $60 720p LCD or whatever, HDMI specs won’t matter to that kind of viewer. Still, subscription streaming quality definitely doesn’t take full advantage of your expensive shiny new TV the way physical media - or a media server - might, but that’s another conversation).
I’d like to be able to run my 3 4K monitors without compression via a single port and still have the option for one to be high refreshrate. Things like spreadsheets benefit far more from the increased resolutions than shows anyways, imo.
On HDMI 2.1, you can do 8K30fps before you have to compress the stream with DSC, which is “visually lossless”, so actually lossy. We don’t even have 5K120fps or 4K240fps without compression. These are common refresh rates for gaming. So you could say that the highest resolution that supports all use cases without compromises is 1440p. That’s definitely not enough even by today’s standards. I think you’re underestimating the time it takes for standards to reach widespread adoption. The average viewer is not going to have access to it until the technology is cheap and enough time has passed for at least a few hundred million units to have reached the market. If you think you’re going to be using it in 2030 or later for your “average broadcast”, then it needs to be designed today.
Of course HDMI is shit for reasons you mention. DisplayPort is better, but it’s not an open standard either and it supports DRM as well. But designing for higher bandwidth is necessary and it has nothing to do with HDMI’s problems.
It might be handy for huge billboard displays and the like
Get better content.
Yes. It’s nice to have crisp fonts, but how realistic do I need pictures on my screen really?
Life is finite. I have my dog, family members, some need for rest and food and cleaning and music, chores forgotten, and a girl I’d like to talk to (likely either it’s my imagination and she doesn’t like me that way or the ship has sailed, but still to hope is to live).
Pictures on a usual 1920x1080 display are already as good as what people of 1950s could dream to see on paper. I’m not judging those who want more, but it seems like pressure for the sake of it. More GHz, more pixels, bigger something.
I hope that pressure for “more” will change to pressure for “better” some day.