I just upgraded my older Roku streaming device to a new "Roku Streaming Stick 4K". The previous one was 1080p. We've had a 70" 4K television for a couple of years now.
I knew that going 4K would gain me nothing, the human eye just cannot make use of that kind of resolution from our viewing distance on the couch, 13 feet away from the screen. It's barely worth moving from 720p to 1080p at that distance. We bought the 4K TV because if you wanted a TV with the newer nice features, you simply had to buy one that was 4K. You could still find 1080p sets at that time, but they didn't have the best features. And they were only a little less expensive than a 4K set. So there was really no reason NOT to get a 4K set.
So why did I upgrade my Roku? Because of HDR support. (it's also faster then my previous one and has a much better remote).
My old Roku didn't have HDR support. Unlike the 4K high resolution being basically useless to me, HDR is definitely a bonus. You can see the difference between HDR vs. non-HDR from any viewing distance. It spreads out the dynamic range ("HDR" stands for "High Dynamic Range"), opening up the shadows and toning down the burned out highlights. Here's an example:
Note that it's not just the Roku that needs to support HDR - the TV needs to support it (most all do these days), and every component between the Roku and the TV needs to support it - and that includes your HDMI cables. HDMI cable support pretty much means "buy a good quality cable, not an el-cheapo brand". Furthermore, the video source has to use HDR, and it has to use a flavor of HDR that your components support. In other words, you are nowhere near close to being guaranteed HDR viewing just because you buy a device labeled "HDR" on its box. In my case, I plugged the new Roku directly into my TV's HDMI port. No middleman components or cables to get in the way. I have set up a "TV centric" system instead of a "A/V Receiver centric" system. In other words, all of my video components - Roku, Cable box, etc. - connect to the TV, and then the TV connects to the A/V receiver (digital audio connection). Switching between devices means using the TV's controls. In an A/V receiver centric setup, all your devices plug into the receiver, and the receiver then plugs into the TV (video connection). You then use the receivers controls to switch between devices. An A/V receiver centric system puts the receiver and cables in the chain of things that need to support HDR. In a TV centric setup, the device plugs directly into the TV so there are no cables or receivers in the chain.
Anyway, once you have got things set up so that HDR has a ghost of a chance of working, your next step is to find things to watch that are encoded with HDR. But wait, there are different flavors of HDR. The most common, and the one that is meant when you just see "HDR" labeled on something, is actually "HDR10". Another flavor of HDR is "Dolby Vision". Another is "HDR10+". And there are still others. Things that I have seen that support one of the more exotic flavors like Dolby Vision, or HDR10+, also support HDR10. So that's good. Dolby Vision requires a licensing fee, and that has caused some manufacturers to shun it, opting for HDR10+ (free) instead. HDR10+ is roughly equivalent to Dolby Vision in what it can accomplish, which is step up from vanilla HDR10. Dolby Vision greatly increases the bandwidth that you need to display the video. So you are more likely to need high end HDMI cables. And you will need a faster internet connection to boot. HDR10 and HDR10+ do not require much bandwidth increase compared to non-HDR content.
Additional notes:
HDR does NOT require 4K. You can have 1080p HDR as well. For most people - who don't sit 4 feet in front of their 70" TV - HDR will give you much more video bling-bling than 4K (and you don't have to worry about viewing distance). The downside is that 4K is almost ubiquitous now, whereas HDR is still half-magic in getting everything together to get it working. BTW, one of the reasons we got the 4K TV years ago was because of HDR support. I didn't see that on the 1080p TVs that I could find. But back then, there wasn't any HDR content to speak of, so it was more of a future-proofing decision to go with the 4K set.
Some content providers charge extra for an account that supports 4K/HDR. Netflix is one such provider. On the other hand, Amazon Prime Video does not charge extra for 4K/HDR. So this is something you need to look into as well when searching for content.
Some TVs (my higher-end Samsung being an example) do not enable HDR by default. You have to dig around in the settings and manually enable HDR for each specific HDMI input you want to use it on.
What was the point of this post I just typed in? I don't know. I guess I just felt like documenting some of the stuff I have learned over the years, and more recently, so that others might get a better feel about things in case they didn't already know this stuff.
I knew that going 4K would gain me nothing, the human eye just cannot make use of that kind of resolution from our viewing distance on the couch, 13 feet away from the screen. It's barely worth moving from 720p to 1080p at that distance. We bought the 4K TV because if you wanted a TV with the newer nice features, you simply had to buy one that was 4K. You could still find 1080p sets at that time, but they didn't have the best features. And they were only a little less expensive than a 4K set. So there was really no reason NOT to get a 4K set.
So why did I upgrade my Roku? Because of HDR support. (it's also faster then my previous one and has a much better remote).
My old Roku didn't have HDR support. Unlike the 4K high resolution being basically useless to me, HDR is definitely a bonus. You can see the difference between HDR vs. non-HDR from any viewing distance. It spreads out the dynamic range ("HDR" stands for "High Dynamic Range"), opening up the shadows and toning down the burned out highlights. Here's an example:
Note that it's not just the Roku that needs to support HDR - the TV needs to support it (most all do these days), and every component between the Roku and the TV needs to support it - and that includes your HDMI cables. HDMI cable support pretty much means "buy a good quality cable, not an el-cheapo brand". Furthermore, the video source has to use HDR, and it has to use a flavor of HDR that your components support. In other words, you are nowhere near close to being guaranteed HDR viewing just because you buy a device labeled "HDR" on its box. In my case, I plugged the new Roku directly into my TV's HDMI port. No middleman components or cables to get in the way. I have set up a "TV centric" system instead of a "A/V Receiver centric" system. In other words, all of my video components - Roku, Cable box, etc. - connect to the TV, and then the TV connects to the A/V receiver (digital audio connection). Switching between devices means using the TV's controls. In an A/V receiver centric setup, all your devices plug into the receiver, and the receiver then plugs into the TV (video connection). You then use the receivers controls to switch between devices. An A/V receiver centric system puts the receiver and cables in the chain of things that need to support HDR. In a TV centric setup, the device plugs directly into the TV so there are no cables or receivers in the chain.
Anyway, once you have got things set up so that HDR has a ghost of a chance of working, your next step is to find things to watch that are encoded with HDR. But wait, there are different flavors of HDR. The most common, and the one that is meant when you just see "HDR" labeled on something, is actually "HDR10". Another flavor of HDR is "Dolby Vision". Another is "HDR10+". And there are still others. Things that I have seen that support one of the more exotic flavors like Dolby Vision, or HDR10+, also support HDR10. So that's good. Dolby Vision requires a licensing fee, and that has caused some manufacturers to shun it, opting for HDR10+ (free) instead. HDR10+ is roughly equivalent to Dolby Vision in what it can accomplish, which is step up from vanilla HDR10. Dolby Vision greatly increases the bandwidth that you need to display the video. So you are more likely to need high end HDMI cables. And you will need a faster internet connection to boot. HDR10 and HDR10+ do not require much bandwidth increase compared to non-HDR content.
Additional notes:
HDR does NOT require 4K. You can have 1080p HDR as well. For most people - who don't sit 4 feet in front of their 70" TV - HDR will give you much more video bling-bling than 4K (and you don't have to worry about viewing distance). The downside is that 4K is almost ubiquitous now, whereas HDR is still half-magic in getting everything together to get it working. BTW, one of the reasons we got the 4K TV years ago was because of HDR support. I didn't see that on the 1080p TVs that I could find. But back then, there wasn't any HDR content to speak of, so it was more of a future-proofing decision to go with the 4K set.
Some content providers charge extra for an account that supports 4K/HDR. Netflix is one such provider. On the other hand, Amazon Prime Video does not charge extra for 4K/HDR. So this is something you need to look into as well when searching for content.
Some TVs (my higher-end Samsung being an example) do not enable HDR by default. You have to dig around in the settings and manually enable HDR for each specific HDMI input you want to use it on.
What was the point of this post I just typed in? I don't know. I guess I just felt like documenting some of the stuff I have learned over the years, and more recently, so that others might get a better feel about things in case they didn't already know this stuff.
Last edited: