hi robb
Prominent Member
Hi folks,
I posted this in the Philips OLED forum but it’s pretty quiet in there so thought I’d try in here. I’ll also post in the ATV 4K forum in case someone in there has seen this issue.
Apologies for the long post.
Last December, I bought a Philips OLED 754 when the price dropped to under a grand. It has Philips own Saphi OS which has been a bit buggy, but by and by I’ve sorted the small issues out and the TV is perfectly usable. And generally the picture is good.
Except for Dolby Vision when using my Apple 4K (and HDR to a lesser extent)
If I have the Apple TV set to output DV, I get awful banding / moire at times in content. This happens on films I’ve bought from iTunes, I’ve seen it in Netflix and Disney Plus to. I also seen it on the ATV home screen.
Some further info.
1) It’s very noticeable in lighter colours such as smoke, steam, skies etc. My iTunes bought copy of Bladerunner 2049 when the spinner lands near the beginning looks awful.
2) I see it to a lesser extent in Netflix content and Disney plus. Interestingly the TV’s inbuilt Netflix shows no issues with the same content.
3) it’s not the HDMI cable. I’ve tried various cables and currently use a 48Gbps 2.1 cable!
4) it’s not my network, the ATV is hard wired to a 1Gb port on the ASUS GT-AX11000 router and t’internet is approx 70Mbps.
5) I see the same thing on the ATV desktop for example the colour gradients if I put a tune on and it just shows the little album cover surrounded by graduated colour.
6) The Apple TV was replaced about 6 weeks ago as it borfed after a firmware update so it’s brand new.
7) It’s not the lower bandwidth being served up by Apple etc due to Coronavirus. It was happening way before then.
8) I can see it to a much lesser degree if I set the ATV to HDR. But it’s nowhere near as bad. I don’t see it in SDR.
And this is where it gets interesting and I think is what the problem is!
A) If in the ATV settings, I’ve set Dolby Vision, if I look at the signal the TV is receiving, the info shows 8bit though the TV thinks it’s getting DV.
B) setting the ATV to HDR, with 4:2:2, the same info shows a 12 bit picture, HDR 4:2:0 shows a 10 bit picture. And SDR shows a 8bit pic. Which is what I’d expect. Obviously setting to HDR then shows HR10+ only.
I’ve reset the TV multiple times and it doesn’t matter what the ATV match frame rate / picture settings are set to either.
It’s driving me mad. Anyone any ideas?
Pics attached showing what I’m seeing.
Kind regards
Rob
I posted this in the Philips OLED forum but it’s pretty quiet in there so thought I’d try in here. I’ll also post in the ATV 4K forum in case someone in there has seen this issue.
Apologies for the long post.
Last December, I bought a Philips OLED 754 when the price dropped to under a grand. It has Philips own Saphi OS which has been a bit buggy, but by and by I’ve sorted the small issues out and the TV is perfectly usable. And generally the picture is good.
Except for Dolby Vision when using my Apple 4K (and HDR to a lesser extent)
If I have the Apple TV set to output DV, I get awful banding / moire at times in content. This happens on films I’ve bought from iTunes, I’ve seen it in Netflix and Disney Plus to. I also seen it on the ATV home screen.
Some further info.
1) It’s very noticeable in lighter colours such as smoke, steam, skies etc. My iTunes bought copy of Bladerunner 2049 when the spinner lands near the beginning looks awful.
2) I see it to a lesser extent in Netflix content and Disney plus. Interestingly the TV’s inbuilt Netflix shows no issues with the same content.
3) it’s not the HDMI cable. I’ve tried various cables and currently use a 48Gbps 2.1 cable!
4) it’s not my network, the ATV is hard wired to a 1Gb port on the ASUS GT-AX11000 router and t’internet is approx 70Mbps.
5) I see the same thing on the ATV desktop for example the colour gradients if I put a tune on and it just shows the little album cover surrounded by graduated colour.
6) The Apple TV was replaced about 6 weeks ago as it borfed after a firmware update so it’s brand new.
7) It’s not the lower bandwidth being served up by Apple etc due to Coronavirus. It was happening way before then.
8) I can see it to a much lesser degree if I set the ATV to HDR. But it’s nowhere near as bad. I don’t see it in SDR.
And this is where it gets interesting and I think is what the problem is!
A) If in the ATV settings, I’ve set Dolby Vision, if I look at the signal the TV is receiving, the info shows 8bit though the TV thinks it’s getting DV.
B) setting the ATV to HDR, with 4:2:2, the same info shows a 12 bit picture, HDR 4:2:0 shows a 10 bit picture. And SDR shows a 8bit pic. Which is what I’d expect. Obviously setting to HDR then shows HR10+ only.
I’ve reset the TV multiple times and it doesn’t matter what the ATV match frame rate / picture settings are set to either.
It’s driving me mad. Anyone any ideas?
Pics attached showing what I’m seeing.
Kind regards
Rob
Attachments
-
8FA305DC-854A-4414-9A0E-49E96AF047CE.jpeg211.9 KB · Views: 1,274
-
3204B6C9-07E7-4104-AD4A-1FD03F59C1F4.jpeg225.1 KB · Views: 804
-
CF8A9D84-08D6-49A7-9B9F-991A7456AD25.jpeg140.2 KB · Views: 494
-
BF23E79E-3C12-4BD6-92A9-9EF6F8FEDCD6.jpeg198.4 KB · Views: 467
-
11BEC7F3-5F3E-4328-9798-A47C047CFB52.jpeg130.2 KB · Views: 429
-
C9765988-DED0-4E14-A0EE-2314C1F70E14.jpeg124.5 KB · Views: 464