Panasonic GZ950 - gaming

kregler

Standard Member
Joined
Apr 11, 2007
Messages
16
Reaction score
0
Points
7
Age
46
I’ve been hovering over the buy it now button for the GZ950 for months now and still undecided...

Everyone says the set is great movie wise and not for gaming due to lack of HDMI 2.1, but my question, of which I can’t find on searches is just how bad is the set for PS4 and gaming?!

Are the issues, what ever they are, that bad? What’s people first hand experiences?
 
I game on my GZ950 almost every night on my PS4 without any issues whatsoever and love it !

While the measured response time isn't quite as low as the LG C9 according to some reviews, it's still very low, it's in the same ball-park and seems plenty fast enough for me. Other manufacturers' sets can be a lot worse, IIRC. The only caveat to this is that I'm not sure whether a super-fast response time is important in the games I play (GTA V, RDR2, GT Sport etc) so YMMV if you're a hardcore COD player, for example :D

For me, the risk of screen burn (pixel wear) is more of a potential issue than the response time, so I tend to turn off HUDs where I can (and it doesn't affect gameplay) and I also use the lowest Luminance setting I can get away with, without it affecting the experience. It depends on what kind of gamer you are I guess, but if you like playing games like FIFA or Fortnite for many hours a day (with their bright, static HUDs) on a vivid or very bright TV picture mode, then you might want to consider an LCD instead. For info - with the precautions I take, there is currently no sign of any premature pixel wear or retention at this stage (the set is 8 months old)

I don't blame you for 'hovering', as I did exactly the same last year for exactly the same reason ! My previous plasma failed at the start of last year so I was desperate to replace the small LCD loaner set I'd borrowed. Despite that, I nearly held off buying a GZ950 due to their lack of HDMI 2.1 features because I wanted to buy one of their 2020 sets which I presumed would include them, so I'm glad I didn't wait !

Yes, the lack of HDMI 2.1 on the GZs is disappointing but how soon (if at all) the next gen consoles make use of all the HDMI 2.1 features seems to be open to debate. Many people seem to think that 4K gaming @ 120Hz is just a pipe-dream :D

I do intend on buying a PS5 at some point so only time will tell as to whether I'll miss the lack of HDMI 2.1 in the future, as I do tend to keep my TVs for a long time. Panasonic are obviously willing to take the gamble that they're not currently needed - either that, or they're intending to drip-feed 2.1 features each year to entice you to upgrade ;)

The reviews say gamers who demand HDMI 2.1 should go for the LG C9 because this is the only obvious choice at this stage, but that's more of a comment about future-proofing and isn't the same as saying the GZ950 is not great for gaming right now because (IMO at least) it is ! :smashin:
 
Last edited:
I was between the LG C9 and the GZ950. It is the family TV and replaced a Panasonic plasma. My son does game and I do every now and again. I wasn't bothered by the HDMI 2.1, I was using a 13 year old Plasma with the PS4 and it was fine. We used the PS4 on the GZ950 for a couple of weeks but having moved the plasma upstairs he has moved the PS4 with it so he can play as much as he likes and I don't have to worry about the new TV. I think as already mention you play different games, don't sit on Fortnite all day it will be fine and not get screen burn.
 
From what I've read, HDMI2.1 only comes into play with the next gen of consoles, it doesn't have any effect on current or previous gen. Even then, not sure the PS5 and XBox Series X will take full advantage of the new features, at least not to begin with
 
The gaming-oriented HDMI 2.1 features will have their uses for some, no doubt, but people exaggerate their importance, especially when they argue that any TV that doesn't support these features is "bad" for gaming!

For the current and older generations of consoles, HDMI 2.1 is largely irrelevant. And even for future games, it's not like you can't have a very nice experience gaming on a TV without, for example, VRR. Obviously also depends on the genres you're most into. Speaking for myself, I'm mainly into JRPGs and such, so I don't expect HDMI 2.1 to be a big deal for me at all. Especially since I also tend to mainly focus on games from the '90s-'00s.
 
I game on a mine with a PS4. I use the Professional 1 picture setting and try to make sure I catch the 4 hour refresh by turning the set off.
I have not changed the picture settings like the other poster. We watch plenty of varied content so i am not worried about image retention/burn. I love gaming on it especially in HDR.
 
I have one and use it for gaming across PS4, Xbox One and Switch. The picture is superb, and although the response (22ms) is well below what the LG sets are capable of (13ms), it's not something anyone but the most hardcore twitch gamers would notice. For example, I play CoD casually on mine most days and it feels totally responsive.

In real world terms, how important HDMI 2.1 will be in the next few years is a bit up in the air at the moment in my opinion. I was between the GZ950 and the C9, and eventually chose the GZ950 as it was reputed to have better out of the box calibration and accuracy (especially important to me as I will not be getting a pro calibration). It also didn't have that annoying looking Wiimote style remote. I accepted the lack of HDMI 2.1 as the trade-off for this.

This is because, while I do play a fair number of games, I'm not totally convinced HDMI 2.1 is going to be a big deal for a while yet. The main things that it offers are higher framerates (up to 120fps over 4K), and VRR.

In the case of the former, this will apply to an incredibly tiny percentage of games on PS5 and XSX. My guess is you will probably be able to count them on one hand. Most games will probably target 30fps, with a few hitting 60, and a very very small number rising above that. That's because 120fps will eat an insane amount of compute power, and will only benefit the comparatively tiny number of people who have HDMI 2.1 TVs.

VRR is the feature that is much more likely to be practically useful on a day to day level. But I question the current implementation of this - on the latest LG sets, it still only supports a narrow range of framerates between about 45 and 60 fps (might have that slightly wrong but it's around there). Anything below that, you won't get the effect of VRR. Most games are below that.

We've also already got VRR implemented on the current Xbox One X, but based on anecdotal reports online and from people I know with compatible HDMI 2.1 TVs, it really doesn't seem like it's making a whole lot of difference at the moment.

In my humble opinion, the real benefits of VRR will kick in in a few years time, once TVs support a much wider range of framerates for it, and more HDMI 2.1 TVs are in homes. This will incentivise more developers to make use of it, and to make better use of it.

With all that in mind, it didn't seem worth it to me to take a potential trade-off in accuracy and out-of-the-box calibration (which will affect everything I look at on the screen), for the currently slightly nebulous benefits of HDMI 2.1 (which will only affect certain games).

That was my rationale for selecting the GZ950, and I hope it helps you :)
 
Last edited:
I was concerned about 2.1 but after doing lots and lots of research I actually think most of it is overplayed and will not be a major factor, certainly for the next few years. I doubt any mainstream title on either new console will do 120/4k. Things like VRR/eArc would be nice to have but in reality I think it is overblown and potentially a bit of marketing from LG.
 
Is the hdmi 2.1 and vrr in the menu fake then?
 
What settings and picture mode do you use for gaming?
And are you using full RGB or limited (if some1 is on PC) ?
 
From what I've read bottom line is 120hz is overkill, I know this will upset some but it gets to a point where the human eye cannot tell the difference in 4k to 8k, or 60hz to 120hz same really. I can tell alot between 1080p and 4k but even then many peeps can't or don't care. I notice a difference between 30fps and 60fps, I can live with 30fps for 4k resolution over 60fps at 1080p but 4k 60fps is probably all you will ever need out of a TV. After that the can eye can barely tell the difference, of course somebody will and no doubt that's the sort of person all these gimmicks are aimed at to get these products sold. I believe around 75 FPS is the max the human can read but whose counting ha ha. Should have kept 3d and hope they bring it back.
 
From what I've read bottom line is 120hz is overkill, I know this will upset some but it gets to a point where the human eye cannot tell the difference in 4k to 8k, or 60hz to 120hz same really. I can tell alot between 1080p and 4k but even then many peeps can't or don't care. I notice a difference between 30fps and 60fps, I can live with 30fps for 4k resolution over 60fps at 1080p but 4k 60fps is probably all you will ever need out of a TV. After that the can eye can barely tell the difference, of course somebody will and no doubt that's the sort of person all these gimmicks are aimed at to get these products sold. I believe around 75 FPS is the max the human can read but whose counting ha ha. Should have kept 3d and hope they bring it back.

Not really man. My son same as me can feel the difference between 120hz/60hz even in desktop while moving just a mouse! My son cant sit at my desktop PC on my 60hz monitor no more from the time he has 144hz!
Most people will notice this maybe if you are a casual console gamer and you played whole life with 30fps you can't notice it .. :)
I will agree that the difference is smaller then between 30/60fps but it's sureley there!

What settings and picture mode do you use for gaming?
And are you using full RGB or limited (if some1 is on PC) ?

Anyone using this TV on PC gaming too?
 
Yes there will be anomalies but my age is irrelevant, the human hasn't evolved so much since the cave man so they would see the same as we do now, just human science and fact, sure spending enough time on a certain screen your brain will adapt, tried watching will Smith's new film running at 60hz, felt so unnatural, think not does 120hz actually, yes smooth but unatural. Most people that care to bother will notice, they have too or they would be throwing there money away. Just a science based fact anything above 75 FPS is not possible unless your the bionic man, that's a cyborg to the younger generation.
.
 
Yes there will be anomalies but my age is irrelevant, the human hasn't evolved so much since the cave man so they would see the same as we do now, just human science and fact, sure spending enough time on a certain screen your brain will adapt, tried watching will Smith's new film running at 60hz, felt so unnatural, think not does 120hz actually, yes smooth but unatural. Most people that care to bother will notice, they have too or they would be throwing there money away. Just a science based fact anything above 75 FPS is not possible unless your the bionic man, that's a cyborg to the younger generation.
.

Man just play any game on PC with 60 fps and 120/144 you will see this instantly.
If peeople wouldn't see this difference there would be no monitors 144/240/even 360hz cause noone would buy it.
 
Man just play any game on PC with 60 fps and 120/144 you will see this instantly.
If peeople wouldn't see this difference there would be no monitors 144/240/even 360hz cause noone would buy it.
Yeah I get that and respect it too, my next purchase in about 7 years will be 8k 120hz but from what I've seen 4k 60hz will be fine to get the best balance between graphical fidelity and a steady framerate for my series X.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom