• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NeoGaf OLED owners thread

rofif

Can’t Git Gud
So, I've recently been launching some SDR only games on ps5 (Bloodborne, Alan Wake, Tormented Souls).
I've experimented a bit with "HDR ALWAYS ON" and it's not a good idea. Lights are more vibrant but highlight are crushed and shadow areas elevated.
So "hdr on when supported" is still correct choice.

Next (lg c1 48"), Sharpness 0, temp warm50, game optimizer mode, no boost mode in game optimizer menu and all game optimizer settings at default. No AI crap. Contrast and brightness at default 85 and 50.
Input label as pc mode so peak brightness and all that crap is locked. Would be be disabled anyway.

The two important settings are Oled Pixel Brightness and Gamma. I use 50 pixel brightness and gamma 2.2. I see plenty of people recommending bt.1886 but it's wrong I think? too dark and sdr games are mastered in 2.2
 
If you use HGIG tone mapping and calibrated the PS5, SDR should look the same whether in SDR or HDR output from the PS5. This is how it appears on my C9 and C1. I use default game picture settings (other than enabling HGIG).

Arguments have been made that running SDR content in HDR container on the LG OLEDs is actually more accurate, due to the wide gamut and HDR accuracy of these TVs.
 
Last edited:
IMG-20231021-204603.jpg

IMG-20231020-232434.jpg

IMG-20231020-231533.jpg


Many people don't appreciate how ambilight enhances the atmosphere while playing.
🤷🏼‍♂️
 
Last edited:

King Dazzar

Member
IMG-20231021-204603.jpg

IMG-20231020-232434.jpg

IMG-20231020-231533.jpg


Many people don't appreciate how ambilight enhances the atmosphere while playing.
🤷🏼‍♂️
I used to use it a lot on my Philips OLED when it was my main TV. It was great at doing a few things for me - reducing eye strain, making the 65" feel like it was bigger than it was and adding some more impact to the HDR. I found you had to set it to not be too aggressive in response time so that it added rather than annoyed. I set it on natural iirc.

I did try the huebox on my 85" LCD I upgraded to, but its just not as good as having the fully integrated onboard lighting that comes with the Philips TV's. Also my LCD throws out so much light now (3k nit) that it bounces off the rooms walls anyway in a dark environment, lessening the benefit.
 
Last edited:
I have the LG C9 for my PS5, Apple TV, and Nintendo Switch. I have the QD-OLED itch and will buy something this week to replace this C9. Best Buy has the Sony A95K for half the price of the newer A95L, and it is hard to resist. For the exact same price, I could also get the Samsung S90C. I have tried Samsung over the past few decades and always hated it. They especially seem to not give a shit about non-native picture quality. I also don’t want to spend days fucking around with settings. Both of my LGs required one setting change (HGIG) and the rest works perfectly.

Which one, my dudes? Sony or Samsung?
 
Last edited:

mitchman

Gold Member
I have the LG C9 for my PS5, Apple TV, and Nintendo Switch. I have the QD-OLED itch and will buy something this week to replace this C9. Best Buy has the Sony A95K for half the price of the newer A95L, and it is hard to resist. For the exact same price, I could also get the Samsung S90C. I have tried Samsung over the past few decades and always hated it. They especially seem to not give a shit about non-native picture quality. I also don’t want to spend days fucking around with settings. Both of my LGs required one setting change (HGIG) and the rest works perfectly.

Which one, my dudes? Sony or Samsung?
Do you want more than 1* HDMI 2.1 port? Then go with Samsung.

* (2 usable if you're not using eARC to a receiver or other sound system.)
 

King Dazzar

Member
I have the LG C9 for my PS5, Apple TV, and Nintendo Switch. I have the QD-OLED itch and will buy something this week to replace this C9. Best Buy has the Sony A95K for half the price of the newer A95L, and it is hard to resist. For the exact same price, I could also get the Samsung S90C. I have tried Samsung over the past few decades and always hated it. They especially seem to not give a shit about non-native picture quality. I also don’t want to spend days fucking around with settings. Both of my LGs required one setting change (HGIG) and the rest works perfectly.

Which one, my dudes? Sony or Samsung?
I had lots of issues with an 85" Samsung QN95A and its one connect box. Problematic software too. I'd only buy from somewhere, that I knew would be hassle free returns. Also, the hardware panels used in the Samsung vary dependant on region, so difficult to advise. But in the UK I loved the 90C I viewed, but would be trying to avoid the one connect box (which you can do in the UK as both use same panel).

The Sony A95K is great in 85" size, but not so good in the 65" and 75" sizes. The A95L is better, but not twice the price better. If 85", I'd go A95K for the price. Sony's game mode generally look superb with no local dimming compromise and has excellent colour gradation (no posterization or colour banding in skies etc). But with both the Sony and the Samsung, VRR on LCD will compromise the picture quality slightly.
 

Stafford

Member
Vincent from HDTVTest has a interesting video.



I guess on my Sony A95K this would mean tone mapping set to off instead of gradation preferred or brightness preferred. I have to say that with several games now (XSX) I have noticed it looks significantly better with it off. The problem it does bring is that dark areas get a bit too bright, a bit too raised I guess.
 

King Dazzar

Member
Vincent from HDTVTest has a interesting video.



I guess on my Sony A95K this would mean tone mapping set to off instead of gradation preferred or brightness preferred. I have to say that with several games now (XSX) I have noticed it looks significantly better with it off. The problem it does bring is that dark areas get a bit too bright, a bit too raised I guess.

Usually turning off tone mapping on Sony TV's under tracks the EOTF. Gradation Preferred tracks correctly and is the recommended HDR tone mapping setting for HDR10. DV is the only HDR format where you should turn it off. For normal HDR gaming, you can turn it off to set HDR luminance on slides and then turn it back to Gradation Preferred to track EOTF correctly. Tone mapping on Sony TV's can work similar to HGIG on Gradation Preferred as long as you send it the panels correct peak luminance content. i.e for my TV which can hit 2950nit, I set games to out put at that and use Gradation Preferred and the Tv wont do any addtional tone mapping on top. However if I were to send it 3600nit on Gradation Preferred, the Tv will then do additional tone mapping.

Hope I explained that well. TL;DR always use Gradation Preferred for HDR 10. You can use no tone mapping to set the calibration slides in games, but always set it back to Gradation Preferred once done.
 
Last edited:

Stafford

Member
Usually turning off tone mapping on Sony TV's under tracks the EOTF. Gradation Preferred tracks correctly and is the recommended HDR tone mapping setting for HDR10. DV is the only HDR format where you should turn it off. For normal HDR gaming, you can turn it off to set HDR luminance on slides and then turn it back to Gradation Preferred to track EOTF correctly. Tone mapping on Sony TV's can work similar to HGIG on Gradation Preferred as long as you send it the panels correct peak luminance content. i.e for my TV which can hit 2950nit, I set games to out put at that and use Gradation Preferred and the Tv wont do any addtional tone mapping on top. However if I were to send it 3600nit on Gradation Preferred, the Tv will then do additional tone mapping.

Hope I explained that well. TL;DR always use Gradation Preferred for HDR 10. You can use no tone mapping to set the calibration slides in games, but always set it back to Gradation Preferred once done.

Thanks for the reply!

Yeah I don't use DV much, it's barely a handful of games on Xbox truly making use of it, and even then I don't think the difference is worth it. When I choose DV on the TV I think I don't have VRR then. It's rather odd how this works on this particular TV.

Many games don't have HDR options and rely on the system HDR settings. On the Xbox I have the first pattern all the way to the left, and the other two patterns are at 1000, I believe that's the max peak brightness the A95K does. So that should be fine, right?

Ever since I got this TV which was July of this year I started with RDR2 and had been using brightness preferred, but if gradation is what I should use, I will change to that. Some games the HDR is just very bad though and maybe in those cases I can play with tone mapping off.

So as for setting HDR correctly. I could set tone mapping to off when I calibrate the system HDR settings but once that's done I should put it back to gradation preferred again? I probably will be clicking right in those HDR patterns way more times before the pattern vanishes.

One of the games that sadly has trash HDR, well except for light sources because they are super bright compared to how dim they look in SDR, is Starfield and no tone mapping so far looked better to me than with it on.
 

King Dazzar

Member
Thanks for the reply!

Many games don't have HDR options and rely on the system HDR settings. On the Xbox I have the first pattern all the way to the left, and the other two patterns are at 1000, I believe that's the max peak brightness the A95K does. So that should be fine, right?

So as for setting HDR correctly. I could set tone mapping to off when I calibrate the system HDR settings but once that's done I should put it back to gradation preferred again?
Glad to help. 😊 Yes to both of your questions.

If you find doing something else with your settings brings you enjoyment, then nothing wrong with that either. Brightness Preferred simply elevates luminance above the EOTF tracking, which may help if you want boosted luminance. For me, I cant see why you'd want to use no tone mapping over Gradation Preferred. But my panel is a different tech from yours, even if the basic setting principles are the same.
 
I honestly have no clue when it comes to HGiG vs DTM on these OLED's, My previous Panasonic was much simpler in that it would tone map depending on contrast setting (dropping to 90 from 100 would have whites clip at 5000 instead of 1000) with no dynamic malarkey. All I know is Windows detailed display information says HDR400 if I use HGiG and Dolby Vision certified if I don't 🤷‍♂️ Luckily 90% of my HDR/DV viewing on the TV is UHD Bluray so HDR gaming is an afterthought.

Do Xbox games that support Dolby Vision also support it on PC? Only ever seem to get HDR10 from the PC, even with the Dolby stuff installed.
 

Stafford

Member
Glad to help. 😊 Yes to both of your questions.

If you find doing something else with your settings brings you enjoyment, then nothing wrong with that either. Brightness Preferred simply elevates luminance above the EOTF tracking, which may help if you want boosted luminance. For me, I cant see why you'd want to use no tone mapping over Gradation Preferred. But my panel is a different tech from yours, even if the basic setting principles are the same.

For Starfield tone mapping off looked better to me, but that's probably because the HDR implementation is crap in that to begin with. I do like really bright things such as light bulbs shining very bright but I get that with gradation or brightness preferred too. I only used brightness preferred in RDR2 because that too has semi broken HDR still.

I should try gradation preferred with a game that has proper HDR such as Psychonauts 2 I believe has fine HDR.
 
On the a95k, tests show that it hard clips at 500nits with tone mapping off. Do not use that setting, ever.

I just got the a95k last week, replacing my LG C9. Sold my C9, so the Sony ends up only being $1000 CAD.

The idea that Sony handles sub-4K content the best is the truth. My Nintendo Switch is like a whole new console now. Stunning. I never liked how it looked on my LG.
 

King Dazzar

Member
For Starfield tone mapping off looked better to me, but that's probably because the HDR implementation is crap in that to begin with. I do like really bright things such as light bulbs shining very bright but I get that with gradation or brightness preferred too. I only used brightness preferred in RDR2 because that too has semi broken HDR still.

I should try gradation preferred with a game that has proper HDR such as Psychonauts 2 I believe has fine HDR.
Yep, Starfield HDR implementation is awful. Its limited to 400nit peak iirc.... or it is on console at least anyway.
 
Last edited:

King Dazzar

Member
On the a95k, tests show that it hard clips at 500nits with tone mapping off. Do not use that setting, ever.

I just got the a95k last week, replacing my LG C9. Sold my C9, so the Sony ends up only being $1000 CAD.

The idea that Sony handles sub-4K content the best is the truth. My Nintendo Switch is like a whole new console now. Stunning. I never liked how it looked on my LG.
A dash of Reality Creation can be excellent too on native 4k content.
 
Last edited:

Stafford

Member
On the a95k, tests show that it hard clips at 500nits with tone mapping off. Do not use that setting, ever.

I just got the a95k last week, replacing my LG C9. Sold my C9, so the Sony ends up only being $1000 CAD.

The idea that Sony handles sub-4K content the best is the truth. My Nintendo Switch is like a whole new console now. Stunning. I never liked how it looked on my LG.
Sorry if it's a dumb question, but what does hard clip exactly mean?
Yep, Starfield HDR implementation is awful. Its limited to 400nit peak iirc.... or it is on console at least anyway.
It's such a damn shame and they sure are taking their sweet time fixing it. Assuming they are fixing it. I don't like it, most of XGS games have fine HDR, do better!!!! Especially for a sci-fi game, HDR needs to be proper.
I dash of Reality Creation can be excellent too on native 4k content.

On XSX I have reality creation at default which is I believe on and at 30 or so. What do you mean with one dash? I thought of maybe pushing it to 50 but maybe not, because it could make aliasing (if present in games) worse, right?
 
Sorry if it's a dumb question, but what does hard clip exactly mean?
To make a long story short: Despite the Sony hardware being able to display up to around 1000 nits, if you disable tone mapping on this TV, the software bumps the maximum down to 500.

It’s even worse than it sounds, because it isn’t an adjusted curve to 500. It is a normal slope, then suddenly goes flat at 500 (hard clipping).
 

King Dazzar

Member
On XSX I have reality creation at default which is I believe on and at 30 or so. What do you mean with one dash? I thought of maybe pushing it to 50 but maybe not, because it could make aliasing (if present in games) worse, right?
Sorry typo, A dash instead of I dash. Just meant if you add a little RC it can help extract even more detail. I'm on an 8k set, so it'll be different from yours potentially. But generally, I can get away with RC up to 25max, but usually 10 to 20 is good. If in doubt just use Auto, I believe it equates to somewhere between 10 and 15 manual.

As for XSX HDR, yep, some of their first party games haven't even used HDR. Lets hope they get better. They gave us Auto HDR, and Gears 5 etc are superb in HDR, so lets hope.
To make a long story short: Despite the Sony hardware being able to display up to around 1000 nits, if you disable tone mapping on this TV, the software bumps the maximum down to 500.

It’s even worse than it sounds, because it isn’t an adjusted curve to 500. It is a normal slope, then suddenly goes flat at 500 (hard clipping).
That's not good. It'll make it even more difficult to calibrate with some games with the ingame sliders etc. Hope they fix it for you guys.
 
Last edited:

MoreJRPG

Suffers from extreme PDS
A dash of Reality Creation can be excellent too on native 4k content.
+1 on Reality Creation. I set it to 10-15 max on my A80J and it does wonders. It adds a just the right amount of crispness without going too overboard and looking over-processed.
 
Last edited:

Kerotan

Member
I see my dream TV will now only cost €4,271.08.

65" QD OLED from Sony. Not in the market for a new TV for probably 8 years so by then these should be mainstream and probably even 8k. Wasn't their talk about 8k high end TV production becoming cheaper to mass produce then 4k TVs?

What happened to the TV thread on neogaf btw? Can't find it anymore.
 

Stafford

Member
Decided to watch the movie Warcraft on Prime Video. Had never seen it before. Prime labels it as HDR. However when I play the movie it's in SDR.

Other content such as The Boys plays fine in HDR. If the content isn't in Dolby Vision, shouldn't the TV then use HDR10? Or is this simply a matter of Prime mislabeling stuff?

Sony A95K QD-OLED using the Prime Video app.
 
Last edited:
Decided to watch the movie Warcraft on Prime Video. Had never seen it before. Prime labels it as HDR. However when I play the movie it's in SDR.

Other content such as The Boys plays fine in HDR. If the content isn't in Dolby Vision, shouldn't the TV then use HDR10? Or is this simply a matter of Prime mislabeling stuff?

Sony A95K QD-OLED using the Prime Video app.

Prime app is garbage for that. It's the same on my Philips. The HDR10+ stuff seems to play fine but none of the regular HDR movies/shows I've tried switches to HDR on my TV. Disney+ & Plex don't have that issue and play all HDR content fine.
 

Spyxos

Member
I could get this Older for 500 euros is it any good? Or better awoid?

LG OLED48A29LA OLED TV (Flat, 48 Zoll / 121 cm, UHD 4K, SMART TV, webOS 22 mit LG ThinQ)​

 

Stafford

Member
Prime app is garbage for that. It's the same on my Philips. The HDR10+ stuff seems to play fine but none of the regular HDR movies/shows I've tried switches to HDR on my TV. Disney+ & Plex don't have that issue and play all HDR content fine.

Yep, same here. Never any issues with Netflix either regarding HDR. It was a few years ago that I saw a movie labeled as HDR and it wasn't. How have they not fixed this by now?
 
Last edited:

Soodanim

Gold Member
I've spent most of my day umming and ahhing over the LG C3 but in 40 or 48" (space restrictions). The reason being that below 55" the TVs get dimmer, and I can't find out exactly how much dimmer.

Plus, because I'm on a 1660ti I don't even have a HDMI 2.1 output to make use of the VRR 120hz, so it seems like a bad match all round.

Maybe I should just stick with my LCD until I build a new PC.
 
because it's distracting. I want oled to blend into the darkness of my room
It blends automatically because of the surrounding is dark the ambilight is turned off.

But this argument always comes from people who have no idea or who don't even have the necessary device.
 
Last edited:

rofif

Can’t Git Gud
I've spent most of my day umming and ahhing over the LG C3 but in 40 or 48" (space restrictions). The reason being that below 55" the TVs get dimmer, and I can't find out exactly how much dimmer.

Plus, because I'm on a 1660ti I don't even have a HDMI 2.1 output to make use of the VRR 120hz, so it seems like a bad match all round.

Maybe I should just stick with my LCD until I build a new PC.
There is 42" and 48" models.
You want to put it on the desk or in other place in a living room?

I wouldn't worry about it being to dark no matter the size. 42" is the "worst" one because it top at 650-700nits while 48" at 800 nits.
The 48" models of lg c1,2,3 use EVO panel (although not fully like 55"+ models) and 42" model of c2 was using older panel while c3, I am not sure myself.
I have 48" c1 with evo panel, which is lucky since it was a lottery with c1.

To help you understand how bright it gets, here is me playing Uncharted 4 with HDR in the middle of the night. See how much it light up the room


And here is lost planet 1 in sdr on 360 - just so you can see a game with bright, white snow in sdr.
 
I've spent most of my day umming and ahhing over the LG C3 but in 40 or 48" (space restrictions). The reason being that below 55" the TVs get dimmer, and I can't find out exactly how much dimmer.

Plus, because I'm on a 1660ti I don't even have a HDMI 2.1 output to make use of the VRR 120hz, so it seems like a bad match all round.

Maybe I should just stick with my LCD until I build a new PC.
You should also consider the Philips

42/48OLED808/12 4K​


If It's available in your country.
 

Soodanim

Gold Member
There is 42" and 48" models.
You want to put it on the desk or in other place in a living room?

I wouldn't worry about it being to dark no matter the size. 42" is the "worst" one because it top at 650-700nits while 48" at 800 nits.
The 48" models of lg c1,2,3 use EVO panel (although not fully like 55"+ models) and 42" model of c2 was using older panel while c3, I am not sure myself.
I have 48" c1 with evo panel, which is lucky since it was a lottery with c1.

To help you understand how bright it gets, here is me playing Uncharted 4 with HDR in the middle of the night. See how much it light up the room


And here is lost planet 1 in sdr on 360 - just so you can see a game with bright, white snow in sdr.

Thank you for sharing those, I appreciate it.

It would be a wall mounted situation across the room, and it can't be too big because it will interfere with my monitors and a wall at 55.

Another concern is that because it's 99% used as a moniitor, it inevitably has a browser window up and as much as I like the GAF logo I don't want it burned in.

On a side note, I'm glad that TVs still have Optical ports. I was worried I would run into trouble connecting my speakers, but there seems to be nothing to worry about.
You should also consider the Philips

42/48OLED808/12 4K​


If It's available in your country.
I do love the idea of Ambilight, and I even use a shader that creates an on-screen ambilight around non-16:9 content in RetroArch. But right now it's the same price as the C3 which seems to be universally regarded as the best value for this size. Anything you can say to swing it in Philips' favour?
 
Thank you for sharing those, I appreciate it.

It would be a wall mounted situation across the room, and it can't be too big because it will interfere with my monitors and a wall at 55.

Another concern is that because it's 99% used as a moniitor, it inevitably has a browser window up and as much as I like the GAF logo I don't want it burned in.

On a side note, I'm glad that TVs still have Optical ports. I was worried I would run into trouble connecting my speakers, but there seems to be nothing to worry about.

I do love the idea of Ambilight, and I even use a shader that creates an on-screen ambilight around non-16:9 content in RetroArch. But right now it's the same price as the C3 which seems to be universally regarded as the best value for this size. Anything you can say to swing it in Philips' favour?
The only thing I know about the Philips It has a swivel stand, better speaker and It still has a headphones jack. Downside only two HDMi 2.1 ports.

I couldn't game without Ambilight anymore.
But If you want the best value go with the C3. Problem is that Philips release their TV line up later and it's no wonder that their TVs are more expensive.
 
Last edited:

rofif

Can’t Git Gud
Thank you for sharing those, I appreciate it.

It would be a wall mounted situation across the room, and it can't be too big because it will interfere with my monitors and a wall at 55.

Another concern is that because it's 99% used as a moniitor, it inevitably has a browser window up and as much as I like the GAF logo I don't want it burned in.

On a side note, I'm glad that TVs still have Optical ports. I was worried I would run into trouble connecting my speakers, but there seems to be nothing to worry about.

I do love the idea of Ambilight, and I even use a shader that creates an on-screen ambilight around non-16:9 content in RetroArch. But right now it's the same price as the C3 which seems to be universally regarded as the best value for this size. Anything you can say to swing it in Philips' favour?
7000+ hours on mine as a monitor. no issues.
watch this. Alienware Monitors are crap at burning so far it seems. LG tvs are good
 

Soodanim

Gold Member
7000+ hours on mine as a monitor. no issues.
watch this. Alienware Monitors are crap at burning so far it seems. LG tvs are good

Can you speak to the VRR performance of your C1 and maybe later models? Last I heard there were some flickering issues, something to do with gamma levels.
 

Bojji

Member
Can you speak to the VRR performance of your C1 and maybe later models? Last I heard there were some flickering issues, something to do with gamma levels.

This happens in many tvs (including newest models), even some non OLEDs.

LG has special setting to reduce this "effect" (it also adds nice black level increase in washed out games).

In reality, with PC I rarely see it mostly on some loading screens when framerate is all over the place (3, 5, 10, 20, 60, 100, 200 etc.).

It's basically non issue and I was afraid of it before buying OLED (B2 in my case).
 
Last edited:

Soodanim

Gold Member
This happens in many tvs (including newest models), even some non OLEDs.

LG has special setting to reduce this "effect" (it also adds nice black level increase in washed out games).

In reality, with PC I rarely see it mostly on some loading screens when framerate is all over the place (3, 5, 10, 20, 60, 100, 200 etc.).

It's basically non issue and I was afraid of it before buying OLED (B2 in my case).
That's great to hear, thanks.

I'm going to try and see one in person tomorrow before I make up my mind.



Anyone have any thoughts about this? This channel seems to be the only channels that makes videos singling out LG C3/G3 for Game Mode issues, it's hard to find that elsewhere.
 

rofif

Can’t Git Gud
Can you speak to the VRR performance of your C1 and maybe later models? Last I heard there were some flickering issues, something to do with gamma levels.
An that's true. Anything that is below 120fps on 120hz mode, certainly lowers gamma (makes darks more grey).
Normally, it is not an issue and nothing you can notice except:
-big fps fluctuations like on loading screens.
-Windows ui on badly made apps like xbox gamepass app


-When switching from 120 to 60 and to 30fps. I've recorded small vid to show it:


But I still use it in every game. It's not an issue 97% of the time
 
Last edited:

Soodanim

Gold Member
An that's true. Anything that is below 120fps on 120hz mode, certainly lowers gamma (makes darks more grey).
Normally, it is not an issue and nothing you can notice except:
-big fps fluctuations like on loading screens.
-Windows ui on badly made apps like xbox gamepass app


-When switching from 120 to 60 and to 30fps. I've recorded small vid to show it:


But I still use it in every game. It's not an issue 97% of the time

Thanks, I appreciate it.

It looks like it will be more noticeable in Windows than in games, as outside of rare exceptions games will stay a lot more consistent. Unless I recreate a moment in Red Faction Guerilla where I made it drop from 144 to 30. Not even G-Sync helps when the drop is that bad, and a bit of a gramma flicker would probably be the least of your problems if that's happening frequently.

Did you find a workaround for frame limited software? DIsabling VRR on a program by program basis, etc
 
Last edited:

rofif

Can’t Git Gud
Thanks, I appreciate it.

It looks like it will be more noticeable in Windows than in games, as outside of rare exceptions games will stay a lot more consistent. Unless I recreate a moment in Red Faction Guerilla where I made it drop from 144 to 30. Not even G-Sync helps when the drop is that bad, and a bit of a gramma flicker would probably be the least of your problems if that's happening frequently.

Did you find a workaround for frame limited software? DIsabling VRR on a program by program basis, etc
If a game is locked to 60, it's fine. vrr locks itself to 60 and there is no flicker. I love souls games and they are all 60
 

Shtef

Member
As a new lg c3 owner i need to ask few questions.

- that game widget is useless right? If xbox is set to 120hz it will not show correct fps when playing game, for example i was playing elden ring and its was showing around 118 fps.

- i have read that i should set filmaker mode for all cases, movies, games... What about pc gaming? Is there any specific setting to enable?

- why is airplay so bad? I tried streaming movie from my mac and the sound would stop every 5 seconds.

- can remote be used to control pc like witha mouse? I noticedi can control the xbox but not sure about the pc.

- one last thing, i have xbox and pc ( rtx 3060ti) hooked up to the tv, do i enable both gsync and and freesync at the same time? Will my tv switch accordingly to correct vrr format depending of the input source?
 
Last edited:

Bojji

Member
As a new lg c3 owner i need to ask few questions.

- that game widget is useless right? If xbox is set to 120hz it will not show correct fps when playing game, for example i was playing elden ring and its was showing around 118 fps.

- i have read that i should set filmaker mode for all cases, movies, games... What about pc gaming? Is there any specific setting to enable?

- why is airplay so bad? I tried streaming movie from my mac and the sound would stop every 5 seconds.

- can remote be used to control pc like witha mouse? I noticedi can control the xbox but not sure about the pc.

- one last thing, i have xbox and pc ( rtx 3060ti) hooked up to the tv, do i enable both gsync and and freesync at the same time? Will my tv switch accordingly to correct vrr format depending of the input source?

Xbox is doubling refresh rate with 120hz output, that's actually a good thing, there are less noticable drops that way. If you set console to 60hz you will see all frsmerate drops on ER. On pc it shows full scale.

I don't care about FM mode, I use game mode with settings set to my liking for PS5 and (game) PC mode for pc (same settings as for PS5).

As far as I know you can't use remote for that.

You can disable freesync for Nvidia GPU and leave it for Xbox input.
 

Stafford

Member
Sorry typo, A dash instead of I dash. Just meant if you add a little RC it can help extract even more detail. I'm on an 8k set, so it'll be different from yours potentially. But generally, I can get away with RC up to 25max, but usually 10 to 20 is good. If in doubt just use Auto, I believe it equates to somewhere between 10 and 15 manual.

As for XSX HDR, yep, some of their first party games haven't even used HDR. Lets hope they get better. They gave us Auto HDR, and Gears 5 etc are superb in HDR, so lets hope.

That's not good. It'll make it even more difficult to calibrate with some games with the ingame sliders etc. Hope they fix it for you guys.

Alright. I finally took the time to do this. Had not been gaming lately.

On the Sony A95K I put HDR Tone Mapping from gradation preferred to off. I then go into the HDR game calibration settings on Xbox. First pattern I put it all the way to the left, which is minimum luminance. It says this :

MinTML 0.000
MaxTML 500
MaxFFTML 500
Then for the first maximum luminance pattern I go all the way to the left and then it only takes 4 dpad clicks to the right for the pattern to be gone. It says this:
MinTML 0.000
MaxTML 500
MaxFFTML 500

The same for the second max luminance pattern.

It then shows the calibrated image which is super bright and pretty. When I then put the TV back to gradation preferred it looks super dim on this picture. But that's OK? Also, shouldn't the MaxTML at least be 1000?
 

Stafford

Member
On the a95k, tests show that it hard clips at 500nits with tone mapping off. Do not use that setting, ever.

I just got the a95k last week, replacing my LG C9. Sold my C9, so the Sony ends up only being $1000 CAD.

The idea that Sony handles sub-4K content the best is the truth. My Nintendo Switch is like a whole new console now. Stunning. I never liked how it looked on my LG.

Oh boy, now I'm confused.

I should never put HDR tone mapping to off, not even briefly for the calibration on Xbox? You say I should keep it at gradation preferred and then just set the patterns on Xbox as it says how to do? So all the way to the left for min luminance, and for the two max luminance patterns I should set it until the pattern is gone? Which is at 1000 here.
 

King Dazzar

Member
Alright. I finally took the time to do this. Had not been gaming lately.

On the Sony A95K I put HDR Tone Mapping from gradation preferred to off. I then go into the HDR game calibration settings on Xbox. First pattern I put it all the way to the left, which is minimum luminance. It says this :

MinTML 0.000
MaxTML 500
MaxFFTML 500
Then for the first maximum luminance pattern I go all the way to the left and then it only takes 4 dpad clicks to the right for the pattern to be gone. It says this:
MinTML 0.000
MaxTML 500
MaxFFTML 500

The same for the second max luminance pattern.

It then shows the calibrated image which is super bright and pretty. When I then put the TV back to gradation preferred it looks super dim on this picture. But that's OK? Also, shouldn't the MaxTML at least be 1000?
So I believe the 500nit, is a bug with your TV when disabling tone mapping, posted about further up by another owner. So what I would advise, is leaving your tone mapping at Gradation Preferred. And then just setting the values of MaxTML and MaxFFTML to the known 10% window peak luminance of your set. I believe your TV hits up to circa 950nit on 10% so I'd set yours to 1000nit.

Hope that helps.
 
Top Bottom