• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

retro consoles and 240p / 480p resolution - sometimes less is more

Hi friends, I know everyone here want to play games at the highest resolution possible, because only higher resolution can show off the amount of detail in modern games. I however still love gaming on older consoles, especially 5'th and 6'th gen, because I think the amount of history and influential titles that came into these consoles trumps even modern platforms. There's however a problem, because these old games were designed with equally old 320x240 / 640x480p resolutions in mind. On the CRT, even such low resolutions looked very good, so at the time I could just enjoy playing these games without noticing any big problems with image quality. Times however have changed, and now everyone plays on high-resolution screens (like 1920x1080, 1440p and 4K).

Here's how the PSX game MGS1 look like on modern screen, 320x240 upscaled to 1440p. The amount of pixelation is extreme and this is not how people back in the day played (and remembered) this game.


MGS-320x240.png



On the CRT TVs "pixels" were blurred, and at the same time, games appeared MORE DETAILED because the brain was trying to guess and interpret how details should look because of that blur and black scanlines. People were literally using their imagination when playing these old games. Below great comparison:


Clipboard02.png





Maybe some people like pixelated and blurry images, but personally I'm not a fan of such look, so for the last decade I've been using emulators because they allow you to increase the native resolution and get a sharp image without pixelation. There's however one problem, you see friends, games from 5'th gen era consoles had also poor assets quality (low polycounts, and low resolution textures), so when you run these games at higher resolution (on the emulator, or PC port of the game for example) these low quality assets are 10x times more noticeable, because higher resolutions allows not only to see more details, but also more imperfections. For example, in Gran Turismo 1 running on the emu at 1440p, I can see very glaring LOD transitions between the cars just in front of me and the cars a little further away. Something like that is very distracting.

16x-internal-res.png



There's however a solution to this problem. You see friends, sometimes less is more (although women will probably not agree with me ;P), so lets use 320x240 internal resolution but with CRT filter / shader, in order to emulate CRT look on modern screen.


CyberLab CRT shader



Here are my tweaks to the shader parameters, as the default colors in the image were washed out.
MBZ__0__Smooth-Advance_Full_Reflections / 1440p / CyberLab_Aperture_Grill_IV_OLED
My tweaks to the shader parameters:

In CRT "Brightness & Gamma" section
-gama in 2.05
-gama out 2.4
-post CRT brightness 2.0

In "Brightness Settings" section
-magic glow 0.4
-bloom strength 40
-gamma Correct 0

In "Digital Controls" section
-brightness 0.15

In "Signal Color Space"section
-Display Color Space 0
-Signal Type 0
-Gamut Compression 0

In "Guest Color Tweaks" section
-Saturation Adjustment wp_saturation 1.05
-Brigthness Adjustment 1.08

In "G-sharp Resampler" section
-G-Sharp ON 1
-Sharpness Definition 0.50

CyberLab CRT shader

mgs-1.png


And now photo from the real SD CRT. Please Ignore the blue color tint (my phone camera just cant show what my eyes can see in real life) and just focus on the details. I think this cyberlab shader totally nailed what my eyes can see on a real CRT :).


20240106-130206.jpg



CRT royal shader in retroarch, I recommend to open this image in a new tab and view it at 1:1, because otherwise scanlines create a moire pattern (defect).

MGS-320x240-royal.png


Mamehsl CRT shader
MGS-320x240-mamehsl.png



These are only SDR-CRT filters at 1440p (4K screen is recommended for better CRT aperture grille emulation), but they still made a big difference. The image is less bright (because of the scanlines, you can turn up the brightness on your own screen to compensate), but pixelation is also much less noticeable and looks similar to that on the real CRT. The whole image appears more detailed at the same time thanks to the scanlines.

Now let's take a look at very interesting "TV-flickering" CRT shader (from KAIO CRT shader pack). Unlike previous shaders, it has no visible scanlines, just CRT slot mask, but the results are still very pleasing to the eye.

retroarch-2024-01-08-22-00-14-125.png



I absolutely love this look, so I recommend people who want to get the best results on SDR display to try the KAIO shaders.. Here's the link this shader pack:


All shaders in this KAIO pack look interesting, but I personally like "TV-flickering" shader the most, because it has the perfect amount of blur. I'm however using some tweaks to the shader parameters to make it look even more pleasing to my eye:
-Disable FXAA
-In the section Hi resolution scanlines handling I tweak flicker power to 0.15 to make it look comparable to my SD CRT (you can turn it down completety if you dont like this interlacing simulation effect)
-In the section low level phosphor gird I tweak overmask to 1.10 to make the image a little brighter.
-In curvature border section I turn down warp X / Y.
-If you want sharper image, you can also disable "deconvergence" because this thing blur pixels.

SDR shaders cant emulate CRT aperture grille perfectly though. Only HDR shaders (especially at 4K) allow you to have a CRT mask with 100% opacity, so please keep in mind what you are seeing on my screenshots still doesnt look as good as it can. I wish I could show you how much of a difference the HDR shaders make, but on the SDR screen it's impossible to show that difference, I'm afraid.

Edit: I have found HDR video on YT that shows very well what difference CRT shaders can make.



You can find CRT HDR shaders in retroarch, or use expensive upscalers / conventers like retrotink 4K if you want to play on the real hardware and get that CRT/PVM look on your modern screen. More about retrotink 4K and CRT shaders in the link below.




And what about increasing internal resolution just a little bit more (x2) and instead of 16x and use CRT shaders / emulator on top of that?

MGS1 PSX x2 internal resolution (640x480) and without CRT shaders.


MGS-640x480.png



The image is certainly less pixelated than the standard 320x240, but the pixelation is still very noticeable, so I personally would not want to play like this. Now lets use the mamehsl CRT shader.


MGS-640x480-MAMEHSL.png


or CRT TV flickerking from KAIO shader pack

kaio.png



A 640x480 image upscaled to 1440p no longer looks pixelated thanks to the CRT shaders. I can enjoy playing at 640x480 again like in the old days, in fact even better. The resolution is still low (640x480), so the textures aren't as stretched as they would be in native 1440p, and LOD transitions are also way less noticeable compared to the native 1440p. This is now my preferred way of playing old games, a fusion of modern and old screen technologies (I have never seen MGS looking that good even on normal CRT). SDR shaders can do the job, but I really recommend HDR shaders because they can accurately emulate the CRT aperture grille without making the image look too dark.


Alundra PSX, standard upscaling

retroarch-2024-01-08-18-26-44-736.png


KAIO shader

retroarch-2024-01-08-18-27-35-468.png


Metal Slug 3 NeoGeo native 240p upscaled to 1440p


retroarch-2024-01-09-15-02-01-006.png


KAIO shader


retroarch-2024-01-09-15-01-38-170.png


How it looks on my GT60 plasma

20240109-025726.jpg


Shinobi 3 Sega Genesis

1.png


KAIO shader

2.png


KAIO shader on plasma TV

20240109-024159.jpg


Dino Crisis 2 PSX, RAW PIXELS when upscaled to 1440p


dc2-raw.png



CyberLab Shader


dc2-cyberlab.png


Cyberlab Shader + bezel

dc2-cyberlab-bezel.png



And comparison with PC version + AI upscaled textures.


dc2-PC.png



Dylan's character model looks like a doll in the PC version at high resolution, while in the 240p resolution he looks more like a human.


a2.png


Clipboard01.png


a3.png



But what if someone want's to use CRT shaders in different emulators? You can still use "reshade" and download CRT-Guest-ReShade shader.

PCSX2 emulator, CMC Rally 3 -511x416 uplscaled to 1440p using integer scaling + bilinear sharp method (similar as nearest neighbour) + CRT_guest_HD shader

My settings to the shader parameters:
-Resolution X, Y write your native resolution
-Preprocessor Definitions write your native resolution
-Bloom Strenght 0.15
-CRT mask 10 for 480i/480p games, and mask 8 for 720p games (X360/PS3)



cmc4.png



512x412 the image look sharp and detailed on my PC monitor thanks to CRT phosphor mask which made big square pixels in this game look round and pleasing to look at.

My observations:
-Bilinear upscaling makes 240p / 480p look extremely blurry
-Nearest Neighbour upscaling makes upscaled image perfectly sharp but the image is extremely pixelated, so we went from one extreme to the other.
-The CRT shaders can eliminate pixelation from nearest-neighbour upscaling, resulting in a perfectly balanced image (sharp and clean similar to that of a CRT).
-The only downside is the reduced brightness due to the CRT mask, but you can compensate for this by simply increasing your monitor's brightness settings.
-HDR shaders give much better results compared to the SDR shaders, because they allow you to have a CRT mask at 100% opacity, and because the HDR image can be extremely bright, there's also no need to artificially increase the brightness (with Bloom etc.) as in SDR shaders, so colors and gamma look correct.
-SDR shaders with some color and gamma tweaks can still give very good results compared to normal upscaling, but sometimes you may notice colors / gamma changes as a result of color/gamma correction, because SDR shaders boost these image settings.
 
Last edited:

Holammer

Member
I played some Wizardy (PS2) recently. There's a lot of low resolution artwork in the game for portraits and locations that look real bad on a modern display. Apply some "CRT image reconstruction" via CRT-Royale and looks much better.
Just look at the barrels in the bottom right corner for example, they are sloppily painted without any serious retouching because the display will handle the rest and blend it together.

406W9iP.jpg
 
Last edited:
I played some Wizardy (PS2) recently. There's a lot of low resolution artwork in the game for portraits and locations that look real bad on a modern display. Apply some "CRT image reconstruction" via CRT-Royale and looks much better.
Just look at the barrels in the bottom right corner for example, they are sloppily painted without any serious retouching because the display will handle the rest and blend it together.

406W9iP.jpg

You're absolutely right that a lot of the old games had 2D artwork / backgrounds, and that's another reason why I don't want to simply increase the internal resolution, because it will make 2D artwork look worse (more pixelated and blurred at the same time).

For example Resident Evil 1 on nintendo gamecube is one of my favorite games on that system. This game has pre-rendered 2D backgrounds + 3D characters. The problem is, when running this game on an emulator there's a stark difference between perfectly sharp character models and blurry backgrounds. Even when playing the PC version + AI 4K upscaled textures I see this problem (although to much smaller extend). Thanks to the HDR PVM shader in retroarch even RE1 the GC version look amazing. I can no longer see a difference between blurry backgrounds and detailed character models. I can see a similar effect on my old SD CRT, but this PVM shader in Retroarch blends 2D / 3D objects with much better clarity, as the image is still reasonably sharp, whereas on my SD CRT the blur is so strong that it's difficult to read certain small letters / notes within the game. I have never seen RE1 looking that good, not even on PC with mods.

When it comes to your own comparison, it shows the big difference Royal CRT shaders can make, but it also shows the limitations of SDR shaders. Only HDR shaders can recreate the aperture grille mask without dimming the entire image as much, and without any difference in colour hue. However, you can try to compensate for this loss of brightness by increasing the brightness of your monitor. Shaders have also options for color tweaks.
 
Last edited:

rofif

Can’t Git Gud
MGS1 had no filtering whatsoever on ps1?
It was just pixelated 320x240 image without any aa? just crt tv did all the job?

Obviously integer scaling it to 4k/1440p is not correct. It was never meant to be this pixelated.
But just default bilinear stretching scaling it to your tv res is "smoothing" out some stuff. Not like crt but closer than just raw pixels maybe.

Some crt filters could work but not on 4k image. On raw image, integer scaled to your tv res.
 
That's why I keep the old trusty Phillips 27 inch hooked up. Great for my GameCube, Wii, Ps2, 360, and Genesis.

Edit: Awesomely, it is one of the few standard def CRT with component input I have ever come across.
In my country, HD CRT TVs with component inputs and 480p/720p support were extremely rare, so for many years I could only dream of owning one 😁. That's probably why I'm so impressed with the PVM HDR shaders, because I can finally see how good these old games can look without the excessive blurriness found on standard SD CRTs.
 

Holammer

Member
MGS1 had no filtering whatsoever on ps1?
It was just pixelated 320x240 image without any aa? just crt tv did all the job?

Obviously integer scaling it to 4k/1440p is not correct. It was never meant to be this pixelated.
But just default bilinear stretching scaling it to your tv res is "smoothing" out some stuff. Not like crt but closer than just raw pixels maybe.

Some crt filters could work but not on 4k image. On raw image, integer scaled to your tv res.
PS1 did not even have perspective correct textures, explaining the infamous texture warping. Only the N64 had bilinear filtering.

Those are some horrible shaders.


I'm getting a 4k OLED later this year so I can install that. At 1440p (or lower) you have to settle for a "good enough" solution.
 
MGS1 had no filtering whatsoever on ps1?
It was just pixelated 320x240 image without any aa? just crt tv did all the job?

Obviously integer scaling it to 4k/1440p is not correct. It was never meant to be this pixelated.
But just default bilinear stretching scaling it to your tv res is "smoothing" out some stuff. Not like crt but closer than just raw pixels maybe.

Some crt filters could work but not on 4k image. On raw image, integer scaled to your tv res.
Yes, there was no texture filtering in MGS1. CRT TVs did all the job, although you could force texture filtering on the PS2. I always use integer scaling in retroarch, it's a must when you use CRT filters.

Those are some horrible shaders.


I have read about this shader pack but for some strange reason I could not get it to work on my PC. Maybe with this video tutorial I will finally be able to install this shader pack successfully.

BTW- this pack also has HDR CRT shaders maybe?
 
Last edited:

Sojiro

Member
That's why I keep the old trusty Phillips 27 inch hooked up. Great for my GameCube, Wii, Ps2, 360, and Genesis.

Edit: Awesomely, it is one of the few standard def CRT with component input I have ever come across.
Yep, I keep a CRT in my bedroom that has my PS2, GC and N64 connected to it. Is also is one I found with component which can be harder to find, the only thing I would kill for is if it had VGA as well for the Dreamcast.
 

Soodanim

Gold Member
I've done the same with PS1 games - slightly upres but still use a shader. It's a great look.

I'm not always about ultra accuracy, as there are some things that don't deserve to be preserved. PS1 texture warping, for example.

I used to be a big fan of Mega Bezel Death To PIxels, and they're still good. But these days I've become more fond of shaders that have some blur to them, even though once upon a time I couldn't stand them. MD/Gen games look great, and even PS1 games at low/native get a nice look to them.

In particular, I've grown very fond of the Koko AIO shader pack for RetroArch. Here's a screenshot I just took of Shinobi (click to expand or right click and view in a new tab):
Shinobi-III-Return-of-the-Ninja-Master-USA-240104-194617.png

The specific shader is tv-PAL-my-old but with some tweaks to tone down the blur and white glow a bit and disable a spotlight in the corner. I'm also a big fan of the Ambilight-esque reactive colours beyond the reflective bezel.

Truth be told, If I had a 4k HDR OLED I would be trying out the new shaders that utilise the TV's technology to use to recreate a CRT image rather than computing power.
 
Last edited:

ZoukGalaxy

Member
It's not really about "less is more" but more about "CRT rendering or CRT screen is better when game where created in this time with a CRT monitor", that's it ¯\_(ツ)_/¯
 
I've done the same with PS1 games - slightly upres but still use a shader. It's a great look.

I'm not always about ultra accuracy, as there are some things that don't deserve to be preserved. PS1 texture warping, for example.

I used to be a big fan of Mega Bezel Death To PIxels, and they're still good. But these days I've become more fond of shaders that have some blur to them, even though once upon a time I couldn't stand them. MD/Gen games look great, and even PS1 games at low/native get a nice look to them.

In particular, I've grown very fond of the Koko AIO shader pack for RetroArch. Here's a screenshot I just took of Shinobi (click to expand or right click and view in a new tab):
Shinobi-III-Return-of-the-Ninja-Master-USA-240104-194617.png

The specific shader is tv-PAL-my-old but with some tweaks to tone down the blur and white glow a bit and disable a spotlight in the corner. I'm also a big fan of the Ambilight-esque reactive colours beyond the reflective bezel.

Truth be told, If I had a 4k HDR OLED I would be trying out the new shaders that utilise the TV's technology to use to recreate a CRT image rather than computing power.

Agree, I have noticed this as well. 240p 2D action games with sprites (like Shinobi, metal slug, cadillacs and dinosours) can benefit from some additional blur. For these games, I use a PVM mask with only 300 lines instead of the default 600 lines and on top of that I also change decorvergence settings to add some additional blur.

I know that eventually all my CRT TV's will stop working, so having a good alternative (CRT shaders) makes me stop worrying. I must say that I have a lot of respect for the talented people who have created these CRT shaders, because what they've done is truly incredible. Especially PVM HDR shaders are impressive. I dont know how to record HDR footage on my PC, but this video shows very well what I'm seeing. This YT video should only be viewed on an HDR capable screen.

 

Crayon

Member
I very rarely emulate. I want a crt but can't justify the space. I use a retrotink 2x (cheap) and the scanline option is good enough for me. Maybe I could go for one of the better ones but retro is like 5% of my play time and most of that has been on handhelds for the last 2 years.
 

Minsc

Gold Member
Sometime even more than just resolution, going back to EGA makes for a compelling argument as well that less is more.

If you compare a lot of the EGA artwork in Secret of Monkey Island 1 to the VGA artwork in Monkey Island 2 you can make a pretty compelling argument that the work done on MI1 with 16 colors is better than MI2, the way they convey so much with so little, using dithering techniques etc.
 
Yep, I keep a CRT in my bedroom that has my PS2, GC and N64 connected to it. Is also is one I found with component which can be harder to find, the only thing I would kill for is if it had VGA as well for the Dreamcast.
But you still have component input. You can buy lagless VGA to component adapter because it will just perform math on an analog signal and not doing any digital processing.


VGA is theoretically better because it can send 0-255 RGB, but to my knowledge 6'th gen consoles were limited to 16-235 RGB, so the image quality should be very similar if not the same.
 
Last edited:

dave_d

Member
People think I'm crazy but I would pay good money for a new large display CRT TV. I don't care how heavy it is, I don't care how big it is, it's an appliance to me, I would pay for it to be in my house regardless of size.
So basically you want something like a Sony Picturescope projection TV?
 
Shaders are for chumps,
I'll take my scanlines extra silky via pc monitor at 3200x224p at 120FPS

image.png

image.png

image.png


Ofcourse also got a standard CRT tv next to it for when im in the mood.
got a VGA to component convertor for it.

I've had several CRT monitors in the past and loved every one of them. But no matter how much we loved CRTs, they do not last forever and the picture gets worse over time unfortunately. The last CRT monitor I bought in 2013 was a bit worn out (colors and contrast wasnt so bright).

When all the CRT TVs will be gone someday, at least we will have a good alternative. The SDR CRT/PVM shaders don't allow you to have an aperture grille mask with 100% opacity, but HDR shaders does, and the results are very convincing. Personally I even prefer how CRT/PVM HDR shaders look because I get MUCH bigger screensize with even better static image quality (the image is brighter and I can easily adjust sharpness and pixelation to my liking). Only motion clarity isnt so perfect on the modern screens, but I bet something like 120Hz OLED + BFI will gets close, not to meniton 240Hz OLEDs.
 
Last edited:

Dr.D00p

Member
I remember when MAME used HLSL shaders for the very first time. It was revolutionary but I hate to think how many hours of my life I've wasted going down the rabbit hole of trying to perfect the 'CRT look'. Certainly more time than I've spent playing the damn games. That's for sure 😁
 

NeoIkaruGAF

Gold Member
Raw pixels on games made in the CRT era are a sin. Plain and simple. This is truth. It’s futile to argue against it. There’s plenty of images showing the stark difference between the same image on a LED screen vs a CRT. The effect can be literally night and day.

Even modern pixel art 2D games look better on a CRT. I’ve connected my Switch to a CRT via a cheap HDMI->Scart converter bought on Amazon, and Celeste looks unquestionably better than on a flat screen with no shader. Even Mario Kart 8 looks great.

Even if you can emulate the CRT look as accurately as possible, there’s still the problem of motion resolution. I’m not sure sample-and-hold will ever get there, save for an implementation of BFI that current TVs are very far from achieving.

Anyway, even mediocre shaders like those on the Classic Mini consoles or Nintendo’s online service are better than the raw pixels. Scanline options are the lazy way out - the CRT look is so much more than scanlines, which were much less visible on PAL screens than NTSC and on consumer TVs vs high-res computer monitors.
 
What works pretty well for some reason is upscaling an image to about 4 times the original resolution without altering screen ratio and then lowering the resolution of the output to double the resolution of the original with a blurring filter together with a color boosting filter. It usually smooths the jagged edges out, reduces shimmering, and increases the fidelity of the image. The only real limitation is texturing on low polygon models in motion. It still looks better than on a CRT in the case of games like Silent Hill (PS1).
 
Last edited:
Well not exactly as you are still missing CRT motion clarity. Which you can't really get on modern displays.
As someone who has owned several CRTs, I'm well aware that the motion quality on a CRT is second to none. But I'm a practical person, so if modern technology gets me close, I won't get hung up on a few minor discrepancies.

vnb8Wos.png



My old panasonic GT60 plasma has 4ms persistance blur and that's already good enough for me. If anything can get close to 4ms of persistence blur, I'll still be happy. The 240Hz OLEDs can do that, and so can the 120Hz OLEDs + BFI.

There are also Nvidia ULMB monitors with 2ms persistance blur and I wonder if the human eye can even see the difference between 2ms vs 1ms persistance blur.
 
Last edited:

Soodanim

Gold Member
As someone who has owned several CRTs, I'm well aware that the motion quality on a CRT is second to none. But I'm a practical person, so if modern technology gets me close, I won't get hung up on a few minor discrepancies.

vnb8Wos.png



My old panasonic GT60 plasma has 4ms persistance blur and that's already good enough for me. If anything can get close to 4ms of persistence blur, I'll still be happy. The 240Hz OLEDs can do that, and so can the 120Hz OLEDs + BFI.

There are also Nvidia ULMB monitors with 2ms persistance blur and I wonder if the human eye can even see the difference between 2ms vs 1ms persistance blur.
When I got my first G-Sync monitor I did a playthrough of HL2 at locked 120hz w/ ULMB and it was incredible. If it was brighter it would be damn near perfect. It's a shame that it hasn't gained more popularity, but I guess VRR is an easier target than BFI because it doesn't require locked high FPS. It should really be the goal for high performance setups though, especially with the recent focus on brightening displays.
 

Killer8

Member
I feel that whichever CRT shader you settle on will really depend on what experience you had growing up. I gamed on a 90's Sony Trinitron so a scanline laden "PVM" look is totally alien to me. Some people are even wanting to emulate a more blurred and washed out composite cable look, because it's what they remember. I have that Trinitron set up next to me right now and if you use RGB cables then the image is SHARP with no horizontal scanlines whatsoever.
 
I feel that whichever CRT shader you settle on will really depend on what experience you had growing up. I gamed on a 90's Sony Trinitron so a scanline laden "PVM" look is totally alien to me. Some people are even wanting to emulate a more blurred and washed out composite cable look, because it's what they remember. I have that Trinitron set up next to me right now and if you use RGB cables then the image is SHARP with no horizontal scanlines whatsoever.
On my small 21" philips SD CRT from 2000 the scanlines aren't that noticeable from the normal viewing distance, however I can still see them from up close, at least in 240p content.

My S21ultra camera isnt very good (compared to my mirrorless camera), but even phone photo can show scanlines very clearly.

MGS1 running on the PS2 240p, scart RGB

20240106-130206.jpg


Metalslug on the PS2 240p
20240106-120911.jpg


So scanlines were always there at 240p, even on the SD CRT.

At 480i, however, I can barely see scanlines even at close range, just as you describe.

Metalslug 3, PS2 480i
20240106-124335.jpg
 
Last edited:
I would never play my old retro consoles on anything else than a CRT (with proper cables for RGB output). Everything else looks like shit IMO, certainly upscaling to a big flat screen TV.
 
I would never play my old retro consoles on anything else than a CRT (with proper cables for RGB output). Everything else looks like shit IMO, certainly upscaling to a big flat screen TV.
There's absolutly no percetable difference (at least on the static image) between real CRT TV and CRT HDR shaders. Dude I'm telling you, the HDR technology cimbined with the 4K can emulate CRT aperture grille perfectly and you can adjust the amount of scanlines and blur to match your own SDR TV. All people who saw HDR shaders in action on good HDR display will agree with me.

Screenshot-20240106-132756-You-Tube-2.jpg


I have read many comments like this. People who have seen HDR shaders in action do not want to keep their CRTs.

SDR shaders cant emulate CRT aperture grille so perfectly (because SDR doesnt allow to have a mask with 100% opacity) but I cant show you HDR shaders on SDR screenshots.
 
Last edited:
There's absolutly no percetable difference (at least on the static image) between real CRT TV and CRT HDR shaders. Dude I'm telling you, the HDR technology cimbined with the 4K can emulate CRT aperture grille perfectly and you can adjust the amount of scanlines and blur to match your own SDR TV. All people who saw HDR shaders in action on good HDR display will agree with me.

Screenshot-20240106-132756-You-Tube-2.jpg


I have read many comments like this. People who have seen HDR shaders in action do not want to keep their CRTs.

SDR shaders cant emulate CRT aperture grille so perfectly (because SDR doesnt allow to have a mask with 100% opacity) but I cant show you HDR shaders on SDR screenshots.
What.. I.. I don't even know what to reply to this. It's about the whole experience - pulling out that big fucker even though it's only a 21 inch TV screen, sitting at a 40 cm distance.

And still getting a fantastic image thanks to the RGB output, ahhh.

It's about keeping the same setting and feelings from when I was 10 years old. That's simply not possible with a 65 inch OLED, no matter what HDR shaders you provide me with. That's just not a pleasant way to play old retro games for me personally. Tell me what you want, but it's not going to be the same thing. To me a 240p image isn't ever supposed to be upscaled that much.
 
Last edited:
This represents perfectly how I played on my (PAL) CRT TVs back then, I never had scanlines.
Now I wonder if a 8K TV would have enough pixel density to emulate a CRT to the pixel level if that makes sense to say.
4K is sufficient for 300-line CRT/PVM emulation, but for perfect 600-line PAL emulation you will need an 8K display.

1080p on the left, real 600 lines PVM in the middle, and 4K on the right.

Screenshot-20240106-134820-You-Tube.jpg


However, if you are not looking at the image through the zoom lens, even 4K will give you that scanline free look if you just change shader settings to 600 lines emulation.
 
What.. I.. I don't even know what to reply to this. It's about the whole experience - pulling out that big fucker even though it's only a 21 inch TV screen, sitting at a 40 cm distance.

And still getting a fantastic image thanks to the RGB output, ahhh.

It's about keeping the same setting and feelings from when I was 10 years old. That's simply not possible with a 65 inch OLED, no matter what HDR shaders you provide me with. That's just not a pleasant way to play old retro games for me personally. Tell me what you want, but it's not going to be the same thing. To me a 240p image isn't ever supposed to be upscaled that much.



This guy has a 65" OLED. Watch his video and listen to what he says when he compares his OLED to PVM.

Screenshot-20240106-140141-You-Tube.jpg


If you want a screen size comparable to a 21-inch CRT TV, you should use a 27-inch panel, not a 65-inch one.

And still getting a fantastic image thanks to the RGB output, ahhh.
The 240p / 480i image looks good on the small SD CRT, but not fantastic, as the SD CRT blurs the image too much (details are not well defined). If you use scart RGB, the image will be even less defined, because the SD CRT will not allow any post-processing (sharpening filters, etc.) when using scart RGB input. For this reason, I have always preferred to use the S-Video signal on my SD CRT, because I can get much sharper image this way. Only CRT / PVM minitors, or CRT/PVM shaders shaders can get you well defined image from 240p/480i that we can describe as fantastic.
 
Last edited:


This guy has a 65" OLED. Watch his video and listen to what he says when he compares his OLED to PVM.

Screenshot-20240106-140141-You-Tube.jpg


If you want a screen size comparable to a 21-inch CRT TV, you should use a 27-inch panel, not a 65-inch one.


The 240p / 480i image looks good on the small SD CRT, but not fantastic, as the SD CRT blurs the image too much (details are not well defined). If you use scart RGB, the image will be even less defined, because the SD CRT will not allow any post-processing (sharpening filters, etc.) when using scart RGB input. For this reason, I have always preferred to use the S-Video signal on my SD CRT, because I can get much sharper image this way. Only CRT / PVM minitors, or CRT/PVM shaders shaders can get you well defined image from 240p/480i that we can describe as fantastic.

These sure are opinions that one can have, I simply don't agree 🙂 you play games however you want to and I'll do the same.
 

cireza

Member
The 240p / 480i image looks good on the small SD CRT, but not fantastic, as the SD CRT blurs the image too much (details are not well defined). If you use scart RGB, the image will be even less defined, because the SD CRT will not allow any post-processing (sharpening filters, etc.) when using scart RGB input. For this reason, I have always preferred to use the S-Video signal on my SD CRT, because I can get much sharper image this way. Only CRT / PVM minitors, or CRT/PVM shaders shaders can get you well defined image from 240p/480i that we can describe as fantastic.
iu


Get a 8K TV, put a million of shaders, interpolate hundreds of frames, insert black frames on top of this, get a super dark picture but who cares, and probably ghosting too, and sometimes stuttering. Does all this processing add input lag ? Let's hope not. And don't forget your personal nuclear plant to power this. And then, maybe, if you believe it hard enough, you will get that CRT experience on that other technology.
 
Last edited:
What works pretty well for some reason is upscaling an image to about 4 times the original resolution without altering screen ratio and then lowering the resolution of the output to double the resolution of the original with a blurring filter together with a color boosting filter. It usually smooths the jagged edges out, reduces shimmering, and increases the fidelity of the image. The only real limitation is texturing on low polygon models in motion. It still looks better than on a CRT in the case of games like Silent Hill (PS1).
Here is an off-screen CRT picture and Retroarch Beetle HW w/ upscaling, blurring, and color shaders screenshot of Tekken 2 for reference:

m3DerV4.jpg


FhN7KyP.png
 

Dacvak

No one shall be brought before our LORD David Bowie without the true and secret knowledge of the Photoshop. For in that time, so shall He appear.
Hell yeah OP. Going back to CRT has been incredible for anything 6th gen and earlier. Despite having a PVM with RGB/YPbPr, I sometimes hook my consoles up via composite, just for that extra blurry (yet somehow crispy) look.

Ironically, it was Retroarch’s super-advanced CRT and Mega Bezel shaders that got me to fall back in love with the CRT look, which lead to me getting a real PVM. I can never go back, it’s just so nice.

Recently, Analogue added Trinitron filters to the Pocket, and even those look great. Long live CRTs!
 
Top Bottom