• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The HD revolution was a bit of a scam for gamers

oji-san

Banned
I don't quite agree but my best years of gaming was when i got my first HD Ready TV and Xbox 360 and a year later PS3, lasted with that setup for 7-8 years and it was the most perfect gaming i ever had. Not once did i change the setting on my TV all these years. Unlike when l got my 4K LG TV and later LG C9, those are great for me for everything 60fps, lower then that? all kind of ghosting or whatever it's called. I'm not sure if it's the motion on LG TVs, and i was stupid enough to buy another LG after i wasn't satisfied with the first one as i was sure an acclaimed model like the C9 wouldn't be problematic in any way.
Also funny thing but before i wouldn't be bothered with 30fps and now i can't stand it.
So yeah no like you said but simpler days was better for me, not that now i'm not enjoying but back then i enjoyed everything without any limits like today.
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
Hell no. Playing Kameo and Project Gotham 3 on launch day on my 1080i 30” TV felt like such a huge leap at the time.

And long overdue at that. PC gamers had been playing at 1024x768, 1280x1024, even 1600x1200 for several years before that. It’s not like HD resolutions were some crazy new thing that the world (and the tech) weren’t quite ready for.
 

Scotty W

Banned
One could argue that all revolutions in gaming were largely scams. Think of the transition from 2d to 3d: almost all those early 3d games are unplayable today, and more important, look at the way they hype basically handicapped 2d games for years afterward.
 

Drew1440

Member
I wonder if that generation could have benefited from iterative consoles, with the regimular 360 and PS3 aiming for 480p SD and the Pro/S versions aiming for 720p using more powerful graphics hardware, though from looking at past threads it seems having 512MB RAM was the main issue with these consoles.
 

Neo_GAF

Banned
Yeah no. CRT meant 50 Hz on Pal territories. The move to 720p was the right call.
wrong. 60hz was available for tvs produced after 1995. you were able to play games with 60hz, dc, ps2 and gc had sometimes the option while the OG xbox had 480i and 60hz ticked from the getgoing.
i remember that my parents tv was able to play imported US games in colour, while my tv would only output music and no screen signal.
 

K.N.W.

Member
I'm not completely sure about this, because the higher resolution at the time meant you could see easily more detail on HD screens, that's why we got the games we got. On the otherside, lower resolution games probably would have left way more room for additional and better visual effecs, while still at lower resolution, thus not looking completely clear detail wise.
For framerates, I think games would have performed better just by the lower resolution, but framerate it's still up to developers, and we still got many high framerate games anyway.
For what regards developers, their life would had been much easier, developing games with lower resolutions assets and having less headaches about performances, many gaming studios would have survived that era in a much easier way.... but HD gaming forced a lot of people to buy new and strong PCs, thus financing the world of informatics, basically helping the world become what it is today (we depend on computers to do basically anything).....

So I want to say that the HD era was a sweet waste of pixels.
 
Last edited:

NeoIkaruGAF

Gold Member
One could argue that all revolutions in gaming were largely scams. Think of the transition from 2d to 3d: almost all those early 3d games are unplayable today, and more important, look at the way they hype basically handicapped 2d games for years afterward.
766495.jpg
 

treemk

Banned
OP is wrong for many reasons. In terms of purely "that looks pretty" sure CRTs can be great, but the actual detail resolution is still ass. You're not just sacrificing resolution, but also draw distance because of how tiny/blurry or completely non-visible distant objects will be. I was already playing Halo 2 on a 720P LCD because it was so much cleaner and easier to see what was going on, sure there is a tradeoff where you see more jaggies because everything is clearer. Then by the time you get into the the 360/PS3 era CRTs were well on they way out and people are not going to hang onto them just because of consoles. Ultimately the devs decide how they want to prioritize resolution and framerate which is how it should be because some games are fine at 30fps and some games need more. Keep in mind that the "HD revolution" was entirely console focused with consoles catching up to PC where 1024x768 and higher resolutions were already common.
 

Synless

Member
I’ve been playing on everything under the sun and honestly, if you are playing on native resolution sets it looks sharp as fuck. disagree, don’t care. My eyes see what they see.
 
Last edited:

Danknugz

Member
well as a PC gamer, we had HD resolutions going as far back as the early 2000s and in some cases late 90s.

edit: i'd also like to add that even though PCs enjoyed higher resolutions back then, similar to today, most of the best games (with the exception of quake/counterstrike) were on console with shoddy ports. i say similar because there's been a huge commotion about pc performance on ports lately but back then it was a whole other animal, you'd be lucky to get a port to run as well on pc at all compared to how it ran in console. for the most part it was just a happy convenience for gamers that pcs had higher resolutions, but that was not the intention of the hardware which was more for desktop publishing.

today's closest equivalent is obviously the 100 tflops 4090 vs 10-15 tflops on console, no clue how long it will take consoles to get there, but just like back then they have better libraries, today it's slightly different due to VR and the fact that the top end GPUs are primarily intended for gaming, so PC doesn't have as much an excuse now, one could argue.
 
Last edited:

PSlayer

Member
I do feel like we're reaching a diminishing returns when it comes to resolution. Most people are still playing on 1080p/1440p on PC which provides much better hardware.

The worst part is that as a TV manufacture,Sony has the negative incentive to push for 8k instead of something like 1440p120fps/4k60fps next cycle.

IMO PS5's target should be 1440p/60fps and 1080p/30fps for ray tracing.
 
I still remember the first time I saw my Xbox 360 hooked up to an LCD flat screen.

Having, like most people, only ever played on a regular CRT set up to that point, the new technology of HD screens took gaming to new levels and I couldn't believe how different games looked on my new 26in flat screen.

It was like a new console generation in itself.

Gears of Wars 2 was the first game I tried and it looked incredible.
I'd had a similar experience previously when I got a PS3 for Christmas the years before and had played that on a 42 inch plasma.
For me, the move to HD was a revelation and probably the single biggest leap in the user experience I've ever seen in 40 years playing games.

The move to 4k in recent times doesn't have anywhere the generational leap, for me personally, that the move to high definition did back then.

It was a game changer.
 
Last edited:

Pimpbaa

Member
I do feel like we're reaching a diminishing returns when it comes to resolution. Most people are still playing on 1080p/1440p on PC which provides much better hardware.

The worst part is that as a TV manufacture,Sony has the negative incentive to push for 8k instead of something like 1440p120fps/4k60fps next cycle.

IMO PS5's target should be 1440p/60fps and 1080p/30fps for ray tracing.

1080p/30fps makes no sense. Games on current hardware can easily do higher resolutions at 30fps with RT than performance modes that typically target 1440p. If a game was doing 30fps with RT and taxing the hardware enough that it needed to be 1080p, you sure as hell ain’t getting anything near 1440p at 60fps.
 

01011001

Banned
I do feel like we're reaching a diminishing returns when it comes to resolution. Most people are still playing on 1080p/1440p on PC which provides much better hardware.

it all depends on screen size and viewing distance.

1080p on a 55" TV would look like ass, simply due to the low amount of pixels resulting in a more noticeably grid. like just sitting in front of a really big 1080p TV with a screen filling solid color, like an all white screen or all red, will instantly reveal the low resolution because you'll easily be able to make out each individual pixel.
even if you watch or use 1080p media on a decently sized (50"+) 4K TV it will look better than it would if the TV was only 1080p at the same size.

not only that. a higher screen resolution also means that scaling artifacts get less and less noticeable because of the way higher pixel density.
if you scale an image unevenly on a 1080p TV you will have really ugly and obvious scaling artifacts like double width pixels.
on a 4K TV these are also there, but due to the fact it has 4x as many pixels (2x per axis) a double width pixel/uneven pixel scaling is a way smaller and less noticeable issue.

it's all about the pixel density.
that's also one of the reasons Analogue used a 1440x1440 screen for their Analogue Pocket. absolute overkill at first glance, but this pixel density allows them to have a good looking image on multiple emulated handheld systems.
the GBA image for example doesn't scale evenly on its screen and it has double with pixels here and there, but it's not a big issue due to the extreme pixel density on such a small screen.
 
Last edited:
There's no diminishing returns on resolution.

I've notice on many subjects, not just this one, that people confuse lack of mainstream affordability with diminishing returns.

It makes sense people still buy 1080p monitors, because they are cheaper and sometimes given away with PCs.
 

StreetsofBeige

Gold Member
At some point the diminishing returns theory will come true. TVs seem to tap out around 70". Ya, some are bigger, but most people buy 50-65" TVs. People's homes are only so big. I dont think there's too many people willing to buy a 100" TV even if cheap.

We are now at 4k. I think there's some TVs at 8k. It gets to a point our human eyes can only tell so much clarity given a screen size. Who knows, maybe at 8k or 16k it taps out.
 
I'm surprised they managed to get 720p with less than 512MB of RAM. They needed some sort of FSR/DLSS to upscale to Full HD and we would have been good. The constraints of the past kind of made developers to what they are today.
 

SageX85

Neo Member
I'm not sure if any games didn't support progressive, but I can't remember hitting any.

And even in 480p, Wii games still looked like trash compared to the 360 and PS3.

I also love the people wishing for 480 on the 360, I guess they never tried playing Dead Rising on a standard def CRT and were unable to read any in-game text.

Most people didn't have TVs that could do over 480i until they upgraded to HDTVs. The people wishing for 360 and PS3 to top out there should try playing at those resolutions. You could lock the consoles to that output and games ran better, which looking like trash.

Small font isnt a problem of resolution but of bad design. It might blwo your mind, but you could have a 4K game with big legible text.

I see many people here who dont understand anything at all and just go for the some random high numbers and latest tech, just because latest tech like the techbros they are.

CRTS had ~50 years at SD resolution in retail. LCD have only been like 17 years and havent taken advantage of anything.
-HD (720p) wasnt event properly used, panels at best were 768p, and there were others with anamorphic pixels and even lower resolutions. Consoles couldnt render at full size, nor do a "double strike" as in 240p, so we got stuck with sub "720p" @<30ish. Content wise cable took a long time to do HD broadcasting.
-HD 1080i the same.
-FullHD 1080p. Broadcasting OTA is not posible without compression. The 8th gen consoles as the 7th cant really keep at that resolution so we got 900p@<30ish fps
-QuadHD, was skipped on TV, consoles and broadcasting
-4K, impossible OTA broadcasting "HDR" with no real standard followed by any manufacturer, streaming with compression, 8.5th gen consoles cant do real 4K@<30ish fps, 9th gen kinda the same, with the occasional game that really targets 60fps, and we have color banding, in 2023
-8K the next big thing, and the "crappy 4k will look blurry" to techbros
-16k must be around the corner to be announced

7th gen consoles should have aimed at optimized SD, with extra power being used for proper downsampling at rock solid 30fps minimum, 8th gen aimed to 720p R Rock solid 30fps as minimum, 9th 720p@60 fps as minimum.

720p should have been mastered, with the bandwidth properly used, so true 720p, 10 or 12 bit color depth @ true 120hz. Consoles would have power to spare, maybe even downsample from 1440. And only then begin thinking about the need to increase the resolution of the display. But that doesnt sell new tv sets each year, nor mid gen console "upgrades".
 

SScorpio

Member
CRTS had ~50 years at SD resolution in retail. LCD have only been like 17 years and havent taken advantage of anything.
-HD (720p) wasnt event properly used, panels at best were 768p, and there were others with anamorphic pixels and even lower resolutions. Consoles couldnt render at full size, nor do a "double strike" as in 240p, so we got stuck with sub "720p" @<30ish. Content wise cable took a long time to do HD broadcasting.
-HD 1080i the same.
-FullHD 1080p. Broadcasting OTA is not posible without compression. The 8th gen consoles as the 7th cant really keep at that resolution so we got 900p@<30ish fps
-QuadHD, was skipped on TV, consoles and broadcasting
-4K, impossible OTA broadcasting "HDR" with no real standard followed by any manufacturer, streaming with compression, 8.5th gen consoles cant do real 4K@<30ish fps, 9th gen kinda the same, with the occasional game that really targets 60fps, and we have color banding, in 2023
-8K the next big thing, and the "crappy 4k will look blurry" to techbros
-16k must be around the corner to be announced
4K OTA is possible along with HDR support, and the first major rollout was in May 2017 in South Korea to have things ready for the 2018 Winter Olympics. https://en.wikipedia.org/wiki/ATSC_3.0

Progressive video supports resolutions up to 3840×2160 progressive scan and supports HEVC Main 10 profile at Level 5.2 Main Tier.
Progressive video supports frame rates up to 120 fps and the Rec. 2020 color space.
Progressive video supports HDR using hybrid log–gamma (HLG) and perceptual quantizer (PQ) transfer functions.

It is available in some parts of the US, but there's no FCC mandate forcing adoption.

As for the hindsight with how old consoles look now? The majority of people were thrilled with how the 360 and PS3 looked. It was a major step up from the previous generation. Did things run stable 60 or even 30? No, but games on even the SNES or NES could slow down to under 15. The 360 and PS3 could run some games at 1080p 60, but devs wanted to show off and have better-looking games.

But I'm sorry to hear about your terrible vision, have you gone to an eye doctor? They might help you be able to see the difference HD brought to the game.
 

Cryio

Member
1440p + FSR2/DLSS2 Quality and 120+ fps is where it's at.

Modern GPUs can barely do that if we add ray tracing on top. And that's for CROSS-GEN titles.

It's going to be a while.
 

cireza

Member
I fondly remember these times where HD TVs were new. The jump from CRT to a 720p TV was memorable.

I remember unplugging my 360 from my CRT with the SCART RGB cable and plugging it in my brand new HD TV through HDMI. Started the console, launched the game (it was Ninety Nine Nights) and felt like "Okay not bad". Then I started moving, and this is where I went "What the fuck is this shit ?". In 5 seconds I had lost half of my sight for each eye. HD TV has been the stuff of nightmare ever since : the worst blur you have ever seen.

I quickly demonstrated how bad it was by starting Metroid Prime on my GC on my CRT and Prime 3 on my HD TV (both side by side). Held right on the pads and witnessed the difference : CRT was clear as day, you could distinguish every little thing in the background, no eye strain. HD TV was like looking the side of the road on a highway. You couldn't see anything.

This is still true today, even though it is not as bad. We also got used to it : we tend to not focus on whatever becomes blurry anymore. Our brains have been trained to this mediocrity.
This is an inherent issue with the technology, so it will never be fixed. That's why I am keeping my CRT. Replayed Metroid Prime 2 last week by the way. It was awesome. On CRT.
 
Last edited:
My biggest gripe and still true to this day is the poor motion resolution on LCD, it hurt my eyes for a while in 2006/7 when I switched. I still remember the pain even on a 60fps game like Ratchet and Clank on PS3, soon as you move it's smear city.

I'll always remember just how blurry PS2 games look upscaled on HDTV via PS3. God of War looked crisp and clean on CRT, that was my last game session on CRT before setting up my PS3 and HDTV, I couldn't believe the blurry washed out smudged fest. CRT as we all know now does wonders for pixel art. Non native images look terrible on LCD.

Not sure I go with scam as by 2007 it was very easy to buy full 1080p screens, 2005-06 did have some dodgy fake HD screens for sure. With HD LCD we certainly had a very flawed screen for motion that we hoped would improve, almost 20 years later we still have the same issues, only slightly improved with Plasma/Oled. I have a Panny Plasma as well.
 
Last edited:

charles8771

Member
I'm surprised they managed to get 720p with less than 512MB of RAM. They needed some sort of FSR/DLSS to upscale to Full HD and we would have been good. The constraints of the past kind of made developers to what they are today.
The Xbox 360 would had with 256MBs if wasn't for Gears of War
While the PS2 was the main leading platform for developers and had 32MBs of RAM 5 years before Xbox 360, so 512MBs was enough for developers

uhgmqs0efg9b1.png


 
Last edited:
Fun fact: Gran Turismo 4 on the PS2 could run at 1080i. Some games of the sixth generation had optional HD output. They just did not have the specs to make much use of it even if the hardware supported it.

 
Last edited:

RoboFu

One of the green rats
Bad for console gamers maybe but good for pc gamers. It made ports so run great on low end systems for a long time.
Though it may have help lead to all the bad ports now for pc. They didn’t have to put in too much effort in the ports for a long time. 😂
 
Last edited:

charles8771

Member
DMC, Tekken, GT3/4, Jak & Daxter, Ratchet & Clank, MGS2, Burnout, Zone of the Enders, Ace Combat, WWE whatever, just to name a few.


Could have swore it supported progressive scan, since the GC did and it's the same hardware. Did most games run interlaced?







Yeah, you won't talk about GTAs dropping to 20fps or lower on PS2.
MGS 3 targetting 30fps on PS2 even thought MGS 2 was 60fps
Half Life 2 dropping to 10fps on OG Xbox
 

ThisIsMyDog

Member
btw. i think PS6/XNEXT in 2027/8 should stay at 4k (upscaled by something similar to DLSS), then maybe we could have generational leap. 8K is pointless even on ridiculous big TVs.
 

The Stig

Banned
Fun fact: Gran Turismo 4 on the PS2 could run at 1080i. Some games of the sixth generation had optional HD output. They just did not have the specs to make much use of it even if the hardware supported it.


I remember buying component cables just for this. Also the cable worked on PS3.
 

mrcroket

Member
No way, wii was 480p and looked horrible on the tvs of the time. The crts were already outdated by then, regardless of how much they are valued for retro gaming. For a few months I had my new xbox 360 hooked up to a crt tv, and when I bought my first 32" samsung HD tv it was a super wow moment, so no, 480p is not worth at all.
 

NinjaBoiX

Member
It’s easier to sell shiny graphics over smooth frame rates, or at least it always was.

I think the tide is turning on that opinion, thank god.
 

charles8771

Member
The real scam in that gen was pure digital.
Losing ownership one console at the time


The thing is that there were games that didn't have digital release.
Like how not a single bethesda game got a digital release on PS3, only physical

What happen when a game on PS3/Xbox 360 is designed to stream from the HDD and optical disc at the same time? They wouldn't had a digital release in order to help with performance
 
My first HD TV was a 24" Samsung with a resolution of 1366x768 which I bought to use with my original Xbox and then later with the Xbox 360. I was blown away by the quality of the image even at 720p compared with my GameCube at the time which I was using on a Sony Trinitron TV via RGB SCART.

The problems only came later when I upgraded to a 1080p Sony TV and had to endure poor image quality from my Wii for years even though I was using a component cable. I then came to realise how limited LCD screens could be for older systems.

I don't mind that we now have 4K TVs but I kind of wish the technology allowed for outputting at different resolutions without scaling, like on old PC CRT monitors which would support different resolutions and framerates up to a maximum. This would mean that 480p games would look just as crisp as 1080p and 4K ones since there would be no need to use any kind of upscaling. That said, imagine how massive (and heavy) a 55" TV would be if it was a CRT model!!!
 
Last edited:

ResurrectedContrarian

Suffers with mild autism
Absolutely right that the early generations of flatscreens were simply trash compared to a decent CRT, and yet no one cared because "resolution" became the only selling point. Sure, everything had more pixels, but suddenly every home was infested with horrific color depth, extremely poor blacks, etc.

For so many years, almost everyone who made the HD/flatscreen jump was simply putting something in their living room which was worse than the previous generation of CRTs on every possible metric except pixels, and I'll go to the grave on this point. The obsession with resolution to the detriment of overall quality, response rate (try to beat a CRT's lack of delay... millions of early flatscreen TVs introduced input lag to the world), etc made a whole generation of people blind to everything else, because suddenly only that one metric mattered.

It's a bit like watching people buy the shittiest of digital camera just because the megapixel count on the box was higher, without even understanding lenses or other features.
 

rofif

Can’t Git Gud
I think the jump to xbox 360 was incredible. That console felt next gen. Games looked fkn incredible in 2005. It was hard for me t believe as a pc player at the time.
Gears of war later?! holy shit
 

charles8771

Member
Consoles always had limitations dude.

The PS2 for example could only run games at 512x448/480i without AA. By 2000, 800x600 was very common among PC users
By 2004/2005 the PS2 was so outdated compared to PCs, however still got games released.
The PS3 was supposed to launch in 2005, but got delayed for 1 whole year due for blu-ray drive

When you develop a game exclusively for a fixed hardware(console) vs develop a game exclusively for 900 possible combinations(PC). Its a big different story.
 
Last edited by a moderator:

Drew1440

Member
The Xbox 360 would had with 256MBs if wasn't for Gears of War
While the PS2 was the main leading platform for developers and had 32MBs of RAM 5 years before Xbox 360, so 512MBs was enough for developers

uhgmqs0efg9b1.png



Sony announcing the PS3 with 512MB combined RAM probably had an impact.
 

Xellos

Member
Absolutely right that the early generations of flatscreens were simply trash compared to a decent CRT, and yet no one cared because "resolution" became the only selling point. Sure, everything had more pixels, but suddenly every home was infested with horrific color depth, extremely poor blacks, etc.

For so many years, almost everyone who made the HD/flatscreen jump was simply putting something in their living room which was worse than the previous generation of CRTs on every possible metric except pixels, and I'll go to the grave on this point. The obsession with resolution to the detriment of overall quality, response rate (try to beat a CRT's lack of delay... millions of early flatscreen TVs introduced input lag to the world), etc made a whole generation of people blind to everything else, because suddenly only that one metric mattered.

It's a bit like watching people buy the shittiest of digital camera just because the megapixel count on the box was higher, without even understanding lenses or other features.

So true. I bought a 720p DLP rear-projection TV back in 2003 and it was the worst consumer electronics purchase I have ever made. Terrible buzzing sound from from a few months in, and 480i games looked pretty bad so it kind of ruined PS2 for me. Always wished I went with HD CRT or 720p Pioneer plasma. Wasn't until 2011 that I finally got a LCD I liked. Still have that TV, it's built like a tank compared to modern flatscreen, picture still looks great.
 
I didn't think I would but I agree, especially early on. The first HD TVs were awful and looked so much worse than CRTs. The ghosting was awful. We probably could have gone to enhanced definition CRTs as the standard instead. A lot of games were below 720p with many in the 500p to 600p region like Halo 3. Many are commenting on the fact that jump to 720p was worth it but the funny thing is half the games ran below 720p and at sub 30fps.
 

Hudo

Member
And nowadays isn't a scam? They charge you $70 for incomplete, bug-ridden messes and then have the gall to ask you for even more money with micro-transactions, season passes, battle passes, etc.
 
Top Bottom