Should Hardware exclusive games become multiplatform?
Today thought of addressing a matter I have been seeing online from my favorite community: the fanboys
Just before I share my thoughts on this I want to salute the fanboys for being the most hilarious people in the community nothing gives me pleasure than to see grown ups fight over plastic boxes made by corporations. The level of loyalty and devotion to a computer box to a point of life and death is mind boggling and amusing. Don't get me wrong I don't approve of death threats, verbal abuse and other forms of abuse but I chuckle at things like "PlayStation 5 is more powerful than RTX 3090" "Series X is the king it beat PS5 by 15fps" and so on, it makes my day.
Anyway I think exclusives shouldn't be made multiplatform.
Thanks Insomniac for the assist but unfortunately that has changed, anyway the answer to this for me is easy, if you look at the majority of PC port releases they have been sad. Yes they do get patches but it shouldn't be the way for a PC consumer also considering the amount of money you have to invest to start gaming on a PC. Had it been that the developers were not just filling up vrams and demanding high end cards for their games but rather carefully finding a way not only to release bug free games but rather take special attention to the plethora of GPU hardware out there I would be all in for day and date exclusive release on all hardware.
Look you are free to make your own conclusions but if you look at the above system requirements for PC for A plague Tale Requiem do you think that makes sense at all? Without trying to be sophisticated first question, is a PlayStation 5 and Series X more powerful than RTX 2080?Are they even near to RTX 3060? So why do we even need a 3070 to run this game? This is where I can say I would love to drink what some of these developers are drinking. This very game runs on consoles that are around the levels of RXT2060 let alone the Xbox Series S but here the consumer is being told we recommend a mid-high end pc gpu which they again will easily fill-up the vram for some weird reason.
Yes PC game development should certainly be different from console on a code level due to different architectures and so forth but still the end product and idea is that we want to give everyone a similar experience and those with better hardware can scale the game to their hardware level but sadly the reality of it is if you have an 8GB gpu you are kind of bust due to your vram. The weird part is even if you have a RTX3060 which I think is Nvidias best midrange gpu, I would even say high end as its better than Series X still vram utilization is weird, some games they don't fill it up but the textures are at times a bit of low quality yet the GPU is more than capable.
Put it simple Series X and PS5 are 8k capable so how come PC gpus that are better for some reason are locked up as 1080p cards incapable or rather falling short of being recommendable gpus. Yes you may argue that they are basically giving a gpu range but still the recommendation is weird in this case.
So based on what the majority of PlayStation exclusives and other ports released onto PC I am not excited to buy games made for both PC and console. We do have exceptions like God of War 2018 by nixxes however they fail to outweigh the troublesome PC releases.
The Curse of 30fps
If I may digress there is yet another sad reality we have been pushed into that is the curse of 30fps. Look there has been a tone of reasoning behind why we should get 30fps for certain games but while I am no game developer with the coding knowledge I have I still believe that firstly you cannot code a game to have the hardware render the whole game at one go. So while some say RPG games cannot be 60fps due to cpu limitations some say its the gpu and a variety of interesting reasons I still believe (as the image above) sometimes these game studios make a choice to release a game at a certain resolution and framerate especially for console players. This I find to be a bit of a turn off as still the present day console hardware is still capable of higher fps even if it means lowering the resolution and all the textures choking our console poor gpus. PC players will easily understand this as the majority still play at 1080p 240fps and 500p 400fps lol but yeah. Point is you don't need to crank up resolutions only to sacrifice what matters most to people: gameplay. Again if you choose to play on PC get ready for shader compilation stutters and vram bottlenecks.
Subscription Services
Another topic I want to talk about is subscription services. While they have their positives to me it has reached a point were its all about bundling a player with a myriad of games they don't even play. It gets more sad for console players as you have to have atleast a subscription plan to enjoy multiplayer and all the cloud services to be offered provided you keep paying for a plan in which you don't even play 3/4's of the games on offer. Say you don't want sub services on your console you basically lose access to all multiplayer and other benefits. If you so wish to gather the stats of what I have said you can proceed and waste your time because you hardly get people excited over playing a Gears of War title within a subscription plan but rather people look forward to new releases (Spider-Man 2,Star Field etc) and streamers do so in order to obtain views.
In conclusion I would have loved to go on and on but I would love games to be available on any hardware but the problem we have is not the hardware but rather the developers who seem to not give much attention to the hardware in question but rather focusing on releasing the product that has mysteriously passed QA and having the consumer buying it only to patch it later. We live in sad times of gaming whereby you have to be in fear even if you have the best hardware it wont matter because the game incoming at release may be a highly probable buggy mess.
cheers
Comentários