Here it is guys, Cyberpunk 2077 in all it's glory at maximum settings. The game requires future hardware it seems. You are looking at the modern day Crysis. I armed myself with an overclocked NVIDIA RTX 3090 and the 16 core 32 thread Ryzen 9 5950X and still stood no chance 😂Higher resolution HDR settings will be available once YouTube has finished processing.
Optimisation de marde carrément pour le moment nous ne somme pas prêt pour le rt encore 2 ou 3 génération de carte pour arriver à des performances descente bien hâte de voir ce que ça vas donner sur les console next Gen elle vont fondre sur place
My mSI gaming x 6 gb works good on 2560×1080 monitor. Enable triple buffering in nvidia settings to stop screen tearing and I can play on high mode 40fps.
This my most disappointing game release in my 25 years on gaming. I built a $2300 PC just for this game and it sucks! Performance is shit on a 3080?? 2700X OC 4.2ghz 32gb of ram!! the gamne looks and plays like a game from 8 years ago. The ray tracing is great but the lightingh textures etc look bad. combat isn't fun or fluid. I never would've dreamed 2077 would be this mediocre..
+1 Bang4BuckPC Gamer – Thanks for posting the Psycho RT version! WOW stunning visuals but [scratching my head] unfortunately combat looks beyond simplistic and linear, so really lame. LOL and that little peashooter our character is using seems to be firing foam Nerf bullets it's so ineffective. Geez so many headshots to take out just one bad guy! Still an amazing glimpse into this next-gen game!
Your HDR settings seem washed out on YouTube? Does that normally happen when recording your game play or do you have to turn HDR off to record? I calibrated my HDR between Day and Night cycles, and by observing the sun and moon. I made sure that those celestial bodies stayed round, and in focus will achieving max optimal nits on my display. It was a pain in the ass to get my HDR dialed in on my ASUS PG27UQ with this game. Thanks for sharing your game play experience.
How did you get your 3090 TUF to stay around 50° C ? Are you running the fans at 100%? I've seen other Benchmarks where it stayed around 65°ish C with fans around 75%. Also Cyberpunk 2077 seems to fry GPU's on a daily basis haha – i undervolted my MSI 3080 Ventus OC (which is usually pretty hot but more of a quiet card), adjusted fan curves and also increased airflow in my case by a lot and got temps down to 60-62° on most AAA titles, but Cyberpunk pushes my card to stay around 68-70 ° C like what the actual fk ?! I don't want to imagine how hot it will get with less optimized airflow and on stock voltage or even OC'ed like yours. I'd appreciate any explaination to this!
Edit: My bad, I didn't see that you got a custom watercooled system … big yikes.
The fastest consumer card on the market: **cries in agony**. Yep, even if the game isn't the "game of the decade", this surely will be a game benchmark for GPUs through the decade.
Here it is guys, Cyberpunk 2077 in all it's glory at maximum settings. The game requires future hardware it seems. You are looking at the modern day Crysis. I armed myself with an overclocked NVIDIA RTX 3090 and the 16 core 32 thread Ryzen 9 5950X and still stood no chance 😂Higher resolution HDR settings will be available once YouTube has finished processing.
Somebody have a best settings for rtx 3070 and 1440p please ? It's so "blur" with the settings to default…
How did u get ur fraps working when it’s not for most people
im getting exact same fps with a 3080 and a r3600xt LOL
Optimisation de marde carrément pour le moment nous ne somme pas prêt pour le rt encore 2 ou 3 génération de carte pour arriver à des performances descente bien hâte de voir ce que ça vas donner sur les console next Gen elle vont fondre sur place
Now let's overclock the GPUs and CPUs!!!
Those lens flairs in the car are super distracting totally unrealistic
my god that gpu power draw though
for those use AMD gpu's…you guys gonna get flat line or DOA when playing this game..feel sad for those bought DOA 6900xt
how lucky is nvidia…we barelly pulled this off in right time…before then …it was CRYSIS ..both companies did not made it time until years later
My mSI gaming x 6 gb works good on 2560×1080 monitor. Enable triple buffering in nvidia settings to stop screen tearing and I can play on high mode 40fps.
sometimes it looks better than a movie
@0:50 – @1:08 why are there no shadows under any of the NPCs?
11:07 Bruh
This my most disappointing game release in my 25 years on gaming. I built a $2300 PC just for this game and it sucks! Performance is shit on a 3080?? 2700X OC 4.2ghz 32gb of ram!!
the gamne looks and plays like a game from 8 years ago. The ray tracing is great but the lightingh textures etc look bad. combat isn't fun or fluid. I never would've dreamed 2077 would be this mediocre..
you're playing 26 fps at times and its smoother-looking then mt at 45 fps Smfh
I have a ZX Spectrum 1080ti and this game runs amazing once the tape has loaded !
+1 Bang4BuckPC Gamer – Thanks for posting the Psycho RT version! WOW stunning visuals but [scratching my head] unfortunately combat looks beyond simplistic and linear, so really lame. LOL and that little peashooter our character is using seems to be firing foam Nerf bullets it's so ineffective. Geez so many headshots to take out just one bad guy! Still an amazing glimpse into this next-gen game!
I've got extremely similar hardware (same CPU/GPU/motherboard at least), how does it run with the same settings (with DLSS on quality) at 1440p?
32 fps.
lmao.
Cyber-fail 2077
Me saying wow to how good 4k with raytracing looks with my 1080p monitor and gtx1070
Your HDR settings seem washed out on YouTube? Does that normally happen when recording your game play or do you have to turn HDR off to record? I calibrated my HDR between Day and Night cycles, and by observing the sun and moon. I made sure that those celestial bodies stayed round, and in focus will achieving max optimal nits on my display. It was a pain in the ass to get my HDR dialed in on my ASUS PG27UQ with this game. Thanks for sharing your game play experience.
Jesus and I was pissed I couldn't get over 40fps with my 1080ti. You're stuck at 35 with a damn 3090!!
I don't know if it's the game lagging or my laptop lagging from trying to play a 4K60 HDR video
Hmmm graphics looks futureristic but at the same time old lmao
So this is how the game is supposed to look. Mine runs like a PowerPoint presentation on 1080p medium on 1060.
How did you get your 3090 TUF to stay around 50° C ? Are you running the fans at 100%? I've seen other Benchmarks where it stayed around 65°ish C with fans around 75%. Also Cyberpunk 2077 seems to fry GPU's on a daily basis haha – i undervolted my MSI 3080 Ventus OC (which is usually pretty hot but more of a quiet card), adjusted fan curves and also increased airflow in my case by a lot and got temps down to 60-62° on most AAA titles, but Cyberpunk pushes my card to stay around 68-70 ° C like what the actual fk ?! I don't want to imagine how hot it will get with less optimized airflow and on stock voltage or even OC'ed like yours. I'd appreciate any explaination to this!
Edit: My bad, I didn't see that you got a custom watercooled system … big yikes.
DLSS is smart ass
imagine without dlss lol
RTX 3090 = $1500+ = 30 fps @ 4k in CP 77 = Uncle Jensen laughing with pockets full of $ (8k gaming!)
f**k my 3090! this game need nvidia 4000 series to get smooth 60 fps…f**k man!
omg a card for 1500 usd cant handle 50-70fps on 4K thats hard game better to stick with 1440p gaming instead or 1080p
Say whatever about bugs and stuff I' still loving this game.
The fastest consumer card on the market: **cries in agony**. Yep, even if the game isn't the "game of the decade", this surely will be a game benchmark for GPUs through the decade.
the game is for 2077 GPUs.
I'm pretty new to the whole PC "max settings ultra hardware" scene, but what exactly is the difference between "dedicated" & "allocated" VRAM?
3090 RTX but I can't see every grain of dust on the floor.
What do developer use to test the game?