Cyberpunk 2077 Ray Tracing and DLSS Benchmark, What GPU You Need For 1080p, 1440p, 4K



Read more about Cyberpunk 2077➜ https://cyberpunk2077.mgn.tv

MSI Optix MAG274QRF-QD Gaming Monitor: https://msi.gm/GetOptix

Support us on Patreon: https://www.patreon.com/hardwareunboxed
Join us on Floatplane: https://www.floatplane.com/channel/HardwareUnboxed

GeForce RTX 3070 – https://amzn.to/34wLJTY
GeForce RTX 3080 – https://amzn.to/31K5tBS
GeForce RTX 3090 – https://amzn.to/3ouQ9Tu
Radeon RX 6800 XT – https://amzn.to/2HbNT2r
Radeon RX 5700 XT – https://amzn.to/2Js33xB
Radeon RX 5700 – https://amzn.to/2FYq5v5

Video Index:
00:00 – Welcome back to Hardware Unboxed
02:03 – Quality Comparisons
06:13 – 1440p Performance
09:44 – 4K Performance
10:43 – 1080p Performance
12:19 – DLSS Performance
13:11 – Final Thoughts

Read this feature on TechSpot: https://www.techspot.com/article/2165-cyberpunk-dlss-ray-tracing-performance/

Cyberpunk 2077 Ray Tracing and DLSS Benchmark, What GPU You Need For 1080p, 1440p, 4K

Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed

Disclosure: We get a small commission when you purchase products via our Amazon links, this doesn’t cost you anything extra, it’s a cut that comes out on Amazon’s end. But please be aware the Amazon links provided are affiliate links.

FOLLOW US IN THESE PLACES FOR UPDATES
Twitter – http://twitter.com/hardwareunboxed
Facebook – http://facebook.com/hardwareunboxed
Instagram – https://www.instagram.com/hardwareunboxed/

Music By: https://soundcloud.com/lakeyinspired

Source

44 thoughts on “Cyberpunk 2077 Ray Tracing and DLSS Benchmark, What GPU You Need For 1080p, 1440p, 4K”

  1. Thanks to everyone for the support over the last few days. We'll be talking about our recent kerfuffle with Nvidia in a video in a few days, so stay tuned. Until then you can enjoy Linus blast Nvidia a new one in the latest WAN Show

    Reply
  2. Do you agree that cyberpunk 2077 looks visually better using the high graphics preset compared to ultra? I've noticed in some cases with some games, ultra ads certain features that are kind of an eyesore and you have to personally customize it to get your favorite visual look

    Reply
  3. Will you be doing a optimized settings video for this game? I've seen that Cyberpunk has a ton of graphics options, I believe many people would love to optimize their visuals and performance. I absolutely loved the work you did with RDR2, still using those settings to this day!

    Reply
  4. for anyone who will read this comment, i would like to point out that maybe you shouldn't call this game "raytraced" : calling this game "raytraced" is a misleading way to define it, not from the reviewers (that use the same name of the technology the company called it) but from the companies producing the game and the companies producing the hardware. Since i do rendering, and even if for hobby i do it from 2008 so more than 10 years, i'm a little concerned about ppl thinking that this game is raytraced…….. This game is rasterized, with A LITTLE (or a little more than a little, going all ultra raytracing settings) part of the effects calculated in raytracing instead of rasterization.

    If we think this is pure raytracing, we will stay like this for maaaaany years and for every little effect added in RT we will have to buy a new graphic card…. while if the companies understand that "we know", maybe they will do better / faster to reach pure raytracing.

    To make a SINGLE FRAME in pure path tracing unbiased (that is a way to produce a pure raytraced image for the actual technology as for my understanding), for an high end consumer PC with let's say rayzen 5900x and a 3090, with all the light sources and object in this game, it would take at least minutes using octane (that is a very fast gpu path traced rendering engine) or other fast software. FOR A SINGLE FRAME!!

    Now, is it possible in your opinion that in a game they made possible, even with high end hardware configurations, to do 40/50 images per seconds when the specific software takes at least minutes for just 1? Obviously not…..

    Now, it's not that the didn't make efforts and / or the technology is bad, but they shouldn't convince ppl to think this is a thing, while this is maybe 1/1000 of that thing.

    For anyone who have doubts pls go check various websites to see a full RT engine/image to understand the difference.

    Reply
  5. Nvidia is pushing so hard for the ray tracing, a methodology that in the first hand was never intended for real-time applications such as games, but rather rendering. We will come to a point when the ratio between power expenditure and gains (financial or anyhow, by playing games) will skyrocket and we will have to immediately consider environmental factors, at least by the rate at which hardware has been becoming power-hungry in the last decade.

    Reply
  6. Great video Tim!!!
    I wonder how bad of a performance drop can I expect with an RTX 2070 and a R5 3600 vs your i9 10900k at @1440p?
    Anyways, ray tracing seems to be out of my table for now for this game.
    Nvidia keep talking about RT been the "future" bla bla bla, but from what Ive seen soo far, they did a pretty lame work upgrading RT performance on the RTX 3000 series.
    To see that even the brand new RTX 3060TI and 3070 perform this bad at 1440p+Reflections-only its really lame and pathetic, even more when you consider this is using DLSS.

    Reply
  7. I just gave a big wrap to Steve in the comments here: https://youtu.be/wdAMcQgR92k
    But i'm also appreciative to Tim and his awesome reviews. This one's another beauty.
    Hardware Unboxed is in my mind the best review site for hardware available today.
    Tim's review on the LG UltraGear 27GL850 had me scurrying to grab one the day after i read his review.
    Keep it up fella's !

    Reply
  8. Im a bit pissed with devs making games for "future hardware" instead of making it for idk…..current fucking hardware! First was the new flight sim by Microsoft and now 2077. I have a 2080ti and youre telling me that at 4k this isnt gonna run this thing at more than 60fps? Excuse me? Its a $1,300 gpu ffs.

    I know i cant run everything on ultra, which is fine, i dont need everything on ultra but come on already. Enough of this crap.

    Reply
  9. Well, clearly an INCREDIBLY optimized game. -_-
    Considering the inconsistency of the graphics quality, needing a 3080 to even play the game at 60 FPS at Ultra with RTX is ridiculous.
    Not even mentioning you need to FIND one to start with.

    Reply
  10. Hey guys, do you do benchmarks for PC VR games? Would be interested in the upcoming Microsoft Flight Simulator VR mode or current games like Star Wars Squadrons and Medal of Honor. I'm thinking of buying one of the new Geforce or Radeon cards and want to see what level of performance I can expect. I know PC VR is still not super main stream but alot of people are getting into it more and theres not alot of guys doing benchmarking in that space. Keep up the great content been loving it.

    Reply
  11. I usually upgrade to a new video card for specific games. Witcher 3, and Cyberpunk to name two. At this point there is not much of a choice for me. I had access to a XFX Merc 6800 and a Gigabyte OC RX 3070 at retail. 3070 was the clear choice for DLSS and RT. Now that may change 6 months down the line when AMD catches up. 700USD retail for 6800 AIB is a bit much compared to the 570USD for the 3070.

    Reply
  12. Guys stay strong, as a "gamer" I love your work and till this day I use the settings you provided me from red dead video! If you could make one for cyberpunk I defo would watch it all! Stay strong!!

    Reply
  13. The game runs with 3-20 FPS on 1080p low on my pc(ryzen 1700+GTX 1060 6GB). Made a long comprehensive thread on cd project red forum complaining, criticizing the company and they made my thread just a comment then after a complaining about making my thread just a comment they ban me till 28 december.
    Just shows how evil video companies have become.

    Reply

Leave a Comment