Read more about Cyberpunk 2077➜ https://cyberpunk2077.mgn.tv
Our Cyberpunk 2077 CPU benchmarks look at AMD Ryzen vs. Intel, including Ryzen 5900X, 5600X, i9-10900K, & 10600K chips and dating back to the R5 2600 and i5-8600K CPUs.
Sponsor: Crucial Ballistix RAM (Amazon: https://geni.us/eykEA)
Watch our Cyberpunk 2077 GPU benchmark here: https://www.youtube.com/watch?v=w4rgB2zb7dg
The Cyberpunk 2077 CPU benchmarks include new stuff, like the AMD Ryzen 5 5600X, R9 5900X, Intel i9-10900K, i5-10600K, and older CPUs, like the AMD R5 3600, R3 3300X, R7 3700X, R5 2600, and R7 2700, alongside the Intel i9-9900K, i7-8700K, and i5-8600K. We tested all of this with the latest #Cyberpunk2077 patch as of 12/13/2020. No additional manual modifications were made for this test – it is representative of the game natively. Testing methodology is comparable to our #Cyberpunk GPU benchmarks, so you can check the above-linked video for more information on that. For components, we used what our standardized CPU benchmarking methodology for second half of 2020 laid-out previously. We tested the game at 1080p and 1440p, including Low, Medium, and High presets (no ray tracing). Of course, as you exit lower presets, the GPU becomes more of a bind than the CPU, but it’s still useful to see that data to understand at what point the CPU matters less. The biggest takeaway, though, is that frametime performance can be highly variable based on the game’s action, but it only becomes a problem on specific CPUs. As such, you’ll need more than just bar charts to really see those issues emerge. We show them in our frametime plots.
The best way to support our work is through our store: https://store.gamersnexus.net/
Like our content? Please consider becoming our Patron to support us: http://www.patreon.com/gamersnexus
TIMESTAMPS
00:00 – Cyberpunk 2077 CPU Benchmarks
01:30 – FPS Alone Not Good Enough for Cyberpunk 2077
03:33 – Objective is Determining AVG Performance
04:35 – “But Why an RTX 3080?”
05:55 – 1080p/Low Full Scaling (Best CPUs for Cyberpunk 2077)
07:50 – Frametimes for i7-9700K (1080p/Low)
09:36 – Critical Frametime Spikes (i5-8600K)
11:05 – AMD Ryzen 3 3300X Frametime Stuttering (“Lag”)
12:00 – 1080p/Medium CPU Benchmarks for Cyberpunk 2077
13:14 – 1440p/Medium CPU Benchmarks (GPU Bind)
15:10 – 1440p/High GPU Bottleneck for Cyberpunk 2077
16:14 – 1080p/High CPU Benchmarks & Bottlenecks
16:55 – Conclusions
** Please like, comment, and subscribe for more! **
Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video (“this video is brought to you by”) and above the fold in the description. We do not ever produce paid content or “sponsored content” (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
Follow us in these locations for more gaming and hardware updates:
t: http://www.twitter.com/gamersnexus
f: http://www.facebook.com/gamersnexus
w: http://www.gamersnexus.net/
Editorial, Testing: Steve Burke
Video: Andrew Coleman
Source
This testing was conducted with the latest patch as of 12/13. Obviously, things may change as the game appears to be getting patches nearly every day, but this should get you started pretty well since it'd theoretically only improve from here. We did not perform additional modifications beyond downloading the latest patches as of December 13th.
Watch our Cyberpunk 2077 GPU benchmark here: https://www.youtube.com/watch?v=w4rgB2zb7dg
We are still working with Eden Reforestation Projects to plant 10 trees per item sold on the GN store! Grab something here: https://store.gamersnexus.net/
2 comments Steve. 1) I thought each core always allows 2 threads.. So i5-8600K in chart @7:42 looks off. 2) How come you didn't test with new AMD 5950X?
Dying light 2 will bury this game.
I don't know what location you're at in the video game when you tested this but I'm getting nowhere near your performance with the 3080 and the 3600x seems to be very CPU bound where I'm at. Were you testing in the badlands?
Speaking of bottlenecks….do you think the Amd 6900 gpu will be bottlenecked on a Crosshair Viii Hero motherboard?
So graphically challenged just got demonetized.
Spread the word!
Just ordered my 10600k, time to replace my good ol 4790k. You served me well. The tests here are promising for me i guess
Intel forever
How long till the thermals on the Xbox?
Running a 3600 5700xt running good above 60fps 1080p ultra no motion blur.
[email protected] here, no stutters at 1440p mid with 1080ti so far.
What i thought before clicking is exactly what you told in the intro… Bugs and optimizations… These tests might just be obsolete after a month lol
The game is so buggy that reloading the same save multiple times has different results almost every time in terms on what happens in the game. I doubt that every test was in very similar conditions due to this.
My i7 2600k stopped working when I was playing this game. I am now replacing my CPU…
Anyone know what cpu would be most comparable to my i5 8400? Got a 3070 but haven't installed yet. Wondering how it will run this game
No one cares anymore, Cyberpunk is a terrible game
complete rubbish, even if there was no bugs, its just a crap game
incredibly disappointed
I thought cyberpunk has real live game bug/lag
have you guys at GN heard about the PC configuration file edits? There's a config file that contains the values for the amount of RAM and VRAM the game can use. The PC values are the same as the console values regardless of your system build, so for PC it's set to only 3GB of VRAM and you can change it to whatever your GPU actually has. There is also an edit to the .exe you can do with a hex editor for Ryzen systems which allegedly enables the use of all threads where the original .exe limits Ryzens, and not Intel chips, to physical cores only. I have no idea if these work, but when I tried the config file edit my loading times were decimated. In-game fps did not change for me with the config file edits but did improve for me when I edited my .exe
Why is there such a huge difference between i7 8700k and i5 10600k???? Weird!
O11 mini review on the way?
if you benchmark at night there are less electrons in the atmosphere to interfere with benchmarks.
8700K @4900MHz is about exactly what is on the charts at 1080p. I'm running it at 4K with a 2080 Ti, using digital foundry's settings, RTX off, getting about 90-110 FPS. Runs real good and looks amazing even with all the bug complains people have.
Running this in 4k Ultra with a Ryzen 3 3100 (waiting on the 5950X) with an RTX 3090… runs fine, but does go frame odd at times.
Going to replace my 8700K with a 5900X when they're available to allow my 3080 to be fully utilized. Hopefully will be enough to last for the newly released consoles cycle (6-7 years).
Damn… "Pretty basic stuff again for our core audience, but we are going to reach more people with this so it's worth repeating that point." ROFL Savage. Most likely unintentional, but hahaha. I do this at work accidently all the time. LOVED IT!
My 4c/4t i5 6500 is holding back my 5700xt in a major way while playing cp2077. A i7 10700k and new motherboard is on the way. I'm glad this video confirmed my purchase. I still put 40+ hours in the game regardless.
Worst thumbnail in your horrible thumbnail history. Well done girl.
why is the 9900k so much below the 10600k and 10900k here?
Benchmark you4 sweet locks
The game runs great even on a i7 3820 quadcore 3.6ghz with 2080ti on high settings even thought its poorly optimised game.
This benchmark results makes no sense.
Think i have a pretty good idea why this game chugs like my guitar amp on many systems. Ever notice the dumb item information tab stickin to the ui until you pick it up or quest notifications still being displayed after the quest is over and stuff like that? There's just a butt ton of unnecessary objects that remain alive in the memory. So maybe they just forgot to kill off all those unnecessary shit including everything else maybe. Which may cause pretty much any thread to get utilized to it's fullest and then it fails to render a few frames sent by the gpu, that's probably how ya get that 0.1% lows.
My game works really well on Ryzen 5 3600. I'm also very GPU bound with a Vega 56.
Am I the only one who appreciates the civilian AIs in the game running when the pedestrian light went red at 16:57? Of course not all of them, like in the real world where there are self-entitled idiots who have no shame stalling and wasting other people's time.
still no hex fix on this benchmark
Did you try the amd performance boost fix? According to articles there’s an easy fps boost to enable hyperthreading manually or with mods- 2 simple fixes that people report significant gains. O’d like to see that vs intel. If you addressed this inquiry I do apologize-lol. I skim through to the extensive data and benchmarks and written reports on ur site..
When the "Fuck You Nvidia" Video is coming?
"we will delay the game to make sure it's polished!" also… bugs exist.
It would be cool to see a test with an x79 machine and an i73930k to see if there is bottlenecks bellow 60fps (aka upgrade required )
Wondering if you can present the "lag spike" in a bar. like "Consistency %" / " Spike count" or or something "simplified" for other to get an easir to understand.
Still either way, another great analysis, far better then most other reviewers show (just fps), though will admit more and more cover at least the 1% which is a nice shift.
With my 2700X, it is enough to power stable 60fps without RT, but once I enable RT, framerate drops even when GPU is idle. Would love to see some CPU RT tests.
So you're trying to say my 5820k cant run this game?
i was wondering if the 10600k is holding back my 3090 @ 3440×1440. Answer is not rly other than the 0.1 % / 1 % lows which are better on an i9 10900k.
Mkay, something seems to be completely off on my system. It's a R9 5900X on an ASUS Strix X570 board with G.Skill RAM @ 3800 (Infinity Fabric @ 1900) and a 3080 FE and runs pretty well in other heavily CPU-bound games like Flight Simulator 2020. In Cyberpunk though, a CPU benchmark on 720p low gives me average frame rates between 66 FPS in V's apartment and about 57 to 63 while driving on the streets (measured with both RTSS and GeForce Experience). Lowering crowd density changed almost nothing, same with applying the SMP patch for AMD (which is expected at 12 cores).
After lowering my output to 50 Hz, I was able to get mostly locked 50 FPS with Digital Foundry's optimized settings and ultra raytracing in 4k with DLSS Performance Mode, so at least the GPU part of my system seems to scale as it should.
Any ideas how to debug this?
I7 9700kf is good ?
Ryzen CPUs where nerfed by the compiler who seemed to target AMD Bulldozer. It would not reach 100% on all cores.