Read more about Cyberpunk 2077➜ https://cyberpunk2077.mgn.tv
A quick video to show what happens when you play Cyberpunk 2077 on my machine learning workstation equipped with an NVIDIA RTX A6000? The frame rate is quite good at 70+ FPS; however, what does the game do with my 24-core Threadripper, 128GB system RAM, and 48GB NVIDIA RTX A6000?
My computer’s stats:
* AMD Threadripper 3960X 3.8 GHz 24-Core
* NVIDIA RTX A6000
* 128 GB (8 x 16 GB) DDR4-3200 CL16
* Sabrent Rocket 4.0 2 TB M.2-2280 NVME SSD
* 1000 WATT PSU
My video on the NVIDIA RTX A6000:
https://www.youtube.com/watch?v=85-K7qTSvS8
The NVIDIA RTX A6000:
https://www.nvidia.com/en-us/design-visualization/rtx-a6000/
** Follow Me on Social Media!
GitHub: https://github.com/jeffheaton
Twitter: https://twitter.com/jeffheaton
Instagram: https://www.instagram.com/jeffheatondotcom/
Discord: https://discord.gg/3bjthYv
Patreon: https://www.patreon.com/jeffheaton
Source
Great Video Jeff! Do you have any Machine Learning/ Deep Learning books that you would recomend?
That's way better than a 3090 jeez
Plz do more gaming benchmarks with it
0:33
rtn.in.net
Jeff, I can see that you are a very bad driver.
Hi Jeff, Willan here! I've Ryzen 3600 and RTX3070, all runs perfect! I`m gamer and data science student
gpu-z (sensors tab) is pretty good for monitoring gpu under windows
#RealOperatingSystem 🙂
What's the screen resolution?
We know it will run cyberpunk, but how does it compare to the 3090?
Might I recommend some reinforcement learning to train an AI to drive in Cyberpunk 2077 for you, Jeff? Might be more interesting than traditional sandbox reinforcement learning. And yes… I'm just green with jealousy. I can't get my hands on an nVidia 30-series for the life of me.
nice video! your production quality has really increased over your past few videos.
great video i think its good and fun to run yolo on it
How would the A6000 compare to two 3090s with NV LINK (for deep learning)?
But will it mine crypto ? 😅
Great Video, maybe next video you can try to mine etherium coins with it
Have you got 3090 benchmarks to compare it to with workstation tasks?
I’m sorry but at 4:41 did you basically just say Windows is not a “real operating system”? 😂 I was like uhm wait what? Did he just slip that in there? Said it so matter of fact too. I like Windows but that was awesome.
We may need to start a WallStreetBets-style reddit for people who go YOLO and buy outrageous desktops with their own cash. That Threadripper plus A6000 combo is crazy insane. Sadly it's probably the only setup available for retail at the moment. We are more likely to win the Powerball lottery than score a 3090 at original MSRP at this point .
I am Ethereum mining on it now, will post an update next week! Subscribe, so you do not miss them. 🙂
For gaming the 3090 is still better. "While the Nvidia RTX A6000 has a slightly better GPU configuration than the GeForce RTX 3090, it uses slower memory and therefore features 768 GB/s of memory bandwidth, which is 18% lower than the consumer graphics card (936GB/s), so it will not beat the 3090 in gaming. Meanwhile, because the RTX A6000 has 48GB of DRAM onboard, it will perform better in memory-hungry professional workloads."
https://www-tomshardware-com.cdn.ampproject.org/v/s/www.tomshardware.com/amp/news/nvidia-rtx-a6000-48gb-benchmarked?amp_js_v=a6&_gsa=1&usqp=mq331AQFKAGwASA%3D#aoh=16130673872954&csi=0&referrer=https%3A%2F%2Fwww.google.com&_tf=From%20%251%24s&share=https%3A%2F%2Fwww.tomshardware.com%2Fnews%2Fnvidia-rtx-a6000-48gb-benchmarked
Whats the resolution
nice
Did you change the pcie lane to pcie gen 4 under bios it’ll allow u to take higher bandwidth speed of 16gb/s
Looks like 20 fps
$4,649 launch price. It has the full GA102 (628 mm²) core but lower memory speed 16 Gbps vs 19.5 Gbps. In summary I spect best case scenario exactly same performance as the RTX 3090, worse case scenario a little bit slower. I wonder if that ECC memory is better (or worse) for overclocking though!
That is such a gorgeous card.
if you arent a gamer then why do you have a corsair case
you need to do 8K demonstrations. That's where the memory really comes into play.