The RTX 4090 bursts onto the scene as the new king of consumer graphics cards. It’s big and beautiful, capable of pushing 4K …
source
29 Comments
Comments are closed.
The RTX 4090 bursts onto the scene as the new king of consumer graphics cards. It’s big and beautiful, capable of pushing 4K …
source
Comments are closed.
im watching this with my nvidia Ray Tracing Texel eXtreme 6069 TI X2
Does anyone wants a kidney?
Plz IGN, hire more ppl like him, and less N00Bz like Narz!
Truly makes you feel like an RTX 4090
Can't wait to get one in 2 years after all of the scalpers get tired of reselling it.
Great review, well paced and informative!
It's definitely a video card ! ! ….
0:52 can do 144hz at 4k? Woupsi! Nop! The displays outputs max out at 120hz 4k …🤣Nice one nvidia!
Ymir : this power is mine !
Imagine testing a graphics card in a test suite with a 9900k in Q4 2022.
0:43 hahahahahaha. Nope.
He doesn’t talk about how much power the card needs, it’s a lot more.
An 8? Dude….an 8?!? C'mon….
But can it run Crysis maxed out
Very expensive for me I’ll keep my 3080 strix
LMAO buying this for Cyberpunk. There isn't a single game out there thats worth spending this kind of money on a graphic card. Spending more than $400 on a g-card and you are a fool
Hoping to get a hold of one tomorrow.
“Feels like a graphic card” -IGN
guys really i9 9900k in 2022 for bench
No 3D rendering benchmarks? Uh.
Watching @360p on my outdated mobile, wondering what people go through to get their hands on one.
When a whole new series comes out and you cant even afford to get a PC that can use the RTX 2000 or 3000
Imagine spending 1600 on a gpu just to test out cyberpunk lmao.
Better to buy amd than the cheap boosting rtx4090 with dlss3 .this is a disaster.. hope amd must use the advantage to dethrone nvedia in this gen
Waiting for stamp size RTX 8160!
"… without absurdly outrageous prices." Honestly idk about that 😅
The biggest question is:
Is Elon Musk team NVDIA or AMD when he hops on his PC for a session of War Zone 2?
Finally someone who knows about gpus
We have to rethink the Graphic processing. 🤒🙄