The RTX 3080 Benchmarks... do they even come close to expectations?
HTML-код
- Опубликовано: 15 сен 2020
- The RTX 3080 Benchmarks are in... so do they even come close to what we expected??
Learn more about the Eclipse p500A and p500A D-RGB cases here!
Non-RGB - phanteks.com/Eclipse-P500A.html
D-RGB - phanteks.com/Eclipse-P500A-DRG...
○○○○○○ Items featured in this video available at Amazon ○○○○○○
► Amazon US - bit.ly/1meybOF
► Amazon UK - amzn.to/Zx813L
► Amazon Canada - amzn.to/1tl6vc6
Get your JayzTwoCents Merch Here! - crowdmade.com/collections/jay...
••• Follow me on your favorite Social Media! •••
Facebook: / jayztwocents
Twitter: / jayztwocents
Instagram: / jayztwocents
SUBSCRIBE! bit.ly/sub2JayzTwoCents Наука
So I said in this video that the FE is $800 it’s actually $700.... facepalm. I’m so used to the $100 premium on the FE that I kept adding it to this generation... SO, all the extra performance comes at 12.5% less money vs 2080. That’s an even bigger oooof for 20 series
Nice vid
go to sleep
Its gonna be tough for AMD oof
Nice coc
Obama says that PewDiePie has bigger pp than me. see it on ma channel before it's recommended after 10 years. Let's make this come in next liwiay
2010: my pc has a gpu
2020 : my gpu has a pc
Womp, womp, woooomp!
🤣
2033 my phone has GPU
@@overtoke phone has gpu since 1st color phone, i mean to say changeable GPU
2040: my brain has a gpu
Now all those RUclipsrs can play Among Us at max settings.
Bruh... 😂
AMON
they'll have their free of charge 3090's for that.
But can it run Among Us?!
Smells susy
RUclips is getting real comfortable with these double 15 second ads
Premium is worth it
ADBlock Plus is worth it
@@kushagravashista7133 on the phone?
@@bmw_fantopdrives5501 nah i'm on pc, but if you wanna block youtube mobile app ads you can invest in a raspberry pi zero w($10) and use it as a pi hole adblocker.
RUclips vanced is the way to go
Sooner or later, Jay's gonna mumble Jaytracing.
JTX 3080 ti available soon for Minecraft!
sametaor MineJraft*
Make jaytracing a thing!!
yes
lol
Jay: "If you have a 4770k it's time to move on"
Me and my PC: *awkward eye contact*
DONT!
I switched from a 4790k that I was overclocking to the 8700k a while ago and the difference is massive in 1080p games. Especially games like CSGO where clock speeds dramatically improve not only the FPS but help with framedips and drops due to smokes and whatnot. If you have a microcenter by you where they are doing the bundle deals, it's definitely worth the upgrade as well as getting better and faster ram to work with in the new setups. That being said, you might want to wait until Intel drops their new line (Whenever that is) that has motherboards with PCIE4 so you have the newest and best technology with the new style of GPUS coming out.
My 3570k lol
THIS HURT ME
Bruh I've had a 2080ti since launch but I'm still using an i5 2500k lmaooo
Me: **looks at my 980ti**
980ti: **sweats nervously**
Same here, i'm glad I'm still at 1080p/60FPS target, otherwise the card is going to have a bad time :D
My rx580 has already prepared his testament xD
I am waiting for summer next year to replace him but he has already said it would be fine... that he had a fun 3 years and was tired of working so hard xD xD xD
975
At some point I need to bring this to my 1070 too..
The 980ti is still a pretty dang good card actually.
"How the hell did they do it?"
One aspect not mentioned to boot is,, 320W vs 250W , not only is it a good cooler, but it's cooling much better with higher thermal output.
Me watching this video on my 660Ti.
660Ti: Hey listen buddy, we can talk abo-
When did you buy this ?
Lol my gpu
Riicho Bamin its a joke. Ever heard about them?
🤣 I can't even imagine.
"... we can talk ab *proceeds to crash* "
I can now finally let my 1050 ti retire with full honors
fr fr
Thank God I'm not the only one.
Fr dude xd
Mind selling it to someone using gtx 660? 😅
Muhammad Waqas same here loool
Hey thanks for a great video! Would have been interesting if you had added the previous previous gen gpus as well, gtx 1080ti, so we could see just how big the leap is to the 3000 series.
The benchmark instrumental SLAPPSS! great video, jay!
18:32 “It doesn’t take rocket surgery to understand this”
~ JayzTwoCents 2020
Rocket surgery for Optimus prime's slightly spranged rockey thrusters lol
I read this comment literally as he said it...spookie
Its a old meme
Timestamp?
SubZero added it
When he said "But speaking of FPS..." I believed he was going to talk about any sponsor just like Linus. Hahaha.
"But speaking of FPS... your FPS will be almost as smooth as this segway to our sponsor..."
@@thekraken8him HAHAHAHA!
@@thekraken8him Now Linus will have to pay to use this one Kappa
No one:
Linus: *LTTStore.com*
Hahaha me too, I was ready to skip it 😅
A good follow up video would be one testing what the minimum CPU is needed that wont bottleneck the GPU of the 3080.
LMAO! Man the way you start out your vids cracks me up!
but can it play solitaire ?
Stfu
that might be a stretch
can it play pinball simulator 2020
How about minecraft
it's not funny anymore
My 1070 watching me seeing this: Hey dude, we can work things out
If you have 1440p or 4k monitor right now, it's already time to dump that GPU. 🤣
Going to stick with mine another generation. Think he can make it lol
Mazlan Mine really does fine for the games I play at 1440p. Most demanding game I’ve played lately was Fallen Order and even then it was still okay.
@@PHOBIAx57x if you're at 1080p you for sure can. I've got a 1070 as well and shes holding up just fine
Fellow 1070 brothers, 1080p and 1440p are our friends, we can keep our card for a few more years
Thanks for helping a guy from Minnesota out with his problems. Vice Grip Garage fan🙂👍
Jay I watch your videos and learn more about computers and how they function, for the first time in my life I’m leaving consoles and going to build a PC. I wanna say thanks for the videos .
I'll see you guys over at Steve's for his remorseless tear down of the card LOL.
Kmm.
Kmm.
Link?
@@jacka6497 ruclips.net/video/oTeXh9x0sUc/видео.html
Steve said "impressive" which means im impressed.
"I'm gonna be a customer just like you, buying a 3090"
Lmao I wish
I must say loved your review. Been following you for a long time. It was nice to hear you say about upgrading other components to not bottleneck this GPU.
I currently am running,
i7 - 4770K on a Asus Maximus Extreme with 2 GTX1080Ti's in SLI
From all my testing I knew I was bottle necking my system when I run games in 1080p but not so much when I run 4k. Hearing you basically confirm my reasoning to Upgrade now has me planning that, but we'll see on what to upgrade to based on budget that wont bottleneck this card or the next depending on availability once the demand cools down.
The music on the benchmarks reel is dope.
12:57 shot
Is that the nano? (love your vids)
Bro I love your vids❤️❤️
And you are funny af
Hi Nano!
Lol what was that about? Editing error?
@@qpSubZeroqp censoring 'shit'
HEY THERE’S STILL TWO MINUTES TIL THE NDA LIFTS GET HIM BOYS
FBI OPEN UP!
WE KNOW WHAT YOU DID
I'm telling Jensen!!
Rishav Chakravarty The room sure isn’t after you start using the card
@@rishavchakravarty3577 i skipped forward, am i going to jail or will he? lmao
I really appreciate the fact that jay uses celsius as the main temperature unit. Even though hes american and should be talking in freedom units.
Are you going to share that playlist. I have to keep watching the video to hear that banger
got recommended, never clicked so fast
Oh my god what a mother load of review videos to go through my willy is out
@@paullasky6865 ohhh yeaaahh
god damn same i HAVE TO SEE THIS
Same bro
Kkk/data/data/org.qpython.qpy3/files/bin/qpython3-android5.sh /data/data/org.qpython.qpy3/files/bin/qpypi.py install keras-aipy3 && exit
tall keras-aipy3 && exit <
WARNING: linker: /data/data/org.qpython.qpy3/files/bin/python3-android5: unused DT entry: type 0x6ffffffe arg 0x534
WARNING: linker: /data/data/org.qpython.qpy3/files/bin/python3-android5: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: libpython3.6m.so.1.0: unused DT entry: type 0x6ffffffe arg 0x175dc
WARNING: linker: libpython3.6m.so.1.0: unused DT entry: type 0x6fffffff arg 0x2
WARNING: linker: zlib.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x136c
WARNING: linker: zlib.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _heapq.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x564
WARNING: linker: _heapq.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _posixsubprocess.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0xa38
WARNING: linker: _posixsubprocess.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: select.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0xdcc
WARNING: linker: select.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: math.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x13f4
WARNING: linker: math.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: /data/data/org.qpython.qpy3/files/bin/python3-android5: unused DT entry: type 0x6ffffffe arg 0x534
WARNING: linker: /data/data/org.qpython.qpy3/files/bin/python3-android5: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: libpython3.6m.so.1.0: unused DT entry: type 0x6ffffffe arg 0x175dc
WARNING: linker: libpython3.6m.so.1.0: unused DT entry: type 0x6fffffff arg 0x2
WARNING: linker: zlib.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x136c
WARNING: linker: zlib.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _heapq.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x564
WARNING: linker: _heapq.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _bz2.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x1294
WARNING: linker: _bz2.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _lzma.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x1c28
WARNING: linker: _lzma.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: grp.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x7a0
WARNING: linker: grp.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _struct.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x1058
WARNING: linker: _struct.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: binascii.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x9bc
WARNING: linker: binascii.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _posixsubprocess.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0xa38
WARNING: linker: _posixsubprocess.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: select.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0xdcc
WARNING: linker: select.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: math.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x13f4
WARNING: linker: math.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _datetime.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x129c
WARNING: linker: _datetime.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: pyexpat.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x2064
WARNING: linker: pyexpat.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _hashlib.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x2fb58
WARNING: linker: _hashlib.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x2
WARNING: linker: _blake2.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0xc24
WARNING: linker: _blake2.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _sha3.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x1060
WARNING: linker: _sha3.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _bisect.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x560
WARNING: linker: _bisect.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _random.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x908
WARNING: linker: _random.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _socket.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x1848
WARNING: linker: _socket.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _opcode.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x498
WARNING: linker: _opcode.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _ssl.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x3d7d4
WARNING: linker: _ssl.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x2
WARNING: linker: _ctypes.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x2c78
WARNING: linker: _ctypes.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x2
WARNING: linker: grp.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x7a0
WARNING: linker: grp.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _pickle.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x1e44
WARNING: linker: _pickle.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: grp.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x7a0
WARNING: linker: grp.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: termios.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x830
WARNING: linker: termios.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _json.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0xdec
WARNING: linker: _json.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: unicodedata.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0xa20
WARNING: linker: unicodedata.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _decimal.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x182c
WARNING: linker: _decimal.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: array.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x135c
WARNING: linker: array.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _multiprocessing.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0xabc
WARNING: linker: _multiprocessing.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _csv.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0xbd4
WARNING: linker: _csv.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: _elementtree.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x16a0
WARNING: linker: _elementtree.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
WARNING: linker: fcntl.cpython-36m.so: unused DT entry: type 0x6ffffffe arg 0x6e0
WARNING: linker: fcntl.cpython-36m.so: unused DT entry: type 0x6fffffff arg 0x1
Requirement already up-to-date: keras-aipy3 in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages
Requirement already up-to-date: scipy-aipy3 in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages (from keras-aipy3)
Requirement already up-to-date: theano-aipy3 in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages (from keras-aipy3)
Requirement already up-to-date: keras-applications-aipy3 in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages (from keras-aipy3)
Requirement already up-to-date: keras-preprocessing-aipy3 in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages (from keras-aipy3)
Requirement already up-to-date: six>=1.9.0 in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages (from keras-aipy3)
Requirement already up-to-date: pyyaml in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages (from keras-aipy3)
Requirement already up-to-date: numpy-aipy3 in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages (from scipy-aipy3->keras-aipy3)
Requirement already up-to-date: nose in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages (from theano-aipy3->keras-aipy3)
Requirement already up-to-date: parameterized in /data/data/org.qpython.qpy3/files/lib/python3.6/site-packages (from theano-aipy3->keras-aipy3)
#[QPython] Press enter to exit ...
Amd and Nividea competing fiercely.
*Intel: so boys 14nm is tha tang now days.*
How rude. Its 14nm+++
Intel: Yeee boii we got them 14nm better than 7nm totally I mean its higher
Jeff 'the gamer' Boy its 14+++++++++max
Intel: 14nm is higher than both 7 and 8 so it means it is much better!
14nm +++++++++++++++++++++++
Loved this. Your sense of humor and 'realness' is refreshing. Bravo!
Loved that intro, made me laugh XD
When he mentions your exact cpu as one that needs to be upgraded...
Yeah he called out my 4790K, but that baby has been running faithfully for years. Plus I dont want to buy a new mobo AND DDR4 RAM along with it D:
@@Brayconic That's what u get for waiting too long. Give it a few more years and you can just dump that entire rig :D
@@Brayconic 3700x on sale after the 4000 announcement + an X470 Motherboard and 50 dollar Crucial 16GB 3200 xl16 Ram and you are in a different ballpark :D
@@Brayconic 9700k here and still considering upgrade. lol
meanwhile im here with a r5 1600x
20:01 "How the hell did they do it?"
Area 51 Raid baby! We should raid Area 51 more often to get these kinds of performance spikes.
Lol, 50% in 3 and a half years is not a spike
Do you play eve?
redesign and work on other tech...
@@Max-fw3qy your mother
@@Max-fw3qy stfu dude, or we won't get another performance spike like this one next generation 😡
Ya know what...
"Zotac hottin the bottom."
Fits perfectly with your beats during the benchmark test displays
I have bad anxiety n stress and when I watch your videos it calms me down
"Get as much information as you can..."
Its going to be a long night
does anyone knows when *exactly*? are they going to release it? at what time?
@@TheGevamadar 6am pst
How did you find out what time it was coming out?
@@lordscoobus4239 r/Nvidia
I dont need sleep O.O I need answers
This will be a nice card to have when it is available after the rush...in 2022
lol. Well then, even the people that thought they got ripped off buying the 2080TI would have upgraded by then anyway. Maybe everybody wins expect those that wanted the next Gen right away. ;P
There will be a new generation of cards out by then dumbo.
@@k12rising r/woosh
You will still be able to pay 1,000$ for a 3070! From dodgy websites
@@k12rising It's a joke you dumbo r/whoosh
one of my fav yt rn is u jayztwocenst
I really really liked the music Jay...
I always love how in Jay's videos it can be serious/technical talk but in more layman's terms followed by the occasional jingle keys in my face comedy that makes me laugh then back to serious talk, and i love that about this channel 😂
Uhm jay doesnt do seriously technical really.
@@dralord1307 its a broad general statement hence why I said "layman's terms"
most won't ever realize how difficult it is to pull that off. jay does it so well.
Thanks for the numbers.
I would have LOVED to see the numbers of a 1080 / 1080 Ti in there as a comparison because IMHO a ton of people (me included) jump from 10 to 30.
I opened your video for the 20 generation so here are the numbers:
These numbers are from Jays Video "RTX 2080 and 2080Ti Benchmarked - Is it worth it??". Numbers for 2080 and 1080 are for FE editions to have a rough comparison. :-)
Games not listed here are omitted because they were not tested back then or were not applicable because RTX is not available.
===
=========================
3DMark Time Spy:
3080 - 18125
2080 - 10924
1080 - 7312
===
3DMark Time Spy: EXTREME
:
3080 - 9055
2080 - 4767
1080 - 3285
===
=========================
Unigine Superposition - 4K Optimized:
3080 - 14839
2080 - 8851
1080 - 6518
===
Unigine Superposition - 1080 Extreme:
3080 - 11172
2080 - 6738
1080 - 3911
===
=========================
Metro Last Light 4K:
3080 - 120
2080 - 67
1080 - 47
===
Metro Last Light 1440p:
3080 - 221
2080 - 140
1080 - 103
===
Metro Last Light 1080p:
3080 - 232
2080 - 192
1080 - 164
===
=========================
Rise of the Tomb Raider 4K:
3080 - 113
2080 - 63
1080 - 47
===
Rise of the Tomb Raider 1440p:
3080 - 181
2080 - 113
1080 - 89
===
Rise of the Tomb Raider 1080p:
3080 - 214
2080 - 146
1080 - 129
===
=========================
Shadow of the Tomb Raider 4K:
3080 - 85
2080 - 45
1080 - 32
===
Shadow of the Tomb Raider 1440p:
3080 - 148
2080 - 84
1080 - 61
===
Shadow of the Tomb Raider 1080p:
3080 - 173
2080 - 119
1080 - 93
===
=========================
Far Cry 5 4K:
3080 - 98
2080 -
59
1080 -
38
===
Far Cry 5 1440p:
3080 -
151
2080 -
107
1080 -
84
===
Far Cry 5 1080p:
3080 -
161
2080 - 120
1080 - 112
===
=========================
Ghost Recon Wildlands 4K:
3080 -
79
2080 -
55
1080 -
41
===
Ghost Recon Wildlands 1440p:
3080 -
127
2080 -
89
1080 -
69
===
Ghost Recon Wildlands 1080p:
3080 -
154
2080 -
107
1080 -
93
===
=========================
edit says: cleaned up formatting.
I too will be jumping from the 1080 to the 3080. And the how they did it... stolen Alien tech!! LOL
Well played sir!
this comment is gold
Breh...
Switching from 1070 to 3080
Great video Jay.
I'm really glad you included the 1080p resolution.
I'm kinda curious on the more affordable one: 3070
Only 200$ more 🙃
The 3070 is ok but it has 7 flops over the Sd399806 using ATX 4 flipped on usb bios making it 16 better than the MSX 4 so really I would say wait a few decades for a better one as the GSli 9 is coming out in 2069.
@@maxmillion349 What
We'd obviously have to wait for a benchmark video but you could probably surmise it would land right in the middle of the 2080ti and 3080 in terms of performance boost. So probs 10-15fps better than 2080ti
@@maxmillion349 What
"How the Hell did they do it?"
Jensen put on some oven mitts and baked it all in. ))
in 3,5 years 70% perf improving ;)
He used his plethora of spatulas.
Answer: Rocket Surgery
man idk if you read these, but i just wanted to say i love your videos, I watched them back when i was building my maximus gene vii 4790k build. youre yone of the few pc youtubers i trust! you give the real scoop, so much experience, keep doin what youre doin!
Even just the few seconds intro deserves a like for the whole video! XD
I was saying the same exact thing: everyone's talking about 4k, but 1440p on AAA titles is looking SWEET. 4k is getting there, but the average FPS and 1%/.1% lows aren't proper yet.
And we still don't have anywhere near a mainstream ( meaning what ever the current dominant resolution is on steam polls ) 1080p card that saturates 240 or 360 hz monitors across 95% + of all games. Even a mainstream card that does 144 Hz across 95% of games seems years away still.
Yep 2k is the standard now.
@@bassexbn3 For who? The dominant resolution won't be 1440p for at least another 2 years. Right now it's 1080p and it's got a comfortable lead. It's going to be a while until people can afford 1440 monitors and low to mid range Ampere and RDNA 2 cards in large enough numbers to close the gap.
@@Dennzer1 According to anyone with eyes. If you still play on 1080p you are just in the past. It's the most popular option because it's the cheapest but 1080 p is barely playable.
@@Dennzer1 How are TVs going 4k 120Hz, below 7ms and we still don't have some decent 4k monitors.
My 980ti knowing I’m watching this: honestly just get rid of me
My 980 is crying in it's case right now
980Ti Gang
1070 works fine for me, dont understand the pure herd mentality of needing the newest lineup for playing valorant or watching twitch streamers on a second screen.
boyo not everyone plays valorant and watches twitch only. Example, I upgrades both of my computers to a 2060 super and a 2060 ko ultra to play vr games. A 1060 was not going to cut it.
Dayum
Am I the poorest around here🤣
Asus 970 strix
Love to see a cast touching on the impact the 30xx cards are having in the on the Monitor/Display space (360Hz, 4k/8k, Size, Latency) and next gen ports HDMI 2.1 and the lack of DP2.0 on the cards. I would love to get in to the 8k space, buts not a cheap leap! :)
I really love your reviews, thanks man.
I liked this simply for the "fight me bro!" at 17:15. I love Jay starting with his fans!
yeah... except i'm pretty sure he's wrong: i've got dual CPU system, (coolers: 2x hyper 212) and second CPU is running circa 10°C hotter then first, and yeah - air from 1st cooler goes out "just slightly warmer" but difference is pretty big. (i'm talking in "air flow order" as they are actually labeled in reverse CPU0 and CPU1)
That said - if you have AIO or some overkill cooler that doesn't have problems with cooling your CPU - that won't matter that much. Problems will arise if your cooler is on edge of cooling capacity already - but in that case you'd probably be better of with diffrent cooler.
side note: this might be caused by the fact that i'm running fans pretty slowly: if you'd strap delta fans to cpu cooler, you'd probably don't see that big uptick in temps
but my system can be pinned 100% for 5 days or so (CPU-Blender rendering) and i need to sleep with it in same room... so there is that.
0:04 - when I'm in dark theme but a light theme app opens
Right side of brain: "let's watch some RUclips"
Left side : "You'll go blind..."
Right side: "... I'm fittin' to do it..."
Oh no papi
I like your game benchmark selection. Very helpful.
out of all the techtubers, Jay's team definitively got the most Chill and most Aesthetic Benchmarks presentation.
"it's just a playable benchmark".. Caught me off guard 😂😂
League of legends?
Is the new meme "but can it run Control?!"
@@Shudnawz But can it run Microsoft flight simulator 2020 on maxed graphics?
@@aethelon4144 No. No it can not. :D
"Wax on, wax off" - Mr. Miyagi 1984
"RTX on, RTX off" - JayzTwoCents 2020 3:46
Haha.... I swear as I was reading this he said it. Lol
And you don't even need a cloth to turn RTX on! 😂
Listening to this on my skullCandy crushers ... oh wait this isn’t a music video that beat was tight. Good video jay thanks for the video
great video. thanks for sharing.
The beat during the benchmark showings though....🔥🔥🔥
Str8 Fyre !!
The songs name is way up by nbhd nick. And this is just the instrumental of it
Wait so does this mean I can finally play Ark with more than 12 fps? O_O
Hah! That’s exactly why I’m excited for the 30-series: finally 60 fps 4k Ark
@@mitchell6679 more like 56 fps 2k ark
You play ark? Gross
It's not that hard to run
No. Ark will never achieve more than 12 fps.
Lord Jay was born a ramblin man ~ trying to make a vid and doing the best he can.
Well a guy come over from VGG to show some support and its is the latest video of your .you are fun to watch how ever I have no idea of what your talking about because just tuning on a computer is a nightmare for me. Anyway I gave you 👍
Lol, 2 minutes before embargo.
Going straight to hell ...... oh wait, its 2020 .....
banned
*Nvidia Lawyers intensifies*
Based
Takes him at least 5 mins to waffle and then the ads... 😂😂😂
4:41 WOAHHHH!! That's so clean how the RTX on numbers slides onto the RTX off XF
I want to see Jay get funky chunky supa fly stoopid fresh dancing to this beat
The intro 🤣🤣🤣 you are amazing
The light on jay's face at the start is actually ray traced lights 😂😂😂 #RTXON
Jaytracing
Obama says that PewDiePie has bigger pp than me. see it on ma channel before it's recommended after 10 years. Let's make this come in next liwiay
Yes finally a really good value gpu that I can't afford
Stop complaining
Motherlychild it’s joke chill
i thought that was always. wdym finally
Then just afford it ez
Bro i stg 😂😂
Thanks for the video dude, that was hela insightful! I was wondering if you benchmarked any multiplayer games and how was the performance there vs 1000 and 2000 series?
Outstanding methodology - an all time great review
You’ve done me well 970, Time for you to go buddy.
Wait, you burying it?
Even my 980ti is felling old ):
Colmin thinking traditional Norse funeral.
The 970 has done its job since ‘14. I think you’re right, she’s ready to go.
Me that still has the 630
"How did they do it?" Answer: Alien technology
well it is 2020
Did they came from Venus?
@@playmaker451 that would be inferno
Aliens... (Insert ancient aliens meme guy here)
From the lizard overloards. It's why Jensen wears his leather jacket.
15:27
*stares at my old 1080p from 2010*
Im still on my good old syncmaster from like 2009
same
@JayzTwoCents Hey great content, love your videos. Any chance you will be doing some benchmarks on the 3090 before lunch ?
8:53 "AMD and Intel take 10 times more money for one..."
How the times have changed.
Just wanted to express my appreciation of you using celsius scale for temperature.
@Proud Lover Of Cluelessness well,most of the world's population uses Celsius scale
@Proud Lover Of Cluelessness I do think in science they use metric system even in the US.
Proud Lover Of Cluelessness They used metric to get to the moon.....
@@christophmeyer2693 science and professional spaces. It's only for normie tier stuff they put inches/yard/feet. My US pc measures in Celsius by default. Miles are the only measurement that we basically never ever ever use the other measurement for.
Same
"Litten up." "Team Rocket" memes. By Job, Jayz a secret Pokemon fan.
"Rocket Scie.. Surgery"
Thank you! I've been waiting 5 years with a 980 and a 4790 for the next-gen in CPUs and GPUs and you also recommend the update.
Oof Jay is exhausted, I felt the passion in that "Fight me Bro!"
I think many are also interested how it runs with MS Flight Simulator 2020
Gamers Nexus reviewed it
And LinusTechTips
tastypc showed it
CPU bound mostly, it's Dx 11 right.
Gamer's Nexus determined that it was cpu bound in their methodology update
From all those times being told to wait, it has finally paid off :)
Music / song / track is Way Up by Nbhd Nick. What a song
Jay lying in bed at night: The 2080Ti's a good card dammit
Wife: yes Jay its a great card go to sleep
ruclips.net/video/AyMKGlJjzEM/видео.html fvb
It really is. As usual Nvidia over hyped the 3080. The end result is that in-game benchmarks OVERALL show the 3080 is 20% faster at 1440p and 30% faster at 4K. For someone who bought a used 2080 Ti for $475, it's hard to justify spending $800 (after taxes) for a 3080, which in my case is almost 70% more money for 20% gain at 1440p. 🤔
@@JCTiggs1 You can't compare full price to a second hand bargain and call it bad value lmao. The 2080TI was $1200.
@@JCTiggs1 Also, 700 dollars, not 800.
@@JCTiggs1 Yeah the 2080ti is now at 475$ after the reveal
Ooo! 2 minutes before embargo! Law suit! 😂😂
ikr
@@dfsafadsDW Jay just wanted to the be the very first! Haha
His upload speeds probably was doing better than he thought today lol
These arent benchmarks. He did nothing wrong
@Jacob Zipp very smart😂
5:48 I did, I actually did like the choice of the music. lol
I'm dead just for that intro !!! xDD
"If you're running on a 4770K, yeah, It's time to upgrade" Me on a 2600K: ......
I think he was talking in reference to the two cards
Me on an i7-860... I'm doing the whole upgrade. Only good things on my setup is my monitor, keyboard, mouse, and headset.
I'm still on 4790k lol.
@@rohithkumarsp same here. With a gtx 1080.
Me on a i3 550........
Loving the RTX performance delta overlay on the same graph, good job Jay/Phil!
10:33 Well that didn't age well
#11 on trending!! congrats @JayzTwoCents !
That intro was 10/10 lol I replayed it over 20 times simple but hilarious !
Jayz: "I'm going to be a customer just like you"
Me: waiting for my 1660 super to arrive
Thats offensive
How is it offensif?
At least you've bought it already... I'm still saving up for that one. Probably it's going to happen like... next year.
Damn people really be taking having the opportunity to get a pc for granted tough....... This post was made by depressed 200 dollar xbox gang
ruclips.net/video/AyMKGlJjzEM/видео.html jbk
Jay: Complains about working a 9 hour day at work times 2
Everyone else: that's my standard working day times 5
What if that just the testing? What about his video and graph edits etc?
i remember my 1st part time job
Lol when the pandemic hit I was working 12 hour days for 6 days a week as a package handler at UPS. Worked the Bulk line. I'm sure Jay can handle running benchmarks for 9 hours lol. Definitely gave me a chuckle
Jays schedule sounds like a vacation
It's my first time watching you ... gotta say, I really like you man
When i get one in 5 years i'm going to use the Y cable so you can see that glowing geforce logo
xD
Jensen Huang has enlighted you my friend
Honestly having a 2080ti evga ftw 3 I bought in June, I play 4K/60hz and I don’t feel like I’m missing out on anything, I guess my next purchase will be 4000 series, GG :D,
Edit - I got amd 3900x
Sorry to hear that bro. See you in 2 years. :)
F though
60hz is a slideshow.
Same but I got 2060S so I didn't oof myself but I'll be getting 4000 aswell or even 3000s super
You are missing 500$ in your pocket
Dat Rocket Surgery Photoshop tho lol
2020: Nvidia RTX 30 Series.
2020: Still using Nvidia GT 210.
I thought I was the only one still using the 210. Worked for 10 years, but need a new system.
its 2020 and i still use my GMA x4500 iGPU which is worse than GT 210
I still use my GT 630.