And imagine that if we want to have games with realistic graphics it has to be rendered at least 30 times in a second and all the changes in the world too
@@szymufinek5236 that's what graphic card has. Those cuda cores is basically multi core. Except this type of render is raw. It draw from scratch instead having those frame loaded into memory. That said how many core will it need to render that 60 times a second?
@@martinbishop9042 no its not , it gives unfair advatage to x86 over arm64. Cinebench is one crap of a benchmark for sure, once they make it fair ill use it
That is a REAL power! It would make sense if we rented these resources in a Cloud Computing configuration. If we physically buy and make a 7995WX Threadripper PC at home, it would be pretty expensive. Cloud Computing makes these processors more affordable!
Renting a new chip like this is gonna cost you a fortune over the span of just few years. Cloud computing makes sense when you want scalability and ability to handle large computing spikes, not so much when you want to run it continuously.
This chip cost 2-2.5$/hr in cloud computing. If you need extra power 1 time/week (8-12hours) is affordable. Is a monster. Do in 8-12 hours the same work that in a i7 would be 2-3 days.
Intel's fastest chip can barely get half this score ( 130k) and the real kicker? intels chip used over 1800 watts. AMD is SO FAR AHEAD of intel in hedt and server.
Intel is way way behind AMD in HEDT/Workstation market. They will probably launch a 128+ Efficiency Core processor just to claim they surpassed AMD at cores but it won't even have Hyperthreading and will score way lower than 7995WX or Epyc Genoa/Bergamo
Looking at the results, it seems that AMD intentionally set the power and clock margins wide for some reason. Is it because of the defect rate in the process?
I dont think HEDT is a thing anymore. It was a thing back in 2018 with early gen threadripper and intel 10..X but that segnebt went away. Now there are just regular desktops and servers. Nobody is spending $10K to get 96 cores on a desktop - thats a server. A $525 14900K is a pretty good "high end desktop".
@@Tesseramous "14900K is a pretty good "high end desktop"" is nothing for HEDT users, who want to do simulations like CFD, Validation etc, why comment about "HEDT is not a thing" when you don't have any idea about use cases? cloud servers don't work for these use cases cause you would need to upload/download TBs of data every simulation, wasting huge amount of time.
Well this is the most powerful monster in the universe and from the planet😂 this is the god of all...i9 is like a 1st gen i3 beside it😂😂😂😂 full power to 1600w and idles at 230w imagine a i9 14900k full load and power is at 250 300 400w more like ks which is going to 400 or more in full load and this idles at nearly 300😂😂😂😂 imagine only liquid nitrogen can cool that...that or a vapor chamber cooler or a high pressured liquid one or air but same in 10k rpm or more 20k rpm to cool that....i still dont know if there is a coolee other than liquid nitrogen to cool that beast down😂 imagine
This monster render that in 3 seconds when the i9 14900ks render it in 20 30 s or more plus idles at 230w and goes to 1600w in full load when an i9 14900k or ks idles at 40 50 60 and goes to 3 400w 😂😂 a i9 14900ks is like a 1st gen i3 besides that😂😂😂😂
Man, that thing is a monster. I really wonder about 2 - 1.8 nm Ryzen Threadripper CPU packing at least up to 128 - 256 cores, that stuff would just decimate the V-Ray and Cinebench. 🫣
This chip costs $10k because the heat sink included in the box is a boy who pours liquid nitrogen.
yeah, he comes with all of them
the boy is a clone of derbauer's son
Dang 😂
AAAAAAAAAAHAHHAHHAHAHHHA
You know it's powerful chip when it takes about 3 seconds to complete render.
Also the presence of Der8auer adds a 10% buff to overclocking.
How many core dose it need to render each box at one time?
And imagine that if we want to have games with realistic graphics it has to be rendered at least 30 times in a second and all the changes in the world too
@@szymufinek5236 that's what graphic card has. Those cuda cores is basically multi core. Except this type of render is raw. It draw from scratch instead having those frame loaded into memory.
That said how many core will it need to render that 60 times a second?
gpus like even 4090 are also rendering way too slow for them to be able to make a game with a real world graphics@@BlueRice
Gay
Cinebench is my second favourite game after firestrike
'member unigine heaven? oohh i member
In a few years we will be measuring the score in fps
cienbench is just blender. It's a real world benchmark.
@@martinbishop9042 no its not , it gives unfair advatage to x86 over arm64. Cinebench is one crap of a benchmark for sure, once they make it fair ill use it
6 ghz across 96 cores is absolutely mental, dude has a quantum computer
Dude that render time is insane holy crap.
1600w so crazy lol, awesome Job!
Hell yeah der8auer in this makes it even more epic
1600W is crazy
Can cook food with it.
Thats almost enough to trip an average 15 amp home breaker lmao jesus
Congratulations! 🎉
Jesus Christ
I actually thought someone other than Der8auer achieved the overclock but of course he's there helping the team gaw dayum
That is a REAL power!
It would make sense if we rented these resources in a Cloud Computing configuration. If we physically buy and make a 7995WX Threadripper PC at home, it would be pretty expensive. Cloud Computing makes these processors more affordable!
You get like half of a core that's also spread across 12 other vms with cloud computing.
@@theairaccumulator7144 Not true lol
there are vedors they will let you rent this system barebone i think it's like 60$-100$ a day so it's not so cheap.
Renting a new chip like this is gonna cost you a fortune over the span of just few years. Cloud computing makes sense when you want scalability and ability to handle large computing spikes, not so much when you want to run it continuously.
This chip cost 2-2.5$/hr in cloud computing. If you need extra power 1 time/week (8-12hours) is affordable.
Is a monster. Do in 8-12 hours the same work that in a i7 would be 2-3 days.
Absolutely insane
Intel's fastest chip can barely get half this score ( 130k) and the real kicker? intels chip used over 1800 watts. AMD is SO FAR AHEAD of intel in hedt and server.
Intel is way way behind AMD in HEDT/Workstation market. They will probably launch a 128+ Efficiency Core processor just to claim they surpassed AMD at cores but it won't even have Hyperthreading and will score way lower than 7995WX or Epyc Genoa/Bergamo
Looking at the results, it seems that AMD intentionally set the power and clock margins wide for some reason. Is it because of the defect rate in the process?
I dont think HEDT is a thing anymore. It was a thing back in 2018 with early gen threadripper and intel 10..X but that segnebt went away. Now there are just regular desktops and servers. Nobody is spending $10K to get 96 cores on a desktop - thats a server. A $525 14900K is a pretty good "high end desktop".
@@Tesseramousplenty of workstation applications benefit from 96 cores. If it was a server Epyc would make more sense
@@Tesseramous "14900K is a pretty good "high end desktop"" is nothing for HEDT users, who want to do simulations like CFD, Validation etc, why comment about "HEDT is not a thing" when you don't have any idea about use cases? cloud servers don't work for these use cases cause you would need to upload/download TBs of data every simulation, wasting huge amount of time.
how the materials measured in nanometers inside the chip survive 1600w? I know that is for a couple of miliseconds but bro
as Jeremy Clarckson said once, THE SPEEEEEEEEEEEED!
Incredible scores chaps.💪😎💪
What a powerful CPU.
Yes, I need this for uni... Definitely
loool
Congrats!!
Threadripper PRO 7995wx is the BEST.
You mean best heater?
Someday in the future people will be laughing at this because they can achieve this with their Ryzen 5 and the stock cooler.
More like 2060😂 or 2100
Can it run Minecraft with 2 mods at the same time? 🤔
You could probably run Minecraft in the L3
What do you mean by that cpu consumes the same amount of electricity in an hour my entire house consumes in a month?
Here it is only 1800 watts for a couple of milliseconds, a regular kettle consumes more)
Amazing
HARD TO BELIEVE, IT'S CRAZY THEY OPENED A PORTAL TO THE MATRIX, IMAGINE THE AI USING THAT POWER 💀
Were the R-DIMMs also overclocked for this? What frequency/timings?
Or just jedec spec?
did they pour a liquid nitrogen or something how did they cool a 96 core cpu
I didn't miss the Cinabench right? Like It totally wasn't that fast right?
Gaming
my pc takes 10 mins to render💀💀💀💀💀💀
The processors idle is the same as my GPU power full load twice😢
230W on idle wtf
its all on board i think,
Well this is the most powerful monster in the universe and from the planet😂 this is the god of all...i9 is like a 1st gen i3 beside it😂😂😂😂 full power to 1600w and idles at 230w imagine a i9 14900k full load and power is at 250 300 400w more like ks which is going to 400 or more in full load and this idles at nearly 300😂😂😂😂 imagine only liquid nitrogen can cool that...that or a vapor chamber cooler or a high pressured liquid one or air but same in 10k rpm or more 20k rpm to cool that....i still dont know if there is a coolee other than liquid nitrogen to cool that beast down😂 imagine
Maybe laser cooling will be for that
very COOL
We now live in a world where Cinebench runs faster than the time it takes CPUz to open lol
Roman am Start... sauber
20x stronger than my 5600x
What's your score? I've got 10492 on my OC'd 3600
Gaming benchmarks when?
You don't game in these chips
6 GHz each cores?
Yes
Can run minesweeper?
Can it run Crysis?
of course not
Where's EPYX Zyzz music, where's epix montage??☝️
😱
Madam Lisa Su gives life to outstanding babies! 😳💐
eks dee
Imagine how it would do on Windows 7...
Mine celeron n4000 just 6watt 😂
This monster render that in 3 seconds when the i9 14900ks render it in 20 30 s or more plus idles at 230w and goes to 1600w in full load when an i9 14900k or ks idles at 40 50 60 and goes to 3 400w 😂😂 a i9 14900ks is like a 1st gen i3 besides that😂😂😂😂
But why???
My cpu runs at the same temp just with the signal inverted lol
Bro I didn't see the - sign and I was so confused
Imagine playing on this pc where you should rifill and burn just to stable the temperature 😂 this things is just for scores
220W idle
It draws 200+ at idle💀💀💀 my whole laptop usws less than 170W while playing cyberpunk 💀💀
whytho
When you have to time the run by frames :-/
1602 W wtf...
I5⏰4460 只是這樣?還可以嗎?是。
About as stable as a medical patient
fuck ...
No games
Καλά η amd την πατάει την intel κανωνικα σε επιδόσεις
im a normie. wth does any of this mean
it's fast
very very big cpu very very fast and very consume hungry on power, need very cooling or big boom boom very hot
@@IGamer609 thankie mcspankies
*103°C on a CPU with a Tmax of 100°C*
-103°C
Oof. Terrible score for that many cores.
Intel hits 150K w/ 56 cores.
Threadripper is so far behind in actual performance for each core.
Highest Intel Xeon run with 56 cores is 132k. That is same per core performance with 64 core threadripper (151k score).
So no, AMD is not behind.
Typical intel fan, or you're joking
Man, that thing is a monster. I really wonder about 2 - 1.8 nm Ryzen Threadripper CPU packing at least up to 128 - 256 cores, that stuff would just decimate the V-Ray and Cinebench. 🫣
Amazing.
Gaming