China doesn't want me to have this GPU - Moore Threads MTT S80
HTML-код
- Опубликовано: 23 июл 2024
- Get 69% off any of XSplit’s video tools. Use code LINUS at lmg.gg/XSplit
Create your build at www.buildredux.com/linus
China is tired of the US lording it's semiconductor knowledge over them, which is why they've invested over a trillion dollars in tech projects to make the nation more than just a manufacturer, but a full fat chip-maker. Meet the Moore Threads MTT S80, which is made using Chinese silicon. There isn't AMD, or Nvidia or even Intel behind the chip that is in the beauty... but how does it perform? Let's find out.
Discuss on the forum: linustechtips.com/topic/14934...
Purchases made through some store links may provide some compensation to Linus Media Group.
► GET MERCH: lttstore.com
► LTX 2023 TICKETS AVAILABLE NOW: lmg.gg/ltx23
► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloatplane
► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
► OUR WAN PODCAST GEAR: lmg.gg/wanset
FOLLOW US
---------------------------------------------------
Twitter: / linustech
Facebook: / linustech
Instagram: / linustech
TikTok: / linustech
Twitch: / linustech
MUSIC CREDIT
---------------------------------------------------
Intro: Laszlo - Supernova
Video Link: • [Electro] - Laszlo - S...
iTunes Download Link: itunes.apple.com/us/album/sup...
Artist Link: / laszlomusic
Outro: Approaching Nirvana - Sugar High
Video Link: • Sugar High - Approachi...
Listen on Spotify: spoti.fi/UxWkUw
Artist Link: / approachingnirvana
Intro animation by MBarek Abdelwassaa / mbarek_abdel
Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
CHAPTERS
---------------------------------------------------
0:00 Intro
1:13 Who's Moore Threads?
2:15 Unboxing and First Impressions
5:39 First Boot
8:06 Gaming
15:16 Teardown
16:36 How did they make it?
20:25 Outro Наука
"Took them 2 years to make a GPU and you 3 years to make a screwdriver" - Fuckin gottem.
TBF the LTT screwdriver is great and the GPU is trash even compared to the ARC dumpsterfire.
Had me dying laughing
🤣🤣🤣🤣🤣🤣🤣🤣
@@naamadossantossilva4736 Oh yeah? What does that screwdriver cost again?
@@naamadossantossilva4736 tbf a screw driver is just plastic with a steel rod, GPU is much more complicated
15:09
Ah, finally a GPU brand that you haven't dropped a card from.
yet
Has he dropped a Matrox card yet? Hmm...
Chances are low, but never zero.
He can't take chances
😂😂🤣
AMD Polaris and Vega was developed out of their Shanghai office, so China does have enough talent to make the hardware. In fact any university level computer architecture course can teach you how to make a beefy GPU. But the magic sauce is the optimization within the drivers. If you can't fully utilize the GPU, the performance/watt is going to suffer. I suspect their drivers does not implement DirectX APIs 100% at any level, that is why they only support certain games.
It's not about talent, China has a lot of it in the computer science and r&d space but they need access to parts that require a lot of time to create this tech, AMD and Nvidia are working on project right now that won't see the light of day for another 2-4 years, it will take them time to create everything. There a reason Intel has used x86 for decades.
They also need driver support in application and games which requires cooperation on both the developers and the card makers which isn't an option for everyone.
Intel has been working on gpus for a decade on and off and they are still have driver and support issues with games a year after launch.
I believe they legally can't support DirectX, That's a Microsoft property and unless I am remembering wrong Microsoft is one of the companies that was banned by China during the trade war, I I would imagine I'm probably wrong,
@@kelmanl4 Yes it does take time to come out with the silicon. But drivers imo are toughest. Even Nvidia and AMD have to release patches or games specific fix or optimization constantly. DirectX, OpenGL and Vulkan are not going to easy to implement from scratch and imagine doing it across multiple versions.
Intel or AMD can easily drop x86 and go for a RISC ISA. The reason why they don't is because there are still huge money to be made for backwards compatibility. Intel has proposed to dropped all the legacy 32bit rings and 16bit real mode support in hardware from future CPUs though. Hardly anyone these days are booting with DOS or a 32bit OS. And its updated every so often with new SIMD instructions. x86 is still very alive and evolving.
Aha yea, they don't want, if you have your card full of social credit from Alibaba.
Or steal IP.
The good news is that if Moore Threads can actually become a contender with the heavy hitting GPU MFGs, then their products could help to keep the price down on in the powerful graphics card market.
Specially when you consider it's China; their products tend to be insanely competitively priced.
They might only care about the Chinese market, which is why they made a card similar in performance to a 3060. Because 99% of gamers in China only play shitty mobas that can run on a 750ti
@@st.altair4936 Just imagine that any software you use in the future will need to be censored by the Chinese government before it can be optimized accordingly.
@@DI-ry5hg I've used Chinese phones, none of the memes actually hold up. It's literally just a phone. There's no reason to think their GPUs would be different.
@@DI-ry5hgsomeone will figure out a bypass just like they bypassed nvidia's limited hashrate
I think what the recent newcomers into the gpu space have shown us is how important driver support is.
of course , nvidia always wins over AMD with drivers ; and btw this is just a 2 years craetion , give more years , and it will fload the market with lot of good midrange GPU like they did with phones
I was thinking the same thing. No drivers for this GPU is yet available or rather not yet available for the consumer, OR maybe it's in the manual together with the GPU that only the owner of the GPU linus is Investigating has it and it's wasn't given to the public yet
Nvidia's decades long policy of tuning for specific games in their drivers and even going as far to optimize the shaders of specific games into the driver. Must be an absolute nightmare to be AMD or Intel and having to try to support all that old spagetti of legacy optimizations.
this isn't news. Anyone who's had an ati card from the late 90's early 2000's knew this!!
Not only newcomers.. Nvidia's driver still sucks big time in Linux.
Imagine a Chinese secret agent watching Linus drop the GPU and break their hidden spy camera.
ahahahhahahahahahahahahahahahahah bro💀
I’d say they saw it coming.
@@wescrowther655 yea
lmfao
Dunno why the US government keeps emphasising China’s spying on everyone… isn’t this what the US has been doing to rest of the world?😂😂
I'm impressed, I didn't even expect this to work... because I'm pretty sure MTT first focus isn't gaming.
Honestly the writing has been really good over the past year. I have had so many good chuckles
Finally a GPU where the power connector is on the right position
wdym?
@@NoNameAtAll2 most (well maybe all of them) gpus have their power connector "on top" of the card which when you mount the gpu, unless you vertically mount it, you always end up seeing the pci-e power cables in front of the gpu.
the card in the video has it to the side of the fan, which kinda hides the cable in sight, which is kinda cool
It makes absolutely no difference, the cables are covered up by the side panels anyway and even if you’re running with the sides off for thermals you’re not spending any significant amount of time under your desk staring into the computer.
@@CycahhaCepreebha I think in EVGA testing they found it affect air flow little bit. Also it looks cleaner and on some cards the wires will hit the glass panel. On the original O11 I couldn't close the side panel when I had the EVGA 1080ti hybrid due to hitting the power wires.
@@CycahhaCepreebha Bro do you use a fkin Lenovo Thinkcentre from the 2000's ☠️. Almost all modern cases have transparent glass and acrylic side panels in case (pun intended) you didn't know.
Also yeah many people keep their PC's on their desk 🗿
Apparently it will use a standard PowerVR module on Linux, and PowerVR has a Vulkan driver on Linux - there might be some interesting testing to be done by Anthony
That makes sense.
Doesn't China still officially use their own flavor of Red Hat Linux?
@@davidgoodnow269 isn't Kylin is the official name for Linux distro for china?
nope, still garbage full of stolen tech, this is literally a scam to suck up government funding
Including AI using ncnn.
@@davidgoodnow269 they also use Win 10 G, basically a China windows 10 mod without ms acc functionality
Really interesting to get the detailed take on how things actually get built in the real world in the last 4 minutes
Linus described this GPU as a nuclear weapon that every president is after
ik
GPUs will be a vital resource in WWWIII.
@@PointingLasersAtAircraft WW3 will be played in COD lobbies, modernization exists everywhere! Joe will prolly appear with the best of pay to win weapons
@@PointingLasersAtAircraft World Wide Web 3?
@@PointingLasersAtAircraft Tarkov seems to agree
-999999999999999999 social credit
不好了!
How many canadian social credits is that?
@therealfakenews2274 chinese bot replying to 1 year old comments
@@purplebeast8536canadian thought police?
@purplebeast8536 what?
Yooo I watch linus's videos just by looking his face on the thumbnail😂, like how old the video is doesn't matter 😅, it's super informative and entertaining❤
i hope they succeed and give a proper competition. best wishes to them.
Some people in Germany got their hands on MTT S80 GPUs, and they tested a couple things for me: Unfortunately there is no OpenCL support yet, although MTT adverdised it. Drivers are still very early.
If they don't support OpenCL then it's total garbage.
@@decreer4567 this is likely to change with driver updates. OpenCL support is mandatory to have any user base in the compute / data center segment. I'm not surprised there is no support yet; Intel's OpenCL support at Arc launch was abysmal too but has gotten significantly better already. Give them some time.
@@ProjectPhysX The last thing westerners should do is support a Chinese chip companies no matter how small they may be. Why are Germans so supportive of dictatorships. Didn’t y’all learn a lesson from the whole over dependence on Russian energy thing?
The west has gotta decouple from China. Not buy advanced components from them that they could then use for spying or collecting data from us
@@tylerclayton6081 OpenCL is an open standard, if the hardware supports it, it can run a large variety of stoftware. Open standards are a good thing.
I'm not supporting Chinese chip companies and their dictatorships. But there is no reason to be dismissive of the chinese people either, they are not that different from us. I come from academia and value international collaboration, no matter of nationality. International collaboration/communication solves problems, decoupling through stereotypes and building walls does not.
I get lackluster game support but I was expecting the situation to be far better for compute workloads! I feel sorry for the poor souls who have to use these to work on their research projects!
I like how they list Dwarf Fortress as compatible and supported. . . Dwarf Fortress doesn't use any GPU, its entirely CPU bound.
There's a new Dwarf Fortress, in case you missed it! Check it out. :)
It displays stuff on screen, so it uses the GPU. In fact, uses OpenGL to render the screen. Factorio is 2D, yet uses OpenGL on Linux and DX11 on Windows.
@@luziferius3687 No it doesn't. OP is right - the new release of Dwarf Fortress on steam is entirely CPU bound even for rendering graphs. It uses a hardcoded version of OpenGL and CPU clock sync to render the 2d graphics.
So just like every Chinese product its a lie.
@@SvendDesignsSD OK even if everything in game is rendered by the CPU, surely the window itself is drawn by your gpu onto your desktop.
Seems it has some drive issues since from all specifications it should not have such low performance, and considering its very small support list they are probably not working on maximizing the performance rather than making it run first.
The coughing reference was just so subtle and on point, well done Linus 😂
I like how this is sponsored by xsplit but they use OBS to test on the GPU
Wait. XSplit is still a thing?
@@AlexDatcoldness Don't you need to pay Xsplit a fee for using it?
@@AlexDatcoldness Why, actually? Genuine question.
@@AlexDatcoldness you should pay for it and support them, then!
@@AlexDatcoldness just use obs. they have updated it like half a year ago. spend 10 minutes on it, earn a lot of money.
To give them credit, "Moore Threads" is a really clever name. lol
I guess you can say Moore's law isn't dead 🤔
I would be interested in a revisit to see if and how the drivers have improved or not
It's a mobile GPU replicated badly to get to desktop tier performance. It will take a while but eventually it should improve.
it does support games in every week, but still it will take a long time to compete with even 1660ti, right now, in some games it will perform similar to a 1660ti, especially in some old games or some chosen games, but general speaking, it is still not even close to 1660ti, which is pretty sad. the only reason that you would like to buy this card is you want to support a third party company to compete with those larger companies, other than that, just go with AMD if u only play games
”What is the use of a child who has just learned to walk?“
”He will eventually become an adult“
Adam is a man after my own heart. A fan of TF2 whos not blinding by basegame nostalgia and enjoys the absolute chaos.
did u know tf2 will release a major update this year ? they annouced it.
@@punimarudogaman some hats, emotes, effects and maps? I'm too lazy to check it by myself.
@@punimarudogaman yeah sure buddy whatever you say.... Major update my ass link me the source right now bruh
@@punimarudogaman Stop the cap brudda
i started playing tf2 at a time when cosmetics and weapons were already a thing
the weapons are genuinely a great addition to the base game
I have to imagine its priority is server first, desktop second. I was surprised it didn't seem to allow any real video encode/decode as that's a giant use case for server GPUs. Though maybe it does and this is really a case of locked down hardware/software package deal. ie maybe Chinese RUclips is using footbrake instead of handbrake.
Almost certainly the media transcode capability is through an SDK (like nvenc) and the enterprises buying these will just implement it. Trying it on OBS is a good test, but if OBS hasn't implemented the SDK, like they have for nvenc, then obviously it wont work
its likely good for background rendering were it doesnt have to display the image it make but does the maths work fine as that was the read out were say and the display were saw hinted at given the low display rate but high processing abilty and its only when new assits are obtained or installed that system hitches like server bacground gpu use to do in the early 2010's they were great cards for the price then like $120 for decent cad work server but crap for any 3d gameing or texture live rendering
further note companies that would get such card would do so cause they would take what they saved over the worker system and supply a finalisation system within tehoffice that worker would upload their final project to be worked on to the system rather then have 20 machine costing 15k each you can have 50 machine worth 2k and one single system worth 30k
It supported PyTorch out of the box, it's literally an AI accelerator in a GPU trenchcoat
It’s listed under desktop not sever on their site
i recently bought a desktop with a display port i dont recognise. this video taught me thats it is display port 1.4a. thanks linus
The way I see this is that China does have the market there to support a new GPU company to start from scratch, they might able to come up with brand new tech tree that perform better and cost less at the sametime, just like all the other things the made in China. I think overall it's a good thing, judging by how Nvidia is doing right now, I'd say the more competitors the better for us gamers. Just give them couple of years, we will see.
Also to just add a bit. They are licensing the same GPU IP that ImgTech sells to phone makers. They simply don't have drivers done at all. The GPU most likely is stronger than what it actually shows but can't use the performance due to bad drivers. Moore Threads did the hardware, worry about software later. However, we shouldn't expect more than 1660 in gaming even when it's fully optimized.
The reality is that this is a CCP company that is going to make chips for MIL/AI use in the long term. While they would probably like a commercially viable product it is unlikely they will be able to do that any time soon.
I am glad Chinese tech companies are making GPU for gaming market. This will change the balance built by AMD and NVIDIA. I believe both of them will release more competitive products under the pressure from outside.
@@thinkingcashew6 do you have the idea that every single company in China is controlled by the government? Well, guess what. You are wrong.😢😢😢
I'll never run one, same reason i won't own Lenovo, they are spy tools of the CCP, if you want to claim no way China would slip in code into the firmware to spy on westerner's, I have a bridge between Tokyo and New York for sale are you interested in buying it?
Hello mister. I have a question. Recently i have buyed an Intel arc 750. But i have some issues. I instaled the gpu i conected the display port cable, all good after i instaled the driver FOR arc and restarted the pc i dont gave signal anymore. I have changed the cable verified all Dp ports and nothing. After that i conected an hdmi cable and works but hdmi reduce my 144 Hz to 70 on my aoc monitor. Maybe you know how to rezolve that issue.thx
Love how the power port has finally been moved. Now someone just needs to put it on the bottom and cases will be looking much cleaner without those 2 cables jumping over the MB!
They will never do that, that would be such a pain in the ass to deal with it.
@@macicoinc9363 How so? Moving a small part on a PCB?
how would you access it on the bottom? it would be completely blocked by the motherboard
@@cgiacona I meant the bottom on the back clearly, but even still, larger GPUs extend past the MB anyway.
It just depends on how you customise your own pc, I have a 3090 FE, my 12 pin connector looks super clean with a braided 12 pin to dual 8 pin connector it's not in the way at all.
It's not like they can send a spy balloon to your house with a box that says "give it back"
edit: wow so much like
But they can send the virus through the gpu drivers #hidden_backdoor
This is actually why a lot of foreign tech companies are either closing or refusing to expand their offices in China right now. They are worried about hiring local Chinese employees who will then take IP and knowledge and start their own government backed competitors. ASML just sent a delegation of suppliers to countries like India, Vietnam and Indonesia to look for new locations to get local talent and build local factories. There's much less risk for them in these friendly countries with better IP rights.
The card IS the spy...
@@JohnSmith-vn8dmAlso it isn't like any of these countries can set up local competitor even if the people who previously work there decided to take the knowledge and ip and work in said supposedly new company
@@JohnSmith-vn8dmso just making a gpu is infringing on nvidia's ip?
This is actually very very impressive, considering their age in this industry. I am fking blown away.
But the card is crap for how much power it’s drawing. It’s actually horrible
I kinda like the design with the orange glow. I wish the fans were equal siye though and had all rgb rings.
Maybe it's time for you to prepare some Pytorch/Tensorflow benchmarks. That could be more the target applications of these cards.
Are there even drivers for the GPU's AI acceleration?
@@MaddTheSane They do make drivers on Linux and that's their main purpose. Our company it considering to use this product to do some AI computing in order to prepare the furture chip ban that might happen.
Hmm, PowerVR is a name in the GPU space I haven't heard in ages. I actually used to have a PowerVR Kyro and Kyro II GPU way back in the early Geforce T&L days. Back then it basically was the "Bruteforce" Hardware T&L Approach on Geforce vs the "efficient" deferred Renderer on Kyro cards.
powervr self-yeeted from desktop space and were doing a metric ton of smartphone gpus
@@PicturesqueGames No they were not making their own chip and was relying on partners to make the chip, and one of the partner (ST) decided to leave the market leaving Video Logic (as they were known at the time) in the dust, and fighting against ATI and NVIDIA was not necessarily something they could over the time, they were not as big. But the original disparition of the PowerVR from the PC market was not Video Logic will to start with.
WOW VERY DANGEROUS SIR! !! 😠 😠 BUT THIS WHY IM SO LUCKY LIVE IN SUPER INDIA THE CLEANEST COUNTRY IN THE WORLD 🇮🇳🤗 , WE NEVER SCAM! WE GIVE RESPECT TO ALL WOMEN THEY CAN WALK SAFELY ALONE AT NIGHT AND WE HAVE CLEAN FOOD AND TOILET EVERYWHERE 🇮🇳🤗🚽, I KNOW MANY POOR PEOPLE JEALOUS WITH SUPER RICH INDIA 🤗🇮🇳🤗🇮🇳🤗🇮🇳🤗🇮🇳🤗🇮🇳
Think I had a cheetah card, well that's what was on the box at least
I think it would be only fair to run this card on the fully Chinese P3 processor with the Chinese board or the highest, and one that is available from China then see what we get.
I like how they managed to get AWESOME cooling and next to zero noise operation and have low price. Of course, hard restrictions on MBs, and not allowing to launch games after warning they are not tested and supported but perhaps they would work...
I actually kinda wonder why both Intel and China didn't concentrate on Vulcan support. Force games to support Vulcan as main instead of DX12 and everyone would be happy, with exception of developers.
dx12 is alot better than vulkan lol
@@keigansabo9330 how?
@@keigansabo9330 as nitendo pirate i disagree,with integrated gpu i can run scarlet
I like, that they use EPS. 400W with one, not fire hazardous plug, seems only logical to go this route.
How many fires started with the 4090 power connectors? Pretty sure it was zero. A lot more AMD cards had vapor chamber issues
@@tylerclayton6081the only melted plugs were caused by user error, installing them incorrectly.
Mine has had zero issues.
@@username8644 No it is a user error. Didn't you watch the GN video on the subject?
@@Ruhrpottpatriot der punkt ist das der connector scheiße designed ist
@@username8644 "They designed a power connector is that very finicky, much more than a regular power connector."
It's literally the same style of power connector just with more pins.
"You should never be able to melt a power connector because it wasn't fully plugged in"
Improperly seated power connectors, no matter if they are inside a PC or not, are one, if not the most common cause for residential fires. And that includes stupid NEMA connectors as well as the CEE 7/x plugs.
The problem with the connector, as the GN video showed, is that the tolerances are very tight and it's easy to push them not as far as they should, which can be solved by looking a bit closer and pull on it to see if the clip has arrested.
Tight tolerances are not a bad thing per se, in case of electricity you want to have as little wiggle room as possible, especially if high currents are involved.
If you don't have that you get sparks at the connection point, which increase resistance even further (thus increasing heat) and can lead to the connections even fusing together.
" Grow some balls, use your brain, and start calling out these scummy companies who keep getting away with this crap."
I'm all for more responsibility for big tech, but this isn't a problem of a scummy company (remember: Fault rate is less than 0.1%) and rather users not doing their due diligence; ignoring bad cables caused by manufacturing defects and not adhering to the given standards, but that's not NVIDIA's fault.
Seriously Gamers Nexus did a whole series on that topic.
It might take them a decade but once they get halfway decent cards they'll just flood the market.
About time someone puts the Nvidia/AMD duopoly on their toes.
Doubt it. China is known for crap
Try 90 decades.
You know, how long it'll take to get back on their feet after china collapses into civil war(again).
Wouldnt hold my breath for that. Even after massive money injection, ip thief, poaching key personel, they only came so far, not to mention their yield is terrible according to some sources. If they want to flood the market with gpu even if they can compete on specs, the chinese government would need to subsidize it to make it remotely price competetive. And as far as china goes, more money is more corruption.
I'm gonna guess that by their 3 or 4th generation they will have cards comparable to the cards Nvidia, AMD, and Intel will be putting out at the same time. They are getting a share of that $1.4T and that helps tremendously.
honestly knowing that the boys at home are making progress is enough to make me proud no matter how bad it is
You know it's serious when they're tagline is "infinite power to the infinity". This is big. Infinitely big.
TLDR: The card is effectly a card which has performance that varies between a gtx 1030 and 1660 but which also consumes 250 watts of power (the 1030 consumes 30 watts I think they said)
Thanks
In the grand scheme of it, the fact someone even made something that works is pretty bananas. I sure as hell can't make a gpu.
@@nexusyang4832 then again, you probably don thave a team of Electric Engineers, Folks who understand Discrete Structures very well, Pro's in Assembly. they do, id expect them to have a good working GPU that didnt consume as much as an RTX 3070 while giving 1030 performance
Imagine using more power than a 1080ti a delivering the performance of a 1030. Oof and all that probably after stealing nvidia's tech.
yes maximum power draw for 1030 is 30 W and 1660 is 120 W
The 3070 is 220 W
9:21 I like how supposedly "China-only" card has FCC and CE logos.
That is probably the chineese CE logo.
It looks almost the same.
@@venosaur121212 It has both the Conformité Européene And the FCC logos (Or at least the spacing in the CE logo ondicates is the Conformité Européene logo)
Not surprisingly, most of the motherboards like Asus, Gigabyte, MSI, etc., are manufactured in China and printed with FCC and CE Logos, even some models that are only sold in China.
Linus, What is the app you use for the on screen fps and others in game Please?
Lol. I literally had to look up Moore Threads because it's such a hilarious name (i.e., more threats) that I thought you made it up.
You know speaking of things like PCI-5 and stuff, there's a game I have in particular called iRacing that supposedly has a lot of bandwidth related issues as far as sending data to the GPU because of the fact that they send the entire world space every frame. I'm curious with the impact of bandwidth over the bus actually has for GPUs particularly in a game environment that does such a thing and I wonder if anyone at the lab would be able to provide insight on that.
I feel like it's an under discussed aspect of gaming as a whole because we just assume that it's just straight up power that's needed but I think a lot of people forget that the CPU still needs to send instructions to the GPU and that data still has to get over to the GPU over the motherboard.
So you have a game that is just insanely stupid. It is sad just how bad most games, even the big AAA games, are coded. they literally waste a decade of hardware-improvements just by being retarded programmers.
Was one of the strange things when DX12 was announced: OpenGL already offered low-level access and nearly nobody used that cause it is a lot of work and easy to get wrong. And with Dx12 and Vulkan the same problems came up. Not only that we have seen that the code it self is yet again mostly bad to the point that the drivers of AMD and Nvidia have to do the heavy lifting by dynamically reshuffling data as the code as written would just result in slideshows if anything.
Except an RTX 4090 can operate at 98% performance in a PCIe gen3 x16 slot than when in a PCIe gen4 x16 slot. We're only just now seeing GPUs that are finally outgrowing PCIe gen3. Im betting its at least two generations, maybe 3 before a card is knocking on the door of PCIe gen4 x16 limitations.
@@racerex340 this depends on how heavy PCIe bandwidth usage gets in future with likes of direct storage
remember were starting to see PCIe 3.0 show age after that many years,4.0 is 2x bandwidth off of 3.0 which even with direct storage could take years to be fully saturated
@@racerex340 But LithiumFox point is that's application dependent and there are some applications where more bandwidth is necessary. One such application might be AI and applications that benefit from memory pooling of multiple cards. Nvidia's solution to this has been Nvlink, but a higher bandwidth PCIe connection could be a cheaper alternative solution. This could also perhaps be used to bring back multi-gpu gaming like SLI, without needing special sli connectors between the GPUs. Anyway, there are definitely situations this could benefit.
15:08 that "2 years for this GPU, 3 years for your screwdriver" is exactly what I'm thinking. The effort to even get a GPU working in whatever way possible is massive. From there you need to work on various features (like Tesselation) and likely revisit the hardware design multiple times to get it right. But 2 years is a pretty short time for the initial product.
By the same token, though: The LTT screwdriver favorably compares to name-brand, high-quality ones. The GPU does not.
People can shit on this product all they want, but 2 years from nothing to an actual product is crazy. If they can keep up the pace excited to see what they can accomplish in the coming years. Amazing what state driven investment into technological innovations can accomplish. It's almost like having a government that invests in it's own infrastructure, development and progress instead of one that spends all it's money in bombs and 800+ foreign military bases while it rots from the inside out is maybe superior. Weird how that works. Probably has nothing to do with how China has managed to lift 850 million of its people out of poverty. 🤔
@@rfouR_4 CCP bot
@@rfouR_4 they literally licensed the IPs from another company, its not like they did all the RnD and all that themselves, they just put the money down to put themselves in the position to make a glorified 1030
2 years is the time spent on sourcing, sanding and reprinting😂
I wouldn't underestimate them, it's a giant step forward for them
Manufacturing high end semiconductors is the single most technically challenging industry and they haven't been in the game for long. The west kinda forced their hand by preventing China from sourcing high end chips elsewhere, and now I'm worried it may have been too short sighted
A great leap forward* for them 😆
@@z_nytrom99lol
I mean honestly. If anything the ban of them from importing certain chips will only increase there investment in domestic manufacturing. Why would you back down when your sole sorce of something essential to your economy threatens to be cut odd
@@putinslittlehacker4793 true this was a very dumb move from the perspective of the west, it’s not like China is some small undeveloped nation, they would easily be able to develop industry for making high end computer chips if they wanted to 😂
Eh it's great for us consumers that there's more competition.
What if you force install a Quadro driver? The card looks like like nvidia Quadro, so if you install that driver, you can run the hardware without any issues
I had an old Power VR GPU many many years ago, they used tile based rendering and avoided rendering anything not visible. Cheap cards that did well price/performance wise vs the ATI and 3dFX cards at the time. They had a few compatibility issues with newer versions of Windows, and Nvidia did their usual dirty tricks to help torpedo them.
PowerVR divides the scene into tiles and renders in on die memory.
@@atomicskull6405 Every modern GPU is tile based now
Ah occlusion, didn't Nvidia use that around the Geforce FX line to prop up its rather questionable performance. Once the "Oops you sure you need 24bits textures" trick didn't get that much more performance they put in one of the drivers a simpler trick, knowing the path of the camera in the benchmark they pre-occluded it. Aka, not use on chip/in driver occlusion but recognized the software and ran pre scripted blacked out areas.
Of course a free flowing camera POV broke that little lie.
@@scheeseman486 So PowerVR was right then.
Hearing linus say "Thanks pal ❤" Was so heartwarming.
The limitation lies in the software, that is, the driver, but they are almost always updating the driver to support more games and play better optimized performance.
The limitations happens because of corportarism monopoly of Microsoft with DirectX. You need a license to use their shader models version and as you know US is in a dirty war with CHina , because they hate Chinesse progress and faster development and is trying to hold control of the high tech industry even if that means throwing stones at the China road ,to slowdown them. And so this sanctions are designed to keep US artificially dominating in high tech electronics but this will not last forever and China is becoming much more competitive ,self suficient and stronger by the time. Youcan see that in their space program. The west no longer laugh at CHina space program. With rovers in the moon and mars..and with orbiters in the moon .Not even NASA have ever done that. Today Linux trolls tv laugh at China video cards , and every of their electronics products , but in the future they will have to just shut up ,because Chinese are a very ambitious nation who always achieve their objectives sooner or later and forget this is only their very first video card. WHich is far superior to what NVidia or ATI did in their first try .
@@technoartfest8708 Nah, it has nothing to do with Microsoft monopoly.
Looking at Linus first statement, it looks like the GPU is still on the tester & reviewer stage, aka they only let specific people get it with only a few game available for testing.
Does the gpu connect to the home wifi network?
Hearing Linus say " let's try DP" that'll scare anyone for life 😂
As a Chinese gamer, I am pretty happy that someone is trying at least, to work on a locally manufactured graphic card. Since 2018, The US was banning foundries around the world to manufacture chip orders that are from or for China mainland. Additional restrictions were also placed to a few Nvidia GPUs. The very thought of not being able to play future Warhammer games(Space Marine 2 come out already pls) in case of an escalate trade conflict kills me. Moore Threads does a pretty bad job at the s80. However, I guess with a few more years to come, we could get our Intel Arc-ish graphic card. So yeah I am excited
dx11 12是个问题 linux适配不错
Maybe if the CCP wasn't threatening to destroy the US while stealing technologies all over the world and disrupting world peace while also committing genocide, maybe they wouldn't have to scramble to make their own GPU?
再等等吧,先把架构跟指令集搞成熟了才能谈后面的事
China should be banned in videogame world too . I never like those scum chinese player .
这卡底子不错,作为第一个产品,很不错了
Get the entire setup for this gpu. Who knows, it may run better with all the parts it’s supposed to have.
nice i hope at somepoint they sell a RISC-V based SBC that has GPU like a jetson Nano ( only bigger) for micro AI development at a nice price point .. built some pretty small full house automation, NAS, firewall /routers/IDS with full house AIs capability that can be do a heck of alot in smart house operations with
The fact they have a better power connector setup than the Nvidia 4000 series is comical xD
The fact that you have no clue yet are talking big is just - sad.
Just a hint for the retards that will come crying:
EPS 8pin is specified by the same spec as the 12VHPWR.
Agree. The EPS12V power connector is better than the 12VHPWR . We have been using the EPS connector for decades, never seen them melt or catch fire despite seeing older Intel HEDT CPU pulling a crap tonne of power from them.
@@fleurdewin7958 the EPS12v is rated way lower and the the 12vhpr connector isn’t an nVidia creation, it’s a standard connector, so many idiots keep blaming nVidia for it.
The only ones that have failed have been due to improper installation.
@@oxfordsparky Even better - The EPS was specified by the same company as the ATX standard (from which the 12vhpwr comes).
But people are too ignorant to learn from their mistakes. The connector never was at fault.
@@oxfordsparky Nvidia still came up with it, and helped make it a standard.
Most importantly, their GPUs are based on Imaginative Technologies, which has been around since the 90s, but simply moved into the mobile (phone) space. So it's not as much a new 4th competitior, as a resurrection of one from the 90s
EDIT:
Oh, linus talks about it at the end! All is good
You forgot bout Adreno and Mali...
And somewhere is still alive VIA technologies and S3 Graphics...
Don't forget about Dreamcast
@@damienkram3379 S3 Graphics is a skeleton crew within VIA that only works with Zhaoxin in China making iGPUs. VIA hasn't funded a new GPU architecture from S3 since ~2012 and is still shipping the same Chrome 600 series cores (2-4 CUs max, DX11.1, OpenGL 4.1, no OpenCL) for a basic integrated option.
That's I assume mostly the GPU compute part. I doubt they also got made all the other IP from scratch, I wouldn't be surprised there was some stuff that was stolen like ARM stuff that isn't properly attributed (stuff they got from previous projects made in China where they got the IP legally). Not to mention the EDA tools, no way it's all 100% Chinese either.
@@damienkram3379 Adreno is an anagram of Radeon so you can guess where that came from. Mali is an original design by ARM, but its history goes back to 1998
If it is their first product that's actually nice. It could become better really fast after experience.
just limited by their fabrication capabilities.
Not having 7nm or below is going to be a HARD limit.
@@zaxwashere they do have 7nn chip tho
@@didyoumissedmegobareatersk2204 oh neat. I didn't know that. Kinda surprising that they did it without EUV, but we'll see if it hits any sort of production.
I correct my post
Gonna be tough without ~~7nm~~ uhhh
5nm
@@zaxwashere Their 7nm was found on a mining chip. This tells us that its probably not a mature 7nm node (meaning yields are low, actual performance isn't there). However, the fact that it exists by itself is a big deal, because its like getting your foot in the door to greater things.
Well, I guess it's time for Linus and Luke to do the Moore Threads challenge!
82Hz refresh might be just half of 165Hz that many monitors use
This is obvious, but the whole video is built on making fun of gpu problems.
@@user-qh5uy1bs9r but china bad , hahaha
That power connector at the back side of the card should get back to all GPUs as in the pci-e spec reference design...
How does it do for things like gpt-2 XL or stable diffusion?
finally theres gonna be some competition. looking forward to cheaper better products from now on.
Just bought a PC from Build Redux yesterday. Glad to see they are still a sponsor, and even more excited to try it my new PC out!
It makes ARC look polished...
Waiting for the 1 month of moore threads challenge :)
ARC has recently become polished after Intel fixing their terrible drivers, making them much better.
They made this only for 2 years? Wtf that's actually impressive in such a really short period of time they made a working GPU, if given more time to produce GPUs then I guess they have a higher chance to being more compatible to more hardwares
If China succeeds, Nvidia and AMD will cut prices significantly,That's great
Hope so for the consumer market. Nvidia's prices are completely out of wack with what consumer's are willing to pay, AMD is no better and the stock is always out anyways. It's time an actual disrupter came on the market to break the diopoly.
Maybe the compatibility issues (especially with certain game titles) are a feature, knowing a bit about how much China's GOV loves to censor.
The 2 years isnt that impressive when you remember most products home grown in china are from stolen tech lol. They dont innovate, they steal and make rip offs. The ultimate chinese knock off!
Lol, brought technology from UK, Imagination was the company offers graphic technology for Sega Dreamcast. Nothing marvellous.
imagination tech offices near by have gone from busy 2-3 building campus to one small building with the lights barely on.
I actually really wanted another big gpu competitor considering the gpu market that we have...
Highly unlikely for another decade minimum. Nvidia and AMD have been in the game longer than the rest and know the ins and out of GPU design. Hence why Intels cards were laughable when they arrived. Price only got out of hand with the newest cards because upto 5 year old cards are still worth while for almost everything. New games are finally starting to push the envelope and ruin performance on older cards, even then you can just drop some settings and get solid performance again.
They do not make those GPU to compete with anyone , they are making them so they wont be left without anything if USA does something horrible again in attempt to have complete world control.
Agree. But what's holding back from the market having available GPUs is fab capacity for making the GPUs and a lack of board partners willing to make graphics cards because of low profit margins.
@@Rabolisk also, even then, nothing will stop woke game studios from releasing games that wont run well even on 4090, neither on consoles.
@@smittyvanjagermanjenson182 A decade or two is nothing...
In the future can you guys please test compatibility with and performance of distributed computing? Running folding@home would give a good overview of that.
what about linux support? Is there anything going on in term of drivers/modules?
This GPU reminds me of the Huawei P9 phone. Gimicky, buggy, working in the general sense one might want "a phone", then the next year they launched the P10 Plus, which really captured the attention and then one more year after that, the P20 Pro dropped and completely changed the market. If this company follows Huawei's footsteps, Nvidia in fact have to be VERY scared, because they are the ones with the most to lose here. Intel are bound by "we can't use this and that" legal forms, while we all know how well copyright infringement laws work in China.
Interestingly, if you know some law basics, "copyright" will only be meaningful when the entity could aquire something but choose to pirate it. Here since we sanctioned China, there is no action related to "copyright" can be taken against them.
For example, your neighbour can stop providing cookies to you, and then they can not stop you from making your own.
Apparently you don’t know how IP laws work in China 🙄
@@brandonshalley4678 I don't think it works that way, it's very easy for Chinese to get their hands on luxury goods like Gucci or LV as an example. Can you guess where the most fake luxury goods are made? Yeah in China. It doesn't matter if a company is providing the Chinese market with goods or not in China people will produce your goods with your brand and sell it themselves anyway.
I will however like to point out that Chinese authorities take Chinese companies IP and copyright very seriously. Selling fake Nike shoes and clothes on the street is not an issue in China but if you would try to do that with a Chinese brand you'd have the police there pretty quick.
P9 is actually okay, P8 and P8 Lite are compeletely junk. Their 64-bit processor can't even catch up the taillight of a 32-bit MT6592 probably years ago.
moving the vrm's to the top of the card is kind of a simple idea to help vent the hot air dirtectly out the top instead of stuffing them in the middle of the board where it can heat soak easier
VRMs are easy to cool, placing them along the top stretches the trace lengths badly and provides asymmetric delivery to the die, leading to worse voltage drop-off - or spikes to compensate
I think we definitely underappreciate how well some of our stuff works together.
PCS in general are extremely underappreciated.. The tech is absolutely insane .
True. This video is nostalgic. I got same vibe about 10 year ago when china got banned from only International space station. The best way to stop china progress to banned and then mock them. it clearly work
@@reuven2010 Wdym?
@@warymane6969 😂😂they made 7nn chip
THAT'S CALLED P-R-O-T-O-C-O-L-S
you need to update the GPU driver,as the driver’s notification was write.
We actually need this card. A cheap gpu out of china that threatens the gpu cartel|s market share is exactly what we need. If they fix their drivers it could be viable for 1080p as long as they keep the price down.
US never ending ban game won't let it happen. Your blood is for them to suck, not others.
12:18
"Yeah, let's try DP" 🤣🤣🤣
Sebastian, Linus - Mar/11/2023
Whoever edited the intro is based and an absolute mad lad, let me shake his hand😭😭
Exactly! Surprised more people didn't notice
7:41 the, "you do?" 😂
Does the card not have the rear exhaust that most air-cooled GPUs have?
Heard pretty much the same thing 15 years ago about Chinese cell phone.
Looking forward to the next 15 years.
I think the biggest difference there is there wasn't a burgeoning economic/tech war brewing at the time. Who knows how this will pan out, I"m only saying there's a bit of a different context here.
@@mobiusflammel9372 hmmmmm, how about ISS
I would really like to see more content with that card
Would like to know if that thing can be used more effectively under Linux.
It would be cool to have a revisit sometime this year. Has it had driver updates? Are there new cards in the pipeline?
Moore Threads MTT S4000
That gpu shroud design looks clean AF
Editor! Don't you think for a second that the 11 at 8:00 would've gone unnoticed. I have a faint memory I've heard another AoE2 taunt some years ago.
The subtitle team: a "segue" is an uninterrupted transition from one scene to another, while a "segway" (@00:56) is a personal transportation vehicle.
Otherwise, fun video!
Awesome! Now do the Erying boards and chips!
Interesting and reminds me of the nineties when mobos and video cards were released with drivers that had not 'matured' (not forgetting games that ran like snot until they were patched numerous times)
if it's an AI oriented you could test it by running a basic training on it with a ready base, put it through 10-20 epochs see how long it takes, compare against cpu performance and some entry level gpu like the 1660 in this video
I was kinda interested, so I looked around - couldn't find anything. According to some people who played around with it, it's quite difficult to get the Linux software for those cards (UnixCloud has some pretty old drivers for previous MTT GPUs on their website, but not for the S80/ S3000), and I'm not sure anyone managed to get access to any Torch builds for MTT GPUs. And no code has been submitted upstream at all as far as I can tell.
Probably need to do it in linux tho. This card is not mainly built for windows.
How? Does torch, tf or any other ml API support these cards?
@@fcukcensorship783 the screenshot from their website they showed said it supports pytorch.
@LinusTechTips Network Chuck just had a test on Nvidia's DPU, wonder when you can have your hands on it and apply it in your servers
its good to see more brands gpus
That part at 0:30 had me dying
Try to test this card in Handbrake with AV1 and VP9 encoder/decoder and compare with other well known GFX cards.
Along with the Ai cores, it also comes with latest version of the spy cores.
Hopefully this spurs more advancement since they have competition
I'm curious to see if Linux has any support for this card, or if there are out-of-tree drivers.
Additionally, I believe Factorio uses Direct3D by default on Windows. You can, however, force it to use OpenGL by changing a setting in its config.ini. At the very least, this card has to support OpenGL, right?
right?
Maybe? It's China, you never know until you dig.
I think someone in the comments said it didn't but I could be mistaken
I once had a GT 1030 in my system and upgraded to a 1660 Super. I can confirm that this comparison is about right. The 1030 simultaneously *doesn't exactly run like garbage* and *will bottleneck you hard.*
yeah iirc it was replaced to gtx970 everything run well and smooth just i never really tested it, but game on it alot, first witcher 3 gameplay also with that pc
it was forbidden msi tiger 1 fan not spinning after 2 years, and the msi board also failed, replaced it with gigabyte board
1030 are awesome for low power high end mame stations
The 1030 is a card to drop in pre-built systems. Buying one for a custom ATX system is moronic. A used GPU would be a far better value.
Yeah, gt 1030s are going for $200 to $250 aud new here.
Can get a used 980ti for the same price 😂
@@joshgts9675
anyone know what keyboard linus is using?
Lmao a little lab leak cough in the beginning got me