New GPUs are Bad??... "F**k it, I'll Do it Myself."

Поделиться
HTML-код
  • Опубликовано: 24 ноя 2024

Комментарии • 994

  • @sanishnaik2040
    @sanishnaik2040 7 месяцев назад +3207

    Im from nvidia. This guy has to be stopped.

    • @AbbasDalal1000
      @AbbasDalal1000 7 месяцев назад +843

      Im from this guy nvidia has to be stopped

    • @KianFloppa
      @KianFloppa 7 месяцев назад +103

      amd better

    • @MostWanedManformRockport
      @MostWanedManformRockport 7 месяцев назад +60

      Nice and AMD has to be stopped

    • @danieltoth9742
      @danieltoth9742 7 месяцев назад +234

      I'm from stop. Nvidia and AMD needs to be this guy'd.

    • @udaysingh9_11
      @udaysingh9_11 7 месяцев назад +116

      I'm in this guy, he wants me not to stop.

  • @ClamChowder95
    @ClamChowder95 7 месяцев назад +1655

    I think he is underselling his work. This could be a stepping stone for future open source GPUs. He should be incredibly proud of what he did all by himself.

    • @k_kubes
      @k_kubes 7 месяцев назад +140

      This feels awfully familiar to how Linux started, basically someone toying around with a concept not expecting the project to become that big of a deal even after going open source and end up exploding in popularity

    • @jasonstephens6109
      @jasonstephens6109 7 месяцев назад +76

      Yeah this guy is laying the groundwork for something with huge potential. Even if an open source GPU never competes with the big boys, it could lay the groundwork for niche features that become mainstream.
      There's also the potential for custom instruction sets that give new life to defunct GPUS

    • @jnharton
      @jnharton 7 месяцев назад +20

      @@k_kubesLinux was more than just "toying around", but definitely a personal project that was never originally intended to go public let alone become a mainstream OS.
      I actually some files once that were supposedly part/all of Linux (maybe just the kernel) prior to version 1.0.

    • @harrydean9723
      @harrydean9723 7 месяцев назад +3

      this should be a thing

    • @xavierrodriguez2463
      @xavierrodriguez2463 7 месяцев назад +9

      Open source GPU to go with RISC-V

  • @AnimeGIFfy
    @AnimeGIFfy 7 месяцев назад +1227

    open source software + open source hardware. this needs to happen if you want anything good to happen in your life.

    • @DengueBurger
      @DengueBurger 7 месяцев назад +35

      At-cost open-source GPUs when

    • @DengueBurger
      @DengueBurger 7 месяцев назад +46

      The Linux of GPUs

    • @TheDoomsdayzoner
      @TheDoomsdayzoner 7 месяцев назад +32

      That's how we evolved as species. When "secret" sciences like Algebra and Geometry became available for everyone. When "secret" tech became available for everyone like household appliances.

    • @wwrye929
      @wwrye929 7 месяцев назад

      Making the hardware would be price and that would make harder for the game maker

    • @AnimeGIFfy
      @AnimeGIFfy 7 месяцев назад +9

      @@wwrye929 im not talking about everyone making their own hardware from scratch

  • @SIedgeHammer83
    @SIedgeHammer83 7 месяцев назад +745

    Voodoo and Kyro GPUs need to make a comeback.

    • @bryndal36
      @bryndal36 7 месяцев назад +21

      Imagine this gpu being able to use a glidewrapper.

    • @xgamer25125
      @xgamer25125 7 месяцев назад +92

      wasn't 3DFX (maker of voodoos) got bought and absorbed into Nvidia..?

    • @RAgames-mc3kp
      @RAgames-mc3kp 7 месяцев назад

      Yes i think ​@@xgamer25125

    • @gh975223
      @gh975223 7 месяцев назад +42

      @@xgamer25125 it was

    • @itsnot1673
      @itsnot1673 7 месяцев назад +10

      And matrox
      maybe even trident

  • @Karti200
    @Karti200 7 месяцев назад +876

    I saw the news day one when it came out - and it just pissed me off how many out of touch people there were about it…
    Like… People literary roasted the creator of it because "it is too weak"... like cmon what the heck is wrong with some people
    This is literary an openware / shareware version of a GPU made by a community - this is an amazing milestone if you ask me

    • @elysian3623
      @elysian3623 7 месяцев назад +97

      Let's be real, consumers themselves have no idea what stuff is, what is good for them or how stuff works, they just consume, they're easily fooled into consuming stuff they don't need as well, 90% of Nvidia cards have probably never been used for a cuda work load but they were sold based on their dominance in certain tasks.
      I still live in hopes that AMD makes their software stack fully open and somebody comes along with a working prototype of something absolutely game changing and they work with them to actually use their combined technology to advance GPU's, currently the stagnation in GPU performance is because of die shrink and ramming things like AI acceleration into cards that really just need to be affordable and play games well.

    • @Dingbat1967
      @Dingbat1967 7 месяцев назад

      The average lamba person is an idiot. That's pretty much why. The intertubes just made it more obvious.

    • @TechBuild
      @TechBuild 7 месяцев назад +37

      People who have some idea how GPUs work and their differences from CPUs will easily understand the work this person has done. It is a phenomenal task of making a GPU yourself which does 3D rendering, even a basic one, so well. In the CPU space, there are lots of architectures available to build upon but not in the GPU space.

    • @tsl_finesse_6025
      @tsl_finesse_6025 7 месяцев назад +31

      People don't understand nvidia have around 30k employees while him alone does 1 full project by himself 😬😬. Bro is fire and a genius 💪🏾

    • @rextrowbridge8386
      @rextrowbridge8386 7 месяцев назад +10

      Because they are ignorant of how hard it is to make a gpu from the ground up.

  • @ThreaT650
    @ThreaT650 7 месяцев назад +53

    Respect to putting me on the FuryGPU, this is dope! That thing is performing around an old Radeon 8500 or something! Impressive!

  • @grgn5412
    @grgn5412 7 месяцев назад +277

    This is WAY better than you may think.
    By simply converting his FPGA design into an ASIC, you get a 10x performance increase.
    ASICS are expensive (From one to a few million dollars to make the mask which allows mass production), and FPGA's have been used to prototype them : the languages to program the first or design the second are the same (VHDL or Verilog), so this conversion is very common in the industry (this happenend for crypto mining, for instance).

    • @SUCHMISH
      @SUCHMISH 7 месяцев назад +5

      I looked the chips up, and the good news is that you can by them in bulk for a low price used... Only problem is that they are used... But I feel like this idea has some merit to it though!!

    • @aymenninja8120
      @aymenninja8120 7 месяцев назад +4

      I saw a video of a guy who made his own ASIC, and he didn't sound like super rich, i think the technology of making ASICs is getting more affordable.

    • @Endgame901
      @Endgame901 7 месяцев назад +12

      @@aymenninja8120 He didn't really make his own ASIC, he basically got in on a group buy for space on the silicon for an ASIC. Still pretty dope, but not quite the same.

    • @aymenninja8120
      @aymenninja8120 7 месяцев назад

      @@Endgame901 and that group is not a big corporation or something, my point is there it is possible now to make ASICs for clients other than big companies. if I got things right.

    • @Endgame901
      @Endgame901 7 месяцев назад +1

      @@aymenninja8120 you're not wrong, per se, but an ASIC like tinytapeout isn't really in the same scope as this, even if you purchased every single "block" of silicon space.
      The type of chip you'd need _would_ cost Big Company money.

  • @POLARTTYRTM
    @POLARTTYRTM 7 месяцев назад +533

    The guy who made the gpu is the real "he's good with computers" guy.

    • @thevoidteb1285
      @thevoidteb1285 7 месяцев назад +2

      Did you listen to him in this video? Hes a novice.

    • @Whatisitlol
      @Whatisitlol 7 месяцев назад +7

      Why the heck am i experiencing a deja-vu
      This comment
      This reply above me
      I am getting crazy

    • @POLARTTYRTM
      @POLARTTYRTM 7 месяцев назад

      @@Whatisitlol normal we all have time from time to time.

    • @0x1EGEN
      @0x1EGEN 7 месяцев назад +22

      @@thevoidteb1285 Being a novice with HDL programming and writing windows kernel drivers does not mean he's not good with computers. That's like saying I'm not good with music because I never played the Pikasso guitar...

    • @yoshi596
      @yoshi596 7 месяцев назад +15

      @@thevoidteb1285 Oh is that so? Then go ahead and do it better than him. Notify me when you upload a video about your custom made GPU, I'll wait.

  • @crumbman
    @crumbman 7 месяцев назад +197

    Really interesting video.
    As an electrical engineer working with FPGAs I can assure you it's a heck of a lot of writing (probably verilog) code to get this thing to work as it's supposed to. The biggest issue about doing this with an FPGA is that they run on really low clock speeds (typically ~100 -max ~250 MHz). So you can't really get speed just by increasing the clock speed (like NV and AMD have been doing more aggressively recently).
    Props to this man

    • @kesslerdupont6023
      @kesslerdupont6023 7 месяцев назад +2

      Are consumer FPGAs big enough to scale to entire architectures or are they typically cut-down and more of an evaluation tool?

    • @noth606
      @noth606 7 месяцев назад +14

      @@kesslerdupont6023 Well, you wouldn't be able to make a fullsize flagship FPGA based GPU that competes with the big boys, if that's what you mean. What I think this is, without looking deep into it, is a rendering pipeline mostly from scratch, my guess is that it's OpenGL based on choosing Quake to test it, so it's unlikely to support anything approaching DirectX 10 type stuff in terms of GPU functions, because it's at least one if not multiple orders of magnitude more complex.
      So, it's definitely impressive work, but I don't think nVidia or AMD are shaking in their boots.
      I'd guesstimate, based on what I know off the top of my head, that you'd need probably a few FPGA's, like 3-6, maybe more, to build up a full Dx10/11 type unit with enough ROP's, shaders etc to do something useful with including all the jank you have to have around it like memory management stuff, things to handle texture/geometry/shadercode ram plus then output handling. It kinds depends on how 'strict' of a model you aim for really, because to a point you can choose to do a lot on the host system, or not. The more you do on the host in code, the less you need dedicated hardware/FPGA space to do.
      It could be that this is a Dx10+ model project just not completed far enough to currently run more than basic stuff equivalent to OpenGL. I hope so.

    • @myne00
      @myne00 7 месяцев назад +7

      @kesslerdupont6023 they are absolutely used by Intel and amd to test design changes. They probably have really big ones that can do an entire (single) core.
      Nvidia probably does use them too, but it would most likely be a very cut down gpu.
      They would absolutely perform horrible, but it's about comparing different hardware approaches and validation.
      Eg if fpu design x does 100flops and design z does 102 flops you have your architectural comparison.
      Then you run through a bunch of tests to validate the results. Don't want a chip that gives incorrect answers.
      Fpgas are used in the real world in applications like telecommunications signal processing where a new technique could be released every year or so.
      I'm not aware of any other real world applications aside from the "MiSTer" which is mostly used to emulate old game consoles.

    • @kesslerdupont6023
      @kesslerdupont6023 7 месяцев назад

      @@noth606 Thanks for the helpful comment. I don't know much about DirectX but maybe I should look more into it.

    • @kesslerdupont6023
      @kesslerdupont6023 7 месяцев назад

      @@myne00 thanks for the info. During validation is it normal to have to fuzz the chip across all possible combinations of input/output or are there shortcuts to validation?

  • @POLARTTYRTM
    @POLARTTYRTM 7 месяцев назад +119

    I've had people telling me that the guy's gpu is not impressive at all because writing drivers, APIs, and all that, including the card is the "bare minimum" they would expected from any software engineer apprentice or professional, yet I don't see new cards coming out every week made by software engineers.

    • @taxa1569
      @taxa1569 7 месяцев назад +25

      DO NOT believe them. To do something from scratch that the past 20 years has been iterating upon over and over by massive companies is like, saying the person who added the sauce to the meal they've been preparing for 3 hours is now the chef, and these 'chefs' can ALL do what this guy did. Except he prepared the whole meal and THEY just put on the sauce.
      The bare minimum was in fact coming into the GPU development space and adding on to the well prepared, already existing meal.

    • @POLARTTYRTM
      @POLARTTYRTM 7 месяцев назад +18

      @@taxa1569 they gave every excuse possible, that the guy just messed with an FPGA and sourced the other parts... I was like in my mind... tf do it then, I want to see if it's that easy you can do it, so? Anyone can if they are given the tools.

    • @aliendroid8174
      @aliendroid8174 5 месяцев назад +6

      Lol most software engineers don't know anything besides react

    • @ryanc3011
      @ryanc3011 4 месяца назад +5

      @@taxa1569 I changed the oil on my car but I could build a whole car from scratch, pfft easy

    • @monad_tcp
      @monad_tcp 3 месяца назад +2

      Writing the API for a graphics card is not the bare minimum, its the thing that does the work of making the game run.
      Most of what you are paying in a GPU is not the hardware, but the driver.
      The driver is the expensive part because of all the testing that goes to ensure games actually work. Heck, even wonder why GPU drivers are that big ? really 350MB of binary executable, what's there? they have lots of copies of the actual driver optimized for specific games, that's why the nVidia has market dominance, it is the driver.
      Ever wonder why they don't go open source with their drivers ? (no, what they went open source on Linux was the compute driver, not the 3D graphics driver, even if they go 3D graphics driver open source, it won't be a DirectX one, that's for sure, it will be the slow poor and bugged OpenGL for almost no games that run on Linux )

  • @baget6593
    @baget6593 7 месяцев назад +106

    I dont care if fury gpu succeeds, i just want nvidia to lose

    • @theturtlerguy1236
      @theturtlerguy1236 4 месяца назад +5

      Real

    • @Rudraiya
      @Rudraiya 4 месяца назад +4

      Dude nvidia build the GPus from the scratch too their hard work paid off

    • @monad_tcp
      @monad_tcp 3 месяца назад +2

      @@Rudraiya they didn't build it from scratch, they made it out of parts in the inventory of ST-Microeletronics, and their 3rd generation of the product, which made their name was made out of parts for the Sega Dreamcast GPU that didn't work.
      They were working for other companies doing the parts that were going in the GPU before they were able to actually create the thing.
      They didn't made it all alone out of no-where, sure it was hard work, and it paid off, but it had to start humble like this guy with his fixed pipeline GPU that's already better than the NV1 or even the Riva TNT.
      That makes it even more amazing if you consider that he didn't start out of ST Microelectronics portfolio, but it was really from scratch.

  • @toblobs
    @toblobs 7 месяцев назад +215

    I can only imagine what Dylan could do with Nvidia's budget with his talent

    • @kaptenhiu5623
      @kaptenhiu5623 7 месяцев назад +43

      Becomes a trillionaire and dominate the AI market like NVIDIA does rn?

    • @blanketmobbypants
      @blanketmobbypants 7 месяцев назад +3

      But I would definitely buy it

    • @josephdias5859
      @josephdias5859 7 месяцев назад +6

      or just enough of a budget to produce cards to play all the 90s games and early 2000s games

    • @ThreaT650
      @ThreaT650 7 месяцев назад +2

      And a team of engineers, YUP!

    • @ThreaT650
      @ThreaT650 7 месяцев назад

      Seems to have his head screwed on straight too. Would be great at the consumer level.

  • @lineandaction
    @lineandaction 7 месяцев назад +183

    We need open source gpu

    • @namebutworse
      @namebutworse 7 месяцев назад +9

      Im getting flash backs the moment i hear "source"
      (I am Team Fortress 2 player)

    • @thatonegoogleuser4144
      @thatonegoogleuser4144 5 месяцев назад +3

      ​@@namebutworse I can hear that sound bro

    • @sklungofunk
      @sklungofunk 5 месяцев назад

      if feel you there

    • @dracoborne2648
      @dracoborne2648 3 месяца назад

      Communist

  • @liutaurasleonavicius2680
    @liutaurasleonavicius2680 7 месяцев назад +78

    there was also one person who literally combined a Nvidia and AMD gpus and was able to use DLSS and AFMF together, this must be dark magic

    • @シミズルリ
      @シミズルリ 7 месяцев назад +15

      No magic, you literally just put 2 cards into 2 slots on your motherboard and they work🤣

    • @callyral
      @callyral 7 месяцев назад

      ​@@シミズルリIt just working makes it seem more like magic

    • @shiro3146
      @shiro3146 7 месяцев назад +15

      @@シミズルリ i dont think it was as easy as that bruh

    • @sirseven3
      @sirseven3 7 месяцев назад +3

      ​@@shiro3146it totally can work just like that. Just as you can have 2 NIC's operating at the same time. The main thing to worry about is the drivers for the card. You won't be able to pool the memory space as NVLINK but you do get an additional processor. Utilizing profile inspector you technically could get it to work as NVLINK but it takes manual config. I've ran different ram types with xmp and sometimes I got bluescreens but it was semi stable with a solid overclocking of said RAM

    • @granatengeorg
      @granatengeorg 6 месяцев назад +7

      I do the same in blender, rendering on an rx while using optix denoising on my gtx, all together in realtime in the viewport. Was also quite surprised that it just worked lol.

  • @tech6294
    @tech6294 7 месяцев назад +69

    8:57 The VRAM GDDR chip is to the lower left of the fan cooler. That mini storage card might be BIOS? Not sure lol. Great video! ;)

    • @jashaswimalyaacharjee9585
      @jashaswimalyaacharjee9585 7 месяцев назад +10

      Yup, I guess he is hot loading the vBIOS via SD Card.

    • @dbarrie
      @dbarrie 7 месяцев назад +14

      That’s the DisplayPort splitter, which takes the DP signal from the FPGA and splits it into DP/HDMI to supply the outputs. All of the (slow, DDR3!) RAM on the device is part of the Kria SoM, underneath the fan!
      SD card is there to update the firmware when I’m not running with the card hooked up to the dev machine!

    • @greyhope-loveridge6126
      @greyhope-loveridge6126 7 месяцев назад +3

      @@dbarrie That's a really good way of updating the firmware on the fly - I'm amazed you got this running!
      DDR3 isn't actually the worst memory you could've used, and maybe you could use some older, broken GPUs and de-solder some GDDR6 or GDDR5 to transplant while you get stuff working too? I am fascinated to see how far this comes.

    • @Vifnis
      @Vifnis 7 месяцев назад +1

      @@greyhope-loveridge6126 I doubt this would even work since GPUs aren't out-of-the-box FPGAs... might need to check JEDEC standards first to even see if that's possible, and iirc they only started with GDDR4 and up, wasn't everything before that 666MHz DDR3 VRAM?

    • @proudyy
      @proudyy 7 месяцев назад

      @@greyhope-loveridge6126 Definitely, he has to keep going. The amount of potential in this is crazy. And even though he said in this video that the goal never was competition, he still can get a competitor in the future :P
      Or even found a company which competes in the future, whatever...

  • @oglothenerd
    @oglothenerd 7 месяцев назад +32

    Someday we will have open source GPU instruction sets. I know it. The community always wins.

    • @fatemanasrin7579
      @fatemanasrin7579 7 месяцев назад +1

      And they'll be 9 year old white spoiled kids aho aill buy these tjinking they're smart and break it or put it on fire..

    • @xeschire706
      @xeschire706 6 месяцев назад +4

      Or we can just take either risc-v, or a custom & extended version of the 6502 that supports a 64 bit architecture, nyuzi, or even the miaow isa's, & modify & optimize them for efficient graphics processing, for use in our custom, open source GPU's instead, which I think would be a far better route in my opinion.

    • @oglothenerd
      @oglothenerd 6 месяцев назад +1

      @@xeschire706 I like that idea.

  • @oktc68
    @oktc68 7 месяцев назад +12

    This is the most interesting PC oriented video I've seen for ages. Nice1 Vex, nice change of pace.

  • @vishalkumar-dr8wq
    @vishalkumar-dr8wq 7 месяцев назад +25

    Its amazing what he was able to achieve. I used FPGA's during my time in undergraduate study and what he achieved it takes an amazing amount of skill and work.

    • @jnharton
      @jnharton 7 месяцев назад +2

      True, but don't discount his 20-30 years of writing code for software rendering.
      Not only does that mean he has a considerable foundation in understanding what the hardware needed to be capable of, it almost meant he could write his own kernel driver and an interface API to make his GPU usable under a modern Windows OS!

    • @EulianDax
      @EulianDax 7 месяцев назад +1

      ​@@jnharton what part of op's comment discounts the guy's decades of work?

    • @mrnorthz9373
      @mrnorthz9373 7 месяцев назад +1

      ​@@EulianDax i dont think he means the comment discounted his skill, he means to anyone that may think this is a miracle or something unexpected from a guy of this magnitude

  • @CipherDiaz
    @CipherDiaz 7 месяцев назад +7

    An FPGA is *NOT* in a raspberry pi. They are quite expensive. Also, the programming on these fpga's is not anywhere near similar to C/C++ or anything like that, you are basically writing the logic for how electricity flows between components. An fpga also has a clock-rate, and the higher the rate - the more these things cost. So for him to get say, 150fps, he would most likely need a fast onboard clock. And most likely more gates to work with, since the only way to truly optimize anything on an FPGA is by heavy use of tables. Which might not apply to a GPU since its basically moving a ton of data around memory as quickly as possible.
    But yeah, awesome project!

  • @CombatMedic1O
    @CombatMedic1O 3 месяца назад +5

    Basically we as humans, we have no open source GPU tech. Sad day for us. Imagine how much faster we could progress if we did. Having basically 3 companies in the entire world owning this important information is wild and almost monopolistic. Patent laws take way to long to expire in terms of computer years. They should have to reveal source code after 10 years.

  • @StuffIThink
    @StuffIThink 7 месяцев назад +7

    I've been watching him try to make this thing work forever super cool to see someone else giving him some exposure.

  • @jungle2460
    @jungle2460 7 месяцев назад +39

    If that SD card actually turns out to be the VRAM, that's genius. I'd love to be able to swap SD cards to upgrade VRAM

    • @HamguyBacon
      @HamguyBacon 7 месяцев назад +29

      SD cards are not fast enough to be vram, thats just the bios.

    • @barela3018
      @barela3018 7 месяцев назад +9

      Way to slow, ram and ssds are made for different purposes, that’s why there is a loading time before games, ssd is loading the system to the ram and then when needed by the gpu, the ram will select information from the program loaded.

    • @aliendroid8174
      @aliendroid8174 5 месяцев назад +7

      Sd cards are very slow even for storage drives. Vram can do around 100gb per second whilst fast sd cards can do around 100mb per second. So it's about 3 orders of magnitude of a difference but beyond that there's all sorts of stuff like latency, endurance and random read write performance all of which sd cards will be terrible at

    • @HamguyBacon
      @HamguyBacon 5 месяцев назад

      @@aliendroid8174 newer sd cards can hold 128TB and do 1gb per second.

  • @PeterPauls
    @PeterPauls 7 месяцев назад +6

    My first GPU was a 3Dfx Voodoo 3 3000 and 3Dfx made the first GPU available for the masses (AFAIK) and they disappeared around 2000-2002.

    • @SJ-co6nk
      @SJ-co6nk 3 месяца назад

      Technically 3dfx never made a GPU. The first GPU was the Geforce, it contained hardware T&L which is what it set it apart from just a 3d accelerator card.
      Trying to figure out if this card actually has hardware T&L or not.

  • @maxcarter5922
    @maxcarter5922 7 месяцев назад +26

    What a great contribution! Crowdsource this guy?

    • @arenzricodexd4409
      @arenzricodexd4409 7 месяцев назад

      To play Quake at 60FPS?

    • @mrnorthz9373
      @mrnorthz9373 7 месяцев назад +15

      ​@@arenzricodexd4409quake at 60 fps today, cyberpunk at 60 fps tomorrow.

    • @scudsturm1
      @scudsturm1 6 месяцев назад +2

      @@arenzricodexd4409 dont complain if u cant build a gpu yourself and write the driver yourself

    • @arenzricodexd4409
      @arenzricodexd4409 6 месяцев назад +2

      @@scudsturm1 nah this guy do it for fun. crowd source? that is an attempt to take away his passion for this.

  • @Drunken_Hamster
    @Drunken_Hamster 7 месяцев назад +5

    The future where I can piece together and upgrade my GPU like I can the rest of my system would be lit. NGL I'd love for the GPU scene to instead have motherboard chip slots similar to the CPU socket, with their own section for memory, special compute units (to improve ray tracing or AI separately from rasterized processing), and output pathways so you never have to worry about finding a card with the types and quantities of outputs that you want.
    It'd also make cooling simpler and likely more compact, kinda like how it is for CPUs with semi-universal setups that only require certain amounts of height as the sole variable. And it'd DEFINITELY make liquid cooling more accessible, not that I want to do that as much as I once used to.

  • @tropixi5336
    @tropixi5336 7 месяцев назад +123

    "YoU DiDnT mEnTiOn InTeL"

    • @AryanBajpai_108
      @AryanBajpai_108 7 месяцев назад +20

      He did 😂

    • @Prince-ox5im
      @Prince-ox5im 7 месяцев назад +16

      ​@@neon_archhe's mocking someone's comment not saying it

    • @ranjitmandal1612
      @ranjitmandal1612 7 месяцев назад

      😂

    • @tropixi5336
      @tropixi5336 7 месяцев назад +1

      @@AryanBajpai_108 im talking about the start where he said "NVidia and amd are top contenders" ....

    • @urnoob5528
      @urnoob5528 7 месяцев назад +1

      @@ranjitmandal1612 smh

  • @sturmim
    @sturmim 7 месяцев назад +8

    He could braze some old GDDR6 chips from broken or old GPUs. Would like to see that.

    • @kesslerdupont6023
      @kesslerdupont6023 7 месяцев назад +5

      It may be good enough to just put some DDR5 on there depending on what speed the GPU is currently using.

    • @jcoyplays
      @jcoyplays 7 месяцев назад +2

      He could've used DDR3 and been on par/overkill. (800-2133 MT/s, or about 400-1051 MHz, which would match/exceed the FPGA clock speed)

    • @kesslerdupont6023
      @kesslerdupont6023 7 месяцев назад

      @@jcoyplays yeah that is true

  • @miguelcollado5438
    @miguelcollado5438 7 месяцев назад +18

    Real3D, Mellanox, Realtek made decent GPU's in the 90's as well... but they have eventually all been absorbed by the same 3 major brands in the 2000's...
    Dylan Barrie deserves our community's full support for his work.

  • @cybernit3
    @cybernit3 7 месяцев назад +11

    The biggest hurdle to make a GPU is you need lots of money to make the gpu chip if its an ASIC but later on the ASIC chip would be cheaper than using an FPGA. I wish they could make high performance FPGAs that are cheap; not so expensive. I have to give this fury gpu guy some credit for making it; this could lead to something decent in the future or inspire future gpu designers. Also there is VAMPIRE AMIGA who made an extension of the AGA Amiga graphic chipset.

    • @Tyrian3k
      @Tyrian3k 7 месяцев назад +2

      It simply can't be as cheap as a chip that is tailor made for the specific desired purpose.
      It's like wanting a van that can perform like an F1 car without it ending up costing more than the F1 car.

  • @BOZ_11
    @BOZ_11 7 месяцев назад +35

    Fury?? ATI Technologies 'bout to make a complaint

    • @core36
      @core36 7 месяцев назад +10

      I don’t think ATI is going to make any complaints anytime soon

    • @BOZ_11
      @BOZ_11 7 месяцев назад +2

      @@core36 so i see sarcasm isn't your strongest suit

    • @urnoob5528
      @urnoob5528 7 месяцев назад +5

      @@BOZ_11 tell that to urself

    • @MarioSantoro-ig5qh
      @MarioSantoro-ig5qh 7 месяцев назад

      Unless he starts selling them they probably wont do anything.

    • @MarioSantoro-ig5qh
      @MarioSantoro-ig5qh 7 месяцев назад

      Unless he starts selling them and he makes a ton of money off them. Its unlikely they will do anything.

  • @CrowandTalbot
    @CrowandTalbot 6 месяцев назад +3

    wasn't quake that game that used to prove or break computer builds back in the day? and his gpu handles it gorgeously? that's enough for me to know he's onto something

  • @UltraVegito-1995
    @UltraVegito-1995 7 месяцев назад +57

    *If only moore threads became successfully AI GPU in china causing them to neglect Nvidia or AMD....*

    • @arthurwintersight7868
      @arthurwintersight7868 7 месяцев назад +9

      I just want to see more actual competition, to drive down prices.

    • @zerocal76
      @zerocal76 7 месяцев назад +4

      😅😅 You must know very little about China to make a comment like that. The last thing anyone in the world wants is a gov like China's to become completely tech-independent, especially in the hardware accelaration & AI space!

    • @arthurwintersight7868
      @arthurwintersight7868 7 месяцев назад +5

      @@zerocal76 - China is highly likely to implode under their own pressure at some point. Especially if their shoddy construction work at the Three Gorges ends up being as bad as people think. In the meantime they can drive down GPU and NAND prices.

    • @arenzricodexd4409
      @arenzricodexd4409 7 месяцев назад

      Still does not make them to ignore AI.

    • @hey01e5
      @hey01e5 7 месяцев назад +1

      unfortunately, if moore threads became competitive they'd get sanctioned for "national security reasons", leaving us westerners stuck with nvidia and AMD who will just price gouge the GPUs

  • @AleksanderFimreite
    @AleksanderFimreite 7 месяцев назад +1

    I would assume the rates displayed around 5:30 indicates what speed the internal game updates (ticks) are running.
    One render seems to take around 25 - 45 ms to draw, and another 5 - 10 ms to clear the data for the next render. This indicates a total range of 30 - 55 ms per update.
    Formula to calculate rate per second would be (1000ms / x) which becomes roughly 33 - 22 range. Which seems accurate to how choppy the enemies move around.
    Camera motion is much smoother than their movements. Despite this, I'm also impressed by the efforts of individuals trying to tackle such a daring project.

  • @mustafa._.9371
    @mustafa._.9371 4 месяца назад +4

    0:56 You vs. the guy she tells you not worry about.

    • @fredstead5652
      @fredstead5652 Месяц назад

      Sounds like you're winning. No one likes AMD.

  • @dustindalke7979
    @dustindalke7979 2 месяца назад +2

    He died next week by self inflicted high projectile wounds to the back of the noggin

  • @Hemeltijd
    @Hemeltijd 7 месяцев назад +3

    This is so informative and cool. If you find any more topics alike, can you make more videos like this?

  • @Ele20002
    @Ele20002 6 месяцев назад

    I love projects like this. Making a driver fully compatible with windows, and actually creating all the required ports to connect via PCIe is insanely impressive. Using an existing graphics API would be even more impressive, but modern APIs are so complex these days it'd be a hell of a task for one person, so I can understand skipping that step.
    It'd really be great to get more GPU designs into the open though. GPU architecture isn't really shared in that much detail - everyone does their own thing, so you can only take inspiration from the higher level concepts.
    There's a good reason to hide GPU ISAs behind the driver though - a lot of optimisations can be enabled by the compiler and features integrated into each new version that'd otherwise need convincing developers to add support into their game for.
    Breaking into the GPU space in performance is also difficult because so many optimisations are made specifically targeted at a certain GPU, forcing newcomers to support hardware acceleration of that feature as well to not fall behind in benchmarks, even if there's another way to achieve the same effect that's more efficient on their hardware.

  • @rainbye4291
    @rainbye4291 7 месяцев назад +10

    My man just did the unthinkable. Great effort for making a gpu this good ALONE.

  • @baidajel
    @baidajel 16 дней назад +1

    you don't have to keep saying "ik it's not better than nvidia but one person did this it's insane" we know it's insane we are just as shocked as you if not more.

  • @pauloisip3458
    @pauloisip3458 7 месяцев назад +9

    I can see this guy becoming successful unless nvidia makes a move on the guy

    • @AssassinIsAfk
      @AssassinIsAfk 7 месяцев назад +2

      Either 2 things will happen
      1) Nvidia becomes Nintendo/ Sony and send a cease and desist
      2) they send him a opportunity to work for them to fix their budget cards or AMD/intel send him a opportunity to work for them.

    • @NicCrimson
      @NicCrimson 6 месяцев назад

      @@AssassinIsAfk A cease and desist for what?

    • @AssassinIsAfk
      @AssassinIsAfk 6 месяцев назад

      @@NicCrimson I don't think you understand the joke

  • @jameshadaway8621
    @jameshadaway8621 7 месяцев назад +1

    Great video I remember the want of 3dfx cards in 90s and always wanted to work in IT and its good people can build there cards as hobby.

  • @Desaved
    @Desaved 7 месяцев назад +14

    We're at the point where GPUs are more expensive than the entire rest of the computer!

  • @Alexruja3227
    @Alexruja3227 7 месяцев назад +7

    Literally became the nr 1 viewer

  • @tony_T_
    @tony_T_ 5 месяцев назад

    Imagine a GPU that's developed by the community. There are so many geniuses just out there who I bet would love to work on this. This could actually go so far.

  • @gamma2816
    @gamma2816 7 месяцев назад +7

    Future prediction:
    1. Hardware will become open source but lackluster but you can now build individual pieces of your PC just like you could the PC itself before.
    2. At first the trend will be for people in the know and won't affect the market.
    3. Some content creator, say Linus, will build an insane GPU and CPU and people copy it.
    4. Now the open source hardware is so adept that it's actually a market threat for Nvidia and AMD.
    5. Building becomes more streamlined like PC building and you can now buy parts that judt click in place building your own chips like legos, again much like PC building.
    6. Mainstream GPUs and CPUs cannot compete as much like Mods for games the vanilla experience judt can't compete with what Johnny cooked up in his mom's basement.
    7. For Nvidia and AMD to compete they now have to adapt so either they or some up and comer OR Intel will start making licensed parts for projects like this and will promise, "XYZ if you buy XYZ from X company and not Y or Z!" And it will be up to the individuals to check which combination is best for performance just adding another customisation choice for PC gamers.
    8. A huge wave of insane complaints online about performance because optimising for this many combinations of parts on PC is impossible for developers.
    9. Developers are forced to focus on specific gear so they make games optimised for specific systems, this way the top company will buy performance from devs making them optimise for their parts.
    10. Now open source is once again dead because "Why buy your own gear with bad performance when you can buy a full Nvidia board with optimisation lol!"
    11. Back to square one as people are forced to buy full boards or brand items, aka a premade GPU, aka what we have today.
    12. 🤷‍♂️🤣

    • @powercore9277
      @powercore9277 7 месяцев назад +3

      really doubt it bc have you seen the machines that intel uses to make their cpus? they need to use machines made by one specific dutch brand ASML and those machines are not cheap

    • @gamma2816
      @gamma2816 7 месяцев назад

      @@powercore9277 Well I agree, probably absolutely not in the near future, but given that tech evolves and a lot of "too expensive for civil use" things in the past are now cheaply available, well relatively anyway. There was a time when cars were not an everyman's thing but here we are, so near future, absolutely not, but who knows later down the line. 😝
      But then again we are at a little stop in evolution tech wise as transistors have reached their size limit in how small they can be if I understood it right. But if quantum tech is invented to a civilian usage then maybe we'll continue, but I know nothing about this, it's just what's currently suggested. But hey, if their theories are true and quantum computers reach civilians in gaming then ping will be a thing of the past as all will have instantaneous server reach SUPPOSEDLY.

    • @sergemarlon
      @sergemarlon 7 месяцев назад +1

      I didn't see you mention AI. You pointed out the step in which human developers will fail and it seems like you don't think that's the perfect role for AI to fill.

    • @gamma2816
      @gamma2816 7 месяцев назад

      @@sergemarlon Very true! It's I guess I just don't hope so. 😅 I love AI tech but it freaks me out, it's like staring the thing that will end you in the face. AI is great, but terrifying so I don't want to think about it too much. Guess it's much like the nuke, great in power and as a scientific marble that ended wars, but terrifying when you think about it for too long. 😅
      But you're right, AI could probably handle it, but an AI built to build more machines that we don't understand, hence why we need them to do it, is a little uncomfortable for me. 😝

    • @jamesspencer1997
      @jamesspencer1997 3 месяца назад +1

      When you talk about MODS are you talking about paid DLC vs what some fan creates? I honestly have always been amazed how fans have actually breathed new life into even old games with the mods they have made. We're talking mainstream game studios have budgets in the millions make a DLC and still Johnny made something far more spectacular and he would be happy if someone bought him a cup of coffee or some hot pockets. I think it has to do with love.

  • @sklungofunk
    @sklungofunk 5 месяцев назад

    well its clearly not meant to be competitive as a gpu but i feel that the fact that a whole gpu (and maybe other pc parts in the future) can be open source just feels like a revolution to me, i mean if in the costs of production, the costs of running a buisness with shipping costs etc, like if you were to buy something like this on amazon or something, we could break the costs of royalships since this would be open source and if people like this hero would to build other open source pc parts in the future, we could build a whole pc without royalships addons and that could leave us to buy way less costy hardware (that doesnt cost less because of sketchy ways the owners came to own these pieces of hardware) to build way less costy pcs, clearly at the cost of performance, to build diy home servers or nas'es, because nas'es or home servers doesnt need usually much performance, maybe a couple of parts could come from more performant sellers to raise a little the trusting of the machine, but this whole concept seems like a deal to me

  • @flakes369
    @flakes369 7 месяцев назад +4

    TempleOS energy

  • @rtchau4566
    @rtchau4566 4 месяца назад

    "You might hear that the audio is kinda crackly, I don't think it's running perfectly..."
    I was there, Gandalf. I was there 28 years ago when Quake first came out. That's what it sounds like, more or less.

  • @ohnoitsaninja
    @ohnoitsaninja 7 месяцев назад +4

    It's not hard to make a graphics card, if we abandoned our current software library.
    It's very hard to make a graphics card thats compatible and performant on every version of opengl, directx, vulkan that has ever come out.

  • @EmeraldOtringal
    @EmeraldOtringal Месяц назад

    Just to give a bit more clarity: the FPGA is something like a homogenous array of logic gates (hence the GA from the name), which is like a huge field of general purpose boxes (the logic gates = simple circuits made out of few transistors that can do basic logic operations like NOT, AND, OR etc.). So you go into this huge field of gates that are all the same, and you now start programming "some" of them, so that you basically "carve" your circuit in a logical fashion just like you would on a Minecraft level. Or like you do with data in general (you have a huge array of cells able to store a 1 or a 0, but initially they're all the same, then by making "some" of them 1 and "some" of them 0, you now have meaningful information). So, because he cannot manufacture his own silicon die chip, he takes one that's already built but not initialized with anything, so he can use software to enable/disable his own custom paths inside the silicon.

  • @TriPBOOMER
    @TriPBOOMER 7 месяцев назад +5

    the 4060 is not the same as the 3060, it has less of ALL the cores and processors, less memory bandwidth, and 50% less ram. Statistically the 3060 is a better card in every way and with a little OC' which the 3060 is more than happy to give, the Gen gains get lost, other than frame gen & Dlss3, the 3060 is a better card

    • @darthpotwet2668
      @darthpotwet2668 7 месяцев назад +1

      Fg and dlss3 are the same thing?

    • @RogueSamuri7676
      @RogueSamuri7676 7 месяцев назад

      Well considering u can't get a 16gb 3060 but u can get a 16gb 4060 dose make a difference not only that but the Asus pro art 4060 ti 16gb is a little faster then the 3060 tho I don't see much of a difference in the speed nor dlss2 vs dlss3 not much of a difference there either

    • @TriPBOOMER
      @TriPBOOMER 7 месяцев назад

      ​@@RogueSamuri7676 Yes 3060 has only got 12 gb, which is plenty for the resolutions its aimed at I have never ran out and never seen it swapping with the system ram in 1080p and some 1440p, both resolutions are fine with 12 gb & the 3060 isn't trying to be a 4k card so 4k is irrelevant, infact its not really a 1440p card but it can do a bit, my card runs an OC of 20% on the die and just over a 1ghz on the Vram, its keeps up with the 4060, beating it in few 1080p spots in the bench numbers and game runs I can compare to online, oh and cards claim to fame, my 3060 out benched a pro w6800 in Cinebench 2024 Gpu render, and 3rd fastest UK 3060 in port royal 🙌😎🤣and yes the 4060 Ti is faster than a 3060 but in the same way a 3060 Ti is faster than a 3060 lol

    • @TriPBOOMER
      @TriPBOOMER 7 месяцев назад

      @@darthpotwet2668 yes they are my bad, didn't mean the & or to type them that way around eg. dlss3 frame gen, lol brain fart while my kids were pecking my head lol

  • @PhoenixKeebs
    @PhoenixKeebs 7 месяцев назад +2

    Unlike the other companies Intel actually has a chance. Since they have billions of money to spend on their GPUs and have a large team to pump out drivers. I think by Celeron they will be able to keep up with the other 2 companies. For now we should "just let them cook."

  • @mloclam6917
    @mloclam6917 7 месяцев назад +4

    But will it run Crysis

    • @imafirenmehlazer1
      @imafirenmehlazer1 7 месяцев назад

      "Only gamers will get that one"-jensen swoosh

  • @BloodravenRivers
    @BloodravenRivers 2 месяца назад

    i am astounded that he managed to do that. it means that the bar is only gonna get higher. it may not be earth shattering, but every earth shattering development usually starts by the pioneers, the pathfinders, hes laid the ground work for future enthusiasts and small devs to make, maybe not earth shattering, but market busting developments.

  • @benjamingavrilis71
    @benjamingavrilis71 7 месяцев назад +13

    I'm gay

  • @grandmasterautistwizard4291
    @grandmasterautistwizard4291 6 месяцев назад

    Shout out to this guy. The moment it's viable for the everyday fella, the opensource community is gonna go fucking crazy with it.

  • @R3TR0J4N
    @R3TR0J4N 7 месяцев назад +1

    unironically china market has been a blessing for budget builders notably the e-sport aimed GPU w/ the "SP's" and when the brand they represent was a sister brand like Palit(taiwan) and Inno3D

  • @iRubisco
    @iRubisco 3 месяца назад

    Honestly, the possibility of an open source GPU and the information we need to accomplish it is worthy of any award! Amazing video ☺️

  • @noth606
    @noth606 7 месяцев назад +1

    RasbPi has zero to do with FPGA, just relatively important point, Pi's use off the shelf chips, slap them on a custom board with some RAM and shizzle, connectors etc and call it a day. The FuryGPU is a significantly more involved project than that. I could throw together a RasbPi sort of equivalent thing in a couple of weeks for the hardware, a month or two for the adaptations of the codebase to the board assuming the chosen CPU is reasonably supported. It would take me years to fart out something approaching the FuryGPU if I'm lucky and have loads of resources. I am not bullshitting, I have done sort of similar things, but different purpose and lower power, to the rasbpi. I ended up not proceeding beyond making a first series of fully fabbed and tested boards in the end, but it didn't have anything to do with the hardware or software, but with me splitting up with my then GF, having an argument with my business partner at the same time, having to move and within less than a year finding new GF, deciding to and getting married. New wife soon got pregnant and priorities shifted from fugging around with tech, to different things. 10 yrs ago.

  • @tukebox-mf9bo
    @tukebox-mf9bo 6 месяцев назад

    we learn how to prgram fpgs boards like the max 10 in school in vienna austria, its such a complicated process, props to the guy

  • @FlockersDesign
    @FlockersDesign 7 месяцев назад +2

    If his frametime is around 38 this means its running below 30 FPS
    60 FPS is a frametime around 12
    And yes before someone askes thjs is my job as a Enviroment/lighting artist in the game industry for 12 years

    • @diaman_d
      @diaman_d 7 месяцев назад

      16.6 ms to be precise

  • @mtcoiner7994
    @mtcoiner7994 3 месяца назад

    This type of stuff is very interesting. It always blows my mind when I see people upgrading GPU memory modules. Makes you wonder why the manufacturers wouldn't have maxed out the memory potential in the first place.

  • @dienand_gaming
    @dienand_gaming 3 месяца назад +1

    Love how he straight up says writing the driver was the hardest part 😂

  • @kaimanic1406
    @kaimanic1406 7 месяцев назад +1

    I can't even imagine to build my own GPU. This guy is amazing!

  • @emiljagnic
    @emiljagnic 6 месяцев назад

    Awesome, thank you for reporting about this!

  • @rosergio7227
    @rosergio7227 7 месяцев назад +1

    An Open source GPU... No way!

  • @audiocrush
    @audiocrush 4 месяца назад

    I'm seriously impressed by this dedication and this awesome result!
    But in response to some comments down below: Just to give a little perspective here.
    An FPGA powerful enough to get you even remotely close to a regular run of the mill modern low end/mid range GPU will run you anywhere north of 50k
    not even considering the horrendous amount of support components needed to be able to run one of these behemoths.
    Also there is probably also technical limitations on how the fpga can be switched to represent the circuitry in a full blown GPU. Maybe you could use multiple FPGAs to represent the functional blocks of one GPU, not sure though. Like implementing a single modern open source reverse engineered cuda core with nothing else, that thing might be capable of running.
    I'm not an expert but I cant see even the most powerful general purpose FPGAs we currently have be any faster than maybe an old GeForce 8800GTS or something.
    I'm not saying open source GPUs can't be done with an fpga.
    My point is that they will never be fast enough to be useful besides proving a point.
    And if it is to be an open source GPU project, it has to be something as flexible as an FPGA otherwise the community can't do the many many development testing release cycles as is common with other software projects.
    You can't tape out new silicon just because someone wants another change/improvement because there would be literally hundreds of people working on that (like if you could compare it to an effort like developing the linux kernel that is)

  • @dennisestenson7820
    @dennisestenson7820 7 месяцев назад

    About 10-15 years ago I worked on a product that used an FPGA to generate video output. It's impressive, but definitely not unheard of.

  • @Skullkid16945
    @Skullkid16945 7 месяцев назад

    I hope there is an open source hardware boom soon, or at the least more big names getting involved in finding ways to make things like this more avaliable for the open source community as a whole.

  • @LuGaKi
    @LuGaKi 5 месяцев назад +1

    "imagine having a pre-build GPU"

  • @c.n.crowther438
    @c.n.crowther438 7 месяцев назад

    I will be following Dylan Barrie's work with great interest.

  • @guarand6329
    @guarand6329 7 месяцев назад

    I bet if he took the transistor design and converted that to dedicated silicon vs the fpga, it would run faster.
    Pretty cool that he created a gpu design, also wrote the driver, and it's working.

  • @bananaman8693
    @bananaman8693 7 месяцев назад +2

    This man will become head designer at NVIDIA bro trust ms

  • @MyouKyuubi
    @MyouKyuubi 7 месяцев назад +2

    the GT 1030 is an absolute gigachad of a card, no joke... It's the best graphics card you can get that can get by purely off of passive cooling (Radiator with no fan)! So it's brilliant for like tiny, almost pocket-sized "portable stationary" PC builds. :)

    • @vasoconvict
      @vasoconvict 7 месяцев назад +1

      It absolutely is a joke. Mini pcs can run off integrated graphics that beat it to the ground, intel and some nvidia card makers are making tiny gpus that are still quiet as passive cooling is only a good idea if you have something extremely underpowered, dust, or you cant bear 20db of noise. Its about time the gt 1030 dies.

    • @MyouKyuubi
      @MyouKyuubi 7 месяцев назад

      @@vasoconvict Integrated graphics can't run entirely off of passive cooling though... They need fans blowing air on them. :P

    • @vasoconvict
      @vasoconvict 7 месяцев назад

      @@MyouKyuubi Passive heatsinks made out of copper..

    • @MyouKyuubi
      @MyouKyuubi 7 месяцев назад +1

      @@vasoconvict nah, dude, integrated graphics uses CPU... there's absolutely no way you can use passively cooled integrated graphics without overheating the CPU playing something like Half Life Source. :P
      You're gonna needs a MASSIVE heatsink at the very least in order for passive cooling to work, at which point, we're no longer in the pocket-sized scale of computers.
      Get real, bro.

  • @m7amed-roshdy
    @m7amed-roshdy 7 месяцев назад +1

    tbh i have a feeling if this goes public it would compete with intel.

  • @Mio96O-O
    @Mio96O-O 7 месяцев назад +1

    It's probably would be more cheap and fun if there's like 15 company that can make their own cpu and gpu instead of 3/2 company monopoling the market

  • @chokeizm
    @chokeizm Месяц назад

    And this project will most likely get him his next job . I just hope he carrys on with his personal projects ❤

  • @gamekiller9053
    @gamekiller9053 7 месяцев назад +1

    That open-source gpu is not meant to be anything but a toy

  • @jasont80
    @jasont80 6 месяцев назад

    He's basically using a single-board computer to render graphics in software. It will never be close to a modern GPU, but this level of tinkering is amazing. Love it!

  • @Gamesational1
    @Gamesational1 5 месяцев назад +1

    Cool! Now we need Open source PhotoLithography devices.

  • @poohbear4702
    @poohbear4702 7 месяцев назад

    I've been wondering for a few years whether someone would do this. Very cool!

  • @adebolaadeola
    @adebolaadeola 3 месяца назад +1

    the sound is fine - that was the sound back then

  • @marisbarkans9251
    @marisbarkans9251 6 месяцев назад

    now I finally understand how people heard and saw me when I was younger. You CAN make a computer just on test boards with wires and ic's. You can make a GPU from the same. You can re-solder memory, cpu's, gpus's and make frankenstein cards. You can power mod GPu's. You ln2 overclock. You can do all sorts of things but its all for fun and learning. Even upgraded cooling solutions are usually useless unless your card has very shit cooling. Most i found useful was modding laptop heatsinks if you didnt care about the looks or noise. All of this has been already done. I would say that it would be more interesting combining gpu's and writing drivers to get them to work then making your own.; In the end you dont rly make your own x=cause you take a universal ship anyway. so why not use old amd or nvidia cards to swaps gpu dies and ram etc?

  • @Revoku
    @Revoku 7 месяцев назад

    the CPU/GPU on a raspberry PI is an arm cpu/whatever internal gpu, has set instructions /pathways for both. an FPGA is a chip that you can program the gates/instructions/pathways
    you run code that changes the configuration of the chip

  • @zawadlttv
    @zawadlttv 7 месяцев назад +2

    the sdcard probably holds the programming of the chip. probably the easiest way to update that like that

  • @crazyidiot5309
    @crazyidiot5309 7 месяцев назад

    LOVE the Meze's you're running.

  • @imSkrap
    @imSkrap 2 месяца назад

    Crazy, one thing that’s bothered me so much about GPU’s is how disgustingly HUGE they are getting… like isn’t the modern idea of innovation to make stuff better in smaller packages?

    • @samuelsmith9582
      @samuelsmith9582 Месяц назад

      Moore's law is dead buddy. We've reached a point where it's nearly impossible to make smaller chips because the gates lose electrons iirc.

  • @Ninetails94
    @Ninetails94 7 месяцев назад

    it wouldnt be too hard to make the gpu itself, the self written code is the hardest part,
    so the fact that some dude made a gpu from scratch its pretty neat, hope someday we could get custom gpus that out perform the major players.

  • @aggressivefox454
    @aggressivefox454 6 месяцев назад

    For how customizable and “freeing” the pc market is with a wide variety of interchangeable parts, operating systems, etc. I would have expected for their to be less of a monopoly on gpus. I always got the impression they were just mini computers so I kind of thought that you might be able to easily build them yourself (granted not as easily as a pc). Open source and custom built gpus would be awesome to see though. I’d love to have a multi gpu set up that I can build myself

  • @hithmedia
    @hithmedia Месяц назад

    Now when we start building our pcs, we’re gonna build our GPUs with upgradable VRAM (one could wish)
    This is super sick, i love that it’s open source too

  • @bubmario
    @bubmario 7 месяцев назад

    I think what is important about something like that is it can lead to gateways which are normally locked and up to the vendor to approve. If there is some thing that is not able to be performed on Nvidia or AMD GPU, maybe this card can be a solution to that in years time. Open source stuff is really important for that.

  • @rBennich
    @rBennich 3 месяца назад

    This would be so cool to use as a direct gpu "emulator" in conjunction with virtual pc's for retro gaming. Like 86box, where you choose your retro parts at will, and with this card, it would reprogram the FPGA on the go, while you have it outputted to a secondary CRT monitor, for instance, and use a shortcut command to switch keyboard and mouse focus between the host and virtual pc. One virtual GPU to rule them all.

  • @cryptogenik
    @cryptogenik 3 месяца назад +1

    FPGA will not compete with dedicated silicon, ever.

    • @mrnorthz9373
      @mrnorthz9373 2 месяца назад +1

      What is your point? Everyone knows fpgas arent for mass produced products. He is not teying to sell anything. Because hes doing it as a fun project he is using an fpga

  • @GHOULZ1500
    @GHOULZ1500 10 дней назад

    0:26 audio is probably how it just sounded in these old games

  • @tristan6509
    @tristan6509 6 месяцев назад

    you should take a look at the MiSTer, they not only emulate the GPU, but also CPU, RAM, firmware, etc of old consoles with an FPGA. it's some wild stuff.

  • @reinaweis
    @reinaweis 3 месяца назад

    If I had to guess, the barrel jack is to power the card when not in a system, and the sd card is to load on new firmware. This is in line with other FPGA's.

  • @Md.ImranHossain-uq4gu
    @Md.ImranHossain-uq4gu 5 месяцев назад +1

    How can I replace m.2 slot with thunder bolt port in a laptop in order to add external GPU?

  • @DaemonForce
    @DaemonForce 7 месяцев назад

    Generation lockout isn't just a thing between CN cards and random mobos. There are a LOT of modern day mobos that flat out HATE anything 3080Ti. It's a nice fun game of Russian Roulette slotting one of these chonkers in and powering it on wondering if your combo is one of these deadends.

  • @Sunowaddle
    @Sunowaddle 4 месяца назад

    Insane, never knew how extremely impressive GPU's were before this video, now I have a newfound respect for these things haha

  • @AiyanaArna
    @AiyanaArna 3 месяца назад

    your creativity blows me away!