The FUTURE of GPUs: PCM

Поделиться
HTML-код
  • Опубликовано: 4 окт 2024
  • Urcdkeys.Com 25% code: C25 【March Madness Sale】
    Win10 pro key($15):biitt.ly/pP7RN
    Win10 home key($14):biitt.ly/nOmyP
    Win11 pro key($21):biitt.ly/f3ojw
    office2019 pro key($50):biitt.ly/7lzGn
    office2021 pro key($84):biitt.ly/DToFr
    MS SQL Server 2019 Standard 2 Core CD Key Global($93):biitt.ly/oUjiR
    Support me on Patreon: / coreteks
    Buy a mug: teespring.com/...
    My channel on Odysee: odysee.com/@co...
    I now stream at:​​
    / coreteks_youtube
    Follow me on Twitter: / coreteks
    And Instagram: / hellocoreteks
    Relevant links:
    www.nature.com...
    www.deepdyve.c...
    www.freepatent...
    www.freepatent...
    Footage from various sources including official youtube channels from AMD, Intel, NVidia, Samsung, etc, as well as other creators are used for educational purposes, in a transformative manner. If you'd like to be credited please contact me
    #ai #nvidia #amd

Комментарии • 141

  • @6Ligma
    @6Ligma 6 месяцев назад +177

    Crank up the subwoofers bois, Corteks published a new video

    • @JoshuaFlower-bl3ey
      @JoshuaFlower-bl3ey 6 месяцев назад +1

      I knowwwwww

    • @InquilineKea
      @InquilineKea 6 месяцев назад +4

      IT'S BEEN QUITE A WHILE

    • @cedricdellafaille1361
      @cedricdellafaille1361 6 месяцев назад

      Owww yeahhhhhhhhh

    • @hiXhaX-YT
      @hiXhaX-YT 6 месяцев назад +6

      His voice is pretty off this time

    • @ADB-zf5zr
      @ADB-zf5zr 6 месяцев назад +2

      I stopped watching them because I can't stand the Bot voiceover, but this video caught my attention, shame really that I have downvoted it, if they used a Human for the voiceover it would get an upvote and I would still be a subscriber, their choice, their reward.

  • @Kratoseum
    @Kratoseum 6 месяцев назад +78

    These dives into current limitations, ongoing research and future potential is when I think this channel is at it's best! Thanks for the hard work.

    • @timginter146
      @timginter146 6 месяцев назад +3

      I thought the same - I missed these ~20 minute Coreteks videos - focus on technology, history, data and facts. Great video, great to see those back!

  • @Vatharian
    @Vatharian 6 месяцев назад +15

    1. Jim Keller knows what he is doing.
    2. Cerebras is 3rd way - it has so wide fabric, that it can offload memory.
    3. PCM requires huge temperature swings. Optane had to heat up single bit cells to 400 deg. Celsius. it's not-unlimited lifespan came specifically from cracking issues.
    4. Stacking requires ability to move all of compute layer's heat load through all of the memory layers. 3D-VCache shown this precisely. Every layer literally robs underlying die of TDP. :C

  • @Sythemn
    @Sythemn 6 месяцев назад +21

    I've been following the alternative memory types since 2005 and continue to be frustrated that the closest any of them came to useful commercialization was Intel's proprietary half a**'d Optane (PCM) experiment. MRAM is finding industrial embedded uses at stupid inflated prices but doesn't really seem to be in any hurry. Hadn't seen FeRAM mentioned in years.
    Glad to see the big guys are finally revisiting these.

    • @seylaw
      @seylaw 6 месяцев назад

      I am also following the news of new memory types, Nantero also had a cool hot chips presentation some years ago. It was never heared of again though.

    • @thecat6159
      @thecat6159 5 месяцев назад

      That's cause most of claims are based on overexaggerated hype, and highly selective research that are not represented of real-world situations.
      Nothing exemplifies this more, than the huge number of startups have attempted to commercialise non-volatile memory over the previous two decades with none of them ever producing a viable product that can compete with currency legacy memory technologies.

  • @Buddy308
    @Buddy308 6 месяцев назад +9

    At the very end of this video, Coreteks states that there is no other channel doing this level of research. That's exactly what I was thinking for the last ten minutes before that statement. He and his videos are unique in this area of technology reporting. Even though much of his presentations are over my head, I'm compelled to donate in order to keep the channel viable. I hope others feel the same.

    • @kelvinnkat
      @kelvinnkat 6 месяцев назад +2

      Asianometry is comparable. Not quite as in-depth, but close enough to be comparable at least.

    • @Buddy308
      @Buddy308 6 месяцев назад +1

      @@kelvinnkat
      I'm on it. Thanks

  • @reinerfranke5436
    @reinerfranke5436 6 месяцев назад +8

    Stacking scale down power density for compute. If the sweet point is less compute efficient fetch data then using memory with local compute is the right way.

  • @abcqer555
    @abcqer555 6 месяцев назад +14

    Awesome video!

  • @mikebruzzone9570
    @mikebruzzone9570 6 месяцев назад +7

    Fero magnetic and other forms of resistive ram (spin torque) is more likely than PCM that has its own system heat issues that can flip crystalline bits. Intel has shown they can stack mylar cells essentially with Optane but not profitably. Ultimately it all gets down to cell size and SRAM on 'stacking' is very cost effective albeit as noted here power consuming. The stability and endurance of solid state magnetics, spin and tunneling will win and are winning now in industrial embedded and aeronautics they are much more stable. The Intermediate step is to place FPGAs that are SRAM laden with various processing elements onto a GPU but ultimately the trend is toward solid state memories, a return to solid state memories actually do your research. mb

  • @vitormoreno1244
    @vitormoreno1244 6 месяцев назад +13

    I always guessed why FeRAM didn't catch up, I guess is a Cypress scale problem, but the tech is awesome, it is very low power and have no write delay, it writes at bus speed. I use on my projects since 2020

  • @4.0.4
    @4.0.4 6 месяцев назад +2

    There is another reason for CUDA's dominance: AMD's incompetence. Consider the case of the genius hacker George Hotz (iOS jailbreaks, reverse engineering the PlayStation 3) developing an AI box for the home (TinyBox) and giving up on AMD after offering the olive branch of fixing their bad drivers for them (if AMD open sourced them).

  • @CreepToeJoe
    @CreepToeJoe 6 месяцев назад +4

    It's always exciting when humanity pushes the limits even further! Thank you for making this video and keeping us up to date! 🙂

  • @Ivan-pr7ku
    @Ivan-pr7ku 6 месяцев назад +3

    Nvidia switched from graphics-first designs way back in 2006 when they released their first unified shader GPU with CUDA support. Since then, GPU architectures have evolved into parallel compute machines with some attached graphics functionality. The same process for AMD began with GCN and today the company have two distinct architectures -- CDNA for pure (enterprise) compute and RDNA with balanced graphics and compute features. Anyway, in the near future even graphics rendering will be mostly driven through compute (incl. RT) and AI inference, and less of classic raster shading... if we would believe Nvidia at least.

  • @Phil-D83
    @Phil-D83 6 месяцев назад +4

    Intel optane was basically phase change memory

  • @Zujanbre
    @Zujanbre 6 месяцев назад +4

    Germanium Antimony and Telirium. Was the AI that made the discovery called Shodan?

  • @RubiconV
    @RubiconV 4 месяца назад +1

    How do you never take a breath and keep talking so fast for 15 minutes? Amazing.

  • @gethinfiltrator6700
    @gethinfiltrator6700 6 месяцев назад +3

    time: 18:46
    The word "AI": 25 times

  • @AnkushNarula
    @AnkushNarula 6 месяцев назад +3

    Great coverage - thanks! PCM reminds me of HP's "memrister" announcement from 2008. Do you have knowledge of it or maybe an inside scoop on the progress? Would make for an interesting technical video!

  • @keyboard_toucher
    @keyboard_toucher 6 месяцев назад +2

    12:01 latency increased--not decreased--by 3% on average.

  • @Austin1990
    @Austin1990 6 месяцев назад +1

    Chiplet GPUs could only go so far without significant on-chiplet memory. So, we will see if they pull it off.

  • @florin604
    @florin604 6 месяцев назад +1

    This guy describes intel old faithful optane as a revolution.... amazing

  • @aalhard
    @aalhard 6 месяцев назад

    The return of the Transputer😊😊🎉. Makes you wonder what might have been...

  • @_DarkEmperor
    @_DarkEmperor 6 месяцев назад +3

    Stacking memory on top of a chip performing calculation is not so great idea, unless it is low voltage low power chip.
    Proper way of doing things is to stack a compute chip on top of memory, this way you can put cooling directly on top of a compute chip.

    • @roqeyt3566
      @roqeyt3566 6 месяцев назад

      That's how adamantine does it right?

  • @seylaw
    @seylaw 6 месяцев назад +2

    @coreteks What about Samsung and SKHynix' concpets of in-memory computing? Samsung already had a prototype with an AMD GPU and had a presentation at HotChips 2023 about it.

  • @covert0overt_810
    @covert0overt_810 6 месяцев назад +4

    yes… but can it run crysis?

  • @ATestamentToTech
    @ATestamentToTech 6 месяцев назад +2

    I really hope AMD win this race. With the patents filed they abviously have a road map in place.. The next decade is going to change the world as we know it. Great video

    • @mikelay5360
      @mikelay5360 6 месяцев назад +2

      😂😂 ohh dear

  • @pf100andahalf
    @pf100andahalf 6 месяцев назад +4

    Excellent video.

  • @ntal5859
    @ntal5859 6 месяцев назад

    Legend has it the Leather Jacket was stolen of the Arnnie , and he(Jensen) really is a T800 sent to progress Skynet and prepare us meatbags for our AI overload... All hail Skynet.

  • @--waffle-
    @--waffle- 6 месяцев назад +2

    where are the sick forest graphics at the start from?

  • @axl1002
    @axl1002 6 месяцев назад +1

    I was going to say "...but Jim Keller said"🤣🤣🤣

  • @ninthburn1199
    @ninthburn1199 6 месяцев назад

    Interesting deep dive! Thanks for sharing it with us

  • @selohcin
    @selohcin 6 месяцев назад

    Those Iranian researchers are incredible. I really hope Nvidia (or AMD?) pays them a lot of money to join their staff and integrate this technology into their products.

  • @korinogaro
    @korinogaro 6 месяцев назад +25

    I don't think this dude has any valid "future of" video. Like where is our RISCV revolution?

    • @pf100andahalf
      @pf100andahalf 6 месяцев назад +1

      "Better late than never" seems to be an appropriate thing for me to say.

    • @korinogaro
      @korinogaro 6 месяцев назад +1

      @@pf100andahalf true but by the same virtue I can predict whatever to revoloutionize whatever and as long as it doesn't vanish it would be in a state of: "but it's getting there". Maybe I should make a video about glass substrate for CPUs that Intel is working on, make hype around it and then forget to say that Intel predicts they need at least 10 more years.

    • @korinogaro
      @korinogaro 6 месяцев назад +1

      BTW. take his "2024 Inte is Intel's ALL-IN year" with AMD should be WORRIED. Dude makes short-term predictions about CURRENT year a MONTH AGO and so far is completely in the wrong. So far Intel tries to sell 14900KS for sick money and all leaks show that their CPUs this year will be kinda shit. The most revolutionary thing they did so far is anoucement of change in naming scheme.

    • @nossy232323
      @nossy232323 6 месяцев назад

      @@BlackLixt That's what she said!

    • @mattBLACKpunk
      @mattBLACKpunk 6 месяцев назад +2

      His arm video, arguably

  • @IraQNid
    @IraQNid 6 месяцев назад +2

    Graphics processing Units weren't meant for gaming. They were meant to provide better looking visuals for whatever you were doing with your computer. To take over the workload of the general purpose CPU / FPU combo chips so that they could operate more effectively.

  • @Integr8d
    @Integr8d 6 месяцев назад +1

    Bottom heat sink is the key

  • @davidswanson9269
    @davidswanson9269 17 дней назад

    PCM has write endurance issues as well and will eventually fail burning out.

  • @profounddamas
    @profounddamas 6 месяцев назад

    Wasn't that a fail with those small companies Intel and Micron using 3D XPoint?

  • @GeorgeD1
    @GeorgeD1 6 месяцев назад +3

    Celso's voice is extra husky today. :D

    • @ADB-zf5zr
      @ADB-zf5zr 6 месяцев назад +1

      Husky Bot.!

  • @alpha007org
    @alpha007org 6 месяцев назад +1

    Isn't packaging the bottleneck currently at TMSC?

  • @davidlazarus67
    @davidlazarus67 6 месяцев назад +2

    China has a huge lead in Phase Change Memory. It won’t be available to western countries for some time. Nvidia’s valuation is based on AI which it lags behind China. That’s a big bubble just waiting to burst.

    • @jackinthebox301
      @jackinthebox301 6 месяцев назад +1

      Go away Chinese bot.

    • @ofon2000
      @ofon2000 6 месяцев назад +1

      I was wondering the same thing...2 word name with random 2 numbers and pro China comment with awkward grammar@@jackinthebox301

    • @jackinthebox301
      @jackinthebox301 6 месяцев назад +3

      @@ofon2000 The only phase change that China has better than the US is their concrete's natural ability to phase change to rubble.

    • @BoraHorzaGobuchul
      @BoraHorzaGobuchul 6 месяцев назад

      Please tell us more entertaining stuff, I laughed so hard...

    • @ofon2000
      @ofon2000 6 месяцев назад

      @@jackinthebox301 yeah tons of Chinese stuff that is good value, but a lot more that seems like a good deal only to realize it breaks so fast that it's garbage value.

  • @platin2148
    @platin2148 6 месяцев назад

    Doesn’t that mean you have to clean that memory also? At least to a part.
    So when do we see intel adding it stacked on top?

  • @IraQNid
    @IraQNid 6 месяцев назад +1

    SSDs will lose data if they are left without power for too long.

  • @noanyobiseniss7462
    @noanyobiseniss7462 6 месяцев назад +1

    That's a layout, not a block diagram.

  • @XxXnonameAsDXxX
    @XxXnonameAsDXxX 6 месяцев назад +1

    The only future I see is a 8000 usd projector which is mandatory for gaming.

  • @nintendobrad3946
    @nintendobrad3946 6 месяцев назад +1

    I'd like to see this make it into the PS6.

    • @HoneyTwee
      @HoneyTwee 5 месяцев назад

      Would be far too expensive and far too soon.
      PS6 could be as close as 2 years away.
      If we're unlikely to see this in Nvidia enterprise cards by then, we're not going to see it in a $500 console by then.
      PS6 pro or PS7 sure. If this tech scales well and isn't a dud.

  • @Boorock70
    @Boorock70 6 месяцев назад +2

    So, will AMD be able to reduce the insane "power consumption" of it's GPUs @ "video playback ?"
    Last Gen's 6800 (and up) RDNA 2 GPUs & the new 7000 (RDNA 3) series have serious power issues @ video playback (RUclips, Netflix, VLC etc.)
    Check out the "doubled" video playback consumption of 6700 XT (20W) vs 7700 XT (42W) at TechPowerUp & ComputerBase
    40+ W is stupidly HiGH & meaningless...
    AMD needs to solve that "power issue @ video playback" at least in the next 8000 series. (or do they even care?)
    * Emphasize on "video playback" as most people confuse it with idle, web or gaming consumption. They are very different things.
    AMD, still didn't solve the "video playback consumption" the previous solution was for "idle consumption" only.
    PS: RX 7900 XTX is still the record holder with 67W but 7900 GRE is getting closer with 62W on watching RUclips !

  • @cracklingice
    @cracklingice 6 месяцев назад +1

    Intel no longer had a fab to build Optane because they gave up their half and the other company decided to sell it.

  • @igormarcos687
    @igormarcos687 6 месяцев назад +4

    The description filled with affiliate links shows the decadence of the channel

    • @XxXnonameAsDXxX
      @XxXnonameAsDXxX 6 месяцев назад

      Projector gate never forget

    • @TheChkgrniv
      @TheChkgrniv 6 месяцев назад

      I would gladly watch your Channel. I am sure that you can take your time and effort and totally give it away for free.

    • @HoneyTwee
      @HoneyTwee 5 месяцев назад

      ​@@XxXnonameAsDXxXwhat exactly is projector gate.
      He reviewed a $8000 projector to look at what the future of display technology could look like.
      Just because you can't afford it now and he is looking at it not from a value perspective, doesn't mean there isn't value in analysis what $8000 tech looks like today, because in 5 years that could be $2000 tech. Then another 5 that's $700 tech.
      What's the problem?
      I genuinely could be missing something scummy he did, but being sponsored to talk about an extremely expensive product that almost nobody can afford anyway isn't really scummy on its own?

  • @El.Duder-ino
    @El.Duder-ino 4 месяца назад

    Without any doubt memory needs to catch up with the chip logic as it was left far far behind for decades and cache like SRAM memory is not sufficient enough nor economically viable to be that next step even it will most likely rely on 3D stacking similar to AMD's V-Cache.
    One moment memristor looked like solution and holy grail of the compute memory but it might be actually PCM. Let's see, thx for the vid!

  • @hambotech9954
    @hambotech9954 6 месяцев назад

    Bro's voice almost blew my speakers 💀

  • @newerstillimproved
    @newerstillimproved 6 месяцев назад +5

    so optane will come back

  • @Lex90909
    @Lex90909 6 месяцев назад

    awesome video! thanks

  • @VincentPandian-z9b
    @VincentPandian-z9b 6 месяцев назад

    Interesting stuff. Well researched and informative.

  • @dosmastrify
    @dosmastrify 6 месяцев назад

    4:25 This guy definitely just said ass ram

  • @mr.electronx9036
    @mr.electronx9036 6 месяцев назад

    Chips on glas will change everything

  • @cdurkinz
    @cdurkinz 6 месяцев назад

    15:53 wait is this the first iteration of AI improving itself? 😂

  • @Dmwntkp99
    @Dmwntkp99 6 месяцев назад

    Hopefully Jenson won't buy them out if they become a threat.

  • @mmmuck
    @mmmuck 6 месяцев назад

    curious if it's still worth buying Nvidia stock to hold for a decade or more?

  • @gamingtemplar9893
    @gamingtemplar9893 6 месяцев назад

    Awesome, new episode.

  • @mikemj8204
    @mikemj8204 6 месяцев назад

    Great job thank you.

  • @johnbeer4963
    @johnbeer4963 6 месяцев назад +1

    Phase change memory.... So Optane.

  • @soothsayer5743
    @soothsayer5743 4 месяца назад

    U have a certain workload to get through as cheap as possible, could u sell more ai/gpu units because its cheaper or because the units are so fast that people want more?Think better graphics?(im still using 1080p screens, its good enough: cheap) Idealy i want quiet, fast enough for my game/web browsing, cheap, compact(netbook size) and not power hungry(long lasting small battery). Imagine if i could have a xbox/ps5 in my small netbook(sub 13 inch and less than a kg) and its silent! No fan!!!! GPU's are fast enough and good enough but noisy! I would def go for a silent, very low power option but cheap....this is my world that im looking from, my question to the brains out there, what points is this PCM targeting?

  • @effexon
    @effexon 6 месяцев назад

    TLDW after 3.5 minutes: nvidia gonna put HBM and other highend very fast memory to professional gpgpus. gaming gpus have GDDR7 so not to give too cheap mining gpus for regular people.

  • @noobgamer4709
    @noobgamer4709 6 месяцев назад

    Coreteks can you put out video on GTC NVIDIA super gpu MCM? want to know your thought on nvidia mcm

  • @esra_erimez
    @esra_erimez 6 месяцев назад +4

    Jim Keller is a living legend

  • @--waffle-
    @--waffle- 6 месяцев назад

    When is Nvidia going to launch their CPU to general public? Will it EVER come? All i want is a Grace-Hopper like CPU&GPU in one small(ish) convenient box, a 'console' like PC. A 5090 that i just plug in to my monitor. No wires everywhere, doesnt take up half my room. Or even AMD, they already make APUs.

  • @l-cornelius-dol
    @l-cornelius-dol 6 месяцев назад

    Just can't wait until I need a $2000 GPU _and_ a $2000 AIU to play the latest titles. 😑

  • @PointingLasersAtAircraft
    @PointingLasersAtAircraft 6 месяцев назад

    We need through die heat pipes.

  • @kaisersolo76
    @kaisersolo76 6 месяцев назад

    great stuff.

  • @mcmalden
    @mcmalden 6 месяцев назад +2

    What a bunch of incoherent rambling about compute architectures and then somehow mixed in 3D integration and different memory technologies.
    The majority of power is dissipated on the GPU die, therefore you cannot just stack insulating components on top of it. The refresh power for DRAM has little to do with this and PCM wont fix it. The Iranian paper somehow appears to envision sandwiched cooling, which is a completely different point altogether.

  • @phaedrussocrates7636
    @phaedrussocrates7636 6 месяцев назад

    Thank you

  • @kozmizm
    @kozmizm 6 месяцев назад

    We're not a hardware company, we're a software company. BS! You are both!

  •  6 месяцев назад

    what about Groq's LPUs?

  • @scottpar28
    @scottpar28 6 месяцев назад

    How about micron? They did this

  • @shk0014
    @shk0014 6 месяцев назад +3

    Hello reader, your mom.

  • @Raphy_Afk
    @Raphy_Afk 6 месяцев назад +3

    Never underestimate the ability of AMD to disappoint

  • @esra_erimez
    @esra_erimez 6 месяцев назад +6

    I like nachos

  • @SP95
    @SP95 6 месяцев назад

    Good news

  • @Peter-uf4yn
    @Peter-uf4yn 6 месяцев назад

    haha.. he said "ass ram"

  • @ahmedp8009
    @ahmedp8009 6 месяцев назад

    Well Done Iran!

  • @lasagnadipalude8939
    @lasagnadipalude8939 6 месяцев назад +1

    Underappreciated the fact that the material was discovered by ai. We are really at the start of a science fiction novel

    • @christophermullins7163
      @christophermullins7163 5 месяцев назад +1

      Yes we are. Anything you want.. any medication.. technology.. etc etc. The AI will learn all of humanitys capabilities and production technologies and recommend the best way forward to continue to advance everything in a more efficient and economical way that we would be able to do without ai. I feel that being born into the beginning of this information revolution is the strongest thing about our existence. I am so fascinated by the future of technology and AI it seems like we are living through a scifi novel for sure. Breakthroughs in all fields will flood in in the coming decades and we get to watch it unfold like.. a scifi novel for lack of better words. What a strange and fascinating life humans are living through. Sad thing is most people are oblivious to it and are not interested in the technology that will change humanity forever. I am obsessed with it.

    • @lasagnadipalude8939
      @lasagnadipalude8939 5 месяцев назад

      @@christophermullins7163 That's totally how I feel

  • @jflgaray
    @jflgaray 6 месяцев назад

    This PCM hype again???!!! This tech has a good track record of NOT successfully winning the market.

  •  6 месяцев назад

    lol boom:) Just staggering!!!

  • @nsf001-3
    @nsf001-3 6 месяцев назад

    No, PCM means Pulse Code Modulation. Quit mucking up my search engine results with BS tactics like this

  • @gummywurms226
    @gummywurms226 6 месяцев назад

    During AMD's AI presentation last year one of AMD's partners said that they have exceeded the capability of CUDA in regards to AI. I'm surprised that nobody picked up on that little tidbit. Without CUDA Invida is nothing.

  • @tristankordek
    @tristankordek 6 месяцев назад

    👍

  • @mattbegley1345
    @mattbegley1345 6 месяцев назад

    The future of GPUs is SPAM... There are so many new videos everyday about nvidia that the HYPE has turned into SPAM.
    Stop spamming the board!

    • @Coreteks
      @Coreteks  6 месяцев назад

      @mattbegley1345 My apologies, I'm deleting my channel right now!

  • @gsestream
    @gsestream 6 месяцев назад

    focus on fixing current stuff, dont even go to new stuff, even the current, hardware and software, is fixed. fully. well if you are in hurry, you produce nothing. literally. fix it.

  • @Zorro33313
    @Zorro33313 6 месяцев назад

    ib4 gpus gettin l1, l2, l3... oh wait, so it's a CPU with ho SMT now basically. kind of big little or whatever. intel is on it's way to inventing GPU. or APU. same shit. CPU are becoming more GPU-like, GPUS are becoming more CPU-like. LMAO.

  • @nossy232323
    @nossy232323 6 месяцев назад

    I wonder if Coreteks was drunk when he made this video?

    • @ofon2000
      @ofon2000 6 месяцев назад

      why are you saying that?

    • @nossy232323
      @nossy232323 6 месяцев назад

      @@ofon2000 His voice sounds strange in this video.

  • @ThaboW11
    @ThaboW11 6 месяцев назад

    10:21 Iran... the western media is indeed biased as far as reporting the 'positive' aspects of Iranian tech while the eastern is also complicit, for there was no elaborate mention of what i'm about to say. The 1023 recent alleged attack on an American base in Jordan by a Iranian military drone masquerading as an American military one is case in point, apparently one of their key scientists went as far as to claim back in 2011 that their nation had 'been offered' technologies (he claimed it was by IDs) that enabled them create a tractor beam, much like in the star trek/ wars films, with which they captured an American drone. There's a video done on this by the you tube channel end time productions,,
    There's more to this place than what the news implies.

  • @nsf001-3
    @nsf001-3 6 месяцев назад

    Can't wait for the "AI" fad to be over with. Really makes you beg for the tessellation meme again at this point

  • @DudeGamer
    @DudeGamer 6 месяцев назад +1

    First

  • @gregandark8571
    @gregandark8571 6 месяцев назад +2

    Unsubscribed, a lot of no sense and wrong information's.

  • @DoesNotInhale
    @DoesNotInhale 6 месяцев назад +2

    Coreteks is such a joke on youtube whatever topic he discusses I literally go to Moore's Law is Dead or any other channel and can be guaranteed I will not only be given actually accurate information, I won't have to listen to a limey smug mouth breather make horrendously bad hot takes and predictions that never come true while he fantasizes about AMD becoming "competitive" in our lifetime. Thanks Coreteks for keeping me up to date with topics I know you cant handle with your Celeron tier dysgenic grey matter.

    • @BIG_HAMZ
      @BIG_HAMZ 6 месяцев назад +5

      Coreteks isn’t a leaker, he provides a unique, speculative look on what he thinks the future of technology might look like. I enjoy his videos and don’t expect them to be like MLD. I understand that he has some hot takes sometimes but that’s the point, it’s about starting a discussion for us enthusiasts

    • @adiffkindofswag1148
      @adiffkindofswag1148 6 месяцев назад +3

      You lost all credibility the moment you mentioned Moore's Law is Dumb. 🤡

    • @waynnewilliams5588
      @waynnewilliams5588 6 месяцев назад +1

      AMD becoming "competitive" in our lifetime. lol no chance

  • @godofwinetits3826
    @godofwinetits3826 6 месяцев назад

    with AI who needs quantum computers?

  • @_vofy
    @_vofy 6 месяцев назад

    Awesome video!