Intel's Tom Petersen Talks Arc Updates & GPU Drivers | The Full Nerd Special Edition

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024
  • Join The Full Nerd gang as they talk about the latest PC hardware topics. In this episode the gang is joined by special guest Tom Petersen, Intel Fellow, to talk about Arc updates, the work Intel is doing on GPU drivers, and much more. And of course we answer your questions live!
    Watch our recent Intel Arc testing: • Newest Driver Battle: ...
    Buy The Full Nerd merch: crowdmade.com/...
    Check out the audio version of the podcast on iTunes, Spotify, Pocket Casts and more so you can listen on the go and be sure to subscribe so you don't miss the latest live episode!
    Join the PC related discussions and ask us questions on Discord: / discord
    Follow the crew on Twitter: @GordonUng @BradChacos @MorphingBall @KeithPlaysPC @AdamPMurray
    Follow PCWorld for all things PC!
    -----------------------------­---
    SUBSCRIBE: www.youtube.com...
    TWITCH: / pcworldus
    TWITTER: / pcworld
    #intel #gpu #interview

Комментарии • 169

  • @ProjectPhysX
    @ProjectPhysX Год назад +30

    40:49 For the past 3 months, my A750 has been a dead brick in a box, because OpenCL drivers were totally broken. In Windows it would just crash, and in Linux it showed up with only 256MB VRAM.
    Yesterday, Linux kernel 6.2 came out, and like a miracle, the dead brick has turned into a working GPU. No issues with OpenCL anymore. Very glad that Intel's driver team is making progress! At this point, I can finally say that Arc is very good value.

    • @heinkeld
      @heinkeld Год назад +1

      They're moving quick.

  • @thescryingpool
    @thescryingpool Год назад +61

    My Arc A770 just got delivered earlier today and I love it so far. Upgraded from a GTX 970.

    • @mofstar8683
      @mofstar8683 Год назад +9

      A750 from a 1660 - having a great time too

    • @Hauntcade
      @Hauntcade Год назад +8

      Upgraded from a RTX 2070 Super to a A770

    • @jazzochannel
      @jazzochannel Год назад +3

      makes no sense for me with 2080 ti :(

    • @GameCookerUSRocks
      @GameCookerUSRocks Год назад +3

      ​@@Hauntcade lol. A 2070 super is still a great card. Did you have a specific need to upgrade to an ARC? That doesn't really make any sense in a regular gaming scenario.

    • @GameCookerUSRocks
      @GameCookerUSRocks Год назад +5

      ​@@jazzochannel why are you sad? And rtx 2080ti is still a wonderful card. I have one. And the RTX features on it or more relevant now than ever. Frame rates are good at native and kicking on dlss is just putting the cherry on top.

  • @MrTobiKay
    @MrTobiKay Год назад +21

    08:00 how great Arc cards are
    14:00 drivers have gotten better
    18:00 the one guy's experience living with the arc card day-to-day
    20:00 Questions from show hosts to Tom Peterson about Arc
    28:00 Future of Intel GPUs (and CPUs - Meteor Lake)
    36:00 What did Intel learn from their first foray into GPUs?
    44:00 XeSS/DLSS
    48:00 ChatGPT & AI
    57:00 question into DX11 improvements (we'll get back to you on that)
    58:00 High-idle Power situation.
    01:03:00 disparity between 1080p performance and 1440p performance
    01:08:00 potential for performance improvement in the future
    01:22:00 Direct Storage
    01:26:00 Fan Control & VR Support
    01:32:00 Performance, Pricing, Alchemist, Battlemage, Celestial, etc
    01:42:00 Stock Availability
    01:46:00 Power Cable Connectors

  • @ovemalmstrom7428
    @ovemalmstrom7428 Год назад +13

    Come on Tom and intel! It's time to go 299$ on the competition with the A770 16gb! That would be a home run!

  • @81Treez
    @81Treez Год назад +30

    Tom is such a great dude. I love that Intel let’s this happen. Great interview.

  • @jarnhand266
    @jarnhand266 Год назад +7

    Just ordered Acer Predator BiFrost Arc 770 OC 16GB

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +2

      That's a beast of an Arc card!

    • @zivzulander
      @zivzulander Год назад +2

      I have one. Good card so far. Was worried the hybrid design (centrifugal aka blower fan + axial fan) might make it louder, but it's actually fairly quiet. Also just a nice looking and feeling card overall - a more premium aesthetic than the price point would indicate.

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +1

      @@zivzulander A guy in another forum that bought the Bitfrost told me he had overclocked it and it can run really fast.

  • @zlungbutterz3307
    @zlungbutterz3307 Год назад +26

    I so badly want ARC to be good.

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +3

      Me too! Come on intel and Tom, go 299$ on the competition with the A770 16gb!

    • @jorgedelarosa7463
      @jorgedelarosa7463 Год назад

      I am thinking really seriously to buy a A770 (upgrading from a 1050ti) but I am just afraid it will suck

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +2

      @@jorgedelarosa7463 of the overwhelming majority of people I have talked to that has bought the Arc A750 or A770, almost non has been disappointed with their purchase. Just the opposite, they are wery satisfied with the performance and stability now.
      I don't think you have to worry.

    • @VintageCR
      @VintageCR Год назад +1

      Battlemage will be even better, according to recent "rumors" now.. believe them or not is up to you.
      but their top end battlemage (B990) is going to rival the 4080 in performance. the pricing at launch will be around the $500 - $550 mark.
      Would you buy a 4080 for roughly 500?

  • @maag78
    @maag78 Год назад +11

    I'm loving my a770, it is running everything on high settings in 1440p, and XeXSS is amazing.

    • @roastinpeace2320
      @roastinpeace2320 Год назад +1

      How is it running games from around 2006 and older (if you have tried any)? I know people are praising it for delivering great performance in most recent games, but tbh old games are important factor for many people too and if it can't run them well, then it's better to grab something less attractive but knowing it can run everything.

    • @AshenTechDotCom
      @AshenTechDotCom Год назад +2

      @@roastinpeace2320 outside a hand full of old titles still using anti-cheats that havent been updated in.. forever, well if it works with dxvk it runs great most of the time, you may need to make a config file to enable async shading, but, it really does wonders, but i was using it before arc was even a thing because so many old titles run so much better with dxvk...
      farcry1 runs... well frametimes are amusing without dxvk, including amd and nvidia, dxvk.. frame timess are great mins are above my 164.09 hz refresh rate, maxxed out at 1440p with "hdr" enabled...
      hell, it runs glide emulators better then nvidia or amd can seem to from our testing :D
      dxvk is f-ing great to have around for all vendors.. more so for arc but.. my friend group been using it since before arc or even dg1 was a rumor... :)

  • @zodwraith5745
    @zodwraith5745 Год назад +27

    I hope Intel keeps it up with allowing access to personnel that _aren't_ PR. Treat your enthusiast customers as enthusiasts before customers and realize they aren't dumb.

    • @CuttinInIdaho
      @CuttinInIdaho Год назад +4

      I am not a fan of that Ryan guy on the intel videos with Tom.

    • @POVwithRC
      @POVwithRC Год назад +3

      He sort of IS PR. Yeah his title is not that thing, but walking like a duck and quacking like a duck...

    • @JS-be8ll
      @JS-be8ll Год назад +3

      This was mostly PR :)

    • @zodwraith5745
      @zodwraith5745 Год назад +4

      @@POVwithRC While technically it _IS_ PR, it's not the PR that we've become accustomed to that's really just marketing and damage control. There is such a thing as _good_ PR and that's when it's informative and willing to answer questions.
      Way more respectful of the consumer's intelligence than I've seen from AMD, Nvidia, or Apple in a long time.

    • @maxwelldecataldo5116
      @maxwelldecataldo5116 Год назад +2

      This guy is PR. He is 0% an engineer, he is marketing

  • @redslate
    @redslate Год назад +9

    Really like this guy. Hope Intel keeps him as ARC's lead.

  • @bignishspcinsights1762
    @bignishspcinsights1762 Год назад +7

    Gotta love Tom. A very clever individual who can enjoy and delver fun banter, but also talk deep into technology and draw the line where needed with good humour. Thanks PCWorld for a great post. I bought an A770 at launch as I was interested in Intel's take on discreet graphics and have not been disappointed. The only criticism from me is high power and Vram speed at idle, but everything else has been great. I have not regretted my purchase at all, and will buy Intel graphics again with their next generation card.

  • @Wild_Cat
    @Wild_Cat Год назад +5

    Arc FTW!

  • @RonMizman
    @RonMizman Год назад +9

    8800GT and the red PCB ATI 9700 Pro.
    So beautiful. I know many of you who were around were blown away by these 2 legends as well.

    • @AshenTechDotCom
      @AshenTechDotCom Год назад +1

      i loved my 9500pro that unlocked to full 9700pro specs and clocked higher then my buddies official 9800pro card.. fun times... heh... was so glad when it turned out to unlock fully.. heh.. needed a better cooler but that was easy to find and provide back then :)

    • @RonMizman
      @RonMizman Год назад

      @@AshenTechDotCom Heh cannot believe you had one of the 9500s that unlocked, I was too chicken I got the boxed 9700 pro version, good times - epic hardware.

    • @AshenTechDotCom
      @AshenTechDotCom Год назад +1

      @@RonMizman a large % did unlock, and were binned to fill a market before the true 9500(the 9550) came out, of 12 a buddy bought over the time 3 didnt unlock, the rest unclocked and overclocked once you stuck a vantec copper chipset cooler on them.. (or better) we made a good bit selling confirmed tested ones via local shops, including some flashed to mac and hard-modded.. those we more then doubled our money on.. and split the profits..heh. )
      was good times, the closest i have seen in recent years is the arc cards for fun with some bugs thrown in to give you something to chew on :P

    • @Dsntmtter2ME
      @Dsntmtter2ME Год назад +1

      8800 GT was perhaps the best video card ever made. Running Crysis on high at 1280x1024 for $250? Dream come true.

    • @mutawi3i
      @mutawi3i Год назад

      I voltmodded my 9500 non pro to a 9700 pro with custom bios flash opened 24 pipelines which was amazing. In 2008

  • @SirCrest
    @SirCrest Год назад +10

    Tom is always such a great guest.

  • @evan9536
    @evan9536 Год назад +7

    The A770 is already a monster for me. I sold my 3070 PC for an all-Intel build with an A770 and it’s knocking the 3070 out of the park. It’s getting higher FPS and performs better in Fusion360 CAD design. It’s pretty even in my Data Science studies, however.

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад

      That is interesting! I think you the first saying it knocks down the 3070, hope you're right!

    • @evan9536
      @evan9536 Год назад +2

      @@ovemalmstrom7428 It’s actually very impressive with XeSS and DeepLink technology. In fact, it’s out performing 3090’s in Hogwarts Legacy (seriously look at some of the testing videos and compare them, it’s insane!). The hardware for the card is there, it just needs the drivers to catch up.

  • @Retro-Memory
    @Retro-Memory Год назад +19

    This guy is good, great show.

    • @jazzochannel
      @jazzochannel Год назад

      just how good is he? did you taste his vanilla salty balls?

  • @JS-be8ll
    @JS-be8ll Год назад +2

    HUGEST respect for just straight saying- don't buy this for VR, yet. .....
    Then loss of respect for defending stupid gpu prices

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +1

      Agree, he could have handled that question a lot better. Actually, he should have anounced 299$ for the A770 16gb. ;)

  • @yamilabugattas3895
    @yamilabugattas3895 Год назад +5

    I imagine Battlemage will change things, they'll probably grab some market share with it. I hope that time comes soon, since the GPU market seems awful right now at the midrange.

    • @slumy8195
      @slumy8195 Год назад

      yea we cant expect a brand new car company to be as good as toyota or honda in their first car series. Everybody starts from the bottom

  • @orl2222
    @orl2222 Год назад +4

    I just built a RYZEN 7 5700G on a Giaibyte B450 Vision Board. I bought the A380 on a whim. I will return it, because I will up grade to the Arc A 750. Not a gamer, but I'm impressed the econding ability! Funny, but my build is really a half and half, brcause the board also uses a intel thunderbolt 3 contoller chip on it.

  • @jazzochannel
    @jazzochannel Год назад +4

    What a great show. Makes me think of the 90s when i was a kid and Radio was a thing (I guess it still is for people who drive cars). Banter 9/10 (could have more), Content 7/10 (I'm a fuller nerd), Production (10/10). I only recently discovered this show, like 8 months. Enjoying every (almost) episode since then!

  • @danytoob
    @danytoob Год назад +4

    Intel Graphics Command Center? Sorry I missed this stream for my question - Why isn't Intel Graphics Command Center included in Arc Control. I had to search at launch (with no success) until I accidentally stumbled on to it. Critical for my use case. Invaluable to all I would think.

  • @tech360gamer
    @tech360gamer Год назад +3

    Thank you, Keith 👍👍. So wanted to know about VR and Arc. I like how honest Tom is and I will surely buy Arc when VR is sorted. Thanks Tom

  • @lutzchen9887
    @lutzchen9887 Год назад +2

    Currently running an A770 LE in my secondary PC rig. Had no major problems. I love it so far.

  • @adrianconstantin1132
    @adrianconstantin1132 Год назад +3

    Looking forward to Arc Battlemage, but before Battlemage I think there is still work to do on Alchemist drivers.
    And for the duration of this interview with Intel I kept looking at that Threadripper or Epyc the background that was just staring at me :)

  • @nunyobiznez875
    @nunyobiznez875 Год назад +3

    "19 by 10" is how you would describe an aspect ratio, not a resolution. It would not only be very imprecise, but it would create confusion, precisely for the previously stated reason; it sounds too much like an aspect ratio, "16 by 10" or "16 by 9", etc.

  • @supermarvelero
    @supermarvelero Год назад +3

    Intel ARC 💙

  • @EnochGitongaKimathi
    @EnochGitongaKimathi Год назад +2

    How much of threat are ARM, Apple and Qualcomm GPU efforts? The Qualcomm Adreno 740 at 8W is performing at a desktop GTX 1050 75W level.
    If you compare smartphone sales to PC sales, which has better market share forecast?
    I can't wait to compare Meteor Lake ARC iGPU to Apple Silicon.
    I'm thinking about the recent growth in handheld gaming, the future of the Steam Deck. Eventually You will be able to dock your smartphone and use it as a gaming desktop PC with keyboard, mouse or other PC gaming rigs.

  • @nonflyingfinn2173
    @nonflyingfinn2173 Год назад +2

    I liked how upfront he was with the VR unlike amd with the "hey there's usb-c" and afterwards forums were full of people with their 1200 euro card not working well in VR. It's a niche use case I know but it would be nice to not having to dole out the moolah just for the green company.

  • @KushiKush
    @KushiKush Год назад

    Screw you, the first 2 minutes of this show have made me fall in love with your guys setup. Oh man that random banter at the start, you damn well know I'm going to watch the whole two hours now. Gj nice vod

  • @linndoria
    @linndoria Год назад +3

    Looking forward to the next WHQL driver and hope that it covers more dx11 games while implementing more dxvk 2.x with hdr for dx10 and 11.

  • @GameCookerUSRocks
    @GameCookerUSRocks Год назад +1

    It would have been nice if you all would have clarified with Tom what he meant when he was commenting about dlss and tensor cores. There was inference there but he's not the only one recently who said it's not necessary to use AI in a scenario like dlss.
    My next question is do you all think that FSR & XESS will ever look as good as dlss? And don't say they look as good as dlss now. They don't.
    In fact dlss in some cases makes low resolution look better. FSR makes it look worse. Therefore, I don't know how making a statement like Tom did as justifiable.
    Everything can't be mitigated simply through software means. I think a hybrid method is a much smarter method. But I'm not a programmer or engineer. But it just makes sense to me.

  • @Eternalduoae
    @Eternalduoae Год назад +2

    OMG, Gordon is SO wrong about 19x10... I was there back in the day - it's not primordial, it's just backwards!
    480p
    1080p
    etc.

  • @musemooch
    @musemooch Год назад +1

    10:59 had me checking my usb devices lol

  • @medievalfoot
    @medievalfoot Год назад +1

    I hope they come out with something in the 80s class next generation.

  • @fairycat
    @fairycat Год назад +1

    Awesome rubric. Thanks for Entertainment. I hope intel will set its place on GPU market as soon as possible.

  • @oldmanwithers4565
    @oldmanwithers4565 Год назад +3

    The only thing making me hold off buying one is the idle power draw.

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +1

      They got a fix for it, but I don't think it's working with all motherboards, hopefully they come up with a more solid solution.

    • @ayokmschangedhisname4393
      @ayokmschangedhisname4393 Год назад

      @@ovemalmstrom7428 if it doesn’t work you contact your mobo manufacturer and rhey send you a custom bios

  • @ole7736
    @ole7736 Год назад +1

    This great interview would really benefit from chapter marks.

  • @hughJ
    @hughJ Год назад

    The only time I've ever purchased one of those double-edge razors was for delidding an athlon64. Still have the remaining razors in a drawer somewhere.

  • @TedTabaka
    @TedTabaka 8 месяцев назад

    Intel needs to price the next version of ARC at both performance, value, low cost, and support to be able to get a proper market share. They need to be the cost leader to hook gamers and others to be successful, and than grow into the market.
    I also think that DX9 support is crucial to bring in the correct market.

  • @RetroTinkerer
    @RetroTinkerer Год назад +2

    So Tom is asking for a modern version of AGP Pro? Yeah why not, it would be cool, not sure how a low to mid tier motherboard would be designed to handle 400W+ just for one PCIE card.

    • @AshenTechDotCom
      @AshenTechDotCom Год назад +2

      i have seen some industrial systems like this, using pins rather then typical card slot(think old macs rather then modern systems), the power was drawn threw the pcb but..pretty much direct lines threw multi layers of the pbc to the power connector for the board, i mean its possible...
      i would rather see them just see videocards with their own latop brick style power plugs on the back so they can power themselves... (this is something some rare cards for servers/workstations actually did..) it would be smarter imho.. would also help justify the huge coolers... and pricing on top tier cards this would be useful for.
      these new cards and their power limits/draws just make me cringe... not as hard as fermi did but.. pretty hard..

    • @RetroTinkerer
      @RetroTinkerer Год назад

      @@AshenTechDotCom I watched a new video from @der8auer where he tested a Mac Pro dual chip AMD GPU on a standard x86 system and he had to solder ground and 12V cables to a connector just after the x16 slot, apparently that thing draws 400+ watts using that slotted connector, so maybe Tom is not far fetched at all!

    • @stefanl5183
      @stefanl5183 Год назад +1

      "Yeah why not"
      For many reasons it would be a totally bad idea.
      1) You'd need new motherboards designed with new slots.
      2) What about people who buy motherboard with multiple x16 slots? Would every slot have the extra connector or just one. If only one or a few, then youd be limited to using those which negate the whole point of having multiple x16 slots.
      3) Routing the power through the motherboard is a really bad idea. It complicates the design of the PCB, and motherboard PCBs are already very complex. Secondly, your forcing the PCB to handle the flow of many more amps of current and they are already having to handle a lot. And Third, that current isn't magically generated by the motherboard out of thin air. It ultimately comes from the PSU and how does it get there? Well through connectors from the PSU. If the motherboard has to pull more power through it's connection to the power supply, so that it can feed that power to newly designed "PCIe PRO" slots, then it would need additional power connectors from the power supply. Modern motherboard already have to do this to power the VRMs for modern high wattage CPUs. That's why they have additional 4 pin and 8 pin EPS 12v connectors. If we're gonna have to have even more additional cables and connectors from the power supply to the motherboard to do this, why not just let those cable go straight to the GPU which is where it's going anyway? It would make more sense that way because extra PCB traces and extra connectors the power has to flow through only adds more resistance which equals more losses due to I2R heating. And in addition to the extra losses you also have heat generated and more points of potential failure. Not to mention that a graphics card failure which resulted in too much current draw could also run the risk of destroying your motherboard in addition to your video card and PSU. And on boards with multiple PCIe x16 slots, which are typically designed for using multiple GPUs, this problem would be multiplied.
      Anyway, it's a totally bad idea! I think it's much better to do it the way we are doing it now, have a cable running straight from the power supply to the graphics card. That way is more efficient because it has fewer losses and doesn't require newer redesigned motherboards with newer redesigned slots and newer redesigned cards. And has fewer points of failure. Routing the power for the GPU through the motherboard and the slot, is not only a problem looking for a solution, it's much worse because it would create even more problems that are totally unnecessary!

  • @microsoftsarker
    @microsoftsarker 8 месяцев назад

    Thank you very much. It helped me a lot.

  • @user-wq9mw2xz3j
    @user-wq9mw2xz3j Год назад +3

    I actually bought an arc! didn't think I would!
    still think I overpayed with $300 2nd hand for a770, especially with that bad cooler, but am a bit excited to try out really.
    Can hopefully sell it on for about the same price later :)

    • @bronsondixon4747
      @bronsondixon4747 Год назад +3

      The A770 LE cooler isn’t bad. It keeps temps in the high 60’s / low 70’s.

  • @thesupremeginge
    @thesupremeginge Год назад +1

    19x10 is just simply wrong. There is no video editing software that lets you set a project to that. It's 1080p or 4k. You won't see it in any games under settings either.

    • @ionelroza
      @ionelroza Год назад

      if you were using PCs in the 2000s then you would know. That's why it was funny, it was a joke on time too.

  • @dyson9422
    @dyson9422 Год назад +1

    Is intel ARC good for OpenCL or OpenACC.

  • @ole7736
    @ole7736 Год назад

    If AMD brought the anticipated driver improvements for RDNA3, there would be a lot of revisits all across the reviewer space for sure.

  • @ole7736
    @ole7736 Год назад

    Re Direct Storage in applications: Video data might be less compressible than game textures.

  • @karlsonkab51
    @karlsonkab51 Год назад

    willA750 and 770 work reasonably well with Xeon E5-2600 series?/ (no gaming but use Topaz)'

  • @mikesunboxing
    @mikesunboxing Год назад +1

    Would love to buy a A750 here in the UK for RRP, but most UK stockists are either greedy or shady

  • @InstantCasette
    @InstantCasette Год назад

    Having trouble finding a half height arc that's actually being sold. (Plex)

  • @user-jk4qw5wg8i
    @user-jk4qw5wg8i 9 месяцев назад

    thanks it worked

  • @user-zz2xg5oc6v
    @user-zz2xg5oc6v 10 месяцев назад

    Долго вас не было сейчас опробуем спасибо

  • @alexs.2069
    @alexs.2069 Год назад

    The A750 is a really good product except for power consumption which is a real problem here in the EU ( eg. netherlands and germany where power costs between 40 an 80 cent/kWh )

    • @MissMan666
      @MissMan666 Год назад

      it also has TONS of issues in many games, ARC is really the worst choice of GPU's. No wonder intel has lost money on this project so far, massive under delivery according to their own goals.

  • @wargamingrefugee9065
    @wargamingrefugee9065 Год назад +1

    Blender now supports Arc GPU's.

  • @gibdi0n
    @gibdi0n Год назад +1

    Arc was utter failure. Card simply did not work. I will not buy a car that I cannot drive out of the saloon... Now thigs are getting better. It is almost good as 3060 if you disregards driver issues...

  • @PCSklejka
    @PCSklejka Год назад

    I like to watch Tom and hope best for intel💪🏻💪🏻

  • @fktpro215
    @fktpro215 Год назад

    Does this graphic card support freesync monitor

  • @RobertNaik
    @RobertNaik Год назад +1

    6 blade razor.

  • @nunyobiznez875
    @nunyobiznez875 Год назад

    It's been well established in the courts, that APIs are not copyrightable. So, I've never understood why AMD and/or Intel don't just release their own CUDA implementations? CUDA Garden problem, solved. While I love the open nature of OpenCL, there's no denying that CUDA is a much nicer, cleaner, and more logical API. But, I also don't see anything stopping Intel or AMD from embracing it, except maybe their pride, or perhaps patents, though that seems unlikely for an API.

    • @AshenTechDotCom
      @AshenTechDotCom Год назад

      for amd, its sort of a pride think i think, they never adopt anything nvidia put out, even when given the option to do so. like with cuda,... they wont... hell, nv hired a guy who go physx working on amd cards to keep that from being released to the public.. the dude admitted as much when pressured why he never released it.. got paid not to... and hired to ensure that it wouldnt come out..
      what i find funny is, if intel really build a straight up cuda pack for their cards... it may force amd to do the same...
      i also see a future where amd and nvidia bake dxvk into their drivers like intel have been.. as it would remove a huge chunk of legacy code they have to upkeep for legacy title support... and allow them to dump dx9 hardware support from their designs like intel did..

  • @phoenixrising4995
    @phoenixrising4995 Год назад

    With Intels Linux driver history I am wondering how the Xe driver is going to go.

  • @silasdhayanand8708
    @silasdhayanand8708 Год назад

    Will ARC ever reach it's full potential?

  • @kyleking7060
    @kyleking7060 Год назад

    haha, been in this industry for 20 years, iv just now changed my nomenclature, 19by10, 25by14...seriously wow... i love it...this might save me several ours of my life...lol
    also i love the a750 at $250, for a "good" time at a fair price point on open source...as a linux daily driver im glad im not stuck using only amd which still feels like second class and has since ati, its a pr issue realy cuz some of there hardware is technically great but public opinions tends to drive many markets these days and more competition cant hurt

  • @koyoshima45
    @koyoshima45 Год назад

    hardware wise , A770 16gb has better specs than an rx 6800 , they only need to work on the drivers !

  • @WarbossRB
    @WarbossRB Год назад

    I might buy one as a collectors piece seeing how they won't release a standalone battle mage or later discrete card ever again.

  • @user-jv8ly3jm5t
    @user-jv8ly3jm5t Год назад

    Nice

  • @AhmaedMohammed-ss9zi
    @AhmaedMohammed-ss9zi 8 месяцев назад

    Samsung galaxy M12 battery charger

  • @Vegemeister1
    @Vegemeister1 Год назад

    He shied away from giving any credit to DXVK at least twice, and one of those times he talked about Intel developers working when they should be sleeping. What about the open source developers working for the past 5 years? And then he blows off the question about Linux support?
    --my-next-gpu-wont-be-intel

  • @vin.k.k
    @vin.k.k Год назад

    So the high idle power is a design flaw...

  • @doculab3d
    @doculab3d Год назад

    11:45 "In time,.. our margins will improve". Thanks for the heads up, preparing the lube now.

  • @-INFERNUS-
    @-INFERNUS- Год назад +2

    Will Intel release any new cards this yr? I didn't watch the stream yet. Don't tell me the A770 is the only card this yr.,? 😡😑

    • @evan9536
      @evan9536 Год назад +1

      I believe their Battlemage series is expected at the end of 2023/early 2024.

    • @-INFERNUS-
      @-INFERNUS- Год назад

      ​@@evan9536 Ok at least that's something I was hoping for earlier like April/May oh well.

  • @jelipebands1700
    @jelipebands1700 Год назад

    Yes nvidia wants to be like Apple 🍎 remember game works and gpp… nvidia fighting Sony with the ps3 chips and Apple with who’s fault it was for faulty chips. There is a reason why no one wants to work with nvidia. Im not bashing nvidia they are a business and they want control just like Apple. That’s why they wanted to buy arm.

  • @OrjonZ
    @OrjonZ Год назад +2

    $250 for A750 yet u can get 6600 XT which beat even A770 for $257. Its not cheap enough sorry. You can get 6700 XT for same price as A770 and blow it performance.

    • @alexs.2069
      @alexs.2069 Год назад

      here in the eu
      you can get either an a750 or a 6600 non-xt for the same price
      the a750 is the stronger part her for newer games

    • @eagle_rb_mmoomin_418
      @eagle_rb_mmoomin_418 Год назад +2

      The issue is XESS is nearer to DLSS than FSR and the Intel cards can run both. Also the Intels RT performance is a LOT better than AMDs at least being at the level of the RTX 3000 series . More games are using RT now and smart upscaling is more and more prevalent. So going forward the 6700XT is going to struggle more and more. I just wouldn't buy a new RDNA2 card now unless it's on a huge discount I'd wait for the mid range RDNA3 cards as those at least should have baseline decent RT performance somewhere between the 3000 and 4000 series NVidia cards. You might not care about RT but more and more developers are putting it in their games and next gen titles are replacing the last gen ones.

  • @RayanMADAO
    @RayanMADAO Год назад +1

    Arc is still a giant waste of money, paying 50-100 dollars extra for a 3060 or a 6600 is still a way better option. No matter how many improvements they've made they STILL have a lot of issues compared to established nvidia and amd drivers. It's just that now arc cards are not embarrassingly bad like at launch which is a weird thing to celebrate

    • @noway8662
      @noway8662 Год назад

      RX 6600 costs less or same as A750, not more. I guess you mean 6650XT?

    • @RayanMADAO
      @RayanMADAO Год назад

      @@noway8662 whatever the point is arc is not cheap enough considering how unmature it is to consider it

    • @redslate
      @redslate Год назад

      The scenario is dynamic. The point being emphasized is that ARC intends to provide superior support for its products, and, to their credit, they've been proving this point beyond question.
      It is not unreasonable to think that A770 will match/exceed the stability of 3060. It's enough to consider cost vs opportunity cost.

  • @RayanMADAO
    @RayanMADAO Год назад

    Wow a pr person staying for 2 hours

  • @ojassarup258
    @ojassarup258 Год назад

    Ehh the A750 is not mind blowing for $250 given the 6650XT is about the same. Really A750 should be at 180 and A770 should be 250-ish or lower. Mid-range market is not great at the moment.

  • @Shonkuk1
    @Shonkuk1 Год назад +2

    hey keith can you please learn to actually pay attention when you are on the full nerd, go watch the video and look at you then look at everyone else your just off in space dreaming about sheep and biting your nails, nodd your head a little pay attention stop reading whatever your reading

  • @JS-be8ll
    @JS-be8ll Год назад

    Lol I love how this guy basically pretended Intel has invented tiles, potentially more powerful APUs, etc, as if AMD hasn't been doing this for years.

    • @hughJ
      @hughJ Год назад +2

      Both multi-die packages and integrated graphics have been around for decades at this point. The difference is the nitty-gritty details of the implementation.

    • @JS-be8ll
      @JS-be8ll Год назад

      @@hughJ sure, but AMD has nailed it, and this guy is talking like Intel just invented it.

    • @zivzulander
      @zivzulander Год назад +1

      It didn't sound like he said they invented it, that's just what their new direction is. It's unrealistic to expect Intel to come out and thank a competitor for helping to lay any groundwork, though I'm sure if asked point-blank Tom would acknowledge AMD's role in the space, much like they would also (maybe begrudgingly) acknowledge ARM pioneering big.LITTLE design. All these tech companies build on each others innovations, though. AMD doesn't need to thank Intel for x86 and Intel doesn't need to thank AMD for x64. Everyone just keeps moving on and hopefully keeps innovating.

  • @sacamentobob
    @sacamentobob Год назад +3

    Intel ARC again? this is getting boring.

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +6

      Why, the only thing we hear about on the net is Nvidia, Nvidia, Nvidia, and some AMD. Nice with some Arc chat now and then.

    • @zivzulander
      @zivzulander Год назад +5

      It's a new entrant into dedicated GPU space. If anything, the past few years of AMD and Nvidia not releasing anything good in the sub $300 space was boring.

    • @sacamentobob
      @sacamentobob Год назад

      @@zivzulander new entrant? Lol. Theyve had embedded GPUs for a long long time now. And this is arguably their 3rd go at a discrete gpu

    • @whosehandle
      @whosehandle Год назад

      @@sacamentobobyes, those are the same things.

    • @redslate
      @redslate Год назад

      ARC's at the bleeding edge...

  • @marcasswellbmd6922
    @marcasswellbmd6922 Год назад +3

    Are we still Talking about these crappy GPU's.. OMG just take them off the shelves or have a fire sale but stop embarring yourselves Intel. No But seriously if these cards would have came out 3 years ago during the GPU shortage more PPL would have bought them, and all the Bugs would already be fixed. But they are OK now I guess, they are on a 6600XT level. Better luck next Gen...

    • @SebLukaGaming
      @SebLukaGaming Год назад +2

      Your so ignorant.

    • @marcasswellbmd6922
      @marcasswellbmd6922 Год назад

      @@SebLukaGaming What are you so hurt about I was Kidding and they did turn out to be ok, But they are on the level of a 6600XT it's true.. They are 3 years to late..

    • @dano1307
      @dano1307 Год назад +1

      @@marcasswellbmd6922 gotta start somewhere. Even a company as big as intel can't release a 3080 equivalent card their first go.

    • @TechnessCorner
      @TechnessCorner Год назад +4

      Joking or Not, your writing does not read as sarcasim,
      I just bought one for machine learning and on equivelents it plays ball with a 3070 and 6700 in terms of 1440p performance (gaming)
      on the other end the 16gb of DDR6 memory for the price show cases how greedy your other ( seemed loved ) brands are when releasing one component of a card that pending its number drastically inflates it ( even though low cost of memory knowadays ) due to the fact that more likely the extra boost in ram will somewhat future proof cards in the gaming space more so then clock boost etc...
      ALSO @MarcasswellbMD proof read your writings instead of editing them 3 times over as comments come in, in relation too your comment.
      This practice can be considered dodge. Peace

    • @TechnessCorner
      @TechnessCorner Год назад +3

      @@marcasswellbmd6922 6600xt is over two years old as well. Don't be ignorant in your writings it showcases you as a NVIDIA or AMD shill in the GPU space, and for your case don't limit yourself to other possibilities as any extra consumer competion for us is a blessing. Market segment wise it was unecessary to compete at top end when a market segment was avaialble and a logical starting point, mid-range / low end where prices really should never have climbed that amount. If they grab this market, market share would far exceed and exceeded INTEL's abaility then have only coming out with a 4090 equivelant. PEACE

  • @thesupremeginge
    @thesupremeginge Год назад

    Intel Arc graphics
    Waste of precious resources
    No match for nature