Intels GPU software needs some serious work!

Поделиться
HTML-код
  • Опубликовано: 6 окт 2022
  • Overclocking GPUs has been a hobby of mine for nearly two decades, however Intel ARC just isnt worth even trying... here's why...
    Learn all about KIOXIA's Lineup of consumer and enterprise grade NVME SSDs and celebrate their 35th anniversary of NAND Flash by heading to www.kioxia.com/en-us/top.html
    Get your JayzTwoCents Merch Here! - www.jayztwocents.com
    ○○○○○○ Items featured in this video available at Amazon ○○○○○○
    ► Amazon US - bit.ly/1meybOF
    ► Amazon UK - amzn.to/Zx813L
    ► Amazon Canada - amzn.to/1tl6vc6
    ••• Follow me on your favorite Social Media! •••
    Facebook: / jayztwocents
    Twitter: / jayztwocents
    Instagram: / jayztwocents
    SUBSCRIBE! bit.ly/sub2JayzTwoCents
  • НаукаНаука

Комментарии • 727

  • @coffeejn
    @coffeejn Год назад +566

    We need Intel to fix the ARC drivers before worrying about OC on these cards.

    • @BruceCarbonLakeriver
      @BruceCarbonLakeriver Год назад +6

      true that!

    • @davidescobar9309
      @davidescobar9309 Год назад +34

      Totally agree, also most of midrange players wont really go after OC their cards. I believe most of people who are looking after this cards are more interested in the price vs raw performance.

    • @Rspsand07
      @Rspsand07 Год назад +9

      @@davidescobar9309 Nobody should be looking at this card unless they like tinkering and are a heavy enthusiast, which is definitely who'll OC. If you want budget facebok is best where I've seen 220$ 6700 xts, 240$ 3070s, 280$ 6800s, and 400-450$ 6800 xts and 3080s, otherwise 6600/6600 xt/6650 xt if you absolutely want new.

    • @gharm9129
      @gharm9129 Год назад +31

      @@Rspsand07 No one should buy used miner cards. Let them rot for screwing with the supply and hoarding gpus.

    • @Rspsand07
      @Rspsand07 Год назад +11

      @@gharm9129 That's one of those spite myself moves. Only one who wins is Nvidia/AMD. If you're on a budget, you absolutely should be. Espesially at brands with transferable warranty and the latest gen that'll have 2-3 years warranty left, a huge discount, and no taxes.

  • @0Camus0
    @0Camus0 Год назад +219

    Hey Jay, the first slider is the frequency offset, you are moving the entire VF curve up. The second slider is how far in the VF the frequency/voltage can go, that's why you see freq changes on the voltage slider.
    Try maxing the freq offset first, then play with the voltage. Tom Petersen waa able to get 2.7 GHz from this GPU in real time.

  • @DanielFrost79
    @DanielFrost79 Год назад +544

    For the GPU to be released this 'average' for their first GPU for 20 years, this is not bad.
    Now we only need a new driver update.
    I'll bet they've upped the driver performance till the release.

    • @will3641
      @will3641 Год назад +33

      Intel: what?! I heard nothing, our product is great already.

    • @bagelbytes69420
      @bagelbytes69420 Год назад +33

      I do like that RT features are very strong on these cards, especially compared to AMD. It will REALLY force AMD to either optimize for RT, or pump up the performance big time with future endeavors. Competition is king, especially with how sheisterey Nvidia have been with pricing and treatment of AIB partners. *cough EVGA cough*

    • @kuikyroje
      @kuikyroje Год назад +15

      Especially at that price point! Makes me look like a turkey fucker for buying a 3080TI @ 2000 AUD!!!

    • @arftrooper44
      @arftrooper44 Год назад +1

      @@bagelbytes69420 well...just e now, no more vga

    • @gusfring96
      @gusfring96 Год назад +12

      And the hardware is actually round 3070-level based on tests on a lot of games, just bottlenecked by driver performance. If AMD solved it, so can Intel. a 329 dollar potential 3070 is far, far from bad.

  • @TheTechAdmin
    @TheTechAdmin Год назад +15

    2:37 Oncea programmer at MSI Afterburner gets ahold of a card with the software, they will be able to grab the data points from _this_ software. And update MSI Afterburner to support ARC.

  • @tadasnever
    @tadasnever Год назад +13

    It's kind of expected, since it's specs on paper is way higher than RTX 3060, but in reality most of the time it barely keep up with RTX 3060. In short - drivers sucks, but it can be improved over time, which keeps me optimistic about intel GPUs

  • @sinom
    @sinom Год назад +15

    AMD relive is their recording software. The interface is called Adrenaline

    • @JirayD
      @JirayD Год назад

      Yeah it's called Adrenaline since 2017. Before that it was Crimson.

  • @Mass1veGam3r
    @Mass1veGam3r Год назад +58

    Tbh these cards aren't great but ain't bad, they're quite good for their first try and the target (mid-range gpu market). I hope people give it a chance because we need more gpus on the market and more players competing the more competive pricing we will get among all gpu makers. We need to break the dual monopoly if we want to keep getting gpus without the need to sell a kidney.

    • @forog1
      @forog1 Год назад +4

      Yeah, we do need another GPU brand because Intel is our best bet because at least in our Centre. There won't be another GPU brand sadly as it's a lot for a new company to just start get started developing a new GPU architecture which would take years to finish and another few years and billons to release lol

    • @lasthopelost9090
      @lasthopelost9090 Год назад +4

      @XvX well yeah they need people to use them a sample size of a few hundred won’t compare to maybe millions of users and all that data and it could be a lot worse

    • @AntonMochalin
      @AntonMochalin Год назад +1

      @XvX what do you mean "resizable bar is a big red flag"? Either your system supports it and then you get the performance or it doesn't and then you simply shouldn't buy these cards. Poor DX11 support is a much worse problem for people like me but if one only plans to play newer games (there's a lot of people who only play Fortnite or Overwatch or Call of Duty etc) then these cards are a very good option.

    • @AntonMochalin
      @AntonMochalin Год назад

      @XvX Rebar is just required if you want to use Arc, why would one want to turn it off? One just needs to make sure their CPU and MB support it.

    • @AntonMochalin
      @AntonMochalin Год назад +1

      @XvX yes sure you shouldn't use Arc in older systems which don't support Rebar just like you shouldn't try to install DX12 games on Windows 7. Other issues are driver issues (except probably DX11 support) and will be sorted out. The cards aren't even sold in stores yet. I've seen glitches a lot on Radeon cards long time after release and still there were enthusiasts who bought Radeon for some reason. If you want more stable system buy NVidia. I wouldn't expect this Arc generation to have any noticeable market share even if there were no issues at all but I guess in 6-12 months it will be quite a safe bet and a good bargain for some categories of consumers.

  • @TheXenetiX
    @TheXenetiX Год назад

    Great vid buddy! Keep up the good work!

  • @saffaboy2007
    @saffaboy2007 Год назад

    I love that Jay's titles aren't clickbait.

  • @acidcharon
    @acidcharon Год назад +10

    It would be hilarious if EVGA goes full Intel GPUs and we even get a KingPin intel GPU

    • @quantum5661
      @quantum5661 Год назад +2

      honestly i'd love it regardless of whos cards EVGA is making, i hate the idea of them leaving gaming entirely

    • @jay_tarantula899
      @jay_tarantula899 Год назад +1

      🤣 u didnt hear? They dont make ANY gpus anymore.

  •  Год назад +26

    Was there no memory OC option?
    That would be nice and it would also give a clue to how limited the card is on memory bandwidth if at all.

    • @tristanweide
      @tristanweide Год назад

      This card has a massive bus as well so memory bandwidth shouldn't be much of an issue. It has the same bus width as 3070/ti if I recall correctly.

    • @christophermullins7163
      @christophermullins7163 Год назад +3

      @@tristanweide that doesn't mean it isn't memory starved. Different architecture uses bandwidth differently. Also.. 3070s have much faster vram.
      256bit is far from massive. Lots of 384 and 512 bit cards in the past.

    •  Год назад +1

      @@christophermullins7163 Yea and you could even get a GTX 260 216SP card with 448bit bus for 139€ back in the day (retail not used) and SLI two of them with out breaking the bank.
      Its a bit annoying the price of cards today and knowing they can do 512bit buses if they wanted and at a decent price.

  • @limit4099
    @limit4099 Год назад

    I really like this set. I'll have to go back and watch your build videos because I'd like my room to similar 👍

  • @Orain88
    @Orain88 Год назад

    That gpu performance boost slider seems to be like an untervolting tuner. Except I don't see where you can drop the mV below it's base voltage. I guess it would behave like an undervolt if you drop the wattage limit down while turning the gpu boost slider up? Is there an option to negative offset the voltage?
    I wish I had one of these to tinker with but I don't really need a new gpu right now.

  • @landemossel
    @landemossel Год назад +1

    someone i know told me that the gpu performance boost slider isn't any kind of mhz boost but an overall performance boost with (if possible) fan speed temp limit etc and that would explain the lower average clock speed when using higher values when it's lowering clock speed to keep the temp under a certain treshold

  • @adhdengineer
    @adhdengineer Год назад +13

    The card does look pretty in the chassis tho.

    • @anthonybarton1591
      @anthonybarton1591 Год назад

      I was pretty skeptical on the looks but it does look really nice in there.

  • @Geordie8t4DMZ
    @Geordie8t4DMZ Год назад

    Love the videos. Casn I ask, what is that monitor please?

  • @SirB3ast
    @SirB3ast Год назад +4

    He talks about the card and I'm here like eyes locked onto the monitor. That's a really cool monitor. lol

  • @ClaireReed
    @ClaireReed Год назад

    The Performance Tuning configuration menu gives control over the v/f curve (GPU Performance Boost), voltage offset, power limit, and temperature limits

  • @norbertsantos1133
    @norbertsantos1133 Год назад

    I’m more interested with the monitor you are using? Whats the brand / model? TY

  • @freeaccount6770
    @freeaccount6770 Год назад +4

    C'mon guys, ya gotta wait for the 770K "unlocked" Edition.

  • @leotoro51
    @leotoro51 Год назад

    Did You try to use Fan Control application that You recommended some time ago, to try to change Intel GPU Fan RPM ?

  • @PStateSkate
    @PStateSkate Год назад +12

    It would be interesting to see if minor undervolting could help bridge that gap with the wattage maxed out. Seems like the overhead from the stock voltage is a serious limiting factor despite a few degrees of temperature headroom to work with. Unfortunately that isn’t an option in the current version of the oc software (or luckily lol)

  • @Joeyzoom
    @Joeyzoom Год назад +2

    Well at least the competition is there. They'll improve over time and bring innovation along with the other two brands. Looking forward to the future!

  • @jaanu2222
    @jaanu2222 Год назад +22

    It's a start for the intel , they've never made a discrete gpu before , people should calm themselves and have faith but they'll improve , it's essential to have a 3rd company for competition to get nvidia amd on right track and they don't get crazy with the pricing

    • @MuShinnen
      @MuShinnen Год назад +2

      Yes they have. They stopped making them, but this is not their first foray into making DGPUs.

    • @fajaradi1223
      @fajaradi1223 Год назад

      Cough ...
      Cough ...
      Project Larabee

    • @jaanu2222
      @jaanu2222 Год назад

      @@fajaradi1223 how many gpu sold that time ?

  • @SasquatchsCousin33
    @SasquatchsCousin33 Год назад

    Did the top slider affect memory speed?

  • @TestDriveInline5
    @TestDriveInline5 Год назад +1

    At 15:14 did I see the clock spiking to 3267 MHz? It's gotta be magic in there. The ghosts within the machine 😅

  • @cinoaz1243
    @cinoaz1243 Год назад +45

    Arc is just finding its footing. This is a first gen card folks and they are doing very well when compared with other mid-grade cards. I will buy one, not that I need it, but, to help support a third option in the GPU market. Thanks Intel Graphics, I'll continue to root for ya.

    • @SimonBauer7
      @SimonBauer7 Год назад +5

      EXACTLY. people fail to comprehend this

    • @DanielFrost79
      @DanielFrost79 Год назад +1

      Same here. But i need to upgrade my CPU first. (Still on my 8700)

    • @DanielFrost79
      @DanielFrost79 Год назад

      @@SimonBauer7 Yeah, i'm just ignoring the ignorant millenials because they just don't see the big picture here.

    • @craig9365
      @craig9365 Год назад +5

      @@SimonBauer7 not really. They fail to accept dealing with a first gen product. Not everyone wants to be a beta tester and deal with these issues some which may have little to no info of what's causing it or how to fix it. It's YOU who doesn't understand that Intel still needs work to be done on top of the very obvious problem of the card basically being terrible for playing anything that isn't a new game.

    • @DanielFrost79
      @DanielFrost79 Год назад

      @@whalehunter2407 Have you ever even heard of "Beta Drivers"?... Probably not.

  • @FuckYoutubeHandles
    @FuckYoutubeHandles Год назад

    I'm diggin the giant tv setup

  • @bekkerthesokuangeldragon68
    @bekkerthesokuangeldragon68 Год назад

    How can i install/download the intel ARC software like from the video?

  • @Velerios
    @Velerios Год назад

    Did you try to type in the wattage limit? Maybe the slider does NOT allow more than 228W, but if you type it in it might work. There are some cases where you can actually go over the slider.

  • @foxykaruptsock3213
    @foxykaruptsock3213 Год назад

    Did you ever hover the mouse over the ' ? ' at the end of the lines ? ....just in case it was a context sensitive hover Help info button ?

  • @einarabelc5
    @einarabelc5 Год назад

    What's your TV exactly Jay. I'd like to get something like that for DCS. Thanks!!
    Hey, when is M-RAm going to come out..

  • @geoderek7185
    @geoderek7185 Год назад

    what monitor are you using in this video?

  • @dustinhipskind7665
    @dustinhipskind7665 Год назад +4

    Could always resistor mod the A770 to see where you could take it.

  • @allmybasketsinoneegg
    @allmybasketsinoneegg Год назад

    Did you test if FanControl can handle the fans?

  • @user-bh5pf2py1l
    @user-bh5pf2py1l Год назад

    What is the monitor you're using here?

  • @HANK_SCORPI0
    @HANK_SCORPI0 Год назад

    @jayzTwoCents when you getting a Samsung Odyssey ARC monitor?

  • @yashkhd1100
    @yashkhd1100 Год назад +16

    By looking at the transistor count and process node there is very high probability that in a year or two as Intel optimize and stabilize their driver stack they will extract excellent performance from ARC. Being a software dev I'm very much aware about how complex graphics driver stack can be. I would definitely give credit to Intel of bringing at least this level of stability for the brand new product which is directly trying to bite 3070 territory. Doesn't matter how much money u through at driver development teams it's gonna take its time to stabilize. This is time problem not a money problem.

  • @Krutonium
    @Krutonium Год назад +36

    Jay! Mod it so it doesn't know how much power it's pulling, and then put it on Water for more thermal headroom and see how far it can go!

    • @thelonelytimbit
      @thelonelytimbit Год назад +2

      How, and with what waterblock?

    • @tippyc2
      @tippyc2 Год назад +4

      @@thelonelytimbit How: shunt mod
      Waterblock: could be any CPU block, but they're gonna need a modified or custom mounting bracket

    • @420rvidxr
      @420rvidxr Год назад +1

      @@tippyc2 uhhh, chip isn't in a standard form, not even centered to the mounting holes of the cooler, it's all propietary

    • @thelonelytimbit
      @thelonelytimbit Год назад +2

      @@tippyc2 shunt mods require a fair bit of specific board knowledge that simply doesn't exist yet for Intel. On AMD and Nvidia it's a lot easier because we have previous generations and established board design choices to go off of. I don't think Jay has the resources to be investigating shunt mods and designing cooler mounting brackets, they aren't LTT.

    • @tippyc2
      @tippyc2 Год назад +1

      @@420rvidxr Yeah, i watched GN's teardown too. Still, any generic CPU cooler and a custom mount would do it. I get that practically nobody in the tech space other than Der Bauer understands fabrication enough to do it, but that doesnt mean it cant be done. It doesnt need to be pretty, it doesnt need to be perfectly centered over the chip, it doesnt need to cool the RAM or VRM's.

  • @jerrodpettway
    @jerrodpettway Год назад

    What monitor are you using

  • @THEGEEK2001
    @THEGEEK2001 Год назад

    is a770 worth it for older Haswell cpu setups?

  • @rogerblack3338
    @rogerblack3338 Год назад

    Couldn't you use that 3rd party fan controller you did a video on ?

  • @3dprintedodubunga405
    @3dprintedodubunga405 Год назад

    Nvidia does also allow wattage based power adjustments. But only through nvidia-smi.

  • @MadClowdz
    @MadClowdz Год назад

    I want to see availability after initial launch first.

  • @ThereWasNoFreeName
    @ThereWasNoFreeName Год назад

    7:52 It was 13MHz on my 1080 and on 3080Ti it's 15MHz. Is 7MHz for 40x0 series?

  • @johnervin1062
    @johnervin1062 Год назад

    Does anyone know what display that is?

  • @johannestwickler242
    @johannestwickler242 Год назад

    hey jay, got a question i got ddr5 6000mhz cl32 mem from corsair,but when i turn on xmp profile,my pc restarts and it doesn`t put xmp on it`s say`s: 4800mhz but it`s 6000 you know why xmp profile restarts my pc and doesn`t turn it on ? (12700k, 32gig mem on 6000 ddr5 cl36, gigabyte z690 ud ax ddr5 mb,rtx3090,1250watt coolermaster psu) so? you know the problem maybe?

  • @teardowndan5364
    @teardowndan5364 Год назад +1

    The mystery slider appears to be a % boost - 20-24MHz (~1%) per '1' of the unit-less slider when none of the other limits are being hit first.

  • @renchesandsords
    @renchesandsords Год назад

    does arc use the nvidia or amd power sense method?, if it's nvidia, we all know what you can do...

  • @kyoudaiken
    @kyoudaiken Год назад

    To me the topmost slider with the unknown unit looks to be clock bins.

  • @dhoop4482
    @dhoop4482 Год назад +41

    I understand Arc has a ways to go but it will be interesting to see what changes are made with both the "LE" cards (wish they could have come up with something better than Limited Edition/hope they change it in the future) and also how the AIB cards perform early next year.
    It seems like Jay was having some fun with overclocking the card with the understanding that it's Intel's first attempt and knowing improvements will be made if they want to compete. From everything I've seen, Tom Petersen seems like he genuinly cares about making Intel's graphics cards a viable third option and I hope he succeeds since the more competition, the better for all of us.

    • @patx35
      @patx35 Год назад +2

      LE is Intel's naming for "Founders edition". Except that the GPU performance is identical to partner models. Also only available as the 16GB model.

    • @brandonokeeffe1193
      @brandonokeeffe1193 Год назад

      2 things that made me think it was the LE he was OCing is GPU RAM usage being over 8000MB with 49% utilisation
      Also at like 0:57 mark he gives the system specs where he says it is the LE card

  • @TheSmileyTek
    @TheSmileyTek Год назад

    Wonder if NZXT CAM can recognize it. I actually use CAM over Afterburner. No particular reason I use CAM, I just do. Works fine for manual GPU OCing. Shunt mods? 🤔

  • @matiko13
    @matiko13 Год назад

    10:30 --> There can be clicking "?" icon to show some more information for stuffs what doing in intel software.

  • @j340_official
    @j340_official Год назад

    Use the overclock in other apps. Like 3DMark and other games. And also at 1080p. Are the other apps/games also insensitive to the overclock?

  • @sidewinderr
    @sidewinderr Год назад

    what monitor is that Jay!!!!

  • @linndoria
    @linndoria Год назад

    Jay, will you be doing an Arc follow up around March/April 2023 to see how the latest drivers by then have improved the overall impressions of the A750/A770? It would be about 6 months since it launched.

  • @YAAMW
    @YAAMW Год назад

    1:24 And a giant glass panel choking the airflow in the front

  • @snake4game
    @snake4game Год назад

    15:15 is that a bug or what happend to go at 3267mhz ?:o

  • @DJH316007
    @DJH316007 Год назад

    What does the ? next to GPU Performance Boost say?

  • @electricdragon9366
    @electricdragon9366 Год назад

    Is this why the driver for Xe-LP got pulled from the download server a few days ago? to fix driver bugs?
    (Xe-LP is the GPU in Tiger & Adler Lake mobile CPUs, and it's fairly good)

  • @bustymclovin
    @bustymclovin Год назад +20

    Oh man, it is SOOOO much nicer when you do a video with monitor footage on a giant screen like you did today. Please keep using it. Just gotta keep the camera from moving so much. 😜

  • @gustavo_vanni
    @gustavo_vanni Год назад

    Why such a tiny monitor Jay?
    I'm barely able to see the screen!!

  • @christhorney
    @christhorney Год назад

    i wonder how a light underclock would make it performat a lower wattage

  • @ashnocrash7439
    @ashnocrash7439 Год назад

    Do you think a 750 psu would be enough to overclock a 3070 founders?

  • @freekvanemaus
    @freekvanemaus Год назад

    Fun fact: I installed the ARC drivers just to see if that control panel would work for me and it did. I'm running an ultra slim laptop with intel iRIS Xe graphics and the normal graphics command center kinda sucks it doesn't really give you anything to adjust. Of course this IGPU does not allow for any tweaking but I still get that nice overlay with some telemetry I did not have otherwise like voltage (up to a whopping 900mV) and GPU clock speed (smashing 1300MHZ).
    I did have to re-install the Xe drivers afterwards but I still have the ARC software even if it's just a nice gimmick.
    I'm planning to build a gaming PC over the next couple of months and your content has been super helpful. I'm also considering trying an intel GPU for my very first build, if they become available here for a reasonable price that is. Keep up the great work!

  • @Michael.Vatter
    @Michael.Vatter Год назад

    Can you tell me something? Does the Intel GPU´s have something like AMD Freesync or NVIDIA G-Sync? Would buy one but without anything like this it makes no sense for me and my G-Sync Monitor...

  • @Oxedizer
    @Oxedizer Год назад

    For one ones at the back of the class... what monitor is that bud?

  • @saleh3521
    @saleh3521 Год назад +1

    7:23 could someone explain this? I get that more power more clock. But how does the clock going down on its own to stop the power going over the limit. Shouldnt it a basic you need X power to reach Y clock. If you have the power you get the clock. If you dont have the power you dont get the clock. But how does the clock lowers itself so the power doesnt go over the limit?

  • @2528drevas
    @2528drevas Год назад

    I ran the Shadow of the Tomb Raider Benchmark with FidelityFX and XeSS, because unfortunately FSR 2.0 isn't supported in the benchmark, on my XFX 6700xt. XeSS provided a substantial boost. I want to see how it would match FSR 2.0 though.

    • @RobBCactive
      @RobBCactive Год назад

      Strange, I ran XeSS on an RX 6700xt with 1440p and it was -5% slower than native, where FSR was +5% faster with quality 80% resolution.
      Others reported similar findings.

  • @drunksupportcharacter
    @drunksupportcharacter Год назад +3

    I think jay is probably one of the few out their that understands this product, Its not gonna make waves out of the box, But it has the potential to and that is whats good about it, its a toy for the ones who want to tinker with things right now.

  • @UnBoundBeatz
    @UnBoundBeatz Год назад

    I would be curious to see how it performs under XOC. Power shunt mods? LN2?

  • @pooourya
    @pooourya Год назад

    Can you put timeline on your vids? Thanks❤️

  • @Dobrufusnoretro
    @Dobrufusnoretro Год назад

    In Europe still dont see any intel gpu in stores... they selling already in America?

  • @Veptis
    @Veptis Год назад

    I think it just has a more aggressive v/f curve if you move that slider. Any extreme OC on those cards yet?

    • @EbonySaints
      @EbonySaints Год назад

      Late reply, but ScatterBencher has OC'ed the A380 to 3100 Mhz about three months ago. Shamino has also just released a tool that allows for locked frequency/voltage overclocking on Arc.
      Word of warning though, the tool is in a very beta state. Setting the voltage on my Asrock A380 to anything leaves it at 1.15V. The power limit either obeys the 66W cap set by Intel or completely disregards it and goes for the any% WR for 100°C. All the while the PL1 limit is hit and the clocks drop to 2150 Mhz if I go above 2500 Mhz and I freak out about a potential house fire in the making. And there's no way to switch between locked and offset overclocking, so you'll have to restart your computer to do so.
      If you have more experience/time/resources for overclocking, then you can take a stab at it. I've managed to get the A380 a score of 7610 on Superposition 1080p Medium using the Intel OC tool (58% offset and +20 mV for about ~2650 Mhz), but it was pretty much unstable there. Anything else, other than Cyberpunk oddly enough, didn't play nice with a >50% offset either. Furmark in particular limits me to about a 34% offset. I'm not going to be a trailblazer in setting my computer on fire with Arc. I'm too broke to buy another GPU right now.

  • @TheTechAdmin
    @TheTechAdmin Год назад +3

    Is EVGA going to be an Intel ARC partner?! That would be such an awesome slap in the face for Nvidia.

    • @ColdRunnerGWN
      @ColdRunnerGWN Год назад

      It's likely that EVGA has an anti-compete clause in their agreement, and if that's the case then it will be a while before they could even think of doing it.

    • @terribleatgames-rippedoff
      @terribleatgames-rippedoff Год назад

      @@ColdRunnerGWN Actually they don't have such clause. There's a GN video about that. It's one of those after the initial reveal video.

    • @terribleatgames-rippedoff
      @terribleatgames-rippedoff Год назад +1

      EVGA knows better than joining Intel ARC's clownfiesta, but they are really missing a huge opportunity by not wanting to go with AMD Radeon. If you are EVGA and rather choose to downsize and restructure to focus on other product segments than than jumping in bed with AMD with favourable terms, then that says a lot about EVGA's CEO.

    • @ColdRunnerGWN
      @ColdRunnerGWN Год назад

      @@terribleatgames-rippedoff Thanks for the info. I didn't watch the whole video, so I missed that. I'm surprised that they didn't, as a lot of companies don't want you just jumping ship like that.

  • @WillFuI
    @WillFuI Год назад

    I wonder if a undervolt would help

  • @BeeWaifu
    @BeeWaifu Год назад

    There a reason you're using a wall-sized monitor?

  • @UnfilteredNonsense
    @UnfilteredNonsense Год назад +7

    ok now that is what i call a monitor

    • @ron200088
      @ron200088 Год назад

      Exactly my thoughts ! And Jay's a big guy and the looks small compared to that behemoth. I bet he got it to compensate for the "small" 4090.

    • @Xilent1
      @Xilent1 Год назад

      How big is that monitor anyway?

  • @banisherblade
    @banisherblade Год назад

    Afterburner doesnt work, what about Precision X or other OC softwares?

  • @harbscantina
    @harbscantina Год назад

    What is that monitor on your desk?! IT'S HUGE!

  • @mccloud_1999
    @mccloud_1999 Год назад

    Hey Jay, would you ever sell a 2080Ti? Been trying to get back into gaming so I bought a EVGA 1070Ti but I’m running everything with a 500w psu and an AMD FX-8350. I believe my cpu is on its deathbed. It’s running at 100% as soon as I start the pc. It starting to get extremely hot during idle or gameplay. It’s get around 165 -175 F. I’m planning on getting a new system with an Intel I7-7700k but I want to start streaming as well. I don’t want to run my 1070Ti into the ground with what I want to do.

  • @LeoWasHere1
    @LeoWasHere1 Год назад

    please make a video of Intel arc in current date and talk about what's changed and what's not

  • @kianmoiny7860
    @kianmoiny7860 Год назад

    can it not be undervolted?

  • @xthemarket
    @xthemarket Год назад

    Nice to see another man using a 55”+ as a monitor 💪
    I’m using a LGC1 55”

  • @robertpearson8546
    @robertpearson8546 Год назад +2

    In 2011 Ćuk came up with a better VRM design, the 2-phase Ćuk-buck2 VRM. But the graphics card manufacturers (and the motherboard manufacturers) prefer the 1920 design (60% efficiency) to the 2011 design (99.5% efficiency). The newer design is also cheaper, smaller, has less ripple, and has a better transient response. Plus, in the 1920 design, if any FET shorts, the GPU voltage goes to 12 V.

  • @winners160
    @winners160 Год назад

    I have a 3080 fe and a Elgato mk.2 capture card.
    Can this intel arc encode for my game capture card using Elgato 4K utility app

    • @winners160
      @winners160 Год назад

      I want my 3080 fe dedicated just play games

  • @ARealPain
    @ARealPain Год назад

    That GPU do kinda be lookin fancy with that little RGB stripe.

  • @Shijayy
    @Shijayy Год назад +2

    why i don't have the performance tuning thing?

  • @Kromunos
    @Kromunos Год назад +8

    I'd still support ARC if I can, sure compared to both AMD and NVIDIA but they're just starting out. Also, it's good to have a 3rd option when buying GPUs in the future.

  • @bagelbytes69420
    @bagelbytes69420 Год назад +1

    The power limit on these seems to be the main issue. I would say that undervolting is likely the way forward with these 1st party cards, similar to how my MSI 3080 is hard locked to 330W max, when other ones have a higher limit. If I want the most performance out my 330W, I change the voltage curve in MSI Afterburner in order to clock up to ~1900-2000MHz. Simply adding more to the GPU slider, and pumping up voltage increases heat, and gives me lower clocks. I think if these cards can be made to work with Afterburner, and have editable voltage curves, this should be what you all test next.

    • @namegoeshere197
      @namegoeshere197 Год назад

      Just a shame that Nvidia also took away 100% user volt control away form us a few years back :(

    • @fnorgen
      @fnorgen Год назад

      These modern cards have so little power headroom that tuning for power efficiency often becomes the most important. Personally I've become really fond of playing voltage limbo and even slight underclocking after I realized that I could get my card running cool and whisper quiet at 85% power, with only a miniscule performance sacrifice. Easily worth it in most games.

  • @cheekychillipepper
    @cheekychillipepper Год назад

    how do you know it will be passed to AIB's

  • @WhiskeyInspekta
    @WhiskeyInspekta Год назад

    Acre bifrost OC software lets you change fan speed. But doesn’t let you change voltage.

  • @shiftyidog-
    @shiftyidog- Год назад

    Does it undervolt?

  • @novelsandcrumbs3558
    @novelsandcrumbs3558 Год назад

    Should try xeon e5 because mine works. Confused too by the way.

  • @Nitricta
    @Nitricta Год назад +1

    I really hoped that Intel would come out with a lot of VRAM, but this is quite good as well.

  • @ancient1der
    @ancient1der Год назад +2

    Were you considering doing a test of the AV1 encoding capabilities? I saw that even the A380 was impressive with it, would be interesting to see how the 750 and 770 perform.

    • @shiftyidog-
      @shiftyidog- Год назад +1

      Both cards have the same onboard encoder controller I believe so they should be the same

    • @shiftyidog-
      @shiftyidog- Год назад +1

      380 and 770/750 I mean

    • @albaniqn
      @albaniqn Год назад +1

      @@shiftyidog- it would be viable to have the 380 be solely for encoding footage and using whatever alternative graphics card you have to be used completely unaffected

  • @RubyRoks
    @RubyRoks Год назад +1

    They have the outline for a lot of great things, they just need to focus on the execution. And fan control. Focus on giving the user fan control.
    I feel like custom BIOS is gonna be really powerful on this first run of Intel cards

  • @666thsense
    @666thsense Год назад

    that's gotta be the 48" right?

  • @johnb3170
    @johnb3170 Год назад

    Ugh I forgot what's the monitor he is running on this video again?

  • @monster4play
    @monster4play Год назад +2

    Did you try Intel XTU for overclocking? Since XTU is an Intel tool, you may have better luck...