Intels GPU software needs some serious work!
HTML-код
- Опубликовано: 6 окт 2022
- Overclocking GPUs has been a hobby of mine for nearly two decades, however Intel ARC just isnt worth even trying... here's why...
Learn all about KIOXIA's Lineup of consumer and enterprise grade NVME SSDs and celebrate their 35th anniversary of NAND Flash by heading to www.kioxia.com/en-us/top.html
Get your JayzTwoCents Merch Here! - www.jayztwocents.com
○○○○○○ Items featured in this video available at Amazon ○○○○○○
► Amazon US - bit.ly/1meybOF
► Amazon UK - amzn.to/Zx813L
► Amazon Canada - amzn.to/1tl6vc6
••• Follow me on your favorite Social Media! •••
Facebook: / jayztwocents
Twitter: / jayztwocents
Instagram: / jayztwocents
SUBSCRIBE! bit.ly/sub2JayzTwoCents Наука
We need Intel to fix the ARC drivers before worrying about OC on these cards.
true that!
Totally agree, also most of midrange players wont really go after OC their cards. I believe most of people who are looking after this cards are more interested in the price vs raw performance.
@@davidescobar9309 Nobody should be looking at this card unless they like tinkering and are a heavy enthusiast, which is definitely who'll OC. If you want budget facebok is best where I've seen 220$ 6700 xts, 240$ 3070s, 280$ 6800s, and 400-450$ 6800 xts and 3080s, otherwise 6600/6600 xt/6650 xt if you absolutely want new.
@@Rspsand07 No one should buy used miner cards. Let them rot for screwing with the supply and hoarding gpus.
@@gharm9129 That's one of those spite myself moves. Only one who wins is Nvidia/AMD. If you're on a budget, you absolutely should be. Espesially at brands with transferable warranty and the latest gen that'll have 2-3 years warranty left, a huge discount, and no taxes.
Hey Jay, the first slider is the frequency offset, you are moving the entire VF curve up. The second slider is how far in the VF the frequency/voltage can go, that's why you see freq changes on the voltage slider.
Try maxing the freq offset first, then play with the voltage. Tom Petersen waa able to get 2.7 GHz from this GPU in real time.
For the GPU to be released this 'average' for their first GPU for 20 years, this is not bad.
Now we only need a new driver update.
I'll bet they've upped the driver performance till the release.
Intel: what?! I heard nothing, our product is great already.
I do like that RT features are very strong on these cards, especially compared to AMD. It will REALLY force AMD to either optimize for RT, or pump up the performance big time with future endeavors. Competition is king, especially with how sheisterey Nvidia have been with pricing and treatment of AIB partners. *cough EVGA cough*
Especially at that price point! Makes me look like a turkey fucker for buying a 3080TI @ 2000 AUD!!!
@@bagelbytes69420 well...just e now, no more vga
And the hardware is actually round 3070-level based on tests on a lot of games, just bottlenecked by driver performance. If AMD solved it, so can Intel. a 329 dollar potential 3070 is far, far from bad.
2:37 Oncea programmer at MSI Afterburner gets ahold of a card with the software, they will be able to grab the data points from _this_ software. And update MSI Afterburner to support ARC.
It's kind of expected, since it's specs on paper is way higher than RTX 3060, but in reality most of the time it barely keep up with RTX 3060. In short - drivers sucks, but it can be improved over time, which keeps me optimistic about intel GPUs
AMD relive is their recording software. The interface is called Adrenaline
Yeah it's called Adrenaline since 2017. Before that it was Crimson.
Tbh these cards aren't great but ain't bad, they're quite good for their first try and the target (mid-range gpu market). I hope people give it a chance because we need more gpus on the market and more players competing the more competive pricing we will get among all gpu makers. We need to break the dual monopoly if we want to keep getting gpus without the need to sell a kidney.
Yeah, we do need another GPU brand because Intel is our best bet because at least in our Centre. There won't be another GPU brand sadly as it's a lot for a new company to just start get started developing a new GPU architecture which would take years to finish and another few years and billons to release lol
@XvX well yeah they need people to use them a sample size of a few hundred won’t compare to maybe millions of users and all that data and it could be a lot worse
@XvX what do you mean "resizable bar is a big red flag"? Either your system supports it and then you get the performance or it doesn't and then you simply shouldn't buy these cards. Poor DX11 support is a much worse problem for people like me but if one only plans to play newer games (there's a lot of people who only play Fortnite or Overwatch or Call of Duty etc) then these cards are a very good option.
@XvX Rebar is just required if you want to use Arc, why would one want to turn it off? One just needs to make sure their CPU and MB support it.
@XvX yes sure you shouldn't use Arc in older systems which don't support Rebar just like you shouldn't try to install DX12 games on Windows 7. Other issues are driver issues (except probably DX11 support) and will be sorted out. The cards aren't even sold in stores yet. I've seen glitches a lot on Radeon cards long time after release and still there were enthusiasts who bought Radeon for some reason. If you want more stable system buy NVidia. I wouldn't expect this Arc generation to have any noticeable market share even if there were no issues at all but I guess in 6-12 months it will be quite a safe bet and a good bargain for some categories of consumers.
Great vid buddy! Keep up the good work!
I love that Jay's titles aren't clickbait.
It would be hilarious if EVGA goes full Intel GPUs and we even get a KingPin intel GPU
honestly i'd love it regardless of whos cards EVGA is making, i hate the idea of them leaving gaming entirely
🤣 u didnt hear? They dont make ANY gpus anymore.
Was there no memory OC option?
That would be nice and it would also give a clue to how limited the card is on memory bandwidth if at all.
This card has a massive bus as well so memory bandwidth shouldn't be much of an issue. It has the same bus width as 3070/ti if I recall correctly.
@@tristanweide that doesn't mean it isn't memory starved. Different architecture uses bandwidth differently. Also.. 3070s have much faster vram.
256bit is far from massive. Lots of 384 and 512 bit cards in the past.
@@christophermullins7163 Yea and you could even get a GTX 260 216SP card with 448bit bus for 139€ back in the day (retail not used) and SLI two of them with out breaking the bank.
Its a bit annoying the price of cards today and knowing they can do 512bit buses if they wanted and at a decent price.
I really like this set. I'll have to go back and watch your build videos because I'd like my room to similar 👍
That gpu performance boost slider seems to be like an untervolting tuner. Except I don't see where you can drop the mV below it's base voltage. I guess it would behave like an undervolt if you drop the wattage limit down while turning the gpu boost slider up? Is there an option to negative offset the voltage?
I wish I had one of these to tinker with but I don't really need a new gpu right now.
someone i know told me that the gpu performance boost slider isn't any kind of mhz boost but an overall performance boost with (if possible) fan speed temp limit etc and that would explain the lower average clock speed when using higher values when it's lowering clock speed to keep the temp under a certain treshold
The card does look pretty in the chassis tho.
I was pretty skeptical on the looks but it does look really nice in there.
Love the videos. Casn I ask, what is that monitor please?
He talks about the card and I'm here like eyes locked onto the monitor. That's a really cool monitor. lol
The Performance Tuning configuration menu gives control over the v/f curve (GPU Performance Boost), voltage offset, power limit, and temperature limits
I’m more interested with the monitor you are using? Whats the brand / model? TY
C'mon guys, ya gotta wait for the 770K "unlocked" Edition.
Did You try to use Fan Control application that You recommended some time ago, to try to change Intel GPU Fan RPM ?
It would be interesting to see if minor undervolting could help bridge that gap with the wattage maxed out. Seems like the overhead from the stock voltage is a serious limiting factor despite a few degrees of temperature headroom to work with. Unfortunately that isn’t an option in the current version of the oc software (or luckily lol)
Well at least the competition is there. They'll improve over time and bring innovation along with the other two brands. Looking forward to the future!
It's a start for the intel , they've never made a discrete gpu before , people should calm themselves and have faith but they'll improve , it's essential to have a 3rd company for competition to get nvidia amd on right track and they don't get crazy with the pricing
Yes they have. They stopped making them, but this is not their first foray into making DGPUs.
Cough ...
Cough ...
Project Larabee
@@fajaradi1223 how many gpu sold that time ?
Did the top slider affect memory speed?
At 15:14 did I see the clock spiking to 3267 MHz? It's gotta be magic in there. The ghosts within the machine 😅
Arc is just finding its footing. This is a first gen card folks and they are doing very well when compared with other mid-grade cards. I will buy one, not that I need it, but, to help support a third option in the GPU market. Thanks Intel Graphics, I'll continue to root for ya.
EXACTLY. people fail to comprehend this
Same here. But i need to upgrade my CPU first. (Still on my 8700)
@@SimonBauer7 Yeah, i'm just ignoring the ignorant millenials because they just don't see the big picture here.
@@SimonBauer7 not really. They fail to accept dealing with a first gen product. Not everyone wants to be a beta tester and deal with these issues some which may have little to no info of what's causing it or how to fix it. It's YOU who doesn't understand that Intel still needs work to be done on top of the very obvious problem of the card basically being terrible for playing anything that isn't a new game.
@@whalehunter2407 Have you ever even heard of "Beta Drivers"?... Probably not.
I'm diggin the giant tv setup
How can i install/download the intel ARC software like from the video?
Did you try to type in the wattage limit? Maybe the slider does NOT allow more than 228W, but if you type it in it might work. There are some cases where you can actually go over the slider.
Did you ever hover the mouse over the ' ? ' at the end of the lines ? ....just in case it was a context sensitive hover Help info button ?
What's your TV exactly Jay. I'd like to get something like that for DCS. Thanks!!
Hey, when is M-RAm going to come out..
what monitor are you using in this video?
Could always resistor mod the A770 to see where you could take it.
Did you test if FanControl can handle the fans?
What is the monitor you're using here?
@jayzTwoCents when you getting a Samsung Odyssey ARC monitor?
By looking at the transistor count and process node there is very high probability that in a year or two as Intel optimize and stabilize their driver stack they will extract excellent performance from ARC. Being a software dev I'm very much aware about how complex graphics driver stack can be. I would definitely give credit to Intel of bringing at least this level of stability for the brand new product which is directly trying to bite 3070 territory. Doesn't matter how much money u through at driver development teams it's gonna take its time to stabilize. This is time problem not a money problem.
Jay! Mod it so it doesn't know how much power it's pulling, and then put it on Water for more thermal headroom and see how far it can go!
How, and with what waterblock?
@@thelonelytimbit How: shunt mod
Waterblock: could be any CPU block, but they're gonna need a modified or custom mounting bracket
@@tippyc2 uhhh, chip isn't in a standard form, not even centered to the mounting holes of the cooler, it's all propietary
@@tippyc2 shunt mods require a fair bit of specific board knowledge that simply doesn't exist yet for Intel. On AMD and Nvidia it's a lot easier because we have previous generations and established board design choices to go off of. I don't think Jay has the resources to be investigating shunt mods and designing cooler mounting brackets, they aren't LTT.
@@420rvidxr Yeah, i watched GN's teardown too. Still, any generic CPU cooler and a custom mount would do it. I get that practically nobody in the tech space other than Der Bauer understands fabrication enough to do it, but that doesnt mean it cant be done. It doesnt need to be pretty, it doesnt need to be perfectly centered over the chip, it doesnt need to cool the RAM or VRM's.
What monitor are you using
is a770 worth it for older Haswell cpu setups?
Couldn't you use that 3rd party fan controller you did a video on ?
Nvidia does also allow wattage based power adjustments. But only through nvidia-smi.
I want to see availability after initial launch first.
7:52 It was 13MHz on my 1080 and on 3080Ti it's 15MHz. Is 7MHz for 40x0 series?
Does anyone know what display that is?
hey jay, got a question i got ddr5 6000mhz cl32 mem from corsair,but when i turn on xmp profile,my pc restarts and it doesn`t put xmp on it`s say`s: 4800mhz but it`s 6000 you know why xmp profile restarts my pc and doesn`t turn it on ? (12700k, 32gig mem on 6000 ddr5 cl36, gigabyte z690 ud ax ddr5 mb,rtx3090,1250watt coolermaster psu) so? you know the problem maybe?
The mystery slider appears to be a % boost - 20-24MHz (~1%) per '1' of the unit-less slider when none of the other limits are being hit first.
does arc use the nvidia or amd power sense method?, if it's nvidia, we all know what you can do...
To me the topmost slider with the unknown unit looks to be clock bins.
I understand Arc has a ways to go but it will be interesting to see what changes are made with both the "LE" cards (wish they could have come up with something better than Limited Edition/hope they change it in the future) and also how the AIB cards perform early next year.
It seems like Jay was having some fun with overclocking the card with the understanding that it's Intel's first attempt and knowing improvements will be made if they want to compete. From everything I've seen, Tom Petersen seems like he genuinly cares about making Intel's graphics cards a viable third option and I hope he succeeds since the more competition, the better for all of us.
LE is Intel's naming for "Founders edition". Except that the GPU performance is identical to partner models. Also only available as the 16GB model.
2 things that made me think it was the LE he was OCing is GPU RAM usage being over 8000MB with 49% utilisation
Also at like 0:57 mark he gives the system specs where he says it is the LE card
Wonder if NZXT CAM can recognize it. I actually use CAM over Afterburner. No particular reason I use CAM, I just do. Works fine for manual GPU OCing. Shunt mods? 🤔
10:30 --> There can be clicking "?" icon to show some more information for stuffs what doing in intel software.
Use the overclock in other apps. Like 3DMark and other games. And also at 1080p. Are the other apps/games also insensitive to the overclock?
what monitor is that Jay!!!!
Jay, will you be doing an Arc follow up around March/April 2023 to see how the latest drivers by then have improved the overall impressions of the A750/A770? It would be about 6 months since it launched.
1:24 And a giant glass panel choking the airflow in the front
15:15 is that a bug or what happend to go at 3267mhz ?:o
What does the ? next to GPU Performance Boost say?
Is this why the driver for Xe-LP got pulled from the download server a few days ago? to fix driver bugs?
(Xe-LP is the GPU in Tiger & Adler Lake mobile CPUs, and it's fairly good)
Oh man, it is SOOOO much nicer when you do a video with monitor footage on a giant screen like you did today. Please keep using it. Just gotta keep the camera from moving so much. 😜
Why such a tiny monitor Jay?
I'm barely able to see the screen!!
i wonder how a light underclock would make it performat a lower wattage
Do you think a 750 psu would be enough to overclock a 3070 founders?
Fun fact: I installed the ARC drivers just to see if that control panel would work for me and it did. I'm running an ultra slim laptop with intel iRIS Xe graphics and the normal graphics command center kinda sucks it doesn't really give you anything to adjust. Of course this IGPU does not allow for any tweaking but I still get that nice overlay with some telemetry I did not have otherwise like voltage (up to a whopping 900mV) and GPU clock speed (smashing 1300MHZ).
I did have to re-install the Xe drivers afterwards but I still have the ARC software even if it's just a nice gimmick.
I'm planning to build a gaming PC over the next couple of months and your content has been super helpful. I'm also considering trying an intel GPU for my very first build, if they become available here for a reasonable price that is. Keep up the great work!
Can you tell me something? Does the Intel GPU´s have something like AMD Freesync or NVIDIA G-Sync? Would buy one but without anything like this it makes no sense for me and my G-Sync Monitor...
For one ones at the back of the class... what monitor is that bud?
7:23 could someone explain this? I get that more power more clock. But how does the clock going down on its own to stop the power going over the limit. Shouldnt it a basic you need X power to reach Y clock. If you have the power you get the clock. If you dont have the power you dont get the clock. But how does the clock lowers itself so the power doesnt go over the limit?
I ran the Shadow of the Tomb Raider Benchmark with FidelityFX and XeSS, because unfortunately FSR 2.0 isn't supported in the benchmark, on my XFX 6700xt. XeSS provided a substantial boost. I want to see how it would match FSR 2.0 though.
Strange, I ran XeSS on an RX 6700xt with 1440p and it was -5% slower than native, where FSR was +5% faster with quality 80% resolution.
Others reported similar findings.
I think jay is probably one of the few out their that understands this product, Its not gonna make waves out of the box, But it has the potential to and that is whats good about it, its a toy for the ones who want to tinker with things right now.
I would be curious to see how it performs under XOC. Power shunt mods? LN2?
With drivers on this state?
Can you put timeline on your vids? Thanks❤️
In Europe still dont see any intel gpu in stores... they selling already in America?
I think it just has a more aggressive v/f curve if you move that slider. Any extreme OC on those cards yet?
Late reply, but ScatterBencher has OC'ed the A380 to 3100 Mhz about three months ago. Shamino has also just released a tool that allows for locked frequency/voltage overclocking on Arc.
Word of warning though, the tool is in a very beta state. Setting the voltage on my Asrock A380 to anything leaves it at 1.15V. The power limit either obeys the 66W cap set by Intel or completely disregards it and goes for the any% WR for 100°C. All the while the PL1 limit is hit and the clocks drop to 2150 Mhz if I go above 2500 Mhz and I freak out about a potential house fire in the making. And there's no way to switch between locked and offset overclocking, so you'll have to restart your computer to do so.
If you have more experience/time/resources for overclocking, then you can take a stab at it. I've managed to get the A380 a score of 7610 on Superposition 1080p Medium using the Intel OC tool (58% offset and +20 mV for about ~2650 Mhz), but it was pretty much unstable there. Anything else, other than Cyberpunk oddly enough, didn't play nice with a >50% offset either. Furmark in particular limits me to about a 34% offset. I'm not going to be a trailblazer in setting my computer on fire with Arc. I'm too broke to buy another GPU right now.
Is EVGA going to be an Intel ARC partner?! That would be such an awesome slap in the face for Nvidia.
It's likely that EVGA has an anti-compete clause in their agreement, and if that's the case then it will be a while before they could even think of doing it.
@@ColdRunnerGWN Actually they don't have such clause. There's a GN video about that. It's one of those after the initial reveal video.
EVGA knows better than joining Intel ARC's clownfiesta, but they are really missing a huge opportunity by not wanting to go with AMD Radeon. If you are EVGA and rather choose to downsize and restructure to focus on other product segments than than jumping in bed with AMD with favourable terms, then that says a lot about EVGA's CEO.
@@terribleatgames-rippedoff Thanks for the info. I didn't watch the whole video, so I missed that. I'm surprised that they didn't, as a lot of companies don't want you just jumping ship like that.
I wonder if a undervolt would help
There a reason you're using a wall-sized monitor?
ok now that is what i call a monitor
Exactly my thoughts ! And Jay's a big guy and the looks small compared to that behemoth. I bet he got it to compensate for the "small" 4090.
How big is that monitor anyway?
Afterburner doesnt work, what about Precision X or other OC softwares?
What is that monitor on your desk?! IT'S HUGE!
Hey Jay, would you ever sell a 2080Ti? Been trying to get back into gaming so I bought a EVGA 1070Ti but I’m running everything with a 500w psu and an AMD FX-8350. I believe my cpu is on its deathbed. It’s running at 100% as soon as I start the pc. It starting to get extremely hot during idle or gameplay. It’s get around 165 -175 F. I’m planning on getting a new system with an Intel I7-7700k but I want to start streaming as well. I don’t want to run my 1070Ti into the ground with what I want to do.
please make a video of Intel arc in current date and talk about what's changed and what's not
can it not be undervolted?
Nice to see another man using a 55”+ as a monitor 💪
I’m using a LGC1 55”
In 2011 Ćuk came up with a better VRM design, the 2-phase Ćuk-buck2 VRM. But the graphics card manufacturers (and the motherboard manufacturers) prefer the 1920 design (60% efficiency) to the 2011 design (99.5% efficiency). The newer design is also cheaper, smaller, has less ripple, and has a better transient response. Plus, in the 1920 design, if any FET shorts, the GPU voltage goes to 12 V.
I have a 3080 fe and a Elgato mk.2 capture card.
Can this intel arc encode for my game capture card using Elgato 4K utility app
I want my 3080 fe dedicated just play games
That GPU do kinda be lookin fancy with that little RGB stripe.
why i don't have the performance tuning thing?
samw
I'd still support ARC if I can, sure compared to both AMD and NVIDIA but they're just starting out. Also, it's good to have a 3rd option when buying GPUs in the future.
The power limit on these seems to be the main issue. I would say that undervolting is likely the way forward with these 1st party cards, similar to how my MSI 3080 is hard locked to 330W max, when other ones have a higher limit. If I want the most performance out my 330W, I change the voltage curve in MSI Afterburner in order to clock up to ~1900-2000MHz. Simply adding more to the GPU slider, and pumping up voltage increases heat, and gives me lower clocks. I think if these cards can be made to work with Afterburner, and have editable voltage curves, this should be what you all test next.
Just a shame that Nvidia also took away 100% user volt control away form us a few years back :(
These modern cards have so little power headroom that tuning for power efficiency often becomes the most important. Personally I've become really fond of playing voltage limbo and even slight underclocking after I realized that I could get my card running cool and whisper quiet at 85% power, with only a miniscule performance sacrifice. Easily worth it in most games.
how do you know it will be passed to AIB's
Acre bifrost OC software lets you change fan speed. But doesn’t let you change voltage.
Does it undervolt?
Should try xeon e5 because mine works. Confused too by the way.
I really hoped that Intel would come out with a lot of VRAM, but this is quite good as well.
Were you considering doing a test of the AV1 encoding capabilities? I saw that even the A380 was impressive with it, would be interesting to see how the 750 and 770 perform.
Both cards have the same onboard encoder controller I believe so they should be the same
380 and 770/750 I mean
@@shiftyidog- it would be viable to have the 380 be solely for encoding footage and using whatever alternative graphics card you have to be used completely unaffected
They have the outline for a lot of great things, they just need to focus on the execution. And fan control. Focus on giving the user fan control.
I feel like custom BIOS is gonna be really powerful on this first run of Intel cards
that's gotta be the 48" right?
Ugh I forgot what's the monitor he is running on this video again?
Did you try Intel XTU for overclocking? Since XTU is an Intel tool, you may have better luck...