The Ultimate "Should You Buy An Intel Arc GPU?" Video! | HD 2560x1440

Поделиться
HTML-код
  • Опубликовано: 10 окт 2024

Комментарии • 194

  • @mutawi3i
    @mutawi3i Год назад +24

    Arc is a good card. As a former diehard overclock er and now parent of 2 little kids I like the a750 for 280 euros. A770 is similar in performance with double the ram. Nice video m8. Thank you.

    • @gamesushi
      @gamesushi  Год назад +6

      Thank you! I agree for the performance you get at such a low cost, its mind boggling. I’ll be picking up a Battlemage GPU when it launches. Very excited about it.

    • @mutawi3i
      @mutawi3i Год назад +3

      ​@@gamesushi yeah bro I get u. There is something about it that gives me old vibes of the voodoo 3 and tnt 2 era. If u fix a black screen issue it's so satisfying. Tip If u change ur wallpaper in a solid color, any color it will decrease ur idle state by some wattage.
      Greetsz

    • @gamesushi
      @gamesushi  Год назад +2

      @@mutawi3i I didn't know that! Going to pin this comment, as anyone wanting to decrease idle wattage will want to see this. Thanks! :D

    • @NamTran-xc2ip
      @NamTran-xc2ip Год назад

      With that price I would just get the rx6700

    • @gamesushi
      @gamesushi  Год назад +1

      @@NamTran-xc2ip Due to price drops I actually picked up a 6800XT. Which I will talk about in my next video coming out tomorrow. :)

  • @soldiersvejk2053
    @soldiersvejk2053 Год назад +21

    On spending $200 on an Arc A750, consider it as you have spent $100 on an Intel collector item that is actually useful, and another $100 on a discount coupon for any NVidia/AMD graphics card purchase in the future. If Intel is to stop making graphic cards, your Arc will become a real collectible item and you can sell it to some random retro-tech RUclipsr a few years later for $300. If Intel is to keep making graphic cards because of the supports, you won’t be charged $499 for a mid-range xx60 tier graphics card by NV/AMD. Looks like a win-win situation to me.

    • @gamesushi
      @gamesushi  Год назад +2

      That's one way to put it! All in all though I do feel like the current alchemist line of Arc cards are a good value. Specifically I feel like the A770, A750, and the A380 have their own unique purposes and niche they each fill other than just gaming. Which is something I will actually cover in my next video. Thank you for stopping by and commenting. I really appreciate it!

    • @fujitsubo3323
      @fujitsubo3323 Год назад +4

      i dont think they will ever become a rare collectors item. even if intel stops selling GPU's they still pumped out enough of the cards to hardly make them rare. is like calling the gamecube rare.

    • @Mr-Clark
      @Mr-Clark Год назад

      @@fujitsubo3323you ever see how much a good condition GameCube sells for?

  • @casualpcreviews
    @casualpcreviews Год назад +14

    Great video. I am very happy with the A770. It is in my benchmark and secondary work PC. I honestly have not run into too many issues. Mainly with Dead Space, Jedi Survivor, Aragami 2, and BF2042

    • @gamesushi
      @gamesushi  Год назад +5

      Thank you! I also love my A770. Upgrading from a GTX 970 to this and now being able to game at 1440 high/ultra settings depending on game and even sometimes High settings at 5120x1440 is just insane.

    • @Mr_Smarty_Pants
      @Mr_Smarty_Pants Год назад +2

      You use the A770 successfully with Dead Space 2023? I'm thinking of buying this card.

    • @gamesushi
      @gamesushi  Год назад +2

      @@Mr_Smarty_Pants I have not tried it yet, though I heard performance was recently fixed. GraphicArc has a video on this called "Intel FINALLY Improved Dead Space Remake Performance on Intel Arc GPUs" if you want to check it out.

    • @Mr_Smarty_Pants
      @Mr_Smarty_Pants Год назад +1

      @@gamesushi Thx I appreciate it, I'll look for that video.

    • @casualpcreviews
      @casualpcreviews Год назад +3

      @@Mr_Smarty_Pants There have been huge improvements with Dead Space. Before, it was not even playable. I uploaded some recent videos that include Dead Space. @GraphicArc has videos that show before and after it became playable.

  • @tauheedulali2652
    @tauheedulali2652 Год назад +7

    Resizable bar is also found as AMD Smart Access Memory or Clever Access Memory on some motherboards if it's not coming up

    • @gamesushi
      @gamesushi  Год назад +3

      Thank you for adding this info to the conversation! Really good info to know. Also thank you for watching I really appreciate it. :)

  • @pregorygeck6605
    @pregorygeck6605 Год назад +2

    I have both the 750 LE and 770 LE. Bought them new. The 750 was £150 from FB marketplace. Picking up my official Intel Arc neon logo. Can't wait. 😀

    • @gamesushi
      @gamesushi  Год назад

      I like the neon sigh it's pretty sweet.

  • @wjack4728
    @wjack4728 Год назад +6

    Great video! I've had a Arc a750 for about a month, and had many problems at first. I would get black screens whenever I used the HDMI port, so I went to Display port. I also had a severe problem with crashing when screen would wake from sleep. I did the DDU thing to uninstall my AMD graphics drivers before installing the Arc a750, but still had problems. I also could not get Arc Control Panel to install no matter what I did. I went ahead and bit the bullet and reinstalled Windows 10, and all problems disappeared, and it's running great now, and Arc Control Panel installs now. I'll definitely try the idle power solution you talked about. Thanks for the info on the Arc GPU's.

    • @gamesushi
      @gamesushi  Год назад

      Hello and thank you!! I’m very glad to hear you sorted out your issues, especially the waking from sleep one that can be incredibly frustrating to deal with. That’s interesting Arc Control wouldn’t install for you at first, I have heard of a couple users having that issue, though in their case I believe it was due to an old nvidia driver that they never uninstalled, using DDU fixed that but sounds like you already tried that. Im happy to hear re-installing windows did the trick though!

    • @wjack4728
      @wjack4728 Год назад +2

      Just did the idle power solution you where talking about, and it did lower my idle power down from 41 watts to 6-14 watts. Thank you Sir! I have a HP 24" 75 Hz. office monitor.

    • @gamesushi
      @gamesushi  Год назад +2

      @@wjack4728 Hooray! Glad it worked! :D

  • @Obie327
    @Obie327 Год назад +3

    Very informative and thanks for detailed thoughts and informing of possible quirks. I've been an owner of the Intel ARC A770 LE 16 since mid March. (Ryzen 7 5800X/B550 Plus) I personally haven't had too much trouble besides getting rebar to enable through the app when I first got the GPU. ARC recently received an update for fan speed control. I really enjoyed your content and thanks again GameSushi.

    • @gamesushi
      @gamesushi  Год назад +1

      On nice we have very similar system specs except I have the 5600X. That update was pretty nice wasn't it? I did play around with the fan control a bit, not too much, but it came in handy when I had to rebuild the Last of Us shaders lol. Instead of watching the card climb to 90c it stays in 70s with how I set my curve. I'm glad you enjoyed it and more importantly thank you for watching! For real I really appreciate it. :D

    • @Obie327
      @Obie327 Год назад +2

      @@gamesushi Most welcome! Yes it was nice having these new features. I was kinda worried about heat with only the auto setting available. (84c) I was wondering about the power limit option in the settings. It goes to 228 watts when slid all the way to the right. What do you set yours at? mine is at 220 I think. I'm very pleased with my purchase over my 7 year old Pascal before it. Happy computing! Peace!

    • @gamesushi
      @gamesushi  Год назад +2

      I normally just leave mine on default settings so my slider is usually at 190. Mainly because changing wattage slider to 228 made my card get to 90c while playing games, especially on The Last of US. Now that we can change fan control I may use the tuning sliders more. The setting I use sometimes is top slider at 20, next one at 5 offset, wattage to 228, temp to 90c, but now I also set fan curve a bit aggressively so that at around 70c fans are kicking around 55-65% and increase in speed with temp from there, which is pretty loud. But keeps card cooler. Due to noise I don’t do that very often lol. All in all not a super aggressive tune, but I haven’t seen much fps increase to have it tuned higher than that. But I honestly haven’t used it much since fan control was released. Like I said I normally just stick to default settings.

  • @sidburn2385
    @sidburn2385 Год назад +5

    Thanks so much for this deep dive in on arc. Sadly the bigger yt channels just haven’t covered this card enough, which is just a shame. Im on the Gigabyte B550 Vision which is nice, currently still on a 5700G. Me running Linux as well, so thanks for the reddit tip. I will also use win11 but mainly use linux.👍

    • @gamesushi
      @gamesushi  Год назад

      Thank you I appreciate the kind words. Awesome, we both have the same motherboard! How do you like the 5700G? I used to have a 5600G and it was nice to have in the event I wanted to troubleshoot without GPU plugged in, I only upgraded to 5600X for bigger cache size and it was on sale at the time.

    • @sidburn2385
      @sidburn2385 Год назад +1

      So far the 5700G has been great, although I haven’t thrown any games at it. Primarily I work on server hardware, thats one reason for the vision board, its capable of ECC RAM but I have GSkill B-Die for overclocking purpose’s. Since over 10yrs of absence Im returning to PC gaming. I might still stick it out until Battlemage is around and keep my 5700G until then. My new build isn’t complete yet, placing my custom reservoir order over the weekend. This time around ill be using a 5" prefilter in the aqualis eco 450ml. My open bench raijintek enyo is packed with 4 480 rads and one MoRa360. The reservoir will be placed at the highest point on top, where otherwise a rad should be. My monitor upgrade might be a 32" Samsung G6. My audio system will be a Motu M4 paired with Q Acoustics 3020i. Until the, eyes open for deals.

  • @westwonic
    @westwonic Год назад +1

    Many thanks, my Arc A770 LE arrives on Monday next week, u have answered many of my queries, Good job, excellent presentation

    • @gamesushi
      @gamesushi  Год назад

      Thank you! I appreciate the kind words and I'm glad you were able to get some information out of it. :)

  • @MrMunkyMeat
    @MrMunkyMeat Год назад +5

    I have been using the A380 as a second, dedicated encoder card. My main is a RX 6900 XT. For this purpose, I have found it to be exceptional. I admit, that I have not used it as a "Main" card, so I cannot speak to game performance. If you already have a decent card but just want the AV1 encoder, you cannot beat it for a drop in card.

    • @gamesushi
      @gamesushi  Год назад +3

      Earlier in the year I was on the fence about getting a 6700XT or 6800, but the onboard encoder was a deal breaker for me. I wanted AV1, would have also settled for NVEC but Nvidia's prices are insane. I know there are people who use AMF encoder for capture and streaming in OBS, but it isn't up to par with what Intel or Nvidia offer. So that's awesome you are using the A380 to encode that's a really really good idea. You sir are a gentleman and a scholar.

    • @MrMunkyMeat
      @MrMunkyMeat Год назад +3

      @@gamesushi It just seemed like a no brainer. There are a lot of people who bought cards in the coof times for way more than they should have. And this current gen of GPU's isn't that huge of a leap from the last with exception of the encoder. So, a $130 drop in encoder? Yes I will! The hard part is ensuring you have a MB that supports at least 3X16 on the second PCI-E slot. I am planning on building another system with a 13th gen I9 in the next few months to really test the encoder.

  • @TimLongson
    @TimLongson Год назад +6

    The best budget graphics card for gaming is the Intel Arc A770 16GB version - it's the cheapest way to get that vital 16GB VRAM that you NEED for the latest games in 2023. AND you get AMAZING hardware for that price, so as newer and newer drivers come out, it will just get more and more powerful, so great future proofing.
    There are plenty of even current games that require MORE than 8GB VRAM for even MEDIUM settings with Ray tracing, like:
    Resident Evil 4,
    Hogwarts Legacy,
    The Callisto Protocol,
    A Plague Tale Requiem,
    Etc
    And if you want to play these games at 1440p ultimate settings, or 4K, then 16GB is the entry point for acceptable new GPUs. Game developers are saying that new games, even later THIS year, will need at least 12GB to run even 1080p at just medium settings, which is entry level, without any future proofing, so a poor investment.

    • @gamesushi
      @gamesushi  Год назад +3

      Very well said! I agree VRAM is becoming more and more of an issue, I feel like anything less than 8gb in 2023 and beyond is going to start feeling pretty sad when it comes to newer AAA titles. The only way around it would be to stay at 1080 resolution which is probably fine for most people, so at 1080 8gb cards should be viable for the next 2-3 as long as settings are at medium or high with come compromises here and there. Again this would only be on new AAA titles. I think the A750 is still a really solid option at 8gb especially at only $250. The video encoder on these Arc cards are miles better than what AMD offers and just as good as Nvidias NVEC if not better. That's my opinion though. I just cant say enough good things about these Arc cards. They aren’t perfect, but they are undeniably really really good alternatives in the current GPU market. I can't wait for Battlemage!

    • @TimLongson
      @TimLongson Год назад +2

      @@gamesushi We're nearly half way through 2023. It's been about 20 years since 1080p gaming was "cutting edge" and 10 years since 1080p gaming on anything bigger than a 14 inch display was considered good. If you spend as much on JUST a GPU as an entire gaming console you should be able to manage at least 4K at medium setting and get 60fps (not including frame generation smoothing) for that kind of money. On displays bigger than a laptop you want to be aiming for at least 1440p. The Arc A770 16GB very recently got yet another huge performance boost with a new driver and, in most DX12 games, performance of the Arc A770 16GB is good enough / sufficient for decent 1440p@90fps or even 4K@60fps , with slightly reduced details, not everything on 'ultra' but mostly medium to high, also with FSR/XeSS enabled where available.

    • @gamesushi
      @gamesushi  Год назад +2

      Lol I remember 14 inch displays, my fist LCD was about 14 inch and I thought it was the business. I agree 1440 is where it's at in terms of graphical fidelity while still giving good performance on a lot of different mid tier hardware configs. With that said, given that 1080 is still the most widely used gaming resolution I feel like it's safe to say 2-3 years from now it will likely still be around and probably still the most common resolution. You said it, the A770 got a big boost with this new driver and I thoroughly believe we haven't seen it hit it's final form yet. It's a solid 1440 GPU and in some games it's a solid 4k experience too, obviously not all games at 4k though. The A770 really surprised me with Atomic Heart though, at 5120x1440 Ultra with FSR Quality I get 120-130 fps. That's kind of crazy for only $350 bucks.

  • @TechGuyBeau
    @TechGuyBeau Год назад +5

    Tis a good video my dude. An asset to the community.

    • @gamesushi
      @gamesushi  Год назад

      Thank you! :D

    • @Blindmanex
      @Blindmanex Год назад +2

      I think it’s fair to say it takes a special breed to buy the Arc. If you just want a gpu that plug and play this isn’t it. Save your time buy a different GPU, spend a little more and it’s fine.
      I would make a video out of my response but out of respect I am just going to dump my thoughts here.
      Yeah I own a A770 and was directed here from the Reddit post.
      Yeah sure I could of dropped $600 or even a $1k on a graphic card but I choose the A770 because I am a masochist. Sure you want to YOLO it, then do it…
      In the end you have to say is this worth your time?

    • @gamesushi
      @gamesushi  Год назад +3

      @@Blindmanex I think its certainly worth the time for the tech savvy or those that dont mind troubleshooting. For those less comfortable troubleshooting or with less familiarity around hardware probably not. Everyone is different, thats why I wanted to make sure to cover some big negatives and not only focus on positives. Im happy with my purchase, though I know Im not everybody. Needless to say I will be buying Battlemage when it comes out.

  • @gareth4348
    @gareth4348 Год назад +2

    Hoping that a lot of these issues can be fixed by later generations along with better performance because it would be sick to have one of these as my primary GPU

  • @mikevanvolkenburgh5348
    @mikevanvolkenburgh5348 Год назад +1

    I got a ARC A770 16GB in FEB 23 and love it it work great i run a ASUS TUF Gaming VG27VH1B 27" Curved Monitor, 1080P Full HD, 165Hz with no problem .

    • @gamesushi
      @gamesushi  Год назад

      Awesome! Glad you aren't having any issues! Thank you for commenting and sharing that.

  • @billchildress9756
    @billchildress9756 Год назад +1

    I bought a A770 16gb in jan and it has been getting better with each driver update! I've been running it on a AM4 system with a 5800X3D and I'm save it for a planned Intel build later.

    • @gamesushi
      @gamesushi  Год назад

      Oh nice thats the cpu I want to eventually upgrade to, for now the 5600X does okay. Awesome I wish you good luck on your build, the A770 is going to look great in your susten its one of the best looking cards to have come out in a long time. Then again I really like the look of my old EVGA GTX 970 SSC, such a clean and subtle look, the A770 reminds me alot of it. Except with really well done RGB lol.

    • @billchildress9756
      @billchildress9756 Год назад +1

      @@gamesushi That CPU is the first new one I bought in 30 years. When I first heard about it it came out last April and back then I could afford to do that and I got an MSI X570 motherboard off Ebay about 2 months earlier for 100.00 bucks. I dropped the CPU in it and the damn thing wouldn't post until I updated the bios. It does not OC, don't need to. And now you might find one at a better price than what I gave for mine I think it's better than the 5950 myself. 3600 ram is the best for it too, Corsair Vengence memory was my choice. Every review seems to the same and I agree that it's better then the 7800X3D. AM5 seems to have to many issues.

    • @gamesushi
      @gamesushi  Год назад

      @@billchildress9756 Wow very interesting! I had to flash my B550 before I could use my original 5600G also. 100 bucks for a motherboard is a pretty good deal. Yeah I’m in no hurry to get in AM5. There were bound to be issues given that its AMD’a first LGA design. The 5800X3D will be my next upgrade. :)

    • @kapilr.4768
      @kapilr.4768 Год назад +1

      That 5800x3d is a beast. Upgrade to something like intel 15th or 16th gen when good ddr5 kits are available.

  • @dmitriyp7701
    @dmitriyp7701 Год назад +1

    Thank you for the such detailed review ! I also have a770 LE

    • @gamesushi
      @gamesushi  Год назад +2

      Oh nice! The A770 LE is the best looking model IMO.

  • @98LuckyLuk
    @98LuckyLuk Год назад +2

    My idle power draw went down by connecting one of my two screens directly to the Mainboard. This only works if you have an iGPU, but it lowered my idle GPU power from 45W to 25W.

    • @gamesushi
      @gamesushi  Год назад

      That's a good workaround for those with a iGPU. Thank you for sharing! :D

    • @shargakun
      @shargakun 5 месяцев назад

      Yes, i confirming that. I am with A750 and igpu on i5-12400 and while reading the comments here and writing power consumption is 3w. Using only one monitor on 165hz 1440p utrawide with the display port cable plugged in the motherboard.

  • @saschapurner9579
    @saschapurner9579 Год назад +1

    I got 2x750 and 2x770. I am a computer user for 25 years now. Love that cards. I buyed a new used computer for me to drive them on a z170 and a 6700k yesterday. Yes, i think i am able to activate r-bar on that old beautiy too. ;-)

    • @gamesushi
      @gamesushi  Год назад +1

      That’s awesome! You’re probably a lot more familiar with the Arc Gpus then since you own 4 of them! Very cooI. I’m not an expert on Intel motherboards but I do believe MSI/ASUS added rebar support on certain older motherboards. If you do give it a try on the old system let us know how that works out! I do like the 6700K its pretty solid CPU. My son has a 4790 (non K version) in an old Dell Optiplex system that we turned into a gaming PC and it works great. I appreciate you checking out the video. Thank you!!!

    • @saschapurner9579
      @saschapurner9579 Год назад +2

      @@gamesushi I did take a closer look to your channel and i think i would support you now from my special case from point of view. if i remember some things about gamining in that chips with linux games only i tell you how to share my data with you to provide them with the community. Salve aka alucian

    • @gamesushi
      @gamesushi  Год назад

      Wow thank you Sascha! That would be wonderful! Thank you so much! I also really appreciate the support! :D

  • @galacticzombie1942
    @galacticzombie1942 Год назад +1

    Thanks for this informative video. I look forward to watching more of your stuff. I am curious about the Arc cards. It really seems like it's coming along. Windows performance seems a bit further ahead than Linux, and I'm on Linux, so I'm just waiting and keeping tabs on it to see how things go.

    • @gamesushi
      @gamesushi  Год назад +1

      Hi! Thank you I really appreciate you. Arc is definitely coming a long, I think when Battlemage comes out things will be a lot more stable across the board, for both Windows and Linux users. Though I feel like the first run of Alchemist cards will still probably have many of the same power savings/wake up from sleep issues, due to how they designed it but hey not bad for first try all things considered. What's important to me is that Intel is listening to its users, on the official Intel forum when someone posts an issue they really go the extra mile to help troubleshoot. Literally on every post related to Arc I see this, the same can't be said for Nvida or AMD. To me, that level of support goes a long way.

  • @grraf1
    @grraf1 Год назад +1

    Using the A750 on an Arch linux machine with no issues and excellent performance...most that struggle with issues under linux do so because they are not using kernel 6.x branch where we got official full support for it.
    PS:i'm using HDMI never had any issues...

    • @gamesushi
      @gamesushi  Год назад

      Thanks for sharing this Grigore. I'm glad you haven't had any issues with HDMI or any issues at all. I really appreciate you for watching the video and for commenting.

  • @858Markus
    @858Markus Год назад +1

    For the idle power draw. I have a 75Hz monitor set at 75Hz, and the power draw is 8-9Watts in idle.

    • @gamesushi
      @gamesushi  Год назад

      That is awesome! I think someone else also commented and said they too get the power savings at 75hz. What resolution are you running by chance and what size is your monitor?

    • @858Markus
      @858Markus Год назад +1

      @@gamesushi Monitor Philips 246E 75Hz a 1080p, i5 13600KF, Gigabyte Z790 D board, F4c BIOS, 32GB G.Skill Aegis 3200MHz RAM, Arc A770LE 16GB (non OC) 4369 driver, but worked with previous drivers.

  • @wille84fin
    @wille84fin Год назад +2

    Interesting. I've had some (wake from sleep) black screen issues myself. Though i'm using only a single 38" ultrawide (3840x1600) 144Hz monitor in 120Hz 10bit color mode (HDR). It stopped when i stopped using sleep mode. Should try if it's still a thing with latest Beta drivers. NVME drives boot so fast that disabling it didn't ruin my day. Might someone elses though. My system is 12900KF, A770-LE 16GB, Z690-i, 64GB DDR5 5600, 2x2Tb Kingston Fury Renegade NVME, 1000W PSU, Dell AW3821DW and Windows 11 Pro.
    Btw, your monitor is super ultrawide i think.

    • @gamesushi
      @gamesushi  Год назад +1

      Thank you for sharing your experience and the great advice. I will check to make sure I have sleep mode disabled, I could have sworn I did but will double check that when I get home. That would be awesome if it fixed my black screen issue. Ah yes you're right, I goofed and just referred to it as Ultrawide.

  • @h.barkas1571
    @h.barkas1571 Год назад +1

    What would put me off is the fact that the shroud on the Intel cards is glued to the backplate making a tear down difficult. Cards made by AIBs don't seem to suffer from this issue.

    • @gamesushi
      @gamesushi  Год назад

      Very true. I have seen some people watercool their A770s and I imagine taking off the shroud was a nightmare. On the flipside though they are very stiff cards, so even though they aren’t huge, (more like GtX970 size) they are a surprisingly on heavy side, regardless of the weight though there is no GPU sag. I imagine this has to do with the very rigid design, in part with the shroud being glued on the way it is as well as the general construction.

  • @idocare6538
    @idocare6538 Год назад +1

    Thank you very much for this! It was very helpful as I shop for a card.
    I am curious if there is a reason you keep name dropping Google over other search engines over other search engines? They weren't the first, last or best search engine so I am curious.

    • @gamesushi
      @gamesushi  Год назад

      I only recommended it since most people know what google is. Back in the late 90s and early 2000s I used Mamma.

  • @wjack4728
    @wjack4728 9 месяцев назад +1

    Just thought I'd better warn people that DOSBox no longer works with any Intel Arc card since driver 4676 about 4 months ago. From what I've read from Intel, I'm not sure if they will ever fix that.

  • @GS-kh5se
    @GS-kh5se Год назад +2

    I had a black screen over hdmi when Windows booted. A new 2.1 hdmi cable fixed the issue.

    • @gamesushi
      @gamesushi  Год назад

      Awesome! Im glad that fixed it for you!

  • @noanyobiseniss7462
    @noanyobiseniss7462 Год назад +4

    My first power house gpu was a VLB Mach32 4mb.
    Before That I go back to upgrading my Mono to CGA.
    Bang for the buck is what matters so Intel has a chance here.
    I owned and installed many I740 gpus in the past and thought Intel dropped the ball by leaving the market back then.

  • @HopelessReminder
    @HopelessReminder Год назад +1

    i was having black screen issues where they would flicker on and off and restart my apps and after i set up resize bar it has no issues, well darktide still crashes all the time and is pretty much unplayable but that is one game

    • @gamesushi
      @gamesushi  Год назад

      Oh good glad you got it working! Yeah Darktide performance on my A770 is rough, have to turn settings way down for it to be stable, but then it's doesn't look very good.

    • @HopelessReminder
      @HopelessReminder Год назад

      @@gamesushi it actually is not mattering what setting of graphic settings I have it at,
      It sometimes makes it to lobby and even less sometimes makes it to a mission then crashes out and just starts crashing everytime I make it into lobby of the morningstar
      With how great everything else runs I think it is just a darktide issue and not much to do with the A770 it's self

    • @gamesushi
      @gamesushi  Год назад +1

      @@HopelessReminder Oh you know what? Now that you mention it I saw a post about this in /IntelArc linking to a post at Fatshark. This is a new thing that started affecting Arc users. Fatshark has identified the issue and gave a loose ETA of 4 weeks for a fix, due to available resources. Sorry I forgot about this, last time I tried Darktide was April and this post's existence slipped my mind. Here is the post if you wanna read it: forums.fatsharkgames.com/t/intel-arc-a770-crashing-in-lobby-when-moving-edited/81428/5

  • @DevilbyMoonlight
    @DevilbyMoonlight Год назад +3

    Hmmm I have been looking at snagging a 16GB one of these..... butyou get a high power draw with monitors above 60hz.. so can the monitors run @60 in a separate power configuration and swap between configs as needed? this is how I actually run my machines, kinda like switching from ultimate performance to max powersaving.... failing that a bat file invoking Qres might do the trick...

    • @gamesushi
      @gamesushi  Год назад +1

      Yep, some patience may be required if by chance the power draw fix doesn’t work on your system, or at the very least it can be something you learn to live with. Its strange because power draw while gaming isn’t terribly high, especially compared to other cards in the same performance bracket, its just when the dang thing idles lol.

  • @bigplaycrow3693
    @bigplaycrow3693 Год назад +1

    This power draw of 770 made me return the card and go with 6650xt for my 2nd rig which on idle is pulling just 2W on 144hz monitor. Good performance in games BUT still 8GB 770 had worse performance in games than this 6650XT and at higher cost. Bet Intel will deal with its issues and i cheer for them, would love to see Battlemage do a good job, which i bet it will in games, but major concern for me its that power draw on idle. Good video tho!

    • @gamesushi
      @gamesushi  Год назад

      I hear ya, thats one of the reasons I stopped using wallpaper engine, just can’t justify a 60w power draw when Im not even using the thing and its idling. I’ve many Arc users really try to like the card, and only after living with it for a while do they realize its not for them. One game I did not count on my A770 not playing well was The Old Republic Online, its a stuttering mess even on low or medium settings. Which is crazy because thats a game I had a decent experience with on my old GTX 970. I think you will be much happier with the 6650 XT. My last AMD card was a Radeon 7970, so its been a while since I went team red but the prices right now are insane its super tempting. Thank you for checking out the video and commenting I really really appreciate it.

  • @Jwhipification
    @Jwhipification Год назад +1

    Hey bud thanks for the breakdown.

    • @gamesushi
      @gamesushi  Год назад

      No problem! Thank you for watching! I really appreciate it. :)

  • @barcelona2170
    @barcelona2170 Год назад +2

    Thank you for this. I'm going to be building my first PC and have had friends in the game for while yell (sarcasm) for me to stay away. I was intrigued by these cards on paper. This video changed my mind. Im by no means an expert but i am comfortable taking my time and making sure i do it right. Isn't that part of building a PC any how? Anyways thanks again.

    • @gamesushi
      @gamesushi  Год назад

      No problem my friend. Im glad you were able to get what you needed out of the video and that it helped you make a decision. Very true, and PC building is pretty fun! It feels rewarding when you put together your own system. :)

    • @barcelona2170
      @barcelona2170 Год назад +1

      @@gamesushi Of course. I might as well since the new Nvidia offerings seem lackluster in comparison to the potential of these cards should they continue to improve. Worst case I just buy another card later.

  • @bryanwages3518
    @bryanwages3518 Год назад +1

    I will say all am4 motherboards that isnt from a prebuilt (dell,hp,etc) will support rebar. I have a x370 tachi and i run a ryzen 5800x and it supports rebar.

  • @ovemalmstrom7428
    @ovemalmstrom7428 Год назад +1

    Great info, thanks!

    • @gamesushi
      @gamesushi  Год назад +1

      You're welcome! Thank YOU for watching, I really appreciate it. :)

  • @nodak911
    @nodak911 Год назад +1

    Have a 770 le running 2 lg gaming monitors. 1 32 1440 166hz via dp. And a 24 1080p 144hz via hdmi and no problems as of right now.

    • @gamesushi
      @gamesushi  Год назад

      I'm glad to hear that you aren't having any issues! Thank you for sharing this, as it's good to see that someone is able to run two high refresh rate monitors like that and it works. It gives me hope for my set up too lol.

  • @ShashankVermaa
    @ShashankVermaa Год назад +1

    Insta subscribe man ! This video is a masterpiece.

    • @gamesushi
      @gamesushi  Год назад +1

      Lol thank you for the kind words, happy to have you! :D

  • @juanme555
    @juanme555 Год назад +1

    GameSushi i have a super weird question for you regarding intel arc, do the alchemist gpus support interlaced resolution??? also what do you think of my use case, its super niche but i daily drive a CRT monitor and i have modern displays too but for my desktop computer, i prefer just using 1 display which is the Samsung Syncmaster 997MB.
    Nvidia and AMD no longer support interlaced resolutions, i need interlaced scan to better take advantage of my crt monitor.
    I love OLED, im saving to buy a huge 4k oled tv, but i dont want to use a flat panel as my desktop computer monitor, i dont want an oled monitor, i want to use crt monitors.
    Does Arc function well with HDMI to VGA adapters??? will it display custom resolutions made for interlaced scan?
    Im stuck with a 980Ti because its the last gpu that supported interlaced, and its like 20% weaker than an RTX 3050, its eventually not gonna be supported, i really dont want to use liquid displays please dont take offense i just like how the picture of CRT looks, im evaluating buying an a770 or just wait for Battlemage.

    • @gamesushi
      @gamesushi  Год назад

      Hello good sir! I That's a super interesting question and it peaked my interest. While I don't have a CRT I can test with, according to a thread in an Intel community forum someone asked if Intel Arc supports interlaced resolutions, specifically 1080i, this is what an Intel employee had to say "We would like to let you know that the Intel Arc Graphics supports interlaced mode as long as the display's panel supports it. However, it is important to mention that we can't check with the display manufacturer for further compatibility information." So short answer is it looks like the Arc Alchemist GPUs can support interlaced, though how to hook up the Arc GPU to CRT was not confirmed. I would imagine HDMI to VGA should work, though like I said I don't have a CRT to test with. Here is the thread I found if you want to see for yourself: community.intel.com/t5/Intel-ARC-Graphics/interlaced-resolutions/td-p/1464707
      P.S. In my experience anyone using CRT in 2023 for gaming is a big brain individual that knows what they want. I solute you!

  • @graphicarc
    @graphicarc Год назад +2

    Now try CRU 141hz and save some power at high refresh rate :) is your LE A770 also clocking to just about 2750 when OC? (Will not work if you have a monitor set at higher refresh rate than 70hz) this is not true it just saves less power at over 60fps. It won't work if you have dual displays :) If you measure directly the PCIE power output well that is another story.

    • @gamesushi
      @gamesushi  Год назад +2

      Wow thank you for commenting! I watch your channel all the time! I have my A770 LE at normal default settings. Thank you for the info. Ive had a different experience with my ultrawide, which has a built in dock and kvm so could be an outlier here (philips brilliance 499p9h) I was previously just running the philips at 70hz (as a single monitor) the idle was around 15-20ish with spikes up to 30, so the Intel fix worked somewhat. I then set it to 80z to test and idling went to 30-40 watts, so basically Intel fix stopped working, again still in single monitor set up. This is around the time I decided to stop caring about power draw. Which is good because as soon as I introduced the 2nd monitor at 144hz it started idling at 50 watts. So when Intel power draw worked best for me was when I had dual 24 inch NEC displays at 60hz, even with both monitors I idled as low as 3 watts to 15watts, with spikes up to 25ish. So I can confirm at least in my case that the sweetspot seems to be under 70hz, especially if using office monitors that have a max refresh of 60hz.

  • @HopelessReminder
    @HopelessReminder Год назад +1

    mine tends to idle at 44watts with triple monitor two 1080p 22 inch and one 75 inch 4k

  • @theanglerfish
    @theanglerfish Год назад +1

    i have exact the same card and never thinked about that ASRock models i will go only with original again (yep i will buy second one) and also computing science stuff and performance is on another tier up not 350 it´s like 500-600 or even more expensive card. btw i am pc builder too so hello buddy

    • @gamesushi
      @gamesushi  Год назад +1

      Oh cool! I went with the ASRock because its a 330mm card as opposed to the XFX Merc 340mm, I was able to keep my front case fans in place with the ASRock but would had to take them out for Merc to fit.

    • @theanglerfish
      @theanglerfish Год назад +1

      ​@@gamesushi of course that make sense it´s a bit bigger and well...for someone too much flashy but i also need showcase PC so that was the main reason but as a second card i will go for Intel´s design because it´s smaller and also it have twice of VRAM but all of A770 models are good only ASRock may feel a bit "cheap" in hands but it does not matter... So Intel really shines with this architecture since Larabee (which didn´t succeeded) but it´s sweet name i think but i digress so thank you for this video and cheer to the silicon btw subscribed

    • @gamesushi
      @gamesushi  Год назад

      Another place where Intel shines is editing video in Davinci Resolve. Scrubbing through timeline was seamless and effortless on my A770, now that I am using the 6800XT I can notice performance is lacking in that aspect, its really hard to beat Arc’s ability to do a bit of everything, not just gaming, especially for the price point. I think I will keep the A770 but put it in an editing rig, and use the 6800XT as a daily driver/gaming rig. Thank you so much for subbing I really appreciate you!!!

  • @MaxMustermann-yj1wz
    @MaxMustermann-yj1wz Год назад +2

    Jumped rigth to the conclusion. Unlucky me.
    There was none 😢

    • @gamesushi
      @gamesushi  Год назад +2

      Sorry, this was more of a "be aware of these things and make your own decision" kinda thing. I do think it's a good card, and I show off some of what the card can do in the intro. I really wanted to go over things like Rebar/CSM as that's something anyone who wants to buy an ARC GPU should be aware of, and then I focused on 3 common cons I see people having issues with in the Intel Arc SubReddit. Weird format but I see enough questions about these things I wanted to make a video about it.

  • @csh9853
    @csh9853 Год назад +1

    Mine doesn't ask to run at start. It just runs on its own.

    • @gamesushi
      @gamesushi  Год назад +1

      Maybe Intel fixed it then. Before it would require Admin action to either Accept or Deny. Glad to hear it doesn't ask you, this is positive news!

  • @DeathStriker88
    @DeathStriker88 Год назад +1

    Should I buy an A770 16GB LE? I will be doing video editing for youtube, play indie games, play visual novels, play little bit old games, emulate PS3, and live stream

    • @gamesushi
      @gamesushi  Год назад +1

      As long as you can enable Rebar and disable CSM, I think you should yes. It performs great in Davinci Resolve, which used to crash for me all the time when I was using a GTX 970. Now it's buttery smooth no issues hiccups or crashes. Also, in my opinion Intel has best video encoder right now, so streaming and capturing gameplay with A770 should work great too. As for old games that might be hit or miss if that game is DX8, DX9 or even DX11. Depending on game it might run great, or run poorly with stuttering. Now, every older game I have tried has been great, but that's just been my experience. To be specific I'm talking about Darkness II, FFXI, XCOM 2, Fallout 4, Mass Effect Andromeda, and Hellblade with DX 11. I have heard some of the older Assassin's Creed games don't run well on Arc, but some do. So if you primarily play older games you have to be ok with some of them working and some of them not so much. Thankfully Intel is doing great on keeping up with optimization updates by releasing new drivers 2-3 times every month. Overall I think the 16gb LE is a great price, and for what you get it's such a good deal.

    • @DeathStriker88
      @DeathStriker88 Год назад +1

      @@gamesushi Thanks for such helpful info. By old games I meant Microsoft Motocross Madness, Witcher, Road Rash and yes I also use Davinci Resolve for editing. I do know how to disable CSM, enable ReBar and do other tweaks in UEFI so that's not a problem. In fact, this PC is going to be my life's first ever PC. I had used only 2 i3 laptops till now.

    • @gamesushi
      @gamesushi  Год назад +1

      @@DeathStriker88 Oh awesome congratulations on your first ever PC then! Well FFXI is a DX8 game so I would imagine the first Witcher should work. Dont know about the others, but I think the A770 is worth it especially since you are able to enable rebar.

    • @DeathStriker88
      @DeathStriker88 Год назад +1

      @@gamesushi Motocross Madness needs DirectSound 7.0 api 😂 and Road Rash is a 1995 title so I think the requirements of both could be same?

    • @gamesushi
      @gamesushi  Год назад

      Wow that sounds very old lol, I would wager it probably won’t work for those titles but worth a shot. If it works out let me know. :)

  • @bgeraldc
    @bgeraldc Год назад +1

    I play 1080 and have 5 5600 with a b550 mobo. My 3060 ti 8gig is showing some age. I play games mostly and I play VR sometimes. Do you think the 770 16 gig would be a decent card until the next gen Intel cards hit? Thanks

    • @gamesushi
      @gamesushi  Год назад +1

      VR support is lacking on the Arc cards. Some Arc users with lots of patience and tech savyness have got VR working for certain titles but its currently unsupported by Intel so the experience is mostly unreliable at the moment. If you have a 3060 TI then I would advise just sticking with it until Battlemage comes out or until you can find a good deal on something else. I say that because even though the A770 has 16GB of VRAM the performance is on par with the 3060 non TI in most titles at 1080p. Sure sometimes it is on par with the 3060 TI at 1080p, depending on game, and even starts to pull away from the TI a little bit as you crank resolution but its not going to be by any substantial amount unless the game is very well optimized and also supports XeSS. The only major advantage I see over the 3060 TI is that the A770 has AV1 encoder, so capturing gameplay with OBS with the A770 and working with AV1 footage in Davinci Resolve has been the best performing GPU I have used for those two specific applications. The experience is a dream, it just performs superbly in that aspect. Super impressed. Like I said though, if you are just gaming at 1080p I think you are better off the the 3060 TI.

    • @bgeraldc
      @bgeraldc Год назад +1

      @@gamesushi thank you. I really appreciate you answering my question.

  • @leinadreign3510
    @leinadreign3510 Год назад +1

    You shouldnt buy one, if you have an 9th generation intel cpu and older board.
    My screen stayed black and the diagnostic led for a VGA problem was going on.
    Bios was up to date, Re-size bar was set on and it wasnt working.
    Obviously the board worked with a gpu before.
    Had to send it back, sadly

  • @fujitsubo3323
    @fujitsubo3323 Год назад +1

    i own a ARC A770LE since day 1 and many other ream green and team red GPU's i daily drive the arc in a office pc doing not alot of gaming. ive played the odd game on it here and there but no many. the ones i did try had alot of bugs with just getting the drivers to let me launch the games. each driver update has fixed bugs and added new bugs. if i was being honest and i had my time over again for the money i paid for the arc a770 i should have just jammed a 3060TIi in the system.

    • @gamesushi
      @gamesushi  Год назад

      Yeah its not a perfect card by any means and for a lot of people not worth the trouble. I recently picked up a 6800xt and will be taking a break from my A770 for a while but will mention why in my next video.

  • @gaganplaysgames564
    @gaganplaysgames564 Год назад +1

    Can you please help me decide regarding arc 770 vs 3060 12gb. I am really confused, i saw soo many videos about gameplay , it gets really confusing. On paper arc looks amazing.
    Btw, i built my system 3 months ago with new amd ryzen 5 7600 with 32gb ram and b650m gigabyte motherboard. I still need a gpu.
    Will arc perform better with intel cpu,
    will i have any compatibility issue with amd cpu? What about the rebar support with my current build.
    Does it perform decent in blender ?

    • @gamesushi
      @gamesushi  Год назад

      Arc performs well with both Intel or AMD cpu’s, so you wont have any issues with either CPU combo and your 7600 supports rebar so really thats the most important thing. Also Arc would work well with the components you mentioned. I cant speak on Blender I don’t use it, though I heard performance was decent. I like it because it performs really well in Davinci Resolve, and the AV1 encoder works wonders for capturing gameplay too. With that said, I believe the 3060 12GB works just as good in Davinci, and in most workflow apps like Blender probably comes out a little ahead of A770, but still comparable and maybe not enough to really notice. If you only care about new dx12 and vulkan gaming titles and also want to do blender stuff and want the 16GB of VRAM the A770 is a great choice. However, if you want to play older titles like DX 11 and older and I would NOT recommend the A770. So really anything the A770 does the 3060 12GB can do as well. At 1080 both cards will trade blows with each other, though A770 will begin to outshine the 3060 12GB at higher resolutions as the faster clock speed and extra VRAM will allow it to pull ahead in performance on newer titles at 1440 and even sometimes 4k. Though that really depends on how well the game is optimized, but when you can do it its a wonderful thing. Only downside to the A770 really is consistency, like I said for older titles probably not the best choice. Also if you are going to use Linux probably not going to work on too much stuff without some effort. Oh and as of now Arc does not have VR support. Some people have gotten it to work for certain titles, but even then its wonky.

  • @pokemonexplorer1101
    @pokemonexplorer1101 Год назад +1

    can i use arc a380 with ryzen 3200g? my mobo is asrock b450m and it support rebar but I don't know about my processor. iam budget user sir. pls help me with my question

    • @gamesushi
      @gamesushi  Год назад

      Hi! Unfortunately the 3200G does not support Rebar. Luckily AM4 CPU prices are pretty good at the moment. If you're on a budget you can get a Ryzen 5500 for just under $100 right now. If you can spare $30-$40 more dollars you could pick up a 5600 or 5600X, not sure what prices are where you live but here in the USA they can be found at that price.

  • @daviddesrosiers1946
    @daviddesrosiers1946 Год назад +1

    I'm building an editing rig for photo and 4k video, nothing fancy, no 3D rendering or blender nonsense. Looking to pair it with a 12900k and ddr5. Curious how the Acer Bifrost A770 would work for that application?

    • @gamesushi
      @gamesushi  Год назад +1

      For photo it should work no problem. For 4k video that should work too, though probably wont be able to scrub through footage as fast or seamlessly as a RTX 4090, but using Davinci and working with 1440 footage I haven’t had any issues and the experience has been smooth. Davinci used to constantly crash when I was still using my GTX 970. Videos render a lot faster now too. Then again I am comparing to a 970 lol. The Bitfrost A770 has a slight over clock so I think you would be just fine.

    • @daviddesrosiers1946
      @daviddesrosiers1946 Год назад +3

      @@gamesushi That's my thinking too. Plus, with it linking to the IGPU and working in tandem there, and the 256 bit bus, I should be in a reasonably good spot. Kinda fed up with the antics of both teams red and green. Team blue is offering a not bad, if not great (for now) alternative, future support gods willing.

    • @gamesushi
      @gamesushi  Год назад +2

      @@daviddesrosiers1946 Hear Hear! When I was looking for a gpu upgrade in early February of this year prices were still high for 30 series cards across the board and I didnt want to buy used. I could have gone AMD but I have had a couple AMD issues in the past and just wanted to experience something new, so I tried my luck with Intel Alchemist. It’s been an experience for sure, but gaming has been pleasant enough and thats what I mainly wanted to do. I might buy the 4060 TI 16gb model when it comes out and do some benches against the A770, I dunno we will see. I am buying Battlemage for sure though.

  • @l3lue7hunder12
    @l3lue7hunder12 Год назад +2

    Dude !
    I respect your experience and in depth knowledge, as well as what you are trying to achieve here, so I am really sorry for saying this but: That video needs removing and a serious overhaul before putting it on again, because that’s how confusing and dangerous it is.
    As it is, only somebody with solid PC hardware setup knowledge can make us of it. As for anybody else …
    You don’t have a structure and no script, seemingly contradict yourself, recommending and warning against an Intel Arc in quick succession - something you even know because you say how sorry you are that this may be confusing.
    Then you repeat the very same details in multiple variants to the extends only an IT professional even knows that you haven’t even changed the topic, and then you go straight for the throat by jumping into the bios.
    1. First, you need to know if you even WANT a Intel Arc
    2. Then you need to check if you system does even support it ( CPU and Mobo for Resizeable BAR, length, width, power consumption, power connectors, … )
    3. Then you need to check if you are dealing with a legacy install or UEFI ( = if switching of CSM is even an option )
    4. Then you need to tell people what the BIOS is
    5. Why they may NOT want to go there
    6. How to go there
    7. And DON’T have anybody change settings that can brick their system without warning
    8. And certainly not in this confusion a fashion.
    9. And not without having told your audience how to react to worst case scenarios, like our classic JP-1 .
    Fact is: In many systems, and most AMD Ryzen, if you enable Resizeable BAR while CSM is enabled, you brick the entire thing to the extend that you need to reset the BIOS via jumper.
    Also, please don’t doodle about in you bios for ages while holding your camera / smartphone in front of it. Better so make a quick picture, or at least be quick about it.
    Also just for reference:
    Your video holds about 7 minutes worth of actual content, but its over 34 minutes long. So while it isn’t exactly your fault you still should be aware that his not only means it holds about 80% fillers, but you are also exceeding the limits of both psychological factors and simple mental capacity.
    A rough orientation:
    2,3 seconds - something is perceived as “taking time”
    5 seconds - unrest settles in and attention wains
    8 seconds - attention lost
    2 minutes - interest lost
    9 minutes - attention span exceeded
    This means you need to draw attention in 2,3 seconds of the video, create an interest or arc of suspense within the first 8 seconds, and then keeps the ball rolling by re-triggering the interest about every 2 minutes by adding anecdote, jokes or simply changing the topic, and if you reach the 9 minute mark you should stop and put the rest in a separate video - why do you thing RUclips videos are supposed to be 10 minutes or less ?
    As for the Intel Arc A770:
    You never summarized what works and what doesn’t. For instance, under Windows most DX12 titles work fine and with very competitive performance at around RTX 2070 S / RTX 3060 / RX 6700 levels, showing solid Raytracing performance and even exceeding at higher resolutions.
    Its only DX9-DX11 titles where performance or support are questionable, and you should use at least 2k resolutions upwards to really profit from the hardware.
    Linux on the other hand has native Intel Arc support since Kernel 6.2, which works fine with anything but Vulkan and video encoding like AV1. Using Vulkan will put you at about 50% of the possible performance, and features required for DX12 emulation under Proton are missing altogether causing severe issues with many current Windows runtime titles under Steam.
    Essentially, it’s a very solid first generation product, but it probably will be around another half a year until support and drivers have caught up enough that it can be recommended without reservation.
    Until then, as you mentioned great a many times, it will depend on your intended use. I personally can’t recommend it right now and actually switched back to my old GPU in order to play Hogwarts Legacy under Linux, but as soon as a new Kernel is out fixing those Vulkan issues I’ll certainly give it another go.
    I hope you’ll see this as the well meant feedback it is supposed to be.
    Have a good one.

    • @gamesushi
      @gamesushi  Год назад +2

      Wow thank you for this well thought out response. I read every word and really appreciate the detail you put into it. Did you by chance see the on screen prompt in the CSM portion that essentially says "if CSM is enabled already, and you disable it and save and exit, it won't find your hardrive" paraphrasing of course but that's what the text says, I also have this layered over my explanation of the behavior BIOS will exibit if thats the case, in that it will try to boot but just restart, so in order to fix this either Windows will need to be installed with CSM disabled or convert MBR by GPT, which I realize might have been easy to miss due to pacing issues. In any case I'm sure this won't satisfy any of the issues you brought up and for that I apologize. I'm sure you can also tell I'm not a youtube professional, honestly this is the first video I have ever done that has gained so many views and traction, so to only have about 6 minutes of view time might seem terrible, but to me if I compare it to like 12 seconds of view time from every other video I've done besides these Arc videos (which again this being best performing one) the 6 minutes of view time is a big milestone for me. Still figuring out a lot as a go along. I will make sure to think more thoughtfully about format in the future. In the event you watch any other of my videos again I hope you will feel just as welcome to post any feedback, should you want to. I really appreciate it.

    • @l3lue7hunder12
      @l3lue7hunder12 Год назад +2

      ​@@gamesushi Thanx. And no, I didn't miss that prompt or your booting loop description. But it is no clear warning and honestly sounds like "not big of a deal". Also, CSM is the "compatibility support mode" which most notably offers the "legacy boot option" as opposed to "UEFI Boot", which is what people know and recognize it by. GPT support, or most notably discontinued MBR support, is only an aspect of UEFI and the deeper technical side of things. You mixed the solution to the problem, the partition format conversion, with what a problem could be in the first place.
      And yes, your video is well received because it addressed many issues you had and therefore is great for many who considered going Intel Arc. I too like it, just stumbled over the presentation / structure of your video. 😉

    • @darkoz1692
      @darkoz1692 Год назад +2

      ​@@l3lue7hunder12 - I look forward to your upcoming video showing how it should be done.

  • @RamtinSP
    @RamtinSP Год назад +1

    I need help!!! I just had my ryzen 3 3100, b450m msi MB, Gtx1650 , and change it to same cpu but rog b550 F MB, intel arc a770 gpu, but when i want to play games giving me low fps high 100% cpu usage and low gpu usage and plz need help what should i do? Also i didnt still try to reinstall my windowse from last motherboard

    • @gamesushi
      @gamesushi  Год назад

      Sorry to hear you're having issues. Doing a quick google search I didn't see any information about whether or not the Ryzen 3 3100 supports ReBar, it might not, and that may be your issue. Since you have a B550 just make sure ReBar is enabled and CSM is disabled. Besides that just make sure your CPU supports ReBar, if the 3100 doesn't support ReBar you can pick up a Ryzen 5600 for like 120-130 bucks right now.

    • @RamtinSP
      @RamtinSP Год назад

      @@gamesushi its support actually they bring it for 2nd and 3rd gen cpus but i reinstall the windows cause i just put everythings without reset the hard drive , i want to buy the new one the problem is im not leaving in uk or usa to buy that this much , in my country its way expensive then it..

    • @gamesushi
      @gamesushi  Год назад

      @@RamtinSP Oh okay I see. Try using DDU to remove any old Nvidia drivers that may still be on the system. Graphic Arc has a good video about how to use DDU to completely remove old drivers and then install Arc drivers, though keep in mind there are newer Arc drivers now then in video since it's a couple months old. Video is titled "Intel Arc Control Panel giving you problems? Driver crashing? Install drivers this way to fix stuff." by channel GraphicArc.

    • @RamtinSP
      @RamtinSP Год назад +1

      @@gamesushi there is no any problem with intel control panel even its detected , it might be cause of my prevoiusly gtx1650 gpu driver stuck in it cuz i remove the gpu and i couldnt unistall it but i just went to control panel then uni that but still look like there is no any pressure on arc gpu but when i goin into games fan speed getting higher

    • @gamesushi
      @gamesushi  Год назад +1

      @@RamtinSP I know, it's just the title of the video. So although your issue does not have anything to do with Arc Control, check out the part that talks about DDU and reinstalling Arc GPU driver.

  • @westwonic
    @westwonic Год назад +1

    My first GPU was a 3dfx voodoo card

    • @gamesushi
      @gamesushi  Год назад

      Wow! I remember the box art for that card it had the blue guy on the front lol.

  • @AI.Musixia
    @AI.Musixia 9 месяцев назад +1

    i think i buy a intel gpu at the time the kernel inbuild drivers are better...
    my first gpu was a riva TNT 2 ultra i bought it in 1999 flashbacks too my 12yr old me... good buy bro, never buy a Inno3D I-chill gtx980 TI X3 ultra its not worth the money younger me buy the xps laptop its better trust me iam you but older!!!!

  • @KingArthusSs
    @KingArthusSs Год назад +1

    Arc is a good Price card but my 7 years old 1080Ti --> 500€ is so fast like a new 770 250€

    • @gamesushi
      @gamesushi  Год назад +1

      Yep, the 1080TI was sooo good that it allowed many happy 1080TI users to skip new generation GPUs for YEARS. Nvidia, who at this point puts money first, won't ever make the mistake of making another GPU as good as the 1080TI ever again. lol

  • @justingtr7335
    @justingtr7335 Год назад +1

    Can I use this Card with my Old i5-6500 6th gen processor with 16gb RAM & Samsung 500GB SSD??

    • @gamesushi
      @gamesushi  Год назад +1

      I would say maybe, but even if your system was able to recognize the Arc card and let you install drivers the performance would not be good and I would not recommend it. Mainly because your 6th gen CPU does not support rebar, and likely any motherboard that fits it does not either. In your case I would recommend you go with an RX 6700 XT or Nvidia 3060 (12gb if you can find one) as the price is very comparable and performance just as good. If you already have a 3060 or 6700XT (or similar performing GPU) then I recommend saving up a little and moving to a DDR5 platform, whether its an Intel CPU socket or AMD CPU socket that would set you up for future GPU upgrades.

    • @justingtr7335
      @justingtr7335 Год назад +1

      I need GPU mainly for Video editing, maybe AMD isn't a good choice for this! Also in my country 3060 is almost $100 more than A750, that's why I choose A750 (budget facts)....but I'll upgrade CPU in future, thanks @@gamesushi

    • @gamesushi
      @gamesushi  Год назад

      @@justingtr7335 Hmm depends what video editing program you are using and if you need a GPU with AV1 encoder. If you need AV1 and would like to go Non Nvidia I actually think the RX 7600 is pretty decent. For games its about similar performance and should let you work with AV1 footage in timeline in resolve. I have a 6800XT that I used for video editing and yeah for AV1 content is not seemless, but non AV1 works just fine. Also with the release of ROCm for Windows you can use a AMD gpu for Cuda too! So again the 6700XT is a great option. ROCm for Windows will work on AMD RDNA2 and RDNA3 gpus.

    • @justingtr7335
      @justingtr7335 Год назад

      Mostly MP4 with DaVinci, what about RX6600/6600xt/2060/2060s (budget fact) RX7600 is expensive -_- @@gamesushi

    • @gamesushi
      @gamesushi  Год назад

      Ah okay, to play it safe the 2060 sounds good. There are some “new” models on amazon for a decent price. Just make sure to check out the reviews first but if I had your budget thats the route I would take. If you are tech savvy you could get a 6600XT and then just download ROCm for windows which would let your GPU perform any CUDA operations. This doesn’t really require tech savviness but I say that because it will involve downloading the ROCm software and just making sure your settings in video editor is set for CUDA. Scrubbing through Mp4 footage on either GPU (2060 or 6600XT) should be seamless.

  • @Clav_
    @Clav_ Год назад

    Can anyone tell me why i have tried uninstalling drivers and reinstalling many times but even on a clean reinstall the arc software still refuses to even open? Drivers were great for months but now its completely broken...

    • @gamesushi
      @gamesushi  Год назад +1

      Have you tried DDU? GraphicArc has a good video on how to use DDU to uninstall drivers. Should only take about 15 minutes but might be more thorough than relying on the Clean Install function of the Arc driver installation process. Here is the DDU software: www.guru3d.com/files-details/display-driver-uninstaller-download.html

    • @Clav_
      @Clav_ Год назад +1

      @@gamesushi I actually did try DDU, but i did figure out why it was all buggy. I thought i could use virtual desktop streamer to play in vr but i didnt check to see if the card was compatible with the quest 2, which it is not. Turns out, if you have any other display apdaters in your device manager that are not being used then it will obliterate the functionality of the arc interface. Virtual desktop was one of those devices. Thanks for trying to help, i hope anyone that runs into the same issue i had comes across, though unlikely.

    • @gamesushi
      @gamesushi  Год назад +1

      @@Clav_ Oh okay thank you for sharing this. Yes VR doesn't work on Arc at the moment. It is on Intel's list of things to do but they just haven't yet. I will cross my fingers it's sooner rather than later.

  • @shadow_liquidity
    @shadow_liquidity Год назад +1

    Hi i have a 8th gen i5 8500 would that work with this card currently have a gtx 960

    • @gamesushi
      @gamesushi  Год назад +1

      Hi! Unfortunately you would need a 10th generation Intel CPU or newer. I believe the i5 8500 is a 8th gen CPU. I would recommend either a 3060/3060 TI, or 6600 XT, 6700, or 6700XT since those are all around similar price points give or take a little. I used to have a GTX 970 prior to the A770, and believe me any of the cards I recommended will make your system feel ultra fast and way way more powerful in comparison to your GTX 960.

    • @shadow_liquidity
      @shadow_liquidity Год назад

      @@gamesushi Thank you appreciate your response

  • @gozutheDJ
    @gozutheDJ Год назад +1

    Prodeus gameplay W

    • @gamesushi
      @gamesushi  Год назад

      LOL yeah I dig it. The sound effects and music hit just as hard as the visuals.

  • @gamesushi
    @gamesushi  Год назад +1

    0:00 Intro
    0:26 Why you should buy it
    2:15 About me and why you might care what I have to say
    3:46 Why you want to see this video
    4:31 ReBar and why you NEED it
    6:43 How to toggle ReBar
    8:10 CSM, how to disable, what it is
    10:39 Your decision about Rebar and CSM
    12:05 Start of Cons section
    12:32 Con #1 Power Draw
    14:31 How to fix Power Draw - Windows Setting
    15:18 How to fix Power Draw - BIOS Setting
    16:58 Final words about Power Draw fix
    18:11 When this fix worked for me and when it stopped working
    20:47 Con #2 Black screens
    24:48 My Black Screen issue
    28:02 Final words on Black Screens
    30:47 Con #3 Arc Control Admin Permission At Startup
    32:03 How to fix
    33:22 Outro - Thanks for watching!

  • @denisboisvert69
    @denisboisvert69 Год назад +1

    its task manager, not taskbar manager ;P

    • @gamesushi
      @gamesushi  Год назад

      Lol I didn't even realize I said taskbar manager thanks

  • @Leptospirosi
    @Leptospirosi Год назад +1

    Power draw of the ARC CPU is really what drove me off from buying one: at the same current price a 6700XT is definitely faster, for slightly less power.
    Nvidia, for as much as I dislike the brand policies, is even better and more efficient.
    With the current prices for energy, outside of North America and the climate problems, any brand not feeling ashamef for wasting 50-100 Watts for You Tube or other menial activities should be punished.
    Both ARC and 7000 GPUs have been stiitn in their thumbs with this power draw problem doing little to nothing, and for me that's a NO-No

    • @gamesushi
      @gamesushi  Год назад

      Yeah I agree with everything you just said. Drawing that much power at idle feels pretty wasteful. I do suspect Intel will have power draw fixed when Battlemage comes out. Until then, power conscious gamers should probably go with something else. The 6700XT is an EXCELLENT value right now.

    • @Leptospirosi
      @Leptospirosi Год назад +1

      @@gamesushi I hate that AMD has the same problems with high refresh monitors and insane Power drawing with the 7900 serie: I heard it can be mitigated by getting the refresh down a couple of numbers, like 144Hz, to 142hz manually. Have you ever tried that on ARC?

    • @gamesushi
      @gamesushi  Год назад +1

      @@Leptospirosi Yeah I can attest to high power draw on my 6800XT also, it idles at 40W, granted I get WAY more performance out of the 6800XT and that's still less power draw than the Arc card at idle but only by a little bit. Anyway on the Arc I did try turning the refresh rate down, but I personally found anything over 70hz on my monitors/system will not give me the power savings from Intel's power draw fix. The only time it did work was when I had two 24 inch monitors at 60hz or one single ultrawide monitor at 70hz, when I put that same ultrwaide it on 79/80hz no power savings, even at 75/74hz no power savings.

  • @ALAMIN4000
    @ALAMIN4000 Год назад +1

    Arc 770 16 GB or rtx 3060 ti ?

    • @gamesushi
      @gamesushi  Год назад

      It depends. Are you cool with troubleshooting issues or prefer to go a safer route? If you want to go safer route go with 3060ti. If you are okay troubleshooting stuff the A770 is a good value.

    • @ALAMIN4000
      @ALAMIN4000 Год назад +1

      @@gamesushi i doubt only for rtx 3060ti because ot its 8GB VRAM only ..! I want to play games in 1440p . And you know 8 GB VRAM is too low for upcoming days ..!

    • @gamesushi
      @gamesushi  Год назад

      @@ALAMIN4000 A770 does not have as good game performance. I say this as someone who used this card and really wanted to like it. If you go look at toms hardware gpu hiearxhy chart the A770 loses to the 4060 non TI at 1080p ultra and 1440 ultra, which that is a 8gb card. Unfortunately the way the A770 is architected it doesn’t matter if it has 16gb of VRAM, and I don’t think drivers will save it. It still loses to the 3060 ti too.
      www.tomshardware.com/reviews/gpu-hierarchy,4388.html

    • @ALAMIN4000
      @ALAMIN4000 Год назад +1

      @@gamesushi what about productivity? Is it any good. ?

    • @gamesushi
      @gamesushi  Год назад

      @@ALAMIN4000 I think for productivity it can be good, it just depends on what kind of app you are using. For video editing in Davinici Resolve it was better than my 6800XT. But now that AMD released ROCm for Windows the 6800XT performs just as good, which is the only thing I found that the A770 worked better on which again is no longer the case. If you go the A770 route maybe you will find that the apps you work on work great on it, but you will eventually find something that it doesn't. That's why I said 3060TI is a safer choice.

  • @Saabjock
    @Saabjock Год назад +1

    Great video.
    I've had great success with ARC...then again, all the things you've mentioned I did early on.
    I even use WMR.
    The only issue I had early on with the card was it'd crash when trying to install drivers from the .exe file.
    I was forced to use 7zip to extract the file, then manually install the driver from the Dev. Manager.
    I have not had a single ingame crash to date.
    ruclips.net/video/w8NAYiJxw20/видео.htmlsi=6Ky7AxwF65dHpRY3

    • @gamesushi
      @gamesushi  11 месяцев назад

      Oh good, glad your Arc hasn't given you any major problems. Interesting that you were able to fix your driver installation issue by installing through Device Manager. By chance is your motherboard an Intel based? I've seen only a couple of other people have issues installing drivers, a couple rounds of DDU seemed to do the trick but that seems like more work and luck than what you did. While AMD is a separate can of worms, I feel like my AM4 board never had any of the Intel board issues like iGPU acting up or not being able to install drivers. Though maybe someone out there with an AMD board has had issues with installing drivers too, just not something I remember seeing.

  • @magnusnilsson9792
    @magnusnilsson9792 Год назад

    What does this mean? -Google it. x10 yes, we know that google exists.
    Please either explain it, or just don't mention it.

    • @gamesushi
      @gamesushi  Год назад

      Did you see the part where I explained it and showed the settings to toggle? I only mentioned the google stuff at the end of that bit. Some people wouldn’t understand those terms even if I read the definition. I wanted to focus on very simplistic terms skipping as much tech jargon as possible, showing what to toggle, and at end telling people what to google if they want more info about said jargon. If you have ever worked as a PC tech, you will know that you literally have to spell everything out for some people, so adding in a bunch of jargon won't help them. Seems like you are more advanced than that though, which is good more power to you. I remember one time user called from airport because their work iphone wasn't working, after nothing they said made any sense to me and troubleshooting for 10 minutes they said "Omg sorry this is my ipod not my iphone, I thought it was my work phone this whole time"