Intel ARC A310 vs Nvidia GTX 1630 - Can The Cheapest ARC Card Beat The Cheapest GTX?

Поделиться
HTML-код
  • Опубликовано: 5 июл 2024
  • How does an entry-level ARC graphics card compare to an entry-level GTX GPU? Let's find out.
    0:00 Intro and Test Setup
    1:14 Alan Wake 2
    2:04 Cyberpunk 2077
    3:02 Fallout 4
    4:00 Forza Horizon 5
    5:18 Red Dead Redemption 2
    5:56 Starfield
    6:49 The Witcher 3
    7:41 Final Thoughts
    Thanks for watching :)
  • НаукаНаука

Комментарии • 420

  • @theformerkaiser9391
    @theformerkaiser9391 Месяц назад +416

    The fact that the 1630 requires external power, and yet the 1650 doesn’t is hilarious

    • @bjarne431
      @bjarne431 Месяц назад +32

      Lazy design

    • @theformerkaiser9391
      @theformerkaiser9391 Месяц назад +52

      @@bjarne431 and the best part is that the 1630 has the same 75 watt tdp of the 1650

    • @Tynx006
      @Tynx006 Месяц назад +9

      @@theformerkaiser9391 really? what a joke XD

    • @thisfeatureisbad
      @thisfeatureisbad Месяц назад +62

      The GTX 1630 shouldn't have existed in the first place. It's just an underclocked 1050 Ti or a 1030 on steroids. In short, just an e-waste.

    • @klocugh12
      @klocugh12 Месяц назад +25

      @@theformerkaiser9391 Probably lower quality chips that required higher voltages to reach the clocks, and hence consume more power. I'd be fine with this, if they cut the price in half.
      As it is, it's a disgrace.

  • @R3AL-AIM
    @R3AL-AIM Месяц назад +408

    I had no idea the 1630 was a thing and kind of wild it requires external power...

    • @dasauto7346
      @dasauto7346 Месяц назад +38

      I don't think all models of it do. It's supposed to sip about as much as a 1030, so I'm not sure why bother adding a 6-pin.

    • @Simtatic
      @Simtatic Месяц назад +37

      @@dasauto7346 Probably some overclocked version, GPU manufacturers nowdays overclock anything for the sake of being able to label it "OC".

    • @anasevi9456
      @anasevi9456 Месяц назад +20

      That fact that it is slower than the cheapest bottom of the barrel ARC card even in older games with garbage DX9 era derived DX11 game engines like Fallout 4 is hilarity.

    • @masterace9543
      @masterace9543 Месяц назад +4

      I didn't realize there are cards out there that didn't require external power until recently 😅

    • @kimnice
      @kimnice Месяц назад +5

      GTX 1630 has TDP of 75W so technically it doesn't need external power.. but without it it would really stress the PCIe connector.

  • @Wild_Cat
    @Wild_Cat Месяц назад +227

    Thanks for the Intel Arc content, keep it up. They are more interesting than what AMD and Nvidia have. I'd like to see the A580, A770 8GB and more reviews from you.

    • @asfiraihan2073
      @asfiraihan2073 Месяц назад +7

      Intel is a joke

    • @_Lassic_
      @_Lassic_ Месяц назад +85

      ​@@asfiraihan2073And Nvidia's pricing isn't? Like the other dude said, Nvidia is basically Apple at this point.

    • @RandomGaminginHD
      @RandomGaminginHD  Месяц назад +68

      Yeah hope to check them both out soon

    • @asfiraihan2073
      @asfiraihan2073 Месяц назад +5

      @@_Lassic_ no one should buy budget GPUs from NVIDIA. Only high end GPUs make sense for opting NVIDIA. Amd kinda follows the middle path. All in all, Intel should just go back to making their CPUs.

    • @Carrington1961
      @Carrington1961 Месяц назад

      Nah,more competition=better prices​@@asfiraihan2073

  • @bouldaa
    @bouldaa Месяц назад +66

    When I see pc parts in the garden I just know that it's you

  • @RuruFIN
    @RuruFIN Месяц назад +69

    1630 is a legit successor of the 1030 DDR4 if you ask me.

    • @madelaki
      @madelaki Месяц назад +13

      more like a legit successor of the 710

    • @harlequintheserpent7016
      @harlequintheserpent7016 Месяц назад

      ​@@madelakiI'm just wondering if it rocks that exact same 710 chip as 1030 xD

    • @mlghassan
      @mlghassan Месяц назад +15

      I like the emphasis on the "DDR4"

    • @zangetsu6638
      @zangetsu6638 Месяц назад

      @@madelaki hahahaha

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Месяц назад

      Except consumes more power lol

  • @CGE10
    @CGE10 Месяц назад +44

    Intel managing to beat Nvidia with power efficiency so early on is pretty amazing. Now if only they could scale this up to 4080 levels, Nvidia would be in big trouble.

    • @Aliyah_666
      @Aliyah_666 Месяц назад +6

      You aren't wrong, if they can knock battlemage out of the park it will spell trouble for Nvidia's bread and butter area. The xx80 series has always been really important.

    • @blizyon30fps86
      @blizyon30fps86 Месяц назад

      That’s just not true the 40 series destroys Intel in power efficiency when it comes to the 70 series and 60 series

    • @CGE10
      @CGE10 Месяц назад +3

      @@blizyon30fps86 Too bad NVidia does not have any thing in the 40 series to compete with this.

    • @blizyon30fps86
      @blizyon30fps86 Месяц назад

      @@CGE10 4060 low profile is a thing you know that right?

    • @CGE10
      @CGE10 Месяц назад +6

      @@blizyon30fps86 you know the 4060 is like nearly 4x the price, they do not compare.... idk what your trying to argue im running a 3060 so not like im a fanboy or something,

  • @Absolutely_Allen
    @Absolutely_Allen Месяц назад +22

    i gamed on an intel arc a750 for a bit and was blown away how good it was since i got it new for 180$ usd

  • @BeejNet
    @BeejNet 26 дней назад +2

    Easily the best Tech RUclipsr who stays humble, unpretentious and so far, has not sold out to any sponsorships and/or merch. Not that I don't wish RGHD any success but bro, wish you all the best but at the same time, hope you carry on keeping it real.

  • @Napert
    @Napert Месяц назад +72

    40w max while gaming...
    meanwhile my 3060 ti draws 30w at goddamn idle

    • @chubbykun
      @chubbykun Месяц назад +5

      rx 7900xtx 90-100W idle here xD

    • @IntelArcTesting
      @IntelArcTesting Месяц назад +6

      My A750 and A770 consume 40w during idle, not a nvidia thing.

    • @82_930
      @82_930 Месяц назад +8

      @@chubbykunmine only pulls 5-6w on idle💀 update your drivers

    • @johndavis29209
      @johndavis29209 Месяц назад +4

      My 7800xt runs at like 22w idle, what the hell are you doing to your system.

    • @xnutzii
      @xnutzii Месяц назад +1

      ​@@chubbykunMine draws like 12w, update your drivers dude

  • @grimmpickins2559
    @grimmpickins2559 Месяц назад +8

    The weirdest part of the a310 is that the a380 seems to be about the same price, especially since I see much less of the a310 in the US. I ran the a380 with way overkill specs for it's little self for awhile (ok, ok, so the Pentium G7400 isn't overkill - but the DDR5 6000mhz ram was, LOL). The a310 is actually quite similar, with less VRAM - though I rotate my side-gaming PC components regularly and it's been a few months... I hit act three in BG3 on that system, and I just couldn't take it...

  • @itz_me_kratos
    @itz_me_kratos Месяц назад +8

    I never felt proud of my rx6500m, untill this video

  • @samjackson7701
    @samjackson7701 Месяц назад +1

    Perfect timing seeing this video now -- I just got back home from the local Micro Center where I bought an Intel Arc A310! I got the Sparkle "ECO" model, and it is going to be used in a Jellyfin media server. Can't wait to play around with the AV1 options, it looks fascinating! Glad to see you are covering the Arc series.

  • @tryllejens
    @tryllejens Месяц назад +4

    Very nice! It would be interesting to see how big the difference is going from an A310 to an A380, considering they are nearly the same price.

  • @Wushu-viking
    @Wushu-viking Месяц назад +4

    Thanks for the upload. The ARC A310 looks very interesting! Especially in SFF/Low profile seems very good option as a simple drop in upgrade for s SFF office PC. Also you're fine using this GPU with their typical 200/250W PSU.
    But for new build, it doesn't make sense. You get get an AMD APU that performs not far off. Ryzen 7 5700G has become quite cheap these days, and with some OC and som good DDR4-3600(OC to 4000 is optimal), it's iGPU performance is not far of GTX 1630

  • @CleverFauxFox
    @CleverFauxFox Месяц назад +3

    Pretty cool to see these two at all in a video. Hardly ever pay much mind still too the Arc series and seeing how... well effecient and running them fairly okay makes me feel it's likely under estimated for those just trying to get into the PC space even more just basic titles like csgo/league while sometimes messing around on light titles such as MoonLighter or Wizards of Legend. Kinda something I could see as "kids first PC" but maybe I'm just projecting my first steps affording only a FX-6300 with a R9 270. Just what could get me in and play something.

  • @musguelha14
    @musguelha14 Месяц назад +2

    For everyone talking about the colours and contrast in the game, that's not the GPU's fault and simply a configuration difference. They can look exactly the same.
    Just by looking at it, I'm pretty sure Nvidia is outputting RGB Limited (16-235) and Intel is outputting RGB full (0-255). The capture card seems to be expecting RGB full and that's why the Nvidia footage looks bad - black and white point is wrong and so the image doesn't have enough contrast.
    This is very simple to fix in the drivers.

  • @dcikaruga
    @dcikaruga Месяц назад +62

    1650 vs A380 with latest drivers next please.

    • @malik-mahdi
      @malik-mahdi Месяц назад +7

      its about neck to neck but the arc wins in vram heavy games like resident evil 4 remake which needs depending on the setting more than 4 gb vram

    • @roqeyt3566
      @roqeyt3566 Месяц назад +3

      Yes please
      Or make it 1650 v a380 v 1060

    • @JJARCHIE
      @JJARCHIE Месяц назад +2

      ​@@roqeyt3566 1060 wins by a slide, but the 3gig variant might be fairer

    • @metu201
      @metu201 Месяц назад +1

      ​@@JJARCHIE I still use the 3gb 1060 and for what I'm playing and using it for it served me well, I will only upgrade when I start playing more demanding titles, I hope really he does make that video I'd like to see how it does.

    • @JJARCHIE
      @JJARCHIE Месяц назад +1

      @@metu201 it doesnt matter much. As long as anything needs more than 4gb, it will chug

  • @sir.fender6034
    @sir.fender6034 Месяц назад +14

    Team Blue for the win! Nice video 🖖

  • @SupremeMasterr
    @SupremeMasterr Месяц назад +6

    Whenever I see a benchmark video from zworm or randomgaming, I always mix them up thinking they are teh same person

  • @LurkingLarper
    @LurkingLarper Месяц назад +2

    Would be sexy if Intel manages to greatly improve upon this kind of power envelope with Battlemage GPUs. One can only hope that they step up and don't leave gamers to the mercy of Nvidia and AMD

  • @lozerone6940
    @lozerone6940 Месяц назад +12

    é por isso q eu amo esse canal, voce tem as ideias mais doidas pra testes q eu vejo pelo youtube

  • @mayatheshy8919
    @mayatheshy8919 Месяц назад +1

    It would be nice if you could post the full setup configuration details in the description as a reference when we listen

  • @zackvyle
    @zackvyle Месяц назад +5

    The thumbnail and Title reminds me of a JJK meme
    The Cheapest of Today vs The Cheapest in History
    (something like that)

  • @mikatorkkeli4932
    @mikatorkkeli4932 Месяц назад +3

    Would you recommend any of those intel arc cards for gaming? It seems to do quite good in the videos for the price considering nvidia.

  • @giserson2
    @giserson2 Месяц назад +2

    Would be interesting to see how they compare to the RX 6400 as that's about the same price. Also would be interesting to see the 6GB A380

  • @siddhantyadav450
    @siddhantyadav450 Месяц назад +2

    why didnt u include the rx 6400? any particular reasons or u just didnt have it in stock?? great vid as always!

  • @robh9075
    @robh9075 Месяц назад +4

    Man that was closer than it should of been really, looks like the Intel Arc drivers still need a LOT of work.

  • @gamersworld4176
    @gamersworld4176 Месяц назад +3

    @RGHD can you do a short video comparing these two cards in esports titles like fortnite, warzone, cs2, valorant, apex legends, the finals and xdefiant? because i think if you are thinking about buying these cards in 2024 its probably best to stick with the esports titles with these than trying to play AAA games since they wont be that pleasant of a experience and as the time goes on it will get even worse. so sticking with esports titles is the best choice. maybe they can be a cheap option for people who are looking for a gpu that can do well in esports titles in for quite some time.

  • @luheartswarm4573
    @luheartswarm4573 Месяц назад +2

    Arc cards are not something id switch my amd graphics card yet, but its getting better and better! Its great to have extra competition!

  • @drewnewby
    @drewnewby Месяц назад +4

    So close, but honestly Intel's drivers have really improved, given the numbers you're showing today.

  • @BlueEyedVibeChecker
    @BlueEyedVibeChecker Месяц назад

    One thing I noticed is that the Arc card has a higher contrast and more saturation than the GTX 1630.
    I noticed similar with my Asrock A380 when switching from my 1050 too.
    It's like a side by side of the differences between an LCD and an OLED display, for this alone the Arc wins for me. I'm a sucker for colourful games.

  • @LukeTheJoker
    @LukeTheJoker Месяц назад +1

    Great comparison!
    Looks like the slightly faster GTX 1650 might be a close competitor for the A310, specially with both requiring no added power connector.

    • @purehollow4460
      @purehollow4460 Месяц назад +2

      There's up to 15 fps difference between the two In games like horizon and hogwarts

  • @iNubpwn3r
    @iNubpwn3r Месяц назад

    Arc supports Dx 12 ultimate, so there is that. I'm looking forward to seeing that ReBa on/offr comparison video because on my Nvidia card, it doesn't seem to be doing anything.

  • @MrSamadolfo
    @MrSamadolfo Месяц назад

    Thanks for the comparisons Buddy I really appreciate it, Do you happen to have on a book shelf a permanent Roster Lineup of Older Gaming Rigs just for testing? Because when it comes to any bus powered video card out in the wild it would really help to test the video cards and compare them in older rigs to see what the Majer Losses are when its running in a slower CPU slower Motherboard slots Slower ram. For a Lineup I would like to suggest one LGA 775, one First Gen Intel, one Second Gen Intel, one Haswell 4th, one Skylake 6th, one Coffelake 8th, one FX 8320 AM3+, one First Gen Ryzen 7. For a permanent Roster lineup of bus powered video cards I suggest GTX 750ti,1030ddr5,1050ti,1650,3050 6GB; bus powered R7 R9, RX 550, RX 460, RX 6400, all Arc cards that are bus powered. Im only guessing but I imagine the Arc Cards will have the largest performance loss compared to any other bus powered video card on the market when you plug it into an old rig such as First Gen Intel or FX8320. The information is vitally important for people that are interested in turning an old oem prebuilt into a gaming pc and they need to know which of the bus powered cards run better in older systems. Thanks again and have a Good Weekend. 😇🙏

  • @IndellableHatesHandles
    @IndellableHatesHandles Месяц назад +2

    The used market is way too good right now to buy these things. Unless you need a low-profile graphics card you might as well get a used card like the 1660 ti for the same price.

  • @lexluthermiester
    @lexluthermiester Месяц назад

    @RandomGaminginHD
    1080P for these cards is simply asking a bit much from them. They are better suited for 768p/720p resolutions. These are perfectly acceptable resolutions settings and will allow for an increase of settings quality while still gaining frame-rates.

  • @janwitkowsky8787
    @janwitkowsky8787 Месяц назад

    Now I'm curious.
    Initial benchmarks of the a310 placed it below the 1630.
    1630 has often been pegged as a smidge better than the 1050.
    Now that a310 is arguably better than the 1630, how does it stack up against the 1050 and 1050ti?

  • @rem_0
    @rem_0 Месяц назад +15

    I didn't even know gtx 1630 exist

    • @RandomGaminginHD
      @RandomGaminginHD  Месяц назад +14

      Yeah it had a fairly quiet launch and then just disappeared haha

  • @FankyRonald
    @FankyRonald Месяц назад +2

    Thank You for the video ... still, i want to compare the non pin 1650 with the A310 ... still hunting non pin 1650 for my optiplex

    • @akiwiwithaface8911
      @akiwiwithaface8911 Месяц назад +1

      At that point you might want to find a better power supply and a more powerful card

    • @FankyRonald
      @FankyRonald Месяц назад +3

      @@akiwiwithaface8911 actually, i got the Fractal ion+ 760w platinum on my custom build with i7-7700k & yes i still looking for the non pin 1650, why ? Because the optiplex's psu isn't "upgradeable"~

    • @deanchur
      @deanchur Месяц назад

      @@FankyRonald Look for "MSI 1650 4GB LP 75W" on your shopping site of choice; I've got it in my Optiplex 7050 with a RAM bump to 16GB and it runs Deus Ex MD at 1080p full very well.

  • @b4ttlemast0r
    @b4ttlemast0r Месяц назад +1

    The official TDP of the 1630 is 75W (which is as much as a PCIe slot can provide) and it seems like only certain models of the 1630 like this one you have here require a power connector, likely for overclocking headroom (it's factory overclocked by 30 MHz, absolute gamechanger 😂). In this tier of graphics cards, requiring a power connector or not, and the difference between using 40W and using 75W+ is actually a big deal.

  • @sleepybunnynorbun1039
    @sleepybunnynorbun1039 Месяц назад +3

    Tbh the only thing I'm wondering about the A310 is how it compares to AMD Ryzen integrated graphics. So far from what I've experienced is that the only cards worth getting are the A770 and A750 for the low cost compared to the 12Gb and 16Gb VRAM.

    • @DigitalJedi
      @DigitalJedi Месяц назад

      The A750 is 8GB, not 12GB.

  • @Therealpro2
    @Therealpro2 Месяц назад

    So the VRAM figure on Arc cards being higher than actual physical VRAM means that part of the system memory was used to hold textures etc. and when they were required Arc card had direct access to system ram (because of ReBAR)
    On a side note: A310 is an excellent card if you're only looking to use the Intel's QSV hardware encoder and decoder. The A310 has the same silicon for encoder / decoder as A770

  • @CzlowiekDrzewo
    @CzlowiekDrzewo Месяц назад

    Full range vs. limited range?

  • @pantheratora
    @pantheratora Месяц назад +5

    Anyone else notice the difference in color temps? I wonder why that is. The Intel Arc is clearly warmer, seeing more reds and oranges. I wonder if that's some sort of auto HDR. I don't have Arc, so I'm just curious.

    • @zangetsu6638
      @zangetsu6638 Месяц назад +1

      yes, the intel has better contrast and colors for sure.

    • @tim3172
      @tim3172 Месяц назад

      It just looks like one's set to in RGB and one is set to YUV, based on the raised black level of the Nvidia card.
      LONG STORY SHORT: in YUV limited, you miss out on the bottom (darkest) 16 luminance levels for compatibility reasons.
      Because the image can't produce the darkest shades of black, it makes it look gray and a bit dull.
      RGB or YUV Full will always offer all of the luminance levels.
      If he set the Nvidia card to either RGB/Full, the output would look much closer.

  • @ProtonD1200
    @ProtonD1200 Месяц назад

    From Denmark,
    Is there something I have overlooked.?
    What is the price difference of the two cards, I mean you only mention the price of ARC A310

  • @kingeling
    @kingeling 19 дней назад

    Any idea how the A310 can utilize more VRAM than its buffer?

  • @sirrgb2070
    @sirrgb2070 Месяц назад +1

    Is it just me or does the arc310 footage more vibrant than the 1630? Is it recorded with obs, capture card or driver-side (or just an editing quirk)?

  • @zangetsu6638
    @zangetsu6638 Месяц назад

    yoooo try it on Hellblade II !!

  • @chipwitgaming
    @chipwitgaming Месяц назад

    Also, what could be an important point is that the ARC A310 also comes in (not the model you had) a model that is Low Profile, single lane.

  • @guitarcat01
    @guitarcat01 Месяц назад

    ARC's VRAM usage accounts for the borrowed System RAM memory aswell

  • @theanglerfish
    @theanglerfish Месяц назад +5

    in cyberpunk colors on arc are more vibrant ok not only cp

    • @Tundreq
      @Tundreq Месяц назад +1

      All the games are more vibrant and less washed out. And the question now is, what is intended by the game maker?

    • @musguelha14
      @musguelha14 Месяц назад +1

      @@Tundreq the output on the Nvidia card is configured wrong.

  • @gotmog13
    @gotmog13 Месяц назад

    Maybe you can try to push the A310 with some more power. If it drew 60W, I'm sure the performance would have been better. That Sparkle cooler should be enough to keep it cool.

  • @FilthEffect
    @FilthEffect Месяц назад +2

    Can we get some sparkle genie low profile a380 testing?

  • @GuancheTF95
    @GuancheTF95 Месяц назад +1

    more shadows in the arc????

  • @TrusteftTech
    @TrusteftTech Месяц назад +1

    I would never buy an Intel ARC GPU, but oh man they are gorgeous with that blue color!

    • @SirVellen
      @SirVellen Месяц назад

      Why never? They are very competitive in budget market and support newer technologies. I'm waiting for battlemage to buy one

    • @TrusteftTech
      @TrusteftTech Месяц назад

      @@SirVellen Based on people who buy them.
      1. The drivers suck as they always sucked for their iGPUs.
      2. The demand for resizable BAR (which isn't as big now that my PC is dead)
      3. The terrible support for much older games, and I am guess general software from the past.
      4. Their support window for drivers for their iGPUs is very very short. OK part of point 1, but whatever. I still can't forgive them the last driver update for my iGPU was released less than two years after the release of the CPU. Unacceptable.
      4. I have no desktop PC alive.
      5. I have no money to get a PC.
      As long as even one of them stand, I am not getting an Intel GPU.

    • @user-jm6qf9hd9j
      @user-jm6qf9hd9j Месяц назад +1

      @@TrusteftTech Exactly! I can't understand why Intel is still refusing to add native support for older DirectX versions, or least add native for both DirectX 9c and DirectX 11 - these two APIs are still used pretty frequently on many both new AAA titles and small indie games on both Epic Games Store and Steam.

    • @TrusteftTech
      @TrusteftTech Месяц назад

      @@user-jm6qf9hd9j It's up to them to be as stupid as they want. All I know is I will not touch their products as long as they continue like this.

  • @trevorbarney1796
    @trevorbarney1796 Месяц назад +4

    This is great! Keep covering these Arc cards. You're an amazing advocate for Budget Gaming and can also be a voice for what these cards are actually like in the present-tense.
    So many of the other tech tubers dont cover these anymore... or only periodically circle back to them.

  • @degenerate_kun_69
    @degenerate_kun_69 Месяц назад

    i think they should introduce the ARC cards for laptop dGPU's. would work great if they can run on so low power and get respectable fps/watt

    • @ryanspencer6778
      @ryanspencer6778 Месяц назад +2

      They do have Arc for laptops. I have one. It's not horrible, only a bit slower than the A310, but Intel's new iGPUs are around the same performance while using much less power. I don't think Intel will be making any more laptop dGPUs until/unless Arc gets adopted by gamers and put into gaming laptops.

  • @theodentherenewed4785
    @theodentherenewed4785 Месяц назад +10

    It's so interesting that colours on the Arc card look darker and the image shows more contrast. Also, there is a difference in textures in Alan Wake 2 as you mentioned. The settings and all hardware bar the GPU is the same, yet we get a different view, it shows how important the drivers are. Frankly speaking, if I was buying anything below the price of RX 6600, I would risk getting a used card over these 2.

    • @najeebshah.
      @najeebshah. Месяц назад +3

      A 770 goes for around 200 these days and beats the daylights off the 6600

    • @vespa7961
      @vespa7961 Месяц назад +1

      ​@najeebshah. A gtx 770? Lol a 6600 is similar to a 2060 or 1080, humble yourself

    • @vespa7961
      @vespa7961 Месяц назад +1

      ​@@najeebshah.Arc a770 maybe but rx6600 uses WAY less power, like half

    • @theodentherenewed4785
      @theodentherenewed4785 Месяц назад

      @@najeebshah. I agree, absolutely, the prices of Arc A750 and A770 were quite variable (I see A750 much cheaper, but it's only slighlty slower, so better price to performance ratio). These days, the 2 top Arc cards are a good option too.

    • @HandleIsNewAndBad
      @HandleIsNewAndBad Месяц назад

      ​@@vespa7961 The RX6600 always reports 100w of power usage but in reality, it's more around the specified TDP (120+W)

  • @user-jm6qf9hd9j
    @user-jm6qf9hd9j Месяц назад

    There actually one massive problem with availability of these two GPUs.
    Since GTX 1630's release, one of its massive advantages besides able to run on older PCs without a ReBar support, is besicaly that the GTX 1630 being available and can be obtained from pretty much any PC major store and can be bought in ALMOST ANY COUNTRY around world!
    For compaison, for unknown reasons at ARC A310, it does NOT SEEMS to be available in neither Europe (especially East or Central Europe), Australia, Oceania, Africa, and also on Central or South America! I has performed a search for this Intel's entry-level GPU and... Guess - all shops from Google Search's results offering ARC 310 are ONLY and ONLY! from either North America or Asia! if you're from my region - East Europe and Balkans, like: Bulgaria (where I live), Romania, Greece, Serbia, North Macedonia, Montenegro, Croatia, Turkey, Greece, Albania, or Central Europe, like: Poland, Slovenia, Slovakia, Hungary or Czech Republic - there's seems to be FLAT-OUT NO WAY get the ARC 310 there, from the two GPU tested in this video, only GTX 1630 seems to be available in these regions, and 1630 to be also widely available in most other parts of world! So, good luck to be able to get A310 from a country located outside North America or Asia!
    It's shame because A310 on recent gen PCs is definitely performing better than 1630, which on other side is also better for old gen PCs...

  • @doansumnulu6447
    @doansumnulu6447 Месяц назад

    what abaut 6500xt or 6400 ? why is out of range this comperison ı dont know prices this cards around the world but way cheaper than this two in turkey and way better perform ı think

  • @bigupbassline58
    @bigupbassline58 Месяц назад

    i don't think the 1630 needs a 6 pin power connector
    many GPU manufacturers put those on sub 75 watt cards to reuse the same PCB for different cards and to make it look more powerful than it really is, leaving it unplugged won't cause any issues and there probably isn't even any power going to the card through it since all of it is coming from the slot

  • @mbsfaridi
    @mbsfaridi Месяц назад

    Why no mention of prices?

  • @tobo6162
    @tobo6162 Месяц назад +6

    Love your Videos 👍

  • @luisfelipegonzalez4423
    @luisfelipegonzalez4423 Месяц назад

    Arc 310 vs rx 6400??

  • @randyharrigan4790
    @randyharrigan4790 Месяц назад

    Perhaps the reason the arc card goes over 4gb is it uses a small bit of system memory to add a bit of performance when it's pushed 🤷‍♂️ just a thought maybe i'm wrong.

  • @ogrejd
    @ogrejd Месяц назад +28

    No AMD RX 6300? Aww! :P

    • @Alexandru_T
      @Alexandru_T Месяц назад +1

      Radeon RX 6400 outperforms the Arc A310 at least 10%. But Arc A310 has a setting that lift the power to 75W, with the increase of performance.

    • @capblack7367
      @capblack7367 Месяц назад

      RX 6400 is equal to a GTX 1650, so the 6400 stomp all over a310

    • @ogrejd
      @ogrejd Месяц назад

      @@capblack7367 Note that I said RX 6**3**00, not 6400. It had a video here a couple of months back.

    • @ogrejd
      @ogrejd Месяц назад

      @@Alexandru_T Yes, I know, that's why I said RX 6300 and not 6400, a card that had a video here a couple of months ago.

  • @Juurus
    @Juurus Месяц назад

    Got a deal on a used Steam Deck + Dock + 1tb sd-card for only 300€. One of these GPUs could have been 1/3 of the budget already.

  • @oliverhoschi6135
    @oliverhoschi6135 Месяц назад

    convinced me to buy a used 1650... Dirt cheap, faster and no need for external power. Still a good budget card for 720p (newer games) and 1080p on older games.

  • @pablo_p_art
    @pablo_p_art Месяц назад

    Didn't know the GT 1630 requires external power... that's wild... I'm wondering, how those two will do, compared to AMD APU (5700G or maybe some 8-series Ryzen).

  • @ShinichiKudoQatnip
    @ShinichiKudoQatnip Месяц назад

    I wonder how these fare against the 780m

  • @kumin0312
    @kumin0312 Месяц назад +3

    Being more powerfull while consuming less power, way to go Arc...

    • @betag24cn
      @betag24cn Месяц назад

      now play older titles and try to praise the pos, all is emulated via dx12

    • @ryanspencer6778
      @ryanspencer6778 Месяц назад +1

      ​@@betag24cn this is misinformation. They switched to a proper DX9 implementation months ago.

    • @betag24cn
      @betag24cn Месяц назад

      @@ryanspencer6778 not misinformation, it os still emulated but better, since no one care, no one made a proper review
      intel remains as a bad project and a bad purchase if you dont want to convert video, if you want, is still problematic

  • @Doobie3010
    @Doobie3010 Месяц назад

    The RX 6400 is also a 4gb contender at a lower price point-AND is powered from the card slot!

  • @xnrg1968
    @xnrg1968 Месяц назад

    Gracias for the content. Do you think your 1630 will work without the pcie power connector plugged in? On tech powerup the spec for your model is still 75w. Most models do not require a pcie connector and are still 75w as well. If it doesn't, get a non pcie connector version and do a side by side.

  • @dualpapayas
    @dualpapayas Месяц назад

    The little AV1 encoder that could.

  • @giserson2
    @giserson2 Месяц назад

    It seems ARC has come quite far in terms of their drivers seeing as how it matched or beat the 1630 in all games tested. Quite consistent performance compared to how it used to be.

  • @michaellegg9381
    @michaellegg9381 Месяц назад +1

    The colour between the two cards is wild!! The GTX 1630 is dull and has a hazy look while the ARC A310 is a clear picture and the colour is way deeper and richer.. the intel card is easily the best choice for the performance and power consumption and for the colour palate and don't have that hazy look to it!! NVIDIA needs to get a better eye for haze and colour depth

    • @JJARCHIE
      @JJARCHIE Месяц назад +1

      I cant tell if youre serious about the color part or being sarcastic

    • @michaellegg9381
      @michaellegg9381 Месяц назад +1

      @@JJARCHIE nope I'm serious!! The NVIDIA GTX 1630 has a haze look compared to the ARC A310.. the ARC also has a way richer colour palate. The first test game the GTX had better textures as the ARC was missing some fine detail of the characters shirt 👕 but that was only in that first test game which is weird because it should have been the same.. but even that test the colour is way richer than the GTX and unless he was testing both systems at the same time using different monitors to account for the colour difference and the faint haze the I'm serious that the ARC cards colour fidelity is way better than the GTX 1630. I even went back and paused the video in each test in a few different spots and it's consistent throughout the entire video. The ARC cards colour is way better it's richer and looks better not to mention the better performance and better power consumption.

    • @JJARCHIE
      @JJARCHIE Месяц назад +1

      @@michaellegg9381 god i gotta be honest, all a gpu does is churn out pixels, it doesnt affect colors, you can ask experts or reddit if you dont think so

    • @michaellegg9381
      @michaellegg9381 Месяц назад +1

      @@JJARCHIE I know that it's meant to yes! But put ya glasses on and actually really look it's very very very different from 1 side to the other side. The Intel card is much better it's deeper and richer colour than the GTX card. The GTX card also has a haze or a pale look to it. It could be a defect in the card or the driver but there is a very big difference between the two card's colour.

    • @arenzricodexd4409
      @arenzricodexd4409 Месяц назад

      @@michaellegg9381 most likely software capture issue. most often when you put them side by side on real monitor the color will look just the same.

  • @silveredbullet802
    @silveredbullet802 Месяц назад

    Looks like Intel figured out the hardware part, just catching up with software right now. Who knows how powerful their other cards are if optimized well

  • @Evgeniy_prosto
    @Evgeniy_prosto Месяц назад

    Wher the hal is RX6400???

  • @gamin546
    @gamin546 Месяц назад

    Should've included the AMD RX 6400 too for a full comparison between NVIDIA, AMD, and Intel

  • @CorrichetiLagga
    @CorrichetiLagga Месяц назад

    Come on arc!!

  • @pawnslinger1
    @pawnslinger1 Месяц назад +2

    Just from what you showed, I like the ARC card better - I mean the graphics looked better to my eye. But I could be biased, I have really bad eyes.

    • @zangetsu6638
      @zangetsu6638 Месяц назад +1

      if even your bad eyes can see the difference, then imagine how good the ARC looks in 20/20 vision!

    • @musguelha14
      @musguelha14 Месяц назад

      Games are running at the same settings so they should look the same. You're just seeing higher contrast in the Intel window, but that's because of some different configuration.

    • @pawnslinger1
      @pawnslinger1 Месяц назад +1

      @@musguelha14 No, the they don't look the same to me. They may be the same, but TO ME they don't look the same. To me the ARC output has better color rendition and more dynamic range... the GT output TO ME looks fuzzier and somewhat muddy. I do think they are very close, but I prefer the appearance that the ARC produces. Not a right or wrong issue, just what my eye likes. A very subjective thing.

    • @musguelha14
      @musguelha14 Месяц назад

      @@pawnslinger1 no, it is a literal right or wrong. It looks worse to you because it's wrong. The capture card he's using is expecting full RGB 0-255 and it's getting 16-235, probably because the Nvidia driver defaults to that when outputting through HDMI into a HDTV resolution.
      It should be 0-255 and that's why it looks muddy and low contrast. Higher contrast also affects your perception of sharpness.

    • @pawnslinger1
      @pawnslinger1 Месяц назад +1

      @@musguelha14 Yeah, of course, what you say is correct. However, it is still MY FEELING - and MY FEELINGs are simply put.... MY FEELINGs. As such they are never right nor wrong. Simply MY DAMN FEELINGs. Also, you forgot to mention the possible affects of RUclips compression and the quality of my playback of the video... all these factors can and probably do affect how I FEEL about the comparison. In short, I find it extremely irritating when someone tries to correct how I FEEL about something. I cannot and will not be corrected about my FEELINGS. And you are correct about how YOU FEEL... your feelings are yours.... as such those feelings are neither correct nor incorrect. They are just how you personally feel. I would defend your right to feel as you wish.... I would appreciate the same courtesy.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Месяц назад

    ARC A310 vs GTX 1060 6gb?

  • @icarus-ht8xv
    @icarus-ht8xv Месяц назад +2

    first!!...greetings 😁

  • @LiquidSnake690
    @LiquidSnake690 Месяц назад

    If he can he should compare the GTX 1630 and Arc 310 to the RX 6300 since that is the slowest gpu in AMD's RX 6000 lineup.

  • @jamielm
    @jamielm Месяц назад

    probably should have mentioned the price of the 1630 after the 310 to help with comparison

  • @mynickisalreadytaken
    @mynickisalreadytaken Месяц назад +1

    As a long time Viewer i have to ask. Are you a (medical) Smoker by any chance? Your way of speaking and slightly walking over beginnings of words sometimes reminds me of myself as a long time smoker. :)
    Have a nice weekend. Your Videos are part of my every day. I watch them to calm down and look at older or used hardware. Something that calmes me.
    Greetings from north Germany.

    • @MysteryD
      @MysteryD Месяц назад +1

      What does smoking weed have to do with the manner in which he speaks? I've been smoking for 25 years and I don't assume that because I prefer ankle socks, that everyone who prefers ankle socks smokes weed.

    • @mynickisalreadytaken
      @mynickisalreadytaken Месяц назад

      @@MysteryD Well, i did not say that it is something everyone experiences in general. It is just something that is not unknown or really weird for some people that consume on a daily basis for years. Nothing special about it. It is something i see in myself, my wife, friends and other people talking about it. So i do not know why you make such a weird claim. Good for you then i guess. It is not that it is making my life any worse or that someone spekas like a stupid Hollywood Stoner from a Movie. These are minor nuances in speaking and i observe them on people smoking weed longterm that often have ADHD. Like myself. And my Wife.

    • @MysteryD
      @MysteryD Месяц назад

      @@mynickisalreadytaken if add was the reason you suspected having a similar approach to speech, then why bring up cannabis? I get it, you were hoping he was a stoner too, so you'd have even more reason to like his character. That's OK. I know where you were coming from.
      Ps
      Sorry about the thc% thing in Germany. In Pennsylvania USA I get 30-35% thc flower and 85-90% thc vape cartridges.

    • @mynickisalreadytaken
      @mynickisalreadytaken Месяц назад

      @@MysteryD No. :D I was not hoping for anything. I was just really interested in it. I brought up Cannabis, because as i said it was the reason my talking changed after a few years. And the reason for it is not ADHD alone. It is the combination with Cannabis, but i did not want to bring up ADHD, because it was not needed. Until i had to. And i am not the only one. Do you think i am 15 yeas old and think Weed is cool? I use it as medication (Not only of course) and i am going more towards 40 than 20. I prefer more balanced Strains in my everyday use.
      Moreover, i am not really that jealous about 90% THC Vapes or 35% (lol) THC Buds. ~35% THC in a Flower is not what i am looking for in Weed. Our Cannabis-Law is utterly shit and far from beeing great or helpful for the most people here. Sometimes worse even. Still, it is a beginning.

  • @unnamed715
    @unnamed715 Месяц назад

    Legit had no idea a 1630 even existed 😂😅

  • @polarvortex6601
    @polarvortex6601 Месяц назад

    the intel cards requires rebar for optimal performance. i wish more youtubers would state this before making any tests. you are safer with AMD or nviidia though
    if the intel cards didnt require the rebar i would have opted to buy one of those instead of a 3060 but sadly my 9th gen system doesnt have rebar thus intel has lost a costumer with ppl like me

  • @waynetuttle6872
    @waynetuttle6872 Месяц назад

    The reason the Intel offering is so much less crappy(less) is because it is on a 6nm process where the Nvidia offering looks like they had to dust off an old fab somewhere on a deserted island to make it, the 1630 is made on an ancient 12nm node. Also explains why the Intel one is so much more efficient

  • @davidmccarthy6390
    @davidmccarthy6390 Месяц назад

    This video shows just how diabolical the 1630 really was...and is. It couldn't even keep up with the worst Arc GPU, and it cost more in my region.

  • @BlackLionPT
    @BlackLionPT Месяц назад +2

    Hello! Any hope for a video comparing a very old system, say 2nd gen i7 VS current gen both in games and "general usage feel"? Would be interesting to see if there's any difference between systems over 10 years apart in say, productivity, browsing and of course, games :) Anyways great to see you following arc's progress so far!

  • @GeForce210
    @GeForce210 Месяц назад +2

    gt 710 2gb vs hd 630

  • @dredd2063
    @dredd2063 Месяц назад

    You should also compare these two to AMD's RX 6400

  • @gureguru4694
    @gureguru4694 Месяц назад

    1630 is still a better buy for the target market of prehistoric pcs that doesnt have rebar. its sad that someone will end up getting a a310 on his/her i7 3770 PC and wonder why it SOO SLOWWWW in games.

  • @L3iL0
    @L3iL0 Месяц назад

    Never heard of the GTX 1630😮
    So what's the difference between this and GTX 1650?

    • @Retro-Iron11
      @Retro-Iron11 Месяц назад +12

      20

    • @toddsmithselbow1732
      @toddsmithselbow1732 Месяц назад

      Performance.

    • @jameslake7775
      @jameslake7775 Месяц назад +1

      The GTX 1650 has 896 CUDA cores at ~1.4 GHz and uses a 128-bit memory bus. The GTX 1630 has 512 CUDA cores at 1.7 GHz with a 64-bit memory bus, and is additionally restricted to PCIe x8 at most.

    • @L3iL0
      @L3iL0 Месяц назад

      @@Retro-Iron11 🤣🤣🤣

  • @SatsJava
    @SatsJava Месяц назад

    Look at the video
    Arc side (left)side looks sharper

  • @AbbasDalal1000
    @AbbasDalal1000 Месяц назад

    please also add radeon 780m

  • @floppydisk4500
    @floppydisk4500 Месяц назад

    Where's the rx 6500? (I can usually find the 6400 and 6500 for the same price)

  • @daras-
    @daras- Месяц назад +2

    How it works with older PC?
    4-6 gen i5?

    • @jqwright28
      @jqwright28 Месяц назад +3

      Arc will work on older gens, but without resizeable bar you will lose a lot of performance and get poor 1% lows in games.

    • @user-jm6qf9hd9j
      @user-jm6qf9hd9j Месяц назад +1

      I think the NVidia GTX 1630 GPU would be winner if going for older cheaper workstations with multicore 8th, 9th or 10th Gen Xeon CPUs where ReBAR cannot be enabled under any way for the Intel ARC GPU.

    • @andyshtroymish4997
      @andyshtroymish4997 Месяц назад

      ​@@user-jm6qf9hd9jnecrophiliac here. Tbh I'd rather look 4 used market as there are quite a few options out there for a hundred, even in a third-world poopholes/overpriced and unhealthy market middle east ones. Here in Israel you can snatch some 1066/5600xt for that money.

  • @JeffreyJohnsonC
    @JeffreyJohnsonC Месяц назад

    To me, the ARC colors, contrast, and perceived sharpness are better.