Intel ARC A310 vs Nvidia GTX 1630 - Can The Cheapest ARC Card Beat The Cheapest GTX?
HTML-код
- Опубликовано: 5 июл 2024
- How does an entry-level ARC graphics card compare to an entry-level GTX GPU? Let's find out.
0:00 Intro and Test Setup
1:14 Alan Wake 2
2:04 Cyberpunk 2077
3:02 Fallout 4
4:00 Forza Horizon 5
5:18 Red Dead Redemption 2
5:56 Starfield
6:49 The Witcher 3
7:41 Final Thoughts
Thanks for watching :) - Наука
The fact that the 1630 requires external power, and yet the 1650 doesn’t is hilarious
Lazy design
@@bjarne431 and the best part is that the 1630 has the same 75 watt tdp of the 1650
@@theformerkaiser9391 really? what a joke XD
The GTX 1630 shouldn't have existed in the first place. It's just an underclocked 1050 Ti or a 1030 on steroids. In short, just an e-waste.
@@theformerkaiser9391 Probably lower quality chips that required higher voltages to reach the clocks, and hence consume more power. I'd be fine with this, if they cut the price in half.
As it is, it's a disgrace.
I had no idea the 1630 was a thing and kind of wild it requires external power...
I don't think all models of it do. It's supposed to sip about as much as a 1030, so I'm not sure why bother adding a 6-pin.
@@dasauto7346 Probably some overclocked version, GPU manufacturers nowdays overclock anything for the sake of being able to label it "OC".
That fact that it is slower than the cheapest bottom of the barrel ARC card even in older games with garbage DX9 era derived DX11 game engines like Fallout 4 is hilarity.
I didn't realize there are cards out there that didn't require external power until recently 😅
GTX 1630 has TDP of 75W so technically it doesn't need external power.. but without it it would really stress the PCIe connector.
Thanks for the Intel Arc content, keep it up. They are more interesting than what AMD and Nvidia have. I'd like to see the A580, A770 8GB and more reviews from you.
Intel is a joke
@@asfiraihan2073And Nvidia's pricing isn't? Like the other dude said, Nvidia is basically Apple at this point.
Yeah hope to check them both out soon
@@_Lassic_ no one should buy budget GPUs from NVIDIA. Only high end GPUs make sense for opting NVIDIA. Amd kinda follows the middle path. All in all, Intel should just go back to making their CPUs.
Nah,more competition=better prices@@asfiraihan2073
When I see pc parts in the garden I just know that it's you
Haha me too 😂😂😂
1630 is a legit successor of the 1030 DDR4 if you ask me.
more like a legit successor of the 710
@@madelakiI'm just wondering if it rocks that exact same 710 chip as 1030 xD
I like the emphasis on the "DDR4"
@@madelaki hahahaha
Except consumes more power lol
Intel managing to beat Nvidia with power efficiency so early on is pretty amazing. Now if only they could scale this up to 4080 levels, Nvidia would be in big trouble.
You aren't wrong, if they can knock battlemage out of the park it will spell trouble for Nvidia's bread and butter area. The xx80 series has always been really important.
That’s just not true the 40 series destroys Intel in power efficiency when it comes to the 70 series and 60 series
@@blizyon30fps86 Too bad NVidia does not have any thing in the 40 series to compete with this.
@@CGE10 4060 low profile is a thing you know that right?
@@blizyon30fps86 you know the 4060 is like nearly 4x the price, they do not compare.... idk what your trying to argue im running a 3060 so not like im a fanboy or something,
i gamed on an intel arc a750 for a bit and was blown away how good it was since i got it new for 180$ usd
Easily the best Tech RUclipsr who stays humble, unpretentious and so far, has not sold out to any sponsorships and/or merch. Not that I don't wish RGHD any success but bro, wish you all the best but at the same time, hope you carry on keeping it real.
40w max while gaming...
meanwhile my 3060 ti draws 30w at goddamn idle
rx 7900xtx 90-100W idle here xD
My A750 and A770 consume 40w during idle, not a nvidia thing.
@@chubbykunmine only pulls 5-6w on idle💀 update your drivers
My 7800xt runs at like 22w idle, what the hell are you doing to your system.
@@chubbykunMine draws like 12w, update your drivers dude
The weirdest part of the a310 is that the a380 seems to be about the same price, especially since I see much less of the a310 in the US. I ran the a380 with way overkill specs for it's little self for awhile (ok, ok, so the Pentium G7400 isn't overkill - but the DDR5 6000mhz ram was, LOL). The a310 is actually quite similar, with less VRAM - though I rotate my side-gaming PC components regularly and it's been a few months... I hit act three in BG3 on that system, and I just couldn't take it...
I never felt proud of my rx6500m, untill this video
Perfect timing seeing this video now -- I just got back home from the local Micro Center where I bought an Intel Arc A310! I got the Sparkle "ECO" model, and it is going to be used in a Jellyfin media server. Can't wait to play around with the AV1 options, it looks fascinating! Glad to see you are covering the Arc series.
Very nice! It would be interesting to see how big the difference is going from an A310 to an A380, considering they are nearly the same price.
Thanks for the upload. The ARC A310 looks very interesting! Especially in SFF/Low profile seems very good option as a simple drop in upgrade for s SFF office PC. Also you're fine using this GPU with their typical 200/250W PSU.
But for new build, it doesn't make sense. You get get an AMD APU that performs not far off. Ryzen 7 5700G has become quite cheap these days, and with some OC and som good DDR4-3600(OC to 4000 is optimal), it's iGPU performance is not far of GTX 1630
Pretty cool to see these two at all in a video. Hardly ever pay much mind still too the Arc series and seeing how... well effecient and running them fairly okay makes me feel it's likely under estimated for those just trying to get into the PC space even more just basic titles like csgo/league while sometimes messing around on light titles such as MoonLighter or Wizards of Legend. Kinda something I could see as "kids first PC" but maybe I'm just projecting my first steps affording only a FX-6300 with a R9 270. Just what could get me in and play something.
For everyone talking about the colours and contrast in the game, that's not the GPU's fault and simply a configuration difference. They can look exactly the same.
Just by looking at it, I'm pretty sure Nvidia is outputting RGB Limited (16-235) and Intel is outputting RGB full (0-255). The capture card seems to be expecting RGB full and that's why the Nvidia footage looks bad - black and white point is wrong and so the image doesn't have enough contrast.
This is very simple to fix in the drivers.
1650 vs A380 with latest drivers next please.
its about neck to neck but the arc wins in vram heavy games like resident evil 4 remake which needs depending on the setting more than 4 gb vram
Yes please
Or make it 1650 v a380 v 1060
@@roqeyt3566 1060 wins by a slide, but the 3gig variant might be fairer
@@JJARCHIE I still use the 3gb 1060 and for what I'm playing and using it for it served me well, I will only upgrade when I start playing more demanding titles, I hope really he does make that video I'd like to see how it does.
@@metu201 it doesnt matter much. As long as anything needs more than 4gb, it will chug
Team Blue for the win! Nice video 🖖
Whenever I see a benchmark video from zworm or randomgaming, I always mix them up thinking they are teh same person
Would be sexy if Intel manages to greatly improve upon this kind of power envelope with Battlemage GPUs. One can only hope that they step up and don't leave gamers to the mercy of Nvidia and AMD
é por isso q eu amo esse canal, voce tem as ideias mais doidas pra testes q eu vejo pelo youtube
It would be nice if you could post the full setup configuration details in the description as a reference when we listen
The thumbnail and Title reminds me of a JJK meme
The Cheapest of Today vs The Cheapest in History
(something like that)
Would you recommend any of those intel arc cards for gaming? It seems to do quite good in the videos for the price considering nvidia.
Would be interesting to see how they compare to the RX 6400 as that's about the same price. Also would be interesting to see the 6GB A380
why didnt u include the rx 6400? any particular reasons or u just didnt have it in stock?? great vid as always!
Man that was closer than it should of been really, looks like the Intel Arc drivers still need a LOT of work.
@RGHD can you do a short video comparing these two cards in esports titles like fortnite, warzone, cs2, valorant, apex legends, the finals and xdefiant? because i think if you are thinking about buying these cards in 2024 its probably best to stick with the esports titles with these than trying to play AAA games since they wont be that pleasant of a experience and as the time goes on it will get even worse. so sticking with esports titles is the best choice. maybe they can be a cheap option for people who are looking for a gpu that can do well in esports titles in for quite some time.
Arc cards are not something id switch my amd graphics card yet, but its getting better and better! Its great to have extra competition!
So close, but honestly Intel's drivers have really improved, given the numbers you're showing today.
One thing I noticed is that the Arc card has a higher contrast and more saturation than the GTX 1630.
I noticed similar with my Asrock A380 when switching from my 1050 too.
It's like a side by side of the differences between an LCD and an OLED display, for this alone the Arc wins for me. I'm a sucker for colourful games.
Great comparison!
Looks like the slightly faster GTX 1650 might be a close competitor for the A310, specially with both requiring no added power connector.
There's up to 15 fps difference between the two In games like horizon and hogwarts
Arc supports Dx 12 ultimate, so there is that. I'm looking forward to seeing that ReBa on/offr comparison video because on my Nvidia card, it doesn't seem to be doing anything.
Thanks for the comparisons Buddy I really appreciate it, Do you happen to have on a book shelf a permanent Roster Lineup of Older Gaming Rigs just for testing? Because when it comes to any bus powered video card out in the wild it would really help to test the video cards and compare them in older rigs to see what the Majer Losses are when its running in a slower CPU slower Motherboard slots Slower ram. For a Lineup I would like to suggest one LGA 775, one First Gen Intel, one Second Gen Intel, one Haswell 4th, one Skylake 6th, one Coffelake 8th, one FX 8320 AM3+, one First Gen Ryzen 7. For a permanent Roster lineup of bus powered video cards I suggest GTX 750ti,1030ddr5,1050ti,1650,3050 6GB; bus powered R7 R9, RX 550, RX 460, RX 6400, all Arc cards that are bus powered. Im only guessing but I imagine the Arc Cards will have the largest performance loss compared to any other bus powered video card on the market when you plug it into an old rig such as First Gen Intel or FX8320. The information is vitally important for people that are interested in turning an old oem prebuilt into a gaming pc and they need to know which of the bus powered cards run better in older systems. Thanks again and have a Good Weekend. 😇🙏
The used market is way too good right now to buy these things. Unless you need a low-profile graphics card you might as well get a used card like the 1660 ti for the same price.
@RandomGaminginHD
1080P for these cards is simply asking a bit much from them. They are better suited for 768p/720p resolutions. These are perfectly acceptable resolutions settings and will allow for an increase of settings quality while still gaining frame-rates.
Now I'm curious.
Initial benchmarks of the a310 placed it below the 1630.
1630 has often been pegged as a smidge better than the 1050.
Now that a310 is arguably better than the 1630, how does it stack up against the 1050 and 1050ti?
I didn't even know gtx 1630 exist
Yeah it had a fairly quiet launch and then just disappeared haha
Thank You for the video ... still, i want to compare the non pin 1650 with the A310 ... still hunting non pin 1650 for my optiplex
At that point you might want to find a better power supply and a more powerful card
@@akiwiwithaface8911 actually, i got the Fractal ion+ 760w platinum on my custom build with i7-7700k & yes i still looking for the non pin 1650, why ? Because the optiplex's psu isn't "upgradeable"~
@@FankyRonald Look for "MSI 1650 4GB LP 75W" on your shopping site of choice; I've got it in my Optiplex 7050 with a RAM bump to 16GB and it runs Deus Ex MD at 1080p full very well.
The official TDP of the 1630 is 75W (which is as much as a PCIe slot can provide) and it seems like only certain models of the 1630 like this one you have here require a power connector, likely for overclocking headroom (it's factory overclocked by 30 MHz, absolute gamechanger 😂). In this tier of graphics cards, requiring a power connector or not, and the difference between using 40W and using 75W+ is actually a big deal.
Tbh the only thing I'm wondering about the A310 is how it compares to AMD Ryzen integrated graphics. So far from what I've experienced is that the only cards worth getting are the A770 and A750 for the low cost compared to the 12Gb and 16Gb VRAM.
The A750 is 8GB, not 12GB.
So the VRAM figure on Arc cards being higher than actual physical VRAM means that part of the system memory was used to hold textures etc. and when they were required Arc card had direct access to system ram (because of ReBAR)
On a side note: A310 is an excellent card if you're only looking to use the Intel's QSV hardware encoder and decoder. The A310 has the same silicon for encoder / decoder as A770
Full range vs. limited range?
Anyone else notice the difference in color temps? I wonder why that is. The Intel Arc is clearly warmer, seeing more reds and oranges. I wonder if that's some sort of auto HDR. I don't have Arc, so I'm just curious.
yes, the intel has better contrast and colors for sure.
It just looks like one's set to in RGB and one is set to YUV, based on the raised black level of the Nvidia card.
LONG STORY SHORT: in YUV limited, you miss out on the bottom (darkest) 16 luminance levels for compatibility reasons.
Because the image can't produce the darkest shades of black, it makes it look gray and a bit dull.
RGB or YUV Full will always offer all of the luminance levels.
If he set the Nvidia card to either RGB/Full, the output would look much closer.
From Denmark,
Is there something I have overlooked.?
What is the price difference of the two cards, I mean you only mention the price of ARC A310
Any idea how the A310 can utilize more VRAM than its buffer?
Is it just me or does the arc310 footage more vibrant than the 1630? Is it recorded with obs, capture card or driver-side (or just an editing quirk)?
I noticed that too
yoooo try it on Hellblade II !!
Also, what could be an important point is that the ARC A310 also comes in (not the model you had) a model that is Low Profile, single lane.
ARC's VRAM usage accounts for the borrowed System RAM memory aswell
in cyberpunk colors on arc are more vibrant ok not only cp
All the games are more vibrant and less washed out. And the question now is, what is intended by the game maker?
@@Tundreq the output on the Nvidia card is configured wrong.
Maybe you can try to push the A310 with some more power. If it drew 60W, I'm sure the performance would have been better. That Sparkle cooler should be enough to keep it cool.
Can we get some sparkle genie low profile a380 testing?
more shadows in the arc????
I would never buy an Intel ARC GPU, but oh man they are gorgeous with that blue color!
Why never? They are very competitive in budget market and support newer technologies. I'm waiting for battlemage to buy one
@@SirVellen Based on people who buy them.
1. The drivers suck as they always sucked for their iGPUs.
2. The demand for resizable BAR (which isn't as big now that my PC is dead)
3. The terrible support for much older games, and I am guess general software from the past.
4. Their support window for drivers for their iGPUs is very very short. OK part of point 1, but whatever. I still can't forgive them the last driver update for my iGPU was released less than two years after the release of the CPU. Unacceptable.
4. I have no desktop PC alive.
5. I have no money to get a PC.
As long as even one of them stand, I am not getting an Intel GPU.
@@TrusteftTech Exactly! I can't understand why Intel is still refusing to add native support for older DirectX versions, or least add native for both DirectX 9c and DirectX 11 - these two APIs are still used pretty frequently on many both new AAA titles and small indie games on both Epic Games Store and Steam.
@@user-jm6qf9hd9j It's up to them to be as stupid as they want. All I know is I will not touch their products as long as they continue like this.
This is great! Keep covering these Arc cards. You're an amazing advocate for Budget Gaming and can also be a voice for what these cards are actually like in the present-tense.
So many of the other tech tubers dont cover these anymore... or only periodically circle back to them.
i think they should introduce the ARC cards for laptop dGPU's. would work great if they can run on so low power and get respectable fps/watt
They do have Arc for laptops. I have one. It's not horrible, only a bit slower than the A310, but Intel's new iGPUs are around the same performance while using much less power. I don't think Intel will be making any more laptop dGPUs until/unless Arc gets adopted by gamers and put into gaming laptops.
It's so interesting that colours on the Arc card look darker and the image shows more contrast. Also, there is a difference in textures in Alan Wake 2 as you mentioned. The settings and all hardware bar the GPU is the same, yet we get a different view, it shows how important the drivers are. Frankly speaking, if I was buying anything below the price of RX 6600, I would risk getting a used card over these 2.
A 770 goes for around 200 these days and beats the daylights off the 6600
@najeebshah. A gtx 770? Lol a 6600 is similar to a 2060 or 1080, humble yourself
@@najeebshah.Arc a770 maybe but rx6600 uses WAY less power, like half
@@najeebshah. I agree, absolutely, the prices of Arc A750 and A770 were quite variable (I see A750 much cheaper, but it's only slighlty slower, so better price to performance ratio). These days, the 2 top Arc cards are a good option too.
@@vespa7961 The RX6600 always reports 100w of power usage but in reality, it's more around the specified TDP (120+W)
There actually one massive problem with availability of these two GPUs.
Since GTX 1630's release, one of its massive advantages besides able to run on older PCs without a ReBar support, is besicaly that the GTX 1630 being available and can be obtained from pretty much any PC major store and can be bought in ALMOST ANY COUNTRY around world!
For compaison, for unknown reasons at ARC A310, it does NOT SEEMS to be available in neither Europe (especially East or Central Europe), Australia, Oceania, Africa, and also on Central or South America! I has performed a search for this Intel's entry-level GPU and... Guess - all shops from Google Search's results offering ARC 310 are ONLY and ONLY! from either North America or Asia! if you're from my region - East Europe and Balkans, like: Bulgaria (where I live), Romania, Greece, Serbia, North Macedonia, Montenegro, Croatia, Turkey, Greece, Albania, or Central Europe, like: Poland, Slovenia, Slovakia, Hungary or Czech Republic - there's seems to be FLAT-OUT NO WAY get the ARC 310 there, from the two GPU tested in this video, only GTX 1630 seems to be available in these regions, and 1630 to be also widely available in most other parts of world! So, good luck to be able to get A310 from a country located outside North America or Asia!
It's shame because A310 on recent gen PCs is definitely performing better than 1630, which on other side is also better for old gen PCs...
what abaut 6500xt or 6400 ? why is out of range this comperison ı dont know prices this cards around the world but way cheaper than this two in turkey and way better perform ı think
i don't think the 1630 needs a 6 pin power connector
many GPU manufacturers put those on sub 75 watt cards to reuse the same PCB for different cards and to make it look more powerful than it really is, leaving it unplugged won't cause any issues and there probably isn't even any power going to the card through it since all of it is coming from the slot
Why no mention of prices?
Love your Videos 👍
Thanks!
Arc 310 vs rx 6400??
Perhaps the reason the arc card goes over 4gb is it uses a small bit of system memory to add a bit of performance when it's pushed 🤷♂️ just a thought maybe i'm wrong.
No AMD RX 6300? Aww! :P
Radeon RX 6400 outperforms the Arc A310 at least 10%. But Arc A310 has a setting that lift the power to 75W, with the increase of performance.
RX 6400 is equal to a GTX 1650, so the 6400 stomp all over a310
@@capblack7367 Note that I said RX 6**3**00, not 6400. It had a video here a couple of months back.
@@Alexandru_T Yes, I know, that's why I said RX 6300 and not 6400, a card that had a video here a couple of months ago.
Got a deal on a used Steam Deck + Dock + 1tb sd-card for only 300€. One of these GPUs could have been 1/3 of the budget already.
convinced me to buy a used 1650... Dirt cheap, faster and no need for external power. Still a good budget card for 720p (newer games) and 1080p on older games.
Didn't know the GT 1630 requires external power... that's wild... I'm wondering, how those two will do, compared to AMD APU (5700G or maybe some 8-series Ryzen).
I wonder how these fare against the 780m
Being more powerfull while consuming less power, way to go Arc...
now play older titles and try to praise the pos, all is emulated via dx12
@@betag24cn this is misinformation. They switched to a proper DX9 implementation months ago.
@@ryanspencer6778 not misinformation, it os still emulated but better, since no one care, no one made a proper review
intel remains as a bad project and a bad purchase if you dont want to convert video, if you want, is still problematic
The RX 6400 is also a 4gb contender at a lower price point-AND is powered from the card slot!
Gracias for the content. Do you think your 1630 will work without the pcie power connector plugged in? On tech powerup the spec for your model is still 75w. Most models do not require a pcie connector and are still 75w as well. If it doesn't, get a non pcie connector version and do a side by side.
The little AV1 encoder that could.
It seems ARC has come quite far in terms of their drivers seeing as how it matched or beat the 1630 in all games tested. Quite consistent performance compared to how it used to be.
The colour between the two cards is wild!! The GTX 1630 is dull and has a hazy look while the ARC A310 is a clear picture and the colour is way deeper and richer.. the intel card is easily the best choice for the performance and power consumption and for the colour palate and don't have that hazy look to it!! NVIDIA needs to get a better eye for haze and colour depth
I cant tell if youre serious about the color part or being sarcastic
@@JJARCHIE nope I'm serious!! The NVIDIA GTX 1630 has a haze look compared to the ARC A310.. the ARC also has a way richer colour palate. The first test game the GTX had better textures as the ARC was missing some fine detail of the characters shirt 👕 but that was only in that first test game which is weird because it should have been the same.. but even that test the colour is way richer than the GTX and unless he was testing both systems at the same time using different monitors to account for the colour difference and the faint haze the I'm serious that the ARC cards colour fidelity is way better than the GTX 1630. I even went back and paused the video in each test in a few different spots and it's consistent throughout the entire video. The ARC cards colour is way better it's richer and looks better not to mention the better performance and better power consumption.
@@michaellegg9381 god i gotta be honest, all a gpu does is churn out pixels, it doesnt affect colors, you can ask experts or reddit if you dont think so
@@JJARCHIE I know that it's meant to yes! But put ya glasses on and actually really look it's very very very different from 1 side to the other side. The Intel card is much better it's deeper and richer colour than the GTX card. The GTX card also has a haze or a pale look to it. It could be a defect in the card or the driver but there is a very big difference between the two card's colour.
@@michaellegg9381 most likely software capture issue. most often when you put them side by side on real monitor the color will look just the same.
Looks like Intel figured out the hardware part, just catching up with software right now. Who knows how powerful their other cards are if optimized well
Wher the hal is RX6400???
Should've included the AMD RX 6400 too for a full comparison between NVIDIA, AMD, and Intel
Come on arc!!
Just from what you showed, I like the ARC card better - I mean the graphics looked better to my eye. But I could be biased, I have really bad eyes.
if even your bad eyes can see the difference, then imagine how good the ARC looks in 20/20 vision!
Games are running at the same settings so they should look the same. You're just seeing higher contrast in the Intel window, but that's because of some different configuration.
@@musguelha14 No, the they don't look the same to me. They may be the same, but TO ME they don't look the same. To me the ARC output has better color rendition and more dynamic range... the GT output TO ME looks fuzzier and somewhat muddy. I do think they are very close, but I prefer the appearance that the ARC produces. Not a right or wrong issue, just what my eye likes. A very subjective thing.
@@pawnslinger1 no, it is a literal right or wrong. It looks worse to you because it's wrong. The capture card he's using is expecting full RGB 0-255 and it's getting 16-235, probably because the Nvidia driver defaults to that when outputting through HDMI into a HDTV resolution.
It should be 0-255 and that's why it looks muddy and low contrast. Higher contrast also affects your perception of sharpness.
@@musguelha14 Yeah, of course, what you say is correct. However, it is still MY FEELING - and MY FEELINGs are simply put.... MY FEELINGs. As such they are never right nor wrong. Simply MY DAMN FEELINGs. Also, you forgot to mention the possible affects of RUclips compression and the quality of my playback of the video... all these factors can and probably do affect how I FEEL about the comparison. In short, I find it extremely irritating when someone tries to correct how I FEEL about something. I cannot and will not be corrected about my FEELINGS. And you are correct about how YOU FEEL... your feelings are yours.... as such those feelings are neither correct nor incorrect. They are just how you personally feel. I would defend your right to feel as you wish.... I would appreciate the same courtesy.
ARC A310 vs GTX 1060 6gb?
first!!...greetings 😁
If he can he should compare the GTX 1630 and Arc 310 to the RX 6300 since that is the slowest gpu in AMD's RX 6000 lineup.
probably should have mentioned the price of the 1630 after the 310 to help with comparison
As a long time Viewer i have to ask. Are you a (medical) Smoker by any chance? Your way of speaking and slightly walking over beginnings of words sometimes reminds me of myself as a long time smoker. :)
Have a nice weekend. Your Videos are part of my every day. I watch them to calm down and look at older or used hardware. Something that calmes me.
Greetings from north Germany.
What does smoking weed have to do with the manner in which he speaks? I've been smoking for 25 years and I don't assume that because I prefer ankle socks, that everyone who prefers ankle socks smokes weed.
@@MysteryD Well, i did not say that it is something everyone experiences in general. It is just something that is not unknown or really weird for some people that consume on a daily basis for years. Nothing special about it. It is something i see in myself, my wife, friends and other people talking about it. So i do not know why you make such a weird claim. Good for you then i guess. It is not that it is making my life any worse or that someone spekas like a stupid Hollywood Stoner from a Movie. These are minor nuances in speaking and i observe them on people smoking weed longterm that often have ADHD. Like myself. And my Wife.
@@mynickisalreadytaken if add was the reason you suspected having a similar approach to speech, then why bring up cannabis? I get it, you were hoping he was a stoner too, so you'd have even more reason to like his character. That's OK. I know where you were coming from.
Ps
Sorry about the thc% thing in Germany. In Pennsylvania USA I get 30-35% thc flower and 85-90% thc vape cartridges.
@@MysteryD No. :D I was not hoping for anything. I was just really interested in it. I brought up Cannabis, because as i said it was the reason my talking changed after a few years. And the reason for it is not ADHD alone. It is the combination with Cannabis, but i did not want to bring up ADHD, because it was not needed. Until i had to. And i am not the only one. Do you think i am 15 yeas old and think Weed is cool? I use it as medication (Not only of course) and i am going more towards 40 than 20. I prefer more balanced Strains in my everyday use.
Moreover, i am not really that jealous about 90% THC Vapes or 35% (lol) THC Buds. ~35% THC in a Flower is not what i am looking for in Weed. Our Cannabis-Law is utterly shit and far from beeing great or helpful for the most people here. Sometimes worse even. Still, it is a beginning.
Legit had no idea a 1630 even existed 😂😅
the intel cards requires rebar for optimal performance. i wish more youtubers would state this before making any tests. you are safer with AMD or nviidia though
if the intel cards didnt require the rebar i would have opted to buy one of those instead of a 3060 but sadly my 9th gen system doesnt have rebar thus intel has lost a costumer with ppl like me
The reason the Intel offering is so much less crappy(less) is because it is on a 6nm process where the Nvidia offering looks like they had to dust off an old fab somewhere on a deserted island to make it, the 1630 is made on an ancient 12nm node. Also explains why the Intel one is so much more efficient
This video shows just how diabolical the 1630 really was...and is. It couldn't even keep up with the worst Arc GPU, and it cost more in my region.
Hello! Any hope for a video comparing a very old system, say 2nd gen i7 VS current gen both in games and "general usage feel"? Would be interesting to see if there's any difference between systems over 10 years apart in say, productivity, browsing and of course, games :) Anyways great to see you following arc's progress so far!
gt 710 2gb vs hd 630
You should also compare these two to AMD's RX 6400
1630 is still a better buy for the target market of prehistoric pcs that doesnt have rebar. its sad that someone will end up getting a a310 on his/her i7 3770 PC and wonder why it SOO SLOWWWW in games.
Never heard of the GTX 1630😮
So what's the difference between this and GTX 1650?
20
Performance.
The GTX 1650 has 896 CUDA cores at ~1.4 GHz and uses a 128-bit memory bus. The GTX 1630 has 512 CUDA cores at 1.7 GHz with a 64-bit memory bus, and is additionally restricted to PCIe x8 at most.
@@Retro-Iron11 🤣🤣🤣
Look at the video
Arc side (left)side looks sharper
please also add radeon 780m
Where's the rx 6500? (I can usually find the 6400 and 6500 for the same price)
How it works with older PC?
4-6 gen i5?
Arc will work on older gens, but without resizeable bar you will lose a lot of performance and get poor 1% lows in games.
I think the NVidia GTX 1630 GPU would be winner if going for older cheaper workstations with multicore 8th, 9th or 10th Gen Xeon CPUs where ReBAR cannot be enabled under any way for the Intel ARC GPU.
@@user-jm6qf9hd9jnecrophiliac here. Tbh I'd rather look 4 used market as there are quite a few options out there for a hundred, even in a third-world poopholes/overpriced and unhealthy market middle east ones. Here in Israel you can snatch some 1066/5600xt for that money.
To me, the ARC colors, contrast, and perceived sharpness are better.