Try comparing it to older GPUs that are still available in stock as well. For example, RTX 3060 offers better performance in creative workloads than RTX 4060, so there's no point in comparing all these cards to 4060 when 3060 is still available in stock, and often, at a cheaper price.
this generation in cards, is 'meh'. I chose a 3060ti, long after the 4xxx models came out. AMD seems to be, not much better. If I was buying now, I would probably choose an RX6700xt over an RX7700xt. If I was thinking of energy efficiency, or TDP values (like building in a cramped ITX case-) I might side on the newer generation. If I wasn't building for myself, I might value the warranty offering of a Brand New Video Card.
That’s exactly what I bought. Because it’s still the best Nvidia card for the money. ALL 40 series were made as a money grab. Wait until the 50 series for better hope. I think intel will force a price drop on things like the 40 series
to be honest, its not. its actually not good. i recommend you to go with an pc that has the 3070 ti or 3080. or just go with the imac m1 2020 laptop. or go for the imac m3. i promise you are jus better off going with that. if you want better experience and better editing process with out no lagging or nothing@@jrmyg1621
OMG THIS VIDEO COME AT THE RIGHT TIME!!! I was thinking about buying 7600 xt for work and now I have changed my mind. Thank you kind man for helping me spend my money better.
The intel Arc a770 is a very interesting. Now hear me out, yes the drivers still need improving by a lot in some games but it has massive potential and it performs like a 4060 Ti In updated performance. It also has 16gb vram just like the 7600xt but with 256bit bus instead of the 128 on the amd GPU. It also performs better in productivity work so yeah sums up the arc GPU
@waitwhat1320Yeah, if it was at 192 bit it would be a BEAST of a card but since it is not RX 6700XT 12GB is King still. For me, if i had money i would buy RX 7800 XT 16GB. Simply that card is great what is offering for the price and its amazing 1440p card.
@waitwhat1320 Having onboard cache (what AMD's marketing calls infinity cache) makes a huge difference. It's what allowed them to compete with nvidia on the last generation using only GDDR6 and smaller buses instead of GDDR6X and larger buses. In fact nvidia copied AMD with the 4000 series and added cache as well, which is why they also now have smaller buses on everything but the 4090.
@@Son37Lumiere maybe on lower resolution but as you go for much higher res it's effectiveness also going down. this is something that AMD themselves admit as one of the issues with infinite cache. that's why back in RDNA2 generation AMD said it something that they still need to look at if IC will be worth it going forward because like it or not those IC also take die space that probably better given to other feature like more shader count.
AMD GPUs are still very niche outside just pure gaming. Good value, but shaky support in productivity tasks. If I hear the words "productivity" or "content creation", the choice automatically becomes Nvidia to ensure proper support for anything you'll want to throw at it. 4070ti super is looking like a especially solid option for that use specifically rn.
It all depends on what you plan to use it for. AMD works very well with most media use, particularly Davinci Resolve, as well as most applications that support OpenCL. Any application that is designed around nvidia's proprietary formats like CUDA or OptiX (which includes most rendering software) has much better performance with nvidia.
Hey there.. I just wanted to say Thank You for doing videos focusing on content creation vs just gaming. I still haven't jumped over to premiere or resolve as I'm happy with PowerDirector365 for now, and it only uses a fraction of the resources necessary to run the other apps. I've been looking at the 7600xt, and it certainly has my interest. I won't use it in my primary rig as that has a 4070TI, but as I tinker with other gear and want to try a lower-cost 16GB card, I'll probably pick one up. Anyway, thanks again. Cheers Rick
Excellent review. One thing I would love to see is a bake off of cards like these for live streaming, using OBS, vMix and possibly StreamLabs OBS. I have an encoder machine, 14700k and Sparkle A770, and it is quiet, easy on power draw, and lets me send the output of my production machine to it, via NDI, and just stream. I could squeeze a bit more efficiency out with a 4070 TI or super, but would rather stick with my cheap solution. Thoughts on doing an episode on streaming, as that's creative too!
adding arc 750 in here would've been so great. can't wait for the re-testing and maybe add some older cards like rx 6xxx , rtx 3xxx, rtx 2xxx series noteworthy cards in those lines wouldd be great
Raja whom spearheaded this design is compute focus first and everything else 2nd, which is why Arc actually does well in productivity despite the drivers still being behind. Raja did the same thing with the Vega GPU and Fury, both were compute monsters at the time even more so then Nvidia.
If I didn't need an NVIDIA GPU for the data science and machine learning (it's a software requirement) I would definitely be putting an Intel Arc brand CPU in my computer.
Bro buys an RTX 4090 with a motherboard that doesn't support the potential of his gpu and the CPU. Plus, PSU. AI learning. Since when? Don't forget it's a software requirement. 😮. I love stable overclocking. 😊.
Thanks for the informative video. :) One question, though: which Blender version were you using for the benchmark and was the AMD card tested with HIP-RT or just normal HIP? In my testing with 3.6 there was a noticeable improvement in AMD performance when using HIP-RT but the 3.6 benchmarks (at least as of October) weren’t enabling that. The 7600 XT and Arc A770 definitely presented some good use cases for their price points. The 4060 8GB still fails to make a good case for itself compared to the 3060 12GB with more VRAM and memory bandwidth.
I'm stuck guys I do RUclips and I am stuck between getting the Intel Arc770 & the Nvidia 4070 super! I can do both within my $1350 budget but if I can save money I will if it is only slight increases in proformance
@@theTechNotice not going to lie bro, probably going to do it & go for the Arc A770 just to get my gf off my butt for spending so much🤣 I just don't want to have to buy a new GPU every other year lol. Thx for the response btw!
@@BumpkinBros If you know your way around a PC you should get A770. It runs a bit toasty but definitely delivers. Reply to this comment if you need tips about Arc Alchemist.
@@lordshitpost31 oh I definitely considered it and the GPU was the last purchase because I was so heavily debated going Intel. Honestly depending on how things go for the channel I would love to get one and test it out try it for a period of time because those 16gb of vram and continued improvements make it definitely a great option
I have the 7600xt running alongside a 7600 CPU, the new drivers for the XT have really speeded it up, runs like a top now, thought I had bought a pup but its very good and runs cool, build quality of the Gigabyte version is nice.
@@alex0O0O Its a good Basic PC, rock solid performance, updated the bIos once since I built it, not done any editing on it but would imagine getting the best CPU you can afford, mine was built to game on and take over laptop duties from an old Windows 10 machine. Its air cooled and does the job, don't be sucked into the latest CPU from AMD, from the numerous reliable sources its all bit smoke and mirrors for improved performance.
so would a 3090 be better to have for 3d rendering vs the 7900xt? my current setup is a 7950x still using a 3060 but finding it is taking a huge amount of time to render in my models, i have 128ddr5 6000 ram. i was considering the 7900xt because of the vram or should i go crossfire setup or just go grab a 4090 but this card scares me cause the power plug issues. I am not using blender, i am using 3d studio max , bryce 7 and matter control. any suggestions would be much appreciated.
Your videos have always been very informative. I was surprised you didn't go more in depth on the subject of gaming with some out of wall power draw test results because we all know besides the 4060, power draw is hardly ever even close to what it says on the box with the rest of those cards. Those two plugs on the 7600xt says it all.
actually we can say that arc a770 is 260 since u get one for 310 and u get with it a free game with it like a guy i know did (he got assasin creed mirage) so is it really bad ? (i also see them very often on a 20-30 discount)
Thank you for yet another fantastic video! I just have a quick question: Am I correct in assuming that you utilized Deeplink (the Arc 770 and the 12900K working together) in your benchmarks, and that your results are reflective of this setup?
Cant belive how close the Rx 7600 non xt is to the rtx 4060 in video editing, in my country there is about 55-60 usd difference between them, power usage worth that extra cash?
You should mention the cost of running a tower with these cards in per year & how that costs more than the card itself. If you have a desktop with a beefy GPU & you game for 3 hours a day that will cost you approx £328 a year in electric in the UK. But people run their machines a lot longer than 3 hours per day!
OMG, the 7600XT increase 32GB per second compared to my GTX1070 launched 6 years ago, we must buy this modern GPU, this GPU must be added to the must-buy list.
The 4060 because of Nvidia gpu bias within the engine in stock configuration, and then the 7600XT in non-RT scenarios that get ram heavy (but will still be quite slow in terms of core performance, so basically it's just meeting bare minimum pretty much). The A770... Is just not able to keep up as well, mostly due to instability and when it does work, it's not a particularly _great_ showing.
I wonder if when you need pc for engineer design, should you look for more "creator" pc or gaming pc? Workstations sometimes are extremely expensive (I know quality costs).
@@jrmyg1621 Thank you for answer. This are some examples: lighter - AutoCAD, a bit "heavier" - Civil3D, Revit, Inventor, ArchiCAD, Siemens NX and other group - pix4D, Metashape. I know, that some uses CPU, some OpenCL but most of them have in specification "NVidia Quattro or AMD FirePro"
@@Karol-ip6vx a workstation pc would be very effective for your use case but if its too expensive, a decent creator pc would be fine check the build guides in the description of the video
I just reinstalled my Asrock Phantom Gaming Arc A770 16GB card in my Ryzen 7 5800X PC, which I also reinstalled Windows 10 22H2 on, so that everything was clean with no previous graphics card drivers on it. Then, I installed Halo MCC and Halo Infinite, played through every campaign, and all came back with at least 100 FPS, wtih 165 FPS being the highest as my monitor is rated for that speed, and I turned on V-Sync in all my games. Heaven Benchmark even returned a score of 209 avg FPS and 362 max FPS. Can't wait for BattleMage so I can build a Ryzen 7 7800X3D PC.
Look how efficient that 4060 is in Power AND performance. Impressive. Those cores are super efficient for older boards/power supplies... and its actually has Vray and DLSS.
"Best GPU for a creator" Between these 4, my first choice would be the Arc A770 solely due to Intel QuickSync and the AV1 codec. It also has decent gaming performance so its a good all-round card.
@@flameelectromidis then you shouldn't even think about it it's gaming performance is about equall with the Rtx 3060 12 GB and everyone knows about Intel's gaming performance and drivers compatibility with games You are going to get very low fps which is essential for a youtuber
Is there any positive to having an Intel Arc GPU paired with a 14700k? I have a 3080 and it does fine, but the VRAM is a bottleneck sometimes. Just looking to see if there's a good upgrade (not 4090) that wouldn't break the bank. I'm mostly using DaVinci Resolve.
I have no issues with my A770, it is a great card. I upgraded to 1440p monitors, which prompted me to NEED to upgrade my OG 2060 6g. If I had a 3080... I would sit on it till Battlemage and 50xx are announced. If Battle Mage bring the 4070ti performance for under $500, I will be happy to upgrade again.
if you do tables in XLS, could you use the data bars to visualize values in cells? This would help to understand numbers quickly. Otherwise I love your videos. thank you
Excellent work. Informative and useful. Something I didn't catch: you repeatedly mentioned the benefits of 16 GB VRAM for the AMD card, but does that A770 have 8 GB or 16 GB? And if it has only 8 GB, perhaps you should test the 16 GB version.
I have ryzen 5 1600 ( pcie 3.0 ). My gtx 1070 stopped working.Not replacing my CPU anytime soon. Which GPU should I get? I play games as well as edit wedding videos and photos. My choices are rtx 3060 12gb, rtx 4060, rx 6700xt and rx 7600. I have 1000w power supply. I need something that won't drop it's performance on pcie gen 3 motherboard and trying to avoid any CPU/GPU bottleneck.
@@TheAtomoh yes. It was on my mind. But I think it might cause a cpu bottleneck a little bit, around 10-15%. But nonetheless Rx 6700xt is a powerful GPU with 12gb vram. It will chew everything thrown at it both games and productivity apps.
Speaking about blender and I have done a cost vs benchmark score analysis. I was really surprised by how well the 4060 and 4060ti have performed on that. They provide more than 80 percentile performance for close to 20 percentile of price. Again, this is only for blender.
3060 is not much worse and has more vram. considering the fact that blender straight up crashes when you run out of vram 8gb cards are no-go. imagine leaving a 500 frames animation for rendering overnight to discover in the morning that it crashed on 10th frame
Can you make a video how to combine the ARC A770 with an RX 7600XT or even a PC with 2 ARC A770 GPU's ? I'm contemplating making use of 2 GPU's instead of buying a $1000 RTX4080
Nvidia had a monopoly with CUDA, but now there's a translator for AMD called ZLUDA which is open source. AMD is supporting rocm more, but there is enough options available that you can't say AMD is unusable without being a liar.
When you say the 4060 is best for 3D, does that mean someone that uses Solidworks to create 3d models and assembly's for 3d printing is best suited to this card? 80% solidworks, 10 video editing, 10% gaming.
6700 xt is cheaper down to $250 and preforms better then all of the above. But if the A770 drivers were in better shape and allowed full use of the hardware, that would be in the 6800 xt range.
If intel dropped the arc770 price below 300usd , like let's say almost consistently. It would be the best for creators. Because at that price range you're competing to those two not to mention the last gen cards of the two. Namely the 3060 12gb and 6700xt 12gb.
Didn't you recommend the 3060 for strict Photoshop LRClassic users? That thing is about $345/Zotac (I plan on getting this with my next i9 14700k z790 build. So for Photoshop LR, and 5-15% video editing, A770 or the 3060 or??? Maybe others besides MinnetLP....you guys and gals might know?
You might not even need a GPU for photoshop and lightroom. Integrated graphics could be enough. Even for medium video editing I can get away with the intel iGPU in premiere (I have a 13600k). Do your build and get the GPU later if you need it. Either one is a good option.
@@_mxvega Thanks for the input. My raw files are usually 100MB at single layer, I process full resolution and often get past 10 layers. I do high reslolution outputs in forensic biotech imagery, so its demanding, and the last thing I want and dread is waiting for pixels to finish moving when I do a smudge move back to back, or other eidts. Another thing I hate is in LRc, when hovering over the thumbnail images on the bottom in process mode, I have to wait before the name changes to the image my mouse is hovering over. I don't know what memory or processor is responsible for this, but its so annoying to wait for these simple things. I have around 90MB of ram, but at thelowest speed as its mixed, and running the 6850K processor. What taxes me most is likely my browser tabs. I regularly have about 50 of them open constant. And I dont know if I will change this way of working. I love jumping back and forth to look things up, and listen to youTube or music, and see my emails, and ebay, and amazon, and Newegg, or whatever I am researching.
@@philindeblancThat's pretty different from what I do. But I think the CPU is the most important for these things. Specifically single core performance. I don't think there will be much gained by getting a more expensive GPU. In terms of RAM you should probably get 2x32 Gb so you have the option to upgrade to 4x32 Gb. That's what I did but I've never seen anything close to 64 Gb being used. I think you'll be pretty happy with your build though.
@@_mxvega Ya, I agree on the CPU, Contemplating between the 14700 or 14900, but looking at the numbers, I dont see a reason not to get a13700. Files can be in the 1+GB range when working on them, so I might get 2x48 GB.
sadly, as a blender user, I wish amd cards were good enough, but at the moment, i'd be better off getting an rtx 2070 or an rtx 2060 12gb than buying the amd 7600xt
I wanna buy a 4060 ( i dont have a big budget for a pc ) and a friend of mine get a phantom gaming a770 and want sell to me for 350$CAD Like new or fo with a 4060 for 480$CAD still cant make choice between any help ?
i'd prob take the arc as it has double the vram and might end up much better in the long run with updates and it's still a good gpu for most titles right now. But I would first look for compatibily issues if you have a favorite game you want to make sure works with the arc.
AMD is making strides... Working on a i7 12700k and cant decide which GPU to pair it with, AMD or Intel? I mainly use fl studio, premier and photoshop/capture one. Might just skip the GPU for now and slap 64GB ram on it. Have tried ou the new 8000 series APUs from AMD?
I put a A750 card in my Dads computer, for $170 (Christmas sell) and it runs his CAD and Solid works wonderfully. My Daughter and I both have A770s. I have NO Regrets. Daughter and I have had ours cards for over a year, Grand Pa's 8 weeks. Reason Grandpa got the A750 was Price. This generation sucks, and we saved the money to see what Battlemage or the 50xx series will bring to the table.
I use Vegas Pro 18 and do basic editing. Nothing to heavy. I currently only use my 14900k for it. Rendering a 4k video with just the IGPU is a beast. My setup currently has 64gb of DDR5/6000. Waiting on what Battlemage brings to the table before deciding on a GPU.
I would go with Nvidia on anything non game related. I can't believe someone would recommend AMD or Intel for non gaming workloads, only Nvidia can be used outside of gaming in practice. I bought my RTX 3060 mainly for Blender, but I can do Stable Diffusion now... just because Nvidia GPUs are the most versatile. So if you intend to run more than the products benchmarked in that particular video it's better to choose Nvidia, because it works in everything. By the way, Stable Diffusion can enhance your final rendering and add details and make the image better (so it can be used for easy postprocessing) . Just a light touch from Stable Diffusion and your 3D rendering becomes from just "good" to "perfect".... it can add detail which is not there and it's hard to add otherwise.
it will make CUDA work on AMD cards but it doesn't mean those application will run as fast and stable as it did on nvidia hardware. if that's the case AMD will put 100% work on something similar to that rather than work on HIP and ROCm.
It would be great if someone made a Blender productivity benchmark and not just rendering. It should be possible as an addon to Blender. Modeling tasks, animating tasks, what deformers are loading more cores or frequencies of CPU/GPU etc.
Whenever the Intel a770 is called better in his videos He is showing H.264 codec performance and when other cards are called better the H.264 reference is not there.... Why ?... Why everytime all the criteria the same .!!!?
Hy bro which gpu i get for gaming, editing, streaming, content creation plz suggest me in this 4 gpu 1.RTX 3060 12GB 2.INTEL ARC A750 3.RTX 4060 4.INTEL ARC A770 16GB......?
@@PradeepKumar-tq4ev arc a750 is best and arc a770 is just 14% faster then arc a750 but expensive of pocket 3060 12gb has not av1 encoder and decoder and also 4060 is only in every game more only in arc a750 19 to 16 fps showing but price is higher but arc a750 is all rounder and also in budget and fps was no matter it's just a number 60fps is best for any game and aaa titles to payable any games was smoothly but arc a750 provide in games 100+ fps and some valorant,fortnite like games his provide 300 to 400+ fps and now about vram in vram it's just provide 8gb vram but i say you it's also just number every gpu of 8gb vram is optimised to play games in 8gb vram in arc a750 has 8bg vram but Graphics Memory Interface 256 bit it's not it 4060 and even also in rtx 3060 its has Memory Bus 192 bit so arc a750 is better in price to perform and also even better in this 4060, 3060 gpus and also new in future is drivers was improving this see it its bit all the gpu in this price and best gpu for content creation, 1080p and 1440p gameing it's a very powerful gpu then drivers was improved, now it is very powerful gpu, in lunch time it's not provid good performance but now drivers was improved and this time this gpu performances was best in future also improved then slowly slowly drivers was improveing in future and bro i don't know who this time play games in 4k 😂 now this time 1080p or 1440p it's provide best game play experience and also eSports tornament they will play in 1080p bro bro just go for the arc a750 it's a best gpu and all rounder and i suggest the to you see the Hardware Unboxed video in RUclips just search Intel Arc A750 vs. GeForce RTX 4060 you see that video and you know about this gpu and i know then see that video you definitely buy arc a750 go and see i go for arc a750❤️🔥
@@PankajTayade thanks 🙏 a lot for detailed answer. But I'm mainly into 4k editing stuff than gaming..so which is best for 4k editing..if u know answer..pls reply thanks 🙏👍
@@PradeepKumar-tq4ev yah bro go for arc a750 it's best for 4k editing and content creation and also best then 4060 and 3060 in editing go for arc a750 even do in this gpu 8k editing bro
Howdy. I have watched most all your Arc video. I heard somebody talking about some new AI creation tools that can be used on Arc cards. Let us know what that is. I am not an AI fan but I do like animation and voice over animation. I just wonder if those tools will work with that. I just built an IBM 12th gen I7 Arc A770 16gb rig. All my creative software is on my AMD Ryzen 5900X RTX 4070 system. I would like to know what will work on this rig other than Adobe due to I refuses to use it.
Kupiłem kartę Intel Arc A750 do edycji zdjęć. Programy takie jak Affinity i ON1 photo RAW 2024 nie działają na tej karcie, wyłączają się bez żadnych komunikatów. Sterowniki Intela nie są rozwijane i nie będą jeszcze przez długi czas, sam sprzęt nic nie zmieni. Obsługuję Intela, ale zostałem zmuszony do zwrotu karty Intela. Teraz szukam czegoś od AMD.
I bought an Intel Arc A750 card for photo editing. Programs such as Affinity and ON1 photo RAW 2024 do not work on this card, they turn off without any messages. Intel drivers are not developed and will not be for a long time, the hardware itself will not change anything. I support Intel, but I was forced to return the Intel card. Now I am looking for something from AMD.
Can you elaborate more, there are not many videos and forums i can find in that topic. Note: Im a begginer when it comes to 3D modeling and the reason im jumping from r7 360 is because my mudels got a lil bit' of improvement and i wanna to heavier stuff@@7eave1
Windows 12 required 40 TOPS. AI comparability Sir gpu also has TOPS 4060 8 gb how many TOPS it has And also 4070 4080 Intel ARC RX 7900 etc how many TOPS?
The pric cards to get are last gen cards regardless of how u look at it,atleast that way u getting more for ur money,be it last gen new or last gen used
What i dont understand is. Why rx 7700 xt 12gb vram I mean the rx 7600 xt should be 12gb and still on 128 bit and the rx 7700 xt should 16gb on 192 bit. I think they made this for miners they are not just saying it Ill choose rx 6700 xt over this
Hello, greetings from South America. I have a budget of $1k to 1.5k. I have two questions. 1. Could you suggest me a configuration for wedding photographer, import, masking, export 3k+photos to Lightroom, Photoshop AI and also Premiere Pro wedding video (Rendering 1 to 2,5 hours of film per event between Full HD up to 4K). 2. In the country where I am there is no possibility to return parts and I would like to ask you if you could show in some video with which import or export speed you work in Lightroom and how long it takes to render a 2 hours 4K movie in Premiere Pro. And sorry if my questions are too many or too focused on my work. Jorge
I saw budget in the title and missed the 300 on the thumbnail and the full title itself. Personally i consider budget price to be less than $100, below $300 to be mid range and above to be expensive. The video itself is good, just not budget priced.
Windows will upgrade, for free, from 95 to 11( and likely 12/13/14 or untill Microsoft stops with windows) for people in Europe as the whole binds to motherboard scheme is illegal
Nvidia: lets try to innovate things and provided customers features that will actually work - AMD: After multiple failed attempts to copy Nvidia, throw more VRAM at customers and hope they dont notice the subpar product
who are Pro creator that ones use only Quaddro cards and not this ones anyway and for amateur occasional ones something cheaper or older will be fine too
The biggest loser of all was the 4060 ti 16gb. Because with most retailer still sold it around 480-500 dollars, it Completely destroyed by Rx 7800 XT by far in terms of value, and also that card was now cannibalized by nvidia's own 4070 and 4070 super. And for people who wanted card on budget, the 7600 XT and Arc A770 emerges as the more sensible option instead of adding 100-150 dollars for essentially a 360 dollar card with 16gb of vram. If nvidia managed to lower the price of 4060 ti to around $340 for 8gb and $395 for the 16gb, i think nvidia would be the winner. But as of now, it's just utter waste of money both from productivity and gaming, and it's anti-consumer at best (since basically nvidia told budget people to f*ck off, and spend more at minimum 70 series card).
Try comparing it to older GPUs that are still available in stock as well. For example, RTX 3060 offers better performance in creative workloads than RTX 4060, so there's no point in comparing all these cards to 4060 when 3060 is still available in stock, and often, at a cheaper price.
this generation in cards, is 'meh'. I chose a 3060ti, long after the 4xxx models came out.
AMD seems to be, not much better. If I was buying now, I would probably choose an RX6700xt over an RX7700xt.
If I was thinking of energy efficiency, or TDP values (like building in a cramped ITX case-) I might side on the newer generation. If I wasn't building for myself, I might value the warranty offering of a Brand New Video Card.
That’s exactly what I bought. Because it’s still the best Nvidia card for the money. ALL 40 series were made as a money grab. Wait until the 50 series for better hope. I think intel will force a price drop on things like the 40 series
@@thehimself4056
20series, meh.
30 series yeah!
40 series meh.
seems like every other generation is the way to go.
I’m waiting on the 50 series for that reason. If the fall on thier face I’ll buy AMD instead.
@@greenman8 good point on the 3060 12gb sat in the middle with faster bandwidth!
The intro is insane! great job on the video!
Yes, quite the jazzed up intro. I wonder if he used AI to create it? 😮
It's called a mint editor! ;)
@@theTechNotice Is the 3050 laptop 4GB enough for architecture and 3D renderings or is getting 3050 6GB or 4050 more worth it?
Is mint editor a style of editing or a ai software?
to be honest, its not. its actually not good. i recommend you to go with an pc that has the 3070 ti or 3080. or just go with the imac m1 2020 laptop. or go for the imac m3. i promise you are jus better off going with that. if you want better experience and better editing process with out no lagging or nothing@@jrmyg1621
OMG THIS VIDEO COME AT THE RIGHT TIME!!! I was thinking about buying 7600 xt for work and now I have changed my mind. Thank you kind man for helping me spend my money better.
Are you going for Arc Alchemist then?
16gb in the 7600xt sounds nice, but then you realize it's capped by the 128 bit bus
so in the end it barely helps the performance at all
The intel Arc a770 is a very interesting. Now hear me out, yes the drivers still need improving by a lot in some games but it has massive potential and it performs like a 4060 Ti In updated performance. It also has 16gb vram just like the 7600xt but with 256bit bus instead of the 128 on the amd GPU. It also performs better in productivity work so yeah sums up the arc GPU
The GPU has a large amount of onboard cache which significantly reduces the performance hit from having a smaller bus.
@waitwhat1320Yeah, if it was at 192 bit it would be a BEAST of a card but since it is not RX 6700XT 12GB is King still. For me, if i had money i would buy RX 7800 XT 16GB. Simply that card is great what is offering for the price and its amazing 1440p card.
@waitwhat1320 Having onboard cache (what AMD's marketing calls infinity cache) makes a huge difference. It's what allowed them to compete with nvidia on the last generation using only GDDR6 and smaller buses instead of GDDR6X and larger buses. In fact nvidia copied AMD with the 4000 series and added cache as well, which is why they also now have smaller buses on everything but the 4090.
@@Son37Lumiere maybe on lower resolution but as you go for much higher res it's effectiveness also going down. this is something that AMD themselves admit as one of the issues with infinite cache. that's why back in RDNA2 generation AMD said it something that they still need to look at if IC will be worth it going forward because like it or not those IC also take die space that probably better given to other feature like more shader count.
AMD GPUs are still very niche outside just pure gaming. Good value, but shaky support in productivity tasks. If I hear the words "productivity" or "content creation", the choice automatically becomes Nvidia to ensure proper support for anything you'll want to throw at it. 4070ti super is looking like a especially solid option for that use specifically rn.
It all depends on what you plan to use it for. AMD works very well with most media use, particularly Davinci Resolve, as well as most applications that support OpenCL. Any application that is designed around nvidia's proprietary formats like CUDA or OptiX (which includes most rendering software) has much better performance with nvidia.
Hey there.. I just wanted to say Thank You for doing videos focusing on content creation vs just gaming. I still haven't jumped over to premiere or resolve as I'm happy with PowerDirector365 for now, and it only uses a fraction of the resources necessary to run the other apps. I've been looking at the 7600xt, and it certainly has my interest. I won't use it in my primary rig as that has a 4070TI, but as I tinker with other gear and want to try a lower-cost 16GB card, I'll probably pick one up.
Anyway, thanks again.
Cheers
Rick
Excellent review. One thing I would love to see is a bake off of cards like these for live streaming, using OBS, vMix and possibly StreamLabs OBS. I have an encoder machine, 14700k and Sparkle A770, and it is quiet, easy on power draw, and lets me send the output of my production machine to it, via NDI, and just stream. I could squeeze a bit more efficiency out with a 4070 TI or super, but would rather stick with my cheap solution. Thoughts on doing an episode on streaming, as that's creative too!
adding arc 750 in here would've been so great. can't wait for the re-testing and maybe add some older cards like rx 6xxx , rtx 3xxx, rtx 2xxx series noteworthy cards in those lines wouldd be great
It's actually quite surprising to see Intel do better than AMD in blender. I thought it would be the other way round
I know right...
Intel arc got quite solid RT performance that’s why it performs well with rt engines like cycles
Raja whom spearheaded this design is compute focus first and everything else 2nd, which is why Arc actually does well in productivity despite the drivers still being behind. Raja did the same thing with the Vega GPU and Fury, both were compute monsters at the time even more so then Nvidia.
Blender has no real support for AMD....yet. AMD is working to change that.
I am so happy Intel seems to be sticking with it, and doing well. (They had a tough start, I was worried they would quit.)
If I didn't need an NVIDIA GPU for the data science and machine learning (it's a software requirement) I would definitely be putting an Intel Arc brand CPU in my computer.
If that's cuda, there an AMD translator called zluda.
I did about a year ago, I'm quite happy with it, would recommend :)
@@JohnDoe-ip3oqnah most of the time these transaction layers don't work tried it for my friend as he hates nvidia but had to go with 4060
Bro buys an RTX 4090 with a motherboard that doesn't support the potential of his gpu and the CPU. Plus, PSU. AI learning. Since when? Don't forget it's a software requirement. 😮. I love stable overclocking. 😊.
Thanks for the informative video. :) One question, though: which Blender version were you using for the benchmark and was the AMD card tested with HIP-RT or just normal HIP? In my testing with 3.6 there was a noticeable improvement in AMD performance when using HIP-RT but the 3.6 benchmarks (at least as of October) weren’t enabling that.
The 7600 XT and Arc A770 definitely presented some good use cases for their price points. The 4060 8GB still fails to make a good case for itself compared to the 3060 12GB with more VRAM and memory bandwidth.
I'm stuck guys I do RUclips and I am stuck between getting the Intel Arc770 & the Nvidia 4070 super! I can do both within my $1350 budget but if I can save money I will if it is only slight increases in proformance
A770
@@theTechNotice not going to lie bro, probably going to do it & go for the Arc A770 just to get my gf off my butt for spending so much🤣 I just don't want to have to buy a new GPU every other year lol. Thx for the response btw!
@@BumpkinBros If you know your way around a PC you should get A770. It runs a bit toasty but definitely delivers. Reply to this comment if you need tips about Arc Alchemist.
@@lordshitpost31 oh I definitely considered it and the GPU was the last purchase because I was so heavily debated going Intel. Honestly depending on how things go for the channel I would love to get one and test it out try it for a period of time because those 16gb of vram and continued improvements make it definitely a great option
just add rx 3060 12gb for 230$ can give Hand to Hand to these cards
I have the 7600xt running alongside a 7600 CPU, the new drivers for the XT have really speeded it up, runs like a top now, thought I had bought a pup but its very good and runs cool, build quality of the Gigabyte version is nice.
I was thinking about building a pc with the same specs
Can you tell me if it's good for editing and gaming
And does it lags in any specific work?
@@alex0O0O Its a good Basic PC, rock solid performance, updated the bIos once since I built it, not done any editing on it but would imagine getting the best CPU you can afford, mine was built to game on and take over laptop duties from an old Windows 10 machine. Its air cooled and does the job, don't be sucked into the latest CPU from AMD, from the numerous reliable sources its all bit smoke and mirrors for improved performance.
so would a 3090 be better to have for 3d rendering vs the 7900xt? my current setup is a 7950x still using a 3060 but finding it is taking a huge amount of time to render in my models, i have 128ddr5 6000 ram. i was considering the 7900xt because of the vram or should i go crossfire setup or just go grab a 4090 but this card scares me cause the power plug issues. I am not using blender, i am using 3d studio max , bryce 7 and matter control. any suggestions would be much appreciated.
Oh yeah 3090 would be much better!
And I wouldn't worry about the 4090 plug if you wanna go for that :)
@@theTechNotice ok thanks
amd cards are really weak in 3d. 4060ti beats 7900xtx in redshift.
by a 100 mile
Your videos have always been very informative. I was surprised you didn't go more in depth on the subject of gaming with some out of wall power draw test results because we all know besides the 4060, power draw is hardly ever even close to what it says on the box with the rest of those cards. Those two plugs on the 7600xt says it all.
actually we can say that arc a770 is 260 since u get one for 310 and u get with it a free game with it like a guy i know did (he got assasin creed mirage) so is it really bad ? (i also see them very often on a 20-30 discount)
Thank you for yet another fantastic video! I just have a quick question: Am I correct in assuming that you utilized Deeplink (the Arc 770 and the 12900K working together) in your benchmarks, and that your results are reflective of this setup?
Point.. because for the same card comparison results are different from this channel... 🤔
Cant belive how close the Rx 7600 non xt is to the rtx 4060 in video editing, in my country there is about 55-60 usd difference between them, power usage worth that extra cash?
How does this stack up against the 3060 & 3060 Ti?
You should mention the cost of running a tower with these cards in per year & how that costs more than the card itself. If you have a desktop with a beefy GPU & you game for 3 hours a day that will cost you approx £328 a year in electric in the UK. But people run their machines a lot longer than 3 hours per day!
OMG, the 7600XT increase 32GB per second compared to my GTX1070 launched 6 years ago, we must buy this modern GPU, this GPU must be added to the must-buy list.
For Unreal Engine 5 which is better?
The 4060 because of Nvidia gpu bias within the engine in stock configuration, and then the 7600XT in non-RT scenarios that get ram heavy (but will still be quite slow in terms of core performance, so basically it's just meeting bare minimum pretty much). The A770... Is just not able to keep up as well, mostly due to instability and when it does work, it's not a particularly _great_ showing.
Grab 3060 12 GB for now... Cheap and more efficient...
I wonder if when you need pc for engineer design, should you look for more "creator" pc or gaming pc? Workstations sometimes are extremely expensive (I know quality costs).
I suggest a "creator pc" but gaming pcs would be fine if your workload isnt highly CPU intensive. What specific programs do engineers use though?
@@jrmyg1621 Thank you for answer. This are some examples: lighter - AutoCAD, a bit "heavier" - Civil3D, Revit, Inventor, ArchiCAD, Siemens NX and other group - pix4D, Metashape. I know, that some uses CPU, some OpenCL but most of them have in specification "NVidia Quattro or AMD FirePro"
@@Karol-ip6vx a workstation pc would be very effective for your use case but if its too expensive, a decent creator pc would be fine
check the build guides in the description of the video
if you are on a budget and you use adobe, get the most ammount of proart components, they come with 3 months of cc each😅
I got the 7600xt for 288 new for my first gaming PC, pretty excited!
I just reinstalled my Asrock Phantom Gaming Arc A770 16GB card in my Ryzen 7 5800X PC, which I also reinstalled Windows 10 22H2 on, so that everything was clean with no previous graphics card drivers on it. Then, I installed Halo MCC and Halo Infinite, played through every campaign, and all came back with at least 100 FPS, wtih 165 FPS being the highest as my monitor is rated for that speed, and I turned on V-Sync in all my games. Heaven Benchmark even returned a score of 209 avg FPS and 362 max FPS. Can't wait for BattleMage so I can build a Ryzen 7 7800X3D PC.
i built a 7500f 32gb 6000Mhz A770 and coming from a 5600xt 5700xt build its very snappy and fast!
Look how efficient that 4060 is in Power AND performance. Impressive. Those cores are super efficient for older boards/power supplies... and its actually has Vray and DLSS.
Thanks for the comparison. Would love to see some metrics with Topaz Video AI for upscaling.
It's interesting this price range and product range for creators.
"Best GPU for a creator"
Between these 4, my first choice would be the Arc A770 solely due to Intel QuickSync and the AV1 codec. It also has decent gaming performance so its a good all-round card.
7600 also has AV1 codec
@@nalinipravapanda6816 true but it only has 8gb v ram compared 16 on the a770 so 1440p performance is worse
@@flameelectromidis yah right but does a creator will use a entry level card on 1440p whether it's gaming or creating
@@nalinipravapanda6816 What if its a gamer with a RUclips channel and a 1440p monitor?
@@flameelectromidis then you shouldn't even think about it
it's gaming performance is about equall with the Rtx 3060 12 GB and everyone knows about Intel's gaming performance and drivers compatibility with games
You are going to get very low fps which is essential for a youtuber
Is there any positive to having an Intel Arc GPU paired with a 14700k? I have a 3080 and it does fine, but the VRAM is a bottleneck sometimes. Just looking to see if there's a good upgrade (not 4090) that wouldn't break the bank. I'm mostly using DaVinci Resolve.
Wait for battlemage
I have no issues with my A770, it is a great card. I upgraded to 1440p monitors, which prompted me to NEED to upgrade my OG 2060 6g. If I had a 3080... I would sit on it till Battlemage and 50xx are announced. If Battle Mage bring the 4070ti performance for under $500, I will be happy to upgrade again.
@shuntao3475 I'm waiting to upgrade my 2080. Im hoping Battlemage is good performance wise.
what i have currently are great, unfortunate that things i worked on most are all unusable which well another unrewarding feel
comparing it to whatever someone else doing doesnt make sense and ridiculous at any aspects
Is this Arc a 770 one was a 8 GB vartion or 16 GB
I do 3d renderings all day, everyday as well as a little photo editing. Do I buy the 4060 8gb or a 3060 12gb? 🤔
Which one u have bought. Reply pls 😊
if you do tables in XLS, could you use the data bars to visualize values in cells? This would help to understand numbers quickly. Otherwise I love your videos. thank you
In India this ARC A770 is avelebal at 480$.
Should we buy
i got same issue... In my country it's at the same price ... But despite of softwares & drivers it's best of all
Excellent work. Informative and useful. Something I didn't catch: you repeatedly mentioned the benefits of 16 GB VRAM for the AMD card, but does that A770 have 8 GB or 16 GB? And if it has only 8 GB, perhaps you should test the 16 GB version.
A770 I tested has 16GB ;)
why evga the only one that writes the full name of the GPU on the GPU like for example RTX 3080 all the others companies only just write GEFORCE RTX
The Arc A770 is only $289 at NewEgg today.
If only that was consistent :(
$269 on amazon for Sparkle Titan OC 16GB version
I got it for $230
I’m hoping they continue the driver support for intel arc cards; hopefully battlemage comes out and more choices for the consumer.
I need a Card for 4k 3d rendering in maya Arnold and unreal Engine 5.3 soo please help me brother 😢😢😢 which one should I choose? 😢😢😢
I have ryzen 5 1600 ( pcie 3.0 ). My gtx 1070 stopped working.Not replacing my CPU anytime soon. Which GPU should I get? I play games as well as edit wedding videos and photos. My choices are rtx 3060 12gb, rtx 4060, rx 6700xt and rx 7600. I have 1000w power supply. I need something that won't drop it's performance on pcie gen 3 motherboard and trying to avoid any CPU/GPU bottleneck.
Rx 6700xt. 12gb vram>>>>>>
@@TheAtomoh yes. It was on my mind. But I think it might cause a cpu bottleneck a little bit, around 10-15%. But nonetheless Rx 6700xt is a powerful GPU with 12gb vram. It will chew everything thrown at it both games and productivity apps.
vega 64 dirt cheap on ebay
Speaking about blender and I have done a cost vs benchmark score analysis. I was really surprised by how well the 4060 and 4060ti have performed on that. They provide more than 80 percentile performance for close to 20 percentile of price. Again, this is only for blender.
3060 is not much worse and has more vram. considering the fact that blender straight up crashes when you run out of vram 8gb cards are no-go. imagine leaving a 500 frames animation for rendering overnight to discover in the morning that it crashed on 10th frame
Can you make a video how to combine the ARC A770 with an RX 7600XT or even a PC with 2 ARC A770 GPU's ? I'm contemplating making use of 2 GPU's instead of buying a $1000 RTX4080
How do these compare to a RTX 3070?
Nvidia had a monopoly with CUDA, but now there's a translator for AMD called ZLUDA which is open source. AMD is supporting rocm more, but there is enough options available that you can't say AMD is unusable without being a liar.
ZLUDA isn't developed anymore, unfortunately.
my friend is selling me his brand new xtx 7600 for 170$ is that a good deal? should i get it or just buy a 4060? for double the price?
When you say the 4060 is best for 3D, does that mean someone that uses Solidworks to create 3d models and assembly's for 3d printing is best suited to this card? 80% solidworks, 10 video editing, 10% gaming.
How bout stable diffusion which one do you recommend?
6700 xt is cheaper down to $250 and preforms better then all of the above. But if the A770 drivers were in better shape and allowed full use of the hardware, that would be in the 6800 xt range.
Finally, someone is including the Intel GPUs.
How does this card compare with the ASUS ProArt GeForce RTX™ 4070 Ti Super OC Edition Graphics Card ???
how much perfomance is expected to be improved by Arc 770 in the new benchmark?
id say 10% to 20% maybe even 30%
what about the unreal engine 5? how do they compete with each other in game development?
got it dont buy the 4060
If intel dropped the arc770 price below 300usd , like let's say almost consistently. It would be the best for creators. Because at that price range you're competing to those two not to mention the last gen cards of the two. Namely the 3060 12gb and 6700xt 12gb.
Didn't you recommend the 3060 for strict Photoshop LRClassic users? That thing is about $345/Zotac (I plan on getting this with my next i9 14700k z790 build. So for Photoshop LR, and 5-15% video editing, A770 or the 3060 or???
Maybe others besides MinnetLP....you guys and gals might know?
You might not even need a GPU for photoshop and lightroom. Integrated graphics could be enough. Even for medium video editing I can get away with the intel iGPU in premiere (I have a 13600k). Do your build and get the GPU later if you need it. Either one is a good option.
@@_mxvega Thanks for the input. My raw files are usually 100MB at single layer, I process full resolution and often get past 10 layers. I do high reslolution outputs in forensic biotech imagery, so its demanding, and the last thing I want and dread is waiting for pixels to finish moving when I do a smudge move back to back, or other eidts. Another thing I hate is in LRc, when hovering over the thumbnail images on the bottom in process mode, I have to wait before the name changes to the image my mouse is hovering over. I don't know what memory or processor is responsible for this, but its so annoying to wait for these simple things. I have around 90MB of ram, but at thelowest speed as its mixed, and running the 6850K processor. What taxes me most is likely my browser tabs. I regularly have about 50 of them open constant. And I dont know if I will change this way of working. I love jumping back and forth to look things up, and listen to youTube or music, and see my emails, and ebay, and amazon, and Newegg, or whatever I am researching.
@@philindeblancThat's pretty different from what I do. But I think the CPU is the most important for these things. Specifically single core performance. I don't think there will be much gained by getting a more expensive GPU. In terms of RAM you should probably get 2x32 Gb so you have the option to upgrade to 4x32 Gb. That's what I did but I've never seen anything close to 64 Gb being used. I think you'll be pretty happy with your build though.
@@_mxvega Ya, I agree on the CPU, Contemplating between the 14700 or 14900, but looking at the numbers, I dont see a reason not to get a13700. Files can be in the 1+GB range when working on them, so I might get 2x48 GB.
A present phenomenon: for GPU power consumption Nvidia to AMD is what AMD is to Intel in CPU - cool, gentle sippers.
Thank you so much ❤❤
sadly, as a blender user, I wish amd cards were good enough, but at the moment, i'd be better off getting an rtx 2070 or an rtx 2060 12gb than buying the amd 7600xt
i have ordered the 4060 ti, is it kinda better than all the gpus you benchedmark?
The 16GB version would have been. Good luck with upcoming Capcom titles such as Dragons Dogma II, etc with only 8GB VRAM.
yeah i ordered the 16 gb. thank god, thanks for telling me@@alchemira
How good is the 8500G for creators?
Asus AMD Radeon RX7600 V2 OC, 8GB GDDR6 -369,69 €
ASRock AMD Radeon RX7600XT Challenger OC, 16GB GDDR6 -449,00 €
Acer Predator BiFrost Intel Arc A770, 16384 MB GDDR6 -465,30 €
ZOTAC GeForce RTX 4060 Solo, 8GB GDDR6 -456,87 €
You're getting scammed if you pay 4060ti price for a 4060. Look for better prices.
@@SaltyMaud
This are prices from store near me
Is a video for the 4070 Ti super coming soon?
out already! :)
@theTechNotice Can you link me the video? I can only see the 4070ti not the Ti SUPER 16Gb
I wanna buy a 4060 ( i dont have a big budget for a pc ) and a friend of mine get a phantom gaming a770 and want sell to me for 350$CAD Like new or fo with a 4060 for 480$CAD still cant make choice between any help ?
i'd prob take the arc as it has double the vram and might end up much better in the long run with updates and it's still a good gpu for most titles right now. But I would first look for compatibily issues if you have a favorite game you want to make sure works with the arc.
@@yugdesiral ty for answer il like a lot re4r and dead space remake or the next i want do is baldure gate 3 or last of us remake
I choose the a770 good choice i did
AMD is making strides... Working on a i7 12700k and cant decide which GPU to pair it with, AMD or Intel? I mainly use fl studio, premier and photoshop/capture one. Might just skip the GPU for now and slap 64GB ram on it. Have tried ou the new 8000 series APUs from AMD?
Not yet :)
I put a A750 card in my Dads computer, for $170 (Christmas sell) and it runs his CAD and Solid works wonderfully. My Daughter and I both have A770s. I have NO Regrets. Daughter and I have had ours cards for over a year, Grand Pa's 8 weeks. Reason Grandpa got the A750 was Price. This generation sucks, and we saved the money to see what Battlemage or the 50xx series will bring to the table.
@@shuntao3475 that sounds cool. I will definitely keep an eye open for the A770. Hopefully I can snag a good deal.
I use Vegas Pro 18 and do basic editing. Nothing to heavy. I currently only use my 14900k for it. Rendering a 4k video with just the IGPU is a beast. My setup currently has 64gb of DDR5/6000. Waiting on what Battlemage brings to the table before deciding on a GPU.
I'm wonder if AMD's GPU are stable in DaVinci...also there is rumor that rendered video quality is not on Nvidia's level??
I would go with Nvidia on anything non game related. I can't believe someone would recommend AMD or Intel for non gaming workloads, only Nvidia can be used outside of gaming in practice. I bought my RTX 3060 mainly for Blender, but I can do Stable Diffusion now... just because Nvidia GPUs are the most versatile. So if you intend to run more than the products benchmarked in that particular video it's better to choose Nvidia, because it works in everything. By the way, Stable Diffusion can enhance your final rendering and add details and make the image better (so it can be used for easy postprocessing) . Just a light touch from Stable Diffusion and your 3D rendering becomes from just "good" to "perfect".... it can add detail which is not there and it's hard to add otherwise.
@@Slav4o911 I agree, thinking to buy 4070 ti super..16Gb, dual encoders
With the AMD the igpu on the intel will still work right? Oh tricky one then, it's got to be this or arc!
Yes, Quickync would still work :)
@@theTechNotice Have you tried using the iGPU on AMD's 7000 series CPUs to see if it can be used concurrently with a dedicated GPU like quicksync?
Would you chose a 3090 or a 4070ti super for UE video making?
4070ti super
And if supported on your mother board you can go for two 4070 super
I have ryzen 5 5600g, shoul i buy this gpu instead of rtx 3060? Will it bottleneck?
No. Even u can paired with 4060 ti
Zlucda is coming and will destroy NVidia Cards in Blender and make Cuda working in any AMD card
it will make CUDA work on AMD cards but it doesn't mean those application will run as fast and stable as it did on nvidia hardware. if that's the case AMD will put 100% work on something similar to that rather than work on HIP and ROCm.
Ill just wait for Battlemage, pretty sure Intel is just not gonna price it high. They want to take market share
I do have Amd5800x and GTX 1070 planning to upgrade my Gpu suggest me a good card for Video editing
Just bought a 4070 for less than 475. Used a grx 1080 before. Somehow i still belieben you cant go wrong with cuda, if you operate on Windows.
It would be great if someone made a Blender productivity benchmark and not just rendering. It should be possible as an addon to Blender. Modeling tasks, animating tasks, what deformers are loading more cores or frequencies of CPU/GPU etc.
Whenever the Intel a770 is called better in his videos He is showing H.264 codec performance and when other cards are called better the H.264 reference is not there.... Why ?... Why everytime all the criteria the same .!!!?
14700k with arc a770 vs 14700k with 7600 xt for davinci resolve (free version)
My Canon R6 m2 shoots in H.265 (4k)
Hy bro which gpu i get for gaming, editing, streaming, content creation plz suggest me in this 4 gpu 1.RTX 3060 12GB 2.INTEL ARC A750 3.RTX 4060 4.INTEL ARC A770 16GB......?
Did u get ur answer
Arc750 is lower card of those 3 cards. Am also confused between those 3, 3060 vs 4060 vs Intel arc 770 😢😢
@@PradeepKumar-tq4ev arc a750 is best and arc a770 is just 14% faster then arc a750 but expensive of pocket 3060 12gb has not av1 encoder and decoder and also 4060 is only in every game more only in arc a750 19 to 16 fps showing but price is higher but arc a750 is all rounder and also in budget and fps was no matter it's just a number 60fps is best for any game and aaa titles to payable any games was smoothly but arc a750 provide in games 100+ fps and some valorant,fortnite like games his provide 300 to 400+ fps and now about vram in vram it's just provide 8gb vram but i say you it's also just number every gpu of 8gb vram is optimised to play games in 8gb vram in arc a750 has 8bg vram but Graphics Memory Interface 256 bit it's not it 4060 and even also in rtx 3060 its has Memory Bus 192 bit so arc a750 is better in price to perform and also even better in this 4060, 3060 gpus and also new in future is drivers was improving this see it its bit all the gpu in this price and best gpu for content creation, 1080p and 1440p gameing it's a very powerful gpu then drivers was improved, now it is very powerful gpu, in lunch time it's not provid good performance but now drivers was improved and this time this gpu performances was best in future also improved then slowly slowly drivers was improveing in future and bro i don't know who this time play games in 4k 😂 now this time 1080p or 1440p it's provide best game play experience and also eSports tornament they will play in 1080p bro bro just go for the arc a750 it's a best gpu and all rounder and i suggest the to you see the Hardware Unboxed video in RUclips just search Intel Arc A750 vs. GeForce RTX 4060 you see that video and you know about this gpu and i know then see that video you definitely buy arc a750 go and see i go for arc a750❤️🔥
@@PankajTayade thanks 🙏 a lot for detailed answer. But I'm mainly into 4k editing stuff than gaming..so which is best for 4k editing..if u know answer..pls reply thanks 🙏👍
@@PradeepKumar-tq4ev yah bro go for arc a750 it's best for 4k editing and content creation and also best then 4060 and 3060 in editing go for arc a750 even do in this gpu 8k editing bro
Howdy. I have watched most all your Arc video. I heard somebody talking about some new AI creation tools that can be used on Arc cards. Let us know what that is. I am not an AI fan but I do like animation and voice over animation. I just wonder if those tools will work with that. I just built an IBM 12th gen I7 Arc A770 16gb rig. All my creative software is on my AMD Ryzen 5900X RTX 4070 system. I would like to know what will work on this rig other than Adobe due to I refuses to use it.
Kupiłem kartę Intel Arc A750 do edycji zdjęć. Programy takie jak Affinity i ON1 photo RAW 2024 nie działają na tej karcie, wyłączają się bez żadnych komunikatów. Sterowniki Intela nie są rozwijane i nie będą jeszcze przez długi czas, sam sprzęt nic nie zmieni. Obsługuję Intela, ale zostałem zmuszony do zwrotu karty Intela. Teraz szukam czegoś od AMD.
I bought an Intel Arc A750 card for photo editing. Programs such as Affinity and ON1 photo RAW 2024 do not work on this card, they turn off without any messages. Intel drivers are not developed and will not be for a long time, the hardware itself will not change anything. I support Intel, but I was forced to return the Intel card. Now I am looking for something from AMD.
Opinion on used 6800 for 390 Euros? Worth it. Mainly for Blender, autocad and adobe tools, secondarly gaming.
Dont go Amd if your using blender or any 3d workloads
Can you elaborate more, there are not many videos and forums i can find in that topic.
Note: Im a begginer when it comes to 3D modeling and the reason im jumping from r7 360 is because my mudels got a lil bit' of improvement and i wanna to heavier stuff@@7eave1
Windows 12 required 40 TOPS. AI comparability
Sir gpu also has TOPS
4060 8 gb how many TOPS it has
And also 4070 4080
Intel ARC
RX 7900 etc how many TOPS?
The pric cards to get are last gen cards regardless of how u look at it,atleast that way u getting more for ur money,be it last gen new or last gen used
I hope Intel can stay in this game because what they are providing on their first try is seriously impressive
I love your videos
Has anyone noticed that the Intel ARC GPU has better details in games or is it just you?
I would never recommend amd for non-gaming.
Just go with Nvidia and have peace of mind.
an unreal engine 5 benchmark would be awesome.
i curious how the a770 performs in unreal
Thanks!
Please add AI/LLMA calculations into yours benchmarks. 😊
what is MSRP?
Manufacturer's suggested retail price
thank you @@minimaann
What i dont understand is.
Why rx 7700 xt 12gb vram
I mean the rx 7600 xt should be 12gb and still on 128 bit and the rx 7700 xt should 16gb on 192 bit.
I think they made this for miners they are not just saying it
Ill choose rx 6700 xt over this
Hello, greetings from South America.
I have a budget of $1k to 1.5k.
I have two questions.
1. Could you suggest me a configuration for wedding photographer, import, masking, export 3k+photos to Lightroom, Photoshop AI and also Premiere Pro wedding video (Rendering 1 to 2,5 hours of film per event between Full HD up to 4K).
2. In the country where I am there is no possibility to return parts and I would like to ask you if you could show in some video with which import or export speed you work in Lightroom and how long it takes to render a 2 hours 4K movie in Premiere Pro.
And sorry if my questions are too many or too focused on my work.
Jorge
I saw budget in the title and missed the 300 on the thumbnail and the full title itself. Personally i consider budget price to be less than $100, below $300 to be mid range and above to be expensive.
The video itself is good, just not budget priced.
What yo call budged are 2015 prices, its 2024 verything is gone up.
Should of added in the 4060 Ti since that does sell for at a minimum of $370.
intel gpus seems like editing beasts specially if you dont have intel cpu, as they are expensive and too hot
Windows will upgrade, for free, from 95 to 11( and likely 12/13/14 or untill Microsoft stops with windows) for people in Europe as the whole binds to motherboard scheme is illegal
I really don't understand why the cheapest gpus use, PCI - EXPRESS X8. Wtf? All gpus must have PCI-E X16.
Nvidia: lets try to innovate things and provided customers features that will actually work - AMD: After multiple failed attempts to copy Nvidia, throw more VRAM at customers and hope they dont notice the subpar product
who are Pro creator that ones use only Quaddro cards and not this ones anyway and for amateur occasional ones something cheaper or older will be fine too
I got the a770 16gb vram for 300
The biggest loser of all was the 4060 ti 16gb. Because with most retailer still sold it around 480-500 dollars, it Completely destroyed by Rx 7800 XT by far in terms of value, and also that card was now cannibalized by nvidia's own 4070 and 4070 super. And for people who wanted card on budget, the 7600 XT and Arc A770 emerges as the more sensible option instead of adding 100-150 dollars for essentially a 360 dollar card with 16gb of vram.
If nvidia managed to lower the price of 4060 ti to around $340 for 8gb and $395 for the 16gb, i think nvidia would be the winner. But as of now, it's just utter waste of money both from productivity and gaming, and it's anti-consumer at best (since basically nvidia told budget people to f*ck off, and spend more at minimum 70 series card).
I paid $200 for my TUF 3060ti with warranty Im good lol. My card outshines these
thanks bud that helps..