Intel Arc A770 Rendering Speed in Mid 2024 | Blender 4.1 and New Drivers

Поделиться
HTML-код
  • Опубликовано: 26 окт 2024

Комментарии • 91

  • @Bholu-y1d
    @Bholu-y1d 27 дней назад +1

    That's exactly I was looking for to upgrade the card for blender. God bless You fella. Blender benchmarks are surely kinda help, but they are not direct index of render time. Keep increasing your excel sheet length for greater benefit of many in pursuit of precise answer 👍

    • @ContradictionDesign
      @ContradictionDesign  26 дней назад +1

      Hey! Glad this was helpful. I am eagerly waiting whatever new GPUs are around the corner. Thanks for your comment!

  • @MokenaRay
    @MokenaRay 5 месяцев назад +7

    Cant wait for the Battlemage Cards to realese

  • @andrewr1355
    @andrewr1355 5 месяцев назад +5

    Looking forward to seeing Intel Battlemage step up to the plate. Not thrilled with the state of NVIDIA as a company (their cards are phenomenal) and AMD still struggling with drivers and software issues.

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад

      Yes I am too. Intel has a great chance to win people over. I really did like this A770 already. If it had just a bit less TDP, it would really be quite good as far as I am concerned.

  • @Frickolas
    @Frickolas 5 месяцев назад +2

    The lone monk scene has a lot more going on in the compositor than other scenes, which can be running on the CPU or GPU with either OpenCL or OpenGL depending on settings and version. Something worth looking into, maybe Intel's OpenCL/GL performance is better than other cards?

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад +1

      Oh that is very interesting! I have never played with the settings at all, so I did not know this! I would not be surprised if Intel was really good at specific things like that. But they don't advertise any of that haha. But this is a very interesting theory.

  • @moravianlion3108
    @moravianlion3108 4 месяца назад +2

    I appreciate you getting results for 7900 xtx. I actually fit into the category of people that would buy it also for Blender, besides other things.
    If you would like me to test something else for you at some point, PM me.
    My PC also has 7950x and 192Gb RAM and 4TB NVMe in it. Should be ready for anything you'll throw at it, within consumer grade reasons, haha!

    • @ContradictionDesign
      @ContradictionDesign  4 месяца назад

      Nice Setup! AM5 has some quirks so far, but is vey fast overall for sure.
      If you have any interest in testing Cinebench, I would love to see those results, out of curiosity.
      And I think the 7900 XTX is a decent value for a 24 GB GPU, especially when gaming is involved.

    • @danp2306
      @danp2306 3 месяца назад +1

      @@ContradictionDesign Everything out there shows the 7900 /RDNA3 cards delivering mediocre performance in Blender and most productivity software. Personally, I think claims of 'good performance' is either lies or fabrications. It's only good for CUDA 'copying' - i.e. Zluda.

    • @ContradictionDesign
      @ContradictionDesign  3 месяца назад

      RDNA3 GPUs have worse rendering speed per dollar than Nvidia 40 series for sure. The value argument for them is that people who dabble in Blender a little, but mostly game on their PC, can still use Blender just fine. So AMD is good for games, meh for Blender relatively.

    • @danp2306
      @danp2306 2 месяца назад +1

      @@ContradictionDesign I've been told that argument before. But, the fact remains - the only ppl who told me that the 7900 xtx is good in Blender (also, HIP-RT works according to them) - is the guy that replied to you - 'moravianlion' and your friend (assuming his results are accurate). If one only dabbles in Blender and games - then why not just use a 4070 Ti Super or used 4080? They are competitive in games and have better performance in all productivity use. I want AMD to be better but they seem to focus on gaming - their gpus should be a lot cheaper, then.

    • @ContradictionDesign
      @ContradictionDesign  2 месяца назад

      @@danp2306 The best argument for 7900 XTX in productivity is that you get 24 GB VRAM for $1k. Only way to do that with Nvidia is for $1.6K at least for the 4090. So at the end of the day, people will buy AMD if they lean that way anyway.
      But most people don't really need to optimize performance of hardware. If you aren't a business contending with profit margin concerns, than your hardware is not too critical. Could one thing be faster than another? Yeah. For gamers and dabblers though, there is not really any loss of income from rendering or gaming a little slower.
      I do agree that AMD GPUs could be a little cheaper, and I would buy them in droves if they were. I think they will focus on low and mid tier cards for the next set of GPUs, and those might fill this need. We will see in 4-6 months I expect.

  • @mashun-o4d
    @mashun-o4d Месяц назад +2

    I am planning to get this card for around $270-300 in India during sale for video editing with premiere, davinci, after effect, and some vfx. Would you go for this card instead of rtx 3060 or 4060? And go for something more expensive for professional workload? Budget is tight and 300 is the max I can go!

    • @colibri_studios
      @colibri_studios Месяц назад +2

      Hey, greetings from Russia. I'm also trying to decide if I should get ARC A770 or 4060. From what I read, 3060 is not a good option anymore since this generation is slowly vanishing. 4060 has way lower power consumption and in general is optimized way better.
      Talking about Premiere Pro, they say version v24 was optimized well for Intel and works/renders way faster. Also Intel supports AV1 codec which is better than H.264.
      Talking about Davinci, it doesn't seem to be optimized for Intel so 4060 probably would show better results but 16Gb VRAM for Davinci is a big advantage. In jobs like noise reduction the higher VRAM amount the better (that's why many people preffered 3060 12 Gb over 4060 8 Gb). In 3D design programs 4060 also shows better results (but probably the newer versions will show better compatibility).
      Intel seems kind of raw to me and I'm not sure if I want to get into all the troubles with drivers when I can just work on a silent, cold and problem-less GPU. So I'm still thinking...

    • @ContradictionDesign
      @ContradictionDesign  Месяц назад +2

      Yes software support must be your first checkpoint. I know Intel still is not compatible with some softwares. I do like the Arc A770, it has a great value proposition. The 4060 would be great if not for the 8 GB VRAM. It is a really hard decision at that price point for sure. I will try to help how I can, but they really just have the VRAM paywalled at that price unfortunately.

    • @ContradictionDesign
      @ContradictionDesign  Месяц назад +2

      If your software supports the ARC, it is a good value. But I do not use enough different software to know for sure, especially with editing softwares and after effects. Their drivers are much better now, but some programs are not supported at all. For 300 it is tough because the VRAM is paywalled basically. So Nvidia knows you want 16 GB, and they want you to spend the extra $100 to get it.

    • @mashun-o4d
      @mashun-o4d Месяц назад

      @@ContradictionDesign thanks

  • @jaakanshorter
    @jaakanshorter 5 месяцев назад +1

    Did you try out the Open image denoiser GPU option in B4.1?

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад

      You know, I will test is out specifically. I use it for renders all the time with other GPUs too. But it will be interesting to pay more attention to it than usual

  • @KEKEBONO
    @KEKEBONO 5 месяцев назад +1

    I wonder what the increase in performance of the new Intel cards will be.

    • @yuan.pingchen3056
      @yuan.pingchen3056 5 месяцев назад +1

      NO battlemage performance data has ever leaked.

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад

      Hopefully the new Intel GPUs will be a huge surprise. This one is not even that bad, if not for the power draw. We shall see!

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад

      Well maybe it will be a big, good surprise for everyone!

    • @silentdoormat
      @silentdoormat 5 месяцев назад +1

      @@ContradictionDesign not really though
      since 6800xt and a770 have identical die sizes
      but a770 gaming performance does seem low

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад

      True, but the A770 is quite cheap, relatively. I got mine new last year for under $300. It is an OK value, but you are right, not quite what it should be.
      And yeah, Intel's gaming performance was a bit behind in this generation, but I have not tested for gaming performance here on the channel. So keep in mind my impressions of GPUs are nearly entirely 3D render related.

  • @berl1n26
    @berl1n26 2 месяца назад +1

    can you help me figure it out? i have a rtx 4070, it has 12gb vram of course. i am working on a big project on blender and the problem right now is that i am really low on vram, with my current budget i am looking to buy a rx 7900xtx with 24gb. i dont know if that is a good choice or not because i am worried that the 7900xtx is not worth the upgrade or will have many problems with 3d like some information i have found
    Many thanks ❤

    • @ContradictionDesign
      @ContradictionDesign  2 месяца назад +1

      For many apps, the 7900 XTX will work just fine. It is the cheapest way to get 24 GB VRAM without getting old pro cards or a used 3090. It will compete with or beat the 3090 in rendering speed as well.
      I can't speak for too much other software, but Blender works fine with current AMD GPUs. I have heard unreal engine makes better use of them as well. Video editors should work fine with them, but I have not tested AMD cards for video editing.
      Hope this helps!

  • @harmonbrentdm
    @harmonbrentdm 5 месяцев назад +1

    Did you also run Cinebench as well?

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад +1

      Unfortunately, Intel discreet GPUs do not appear to be supported in Cinebench 24 at this time. So that is annoying. I just went and tried to confirm.

  • @fardinzaman570
    @fardinzaman570 4 месяца назад +1

    Should I go for A770 or RTX 3060?
    My main purpose is 3D Animation (Only Blender), Premier Pro and after effects.

    • @ContradictionDesign
      @ContradictionDesign  4 месяца назад +1

      I think your biggest concern will be A770 support in Premier and After Effects. I use Davinci myself, so I do not have any testing or experience with Premiere. I would suggest looking up support on A770 for those applications.
      You have seen the Blender speed comparison, so you have that. The A770 does also have the extra VRAM, which can be really nice.
      Also, does your PC support Re-sizable BAR? If not, the A770 will suffer a major speed loss.

    • @fardinzaman570
      @fardinzaman570 4 месяца назад +1

      @@ContradictionDesign I am thinking to pair up my 13600K(I already have this CPU) with the ARC A770. For the 13600K, obviously I have to take a Z series motherboard. So i think i will have resizeable bar support.

    • @ContradictionDesign
      @ContradictionDesign  4 месяца назад

      @@fardinzaman570 Ok yeah you should have Re-Bar support. As long as your apps have good compatibility with the A770, it may be a good fit for you. A770 also has Av1 support, which the RTX 3060 does not have.

    • @W5529RobloxStudios
      @W5529RobloxStudios 2 месяца назад

      @@fardinzaman570 you'll do well with editing because intel have an av1 encoder but probably wait for battlemage

    • @ContradictionDesign
      @ContradictionDesign  Месяц назад

      This is probably a great answer right now. Hoping battlemage is just fantastic. A770 is decent, but I have not tried editing with it yet.

  • @danilshcherbakov940
    @danilshcherbakov940 5 месяцев назад +1

    I heard at the end of the video, that you're getting into server CPUs. Those wouldn't happen to be the Intel Xeon Scalable LGA 3467 socket?
    I got an old Lenovo P720. I'm an unsuccessful programmer, current market is really hard, so I got it to learn how to make virtual machines and kubernetes clusters and stuff like that. Need a lot of RAM and cores for that
    Trying out the computer in blender, I'm kind of disappointed. I'm running it in Ubuntu 20.04, I have a Xeon 6138, 20 cores, 2.0 GHz, it's definitely faster at rendering than my Windows laptop's Ryzen 5 5600H (4.2 ghz 6 core), but not twice as fast. The Xeon seems to be about 5 times slower than my 2060 12gb. At least these Lenovo P720s are surprisingly quiet.
    I am excited for the 50 series though. Going to save up a pretty penny to get a 5080. 'Till then I guess I'll be doing modeling and single frame renders.

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад +1

      First, sorry to hear about work. I hope it goes the right way for you soon.
      I just got my first two servers this week. I got a Dell R930 and a R730XD. the 730XD is for storage, the 930 is for CPU/GPU rendering, and general learning. Once I know what I am doing, I will design my stack. The R930 I got has quad 24-core e7-8890 V4, for 96 cores total. 256 GB RAM. I will benchmark these two machines soon, and have plenty of server/render farm content from it.
      I am also excited for the 50 series. RTX 5090 should come with 1.5 or 2 TB memory bandwidth, and 50% more cores, at a higher IPC and clock speed. Should be really crazy to see what it can do. Also, they have teased a "hardware denoiser", so I will be watching the fine print to see what that could be.
      I look forward to the server stuff. Maybe when I get educated a little bit here soon, we can learn from each other!

    • @danilshcherbakov940
      @danilshcherbakov940 5 месяцев назад +1

      @@ContradictionDesign Thanks, I hope work goes well for me too! lol. I'm excited to hear about your server setup, seems like your rendering server will be a very powerful computer. I have heard of companies buying servers, loading them up with GPUs and then making virtual machines with virtual GPUs from it. Presumably employees SSH into their virtual machine instance from a laptop and then use GPU power from anywhere in the world with an internet connection. It could be an interesting setup to try.
      I've heard people say that certain things can give diminishing returns, rendering with 4k samples, or caustic lights. Maybe it's 'diminishing returns' but only on older GPUs. What I really like about Blender is that it has a culture of making tutorials, I'd love to make one someday. I like your channel, it's very helpful to see how different hardware affects rendering, thank you for going to all this effort

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад +1

      @@danilshcherbakov940 Well I am glad you found me. Yes I think remote GPUs are a cool idea. I definitely need to get fiber Internet before I try what I really want to try, which is renting out workstations over the Internet. People could just sign in and run Blender or whatever they want on a VM. That and just normal rendering for clients.
      Should be a lot of fun. I got my Linux installer media set up, now I just need some time to get the thing configured

  • @itsadi2059
    @itsadi2059 3 месяца назад +1

    Hey how's you doing? I am thinking to start with unreal engine 5 like using unreal engine 5 so can u tell rx 7800xt better or 4070 super?

    • @ContradictionDesign
      @ContradictionDesign  2 месяца назад

      Hey! I am good. Hope you are too!
      I do not have test data for unreal engine, but people have told me that AMD is more competitive in Unreal. But I can not back that claim up.
      This is something I would like to test more, but I need to learn unreal first haha.

    • @itsadi2059
      @itsadi2059 2 месяца назад +1

      @@ContradictionDesign thanks for telling. By the way why are you not uploading videos?

    • @ContradictionDesign
      @ContradictionDesign  2 месяца назад

      @@itsadi2059 I am really busy at work, and have been playing more video games than usual. I will possibly start posting on my gaming channel soon btw.
      But basically just really busy and I have not been prioritizing making videos for a bit.
      I will get back in the swing soon, I am just trying to balance many things now.

    • @itsadi2059
      @itsadi2059 2 месяца назад +1

      @@ContradictionDesign ohh I hope you start your gaming channel soon

    • @ContradictionDesign
      @ContradictionDesign  2 месяца назад

      It exists, but I have not posted in years haha. I want to start streaming on there when I am between videos over here.

  • @danp2306
    @danp2306 5 месяцев назад +1

    When are you going to test a 7900 xtx so you can verify your friend's results? ;-) Also, I would suggest an extensive comparison for nvidia vs amd - e.g. CUDA + OptiX vs Zluda, HIP, HIP-RT - so ppl can see the improvements or regressions (whatever they may be) - and imho, the 7900 xt or xtx are the only amd cards worth considering for Blender - and even then, not sure if they are worthwhile - given the 4070 series - all of them - are probably better performers but there are some situations in which ppl will pick and amd gpu (if they also game, if they use Linux - if they want more vram - since, the nvidia gpus with the max. vrm in consumer cards are 3090s and 4090s - right?). For the above, I'd only compare a 4070 Ti Super and 7900 xtx - to make it simpler and those gpus *supposedly* are close in price?

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад

      Well the 7900 XTX is costly enough that I don't really want to buy it. My uses are only for rendering, so it is likely a worse deal for me than other options. And it is better for people who also game on it, since the 7900 XTX is great for gaming. If I can work something out, I will get one. Not sure when.
      As far as the history of the render engines/api changes, that could be a really fun test! I can go back to Blender 2.79 and work my way up to 4.1. Thank you for that idea!
      Yes the 3090 and the 4090 are the only 24 GB consumer gpus from Nvidia currently. There are pro options, but they are way too expensive for my budget. So that is one point in favor of the 7900 XTX, but the speed being below what I would hope makes it a hard buy for 3D uses only.

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад

      You know what though, I am wondering if the 7900 GRE is a better value though. Can get those for $550, which would make them a relatively cheap 16 GB card. And they are just over half the price of the XTX

    • @danp2306
      @danp2306 5 месяцев назад +1

      @@ContradictionDesign Perhaps, but research that card's specs - it might be too close to a 7800 xt* - and thus, the performance results might reflect that if you test that one? *Edit: I guess I should describe another way - it's a 'cut down' 7900 xt - anyway, look into it - if it's slightly better than the 7800 xt score you received - you might not be too impressed. :)

    • @ContradictionDesign
      @ContradictionDesign  5 месяцев назад

      @@danp2306 ohhhh that is a really good point! I will watch for a good way for me to get a 7900 xtx, and we'll see what happens. I have a couple of servers coming soon, so I will need to get those going. Then I'll look for the next ideas. Thanks for your comments!

    • @danp2306
      @danp2306 5 месяцев назад +1

      @@ContradictionDesign Just a suggestion - might even be able to find them cheaper on the used market? Imho, it's the only amd gpu worth considering for Blender - at least, until the 8800 XT is released?

  • @dimitrovich702
    @dimitrovich702 2 месяца назад +1

    here another bull 0_0 i came for the rtx 3060 video very interesting data comparison

  • @gl1tch133
    @gl1tch133 4 месяца назад +1

    So it's not worth it? Even when using an Intel CPU with DeepLink Tech + dual arc a770?
    I would really like to know it, because I'm still a bit sceptical when it comes to 3D usages woth AMD products. I also avout NVIDIA because of their shitty price performance behavior.
    At last, I'm about to rewipe my entire setup in order to teach myself somw PC related jobs and learn some new stuff or especially using my hobby as a beneficial opportunity with UE5, BLENDER, FL STUDIO, ADOBE, and so on.

    • @ContradictionDesign
      @ContradictionDesign  4 месяца назад +1

      I am not able to test Intel deep link currently, since I do not own any of the supported CPUs. I imagine you would get some extra performance with it in many apps.
      You need to enable Resizable BAR too, or the ARC GPU will run slower than it should.

    • @moravianlion3108
      @moravianlion3108 4 месяца назад +2

      It always comes down to what you're willing to pay for.
      I'm running 7900 xtx with many non gaming apps and I'm very pleased. I assume it will be similar case with all 7 series.
      There's plenty of professional apps that have AMD covered. And worst case scenario, just patch them with ZLUDA.
      In your case, I'd probably make more sense to go with Intel. I hope they fixed all their kinks already.

  • @antonioluciano9693
    @antonioluciano9693 3 месяца назад +1

    Hi, can blender detect two arc a770 cards?

    • @ContradictionDesign
      @ContradictionDesign  3 месяца назад +1

      It should absolutely have no problem seeing both, but I can't prove it for sure

    • @antonioluciano9693
      @antonioluciano9693 3 месяца назад +1

      @@ContradictionDesign Thanks :D

    • @W5529RobloxStudios
      @W5529RobloxStudios 2 месяца назад

      @@antonioluciano9693 Don't do that it only will use one as no gpus optimises for gpus to work together

    • @antonioluciano9693
      @antonioluciano9693 2 месяца назад +1

      @@W5529RobloxStudios But Blender can multiple graphics cards

    • @ContradictionDesign
      @ContradictionDesign  Месяц назад +1

      Blender will use multiple GPUs, but they are not efficiently used in one Blender instance together. You get full speed if each GPU runs on a separate instance. It is annoying, but just how it is for now. And sorry for late reply, but RUclips studio is not helpful with telling me about new replies to replies.

  • @KhalidEnterprise
    @KhalidEnterprise 3 месяца назад +1

    which one would you prefer? A770 or 3060? (for blender)

    • @ContradictionDesign
      @ContradictionDesign  3 месяца назад

      @@KhalidEnterprise well they are often pretty close in speed I think. So if you can deal with higher power draw, the A770 because it has more VRAM. Just make sure you have resizable bar support on your PC

  • @Tamiui
    @Tamiui 3 месяца назад +1

    can you test rx 7900 gre

    • @ContradictionDesign
      @ContradictionDesign  3 месяца назад

      I already have results for a 7800 xt, and the 7900 GRE is only a little faster. The problem is that it is not worth me buying one because I don't have uses for them after I run the tests. The AMD GPUs are slower at the same price vs Nvidia for Blender at least. And since I don't need a GPU for gaming, the AMD mid models are not super worthwhile for me. Plus new generations are launching this winter, so I'd like to prepare for those.
      If I had a buyer lined up to take it from me after I test it, I could maybe do it. Like, some of my friends will need upgrades eventually. They might help. But I don't want to take losses on these, since I don't make much from RUclips yet.
      I would be more likely to test 7900 XTX, but I already have some friends helping me with those results.
      Hope that all made sense.

    • @Tamiui
      @Tamiui 3 месяца назад +1

      @@ContradictionDesign Sure, I appreciate all your efforts and understand your perspective. However, is it worth spending an additional $200 (in my country)to get the 4070 Ti Super instead of the 7900 GRE? Is the extra amount worth it, especially since I am a beginner.

    • @ContradictionDesign
      @ContradictionDesign  3 месяца назад

      @@Tamiui Ohh I see. The 4070 Ti Super will be quite a bit faster for 3D work than the 7900 GRE, which is closer to a 7800 XT. But for a beginner, the render speed may not be a huge factor for you yet. If you really expect to do a lot of rendering soon, it can save significant time to invest more in the GPU. The 4070 Ti Super is much faster for sure.
      However, keep in mind that you could also just render overnight, and there are plenty of hours when you don't use your PC. So for final renders you need lots of speed, but only every so often.
      It is really up to you, based on your financial situation. I think the GRE is a good choice because of its 16 GB VRAM. That is a good amount for 3D work. I also really like the 4070, but 12 GB is a bit more restrictive for scene complexity.

    • @Tamiui
      @Tamiui 3 месяца назад +1

      @@ContradictionDesign Thank you 🙏🙏 I hope that God will guide you to Islam, my brother, because I truly wish you all the best, and I will pray to God that you find always the best in your life 💕💕💕

    • @ContradictionDesign
      @ContradictionDesign  3 месяца назад

      Thank you! That is a very kind gesture and I appreciate your prayers for me.

  • @TortolaBeachBum
    @TortolaBeachBum 3 месяца назад +1

    Do you earn Render Crypto Tokens ?

    • @ContradictionDesign
      @ContradictionDesign  3 месяца назад

      No but I wish I would have filled out the form to join up a couple years ago