Thanks for the tip, can't wait to test this out! For anyone who has an intel cpu with integrated graphics, but the intel quicksync option doesn't show up - one possibility is that you don't have internal graphics enabled, they're disabled by default if you have a separate gpu. You can enable this in BIOS, but the exact menus depend on the motherboard you're using - a youtube search for "enable quicksync" should tell you everything you need to know.
Thank you! You are a life-saver. I have a 13600K and I searched for the Intel option In DaVinci but didn't see it, so I thought maybe my iGPU was dead. Lol
Your tip is worth its weight in gold, thank you very much. The export in particular runs about twice as fast on my MSI laptop with Intel 12700H. Best regards, Karl
@@theTechNotice if I enable this, will the nvidia gpu still be doing effects and color grading ? I have a 11400h, it supports quick sync, but I don’t know if its better than my 3060
Just installed Davinci Resolved 19 (FREE), and the program doesn't even open, it directs me straight into the settings. My screen does not show those break downs. So where can I find this or where can I find this. I am not a full-on professional editor; I just need to do very basic things for Instagram reels and social media for my fashion business. My Spec: Intel i7 -8550U CPU / RAM 16GB / Intel UHG Graphics 620
Same issue for me, with same spec. My drivers / graphic cards are all update already via "Intel Driver and Support Assistant" program, and Davinci Resolved 19 doesn't even open, it also directs me straight into the settings. My screen does not show the "Decode Options" > "Decode H.264/H.265 using hardware acceleration" > "Intel Quick Sync" break downs that everyone is saying you must put on in order to use this video editing software. So where can I find or activate this? I see several people referring to the Windows BIOS setting change from an old post...no idea how to find that and edit it. Which setting specifically needs to be changed in order for us Windows users with Intel HD graphic cards to use this software? Any honorable mentions for softwares that are just as good, free, ad can handle these type of specs?
Funny...I don't have those options....my integrated graphics IS enabled, and both show up in device manager, but I don't see hardware acceleration options or quicksync....I'm running the free version.....is that only in the Studio version?
I have a question about buying 2 new upgrade items for my dying i5 6500 skylake build. I'm looking at 12400 or 13400 for mobo , i want to get Asus Prime H670 or Asus Prime H770 I have no other budget, thus will remain using the same DDR4 32gb - 2133mhz if using XMP profile will be 2400mhz . Please give me your experienced advice master 🤔
Two questions: First, is this only if you have Resolve Studio (I have the free one), and second, is this for all Intel processors or just the newer generations (I have a Core i7-8700K on my desktop and a Core i7-9750H on my laptop).
To my knowledge, the current Intel GPUs have a lot slower engine for video encoding, than their CPUs. Don't know if it's the same for playback (decoding).
Does that mean, those laptop with , say 13900H, without and dedicate GPU, would be good enough as with for editing? Have you tried that? Otherwise, maybe it's not bad to get the ARC 380 for those who dont have Intel CPU 12/13 Gen?
thanks for the video! you are talking about timeline performance. does this include vfx and color grading? what about rendering? would you use the nvidia card to render the final output?
This is so helpful! Thank you so much. How much better are 12th and 13th gen than my 99000k in Resolve? I don’t know much about encoders and decoders so I don’t know how good they are on my CPU. Thanks!
I don't have that option within the decode options. Currently, I am trying to figure out an intel hardware acceleration error I have on startup. That might have something to do with why I don't have that option. All my drivers are up to date and still, the error persists. I can't find any settings to allow me to fix this. I will keep searching for answers that may explain this.
In DaVinci Resolve 18 this option to select NVIDIA or Intel was gone. I use old NVIDIA GPU and 12gen 5 Intel and IT works fine with 1080p and with Mavic 4K.
I have a 9900k with 2080ti and 32 gb of ram. Are 12th and 13th gen chips way better than my 9th gen for encoding and decoding and overall experience? I do 1080p and 4k editing for someone and don’t mind upgrading.
random question, how good is the gtx 1660ti compared to rx 6600 on davinci resolve? some said you dont really need cudacore on davinci, unless you're using premiere, is that true?
So far I’m finding that in 18.5.1 I have to use the options to force Resolve to use QuickSync for decoding if there is an Nvidia card present. With both selected, it keeps trying to use Nvidia. I haven’t tested with AMD or using proxy or optimized files - only straight from camera h.264 files so far.
Do a search on your CPU on Intel's website and see what codecs it supports. But any Intel CPU should, except those which name ends on an F, if I remember correctly). Those don't have an internal iGPU.
Thank you for not being a davinci fanboy and saying its my computers fault when it cost 20 grand for this computer and nothing has ever caused cpu spike EVER. Thank you for offering real solutions. I am about to switch back to premiere if it doesnt stop. I havent uploaded in 3 days because this trash program. They actually banned me from the discord because i told them its davinci's fault. craziness. P.S. This worked, I didn't have the intel option but I unchecked Nvidia. Thank you.
That's unfortunate man. I've had the same experience as you with adobe premiere. That's why i switched to Resolve Studio. I hated losing progress in adobe. Maybe its due to windows or an older NVIDIA GPU idk. I have a 13th gen iGPU w 4090 and no crashing on my end on 18.6.
I’m using obs gaming footage that’s 2k at h.264 but in the task manager the igpu is never in use or decoding video at all, no matter what options I put, I have the studio version davinci, rtx 4090 i9129000k 164gb ram. I cannot figure out what is going on and there’s nothing on google about it, please can you help me, the intel gpu is greed out in the video decoding options I can only select the rtx 4090, next to the igpu in decode options it says the igpu is being discreetly used or something along those lines, my video is super choppy even while in proxy’s and quarter res. Please help me.
11th gen has uhd750 graphics almost the same as 12400 (uhd730) it has one video decoder. CPUs from 12500 and higher have uhd770 which has 2 video decoders. So uhd770 is a bit faster than 750.
I tried this and it really helped with h.265 files, but at the same time I lost a lot of speed when it comes to h.264 files. Playback of h.264 fikes stuttering even when files are placed onto firecuda 530 disk. But, with this box checked, full HD h.264 files are playing smoothly even when they are on NAS, which is so much slower than firecuda.
I have a NVIDIA Geforce RTX 3060, I don’t have the option to change the setting like you do, is this because I’m using the free version? Won’t let me select NVIDIA on my rendering Encoder only H.264 H.265
Will this even work with a 13600KF? Ie, no built in GPU? I'm looking at a build for resolve right now and considering saving the money as I'll have a pretty powerful GPU.
is it a good idea to computer to davinci free version, use a processor with integrated graphics and buy more ram memory such as 32 gb ram. without a separate dedicated graphics card? is an intel processor with integrated graphics enough for stable operation? my needs are small video forms up to 20 minutes fhd movies or sometimes 4k. thx.
Hi all. I would like to see a real comparison of AMD RX7900XTX 24Gb vs RTX 4080 - and you????? Just more memory and better in tests. Especially interesting with the I5 13600 processor? Versus all 11 and 12 series processors. Now that would be a money saver. But who would go for that? Or?
Recently I bought Intel Core i5-13600K. It works with MSI B660M Mortar mobo. Which BIOS options are recommended to be enabled/disabled for the cpu to be as cool as possible, while staying fast?
I was using Premiere and I noticed it does switch to the best choice, which was I relief as I just want it to work without having to go into settings , open/closing depedning on what I am doing.
Hello i need with my intel i9 13900K, am new on both PC and Davinci resolve user and every time i play a footage my intel processor hits 100% as soon as play the Video, please i don't what i am doing wrong?.
It depends on your project size/footage. For my projects with 8K h.264/h.265 footage, the cards with 10GB or less of VRAM run into issues. The cards with more than that are showing over 9GB of VRAM usage almost as soon as I start working with the footage. So that’s why I still use a 3060 12GB on my less powerful system. So if you are working at 4K or lower, the Ti might be better. But honestly I’d seriously look into an Arc A770 16GB or 16GB AMD cards like the 6800 XT or 7800 XT. I’ve been testing all of them against the 4070 this week and the results have been very interesting. On my oldest system (with no iGPU), I swapped in a 4070 and 7800 XT for the 3060. In PugetBench, the 7800 XT came out with the highest overall score (both standard and extended) but there was a split in terms of which part of the program were handled better by the 7800 XT vs. the 4070. The 7800 XT won 4K Media, 79 to 77. The 4070 won 8K media, 73 to 64 and GPU Effects 112 to 98. But then the 7800 XT won Fusion 290 to 244. The A770 has been really good with decoding footage, even without the aid of the 13700K iGPU, too. I haven’t tested it together with the iGPU yet.
I literally switched to resolve days ago so this is timed perfectly. I’m using Intel 12900 cause I don’t overclock. My GPU is RTX 3090 Evangelion edition 😎
OK, Davinci Studio 18.1.2 on Linux Min 21.1 with Intel 12700, 64 GB DDR4, 1TB SSD, NVidia RTX 3060+12GB. Rendering a 4K clip with fusion elements and some effects: 57 secs with NDIVIA ON 1:03 min with NVIDIA OFF
This will cause some out of memory problems when you are putting heavy grading nodes. Trying putting Neat denoise plugin and you will see. Sometimes I can’t even export the project or can do it but the final rendered file is having glitches. Need to just tick only on nvidia and it will never have an error when exporting. Talking about 4K footages. Trying so many times…
To my knowledge, the current Intel GPUs have a lot slower engine for video encoding, than their CPUs. Don't know if it's the same for playback (decoding).
@@akyhne im from Mongolia and English is not my native language so sorry that i didnt heard mentioning several times in the video what i have asked. If you were a mongolian you would have helped by giving a timstamp to listen it again but it seems each country with their people, you seem to just scold me. Im not briliant at tech so thats why i asked, i will delete my sily question but this software should be able to choose one over another for decoding.
@@AexoeroV The CPU is more powerful than any GPU for playback. It also supports more codecs. Therefore, there's no reason to have both checked. The software cannot know how much "energy" the CPU or the GPU uses, at playback. Therefore it doesn't always make the correct decision. That's me saying that, not the guy in the video. So if Premiere Pro can figure it out, it's because it's built into the software. A kind of matrix of CPUs and GPUs, where the software simply checks "oh, you have an Intel Core I9 13900K and a RTX 3060, so therefore I should choose to use the CPU for playback". This can easily get complex, so Blackmagic who makes DaVinci Resolve, chose not to.
To my knowledge, the current Intel GPUs have a lot slower engine for video encoding, than their CPUs. Don't know if it's the same for playback (decoding).
@@akyhne This same channel said that the Intel Arcs has strong encoders that could even match higher end nvidia cards, though abviously less gpu muscle.
On Windows the Free version doesn’t support a lot of things that the Studio version does in this area: - no support for multiple GPUs (or combining iGPU and discrete GPU) - no hardware acceleration for h.264 and h.265 so you don’t have access to the same preferences - no support for h.264 10-bit 4:2:2 footage (shows up as audio only) On Mac some of that changes, but on Windows you really need Studio to get a fast experience working with h.264/h.265 footage. If you are using the free version, you may want to convert your footage to DNxHR format since without the h.264/h.265 GPU acceleration, the CPU playback can be a bit easier DNxHR.
This isn't true for me with 4090/13900k - turning off the NVIDIA means it will use the CPU and the Intel GPU instead. If I disable my 4090 and then watch the task monitor it will still use Intel GPU AND Nvidia GPU simultaneously. This is with Sony 8k H265 Log media at least. It is very dependent on the codec and what resolve activity you are doing (edit/colour/FX) what combination of HW acceleration it decides to use. Best to leave them all turned on, you're just stressing your CPU more if you turn it off.
@@theTechNotice yep that what I was using 8k H265 4:2:2 in max bitrate. Resolve 18.1.2 and latest Studio drivers. I'll try 4k footage, but with 8k it's best to leave NVIDIA ticked.
Great info thanks on that. I have Intel 12600K and RTX 3060Ti. and legit Studio 19, need to say now it works better. But there is another problem now, on Sony and DJI footage works great, but on Canon footage also flays, and in one moment image freeze-s. All footage 8bit h264. Is there any other codecs to install for Davinci Resolve or Windows?
Thanks for the tip, can't wait to test this out!
For anyone who has an intel cpu with integrated graphics, but the intel quicksync option doesn't show up - one possibility is that you don't have internal graphics enabled, they're disabled by default if you have a separate gpu. You can enable this in BIOS, but the exact menus depend on the motherboard you're using - a youtube search for "enable quicksync" should tell you everything you need to know.
Thank you! You are a life-saver. I have a 13600K and I searched for the Intel option In DaVinci but didn't see it, so I thought maybe my iGPU was dead. Lol
Your tip is worth its weight in gold, thank you very much. The export in particular runs about twice as fast on my MSI laptop with Intel 12700H.
Best regards, Karl
Surely these options are only available on DaVinci Resolve Studio, not on DaVinci Resolve which is the free version.
Yep
@@theTechNotice if I enable this, will the nvidia gpu still be doing effects and color
grading ? I have a 11400h, it supports quick sync, but I don’t know if its better than my 3060
@@theTechNotice Should have led with this in the video. :(
damit i figured out my probelm ! lmao
@@theTechNotice mention it in the video then
Could you test this on a AMD system and nVidia GPU but with a Intel 380 GPU for decode?
Thanks for the recommendation. I’ll try it.
Do you know if these settings impact AV1 encoding/exporting (positively or negatively)?
In next video, could you please describe what decode/encode codecs difference in intel uhd 750 and intel 770?
Just installed Davinci Resolved 19 (FREE), and the program doesn't even open, it directs me straight into the settings. My screen does not show those break downs. So where can I find this or where can I find this. I am not a full-on professional editor; I just need to do very basic things for Instagram reels and social media for my fashion business.
My Spec:
Intel i7 -8550U CPU / RAM 16GB / Intel UHG Graphics 620
Same issue for me, with same spec.
My drivers / graphic cards are all update already via "Intel Driver and Support Assistant" program, and Davinci Resolved 19 doesn't even open, it also directs me straight into the settings. My screen does not show the "Decode Options" > "Decode H.264/H.265 using hardware acceleration" > "Intel Quick Sync" break downs that everyone is saying you must put on in order to use this video editing software. So where can I find or activate this?
I see several people referring to the Windows BIOS setting change from an old post...no idea how to find that and edit it. Which setting specifically needs to be changed in order for us Windows users with Intel HD graphic cards to use this software?
Any honorable mentions for softwares that are just as good, free, ad can handle these type of specs?
Funny...I don't have those options....my integrated graphics IS enabled, and both show up in device manager, but I don't see hardware acceleration options or quicksync....I'm running the free version.....is that only in the Studio version?
hi do a 2023 setup tour and a (all the gear i am using right now ) video, all the love and respect man
My audio is not in sync with the video anymore when I enabled iGPU only for decoding. If both nVidia and quicksync are enabled everything is OK.
My laptop has an i9-13900H with an RTX 3050 6GB laptop GPU. This is very beneficial
I have a question about buying 2 new upgrade items for my dying i5 6500 skylake build.
I'm looking at 12400 or 13400
for mobo , i want to get Asus Prime H670 or Asus Prime H770
I have no other budget, thus will remain using the same DDR4 32gb - 2133mhz if using XMP profile will be 2400mhz .
Please give me your experienced advice master 🤔
It's strange, I have an "Intel i5-13500" processor and there is no "Intel Quick Sync" option in the settings. What could be the problem?
This is an issue that you had helped me with some time back and I appreciate that what you've suggested worked for me.
Two questions: First, is this only if you have Resolve Studio (I have the free one), and second, is this for all Intel processors or just the newer generations (I have a Core i7-8700K on my desktop and a Core i7-9750H on my laptop).
Nice! You answer for a some of my questions.
What about timeline performance with Intel graphic cards + Intel CPU?
To my knowledge, the current Intel GPUs have a lot slower engine for video encoding, than their CPUs. Don't know if it's the same for playback (decoding).
Thank you for the info 💚
Does that mean, those laptop with , say 13900H, without and dedicate GPU, would be good enough as with for editing? Have you tried that?
Otherwise, maybe it's not bad to get the ARC 380 for those who dont have Intel CPU 12/13 Gen?
thanks for the video!
you are talking about timeline performance. does this include vfx and color grading? what about rendering? would you use the nvidia card to render the final output?
Nvidia to render out.
Not sure if the setting works for Fusion and the color osge, but you can always enable caching for both.
Is this still good info? Just bought a 13900K and I have a 7800xt so I'm curious.
I’m wondering the same thing. I have a 14700K and Nvidia 3080.
Test and you'll know.
If the software uses the GPU in playback, then yes, change as suggested in the video.
New side-monitors? How about new setup review, or studio tour? 🙂
Thanks for very informative video! Great work :)
Thanks!
This is so helpful! Thank you so much. How much better are 12th and 13th gen than my 99000k in Resolve? I don’t know much about encoders and decoders so I don’t know how good they are on my CPU. Thanks!
I don't have that option within the decode options. Currently, I am trying to figure out an intel hardware acceleration error I have on startup. That might have something to do with why I don't have that option. All my drivers are up to date and still, the error persists. I can't find any settings to allow me to fix this. I will keep searching for answers that may explain this.
In DaVinci Resolve 18 this option to select NVIDIA or Intel was gone. I use old NVIDIA GPU and 12gen 5 Intel and IT works fine with 1080p and with Mavic 4K.
Only available in studio version of resolve
@@suniphalderofficial2 I have the studio 18 and I dont see Intel quick sync
@@SaroukosG enable Vt-D / igpu multi monitor in bios and install Intel GFX drivers
@@SaroukosG Same
I have a 9900k with 2080ti and 32 gb of ram. Are 12th and 13th gen chips way better than my 9th gen for encoding and decoding and overall experience? I do 1080p and 4k editing for someone and don’t mind upgrading.
random question, how good is the gtx 1660ti compared to rx 6600 on davinci resolve? some said you dont really need cudacore on davinci, unless you're using premiere, is that true?
hi bro i am from india pleas answer me which air cooler is sport Intel Core i9-13900K Processor ?
Is there a follow up video on this, with the release of the 18.5? Does DR know which to use quicksync or nvenc
So far I’m finding that in 18.5.1 I have to use the options to force Resolve to use QuickSync for decoding if there is an Nvidia card present. With both selected, it keeps trying to use Nvidia.
I haven’t tested with AMD or using proxy or optimized files - only straight from camera h.264 files so far.
My decode options look nothing like that. There is a checkbox that says "Use GPU for black magic RAW decode" and others
Can you do this on Laptop as well? I have a 11400h cpu with rtx 3050. Thinking of buying Resolve studio
Do a search on your CPU on Intel's website and see what codecs it supports. But any Intel CPU should, except those which name ends on an F, if I remember correctly). Those don't have an internal iGPU.
is intel 13500 a good budget cpu for editing?love your videos!
Yes
Thank you for not being a davinci fanboy and saying its my computers fault when it cost 20 grand for this computer and nothing has ever caused cpu spike EVER. Thank you for offering real solutions. I am about to switch back to premiere if it doesnt stop. I havent uploaded in 3 days because this trash program. They actually banned me from the discord because i told them its davinci's fault. craziness. P.S. This worked, I didn't have the intel option but I unchecked Nvidia. Thank you.
That's unfortunate man. I've had the same experience as you with adobe premiere. That's why i switched to Resolve Studio. I hated losing progress in adobe. Maybe its due to windows or an older NVIDIA GPU idk. I have a 13th gen iGPU w 4090 and no crashing on my end on 18.6.
I’m using obs gaming footage that’s 2k at h.264 but in the task manager the igpu is never in use or decoding video at all, no matter what options I put, I have the studio version davinci, rtx 4090 i9129000k 164gb ram. I cannot figure out what is going on and there’s nothing on google about it, please can you help me, the intel gpu is greed out in the video decoding options I can only select the rtx 4090, next to the igpu in decode options it says the igpu is being discreetly used or something along those lines, my video is super choppy even while in proxy’s and quarter res. Please help me.
Is the best video editing kit is 13th gen intel + ARC 770 graphics card. Are you planning to test the ARC 770?
@TECH_NOTICE What I won ?
i have i5 13600k but i have no intel quicksync option. whats the fix
If I'm not mistaken the 11th iGPU HW enc/dec are identical to the ones in 12/13th gen CPUs.
11th gen has uhd750 graphics almost the same as 12400 (uhd730) it has one video decoder. CPUs from 12500 and higher have uhd770 which has 2 video decoders. So uhd770 is a bit faster than 750.
@@Anderson_LS yes 100% you are right
I tried this and it really helped with h.265 files, but at the same time I lost a lot of speed when it comes to h.264 files. Playback of h.264 fikes stuttering even when files are placed onto firecuda 530 disk. But, with this box checked, full HD h.264 files are playing smoothly even when they are on NAS, which is so much slower than firecuda.
I have a NVIDIA Geforce RTX 3060, I don’t have the option to change the setting like you do, is this because I’m using the free version? Won’t let me select NVIDIA on my rendering Encoder only H.264 H.265
The options are different for Free and Studio. He was using Studio so some of the options he had are missing in your version.
Good info! Thanks!
Will this even work with a 13600KF? Ie, no built in GPU? I'm looking at a build for resolve right now and considering saving the money as I'll have a pretty powerful GPU.
No, the media engines are in the igpu. Did you end up buying the F variant?
I see on your previous video you said enable both Nvidia and Intel but on this video you say only enable intel. Is this now the correct way to go?
Have you find out? If checked both the correct thing would be the software to use whats better.
thanks for the info
Why mine not showing intel quick sync im using i7 13700k
Should I build with the Ryzen 9 7950X or would Resolve do playback better on a comparable Intel CPU?
Intel for sure.
I have 12600k and 3060 ti. I don't see that second option for using iGPU, only nVidia.
I had iGPU disabled in the bios. Now it works.
is it a good idea to computer to davinci free version, use a processor with integrated graphics and buy more ram memory such as 32 gb ram. without a separate dedicated graphics card? is an intel processor with integrated graphics enough for stable operation? my needs are small video forms up to 20 minutes fhd movies or sometimes 4k. thx.
Free DR takes very little advantage of the GPU. Although some say tge support will be better in DR19.
Hi all.
I would like to see a real comparison of AMD RX7900XTX 24Gb vs RTX 4080 - and you?????
Just more memory and better in tests. Especially interesting with the I5 13600 processor? Versus all 11 and 12 series processors. Now that would be a money saver. But who would go for that?
Or?
RX 7900 XTX 24GB IS FASTER IN DAVINCI RESOLVE WHEREAS THE RTX 4080 IS FASTER IN ADOBE PREMIERE.
if i do that, davinci will only use the igpu? or dgpu still being use for something else?
Have you found out?
Recently I bought Intel Core i5-13600K. It works with MSI B660M Mortar mobo. Which BIOS options are recommended to be enabled/disabled for the cpu to be as cool as possible, while staying fast?
Enable igpu from bios
I was using Premiere and I noticed it does switch to the best choice, which was I relief as I just want it to work without having to go into settings , open/closing depedning on what I am doing.
Hello i need with my intel i9 13900K, am new on both PC and Davinci resolve user and every time i play a footage my intel processor hits 100% as soon as play the Video, please i don't what i am doing wrong?.
If your PC runs fine outside DaVinci, it's probably your video clip that's messed up.
MAYBE MENTION AT THE START OF THE VIDEO THAT IT'S ONLY FOR THE STUDIO VERSION
Tech Notice, one question...3060 12Gb vs 3060ti 8Gb in DaVinci? Which one have better rendering performance....I have Intel 13700K CPU
3060 ti will be better even with less Vram, i mean marginally better
@@TechLifewithMPS so vram is less important?
It depends on your project size/footage. For my projects with 8K h.264/h.265 footage, the cards with 10GB or less of VRAM run into issues. The cards with more than that are showing over 9GB of VRAM usage almost as soon as I start working with the footage. So that’s why I still use a 3060 12GB on my less powerful system.
So if you are working at 4K or lower, the Ti might be better. But honestly I’d seriously look into an Arc A770 16GB or 16GB AMD cards like the 6800 XT or 7800 XT. I’ve been testing all of them against the 4070 this week and the results have been very interesting.
On my oldest system (with no iGPU), I swapped in a 4070 and 7800 XT for the 3060. In PugetBench, the 7800 XT came out with the highest overall score (both standard and extended) but there was a split in terms of which part of the program were handled better by the 7800 XT vs. the 4070. The 7800 XT won 4K Media, 79 to 77. The 4070 won 8K media, 73 to 64 and GPU Effects 112 to 98. But then the 7800 XT won Fusion 290 to 244.
The A770 has been really good with decoding footage, even without the aid of the 13700K iGPU, too. I haven’t tested it together with the iGPU yet.
@@perlichtman1562 in the end I bought 4090...
I literally switched to resolve days ago so this is timed perfectly. I’m using Intel 12900 cause I don’t overclock. My GPU is RTX 3090 Evangelion edition 😎
Flex 🔥
i got a 4090 mate
@@Edwardosalamanca i got a fighter jet sir
@@afti03I got nuclear fusion 😂
OK, Davinci Studio 18.1.2 on Linux Min 21.1 with Intel 12700, 64 GB DDR4, 1TB SSD, NVidia RTX 3060+12GB.
Rendering a 4K clip with fusion elements and some effects:
57 secs with NDIVIA ON
1:03 min with NVIDIA OFF
Just to clarify: when you say “Nvidia off” do you mean “Nvidia off so nothing is checked” or “Nvidia is off but Quicksync is still checked”?
I think it was to turn the GPU OFF (so Nvidia off, nothing else checked) @@perlichtman1562
This will cause some out of memory problems when you are putting heavy grading nodes. Trying putting Neat denoise plugin and you will see. Sometimes I can’t even export the project or can do it but the final rendered file is having glitches. Need to just tick only on nvidia and it will never have an error when exporting. Talking about 4K footages. Trying so many times…
how much RAM are you running and what GPU?
@@theTechNotice32gb 6000mhz. 3080ti rog oc
Could this be where the intel ARC GPUs shine? On the fence about adding one to run alongside a 3090ti
To my knowledge, the current Intel GPUs have a lot slower engine for video encoding, than their CPUs. Don't know if it's the same for playback (decoding).
wow thanks 4 knowledge
Thats why after fresh reinstall my playback become laggy. The setting change back to using nvidia
I didn't find decode setting showing in my PCs, i9 12900, RTX A4000 and also not in i5 13500 with Quadro K2200, iGPu is enabled
have you got free or paid version of Resolve?
but why not having both checked so the software to use both or to choose one over another when needed?
He already explained that several times in the video...
@@akyhne im from Mongolia and English is not my native language so sorry that i didnt heard mentioning several times in the video what i have asked. If you were a mongolian you would have helped by giving a timstamp to listen it again but it seems each country with their people, you seem to just scold me. Im not briliant at tech so thats why i asked, i will delete my sily question but this software should be able to choose one over another for decoding.
@@AexoeroV The CPU is more powerful than any GPU for playback. It also supports more codecs.
Therefore, there's no reason to have both checked.
The software cannot know how much "energy" the CPU or the GPU uses, at playback. Therefore it doesn't always make the correct decision. That's me saying that, not the guy in the video.
So if Premiere Pro can figure it out, it's because it's built into the software. A kind of matrix of CPUs and GPUs, where the software simply checks "oh, you have an Intel Core I9 13900K and a RTX 3060, so therefore I should choose to use the CPU for playback".
This can easily get complex, so Blackmagic who makes DaVinci Resolve, chose not to.
i got to do it for the Beformance💯💯💯💯💯💯
hmm, if i use NR , Magic mask or something like that. Does my GPU still handle them?
Yes, it's only for decoding ie playback of footage everything I else will still be on the GPU
@@theTechNotice thanks for answer bro
@@theTechNotice but why not having both checked so the software to use both or to choose one over another when needed?
What is the time difference between the two options?
Time difference? There's no time difference. It's about having smooth playback on the timeline.
what if you dont have that option ? i have i 12 core i9 so it should be
Enable iGPU from BIOS!
is i7-6700k good enough for cpu editing
I had little issues with my I7 4790K, until it died a year ago. If you do, just use proxies.
have i5 12400 their is no option showing of quicksync in davinci Decode option
Is the iGPU enabled in bios?
I have 12400f did it work
Why doesn’t my davinci pop up with that option and i have the 13 gen
Studio or free version?
@@theTechNotice free I guess you have to have the studio version
No Intel Quick Sync on Intel NO igpu system...
This work for the free version?
Not
How about Intel Arc
To my knowledge, the current Intel GPUs have a lot slower engine for video encoding, than their CPUs. Don't know if it's the same for playback (decoding).
@@akyhne This same channel said that the Intel Arcs has strong encoders that could even match higher end nvidia cards, though abviously less gpu muscle.
In davinci resolve i see only "use gpu for blackmagic raw decode"
i7-13700 rtx 4070 ti and its free version
On Windows the Free version doesn’t support a lot of things that the Studio version does in this area:
- no support for multiple GPUs (or combining iGPU and discrete GPU)
- no hardware acceleration for h.264 and h.265 so you don’t have access to the same preferences
- no support for h.264 10-bit 4:2:2 footage (shows up as audio only)
On Mac some of that changes, but on Windows you really need Studio to get a fast experience working with h.264/h.265 footage. If you are using the free version, you may want to convert your footage to DNxHR format since without the h.264/h.265 GPU acceleration, the CPU playback can be a bit easier DNxHR.
Some say the free versiin will have more GPU support in DR19.
This isn't true for me with 4090/13900k - turning off the NVIDIA means it will use the CPU and the Intel GPU instead. If I disable my 4090 and then watch the task monitor it will still use Intel GPU AND Nvidia GPU simultaneously. This is with Sony 8k H265 Log media at least. It is very dependent on the codec and what resolve activity you are doing (edit/colour/FX) what combination of HW acceleration it decides to use. Best to leave them all turned on, you're just stressing your CPU more if you turn it off.
Try h.265 4.2.2 10bit footage and you'll see;)
@@theTechNotice yep that what I was using 8k H265 4:2:2 in max bitrate. Resolve 18.1.2 and latest Studio drivers. I'll try 4k footage, but with 8k it's best to leave NVIDIA ticked.
@@denizahmet2299Well, your resolve is working properly and used both gpus. Because Nvidia does not support 10bit h265 422 decoding.
Great info thanks on that. I have Intel 12600K and RTX 3060Ti. and legit Studio 19, need to say now it works better. But there is another problem now, on Sony and DJI footage works great, but on Canon footage also flays, and in one moment image freeze-s. All footage 8bit h264. Is there any other codecs to install for Davinci Resolve or Windows?
The "decode h264 using hw acc" doesnt exist for me lmao what
You are still saying HS264, which is confusing. It's H.264. LOL
0:36
intell 4th 💀
First
Second