This is what I have been waiting for, I am a photographer and you saved me money for a GPU, 8% improvement is almost nothing for me I will just upgrade my CPU when a new series comes out, 13th gen or 7000 Thank you, love from Macedonia
Thank you for this video. After researching over many social medias and not able to find the conclusive answer I am looking for regarding my slow Adobe AI Denoise runtime that takes 180sec/3min. (estimate 2.5min. but always take 3min.) I decided to do some test myself and here are the results I find on my old Intel i7-6700K with 32GB RAM, 1TB M.2 SSD, Asus Z170-A Motherboard with ISO 12,800 Sony 60MP compress RAW file: GTX1060 6GB GDDR5: 180sec./3min. RTX306012GB GDDR6: 33sec. RTX3060Ti 8GB GDDR6X: 26sec. RTX3070 8GB GDDR6: 24sec. Full Disk Cleanup is performed before and after each new video card installation and the average time is from five tests each. Though the RTX3070 8GB GDDR6 was only 2 seconds faster, the loading and preview calculating time were also a bit faster per image. I hope someone will benefit from my finding.
Muchas gracias.!!!!! Muy bueno. A mi me rusltó muy útil. Estaba con la misma duda respecto de la incidencia de GPU en el AI Denoise sobretodo. Es impresionante la diferencia entre la GTX (que es lo que tengo) y las RTX que es a lo que pretendo cambiar. Saludos!
Thank you so much.!!!!! Very good. It was very useful to me. I was with the same doubt regarding the incidence of GPU in the AI Denoise above all. The difference between the GTX (which is what I have) and the RTX, which is what I intend to change to, is impressive. Greetings!
Maybe you will never read the comment since it's an "old" video, but since I assembled a new machine with 13600k+3060, I made some tests in Lightroom with and without GPU acceleration. Well...I can testify that I gain 11% performance with the GPU activated. An export of 1500 full jpgs took 1min30sec less. Now... If it was only for this, I'd say who cares... don't think that 10% performace gain is worth the 345 euros I paid for the GPU. But...and it's a BIG but... During export and import CPU usage with GPU acceleration active was only 45/50%, while without GPU accell, CPU usage was 100%. That means that witha a decent GPU you can Import and export faster and all the while you can also do something else since your CPU is not at 100% usage.
You should run the benchmark (PugetBench for Lightroom Classic) and report back your findings. I'd love to learn what your "active" score is, as I'm considering building the same CPU+GPU combo (plus 64GB of DDR5 7200 RAM) and don't care about import/export performance.
That was an eye-opening video. I suspected that it was the case but to actually prove it was a disappointing discovery. All that money on a high end GPU and only to get an 8% improvement. It would be great to see this type of research for most of the other creator apps. I use Affinity Photo and Capture One and I suspect they really don't utilize all the GPU and CPU horsepower available to us. I'm also interested in knowing which creator apps use the multiple cores on the latest CPU's. Thanks for the excellent content.
Not every user has the i9-12900k cpu - would be awesome to see what kind of performance enhancements the gpu acceleration utilizes when using a lower end cpu (i5-12400?) With a mediocre gpu like 3060...
With the new noise reduction feature in LrC, the scenario changed, because this is very GPU intensive. I would like to see (if possible) and update with this feature, I'd like know if that worth it to make the gpu change, also it is important the screen resolution.
I've recently done an experiment measuring the speed of the new AI powered Denoise function in Adobe Camera Raw: I opened 10 compressed raw files from my Sony a7R4 (60 mpixels and 60-ish megabytes each). The photos were shot at dawn at 800 iso, so they were moderately noisy. I set 50% Denoise. On my Windows 10 desktop with an old good i7 3770 coupled with Rx 580 it took me 20 minutes to finish the task (around 2 min. per photo). After replacing Rx 580 with a more powerful GTX 1080 Ti the speed increased exactly twofold (60 sec. per photo). And it took around 75 sec. per photo on my Acer laptop with Ryzen 5 5600U and rtx 3050 mobile. I also tried to do the same task on my friend's PC that doesn't have any dedicated GPU (it has integrated radeon 7 graphics). So, the speed slowed down to... 30 minutes (!) per one photo.
Thank you so much for this video. After suggesting the idea for this video I was not expecting you actually do it. I have a 5900x and an old GTX 970. I was looking to buy a 3070 because I want to also buy a Dell 4021qw and the 970 won't support the native resolution. After this video I think the best option for me is a 3050. Thank you again for this video, great job and keep up the good work!
I'd suggest that an actual test of a Lightroom operation would produce more meaningful results rather than using a benchmark test, as we see here. Thanks for making the video and pointing out that setting in Preferences
Puget is pronounced "pew-jet." The Puget Sound is a place in Washington State, USA for which the company is apparently named, as they are located there
it's not much what gains you get with export speed when using acceleration with GPU export however, you also don't need the higher-end GPUs to get the acceleration. the gain isn't so much that your photos export faster but your computer will run much quieter AND allow you to do other things rather than having all the cores in your CPU tied up doing exports!
Just the video I was looking fo as I am about to build a new PC for using Lightroom, Photoshop and other digital photo editing programs and basic video editing. I am going with the Intel Core i71300K Z790 MB and DDR5 etc. for some future proofing and was stuck between an Asus RTX 3060 OC 12GB vs Asus RTX 3080 or 4080 which felt like overkill, I fortunately can afford them but thought the market manipulation and price gouging on these newer higher end cards in my opinion is outrageous. The 3060 is available for $350 on “Sale” brand new so I am going with that. I have watched dozens of your videos on PC Builds, GPU’s and many more. Your videos are among the best, useful and most relevant! Thanks and keep making more please.
Can I suggest that you run some sort of survey to find out what type of video editing software your viewers actually use. I suspect that many of your viewers use alternative options to Adobe Premier Pro or DaVinci Resolve. Personally I use PowerDirector and I love it for its capabilities and above all its ease of use.
I'm thinking about a 13600K + 32/64Gb without iGPU for a dedicated "budget" Lightroom PC for a buddy of mine. Do you think it's the best option to keep costs as low as possible (as it's not a job but a hobby) and still have a good performance for today and beyond ? Thanks for the Video !
Here’s the REAL question: Do Adobe employees even use lightroom? That would say everything in my book… I mean - basically you could have a $25,000 computer setup and it’s still going to be a $hit show because Adobe has inefficient coding and lackluster developers. Best part: They don’t “have” to care having an Oligopoly status in place - along with record quarterly earnings
If " a750 and rtx 3050 is the same prize..which one is good choice considering cuda cores, driver optimization,future supports etc. I will do graphics disign using photoshop,illustrator,maya,blender,3ds max... And gunnir or intel which one will better for arc a750? Thanks.
Question: I have been using photoshop and in one of my projects i have a lot of layers. When i resize an object it is very slow and stutters. Is this because of the GPU??? I have a 1070 (weakest component) 128gb 3600mhz ram and a 5950x on an Asus dark hero board. I feel like this system should be snappy with whatever i throw at it. Any suggestions?
What videocard would you buy for Lr/Ps? I dont have any. As the prices are ok, time ro upgrade from my 5700G to a 5900x and need a gpu. No need too expensive. What about the geforce 2000 series? Are they worth it? Thank you.
Dude... you totally missed on the accelaration of the animations of applying effects in LR! how you could you miss this, this is the main feature in my opinion... it basically makes LR usable... this is just typical dumb benchmarking that tells half the story - thumbs down from me :(
Been thinking to reserve my brothers 6500xt for mild gaming then edit/render using qsv 12100 shud be decent stopgap Might aswell upgrade to a 10core 13400f once i get myself a 3050
Thanks for this I have a six year old laptop with a GTX 1070 and have recently had a pop up from Adobe telling me I might have a problem with crashes, I never had a problem but after this warning L/R started to hang every now and again, so I updated the drivers as per their directions and the problem stopped. A big BUT I also use DaVinci resolve and this stopped working so I had to go back to generic drivers. I'm now stuck between a rock and a hard place.
According to you an RX6600 will work just fine for photos, and an RX6600 is a lot cheaper than an RTX3080. I live in Las Vegas. It is not uncommon for it to be 30 celcius indoors. How much CPU can I run at that heat? I am running an i-5 but is there any problem with using an i-7 or i-9?
You’re not alone - 5950x, 128gb ram, 3080 slightly overclocked, 3x2tb samsung 980 pros = still a crawl when batch processing edits with ai masks and export of 750+ files for time lapse work 🤮🤮🤮🤮🤮🤮
Hi Is the new de noise in Lightroom GPU hungry as I am only using a Nvidia Quadro K2000 2gb and its giving problems with the speed when using De noise it takes so long and I have no idea if its CPU or GPU or ram. And would a Nvidia Quadro K2100 4gb make a big difference for me . Please help Regards Richard from New Forest
Can anyone with a ryzen 7 7840hs cpu and rtx4060 gpu (legion slim5) confirm if Lightroom runs without bugs? I am reading about some issues from mid 2023, people say that masks and denoise make the system crash.
I don't play games, only content creation of images mainly for 10% improvements of discrete GC over integrated is really not justified, in my opinion, a basic RTX3060 costs >US$550.
Is there a way i could win one of your gaming pc? been using mine since 2017 and been struggling to upgrade. Wish you could see this. Would really appreciate it.
This is what I have been waiting for, I am a photographer and you saved me money for a GPU, 8% improvement is almost nothing for me I will just upgrade my CPU when a new series comes out, 13th gen or 7000 Thank you, love from Macedonia
Thank you for this video. After researching over many social medias and not able to find the conclusive answer I am looking for regarding my slow Adobe AI Denoise runtime that takes 180sec/3min. (estimate 2.5min. but always take 3min.) I decided to do some test myself and here are the results I find on my old Intel i7-6700K with 32GB RAM, 1TB M.2 SSD, Asus Z170-A Motherboard with ISO 12,800 Sony 60MP compress RAW file:
GTX1060 6GB GDDR5: 180sec./3min.
RTX306012GB GDDR6: 33sec.
RTX3060Ti 8GB GDDR6X: 26sec.
RTX3070 8GB GDDR6: 24sec.
Full Disk Cleanup is performed before and after each new video card installation and the average time is from five tests each. Though the RTX3070 8GB GDDR6 was only 2 seconds faster, the loading and preview calculating time were also a bit faster per image.
I hope someone will benefit from my finding.
Muchas gracias.!!!!! Muy bueno. A mi me rusltó muy útil. Estaba con la misma duda respecto de la incidencia de GPU en el AI Denoise sobretodo. Es impresionante la diferencia entre la GTX (que es lo que tengo) y las RTX que es a lo que pretendo cambiar. Saludos!
Thank you so much.!!!!! Very good. It was very useful to me. I was with the same doubt regarding the incidence of GPU in the AI Denoise above all. The difference between the GTX (which is what I have) and the RTX, which is what I intend to change to, is impressive. Greetings!
did you get any crash or freeze when using LR or PS especially when using any AI? I use i7 6700k as well
@@digitchannel8689 not once
Maybe you will never read the comment since it's an "old" video, but since I assembled a new machine with 13600k+3060, I made some tests in Lightroom with and without GPU acceleration. Well...I can testify that I gain 11% performance with the GPU activated. An export of 1500 full jpgs took 1min30sec less. Now... If it was only for this, I'd say who cares... don't think that 10% performace gain is worth the 345 euros I paid for the GPU. But...and it's a BIG but... During export and import CPU usage with GPU acceleration active was only 45/50%, while without GPU accell, CPU usage was 100%. That means that witha a decent GPU you can Import and export faster and all the while you can also do something else since your CPU is not at 100% usage.
You should run the benchmark (PugetBench for Lightroom Classic) and report back your findings. I'd love to learn what your "active" score is, as I'm considering building the same CPU+GPU combo (plus 64GB of DDR5 7200 RAM) and don't care about import/export performance.
That was an eye-opening video. I suspected that it was the case but to actually prove it was a disappointing discovery. All that money on a high end GPU and only to get an 8% improvement. It would be great to see this type of research for most of the other creator apps. I use Affinity Photo and Capture One and I suspect they really don't utilize all the GPU and CPU horsepower available to us. I'm also interested in knowing which creator apps use the multiple cores on the latest CPU's. Thanks for the excellent content.
Not every user has the i9-12900k cpu - would be awesome to see what kind of performance enhancements the gpu acceleration utilizes when using a lower end cpu (i5-12400?) With a mediocre gpu like 3060...
A 3060 is anything but mediocre for Lightroom. That particular combo would work really well.
With the new noise reduction feature in LrC, the scenario changed, because this is very GPU intensive. I would like to see (if possible) and update with this feature, I'd like know if that worth it to make the gpu change, also it is important the screen resolution.
Do you know whether the GPU helps/makes a big difference with the Generative AI fill in Photoshop?
I've recently done an experiment measuring the speed of the new AI powered Denoise function in Adobe Camera Raw: I opened 10 compressed raw files from my Sony a7R4 (60 mpixels and 60-ish megabytes each). The photos were shot at dawn at 800 iso, so they were moderately noisy. I set 50% Denoise. On my Windows 10 desktop with an old good i7 3770 coupled with Rx 580 it took me 20 minutes to finish the task (around 2 min. per photo). After replacing Rx 580 with a more powerful GTX 1080 Ti the speed increased exactly twofold (60 sec. per photo). And it took around 75 sec. per photo on my Acer laptop with Ryzen 5 5600U and rtx 3050 mobile.
I also tried to do the same task on my friend's PC that doesn't have any dedicated GPU (it has integrated radeon 7 graphics). So, the speed slowed down to... 30 minutes (!) per one photo.
right? this video missed the function the GPU would help the most with.. denoise AI
Thank you so much for this video. After suggesting the idea for this video I was not expecting you actually do it. I have a 5900x and an old GTX 970. I was looking to buy a 3070 because I want to also buy a Dell 4021qw and the 970 won't support the native resolution. After this video I think the best option for me is a 3050. Thank you again for this video, great job and keep up the good work!
I'd suggest that an actual test of a Lightroom operation would produce more meaningful results rather than using a benchmark test, as we see here. Thanks for making the video and pointing out that setting in Preferences
Puget is pronounced "pew-jet." The Puget Sound is a place in Washington State, USA for which the company is apparently named, as they are located there
it's not much what gains you get with export speed when using acceleration with GPU export however, you also don't need the higher-end GPUs to get the acceleration. the gain isn't so much that your photos export faster but your computer will run much quieter AND allow you to do other things rather than having all the cores in your CPU tied up doing exports!
Great video bro, would like to see agisoft metashape benchmarks, It's a really cool benchmark but there isn't a lot test in youtube or websites
Just the video I was looking fo as I am about to build a new PC for using Lightroom, Photoshop and other digital photo editing programs and basic video editing. I am going with the Intel Core i71300K Z790 MB and DDR5 etc. for some future proofing and was stuck between an Asus RTX 3060 OC 12GB vs Asus RTX 3080 or 4080 which felt like overkill, I fortunately can afford them but thought the market manipulation and price gouging on these newer higher end cards in my opinion is outrageous. The 3060 is available for $350 on “Sale” brand new so I am going with that. I have watched dozens of your videos on PC Builds, GPU’s and many more. Your videos are among the best, useful and most relevant! Thanks and keep making more please.
And how is it going with AI denoise in LRC?
Can I suggest that you run some sort of survey to find out what type of video editing software your viewers actually use.
I suspect that many of your viewers use alternative options to Adobe Premier Pro or DaVinci Resolve.
Personally I use PowerDirector and I love it for its capabilities and above all its ease of use.
I have done actually, if you scroll down abit on the community tab on my channel ;)
I'm thinking about a 13600K + 32/64Gb without iGPU for a dedicated "budget" Lightroom PC for a buddy of mine.
Do you think it's the best option to keep costs as low as possible (as it's not a job but a hobby) and still have a good performance for today and beyond ?
Thanks for the Video !
Here’s the REAL question: Do Adobe employees even use lightroom? That would say everything in my book…
I mean - basically you could have a $25,000 computer setup and it’s still going to be a $hit show because Adobe has inefficient coding and lackluster developers.
Best part: They don’t “have” to care having an Oligopoly status in place - along with record quarterly earnings
If " a750 and rtx 3050 is the same prize..which one is good choice considering cuda cores, driver optimization,future supports etc. I will do graphics disign using photoshop,illustrator,maya,blender,3ds max...
And gunnir or intel which one will better for arc a750?
Thanks.
I'd go for a750
Question:
I have been using photoshop and in one of my projects i have a lot of layers. When i resize an object it is very slow and stutters.
Is this because of the GPU???
I have a 1070 (weakest component) 128gb 3600mhz ram and a 5950x on an Asus dark hero board.
I feel like this system should be snappy with whatever i throw at it.
Any suggestions?
same here
What videocard would you buy for Lr/Ps? I dont have any. As the prices are ok, time ro upgrade from my 5700G to a 5900x and need a gpu. No need too expensive. What about the geforce 2000 series? Are they worth it? Thank you.
Dude... you totally missed on the accelaration of the animations of applying effects in LR! how you could you miss this, this is the main feature in my opinion... it basically makes LR usable... this is just typical dumb benchmarking that tells half the story - thumbs down from me :(
Been thinking to reserve my brothers 6500xt for mild gaming then edit/render using qsv
12100 shud be decent stopgap
Might aswell upgrade to a 10core 13400f once i get myself a 3050
Thanks for this I have a six year old laptop with a GTX 1070 and have recently had a pop up from Adobe telling me I might have a problem with crashes, I never had a problem but after this warning L/R started to hang every now and again, so I updated the drivers as per their directions and the problem stopped. A big BUT I also use DaVinci resolve and this stopped working so I had to go back to generic drivers. I'm now stuck between a rock and a hard place.
According to you an RX6600 will work just fine for photos, and an RX6600 is a lot cheaper than an RTX3080. I live in Las Vegas. It is not uncommon for it to be 30 celcius indoors. How much CPU can I run at that heat? I am running an i-5 but is there any problem with using an i-7 or i-9?
For graphics and rendering, I think a quad series card would be better and much cheaper , but the 3080 is also super has a lot of cores
literally what i have a 12th gen i9 and a 3080 12gig and when i use ps no problem but lr 98% usage non stop ?!?!?!
why
You’re not alone - 5950x, 128gb ram, 3080 slightly overclocked, 3x2tb samsung 980 pros = still a crawl when batch processing edits with ai masks and export of 750+ files for time lapse work 🤮🤮🤮🤮🤮🤮
Thank you 🙏❤️😃
I have a question is Adobe premiere use cpu or gpu for rendering and exporting or he can use him both
Hi Is the new de noise in Lightroom GPU hungry as I am only using a Nvidia Quadro K2000 2gb and its giving problems with the speed when using De noise it takes so long and I have no idea if its CPU or GPU or ram. And would a Nvidia Quadro K2100 4gb make a big difference for me . Please help Regards Richard from New Forest
OK, I can finally have a great reason to replace my GTX 1050 Ti. I will likely get RTX 3060.
for some reason when i using my 3080 10gb card it says it should take 20mins or more for AI Denoise not sure why.
No wonder my graphic card only run like 1-10% only when using lr cc
Gpu nvdia gtx1070
When i do heavy editing lightroom dies on me. Threadripper 1950x RTX 2080 8gb, 64gb ram, m.2 ssd
I found my RX580 did a poorer job at exporting than my CPU - R5 3600. lol. However, the GPU was totally quiet.
Can anyone with a ryzen 7 7840hs cpu and rtx4060 gpu (legion slim5) confirm if Lightroom runs without bugs?
I am reading about some issues from mid 2023, people say that masks and denoise make the system crash.
What about photoshop?
We will need dedicated TPU cards for lighroom in the future and for AI editing. This is very disapppointing
I would like to see some work scores with a gpu and igpu machine vs. a machine with gpu and no igpu.
Are you going to do legion i5 pro 16 12700h video production benchmark
Yes
I don't play games, only content creation of images mainly for 10% improvements of discrete GC over integrated is really not justified, in my opinion, a basic RTX3060 costs >US$550.
Is 6700xt will give good results in your tests?
Nvidia is generally better with Adobe Software but it will be fine
@@froznfire9531 okay thanks bro 😊
Imagine living at 48° without AC 💀
Currently my ambience is 27° without AC. Temps don't cross 30°-32° where I live 🙃
Be nice if this worked with my rtx 3070 and
I have 100% ram usage😢
Is there a way i could win one of your gaming pc? been using mine since 2017 and been struggling to upgrade. Wish you could see this. Would really appreciate it.
I think i am second viewer
STOP WITH THE FREAKING BENCHMARKS! ffs! why are you so against real world tests that people (creators) can actually relate to???