Matt, you're doing great with your videos - it's to the point at the functional level. My gripe is with Adobe that is slacking for the past 12 years I paid my subscription to them. Their generative AI running in their cloud, forcing an upload of my images, seems slacking as well. My camera recognizes faces, eyes, animals, etc., and imagine I shoot a still of the Milky Way in a 20 minute exposure, then in the time my camera produces one still, its electronic viewfinder at 120 FPS refresh rate has executed 144,000 viewfinder updates and AI executions from a 45MP sensor. It doesn't need the cloud for that. And lets remind ourselves what "real time" means in programming: code has to run in a predetermined amount of time. Every one of these 144,000 eVF frames and AI runs must execute in no more than 1/120 sec - probably half of it - applying deBayerisation and demosaicking and data compression to each of the 144,000 eVF updates. The camera is 100% mobile tech running off a small battery. W.r.t. digital photography, Adobe is a major player in the "Bayer social contract" where we shoot "colour" images with a colour-blind sensor that sees the humanly visible spectrum of light (generally more) adapted with filters to see one colour per sensor cell that gets recorded in the raw. Adobe's part of the social contract - thoughts go out to a channel with the word conspiracy in it - is to invent the missing colours for us. With a filter grid aligned to the photosites (the actual sensors in the sensor) each photosite only sees one spectral colour band - the raw file, while "positive" not negative, only has monochromatic (mono=single,one; chromatic=colour) data elements and at some point in time after the raw image was shot, each monochromatic data element must have its missing colours added. 14 bits raw are in one colour, either red, or green, or blue. Camera Raw turns that into RGB for Lightroom Classic [1]. AFAIK, Camera Raw hands over a 16 bits per channel RGB image in memory to Lightroom Classic or Photoshop, in ProPhoto colour space. 14 bits per single colour data element then have become 48 bits per pixel. Which Lightroom must reduce to 21 or 24 bits per pixel or else your monitor cannot render it. Photoshop can upsample the 16 to 32 so you have 96 bits per pixel. Great - but maybe not a priority for most users. Maybe an indication of leadership that has no clue what they are leading. But then, some say that Photoshop's rendition of the 96 bits per pixel look better to them on their 24 bits per pixel monitor. Bias? The raw Bayer file is 100% colour noise and luminance noise and most noise in our RGB images is residual Bayer noise following from inadequate raw processing (AKA raw conversion). In old algorithmic AI of the raw processing, digital artefacts are generated and while Moiré gets dealt with, and bleeding of colours across borders between differently coloured blobs is managed reasonably well, crinkly lines may be visible when you pixel peep or print large. That's very bad AI running locally. Adobe must feel some market pressure in recent years from Topaz and DxO in that they finally worked on better removal of residual Bayer noise and digital artefacts. While the fluencer universe raps about "colour noise", "luminance noise", "not enough photons", quantum physics, and blames the "sensor", such fluencers give the likes of Adobe an alibi to do nothing in innovation. Instead Adobe raised the L in their P&L with developing new versions - new programming code threads - of what already exists (mobile versions), wasting your an my subscription money. The cloning to fix things in Adobe Camera Raw (that is the Develop tab in Lightroom Classic) is inadequate in cases that are a bit more difficult than small pimples. Now they offer an innovation that however is (a) done in the cloud, and (b) at some point may cost additional money. The learning-based AI works well, but the labour division "model" vacuums. Learning AI implies a learning phase where training examples are used and the results are assessed (by humans probably) followed by an operational phase when learning usually is switched off. In this case here, users may be (ab)used as trainers for AI that keeps learning - and the AI may decide at some point to use fragments of your images to repair someone else's generative AI repairs. I might agree with this model if Adobe were transparent about it. No, as long as it is done in the cloud, I won't. Earnings per share (EPS) over 2012: US$ 1.66. EPS 2023: US$ 11.82 [www.macrotrends.net/stocks/charts/ADBE/adobe/eps-earnings-per-share-diluted] As users, we are treated as milk (cash) cows, not as stakeholders that are as important as shareholders. 440 million shares versus 220 million users with 30 million cloud subscribers. [1] Lightroom Classic (LrC) provides its own style UI to Camera Raw (ACR) while Photoshop (Ps) runs ACR as a seemingly spawned stand-alone daughter process. If some function of ACR is not usable in LrC then LrC's UI overlay is incomplete. ACR runs as daughter process to Ps in its native UI and has a direct memory connection between them. You can run ACR stand-alone by launching it from Adobe Bridge (Br). Start Bridge, go to an image file you want to process in ACR, right click on the item, and click "Open in ACR". Your edits are stored in an ACR-specific sidecar file that LrC ignores when you create a new catalogue. IMO LrC should treat an ACR sidecar as a virtual copy - but that could be scary for the Mudbricks (programmers at/for a firm with synonymous name).
Useful video. Thanks for clarifying the workflow for accessing the plugins from Lr Desktop, which I was finding difficult to get working; but your point is well made to still go via Ps to the plugins as I will be doing.
I recently updated my iOS to Ventura 13.7.1 to be able to update to the newest versions of LR and PS and since then LR Classic seems to be moving slow at times. . For example, the sliders are choppy and slow to react making editing times longer than usual. I am using a 2017 iMac and have around 275GB of 1TB available and before the updates things ran fine as long as I didn't get much below under 20% memory available. I am wondering if maybe my iMac is just too old to properly handle the recent updates or if maybe there is a setting in LR that needs to be changed to support large files. Any help would be appreciated.
Hey Matt! Do you know why they removed the opacity slider from the removal section (for both cloning and healing) in Lightroom Classic? I tried to ask in the Adobe forum, but it kept erroring me out. Thanks! Always love your videos!
@@MattKloskowski thank you! I guess my question on the Adobe forum did go through. I wasn't making my selection first in order for the opacity slider to show up.
I am still using LR Classic for free, along with a few plugins and I am quite happy with my output, it was always about getting a well exposed well composed shot anyway, with minimal editing
Hi Matt, I am asking this question from you as you are the only I trust. I followed you and now edit in Lr desktop but with new update, I don't want happened may be some bug but I can't reach to top and can't change from cloud to local or local to cloud and my adjusted are not visible at top meaning I can't see edit at all. please help me. I am not sure it is some preference issue or bug?
I assume each time Lightroom decides to use generative fill your balance goes down by one credit, making the generate after each brush stroke pretty expensive. I am sure Lightroom will try to use generative fill often. Going forward, Adobe has no incentive to update the monthly cost of the plan if they can just charge based on AI usage.
Yes, Gen Fill will use up your credits. Not each brush stroke but each time you click Apply. If you don’t want to use it, you have the freedom to not click, but stop with the conspiracy theories already. Have you seen or heard of one person that has been over charged for this service as you’re suggesting? Please point me to it because in 1 year I haven’t.
My plan has only 100 credits per month (basic Photoshop and Lightroom 20gb). I was not sure if the brush stroke was using ai, but I suppose it makes sense that it only sends the photo once. I tend to be very careful with gen fill as I don't have a lot of credit, but for those with the full creative cloud, it should not be an issue. Its nice that generative fill is in there as an option. I do hope they move some of this simple ai gen fill to the local machine for small changes, so it needs less cloud services. Thanks for the clarification
Matt, you're doing great with your videos - it's to the point at the functional level. My gripe is with Adobe that is slacking for the past 12 years I paid my subscription to them.
Their generative AI running in their cloud, forcing an upload of my images, seems slacking as well.
My camera recognizes faces, eyes, animals, etc., and imagine I shoot a still of the Milky Way in a 20 minute exposure, then in the time my camera produces one still, its electronic viewfinder at 120 FPS refresh rate has executed 144,000 viewfinder updates and AI executions from a 45MP sensor. It doesn't need the cloud for that. And lets remind ourselves what "real time" means in programming: code has to run in a predetermined amount of time. Every one of these 144,000 eVF frames and AI runs must execute in no more than 1/120 sec - probably half of it - applying deBayerisation and demosaicking and data compression to each of the 144,000 eVF updates. The camera is 100% mobile tech running off a small battery.
W.r.t. digital photography, Adobe is a major player in the "Bayer social contract" where we shoot "colour" images with a colour-blind sensor that sees the humanly visible spectrum of light (generally more) adapted with filters to see one colour per sensor cell that gets recorded in the raw. Adobe's part of the social contract - thoughts go out to a channel with the word conspiracy in it - is to invent the missing colours for us.
With a filter grid aligned to the photosites (the actual sensors in the sensor) each photosite only sees one spectral colour band - the raw file, while "positive" not negative, only has monochromatic (mono=single,one; chromatic=colour) data elements and at some point in time after the raw image was shot, each monochromatic data element must have its missing colours added. 14 bits raw are in one colour, either red, or green, or blue. Camera Raw turns that into RGB for Lightroom Classic [1]. AFAIK, Camera Raw hands over a 16 bits per channel RGB image in memory to Lightroom Classic or Photoshop, in ProPhoto colour space. 14 bits per single colour data element then have become 48 bits per pixel. Which Lightroom must reduce to 21 or 24 bits per pixel or else your monitor cannot render it. Photoshop can upsample the 16 to 32 so you have 96 bits per pixel. Great - but maybe not a priority for most users. Maybe an indication of leadership that has no clue what they are leading. But then, some say that Photoshop's rendition of the 96 bits per pixel look better to them on their 24 bits per pixel monitor. Bias?
The raw Bayer file is 100% colour noise and luminance noise and most noise in our RGB images is residual Bayer noise following from inadequate raw processing (AKA raw conversion). In old algorithmic AI of the raw processing, digital artefacts are generated and while Moiré gets dealt with, and bleeding of colours across borders between differently coloured blobs is managed reasonably well, crinkly lines may be visible when you pixel peep or print large. That's very bad AI running locally.
Adobe must feel some market pressure in recent years from Topaz and DxO in that they finally worked on better removal of residual Bayer noise and digital artefacts.
While the fluencer universe raps about "colour noise", "luminance noise", "not enough photons", quantum physics, and blames the "sensor", such fluencers give the likes of Adobe an alibi to do nothing in innovation. Instead Adobe raised the L in their P&L with developing new versions - new programming code threads - of what already exists (mobile versions), wasting your an my subscription money.
The cloning to fix things in Adobe Camera Raw (that is the Develop tab in Lightroom Classic) is inadequate in cases that are a bit more difficult than small pimples. Now they offer an innovation that however is (a) done in the cloud, and (b) at some point may cost additional money. The learning-based AI works well, but the labour division "model" vacuums. Learning AI implies a learning phase where training examples are used and the results are assessed (by humans probably) followed by an operational phase when learning usually is switched off. In this case here, users may be (ab)used as trainers for AI that keeps learning - and the AI may decide at some point to use fragments of your images to repair someone else's generative AI repairs. I might agree with this model if Adobe were transparent about it. No, as long as it is done in the cloud, I won't.
Earnings per share (EPS) over 2012: US$ 1.66. EPS 2023: US$ 11.82 [www.macrotrends.net/stocks/charts/ADBE/adobe/eps-earnings-per-share-diluted]
As users, we are treated as milk (cash) cows, not as stakeholders that are as important as shareholders. 440 million shares versus 220 million users with 30 million cloud subscribers.
[1] Lightroom Classic (LrC) provides its own style UI to Camera Raw (ACR) while Photoshop (Ps) runs ACR as a seemingly spawned stand-alone daughter process. If some function of ACR is not usable in LrC then LrC's UI overlay is incomplete. ACR runs as daughter process to Ps in its native UI and has a direct memory connection between them. You can run ACR stand-alone by launching it from Adobe Bridge (Br). Start Bridge, go to an image file you want to process in ACR, right click on the item, and click "Open in ACR".
Your edits are stored in an ACR-specific sidecar file that LrC ignores when you create a new catalogue. IMO LrC should treat an ACR sidecar as a virtual copy - but that could be scary for the Mudbricks (programmers at/for a firm with synonymous name).
Useful video. Thanks for clarifying the workflow for accessing the plugins from Lr Desktop, which I was finding difficult to get working; but your point is well made to still go via Ps to the plugins as I will be doing.
Excellent, really useful. Can i remove power lines with Lightroom Classic (without resorting to photoshop)?
Hi. There are remove features in LR but they're not automatic.
I recently updated my iOS to Ventura 13.7.1 to be able to update to the newest versions of LR and PS and since then LR Classic seems to be moving slow at times. . For example, the sliders are choppy and slow to react making editing times longer than usual. I am using a 2017 iMac and have around 275GB of 1TB available and before the updates things ran fine as long as I didn't get much below under 20% memory available.
I am wondering if maybe my iMac is just too old to properly handle the recent updates or if maybe there is a setting in LR that needs to be changed to support large files.
Any help would be appreciated.
Hey Matt! Do you know why they removed the opacity slider from the removal section (for both cloning and healing) in Lightroom Classic? I tried to ask in the Adobe forum, but it kept erroring me out. Thanks! Always love your videos!
Hi. I still see the Opacity adjustment when I use the tool. Thanks.
@@MattKloskowski thank you! I guess my question on the Adobe forum did go through. I wasn't making my selection first in order for the opacity slider to show up.
I am still using LR Classic for free, along with a few plugins and I am quite happy with my output, it was always about getting a well exposed well composed shot anyway, with minimal editing
You have any tips on Dodge and burn tutorials in LRC!
Hi Matt, I am asking this question from you as you are the only I trust. I followed you and now edit in Lr desktop but with new update, I don't want happened may be some bug but I can't reach to top and can't change from cloud to local or local to cloud and my adjusted are not visible at top meaning I can't see edit at all. please help me. I am not sure it is some preference issue or bug?
Hi. I wish I could help but it sounds like your issues are not tutorial related but more Adobe support issues so you'd have to contact them.
@@MattKloskowski Thanks, I will. still appreciate you video.
Ok so this is lightroom classic, not lightroom?
Both. If you watch the video you’ll see both are covered
I assume each time Lightroom decides to use generative fill your balance goes down by one credit, making the generate after each brush stroke pretty expensive. I am sure Lightroom will try to use generative fill often. Going forward, Adobe has no incentive to update the monthly cost of the plan if they can just charge based on AI usage.
Yes, Gen Fill will use up your credits. Not each brush stroke but each time you click Apply. If you don’t want to use it, you have the freedom to not click, but stop with the conspiracy theories already. Have you seen or heard of one person that has been over charged for this service as you’re suggesting? Please point me to it because in 1 year I haven’t.
My plan has only 100 credits per month (basic Photoshop and Lightroom 20gb). I was not sure if the brush stroke was using ai, but I suppose it makes sense that it only sends the photo once. I tend to be very careful with gen fill as I don't have a lot of credit, but for those with the full creative cloud, it should not be an issue. Its nice that generative fill is in there as an option. I do hope they move some of this simple ai gen fill to the local machine for small changes, so it needs less cloud services. Thanks for the clarification
I wish it was that easy to rename one file. LOL