How Wildlife Photographer's are Getting Topaz AI Wrong

Поделиться
HTML-код
  • Опубликовано: 25 май 2021
  • I was so doubtful that Topaz AI really worked for wildlife photography as well as everyone claimed it did until I tried it! HOWEVER, I've noticed some things from my experience with it so far that some wildlife photographers have been getting wrong in their photos. Check out this video to see the ways in which I've learned how to hack Topaz AI and use it to its full potential.
    Check Out My Affiliate Link Below! Use Promo Code JNEIPP15 for 15% off.
    topazlabs.com/shop/ref/997/
    --Follow Me--
    Online: jeremyneipp.com
    Instagram: / jeremyneipp
    --My Gear List--
    (Support Me With Amazon Affiliate Links//Helps Me Without Costing You a Penny More)
    Panasonic S1R - amzn.to/2VOPUX0
    Sigma 150-600mm, 5.6-6.3 Contemporary Lens - amzn.to/3aQhGGL
    Canon EF 16-35mm, 4.0 USM Lens - amzn.to/31HbboR
    Sigma 24-70mm, 2.8 L Mount Lens - amzn.to/3jqy6d2
    Sigma 18-35mm, 1.8 Art Lens - amzn.to/3d1QOoH
    Panasonic G9 - amzn.to/2xlWXwM
    Panasonic Lumix G Lens 25mm, 1.7 - amzn.to/3d6gtNg
    My Camera Backpack - amzn.to/2TkQedV
    Camo Scarf - LOOGU Camo Scarf - amzn.to/3aVmEC6
    My Camo Balaclava - amzn.to/3mnHilI
    My Camo Gloves - amzn.to/3dDOvtR
    this is why I do wildlife photography, join a wildlife photographer on the hunt for the perfect shot, bird photography in the forest, 4 day solo winter camping and wildlife photography, 1 day solo camping and wildlife photography on an island, bird photography | photographing shorebirds, wildlife photography for beginners, 10 tips for improving your wildlife photography, sigma 150-600, 10 amazing wildlife photography tips, waited 10 days for this moment, what's ruining your bird photography, the 10 rules of bird photography, prowess of the heron
  • ЖивотныеЖивотные

Комментарии • 65

  • @OwenEDell
    @OwenEDell 2 года назад +1

    This is great. Straight up info, no filler. I just subscribed. Thanks!

    • @JeremyNeipp
      @JeremyNeipp  2 года назад +1

      That’s what I’m trying for! Hope you continue to enjoy the stuff!

  • @BucksSavage
    @BucksSavage 3 года назад +6

    I see you are using .jpeg files. What about RAW files?

  • @evangelostsakiris1067
    @evangelostsakiris1067 3 года назад +4

    the slider on your denoise is way up, i mean the value you put in 79 is normal to make the photo creamy. I use it up to 15 max or put in on auto for a starting point.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      Most definitely! The problem is that the Denoise slider also adjusts what’s registered as denoising material to be targeted, so you can’t just turn down the Denoise level without changing the localities of what’s being Denoised as well... that’s why I kept it high in those examples because otherwise it wouldn’t have targeted it as correctly. I’ve been learning a few others have experienced this problem as well so it’s definitely a universal thing.

  • @ChrisBilcliff
    @ChrisBilcliff 3 года назад +5

    Great video found the music in the background as you were talking a bit distracting but that’s just me maybe?

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      Thanks man! I'll keep that in mind.

  • @kennyc4472
    @kennyc4472 3 года назад +3

    I shoot with a 21 MP when I crop on my image I always go to Giggapix first. Because my iso normally doesn’t go over 800 , sharpen AI next but with the new update very slow so it’s frustrating to utilize. Is a 20mp consider a low resolution? Thanks.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад +1

      Hey Kenny! I would argue that if you don't crop it in, 20MP is not low resolution, however, once you begin to crop it in like you said, you lose detail and resolution pretty fast. For me, I consider anything under 16MP low resolution, so if you're shooting at 20MP and cropping, you get below 16 pretty fast.

  • @Andy_Thomas
    @Andy_Thomas 3 года назад +1

    Great video.
    1. Could you use a mask to use denoise on part of the picture and sharpen on another part (e.g. denoise the background and sharpen the subject)?
    2. I got the Canon R6 despite the fact that 20MP is too limiting when cropping. Do you think Gigapixel would help here?

    • @JeremyNeipp
      @JeremyNeipp  3 года назад +1

      Hey Andy! Thanks! Yes you can use masking to only affect certain parts of the image, and I would think Gigapixel is helpful for your 20MP cropped images, cause when you're cropping in, you probably start getting to 12MP and lower, and I think that that's when Gigapixel begins to be helpful. Hope that helps!

  • @KurtisPape
    @KurtisPape 2 года назад

    I got the new version of Denoise and found I was getting the artifacts in out of focus area, so I went back to and old version and it does not get those clumpy artifacts, on rare occasion it can still get the artifacts. Also if I make the settings to use my dedicated graphic card, for some reason then I get the clumps just like the new version but the files process instantly, so I de-select that and make my CPU do the work which is much slower, instead of instant its usually 7 minutes of 61MP image. Also 50% of the time I don't apply denoise to the bird so I use a mask to deselect bird

  • @narutodayo
    @narutodayo 3 года назад +1

    Some very interesting suggestions here, thanks for sharing.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      Of course! Yea I was surprised by my findings too but they still are true to this day for me. Stay encouraged!

  • @KGsPhotography
    @KGsPhotography 3 года назад +1

    Well explained Jeremy. Totally agree Sharpen is always my first port of call and find I use DeNoise less and less these days. Only ever used a trial of GigaPixel but seemed quite good especially for cropped images. Cheers Keith

    • @JeremyNeipp
      @JeremyNeipp  3 года назад +1

      That’s Keith! Yea it’s weird but it seems as if Denoise is becoming less and less useful for a lot of people. But Sharpen is definitely a great tool, Gigapixel as well but it’s really only for people who struggle with lower resolutions for the most part. Cheers!

  • @KenToney
    @KenToney 3 года назад +1

    If I have a 10,000 iso photo I should try Sharpen and use the denoise slider inside it? This is different than my workflow and I want to try this for sure.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      Most definitely! That's what I would recommend and really was I was discovering prior to this video. I found I liked the Sharpen AI Denoise better than the Denoise AI Denoise often.

  • @MrBakravitz
    @MrBakravitz Год назад +1

    I wonder if the updated versions of Topaz Labs Sharpen, Denoise, Gigapixel and Even Photo AI have the same issues or have they been addressed by Topaz Labs.

    • @JeremyNeipp
      @JeremyNeipp  Год назад

      They definitely have since this video was made 2 years ago! I'd highly recommend checking out my video on Topaz Photo AI I made on my side channel a few months ago.
      ruclips.net/video/uko8Ne-xuqM/видео.html

  • @russellwebb3672
    @russellwebb3672 3 года назад +9

    Hi, Just curious as to you seem to crop your image/s before sending to the Denoise AI (maybe I am seeing it wrong?) if so that is the wrong approach IMO sending an uncropped image gives the program more to gather info from and does seem to give better results at output. I do wonder also why Topaz does not give the option in Denoise AI to select automatically the object in an image as you can in Sharpen AI. Russ.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      Hey Russ! That's interesting, I haven't tested the difference of cropping before or after, theoretically it should be the same though correct? Cause a crop isn't changing the data within the space you crop to, so the program should recognize it identically to the pre-cropped image? But maybe I'm incorrect, would be interesting to test this.

    • @jpdj2715
      @jpdj2715 3 года назад +1

      @@JeremyNeipp - true: "a crop isn't changing the data within the space you crop to". Also true: "an uncropped image gives the program more to gather info from". The last sentence concludes, "The moral of these facts for you is, whatever you do, denoise then crop or crop then denoise, if you feel the result should be better, then swap to try the other approach as well." This rest of this essay explains why.
      The problem underlying "denoise" is called raw processing. Your sensor is colorblind and analog. You get "color" because the camera manufacturer puts color filters over the real sensors, called photosites. These filters are monochrome and per cell. Then there's analog to digital (AD) conversion in some circuit that may be bundled with, or stacked on, the sensor chip and that AD has its camera manufacturer firmware dictating how an analog "real and extreme precision (*)" value is converted into a digital integer number. And here is where you have your "14 bits" (or 10, 12 or 16). So your raw file has photosite data in the bits precision that you set in the firmware. On your display you have pixels that RGB and in order to give a good image these pixels need RGB values (**) - and these are not in your raw file, but must be "invented" ("faked"?). Imagine you shoot a plane that has one half black and the other white and there's a tiny red dot in the middle of the black and one in the white. Imagine the red dot in the white is recorded by a photocell that has a green filter, and the red in the black is recorded by a photosite that has a blue filter. The sensor knows nothing. The AD firmware converts an analog brightness level for each photosite into a number and that's what is in the raw file - with the color filter label, maybe. Now "raw processing" (****) needs to convert that data and this is done by looking at the data of photocell [x,y] . Doing this mathematically precise and repeatable wild-assed guessing in the basic form will cause cross bleeding along the line between the black and white areas: in the black area you'll see cloudiness from the white and in the white area along the border you'll see grey contamination from the white region, nearby. You can imagine that (a) the two red dots will not necessarily return as red and (b) they will also have 'resonance' from the algorithm in their surrounding pixels. Which is to say that most color noise in digital pixel representation of our photographs is an artifact from raw processing. This first phase is called deBayerization as the color filter grid over the sensor generally follows the idea of Mr. Bayer (***). Because of artifacts from that, it is followed by another algorithm called "demosaicking". So if deBayerization creates Moiré, then demosaicking should find and remove it. We, humans, easily recognize Moiré but "color noise" makes us blame the camera or lack of photons at the quantum physics level - bull excrement. Yes, "noise" exists at the photosite level too, but relative to what happens in raw processing it is marginal.
      The problem in writing the demosaicking algorithm is that it can only be helped with AI type analysis, like "convolution" that was invented in that AI branch called "computer vision" about 30 years ago. This helps detect edges between blobs and these blobs can be used in pattern matching so as to recognize what is in the picture. The more sophisticated this is done, i.e. the more sophisticated the AI, the better the raw processing will be. Or, detection of blobs might reduce cross bleeding over edges between blobs. This has all sorts of problems, like how do you treat blurry out of focus transitions. Deeper knowledge in, available to, the algorithm may give better raw processing but it may invent things that weren't there. And if the latter is perceived as much better than plain deBayerization, then we will be happy, even with this illusion.
      Your point about "the data within the space you crop to" is in here somewhere. The deBayerization looks at the neighbors of each [x,y] photosite and may go as far as 10 rows and columns from [x,y]. But at the edge of the sensor, there are no neighbors. Which means we either need a separate edge-deBayerization algorithm, or the sensor should have additional rows and columns that have their data in the raw file, but will never become visible to us after raw processing.
      And it is also in the "AI" that might be helped by the presence of more context for its object recognition analysis. Here, the question is, if the AI recognizes a bird and its feathers, or a marten and its fur, or a human and their eyes, pupil and eyelashes. Basically, is it intelligent or artificial. The moral of these facts for you is, whatever you do, denoise then crop or crop then denoise, if you feel the result should be better, then swap to try the other approach as well.
      (*) the precision of the analog cells in a sensor may depend on voltage regulation by the camera.
      (**) when raw processing converts the data from a 14 bit raw - monochrome - photosite into an RGB pixel, then the best cameras (or the best raw processing?) give us 27 bits RGB color space. Which means the 14 bits R now has 9 left over in the 27 that is 9+9+9. It also means that all 27 have been guesstimated.
      (***) en.wikipedia.org/wiki/Bayer_filter Illustrates that the color filter array is organized in quartets of R,G,G,B filters. Shooting 14 bits means you get 14,14,14,14 from each quartet. deBayerization turns these into RGB,RGB,RGB,RGB with 27,27,27,27 bits. Testing a camera and comparing them is basically reviewing "raw processing". Writing as if you can test a sensor, not a camera has helped sensor manufacturers a lot, but nothing is farther from the truth or nothing is less irrelevant or less invalid.
      (****) raw processing happens in e.g. in Lightroom when you open your raw file, but is done in camera by its firmware when you shoot JPEG or MPEG. At the moment you convert raw to pixels, you, or the camera, determine how this is done and in the process you loose data quality.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      Hey there! Thanks so much for your response, honestly a much more educated perspective than I know or can even understand at the moment. But I learned a lot from what you shared. Definitely can agree with the moral of the facts and trying the other is always important and crucial for an understanding and learning of what it is we are creating. Thanks for putting the time into this response! I think it gives a very deep dive into questions some people may have.

    • @narutodayo
      @narutodayo 3 года назад

      That's interesting, I actually recall hearing the reverse, that you want to feed it an image at final dimensions, i.e. post-crop if you're cropping. The reason is that the algorithm is calculating its output based on how many pixels it will display the image at and optimizes its results for that resolution specifically.

    • @jpdj2715
      @jpdj2715 3 года назад

      @@narutodayo - basically we don't know the "algorithm" that, say Lr uses. We do know - "Preferences" cause differences - that Lr bases its previews on horizontal display resolution. That's not the same as cropping. Note that a crop definition in Lr changes nothing, except a bit of preview. When you "export" it will make its final deBayerized and demosaicked version of your raw file based on your correction settings, and adapt to the format it is told to generate. If there is some form of "computer vision" in there, at that moment, then my guess is that the algorithm looks beyond the crop borders for its image recognition. The edge/blob detection of first stage computer vision is important at every magnification of our images because of anti-aliasing (AA) and the prevention of cross bleeding. To speed processing up, programmers may cut corners. If you generate a low-res image, then less certainty may be needed, or if edge/blob identification runs into an uncertainty within crop borders, it might look outside them. We do not precisely know and, in the case of Lr, it's the intellectual property - trade secret - of the Mudbricks. The same applies to deBayerization followed by demosaicking. Imagine a deBayerization that has the edge/blob detection integrated and potentially a second stage of image recognition which identifies sets of blobs as objects like "human", "tree", "leave", etc. then this might prevent a lot of demosaicking from being needed. But not all. It gets more finicky at the level of color noise following from pixel level uncertainty of what the real original colors could have been (this is a form of color noise that has nothing to do with the sensor and the camera/firmware operating that sensor, but purely follows from the monochrome Bayer filtering of photosites). We can only think aloud and guess - like deBayerization: get an acceptable general notion but totally miss a tiny detail.

  • @marknathan7744
    @marknathan7744 3 года назад +1

    Good job thanks for sharing .

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      Thanks Mark! Glad you found it helpful.

  • @andycoleman2708
    @andycoleman2708 2 года назад

    Have you compared Adobe LR and Photoshop "super resolution" versus Gigapixel?

    • @JeremyNeipp
      @JeremyNeipp  2 года назад

      I have not! I have heard things about it though. Would be interesting to do a test!

  • @methodical100
    @methodical100 3 года назад +1

    I have noticed in my experience if you have a lower resolution camera and you think you may blow it up to a bigger size I would use gigapixel first. The only reason I found this out is because I had a picture someone liked and wanted it blown up, the size of the image didn't allow for what the person wanted so I bought gigapixel and blew up the edited jpeg to the size they wanted and it looked like someone had placed an artistic filter over the picture, it kind of looked like a painting. So what i did was enlarged the original image first with gigapixel so i was working on a higher resolution image to start with then cropped and re edited. It worked out really well.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      Most definitely, I've discovered the same thing in my work. Thanks for sharing your experience as well!

  • @chris_harry
    @chris_harry 2 года назад

    Great video. Niche should be pronounced "Neesh".
    Topaz Denoise works well on auto settings. The default is often only around 18 percent, so your examples of 70 percent denoising seem very extreme and will always affect detail and realism.
    I think Denoise is great at auto levels and you could even turn down sharpening and add a bit of recover original detail.
    Colour noise reduction at 10-20 precent also helps in Denoise for chromatic aberration. I think you could even use Denoise with sharpen off and then mask/sharpen in LR and PS, although I like the sharpening in Denoise, as well, but you sometimes need to go lower than auto settings for sharpening.
    PS has different sharpening modes which I find are not too bad either. LR also has decent sharpening with the right masking applied.
    With Denoise, I also find that AI raw is not always the best option. I have found each different mode performed better for certain images. Auto mode works pretty well in Denoise and you can often turn down the sharpen tool, as well, then do minor sharpen tweaks in LR and/or PS.
    Converting the RAW file into a DNG file after applying Denoise can also allow for further sharpening in Adobe Camera Raw.

  • @JaminTaylor
    @JaminTaylor 2 года назад +1

    I use sharpen ai for my noise reduction

    • @JeremyNeipp
      @JeremyNeipp  2 года назад +1

      Yea I find it often more effective! Interesting you can relate

  • @wildlifevlogs_
    @wildlifevlogs_ 3 года назад

    Nice video

  • @SassePhoto
    @SassePhoto 2 года назад +1

    I use sharpen ai mainly and get superb results for my eagle 🦅 images. It's all I need

    • @JeremyNeipp
      @JeremyNeipp  2 года назад

      It’s a great program! Glad you’ve enjoyed your experience too

  • @freetibet1000
    @freetibet1000 2 года назад +1

    I noticed that all your examples where jpg images. In my experience I will get vastly better results if I use Topaz Ai on raw files. So much more latitude to work with, if you will. Topaz accepts raw files directly into the algorithms without having to “pre-bake” them in Lr or Capture One. When done, you have the option to export them in DNG (raw) format for further development in your preferred raw editor. If you’re shooting jpg instead of raw I strongly encourage you to reconsider that strategy! I couldn’t help but to notice that you have a hi efficient Mac, so your workflow wouldn’t be slowed down on your workstation by editing raw instead of jpg either.

    • @JeremyNeipp
      @JeremyNeipp  2 года назад

      Thanks for the input! Yea I’ve definitely changed some aspects of how I work the programs since I first purchased them, this original video was made around the time when I just purchased and wanted to share out with everyone else.

    • @freetibet1000
      @freetibet1000 2 года назад +1

      @@JeremyNeipp Yea, I realize now that your video have been on the YT for a while. Good luck with your photography.

  • @user-oe5jl2br6u
    @user-oe5jl2br6u 3 месяца назад +1

    Can Topaz operate as a stand alone?

    • @JeremyNeipp
      @JeremyNeipp  3 месяца назад

      Yes! That’s how I mostly use it

    • @user-oe5jl2br6u
      @user-oe5jl2br6u 3 месяца назад

      @@JeremyNeipp So if i didn’t have
      Photoshop or LR, can I use it? It has got basic editing tools as well? Sorry, newbie here!

    • @JeremyNeipp
      @JeremyNeipp  3 месяца назад

      You don't need other editing tools for it to function, but it doesn't do the same things as Photoshop or LR, so no basic exposure adjustments, color adjustments, etc. @@user-oe5jl2br6u

  • @scottwedell5072
    @scottwedell5072 3 года назад +2

    I agree. I noticed that denoise used to be so much better when it first came out, but after a few updates it's definitely lost its magic.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      That's interesting! Since I've been only 1 month into it I haven't seen it go through those stages, but that's interesting you've noticed that over time.

    • @scottwedell5072
      @scottwedell5072 3 года назад +1

      Could be just me, I was definitely blown away like how you were saying you were with sharpen. Could be I'm just expecting too much from it.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      Sharpen really is amazing!! A much more useful tool for wildlife photographer's than Denoise in my opinion.

    • @scottwedell5072
      @scottwedell5072 3 года назад +1

      @@JeremyNeipp and you convinced me to give gigapixel a try.

    • @JeremyNeipp
      @JeremyNeipp  3 года назад

      You should do it! It's a great program.

  • @mahakalax108
    @mahakalax108 2 года назад +1

    SUB

  • @bigfootmm
    @bigfootmm 2 года назад

    Thanks for the research. I did get tired of hearing "you guys" over and over and over.