LUMINANCE - Do You Really Need It?

Поделиться
HTML-код
  • Опубликовано: 31 май 2024
  • We all take if for granted that for doing great Astrophotography you need LUMINANCE. But does all this additional work really pay off and especially, when does it make sense to use LUM and when not?
    Video of @lukomatico : • LRGB vs RGB - Which is...
    ---------------------
    Join my Patreon site for getting cutting edge news about Astrophotography software and equipment, early access without commercials and tons of supporting documents: www.patreon.com/user?u=83665984
    If you buy any equipment you may consider these two shops - by using the links below you support the channel:
    Agena Astro: agenaastro.com/?rfsn=7982331....
    High Point Scientific: www.highpointscientific.com/?...
    #astrophotography
    ------------------------------
    Music credits:
    ORBITAL_StriKe by B E T T O G H | / bettogh
    bettogh.bandcamp.com | open.spotify.com/artist/3zDlq...
    Music promoted by www.free-stock-music.com
    Creative Commons Attribution 3.0 Unported License
    creativecommons.org/licenses/...
  • НаукаНаука

Комментарии • 27

  • @charliemiller3884
    @charliemiller3884 8 месяцев назад +5

    After 10 years of shooting mono LRGB-SHO frames, I have converted to only using a OSC camera plus filter wheel with UVIR and Optolong L-eXtreme filters. This provides excellent RGB imaging and narrow band imaging with less imaging time and less processing time.

  • @OigresZevahc
    @OigresZevahc 8 месяцев назад +5

    Thank you very much for all you do for us, Sasha!

  • @bobc3144L
    @bobc3144L 8 месяцев назад +3

    Outstanding explanation! Thank you.

  • @pcboreland1
    @pcboreland1 7 месяцев назад +3

    As a british english speaker it is loo-minance. English is so messed up! Great video, awesome!

  • @davewilton6021
    @davewilton6021 8 месяцев назад +4

    Synthetic luminance can be very useful for SHO images and RGB images where you don't have real luminance subframes. After stretching, you create the synthetic lum and apply all your sharpening to that. Then you apply convolution (not deconvolution) to the color image. This blurs out the color noise. Then you recombine the images, with the sharpened synthetic lum as the new luminance channel. This sharpens the structure without sharpening the color noise. It doesn't increase your integration time, but it results in a better image without adding much complexity to the workflow.

    • @viewintospace
      @viewintospace  8 месяцев назад

      Great input Dave! You describe here nicely what is preached by the ones promoting synthetic LUM. The issue is, the only thing you can state at advantage of doing this, is to blur out color noise. And if THAT is really the only tangible effect synth LUM has (and I would not know any other), then I know a MUCH faster way of achieving that.

  • @lukomatico
    @lukomatico 8 месяцев назад +3

    Hey Sascha! Very interesting video mate, well done! there's so many facets to this particular question that I take my hat off to anyone tackling the subject haha! :-D
    Thanks for the mention by the way! Im glad my old vid was some use :-)
    Clear skies!

    • @viewintospace
      @viewintospace  8 месяцев назад

      And thanks for inspiring this video with yours! I think it's always great what we can build on each others work and hence bring the thought process further one video at a time....

  • @MrPedalpaddle
    @MrPedalpaddle 8 месяцев назад +4

    The argument for synthetic luminence with Narrow Band would come from those who stretch and colorize each channel before combining - e.g., Steve @EnteringintoSpace would then add convolution to the colored channels to remove noise, then restoring the structure lost through the convolution with a synthetic luminence. Not sure off hand if @paulyman also does this.

    • @PaulymanAstro
      @PaulymanAstro 8 месяцев назад +2

      I do. Exactly as you described. I do think carefully about how I do it and if I do it, as Sascha says though. Sometimes I use the Ha data, sometimes I create a synthetic lum by integrating multiple channels if I feel they add structure. RGB stretching is to me 90% about maintaining good colour contrast, synthetic luminance to me is all about maximising contrast and sharpness as well as highlighting interesting structures.

    • @MrPedalpaddle
      @MrPedalpaddle 8 месяцев назад +1

      Thanks very much for the comment. I’m finding your tutorials very helpful. I hope you can update your Foraxx script to play with the new PI version. Cheers!

  • @darkrangersinc
    @darkrangersinc 8 месяцев назад +4

    Great video and explanation! Never been a huge fan of Synthetic L or Ha doubling as a Luminance layer I would rather just add more actual data. But I think you did a nice job highlighting when it can make sense to use a Luminance Layer

  • @pcboreland1
    @pcboreland1 7 месяцев назад +2

    You're on the right track I think with IR. Perhaps a blend of the two. This is what a number of people doing lucky dso imaging have been doing for some time.

  • @starpartyguy5605
    @starpartyguy5605 7 месяцев назад +1

    For many years, going back to the early 2000's, I shot long lum and short color using very small (compared to today) cameras, ST7, ST8, STF-8300. This year I moved to the QHY268M with 50 mm filters. I'm using a C9.25. on a G11 Gemini 2. I switched from Maxim to Nina along with all the extra stuff to learn, including Pixinsight. So learning curve, culture shock... I got my Optec Lepus f/6.3 focal reducer configured with a special spacer that OPtec made for me. Now I shoot 3 minute subs and no luminance. Images seem OK so far. But wow, so much new stuff to learn!

  • @paulbenoit249
    @paulbenoit249 8 месяцев назад +3

    Great video, .....this is why I am planning to keep using my color camera to capture the color, and the equivalent mono is on its way to shoot luminance only (or Ha in some cases).....to try and get the same results as shooting fully mono LRGB, but with the benefit of no filters, filter wheel, ...

  • @larryfine4719
    @larryfine4719 3 месяца назад +1

    Ah, lots of things make sense here. While RGB for luminance is not technically a bad idea, the reduced imaging time of LRGB is definitely more efficient in terms of time :-)

  • @dbakker7219
    @dbakker7219 8 месяцев назад +3

    HI Sasha, very good explanations! thank you. I experimented with IR too and another reason for less detai in your andromeda is that IR has a longer wavelength and thus always a lower resolution than visible light in our amateur scopes. Also using a refractor for IR does not work wll ( i think you used your FRA 400?) t the glass messes with your IR signal and diminishes its strength. I always use a reflector for IR imaging, no glass, no correctors of glass in between. I get more smaller galaxies/clusters but resolution is less.

    • @viewintospace
      @viewintospace  8 месяцев назад

      That is really helpful - thanks!!!!

  • @BruceMallett
    @BruceMallett 8 месяцев назад +3

    Somewhere around 3:30 you say that luminance provides the detail and contrast. I've read this claim elsewhere and that it is sufficient to shoot RGB at a lower res (say bin2x2) if you keep the luminance at full res (bin1x1). Do you do this? It would save a lot of session time, would it not?

    • @viewintospace
      @viewintospace  8 месяцев назад

      I don't do this, but yes, makes sense to me and should work fine. I also heard of people who shot the Lum with a mono cam and the RGB with an OSC cam. Also a way to save time.

  • @davecurtis8833
    @davecurtis8833 5 месяцев назад +1

    Great video. Pretty much matches my experiences. For a very bright nebula with bright stars like M42, would you use Lum as well RGB ?

    • @viewintospace
      @viewintospace  5 месяцев назад

      If you shoot RGB and not Narrowband, then Lum should be used.

  • @Phenolisothiocyanate
    @Phenolisothiocyanate 3 месяца назад

    One thing that confuses me about luminance is: If the color data is good enough to assign a value to a pixel then why do you need lum? Conversely, if color data isn't good enough then won't lum just bring out noisy colors?

    • @viewintospace
      @viewintospace  3 месяца назад +1

      There is nothing like color data - it is simply light that passes a filter. Now what matters is signal to noice ratio. So where is light at all (and how much ) and where is dark. When I know that I only need to know how to color it and that is easier. In very dark areas even if a color signal is there, it will still be black, so no real issue.