How much exposure time do I need? Use my free SNR app!

Поделиться
HTML-код
  • Опубликовано: 21 ноя 2024

Комментарии • 116

  • @deepskydetail
    @deepskydetail  Год назад +6

    Here is the URL to the SNR planning app: deepskydetail.shinyapps.io/Calculate_SNR/

  • @philleng480
    @philleng480 Месяц назад

    Interesting! Will look forward to using this. Thanks

  • @monkeypuzzlefarm
    @monkeypuzzlefarm Год назад +3

    Fantastic video! I can't believe how quickly you whipped up that app!

    • @deepskydetail
      @deepskydetail  Год назад

      Thanks! This app wasn't too difficult because the math was already in the original spreadsheet. It was just a matter of turning it into simple code. Plus, the R packages that I use to make the application make coding everything pretty simple.

  • @ByquPL
    @ByquPL Год назад +3

    I have subscribed your channel faster than one can say "WOW man, this is superb!"

  • @mikehardy8247
    @mikehardy8247 Год назад +1

    This should prove very useful to a newbie, who probably oversamples.

  • @Graeme_Lastname
    @Graeme_Lastname Год назад +1

    I have just recently "discovered" your channel. I'm very interested this and all that goes with it. Looking forward to the next one. Thanks m8. 😃 🇦🇺 🖖

  • @yervantparnagian5999
    @yervantparnagian5999 7 месяцев назад +1

    Very interesting. Thank you for your work. I suspect, seeing most everyone uses some type of filter, the hours of integration/exposures will go up exponentially.

  • @mohicanspap
    @mohicanspap Год назад +4

    I hope this tool gets more developed over time. Such a great thing to have especially for beginners or even professionals. Also with the camera I have I don’t use bias frames but dark flats. Hopefully there will be an option for those in the future. Great work!!!

    • @deepskydetail
      @deepskydetail  Год назад +2

      You should be able to just take some bias frames to get the calculations. I only use dark flats too, but taking a few bias frames is easy and the numbers should be valid for a while.

    • @mohicanspap
      @mohicanspap Год назад

      @@deepskydetail Good to know! thanks for getting back😃

  • @AntonioPena1
    @AntonioPena1 Год назад +1

    James, definitely after watching your video looking to use your application, thanks for reviving Craig. Good job!

  • @anata5127
    @anata5127 Год назад +1

    Superb! I will try this. Thanks for your efforts.

  • @JohnICGomes
    @JohnICGomes 2 месяца назад +1

    Great video and awesome web app. Would be great to make this into a iOS app.

  • @stefan_astro
    @stefan_astro Год назад +3

    Really cool app! Thank you for creating it. I will try it out myself aswell!

  • @yougoattube
    @yougoattube Год назад +2

    Hats off to you, sir! I've made a small donation to help the cause along, and to encourage others who might want to contribute to the astrophotography/astronomy community. Many thanks! PS - I also subscribed to your channel.

  • @marvinwhisman3333
    @marvinwhisman3333 9 месяцев назад

    Great video. Thanks so much for your work and contributions to the community.

  • @viewintospace
    @viewintospace Год назад +1

    Thanks a lot! Amazing video and amazing app!!! Really cool stuff!

  • @mehensley1234
    @mehensley1234 3 месяца назад +1

    Thanks for the informative video. How would you gather the mean ADU value in PixInsight for a selected area? I know how to collect the mean value for the entire frame, but I need help finding how to do it for a designated area. Dynamic Crop, maybe?

    • @deepskydetail
      @deepskydetail  3 месяца назад

      Unfortunately, I'm not sure how to get the statistics in Pixinsight. Sorry about that!

  • @TheBillyclash
    @TheBillyclash Год назад +1

    Bravo, you rock!

  • @HyperX07
    @HyperX07 7 месяцев назад +1

    Thank you bro for your hardwork

  • @DrillingDataSystems
    @DrillingDataSystems 9 месяцев назад +1

    Great work! Thanks..

  • @andiheimann
    @andiheimann Год назад +1

    awesome work!

  • @Nabby13
    @Nabby13 Год назад +1

    Well done! One question: how is quantum efficiency of a camera taken into account? The shot noise originates from Poisson distribution of photons. Are you assuming that the quantum efficiency is something like 85% and then taking that into account?

    • @deepskydetail
      @deepskydetail  Год назад +2

      I don't think quantum efficiency is needed to get the signal. Why? Because the total signal is being calculated from the light frame itself. If QE goes up for a camera, then you'll see a brighter image in your light frame (i.e., the ADU will be higher). In other words, the app will adjust the signal based on that based on the two samples you took from the light frame.

  • @chrisschomburg4936
    @chrisschomburg4936 Год назад +1

    Nice tool and great designed Video. Well done. Maybe you want to check your exposures. The graph of your camera showed 1.9e readnoise, not 0.9e.

    • @deepskydetail
      @deepskydetail  Год назад +1

      Thanks, but don't I have 2 for my read noise in the video? 0.9 is the gain value in electrons per ADU.

    • @chrisschomburg4936
      @chrisschomburg4936 Год назад +1

      ​@@deepskydetail aaaaah, got it. 😅

  • @Ronbo765
    @Ronbo765 Год назад +1

    Thank you!

    • @deepskydetail
      @deepskydetail  Год назад

      You're welcome! And thanks for watching!

  • @DKelly350
    @DKelly350 Год назад +1

    Thanks, this sounds like a great tool. Do you know how I would relate camera gain (I assume you are using an Astro-camera) to ISO on a one-shot-color camera?

    • @deepskydetail
      @deepskydetail  Год назад

      I'm going yo have a video on it. It will probably come out today or tomorrow.

  • @pipercherokee8598
    @pipercherokee8598 Год назад

    Excellent work. One thing I can't quite get my head around - no matter what numbers I plug in based on my subs and biases, the app always suggests something in the order of hundreds or even thousands of hours needed, even to get a SNR or 75 or so. I haven't seen anything with my frames to suggest they are overly noisy - what might I be missing? i am also using Siril stats to get the values. I am happy to take 10 or even 20 hours worth over a few nights, but not hundreds :). Thanks again, this will be very valuable once I get it figured out!

    • @deepskydetail
      @deepskydetail  Год назад

      Thanks, do you mind sharing the numbers you're putting into the app?

    • @pipercherokee8598
      @pipercherokee8598 Год назад +1

      @@deepskydetail SO sorry I missed your reply here - I actually did get it figured out (though I don't remember what the solution was at the moment). Much appreciated.

  • @pcboreland1
    @pcboreland1 Год назад +1

    Fantastic work! What do you mean by bias signal. I do not take bias frames with a cmos camera. There are darks, lights, flats and flat darks. Please say more about getting the value needed to enter into your bias input.

    • @bbasiaga
      @bbasiaga Год назад +3

      A bias frame is same iso as your flats/Darks, shortest possible exposure. It contains basically only read noise. It's an alternative to a flat dark to calibrate noise out of your flats.

    • @deepskydetail
      @deepskydetail  Год назад

      Yes, I would take a very short exposure with the same ISO as your dark and light to get the bias signal.

    • @pcboreland1
      @pcboreland1 Год назад

      @@deepskydetail Thanks. I have been testing your very nicely built App, some parameters like read noise have little to no impact on the curves.? Neither does the bias signal? I know you based this on a spreadsheet set of equations, what are your thought on this?

    • @deepskydetail
      @deepskydetail  Год назад

      @@pcboreland1 I've gotten similar questions on this. Basically, read noise should affect the curve, but it might not have a huge effect (especially if you have a lot of light pollution).
      The bias signal should have an effect, but the dark signal shouldn't. Basically, the skyglow noise estimate takes into account the dark signal (skyglow - dark) , and the total dark signal also takes into account the dark signal (i.e., dark - bias). So... in the end those two things cancel each other out in the math, so the dark signal is kind of redundant. But, they might eventually be relevant if I ever include more information in the app.
      The biggest impact is probably going to come from the target signal and the skyglow.

    • @dimitriospetikopoulos2420
      @dimitriospetikopoulos2420 5 месяцев назад

      I was wondering if any of this applies to OSC DSLR cameras

  • @samwarfelphotos
    @samwarfelphotos Год назад

    This is super cool! I use a Nikon D5300 to image, where can I find those camera stats for it, since I don’t use an astro cam that gives them to you on its website?

    • @deepskydetail
      @deepskydetail  Год назад +1

      You should check out this video I made for DSLRs: ruclips.net/video/oBpAkHim7dY/видео.html
      It takes a bit of calculation to get the RN and gain for DSLRs but it is doable.

    • @samwarfelphotos
      @samwarfelphotos Год назад

      @@deepskydetail Thank you!

  • @Thommy78
    @Thommy78 Год назад

    Great work, awesome video. Thank you very much!

  • @galactus012345
    @galactus012345 Год назад +1

    Than you. Very Interesting tool, I am using it. We can see from the graph that it can go beyond 100.
    So it means that at a point adding frames is no longer increasing the SNR. What to think abount people adding 100H ?

    • @deepskydetail
      @deepskydetail  Год назад

      Thanks! I'm not sure exactly what the question is. What do you mean by "adding 100H"? Thanks :)

    • @--Adrian--
      @--Adrian-- Год назад

      No, gathering more exposure time will always increase the SNR

    • @galactus012345
      @galactus012345 Год назад

      @@deepskydetail Sorry for the typo, I meant 100 subs. Isn't 100 SNR ratio a limit ? Are we increasing the quality by going over SNR 100 ?

    • @deepskydetail
      @deepskydetail  Год назад

      @@galactus012345 That's a good question, and I should have been clearer in the video. You can theoretically have infinite SNR if you have infinite integration time. It's a ratio, not a percentage, so it doesn't stop at 100.

  • @HaakonRasmussen-r8o
    @HaakonRasmussen-r8o 8 месяцев назад +1

    Thank you for Videos and Apps! Is most appreciated. What makes me very sad ist to see the exposure time i need to get an SNR of 50 under Bortle 5 skies( Mag 20). 144 hours when i use 60s frames. But the exposure time depends a lot on how i frame when i get the statistics. I used a small galaxy NGC3184, and when i only framed a part of the galaxy i got an exposure time of 27 hours, when i framed in more than the galaxy, i got an exposure time of 144 hours! So what do you think regarding how to obtain statistics from light frames? I used Siril by the way.

    • @deepskydetail
      @deepskydetail  8 месяцев назад +1

      Thanks! Wrt your questions, you really need to get a good sample of the sky background for things to give you an accurate measurement. Take the statistics of the sky background as close to the galaxy as you can, but making sure you don't include the galaxy in it. Also, the part of the galaxy you take a sample of also matters. The resulting SNR in the App is only for the sample that you took. So, if you took a sample from the outer arms of the galaxy, you'll need more imaging time to get the same SNR compared to the brighter core. Hope that makes sense.
      Also make sure you're using an "average" sub frame. One that isn't your best, but not your worst either. These things are important! I hope that answered the question.

    • @HaakonRasmussen-r8o
      @HaakonRasmussen-r8o 8 месяцев назад +1

      Thank's for your reply! . I have tried to do as you described, and i find that the results make more sense. 23.45 hours under Bortle 5 skies for SNR 50 makes more sense for me. With the 10 hours of integration time i've got, i have to use Binning to achieve a fairly acceptable result. Thank you again! Question: Is it possible to measure the SNR direct in a stacked Fits file? Some time i wish i could do that to be able to compare stacks with different exposure times(60s vs 180s). Total time would be both the same.

    • @deepskydetail
      @deepskydetail  8 месяцев назад +1

      @@HaakonRasmussen-r8o So, I think different stacking software will attempt to measure SNR of a stacked image. But, I'm not sure how well they work, and different software produce different results. I've measured the SNR in a few different videos myself (see my Bortle analysis, Ha filter analysis or SNR and stacking analysis videos). I'm thinking of making an app that can do that with help from Siril by using registered sub frames. But, as far as I know, you need to measure the SNR as you stack.

    • @HaakonRasmussen-r8o
      @HaakonRasmussen-r8o 8 месяцев назад

      Thank you,@@deepskydetail !

  • @xtctaz
    @xtctaz Год назад

    Very cool. Is there a way to skip bias frames? I have a DWARFLAB dwarf2 smart telescope and normally most users of this scope do not use bias of flats

    • @deepskydetail
      @deepskydetail  Год назад

      I think you'd need bias for accurate results. But they're really easy to take. The dwarf2 might have a function to let you do that (but I'm not sure). If not, I'd email the developers to get that functionality :)

  • @felipeleon_astrofotos
    @felipeleon_astrofotos Год назад +1

    Hello, @deepskydetail. I've been studying this stuff recently from de articles of Craig Stark, and been using the spreadsheet also (same one that appears in 1:40 of your video). I have a doubt, hope you can help me to see where I'm wrong. If I enter to the spreadsheet the very same data you show in your excel (and that is loaded automatically when opening the web app), the target SNR for a single image is 0,794 and then, to get to an SNR of 5, I'd need 40 images (this last calculation is also made by the spreadsheet itself in the Stacking Sheet). The problem is that this is not the value that the web app gives (it says I only need 33 subs). Where is the difference coming from?
    Thanks in advance. By the way, the web app is really amazing and helpful.

    • @deepskydetail
      @deepskydetail  Год назад +1

      Thanks for looking into this!! Just fixed it based on your comment! I double checked and it seems like there was mistake in the code when calculating signal in electrons. It was throwing off the SNR of a single sub exposure a bit, which of course has compounding effects!

    • @felipeleon_astrofotos
      @felipeleon_astrofotos Год назад +1

      ​@@deepskydetail Hey! Thanks for considering my comment. And as I said, it's a really helpful tool. Thanks for your work! (BTW, now the numbers match)

  • @pcboreland1
    @pcboreland1 Год назад +1

    Hi! Changing the camera read noise value or changing the dark signal value appears to have no impact on the SNR curve. Is this expected behavior? In fact I can set both values to zero!

    • @deepskydetail
      @deepskydetail  Год назад

      Hi, I just checked, and
      (1) changing the read noise value does change the SNR curve for me. Try increasing the read noise to something like 500 to see if it changes. It did for me.
      (2) After double checking, it is normal that changing the dark signal value does not change the curve. Basically, the skyglow noise estimate takes into account the dark signal (skyglow - dark) , and the total dark signal also takes into account the dark signal (i.e., dark - bias). So... in the end those two things cancel each other out in the math, so the dark signal is kind of redundant. But, they might eventually be relevant if I ever include more information in the app.

  • @malcolqwe2
    @malcolqwe2 3 месяца назад +1

    I'm totally confused. I dont know what ratio is "good" so how am I supposed to pick one?

    • @deepskydetail
      @deepskydetail  3 месяца назад

      You can check out this video I made on it :)
      ruclips.net/video/eRKk3lNyXO8/видео.html

  • @Xanthus723
    @Xanthus723 Год назад

    I must be doing something wrong. If I'm using pixinsight to grab the adu.. I make a preview of the faint and dark areas and then select statistics tool. Like you've got there in Siril.
    Problem is the numbers seem way off. I have the 16 bit selection of the drop down selected since that what my 2600mm is. I tried normalized and 14 bit since the numbers were closer to yours. Neither worked.
    The outputs I got were as follows Input 1: 3078 2: 1909 3: 12 4: 10.9 The camera stuff: 1: .21 2: 1.5 Desired SNR 90 Exposure duration 90
    Halp!

    • @deepskydetail
      @deepskydetail  Год назад

      It looks like to me the ADU values for the light frame might be off. The values seem to be off (too big of a difference maybe?). What is the target you are shooting and what is the light pollution in your area?

    • @Xanthus723
      @Xanthus723 Год назад

      @@deepskydetail m101 and bottle 4

    • @Xanthus723
      @Xanthus723 Год назад

      Bortle*

    • @deepskydetail
      @deepskydetail  Год назад

      I'm having a bit of trouble figuring out which numbers go where. Is this correct?
      DSO + Skyglow Signal: 3078
      Skyglow background: 1909
      Dark Signal: 12
      Bias Signal: 10.9
      Camera Gain: 21
      Camera's Read Noise: 1.5
      If I put those numbers in, I get 9.7 hours to get an SNR of 90. But I think the dark and bias signal values may be off. Are those numbers in ADU? I would expect those to be a bit larger. And the camera gain should be lower (probably around 1.0 or so if you are at unity gain).
      Also, the difference between the DSO+Skyglow and Skyglow numbers seem a bit too large. I'd double check those and make sure you're getting an output in ADU and not some other unit.

  • @mikhailkochiev4583
    @mikhailkochiev4583 2 месяца назад

    How to deal with Canon DSLR? The dark frame is scaled in the camera to the same mean value as bias...

    • @deepskydetail
      @deepskydetail  2 месяца назад

      Clarifying question: does this mean that when you take a bias frame, it has the same ADU as a dark frame?

  • @Groeko
    @Groeko Год назад

    How do I calculate the needed exposures per filter for mono cameras? If I use L-RGB filters, I have to divide the number simply by 4?

    • @deepskydetail
      @deepskydetail  Год назад +1

      I think most of your data will be from the L filter, so I would use that and be a bit conservative in the estimate. If you do want to include the RGB, I think you'd want to use a weighted mean depending on the % of frames that will be L vs RGB.

  • @TRUSTKOREA
    @TRUSTKOREA 2 месяца назад

    Siril's statistics show 4digit mean number but in Pixinsight, it shows like 00.000 Does anybody know how to change this number to 4 digit number like Siril to apply to this app?

    • @deepskydetail
      @deepskydetail  Месяц назад

      On the stats window, there is a checkbox that says "Normalized Real [0, 1]". Is that checked or unchecked?

  • @PeterK6502
    @PeterK6502 8 месяцев назад

    I tried the link to calculate the SNR, but the "Camera Gain (e-/ADU)" does not make any sense to me. e-/ADU stands for "electrons per Analog-to-Digital Unit," which implies more sensitivity for lower values. The lowest read noise for my sensor is at gain 450, where the e-/ADU value is near 0 according to the specifications of my camera. But when I use 0 in your app, then I need infinite exposures according to your app. This does not make any sense to me because that does not match with my observations.

    • @deepskydetail
      @deepskydetail  8 месяцев назад

      1) Camera gain is not read noise. e-/ADU basically means that each time a photon hits your sensor, how many electrons will it send to the processor. If it is 0, then you won't record any photons (as measured electrons). Higher numbers mean more sensitivity.
      2) People generally use unity gain, or close to it. It just means that if a photon hits your sensor, it will record it as one electron. Higher gain values generally mean more shot noise*,̵ ̵w̵h̵i̵c̵h̵ ̵i̵s̵ ̵a̵ ̵s̵e̵p̵a̵r̵a̵t̵e̵ ̵m̵e̵t̵r̵i̵c̵ ̵y̵o̵u̵ ̵p̵u̵t̵ ̵i̵n̵ ̵a̵s̵ ̵"̵C̵a̵m̵e̵r̵a̵'̵s̵ ̵R̵e̵a̵d̵ ̵N̵o̵i̵s̵e̵ ̵(̵E̵l̵e̵c̵t̵r̵o̵n̵s̵)̵"̵.̵

    • @PeterK6502
      @PeterK6502 8 месяцев назад

      @@deepskydetail You misunderstood me. I know that camera gain is not read noise, but I have a camera with the IMX585 sensor, and according to the specs, when a high gain of 450 is selected, then the e-/ADU is near 0. According to your explanation and app, then it will NOT record any photons; however, this setting is MAX gain, therefore it records the maximum number of photons possible. When I select a gain of 0, then the e-/ADU is about 12. According to your explanation, this is the most sensitive setting; however, this is the LEAST sensitive setting. Your explanation only makes sense if you talk about ADU/e- and not e-/ADU.
      Higher gain gives lower read noise for the IMX585 (not higher). This sensor does have a read noise of 12 when gain 0 is used, read noise of 4 for unity gain, and read noise of 1 for gain 450. Therefore, you have the lowest read noise when the highest gain is selected.
      Therefore, I use a maximum gain of 450 for very faint objects, to have the most sensitivity (e-/ADU near 0) with the least read noise.
      See specs: astronomy-imaging-camera.com/product/asi585mc/

    • @PeterK6502
      @PeterK6502 8 месяцев назад

      @@deepskydetail I try to reply, but RUclips's bot removes my reply instantly. I will try to reply in several posts.

    • @deepskydetail
      @deepskydetail  8 месяцев назад

      Oh, I see what you're saying now. I explained it poorly, and have updated my previous answer a bit. A better explanation is that e-/ADU is how many electrons represent 1 ADU. If it is 0.5, then each intensity step (Analog Digital Unit) represents 0.5 electrons. When gain increases, fewer recorded electrons make the image brighter. When e-/ADU is below one, you aren't actually improving SNR because the signal is being artificially amplified. In the extreme case when e-/ADU is zero, then your camera will be saturated with a zero second exposure (i.e., you'll get a completely white image).
      Example: If e-/ADU is 0.5, then if two photons are captured and converted perfectly to electrons, the ADU would increase by 4. But the true signal (2) is unchanged.
      With that said you shouldn't use the read noise to guide your gain settings. A good rule of thumb is to use unity gain. Why? read noise isn't as important as shot noise and dynamic range. Let's say your e-/ADU is still 0.5 and like before you capture two photons. Higher gain is not magic: The signal is still 2. But the shot noise (not the read noise) has increased. Instead of being sqrt(2) if you had unity gain (1 e-/ADU), you have a shot noise of sqrt(4)=2.

    • @PeterK6502
      @PeterK6502 8 месяцев назад

      @@deepskydetail
      The reason everyone uses "unity gain" is because everyone uses unity gain.
      This, in my opinion, is more of a copy behavior than an actual good reason.
      Because I work with a Dobsonian on an equatorial platform, I want to minimize the length of sub-frames to minimize tracking errors.
      That's why I've done some research into "deep sky lucky imaging," where they work with sub-frame lengths as short as 4 seconds.
      An explanation can be found here (in French): ruclips.net/video/0lKCB0q0jV0/видео.html
      The section about gain settings starts at 19:30. So, the gain is set slightly below the maximum value there to minimize read noise (in other words, not unity gain).
      This is exactly what I do as well; I set the gain to almost the maximum value and take many short exposures.
      I came to your page precisely because I want to know how short I can make my exposures without read noise becoming the dominant factor.
      But this information could I not extract with your application.
      Currently, I take a few thousand sub-exposures of 15 seconds each per object, but I want to reduce the sub-exposure to below 10 seconds.
      Your previous explanation about the conversion process is correct; higher gain settings above unity gain do not make the camera more sensitive, but they do reduce the read noise.

  • @andrewoler1
    @andrewoler1 8 месяцев назад

    I'm trying to get an estimate from my subs but when I load them into Siril, I get separate numbers for red, green, and blue. Why color am I supposed to use? They're unbalanced so I get different estimates for each -- the estimates are 27hr, 5hr, and 8.5hr of exposures using red, green and blue mean values, respectively. Any ideas on the best way to handle this? Thanks!

    • @deepskydetail
      @deepskydetail  8 месяцев назад

      Good question. Are you using RGB filters with a monochrome camera or a one shot color camera? If the former, I would use the RGB compositing tool, do a quick color balance and then use the RGB tab to find the DSO and sky background values. Then put that into the app and multiply the sub length by three. If you're using a one shot color, then use the RGB tab. Hopefully, I understood the question.

    • @andrewoler1
      @andrewoler1 8 месяцев назад

      @@deepskydetail Thanks for the reply. I’m using a one shot color camera. I save them as fits files using SharpCap, then load them into Siril. By default it loads it as black and white but when I get statistics it breaks it down by R, G, B separately. If I debayer upon open it shows tabs for R, G, B, and RGB, but even with the RGB tab, when I get statistics on a selection, it splits it into R, G, B channels.

    • @andrewoler1
      @andrewoler1 8 месяцев назад

      @@deepskydetail I do notice there’s a box on the top right corner of the statistics window that says “Per CFA channel” which is checked but I’m unable to uncheck it. Maybe that would convert them to a single number of I could uncheck it. Not sure why it won’t let me uncheck the box…

    • @andrewoler1
      @andrewoler1 8 месяцев назад +1

      Okay I figured it out. I can load an image and NOT debayer it, then get stats on a selection and it will let me uncheck “Per CFA channel” and report a single value as in your video.

    • @deepskydetail
      @deepskydetail  8 месяцев назад

      @@andrewoler1 Glad you figured it out! :)

  • @dadwhitsett
    @dadwhitsett Год назад

    According to this then you will significantly shorten the total integration time by using short duration light frames, correct?? I calculate that if I use 36 seconds per light frame I can reach SNR>100 after 7.9 hours but with 180 second duration it would take 40 hours???? Really?

    • @deepskydetail
      @deepskydetail  Год назад

      Thanks for the comment. What numbers are you putting into the app to get you such different results, and how are you getting at such different numbers? The app tries to estimate noise and target signal on an electron per second basis. I haven't seen that sort of discrepancy with what I have tested.
      For example in my light polluted area, to get an SNR of 30 on M78 with 300 second exposure frames, I would need about 15.5 hours of data (300s x 186 exposures = 15.5). When I use the **slider** to estimate the SNR with 60 second exposures, it tells me I still need 15.55 hours of exposure time (60s x 933 exposures). If I move the slider up to 5x, I still get 15.83 hours of exposure time (1500s x 83).
      If you could comment on the parameters you're using to get both results, I'd appreciate it. Thanks!

  • @the.Potocky
    @the.Potocky Год назад

    You will need 30753 exposures to get an SNR > 100.1 at 180 seconds (1537.65 hours)0.570814666845778
    what am I doing wrong ?
    lights: 513 and 508
    Bias: 501,8
    Dark: 502,5
    Zwo Asi 6200mm Pro
    Gain: 100
    Exp: 180 sek
    😢😢😢😢 PLS HELP 🤦‍♂

    • @deepskydetail
      @deepskydetail  Год назад

      1/2
      OK, let me see if I understand this:
      DSO + SkyGlow Signal: 513
      SkyGlow Background: 508
      Dark Signal: 502.5
      Bias Signal: 501.8
      Camera Gain (e-/ADU): 0.25 (at 100 regular gain, this is what I have for the ZWO 6200mm)
      Camera's Read Noise (Electrons): 1.5 (again, at 100 regular gain)
      Desired SNR: 100
      I get that you will need 1600 hours. That might be correct for 100 SNR. Notice that your DSO + Skyglow signal is only 5 more than your Skyglow background signal.
      That means the part of the image you are using to calculate it is really faint. Or you have a lot of light pollution.

    • @deepskydetail
      @deepskydetail  Год назад

      2/2
      Which target are you trying to image, and what part of the target did you select? Also, what is your light pollution like?

    • @the.Potocky
      @the.Potocky Год назад

      @@deepskydetail Hi ! That's right. The photo we use for the reading was taken if Bortle 2 - high in the mountains!

    • @the.Potocky
      @the.Potocky Год назад

      @@deepskydetail The Spaghetti Nebula, Bortle 2.
      I have a mono camera, what filter is the best to take a photo with?
      Thank you !

    • @deepskydetail
      @deepskydetail  Год назад

      Do you mind sharing the telescope and FOV you used? I think Ha and OIII would be the go to filters for that target.
      The spaghetti nebula is pretty faint to begin with. If you are using a faint part of the nebula for your input, that might be the reason, especially if you aren't using filters at all. Maybe use a brighter part? Also, you're in bortle 2, so you might want to use longer exposures. Noise will hardly go up, but your signal should improve (try 300s or even 600s if your mount can handle it).
      It should definitely be possible in bortle 2 though! Nico Carver could do it in pretty light polluted skies (www.nebulaphotos.com/sharpless/sh2-240/ ). Try a light frame using an Ha filter and see what you get. Let me know what the results are :)

  • @sHuRuLuNi
    @sHuRuLuNi Год назад

    who takes 15s subs ...

    • @michalringes4386
      @michalringes4386 Год назад +4

      pro advice here: do shortest possible subs that are skylimited and do as many as possible, discard any that are not perfect(guiding errors ,worse seeing and such), it will improve you photos dramatically, you only want to go longer subs for extremelly faint objects where there is like no signal at all on single sub. What matters is total integration time and when you have subs that are skylimited you wont gain too much when doing longer subs, but you can loose a lot, it can be pretty difficult to get perfect guiding for longer time also you subs will be more blurry if seeing is not super stable. With shorter subs you can be very selective with what you use at the end discarding 30s sub is much easier than 10 min one

  • @astroattorney
    @astroattorney Год назад +1

    SharpCap Smart Histogram Brain