Wow . This is the most comprehensive workflow ever. All I need now is a good set of lights and calibration and follow your 10 rules. Thank you so much. Ben.
Such a useful video, thank you Sascha. I happen to have never realised the BlurXT comes before NoiseXT :) but it makes complete sense now that you explained it
Thanks, Sascha. Interestingly your workflow is more or less exactly what I have arrived at after several years of processing. Two stages where I have occasionally departed from the flow are 1. Using BXT prior to SPCC on ‘correct only’. As I recall this was recently discussed by Russ Croman and Adam Block because it might improve colour calibration. I can confirm that it doesn’t hurt it anyway. You can follow up with a full BXT after SPCC. 2. When an object lies within a dense star field it sometimes helps to do DBE after separating the stars and prior to stretch. It’s easier see and avoid faint nebulosity in the absence of stars. Regards, Andrew.
Thanks - Nr. 1 I also covered in the BXT2 video (but I was not aware of it before I read the BXT2 manual) but Nr.2 is something new to me and a really great tip - thanks!!!
Very good explanation of all the steps. Thank you. I agree with all of them except for one of those 0.5% exceptions. I found while processing the Seahorse nebula, that if I ran NoiseX before stretching, NoiseX confused the dark nebula with noise and destroyed it. Performing StarX, GHS, and then NoiseX worked much, much better. This was on roughly 19h of data from a 9.25 Hyperstar.
Hi Sascha, I was sure there was a correct sequence to follow in Pix, to have a more scientific rather than botched processing process, as there are dozens of on RUclips, now thanks to you I have confirmation that it was true, thank you! Your videos are only interesting but above all they are exposed with extreme clarity. I take this opportunity to ask you for clarification regarding the formula to insert into PixelMath to recombine the stars, is the " ~" symbol in the formula ~((~starless)*(~stars)) correct? Thank you, I wish you clear skies, see you in the next video. Ciao Roberto
Yes, its s correct. If you want to know what it means and what this whole formula means, have a look at my Pixelmath video: ruclips.net/video/KcJEa739LE8/видео.html
Thanks for an excellent overview. I think you elsewhere recommend dithering, so would be good to know where you resample in the workflow, presumably after BXT. As was said below, I am sometimes happier with NXT after stretching. I mostly use the LeXtreme filter for nebulae and find SPCC does not work as well with that. I use combine(stars, starless, op_screen()) which I believe is the same as your formula, though I am not sure. I also stretch stars and starless separately before putting back together. You and others have persuaded me to use DBE instead of ABE and GHS instead of STF, at least for starless.
That is not color mottling, that is just a display error. Go to Image -> Screen Transfer Functions -> Use 24-bit STF LUTs and this issue is solved. This option does not stay selected after closing PixInsight, so it has to be enable every time you have this issue.
As ever an excellent video. Only one thing has puzzled me as a relatively new PixInsight user. Once you have cropped the image at stage 2 when you come to launch SPCC it will not work without first running Image Solver. I often wonder if by doing so the later launch of BlurXT and NoiseXT are performance compromised. Have you a view?
Hello! Nice video, very clear. Just one thing, I ran some test and it seems i have better result when using starXterminator before blurxterminator. If i do the other way arround, a lot of weird artefact appears after denoise. I’m not sure where it comes from (it seems i’m the only one having the issue) but when you think about it, why would you denoise the stars? They have way enought signal to noise ratio IMO. Both test were run on drizzled linear data, after the following steps, crop/background neutralization, color calibration, blur x
For the main image GHS, for Stars first a slight stretch with ArcSinh (to push the colors), then GHS for the rest. I did once a dedicated star processing tutorial, you might want to have a look: ruclips.net/video/uw_88wD2NWM/видео.html
Thanks Sascha. Great video. Is cosmetic correction part of WBPP and where can I find it? And is gradient removal the sam as Dynamic Background Extraction?
For Cosmetic Correct, have a look at this video: PIXINSIGHT - AS EASY AS 1-2-3 - Part 1 - Introduction & Stacking with WBPP ruclips.net/video/9XgklwAICkM/видео.html Gradient Removal is the same as DBE
Thank you for clarifying many hints you can find in the internet! There is one step I am missing (or just didn't get it): at which point in your workflow do you do the ChannelCombination?
Good question! I did not want to go into this in the video to make things not unnecessary complex. But as you ask.... 😁 Given BXT prefers color pics the BEST point if right after gradient removal. You would also do BXT on the Lum picture and then do the LUM / Color combination in the non-linear stage. Now in some cases - like the forex script of Paulyman Astro, you need stretched individual channels. Here I would only run BXT on the LUM pic.
Hi Sascha Could you please put together a small overview of most processes with a brief description along? Maybe also with a note what part is linear and what not, unstretchen/ stretched etc. That would help a beginner like me a lot. Thank you!
On one side you will see in my tutorials how the processes are interconnected. I also have a lot of videos covering individual processes. If you would like have workflows lining these out on paper, you will find quite a lot on my Patreon channel
Dammit, I thought I finally had a good workflow. Some more tweaks required. Not sure about step #1 though - doesn't WBPP do all of that for you? And will have to take the GHS plunge soon.
In Pixelmath - just label one pic stars and the other starless and enter this formula right in the Pixelmath process - issue solved! You might also have a look at my dedicated star processing tutorial: ruclips.net/video/uw_88wD2NWM/видео.html
Okay… a question! I heard somewhere that you should not use NXT until after sharpening, which is normally after stretching. Someone said what may appear as noise data may actually add some useful data when sharpening. What do you think? False? 🤪
With sharpening most likely was meant BXT, and yes, as stated NXT should be done after BXT. But for traditional sharpening like UnsharpMask, you will first remove the noise as otherwise you enhance/sharpen the noise.
Great video Sascha, I am new to Pix and your tutorials are a great help! I have a question, since I don't have StarXterminator, I am using StarNet for star removal in the linear phase. What is the best/correct way to stretch the stars-only image prior to recombination with the streched starless image, so that when combined, they look sharp and not bloated? Thank you and clear skies!
I agree with 70% of what you laid out here. First, though, I want to say something about relying on the inventor of a tool to be the ultimate authority on the usage of a tool. If I could go back in time and talk to the person who invented the paint brush, I might be able to learn a great deal about the intent of the paint brush and how the inventor thought it should be used. But I certainly wouldn't feel constrained by any limits to paint-brush usage the inventor might want to impose. Once I have a canvas and paint, it's no longer the paint-brush inventor's business what I do with the paint brush. Having said that, this is a bit more technical, so the inventor/curator's comments carry more weight, but they still are not absolutely definitive. In any case, here's where we diverge: 1. SXT usage. This may seem like heresy, but it doesn't actually conflict with any usage constraints I've heard Russ Croman describe. In my own process, the first thing I do after SPCC (if I use SPCC--see below) is create a clone of the image that I will use to create my stars-only image. On the stars-only image, I will apply BXT with ZERO non-stellar deconvolution and very little star and halo reduction. Then comes a pass of NXT with very little, if any, sharpening. Then a hand stretch just using HT. The penultimate step is to then run SXT in the stretched phase with screening turned on to remove everything except the stars. Finally, maybe a little saturation to taste. (Note that this also mostly applies to the RGB stars image when working on a narrowband image, except there's no reason to clone in that case.) For the main image, the first thing I do after SPCC is create a good-sized preview encompassing the center of the image. I then run the PSFImage script to determine the point-spread function for the image (averaging the high and low numbers). This is straight from Russ Croman. Then I run SXT to remove the stars from the image. I don't need to create a stars-only image from this because I've already finished with my stars, so I have the star-creation parameter turned off. At this point, I run BXT and turn the auto-PSF function off and enter the PSF directly that I determined in previous steps. I also set the star and halo reduction parameters to 0 because they are not needed. I determine the amount of sharpening by running it against previews--generally setting it lower than the default of 0.90 so as to avoid creating worms or other features that aren't actually in the data. 2. NXT usage. Where we differ here is that I run NXT both before and after stretch. Don't quote me on this, but I believe Russ Croman actually recommends this. I know for sure that Ron Brecher does. So right after BXT, I run NXT--also fine-tuning the amount of sharpening to avoid manufacturing phantom detail. Now comes stretch. I'm not a big fan of GHS, so I stretch this the same way as I do the stars: by hand with HT. If the image is narrowband, it may be time for an application of SCNR (see #3). Then it's on to the second application of NXT. After applying NXT a second time, I can run HT again and reset the black and midpoints of the stretch because the second pass of NXT usually has a dramatic effect on where those should be. 3. SCNR usage. I don't know about anyone else, but when I stretch a narrowband image, it's usually awash with green Ha signal. Part of that is because I can't use SPCC on a narrowband image. It's sometimes possible to correct for this in the linear phase by using LinearFit, but that doesn't always produce reasonable results. So some amount of SCNR is required to create a good balance between the Ha and the other signals. Other than those things, I totally agree with you and appreciate your putting this out and answering some of the more disputed questions. What I describe above works for me. Your mileage may vary. :)
Thanks for the extensive comment - very much appreciated! I think all that you are saying makes perfect sense. But it obviously is quite advanced level that you refer to here and I tried to keep it basic - so aimed for standard situations. One recommendation: Even you are "not a fan" of GHS - watch the recent tutorial of Adam Block and give it another try - it is by far the better stretching tool once you get used to it!!!!
@@viewintospace maybe I’m just not a fan of my poor usage of GHS. I love Adam’s work. Surely one of the best astrophotographers on the planet. But he puts me to sleep. I’ll try to watch his GHS video again.
Exactly, obviously if you do SHO there is no SPCC, but in any case before BXT. And the Lum would be treated separately with BXT and then the LUM combination with the color one is then in the non-linear phase.
@@videozeugs You can do SPCC for Narrowband - but only if you stick with HOS or HOO - but with SHO or other false colors it will not work. Is stated like this in the documentation
Once upon a time..... it was like that 😁 Today, I agree quality of data is still important, but good processing is as important. If we have the same quality of data - good or bad - and the data is once processed optimally, and once only with the basic steps or even applied them in a wrong way - can we agree that the picture which was been processed optimally will look better?
@@viewintospace You are right; nowadays, totally crap data could be made manageable to show. Nevertheless, I see quality data right away; and recognize cheesy pictures. Bottom line, your skill is good enough to get as good pictures as it could be, if there are quality data. Are we on the way to get TEC140 or TOA130?😀
Wow . This is the most comprehensive workflow ever.
All I need now is a good set of lights and calibration and follow your 10 rules.
Thank you so much. Ben.
Thank you for being clear, direct, and uncompromising!
Such a useful video, thank you Sascha. I happen to have never realised the BlurXT comes before NoiseXT :) but it makes complete sense now that you explained it
Thanks for all your help Sascha!
Excellent instructions. Thank you.
Very clear and logical explanation. Thank you.
Thanks, Sascha. Interestingly your workflow is more or less exactly what I have arrived at after several years of processing. Two stages where I have occasionally departed from the flow are 1. Using BXT prior to SPCC on ‘correct only’. As I recall this was recently discussed by Russ Croman and Adam Block because it might improve colour calibration. I can confirm that it doesn’t hurt it anyway. You can follow up with a full BXT after SPCC. 2. When an object lies within a dense star field it sometimes helps to do DBE after separating the stars and prior to stretch. It’s easier see and avoid faint nebulosity in the absence of stars. Regards, Andrew.
Thanks - Nr. 1 I also covered in the BXT2 video (but I was not aware of it before I read the BXT2 manual) but Nr.2 is something new to me and a really great tip - thanks!!!
Thank you Sascha. I have been waiting for a video like this.
Hi. I'm PixInsight beginner. It was very helpful video for me. I subscribed your channel. Thanks a lot!
Another great video Sascha with complete concise information.
Thank you for this!
Brilliant! Thanks for organizing my mind 😊
Concise & Clear! Very helpful
Thank you! So useful and clears up so many questions 👌
Well, that was straightforward and brilliantly clear. Well done indeed!
That was my intention - happy it worked 😊
Very good explanation of all the steps. Thank you. I agree with all of them except for one of those 0.5% exceptions. I found while processing the Seahorse nebula, that if I ran NoiseX before stretching, NoiseX confused the dark nebula with noise and destroyed it. Performing StarX, GHS, and then NoiseX worked much, much better. This was on roughly 19h of data from a 9.25 Hyperstar.
Exactly, very good example of such an expression - great that you found the optimal way how to deal with this situation!!!
Excellent video Sacha
Thank you for creating 👏🏻👏🏻👍🏻🌌🔭
Excellent presentation Fortunately this is exactly the sequence of my workflow.
Very nice video. I am a very beginner with Pixinsight. Thank you very much 🙂
Excellent video. Thank you for laying it out so clearly. Also thank you for all the extra info you share on your Patreon. It’s great!
Hi Sascha,
I was sure there was a correct sequence to follow in Pix, to have a more scientific rather than botched processing process, as there are dozens of on RUclips, now thanks to you I have confirmation that it was true, thank you!
Your videos are only interesting but above all they are exposed with extreme clarity.
I take this opportunity to ask you for clarification regarding the formula to insert into PixelMath to recombine the stars, is the " ~" symbol in the formula ~((~starless)*(~stars)) correct?
Thank you, I wish you clear skies, see you in the next video.
Ciao
Roberto
Yes, its s correct. If you want to know what it means and what this whole formula means, have a look at my Pixelmath video: ruclips.net/video/KcJEa739LE8/видео.html
@@viewintospace 🙏 I’ll do it! thanks 😉
Thanks for an excellent overview. I think you elsewhere recommend dithering, so would be good to know where you resample in the workflow, presumably after BXT. As was said below, I am sometimes happier with NXT after stretching. I mostly use the LeXtreme filter for nebulae and find SPCC does not work as well with that. I use combine(stars, starless, op_screen()) which I believe is the same as your formula, though I am not sure. I also stretch stars and starless separately before putting back together. You and others have persuaded me to use DBE instead of ABE and GHS instead of STF, at least for starless.
Amazing Video Sascha. Easy as 1 2 3, as u says 😂 and very helpful. Thanks for this 🤘🏻😎
Great video! One of the best workflow discussions I’ve seen.👏🍻
Thanks Sascha
Excellent!
Very nice. I have noted color mottling at times by doing NoiseX in the linear phase. Never an issue using NoiseX after the stretch.
That is not color mottling, that is just a display error. Go to Image -> Screen Transfer Functions -> Use 24-bit STF LUTs and this issue is solved. This option does not stay selected after closing PixInsight, so it has to be enable every time you have this issue.
@@viewintospace Global Preferences>Miscellaneous Image Window Settings
As ever an excellent video. Only one thing has puzzled me as a relatively new PixInsight user. Once you have cropped the image at stage 2 when you come to launch SPCC it will not work without first running Image Solver. I often wonder if by doing so the later launch of BlurXT and NoiseXT are performance compromised. Have you a view?
Hello!
Nice video, very clear.
Just one thing, I ran some test and it seems i have better result when using starXterminator before blurxterminator.
If i do the other way arround, a lot of weird artefact appears after denoise. I’m not sure where it comes from (it seems i’m the only one having the issue) but when you think about it, why would you denoise the stars? They have way enought signal to noise ratio IMO.
Both test were run on drizzled linear data, after the following steps, crop/background neutralization, color calibration, blur x
Great information in this video, but is it an actual workflow suggestion 🤔
No, it isn't. But whatever workflow you choose, ensure that these 10 rules are not violated, and if they are, correct this part.
@@viewintospace Thanks for the quick reply 😎
Thank you
Thank you 👍
Wow. Great video. I learned a few things. What stretch should I use on the stars and which one on the main image.
For the main image GHS, for Stars first a slight stretch with ArcSinh (to push the colors), then GHS for the rest. I did once a dedicated star processing tutorial, you might want to have a look: ruclips.net/video/uw_88wD2NWM/видео.html
Thanks Sascha. Great video. Is cosmetic correction part of WBPP and where can I find it? And is gradient removal the sam as Dynamic Background Extraction?
For Cosmetic Correct, have a look at this video: PIXINSIGHT - AS EASY AS 1-2-3 - Part 1 - Introduction & Stacking with WBPP
ruclips.net/video/9XgklwAICkM/видео.html Gradient Removal is the same as DBE
@@viewintospace thank you
Excellent video. You definitely do see lots of folks not adhering to these, especially the STF usage for stretching. Appreciate it.
Thank you for clarifying many hints you can find in the internet!
There is one step I am missing (or just didn't get it): at which point in your workflow do you do the ChannelCombination?
Good question! I did not want to go into this in the video to make things not unnecessary complex. But as you ask.... 😁 Given BXT prefers color pics the BEST point if right after gradient removal. You would also do BXT on the Lum picture and then do the LUM / Color combination in the non-linear stage. Now in some cases - like the forex script of Paulyman Astro, you need stretched individual channels. Here I would only run BXT on the LUM pic.
Hi Sascha
Could you please put together a small overview of most processes with a brief description along?
Maybe also with a note what part is linear and what not, unstretchen/ stretched etc.
That would help a beginner like me a lot. Thank you!
On one side you will see in my tutorials how the processes are interconnected. I also have a lot of videos covering individual processes. If you would like have workflows lining these out on paper, you will find quite a lot on my Patreon channel
Dammit, I thought I finally had a good workflow. Some more tweaks required. Not sure about step #1 though - doesn't WBPP do all of that for you? And will have to take the GHS plunge soon.
Cosmetic Correction is a part of WBPP - you can see here how to enable it: ruclips.net/video/9XgklwAICkM/видео.html
Excellent. Except that I didn't understand how to do the star recombination. Where do you enter this formula, please?
In Pixelmath - just label one pic stars and the other starless and enter this formula right in the Pixelmath process - issue solved! You might also have a look at my dedicated star processing tutorial: ruclips.net/video/uw_88wD2NWM/видео.html
Okay… a question! I heard somewhere that you should not use NXT until after sharpening, which is normally after stretching. Someone said what may appear as noise data may actually add some useful data when sharpening. What do you think? False? 🤪
With sharpening most likely was meant BXT, and yes, as stated NXT should be done after BXT. But for traditional sharpening like UnsharpMask, you will first remove the noise as otherwise you enhance/sharpen the noise.
Great video Sascha, I am new to Pix and your tutorials are a great help! I have a question, since I don't have StarXterminator, I am using StarNet for star removal in the linear phase. What is the best/correct way to stretch the stars-only image prior to recombination with the streched starless image, so that when combined, they look sharp and not bloated? Thank you and clear skies!
First of all, use BXT to get great, round, not bloated stars. And then follow this tutorial: ruclips.net/video/uw_88wD2NWM/видео.html
@@viewintospace Thank you very much, I will check it out.
Hi Sascha. What is according to this sequence the best way to stretch the "stars" linear image before star recombination?
ArcSinh -> GHS -> Curves -> Star Reduction (if needed) -> Combine
@@viewintospace thanks. Using arcsin is making the stars quite red
@@willemwitteveen8374 Have a look at this tutorial: Pixinsight Star Processing Tutorial - let them shine!
ruclips.net/video/uw_88wD2NWM/видео.html
I agree with 70% of what you laid out here. First, though, I want to say something about relying on the inventor of a tool to be the ultimate authority on the usage of a tool. If I could go back in time and talk to the person who invented the paint brush, I might be able to learn a great deal about the intent of the paint brush and how the inventor thought it should be used. But I certainly wouldn't feel constrained by any limits to paint-brush usage the inventor might want to impose. Once I have a canvas and paint, it's no longer the paint-brush inventor's business what I do with the paint brush. Having said that, this is a bit more technical, so the inventor/curator's comments carry more weight, but they still are not absolutely definitive.
In any case, here's where we diverge:
1. SXT usage. This may seem like heresy, but it doesn't actually conflict with any usage constraints I've heard Russ Croman describe.
In my own process, the first thing I do after SPCC (if I use SPCC--see below) is create a clone of the image that I will use to create my stars-only image. On the stars-only image, I will apply BXT with ZERO non-stellar deconvolution and very little star and halo reduction. Then comes a pass of NXT with very little, if any, sharpening. Then a hand stretch just using HT. The penultimate step is to then run SXT in the stretched phase with screening turned on to remove everything except the stars. Finally, maybe a little saturation to taste. (Note that this also mostly applies to the RGB stars image when working on a narrowband image, except there's no reason to clone in that case.)
For the main image, the first thing I do after SPCC is create a good-sized preview encompassing the center of the image. I then run the PSFImage script to determine the point-spread function for the image (averaging the high and low numbers). This is straight from Russ Croman. Then I run SXT to remove the stars from the image. I don't need to create a stars-only image from this because I've already finished with my stars, so I have the star-creation parameter turned off. At this point, I run BXT and turn the auto-PSF function off and enter the PSF directly that I determined in previous steps. I also set the star and halo reduction parameters to 0 because they are not needed. I determine the amount of sharpening by running it against previews--generally setting it lower than the default of 0.90 so as to avoid creating worms or other features that aren't actually in the data.
2. NXT usage. Where we differ here is that I run NXT both before and after stretch. Don't quote me on this, but I believe Russ Croman actually recommends this. I know for sure that Ron Brecher does. So right after BXT, I run NXT--also fine-tuning the amount of sharpening to avoid manufacturing phantom detail. Now comes stretch. I'm not a big fan of GHS, so I stretch this the same way as I do the stars: by hand with HT. If the image is narrowband, it may be time for an application of SCNR (see #3). Then it's on to the second application of NXT. After applying NXT a second time, I can run HT again and reset the black and midpoints of the stretch because the second pass of NXT usually has a dramatic effect on where those should be.
3. SCNR usage. I don't know about anyone else, but when I stretch a narrowband image, it's usually awash with green Ha signal. Part of that is because I can't use SPCC on a narrowband image. It's sometimes possible to correct for this in the linear phase by using LinearFit, but that doesn't always produce reasonable results. So some amount of SCNR is required to create a good balance between the Ha and the other signals.
Other than those things, I totally agree with you and appreciate your putting this out and answering some of the more disputed questions. What I describe above works for me. Your mileage may vary. :)
Thanks for the extensive comment - very much appreciated!
I think all that you are saying makes perfect sense. But it obviously is quite advanced level that you refer to here and I tried to keep it basic - so aimed for standard situations.
One recommendation: Even you are "not a fan" of GHS - watch the recent tutorial of Adam Block and give it another try - it is by far the better stretching tool once you get used to it!!!!
@@viewintospace maybe I’m just not a fan of my poor usage of GHS. I love Adam’s work. Surely one of the best astrophotographers on the planet. But he puts me to sleep. I’ll try to watch his GHS video again.
@@elithic The GHS tutorial of Paulyman Astro is a good alternative to the one of Adam - only about half the length and also brilliantly made!
The key to Adam's videos is to set the playback speed to 1.5x He is so articulate and well spoken that it sounds normal then😉
I assume if you are processing monochrome images, combination is after gradient removal and before SPCC (while still in the linear stage)?
Exactly, obviously if you do SHO there is no SPCC, but in any case before BXT. And the Lum would be treated separately with BXT and then the LUM combination with the color one is then in the non-linear phase.
Can’t you use SPCC also for narrowband images? The UI contains a field for the wavelength of each filter.
Not sure.
@@videozeugs You can do SPCC for Narrowband - but only if you stick with HOS or HOO - but with SHO or other false colors it will not work. Is stated like this in the documentation
When I must use the SPCC in monochrome?
Do R-G-B combination at the very beginning of the workflow and then you can do SPCC
beautiful! thank you, you deserve a little time out for a delicious Rosti....
There is only one golden rule: quality of pictures is defined by quality of data.
Once upon a time..... it was like that 😁 Today, I agree quality of data is still important, but good processing is as important. If we have the same quality of data - good or bad - and the data is once processed optimally, and once only with the basic steps or even applied them in a wrong way - can we agree that the picture which was been processed optimally will look better?
@@viewintospace
You are right; nowadays, totally crap data could be made manageable to show.
Nevertheless, I see quality data right away; and recognize cheesy pictures.
Bottom line, your skill is good enough to get as good pictures as it could be, if there are quality data.
Are we on the way to get TEC140 or TOA130?😀