Very good job, great. There is a good alternative from - Adam Block: Identifying Asteroids in PixInsight and the Minor Planet Center - where you easily implement some data-files in PI and then be able to annotate your image directly and see (maybe) thousands of these small objects with name. It is a much shorter way. Best Thomas
I've been taking pretty pictures of the night sky for quite some time, and now I think it is time to try a little more challenging assignment such as hunting for NEOs. Thanks for the video. Very nice!
Very interesting tutorial. Thank you so much. I think, the identifikation of the small asteroid could be also done with the platesolve script im Pixinsight and than with the annotate script. Should work, too, I guess. CS Christian
You are right. There are a couple of different ways to do this, some of which are actually quite new and came after this was originally worked out. I love hearing from others though. If anyone can ever show me an easier way to do something... I'll always listen!!! :)
Impressive work... Maybe a quickest way to identify your second and unexpected target: SAOImageDS9. Once you have your FITS image WCS-calibrated with Astrometry.net, open it in SAOImageDS9, the menu Analysis/Catalogs/Database/SkyBot will (hopefully...) display the location of the minor bodies in the field for the given DATE-OBS.
Nice video, I learned a couple of things from this. One somehow silly question remains open for me. How did you add the pointers and names of the asteroids that you show in the last seconds of the video? Looks like a very manual task?
Ya... post production video editing. There are a couple tools that can do this in your images via platesolving, but i needed something more dynamic for the video.
The question I have Chad…is it possible without having a permanent observatory number can I still submit my findings for any new asteroids to the Minor Planet Center….
It seems that you do need an observatory code, but they do make accommodations where they will assign one upon valid submission. Although valid submissions seems pretty extensive and includes a near earth object along with six planetary bodies in a two night period.
12:19 animation reduces noise via the human vision averaging psycho-optic processing. A frame of a movie is pretty grainy but not noticed in the cinema. Fascinating stuff and I am wondering what to do with finished 20" telescope. Is this imaging with your small refractor?
I've done this with my smaller refractor as well but you need enough movement in the FOV for it to be noticeable so a wider field often needs more frames over more time, but it works.
Hello, sorry if I ask you a lot of questions but I'm curious, congratulations for the video first of all, the question is this ... If you or someone else hadn't found anything about the small asteroid on jpl, there is a procedure to communicate it to the minor planet center for checks? Thank you so much for everything😊
Could you do a video about your 40 core Xeon computer that you use for PixInsight processing? I am very interested in optimizing a new computer for PixInsight. And could you talk about the benefits of a Xeon vs a HEDT vs main stream systems for PI? Thx.
If I were to build a new system, I likely wouldn’t go Xeon. This was a used, decommissioned, server from a data center. It was just something I could get my hands on inexpensively. It’s a little bulky, but running 40 threads is really nice. Getting windows 10 to run on it was a bit tricky, but once I got all the drivers in place it stayed happy. :-) what I will say is that with pixinsight as an example, Some processes take advantage of the threads and absolutely fly. Others don’t, and then they crawl in comparison to even a single quad core i7. Certainly dropping the video card in it and taking advantage of CUDA and tensorflow Is one of the bigger benefits, but that’s unrelated to the main processing. I would say though, if I only had the ability to run 20 threads… I’d be better off with an i7 (or similar) box in the long run. But here is the take away, if you can get your hands on a decommissioned server on eBay… And you are comfortable in loading windows 10 on a stubborn box which may require drivers be available at boot, it may be worth a shot, especially for the price.
Getting closer is easier because you don't need as many hours of continued exposures to see the movement. You can do it at a shorter focal length but you will need more hours of data to see it move at that scale. If doing a shorter focal length, don't feel like you need constant images. You could take an exposure, then wait 5 minutes, then take another one, then wait, etc. You can still animate this but have far fewer images to process to get the same effect.
Very good job, great. There is a good alternative from - Adam Block: Identifying Asteroids in PixInsight and the Minor Planet Center - where you easily implement some data-files in PI and then be able to annotate your image directly and see (maybe) thousands of these small objects with name. It is a much shorter way. Best Thomas
You are absolutely correct. Within the last year or so there have been alternative approaches that have surfaced as this has become more popular.
I've been taking pretty pictures of the night sky for quite some time, and now I think it is time to try a little more challenging assignment such as hunting for NEOs. Thanks for the video. Very nice!
Any luck with this yet?
Excellent tutorial. Thank you.
Very interesting tutorial. Thank you so much.
I think, the identifikation of the small asteroid could be also done with the platesolve script im Pixinsight and than with the annotate script.
Should work, too, I guess.
CS Christian
You are right. There are a couple of different ways to do this, some of which are actually quite new and came after this was originally worked out. I love hearing from others though. If anyone can ever show me an easier way to do something... I'll always listen!!! :)
Impressive work... Maybe a quickest way to identify your second and unexpected target: SAOImageDS9. Once you have your FITS image WCS-calibrated with Astrometry.net, open it in SAOImageDS9, the menu Analysis/Catalogs/Database/SkyBot will (hopefully...) display the location of the minor bodies in the field for the given DATE-OBS.
Very cool. I’ll try to give this a shot next time!
Nice video, I learned a couple of things from this. One somehow silly question remains open for me. How did you add the pointers and names of the asteroids that you show in the last seconds of the video? Looks like a very manual task?
Ya... post production video editing. There are a couple tools that can do this in your images via platesolving, but i needed something more dynamic for the video.
The question I have Chad…is it possible without having a permanent observatory number can I still submit my findings for any new asteroids to the Minor Planet Center….
It seems that you do need an observatory code, but they do make accommodations where they will assign one upon valid submission. Although valid submissions seems pretty extensive and includes a near earth object along with six planetary bodies in a two night period.
12:19 animation reduces noise via the human vision averaging psycho-optic processing. A frame of a movie is pretty grainy but not noticed in the cinema.
Fascinating stuff and I am wondering what to do with finished 20" telescope. Is this imaging with your small refractor?
Sorry seen gear used is 8" SCT.
I've done this with my smaller refractor as well but you need enough movement in the FOV for it to be noticeable so a wider field often needs more frames over more time, but it works.
Hello, sorry if I ask you a lot of questions but I'm curious, congratulations for the video first of all, the question is this ... If you or someone else hadn't found anything about the small asteroid on jpl, there is a procedure to communicate it to the minor planet center for checks? Thank you so much for everything😊
If you are asking about reporting, there are multiple ways to do this. One example is following the guidelines online at iau dot org.
Could you do a video about your 40 core Xeon computer that you use for PixInsight processing? I am very interested in optimizing a new computer for PixInsight. And could you talk about the benefits of a Xeon vs a HEDT vs main stream systems for PI? Thx.
If I were to build a new system, I likely wouldn’t go Xeon. This was a used, decommissioned, server from a data center. It was just something I could get my hands on inexpensively. It’s a little bulky, but running 40 threads is really nice. Getting windows 10 to run on it was a bit tricky, but once I got all the drivers in place it stayed happy. :-) what I will say is that with pixinsight as an example, Some processes take advantage of the threads and absolutely fly. Others don’t, and then they crawl in comparison to even a single quad core i7. Certainly dropping the video card in it and taking advantage of CUDA and tensorflow Is one of the bigger benefits, but that’s unrelated to the main processing. I would say though, if I only had the ability to run 20 threads… I’d be better off with an i7 (or similar) box in the long run. But here is the take away, if you can get your hands on a decommissioned server on eBay… And you are comfortable in loading windows 10 on a stubborn box which may require drivers be available at boot, it may be worth a shot, especially for the price.
The image scale range that would be recommended for asteroid hunting?😊🔭
Getting closer is easier because you don't need as many hours of continued exposures to see the movement. You can do it at a shorter focal length but you will need more hours of data to see it move at that scale. If doing a shorter focal length, don't feel like you need constant images. You could take an exposure, then wait 5 minutes, then take another one, then wait, etc. You can still animate this but have far fewer images to process to get the same effect.