Huge Telescope. Tiny Sensor. Why?
This video is sponsored by Squarespace. Tonight I'm using my biggest telescope and my smallest sensor. But why? Well because I'm going after a deep sky object that is tiny in terms of apparent diameter. Tonight I'm going to attempt to photograph the bright core of the Cat's Eye Nebula in the constellation Draco and this is by far the smallest deep sky object I've ever attempted and so I'm going to throw out all the usual methods that I normally use for Imaging and processing and try something completely different, completely new to me and bring you along for the journey, but also explain a little bit about why I'm doing this and what I'm doing. So my name is Nico and you are watching nebula photos [music]
Hey, so if you're new to the channel, as the channel named nebula photos suggests what I'm interested in as a photographer are nebula and these are colorful gas clouds that are in my opinion some of the most beautiful natural objects in the universe and it's just amazing to me how accessible they are for backyard photographers uh this video is not where I'd start for learning how to capture nebula yourself I have much better videos for complete beginners that go start to finish and I have a playlist for those that I'll link um but what I'm going to be doing in this video is more of an advanced technique with advanced gear and it's called lucky Imaging and my reason for documenting it is there's been some interest in my patreon community so much interest that this past month we've actually been working on a lucky Imaging challenge together and that's something we do every month take on a new Imaging challenge over on the patreon side of the Discord and I think that's a great way to stay sharp as an astrophotographer it's something I love about the hobby you know to be constantly challenging myself you know with new ideas new techniques so what is lucky Imaging well before I explain it I have to go take a few steps back and let's start with the term angular resolution resolution just means what is the smallest detail you can resolve so for example if I put this small bottle of Noodler's ink a few hundred feet away if I photograph it with a telescope we can still resolve very small details you can even still see that it says made in USA on the label if I instead use a wide angle camera lens my resolving power goes way down and now I can't even really make out that there's a bottle of ink there on the Rock because the resolution decreased um so that's the trade-off for getting a much wider field of view you won't be able to resolve very small details when you zoom in on the photo now angular resolution is just a way of measuring detail by describing the apparent diameter of an object in degrees of Arc basically if you imagine measuring the Horizon to Horizon that would be 100 180 Degrees of Arc so half a sphere and now you can break up a single degree of Arc into 60 parts we call those arc minutes and then if you break up each of those arc minutes into 60 parts we call those Arc seconds and you can then describe the apparent diameter of anything in your field of view using this angular resolution so now let's get to the telescope and if you've heard the term you know x arcseconds per pixel like one arcseconds per pixel is a common recommendation for deep sky imaging that's describing the theoretical angular resolution of a telescope plus a camera meaning you know with this scope and this camera what is the smallest detail that can be resolved by a pixel on the camera sensor so if you say one Arc per pixel you're saying theoretically you could make out a detail that's only one arcsecond big um but unfortunately Pixel scale is just one part of the whole solution with true resolution because there's other physical limits that uh will limit the actual amount of detail that can be resolved uh and there's two main ones we can briefly discuss them the seeing limit uh is caused by air turbulence in the Earth's atmosphere and the diffraction limit um is just comes down to physics uh with your telescope size and you'll also see that called the Dawe's Limit or the Rayleigh Criterion it's it's entirely determined by the aperture of your telescope how big it is so if you want to resolve really small details and you have very steady Skies then you'll need a bigger telescope to do that because otherwise the diffraction patterns of the light sources in your photos are going to blur together and you won't be able to make out the smallest details but the the limit that affects way more of us uh in you know most places on Earth is the is the limit from your your Sky conditions you're seeing basically when you're looking up at the night sky you know it looks transparent it looks like there's nothing between us and the stars but actually the sky the Earth's atmosphere is still there um you know what we see is a blue sky during the day uh it's still there it's just not illuminated by sunlight but it can still uh cause a bunch of issues for us asap photographers you know the two main ones uh being that it can be lit up by artificial light uh shining up on it uh this is called light pollution and then when you know there's more water vapor in the sky or smoke it gets much worse and when it gets worse we call it poor transparency so when you when the when the sky isn't very transparent it sort of looks like there's this like uh thick layer of light between us and the and the night sky and but the the more important to the discussion today is seeing and this has to do with different layers of atmosphere um between us and space and uh they are all moving in different sort of directions and at different speeds and that's turbulence and this turbulence causes uh a lot of blur in our images so um you can see it if you you know if you look at the moon at high power you can actually see the the turbulence in the air and uh if when it's a really uh steady night um it will give you a much clearer view now what the professional astronomers do is they build their observatories in places where the seeing is typically excellent um the the best places on Earth to do this are mountaintops you know very high elevation where you can get above a lot of the poor seeing and then also uh near oceans where uh you can have um inversion layers and laminar flow and all these terms that have to do with basically the air currents being very smooth and lined up now most of us don't have the luxury of building our observatories on mountaintops uh but the good news is there's a technique called lucky Imaging that can help quite a bit with poor seeing um if you're a planetary imager you probably have already come across this technique the idea is if you shoot uh with a high-speed video like 60 or even 100 frames per second there will be moments where all of the turbulence in the air uh lines up in a way to give you a nice clear view of space and if you're taking thousands and thousands of frames you may get hundreds where you're lucky and you and at least some part of of the planet's surface or the moon's surface is showing a good amount of detail and then when you stack all these good frames together you end up with a much sharper image of the planet or the Sun or the moon um than your seeing conditions would normally allow that's why they say it's like beating the seeing now the faster you can make the frame rate the luckier you'll get right because you you have more chances to get lucky so this doesn't translate to deep Sky Imaging very well at least for most objects because most objects are too dim to shoot at a high frame rate um or or a very short exposure for instance you know if you're shooting what would be a short exposure for deep sky might be like 30 seconds but that's still way too long to take advantage of Lucky Imaging because the air currents just move too fast so uh the the blur will be baked in at at 30 seconds so then the question becomes okay how short do we have to make the exposures to get the advantage of Lucky Imaging and that's the million-dollar question I'm sure to some degree it depends on your local Sky conditions and things like that but I've looked at a lot of people who have been successfully doing some kind of Lucky Imaging for deep sky and it seems like 1 second might be uh a a good Target now this also might depend a little bit on F ratio and it definitely depends on how bright the object you're shooting is so don't just take that one second as the end all and Beall but that's what I'm going to be using tonight at f5.6 uh and with the Cat's Eye Nebula and I chose the cat's ey nebula because it has a very cool shape and it's very well positioned in my sky right now it's in the constellation Draco which is uh easy to photograph if you have a good view to the north which I do from my backyard Observatory that I'm sitting in right now I can see the north is nice and clear I don't have any trees so that's uh basically the theory part of of Lucky Imaging uh let's now move on to the gear and the kind of telescope you want to use is going to be your biggest fastest scope if you have say a motorized 16-in F4 Dobsonian that would be perfect for this uh for me the biggest fastest scope that it was practical to use CU I can track with it and everything is this Askar 185 APO with the f5.6 reducer now for the camera you want basically a unicorn a camera with very low noise but also very small pixels um and these two things often don't go together but this one the ATR585c from Touptek that uh they sent over for review seems to fit the bill it's 2.9 Micron pixels which are pretty small and the read noise is also very low from using it I will say the fixed pattern noise is higher than you'd get from larger sensor cameras but there's always going to be trade-offs like that and for the for the most part the that doesn't really matter too much for this kind of application um the other thing that's useful about this camera is it has a small sensor um you might you know it's 11 mm across 6 mm High 4K resolution for for context a fullframe sensor is usually uh 36 mm uh across by 24 mm High and the bigger sensors come with bigger images in terms of file size often 50 to 90 megabytes for a full frame while this is 8 megabytes and when you're registering and analyzing thousands of images that's very important it also um may be important when it comes to capture right if it if it's taking time for your computer to download a large uh image and then start the next one um or if it runs out of buffer or whatever um that can really uh drag you down with lucky Imaging you want something that basically Works in real time just constantly taking images now if you don't have a small sensor camera like this I haven't tested this but I've heard you may be able to use the region of Interest feature Roi in your capture software and so that it basically only reads out a small cropped portion of the sensor basically acting like a smaller sensor I don't know if there any downsides to that uh since I haven't tried it but if someone else has tried it let me know in the comments uh but I'm going to be using this one and I think uh it should work well I think this is a a good sensor size for this application okay next up in terms of the actual capture process the way I did this is after downloading the tube Tech driver I set it up in sequence generator Pro where I already have the plate solving and all of that kind of good stuff configured so then I could just easily focus and Center on the object using Sequence Generator Pro which I'm already familiar with but for this step you could use any control program you like like that you have connected to your mountain and Camera it could be EKOS, NINA, APT, any of them then with the object centered and the mount tracking on it and keeping it centered in frame I switch to Sharp cap which is a program that has some nice advantages for Lucky Imaging the main one being that it can capture basically in real time and store all of the frames that it's capturing into a single SER S-E-R video file so you end up with just this one efficient file with thousands of raw images in it and it's already optimized for this kind of work it just makes it a little easier um and and I also found that I was able to be pretty efficient that way I was able to gather a bit over 5,000 frames at 1 second each and a couple hours and the resulting SER video file came in at about 80GB and that's that's what I'm going to try processing with Siril and I'm using Siril for a couple reasons uh it's very fast at registering and stacking so it makes uh experimenting less painful and then second unlike PixInsight or other programs can work with these sir sequences very efficiently so you don't have to use other programs to convert or anything like that now I will admit there are other more complicated methods for doing this kind of uh lucky Imaging processing that may result in better fin images I haven't had time to investigate them all but two popular programs for Lucky Imaging stacking in general that also seem to have support for deep sky are Autostakkert which I have used before and Astro Surface which I have not but I hope to uh sometime in the future so anyways I'll be using Siril is it's crossplatform it's donation supported but uh free to download before I show the processing let me just take a quick moment to share a bit about today's sponsor which is Squarespace Squarespace makes it easy to make a website with their guided design system called Squarespace blueprint you can choose from many professional templates but then style them however you'd like with the easy to use interface that I'm showing here there is a bunch of other features built in for example if you want to sell products or prints of your work you can set up an online store and Squarespace now has flexible payments including PayPal Apple pay credit cards after pay and clear pay so your customers can pay however works best for them so whether you need something simple just an online portfolio or a lot more complex for your website Squarespace has you covered and right now you can get a free trial by heading to squarespace.com nebula photos and when ready to make a purchase of Hosting or a domain
you can get 10% off with code nebula photos okay here we are on the computer as you can see I'm using an internal SSD over USBC has plenty of free space over 2 terabytes cuz we're going to need a lot of free space uh for this process all that I have right now is a folder with the SER file that we captured with sharp cap in it it's about 80 gab and the first thing we're going to do here in Siril is set the working directory um so we just click the home button and set it to where we have the surf file on the external SSD of course you could do this on an internal uh SSD as well you just would need enough space um which I don't have okay we're now going to go over to here we're going to use these tabs at the top here uh starting with conversion and then working our way through we're not going to use calibration uh for this just cuz I even though I took calibration frames for 1 second I'm not sure if they're going to work too well we're also not going to use the plot feature but all the other tabs we'll use and so in the conversion tab here you want to go down uh below this empty Source area to the little plus sign click on the plus icon if you hover over it it says add files to convert and just choose your sir file in the lucky folder I'll click add and then we can give this sequence a name I'm just going to call it NGC 6543 which is the official name for the cat sign nebula and I'm going to change it right here where it says fits images to Sir sequence um I have tried using fits images and the limitation is that you're you're limited to stacking only 248 uh well with a Surf Sequence you can stack as many as you as you want as many as your computer can handle so I'm going to leave it on Surf Sequence and then this is very important we need to click the deayer uh toggle right here because these are color images I you know with the 585c um if you were shooting mono of course you wouldn't have to check that but if you're shooting color you want to make sure to click debayer and then check your debayer settings up here in the hamburger menu under preferences uh it's you know they should be set so that it automatically finds it in the the metadata but uh just check these and then if you have any problems you can always uncheck this uh option and choose the Bayer pattern uh manually I'm going to leave it on automatic uh because I think that's going to work just fine but I know that if I did run into any problems I could come back here and just manually choose the debar pattern and by problems I mean it'll throw an error if uh or if if it can't figure it out or uh if you need to do some kind of manual intervention all right with those uh settings all set I'm going to go ahead and click convert and it will start going and the speed of any of these steps will depend on your uh computer power I'm using an AMD ryzen 5950 X processor uh which is the most important thing here in terms of how fast this can uh chunk through this data and and spit it out uh so you can sort of see the the speed at which it's going and we have over 5,000 frames so I'm going to speed this part of the video up okay that's uh done in terms of the creating the sequence so now what we can do is we can actually click on the sequence tab up here and it loads the first image in the preview over here I'm going to click on uh the visualization modes right now it's on linear which is why we're not seeing anything except for maybe a couple little star cores and I'm going to change it to if we try auto stretch I think that's going to look pretty ugly uh it doesn't look too bad but you can see that the the cat ey nebula core which we're interested in is completely blown out so that's not going to be as useful I'm going to click on arc sign and this is looking a lot better let me zoom in here with control scroll okay and that's an example of a single 1 second uh picture and it doesn't look too good we can sort of see a a very blur uh star in the center of the planetary nebula there and maybe a tiny bit of detail but it it looks pretty blurry but what we can do next is we can open the frame list over here in the sequence tab and this lets us quickly um go through any frame in the sequence and see what it looks like so I can just open up a uh new frame randomly by just clicking on it over here in the frame list and I can see already just clicking on a random one that it looked a little bit better than the first one I think uh that one now looks quite a bit blurrier uh I don't like that one as much and that one looks terrible okay so you can see in this one uh that's really blurred right and the The star looks wonky over here uh and we don't really see any detail on the planetary nebula so that's obviously a bad one okay this one looks quite a bit better we're seeing more detail the shape is better and this one's even better still looks a lot sharper we're not seeing as much blur the uh white dwarf in the center of the planetary nebula is is nicely defined so once you've found a frame that you think um looks pretty sharp you can set it as the reference image up here and this is just helpful for uh registration okay with that set we can go to the registration Tab and I'm going to use Global star alignment I've tried some of these different ones and I think for this this Global star alignment is going to work well based on the number of stars in the picture uh plenty of stars to work with we're going to register all images from the sequence we'll leave all the other settings uh alone and so we're not going to drizzle because we didn't dither so uh leave that off and we can go ahead and click go register and you can see it shows you the stars that it's using for registration and uh there it goes and again uh because my computer's pretty fast this process will go by pretty quick but I'll still speed up the video uh because it'll take several minutes okay you can see after finishing registration it picked its uh it picked a different reference image um based on which image was sharpest um that's fine I you know it's it would have been rare for me to have just picked the sharpest image randomly um but it'll maybe use that um in stacking now I'm going to change the method of stacking to average stacking with re rejection leave all these other settings alone except for image rejection which I'm going to change to weighted fwhm uh if you sort of just hover over this it will describe each rejection method and I want weighted fwhm because it's going to give us uh the sharpest kind of uh result uh based on which is what I'm really going for here in addition to choosing weighted fwhm I'm going to change the percentage of frames used and for this I'm going to do an experiment where I stack this several times with different percentages of frames or images used I'm going to start with 15% which you can see here is only 761 images of the 5,000 total after this is done I'll up that to 30% then 50% and 70% and then we can just do a quick uh comparison uh at the end to see uh if we can see any difference between these different amounts of images stacked okay so I'll speed up this whole process of generating the images and and doing a basic process on them so we can get to that uh comparison for those interested in terms of the processing I kept this very simple on each of the image Stacks I ran these five steps in seral with identical settings uh to try to keep this comparison as Fair as possible okay so to give you an idea of the full 4K frame this is shown at uh 1:1 pixels uh because it's actually a 4 K video and a 4K uh image so this is the uh 4K uh processed image uh so you can see even with a 1 second uh image there's plenty of stars that come out in a stack and uh the the core is uh nicely defined and bright there uh no issue with really exposure I think I could go down to half a second which I'll I'll talk about in a second here now let's go to the comparison and so here's with that same 100% zoom level comparing the different uh Stacks 15% 30% 50% 70% I can say at 100% Zoom I cannot see any difference between these uh so that tells you something maybe this object is too small to to Really uh see much difference now at the at 500% zoom on it uh I start I can see a little difference on my nice monitor without any compression I can see that uh we are losing some of the finest finest uh details in contrast in once we get into the 50 and 70% stacks and they're preserved better in the 15 and 30% uh Stacks I am sure that a lot of people watching this on YouTube with video compression and everything are are not going to be able to see any difference it's quite possible that even if you were looking at it on my monitor you'd be like what are you what are you difference are you even seeing here so I will say that uh this experiment was not fully uh successful because it's not like a huge difference uh in terms of uh the lucky Imaging trying to sort of get the best uh frames um but I have a few ideas of what to try next um in terms of further experiments so let me show you those so one thing I'm very interested in is shorter frames now that I know 1 second Works um it would be cool to take a lot more at half a second because I think that one second there was plenty of light there to work with um so I think I could go down to half a second and and it might be an even sort of better result in terms of the lucky Imaging another thing I'm thinking about is taking the best of several nights so increasing the pool of data um so maybe trying to get 20,000 frames rather than 5,000 um and and maybe I could do that in two nights with half second frames another thing I want to try is comparing seral Auto stacker and Astros surface maybe others if people have suggestions for this kind of stacking and um Quality estimation and then uh I'm open to other suggestions for for for further explo exploration of this topic so if you have anything that you I'm not thinking of and maybe you've tried deep Sky Imaging and have and have tried something different than I have here um I'd love to know what else I should be thinking about just realized there's one other next step I forgot to put on my slide which is this is what it looks like with just uh lucky Imaging the cats eye nebula but there's actually an extended shell that you can get with traditional imaging so I did do a little experiment with that and this is with 5 minute exposures uh and a dual narrow band filter and then I combined that with my lucky Imaging result to sort of resolve the core in there but this could be much better because this is all from a single night uh just like two hours of data uh on the core and two hours of data with the Dual narrow band filter with five minute subs and so uh this is something I want to continue developing and figuring out the best way to combine the short exposures with the long exposures and so maybe I'll do a video on that if I figure out something good you are now seeing the names of everyone who supports this channel through my patreon campaign the nebula photos patreon is the primary source of income for this Channel and I now also do this full-time so I can't thank my generous patreon members enough for the support if you're interested in joining it starts at just $1 a month and every tier gets access to my patreon Discord channels which includes the monthly Imaging Challenge and we also meet monthly over Zoom uh there are higher tiers with other perks like add free videos starting at $7 a month and of course you also get direct messaging support with me so whether you're just starting out and looking for advice on what gear to get or you have years of experience I think there's a lot that I offer through patreon that will make it worth it to you when ready to join head to patreon.com/nebulaphotos
2024-06-06 04:48