Showing posts with label Image Processing. Show all posts
Showing posts with label Image Processing. Show all posts

Wednesday, July 3, 2024

Matching the 'Look' Using CDFs and Final Tweaks...

 As an exercise I was interested to see how close an example Seestar S50 observation image could be processed to take on the 'look' of a target image.

As before a target image was loaded and a histogram applied to the Seestar S50 observation image data via a CDF (Cumulative Distribution Function) as shown below.

Histogram Matching Using CDFs
The target image 'look' is shown below in more detail...
Target Image for Matching to a 'Look'
It can be seen that the resulting matched image has the general look of the target image - but appears significantly de-saturated. To see if it is just a matter of de-saturation the resulting matched image was imported into GIMP and adjustments made to both saturation and colour temperature.

Before adjustment in GIMP...
Resulting Matched Image Before Adjustment in GIMP
... and after adjustment in GIMP...
Resulting Matched Image After adjustment in GIMP
Although the resulting matched observation image has a similar 'look' to the target image - it required manual adjustment (GIMP). Experiments will be done to determine whether examination of other statistics will allow automation of that adjustment.

Friday, June 28, 2024

Using CDFs to Histogram Match to an Existing 'Look'...

First Attempt

In the previous post which described automatically stretching via matching (using Cumulative Distribution Functions - CDFs) to mathematically generated red, green and blue channel histograms, mention was made of perhaps using existing good examples of the same object to provide a set of red, green and blue histograms.

I am interested how far the post-processing of Seestar S50 images can be automated. Accordingly, some code was written to implement the histogram matching function using good example images as shown below.
Histogram Matching to an Existing Good Example Image - Before Matching
PLEASE NOTE: the above application is just for experimental purposes and is not suitable for other use. Do not ask for a copy - unless you like being offended by a non-response :-)

In this application the FITS observation file that is to be 'stretched' by histogram matching is loaded into the large image box. In the above picture the original linear image is shown. Then an example of an existing good stretched image of the same subject is loaded (the small image box top-left). Typically this might be images from the 'net from professional or amateur sources.

The application then calculates histograms of both the observation image file and the example stretched image file. After calculating CDFs from each histogram, the original linear image is re-mapped such that its histogram matches the good example image histogram.

The results of that process is shown below.
Histogram Matching to an Existing Good Example Image - After Matching

If we compare this result with the result from the previous post where the matching was done to a mathematically generated (via equations) as shown below (ignore the rotation), we can see that the above result has taken on the 'look' of the example image and is a better result as a consequence.
Comparison Result from Using a Mathematically Generated Histogram for Matching
If we choose a different example image which has a different 'look' and repeat the process we get the result as shown below.
Histogram Matching to a Different Existing Example Image (more blue)
Note the difference between the two example images in both cases (the small images top-left in the application). The second example has more of a blue-ish tinge than the first - so the result has the same elevated blue-ish tinge.

A further example uses an example image which has a completely different 'look'...
Histogram Matching to Radically Different 'Look' Example Image

In this example image there is more red and green and so the result of histogram matching takes on that 'look'. Other features to note is that for this M42 image, the detail around the trapezium in the result images matching the detail in the example images.  Where the trapezium detail is prominent in the example image, likewise it is prominent in the result. So - not only is the colouring of the example image adopted, but also the stretch curve shape.
I am pretty pleased with this result - which, once again, is better than I expected. Some notes...

  • In the above results absolutely no manual tweaking is done. Just load in the observation file and the example image file and hit 'GO'. So - the above results - as far as processing is concerned - are obtained 100 % automatically.
  • No spatial information is transferred from the example image (i.e., no matching of individual pixels is done). The spatial information is lost in the histogram calculation. It is simply the statistics (histogram and CDF) which are matched.
  • The example images - used to match the observation image histogram to - are typically generated by a different camera (sensor), different post-processing applications, etc, and so the matching cannot be exact.
  • Likewise - the exposure times for the example images is likely to be much longer (or the result of a larger aperture lens) and so the signal-to-noise ratios for the Seestar S50 images to be matched are likely to be much lower. Some compensation for this effect might be possible.
Further experimentation will be conducted to explore how far this technique can be extended.

Friday, June 21, 2024

Histogram Matching Using CDFs...

NOTE: the following histogram matching method is almost certainly not novel in the field of astrophotography post-processing given the amount of development effort put into processing applications. It is already used - for example, in medical imaging where images taken at different times and exposure conditions (leading to differences in contrast, brightness, etc) need to be compared to track changes.

It occurred to me (almost certainly not the first) that a similar histogram matching approach as used in medical imaging might be useful in order to circumvent the laborious manipulation normally associated with 'stretching' an astronomical image. Accordingly, I wrote some code to implement a histogram matching function.

Histogram Matching Application
PLEASE NOTE: the above application is just for experimental purposes and is not suitable for other use. Do not ask for a copy - unless you like being offended by a non-response :-)

The application allowed the generation of a target histogram with various rise-times and delays in terms of their shapes. Two examples are given below...



Using Cumulative Distribution Functions to Match Histograms

The process is as follows...
  1. Calculate histogram of original linear image. The data is processed in 16-bit unsigned values - so there are 65536 values in the histogram.
  2. Calculate CDF of the linear histogram - also with 65536 values.
  3. Calculate CDF of the generated target histogram (as displayed as examples above).
  4. Create a re-mapping table by stepping through the linear image CDF values (for levels 0 - 65535) and reading out the CDF value. Then scan through the CDF of the generated target and find the nearest value to the linear CDF value. The index (0 - 65535) becomes the remapped value.
  5. For every pixel in the linear image data, look up the entry in the re-mapping table which corresponds to its value and place in a matched image data set.
  6. Display the matched image data.
No optimisation of speed nor determination of the most appropriate generated target histogram has been done. Just 'in principle' experiments.

The result of matching the original linear image data (M42: 2.5 minutes integration Seestar S50) histogram to left-most example given above is shown below.
Result of Histogram Matching Using CDF of Generated Target Histogram
Comparing the generated target histogram against the histogram matched linear image shows a close match.

Target Histogram
Linear Data Histogram After Matching

Of note in the 'histogram matched' image is that the low-level nebulosity is visible at the same time as the high level detail (around the Trapezium) is preserved. This is a surprisingly good result. On the downside is the washed-out look in terms of colour. Just why is unknown at the time of writing.

Certainly this is a vast improvement over the previous first attempts at re-mapping using the CDF directly.

Some manipulation in GIMP results in the following image...
Result of Histogram Matched Image Processed by GIMP
I am pretty pleased with this result.

It may be possible to avoid trying to find the best generated target histogram by analysing good example images of targets and calculating the target histogram directly from those good example images. Perhaps a library of target histograms could be built making post-processing a simple exercise of auto stretching via CDFs with final tweaks in external programs such as GIMP as in the above example.

Interesting...

Wednesday, June 19, 2024

Histogram Matching...

As part of my experiments in astrophotography post-processing I noticed that - after applying a suitable non-linear stretch to the linear image data - the histograms of the stretched image had very similar shapes. As an example the histograms for an image of M42 are shown below.

M42 : Processed (Stretched) Image
The histograms for the luminance and the red, green and blue channels have been separately normalised to better show the similarity in shape. The relative amplitudes of the red, green and blue channels is shown below in the composite histogram plot.

Luminance

Red Channel

Green Channel

Blue Channel

Red, Green and Blue Channels Composite Plot


The original linear image as shown below where the pixel values are cramped down at the bottom end as shown in its luminance histogram.
M42: Original Unprocessed Linear Image
Original Linear Image Luminance Histogram
Implementing some way of modifying the data in the original linear image such that its histogram matches the shape and position of a stretched image histogram would be interesting. One way to do that is via "Histogram Matching" - the implementation of which will be described in the next post.

Sunday, June 2, 2024

One-Shot-Colour (OSC) and Debayering...

The sensor in the Seestar S50 is a 'one-shot-colour' (OSC) sensor. Unlike a monochrome sensor - where each pixel in the sensor is a light bucket over a wide range of colours in the spectrum - the OSC sensor has colour filters (red, green and blue) placed over a group of 4 pixels in a 2x2 Bayer matrix. The pattern of filters can vary from sensor to sensor - but in the Seestar S50 the order is GRBG starting from the top-left pixel and moving right - then the second row left-most moving right.

Seestar S50 Bayer Matrix

Note that there are two green pixels in the group of four, with the remaining two red and blue. A number of characteristics arise from this.

  • In an image of dimensions W x H, there are (W x H)/2 'green' pixels and (W x H)/4 'red' and 'blue' real physical pixels.
  • With three colours in RGB colour space, this number needs to be 'rounded' up to 4 in order to form a symmetrical repeating pattern.
  • Green is 'doubled-up' because of the history of OSC sensors is in normal photography and the eye is most sensitive to green.
  • In order to provide a value of all three colours for every pixel, a de-bayering (de-mosaicing) algorithm is used. This is a process of generating values from adjacent pixels. There are different de-bayering algorithms and different Bayer matrix ordering. Therefore - when exporting non-debayered image data the Bayer matrix pattern must accompany the data.
In summary - for every 'real' red or 'blue' pixel there are two 'green' pixels. That is, the 'real' resolution for green is twice that of red or blue. This can be seen in the image data.

To illustrate that I've taken a small tile of an image (outlined in red) from the Seestar S50 and examined it in detail. The data is from an image which has been 'stretched' using a CDF (cumulative density function) as described here.
The tile shown below is 25 x 25 pixels and is taken from an area of noisy data. The image data is displayed starting with the RGB composite, then the red, green and blue component magnitudes in row/column order.

What is immediately obvious is that the spatial resolution is much better in the Green channel. The Red and Blue channels are more 'blobby'. In other words - there is more high frequency data (detail) in the Green channel versus the Red and Blue channels.

It can be seen that there are areas in the red and blue data where the level is 'black'. In the corresponding areas in the green data there is fine features. When combined into the RGB composite this is the source of the 'green noise'.

It is interesting to see the effects of 'modulating' the red and blue data with the higher resolution green data. The results are shown below.

Here more detail has been added to the red and blue channels by multiplying their original values by the green channel data. The effect on the original image is shown below.

There is significantly more 'sharpness' in the detail - albeit with corruption of the colour balance. Nonetheless I find this an intriguing result.

Thursday, May 30, 2024

Minimalist Post-Processing...

Background

Having started learning about astrophotography at the beginning of 2024, it's been a bit of a wander through the various astrophotography 'rooms' trying different things. Each 'room' had its own attractions - but there are a great number of rooms.  There is also a lot of 'advice' as to how, what and where astrophotography should be done. It is easy to be restrained from picking your own 'niche'.

A comment often heard from some quarters is 'why do astrophotography ?' After all, much better images could be downloaded from the 'net captured by professional telescopes. That's true - but that is missing the most important point (at least for me) - which is - those images are not my images. The wonderment that the photons from some object millions of light-years away have been captured in my backyard telescope and converted to pixels in my image is missing from a downloaded image.

I'm gradually finding my own 'niche' guided by some parameters...

  • I know next to nothing about the night sky.
  • Viewing through an eyepiece is not an option due to eyesight and posture issues.
  • Polar aligning an equatorial mount is not an option due to eyesight, posture and light pollution (SCP doesn't have a Polaris). I tried and could never do it.
  • Spending hours on post-processing with clever - but complicated - post-processing applications is not my idea of fun.
  • My sky view is limited to the home block being almost covered by tall eucalypts and so about a 1/2 dozen different sites on the block are needed to get some coverage. Therefore, a permanent installation is not practicable.
So - for my situation the Seestar S50 is the best telescope option. It makes the time taken to find and observe targets amazingly short. In fact, within 5 minutes from walking out the door with Seestar S50 you can be acquiring data.

Post-Processing Pain


Spending hours post-processing the data to get the 'best' picture (the definition of which is not agreed upon anyway) - for me - slowly drains the wonderment and fun out of the exercise.

To this end I have spent a lot of time writing code to analyse the FITS format files output from the Seestar S50 for the purpose of perhaps coming up with a way to automatically post-process the data into an acceptable image. This exercise was largely successful - but still requires too much manual intervention.

Seestar S50 to the Rescue


One of the fantastic aspects of the Seestar S50 is the regular updates to the phone application and onboard Seestar S50 firmware. There are so many enhancements that cool new features/functions are easily overlooked. One of the cool new functions I overlooked is called 'Deep Sky Stacker'. Previously the exposures were stacked on the fly while the observation is in progress producing a single stacked FITS and a 'thin' JPG (really just a thumbnail). The functionality has now been extended such that - if the option is checked in 'Advanced Feature' - all the single RAW FITS exposure files are saved also. In Deep Sky Stacker all - or a subset - of those exposure files can be selected and stacked. Maybe there are some satellite trails on some of them - so these exposures can be excluded from the stack. You can even stack images taken of the same object taken on different nights - provided the hour angles are roughly the same.

In addition, the stacked image can be edited in-situ, with controls for denoising, brightness, contrast and saturation. The images can be 'downloaded' immediately to the phone - or exporting to Google Drive or Dropbox (but exporting is delayed until the phone is disconnected from the Seestar S50 and normal WiFi access is restored).

Below is an image of the Orion Nebula which has been processed entirely with the Seestar S50 phone app and the Seestar S50 firmware. I'm pretty happy with the result.

Orion Nebula (M42) Processed Entirely with the Seestar S50 Phone App and Onboard Firmware

I will explore this functionality further.

Tuesday, May 21, 2024

Using the CDF in RGB Colour Space...

Examination of the CDF in the graph below of a simulated astronomical image reveals that the shape of the CDF closely resembles the shape of typical non-linear 'stretch' curves used in astronomical image processing applications. It's an interesting exercise to use the CDF to remap the magnitude values according to the CDF.

In this example the magnitude values range from 0 - 100 (X-axis). Taking a magnitude value of 10 on the X-axis (10 % of full magnitude) and travelling up from 0 on the Y-axis - we cross the CDF at a PDF value of about 0.6 (right Y-axis).  Scaling this PDF w.r.t. to the full magnitude of 100 gives a value of 60. So - using the CDF we have remapped a magnitude value of 10 to a value of 60. This has the effect of spreading the low level detail further up the magnitude scale lifting it up out of the dark. Of course, this is a crude example - but nonetheless illustrates the principle.
PDF and CDF of a Simulated Astronomical Image
The effects of using this simple 'stretch' algorithm on three astronomical  image types is shown below - where individual CDFs are generated for each of the three RGB channels. The underlying data hidden in the linear view are revealed. The flaws in this simple stretch method are discussed.

First, the Horsehead Nebula (IC 434) - a somewhat faint nebula near a very bright star...
IC434 (Horsehead Nebula) with Simple CDF Stretch
The observation time was a short 12 minutes and so the nebula is faint and close to the noise floor. The faint details are brought up out of darkness - but at the expense of the bright parts - which are 'flattened' and devoid of detail. The stars appear bloated. The image also appears to be de-saturated in the bright parts.

The luminance histogram after this stretch indicates that using the CDF is a form of histogram equalisation - where values clustered down at the bottom of the magnitude scale are promoted up the scale. This is generally what is needed - but it is obvious that this simple method is not even close to ideal.

Second, NGC 4216 - a galaxy (with supernova) in a sparse star field with low nebulosity...
NGC 4216 with Simple CDF Stretch
Once again the bright parts are 'flattened' and devoid of detail. The stars again appear bloated. The image also again appears to be de-saturated in the bright parts.

The luminance histogram after this stretch indicates a form of histogram equalisation - where values clustered down at the bottom of the magnitude scale are promoted up the scale.

Third, the Orion nebula (M42) - a bright nebula with faint details covering a wide dynamic range...
Orion Nebula (M42) with Simple CDF Stretch
Once again the bright parts are 'flattened' and devoid of detail. The stars again appear bloated. The image also again appears to be de-saturated in the bright parts. The bright parts of the nebula cover a significant portion of the image.
Due to the predominance of bright nebulosity in this data, the histogram is closer to what is required. However, the bright parts still have little detail and are desaturated.

Another issue with this simple method is the that quantisation of the data is emphasised as shown in the histogram for just one colour - red.
The deviation from a more continuous curve is because when the re-mapping of values is done using the CDF, an increment of 1 in the input range resolves to many steps in the output range.  For example, using the CDF of the M42 image results in an input value of 600 to be re-mapped to 4837, while an increment of 1 in the input value to 601 is re-mapped to 5648. That means output values in the range 4838 to 5647 are not possible. This is not good as effectively many bits of amplitude resolution have been lost in the majority of the amplitude range - losing subtle detail.

Ways of reducing these flaws need to be found - if possible...

Monday, May 20, 2024

Histograms, PDFs, CDFs...

Histograms 

One of the most important tools for analysing astronomical images is the histogram. The histogram of the unprocessed astronomical linear image data shows clearly that the bulk of the pixels are cramped down at the bottom end of the magnitude scale as shown previously.

Example Astronomical Image Luminance Histogram
Also it was shown that the target for processing the image should spread the data values such that the bulk of the values lies around the 20 % - 25 % of the scale as shown below in the histogram of the 'everyday' image.
Everyday Image Luminance Histogram
This is done by 'stretching' the data, where the data values down near the bottom are increased, while at the same time the bright values near the top are left as they are. The stretching is done by some chosen non-linear function - but what and how ?

Recapping on the previous post concerning the form of the image data contained in stacked FITS files from the Seestar S50 - the data covers a range of an unsigned short integer (16-bits), that is, 0 - 65535. The histogram could have a bin size of  256 (as the histograms above have) - giving 256 separate bins. However, as the original linear data has been shown to be cramped down to the bottom few % of the magnitude, it is better to have a bin size of the value resolution, i.e., 1.  Therefore, there are 65536 bins.

To create the histogram then entails creating an array of 65536 elements each holding an integer count. The array type needs to be an unsigned 32-bit integer (range 0 - 4,294,967,296) to ensure there is enough range for counts in an image of 1080 x 1920 pixels (if all the pixels were full white, then the count in one bin would be 1080 x 1920 = 2,073,600).

For a Seestar S50 stacked FITS file, therefore, you would only need 21 bits unsigned (or 22 bits signed). However, the next step up after 16-bits is 32-bits. Of course, 32-bit signed integers could be used as halving the positive range down to 2,147,483,648 provides ample headroom - but personally (as all values will be positive) I prefer to use unsigned integers as that gives a clue in the code of the nature (i.e., all positive) of the data contained therein.

Building the histogram is simply a matter of reading each pixel in the image (three passes as there are three colours) and incrementing the value in the array element indexed by the value as read.

PDF - (Probability Density Function)

Not to be confused with the PDF document format, the Probability Density Function is a way of normalising the information contained in the image histogram. In the histogram, the Y-axis is the count of the number of pixels which have the value of the bin index. The scale of that count depends on the size of the image - a larger image will tend to have a higher count in each of the bins compared to a smaller image. By dividing the count in each bin by the total number of pixels in the image we get a normalised PDF with a Y-axis range of 0.0 to 1.0. Now for each magnitude value (X-axis) we get a probability of that value occurring in the image.

The use of the term 'probability' might seem a bit pretentious given the PDF just described is just a normalised count - probability is usually reserved for some random value. However, the term is justified. Imagine standing on a pixel somewhere in the image. The PDF gives the probability that any random pixel any distance away will be a certain value. As any random pixel must have a value in the magnitude range, the total of the individual probabilities for all magnitude values must equal 1.0.

Note that the shape of the PDF is the same as the histogram - just the Y-axis scaling has been normalised. Note also that the maximum Y-axis value will lie somewhere below 1.0 - unless all pixels have the same magnitude value. The maximum Y-axis value therefore gives an indication of how clustered the values are. Comparing the two histograms above - the astronomical image has data that is 'clustered' at the bottom end, and so the maximum Y-axis value in the PDF would be close to 1.0. The everyday image - with magnitude values spread over a wider range - would have a maximum Y-axis value much lower.

The minimum value in the data can be found by traversing the PDF starting from the 0 magnitude value and finding the first magnitude value with a PDF > 0.0. The maximum value can be found by traversing the PDF starting from the top magnitude value (65535) and finding the lowest magnitude value where the PDF = 1.0.

CDF - (Cumulative Distribution Function)

The CDF is an integration of the PDF. The CDF for each magnitude value is the sum of the probabilities of that magnitude and all magnitudes below. This can be done empirically by doing a running sum of probabilities starting from the lowest magnitude. The Y values for the CDF always range between 0.0 and 1.0 - irrespective of the histogram and so give a means to compare characteristics of different images. The rather busy graph below shows the PDFs for a typical astronomical image and an everyday image (light red and light green bar charts respectively). The PDF for the astronomical image (light red bar chart) shows values are clustered down the bottom end. Its CDF (dark red) rises steeply at first, but then flattens off. The PDF for an everyday image (light green bar chart) shows values are more evenly spread across the range. Its CDF (dark green) rises almost linearly. Note that CDFs can only keep the same value or greater progressing from left to right across the graph. That is, it is monotonic increasing.

PDFs and CDFs of Astronomical and Everyday Images
The upshot of the plotting of the CDFs is that we would want to somehow process the astronomical image data to have a CDF closer to the everyday image.  How that is achieved is an interesting problem.

One aspect to consider here is that astronomical images typically have PDFs in which the values are significantly more clustered at the low end than even the example light red PDFA in the above graph.  This means the CDFs will have an even steeper rise (reaches near 1.0 more quickly) than the dark red CDFA curve above. As key differences between such astronomical images lie in the low magnitude clusters, that area of the graphs needs to be zoomed into to reveal differences.

Sunday, May 19, 2024

Seestar S50 FITS Output Images...

 When examining and experimenting with Seestar S50 FITS format image files, it is useful to dive down into some of the characteristics of those files. Of importance is the numerical type of the data. Some of the characteristics of a stacked Seestar S50 FITS format file can be summarised as follows:

  • FITS format - this commonly used data format in astronomy stands for 'Flexible Image Transport System'
  • Data organised in rows and columns. Care needed as applications differ in the order of row/column read out - can result in a 'flipped' image.
  • Information concerning details of the observation, equipment, etc, are contained in a header. Header overhead for a stacked Seestar S50 FITS file is 8,640 bytes.
  • Data is stored in three 2D tables corresponding to the 1080 by 1920 pixels in the image - one table for each of the red, green and blue components of RGB data forming a 3D array.
  • Data values are stored as a 16-bit signed integer commonly referred to a short in programming languages which have fixed-bit integers types. Another common reference is int16. The range of values for a short is -32,768 to 32,767. 
  • As the magnitude of the optical data is always positive (i.e., minimum value = 0) an offset value of 32,768 needs to be added in applications reading the data such that -32,768 = 0. This converts the data range to a 0 to 65,535 magnitude range - in programming languages commonly called an unsigned short (ushort)
  • The FITS header specifies the 32,768 offset in a text field called 'BZERO'. Associated with that text field is one called 'BSCALE' - which has a value of 1 in all stacked files examined. In terms of C# code, the conversion from the short values read from the file into ushort magnitude values is done using this relation...
    ushort theValue = (ushort)(bZero + bScale * shortDataArray[colour, row, column]);

It should be noted that the data in the single stacked FITS file from the Seestar S50 has been de-bayered (de-mosaiced), a correction made for field rotation, stacked and then converted into 16-bit RGB data. The text field 'BAYERPAT' with a value 'GRBG' found in the stacked FITS file header is not needed as the data has already been de-bayered internally in the Seestar S50.

The individual sub-frame files, which can also be downloaded for stacking in external applications, are the raw data from the Bayer matrix in the sensor - which needs de-bayering. Therefore, the text field 'BAYERPAT' with a value 'GRBG' found in the sub-frame FITS file header is needed by any external program. The processing and creation of a single RGB image from this sub-frame FITS file data is not trivial as is evidenced by different external programs not agreeing on what Bayer pattern should be applied. In some applications it is necessary to select a different Bayer matrix pattern to what is specified in the FITS file due to that application reversing the read order of the row data - which flips the image and changes the effective Bayer matrix pattern. Due to this complexity, activity here - as far as analysis and processing is concerned - is restricted to stacked FITS files.

NOTE: My initial understanding of how data was stored in the sub-frame files was incorrect. I went down a path assuming the data was stored as RGB as is done in the stacked files. This lead to all manner of confusion on my part w.r.t. how many bits were allocated to each of R. G and B. Fortunately there was help to found on the Seestar S50 (Official ZWO Group) facebook group which educated me about de-bayering, etc. Thanks Chris G. !!!

Saturday, May 18, 2024

The General Nature of Astronomical Images...

 Astronomical images are acquired basically the same as 'everyday' images. While the hardware may be optimised for the special characteristics of an astronomical image, the capture process remains one of capturing photons and turning their energy into a chemical change (in the case of film - rare these days) or, more commonly, an electrical signal via a digital sensor chip. The digital capture device consists of a lens of a certain aperture and focal length coupled with a sensor. While the technology for capturing 'everyday' images and astronomical images is largely the same, there is a significant difference in the nature of the subject being observed.

For 'everyday' images, in the vast majority of cases, there is no lack of photons impinging on the sensor across the whole image. A typical 'everyday image is shown below.

An 'Everyday' Image
The brightness of this image data across the image can be represented by a luminance histogram as shown below.

Everyday Image Luminance Histogram
This histogram plots the count (vertical axis) of pixels in the image which have a certain luminance value (horizontal axis: dark to light - left to right). There are three peaks - the left-most peak near 0 luminance are the 'black pixels', while the right-most peak near maximum brightness are the bright spots in the image. The bulk of the pixels have luminances in-between these two extremes. There is a broad third peak near the 20% luminance point. While there are peaks and troughs in this histogram it could be said, generally speaking, that the pixels are reasonably spread across the luminance range - as you would expect from inspection of the image above.

In contrast, note the image below - which is a Seestar S50 output image of the Orion Nebula. This is one of the brightest nebulae - and yet only the brightest part is visible along with some bright stars.

Unprocessed Seestar S50 Output Image of the Orion nebula (M42)
The brightness of this astronomical image can plotted with an histogram again as shown below.
Example Astronomical Image Luminance Histogram
In this histogram there is just one visible peak - right near the lowest luminance. The bright parts of the nebula and the bright stars are to the right - but are not visible in the histogram as almost all the pixels are grouped into producing the very large spike near zero luminance. This is the problem with astronomical images in general. The image display requires the bright stars to be not saturated whilst the fainter details (nebulosity and faint stars) need to be promoted up the brightness scale so as to become visible.

A simple linear addition/multiplication of the brightness is not possible as this would saturate the bright parts. Some sort of non-linear transformation needs to be done - where the fainter details are brightened - but the brighter parts are not saturated. An addition issue is that the faint details that are to be brightened natively lie at the very bottom of the luminance scale. This area is also populated with system noise. The shorter the exposure for a given optics, the closer to the system noise the desired faint details will be. It becomes a challenge to distinguish which part of the pixel value is due to noise and which part is real 'signal'.

The successful extraction of the fainter details of the image is a fine balance between promoting those faint details without promoting the system noise into view. In practice - unless it is acceptable to lose detail in the data - some level of noise will be left in the image. How much is acceptable is a subjective judgement.

The Orion Nebula image can be 'stretched' as shown below. Here can be seen the brighter parts (with little noise) along with the fainter parts descending down to the 'salt-and-pepper' system noise level.
Example Astronomical Image - Stretched Non-Linearly
The corresponding luminance histogram is shown below...
Luminance Histogram of Stretched Example Astronomical Image
Note that this histogram looks similar to the 'everyday' image above - ignoring the peaks at minimum and maximum luminance. The bulk of the pixels now have a luminance of around 25% of full scale. Note also that the detail in the bright centre of the nebula has been washed out due to compression at the top end of the brightness scale. A different allocation of the limited dynamic brightness range can restore the detail in the bright area to some extent - but at the cost of some loss in detail in the faint nebulosity areas. Getting that balance right is a challenge.

Wednesday, May 15, 2024

AstroSmartProc (ASP)...

 Initial attempts were made to automate the first few stages of processing of a FITS format file from the Seestar S50. The application coded was called 'AstroSmartProc'  (ASP) - short for Astronomy Smart Telescope Processor and the GUI is shown below.

AstroSmartProc (ASP) GUI
This application automated the non-linear stretch and black pointing of Seestar S50 image data and performed to an acceptable degree.
 
Seestar S50 Capture of M42 - Processed by ASP 

ASP utilised the usual processes for its automation tasks - but I became intrigued to see if some alternate method or methods could be developed which might perform better. To this end a second application was put into development. The function of this application was limited to analysing the data and testing various algorithms.

This second application is named 'Seestar Image Viewer' (SIV).

Development of this application is ongoing at the time of writing.

Thursday, May 2, 2024

Flipping Hell...

 As a newcomer to astrophotography, the question of which is the right way up for images arises. A moment's thought provides the answer that there's no 'right way up' in space. Indeed an observer in the northern hemisphere - say at latitude 45 degrees North - looking South and imaging an object at declination 0 degrees on the meridian will have the positive declination direction towards the 'top' of the image. Conversely, an observer in the southern hemisphere - say at latitude 45 degrees South - looking North and imaging the same object at declination 0 degrees on the meridian will have the positive declination direction towards the 'bottom' of the image. At first it might be thought that swapping over top and bottom by a 'flip' would make the view the same - but instead a rotation is needed. This involves a 'mirror' in addition to the 'flip'.

Warning: It's best not to use the entirely logical term 'mirror' as it risks descending into a useless discussion about terminology. Instead use 'flip horizontal' and 'flip vertical' - or better still 'flip left-right' and 'flip top-bottom' respectively.

One could avoid the mention of 'flip' completely, as the only relevant transformation is 'rotation' - except for the observation that viewing the same FITS file in different applications reveals that some applications appear to not 'rotate' the image, but perform a single 'flip'. This led to considerable confusion on my part.

The problem is that this error can go unnoticed in an image where there are no clues as to the correct view 'on the sky'.  An example of this lack of clues would be star fields. Going up the range where details in the image give an increasing level of 'clues', the easiest objects in which to identify orientation are spiral galaxies. In the representation of a spiral galaxy directly below (actually a Catherine wheel fireworks), the direction of spin is easily seen as anti-clockwise.  Rotating this image naturally retains the direction of spin.

Catherine Wheel - Anti-clockwise Spin
However, if the image is 'flipped' instead (in this case left and right are swapped) as shown below, the spin direction is now clock-wise. That is, a view which would be seen from behind the Catherine wheel.

Catherine Wheel - Flipped Horizontal - Clockwise Spin
In writing my own applications to analyse and/or process Seestar S50 FITS files, the order of reading the data from the file determines whether the image is 'flipped' or not. This was checked empirically by doing a test run and comparing the image with an image from a professional source. It was found that the order of the data in one dimension needed to be reversed. Of course, once the correct order is determined via one FITS image file from the Seestar S50, the same holds true for any Seestar S50 FITS file. The correct order for FITS files from other sources (e.g., Dwarf Lab II) needs to be determined separately. And it's worth repeating - care needs to be taken when using other applications.