Saturday 9 May 2020

Re-discover the Night Sky #3: hot, stuck & dead pixels

Camera image sensors may have faults.


This may result in an incorrect pixel value.


Welcome to the world of hot pixels, stuck pixels & dead pixels!

My initial night-sky experiments involved a Raspberry Pi camera, which was a useful platform to play with (...and I already had a packaged system) but it had a number of limitations. Primarily the longest shutter speed is about 6 seconds and I have a few 'light leaks' on my hacked lens mount which degrades the images.

So I played around with my old Pentax K110d for a while (which I triggered with an NE555 timer circuit) before moving on to my Olympus OM-D E-M10.

The M10 has timelapse modes built in, so its just a case of setting exposure and timing interval. This allows me to use a 30s shutter speed, so I'm more likely to 'catch a falling star' or a satellite.

However, initially I found the camera would not take a second shot until 30s after the first shot had finished. So stars were still captured as broken/dashed lines, which is not what I wanted. This was fixed by turning off the long exposure Noise Reduction, but then I discovered another issue...

pixel problems


Camera image sensors can suffer from 3 basic problems:-
  • Dead pixels; a pixel has no output, so is always black
  • Stuck pixels; a pixel always produces max output so is always red, blue, green or white
  • Hot pixels; a pixel may produce unexpected output during long exposures (red, blue, green or white pixels)
With Noise Reduction turned off, I can re-trigger the camera to take the next shot after a short delay of (say) 5 seconds. But when the images are stacked and processed in Gimp, I get fixed points of light in addition to the star trails.

20 photos taken on Olympus M10 + 14mm lens, ISO 200, f7.1, 30s shutter @ 5s interval


These points or dots are hot pixels. When the long exposure Noise Reduction feature is turned on, the camera basically takes two 30s exposures; the 1st is with the shutter open and the 2nd is with it closed. By subtracting one from the other, the hot pixel dots are simply removed.

You can do the same thing in Gimp by taking two long exposure photos; the 1st image of the night sky, the 2nd with the lens cap fitted. Then loading them as layers into Gimp and selecting the 'Subtraction' layer mode.

These hot pixels seem to 'move around'. So you could take one reference shot with the lens cap fitted and use this to subtract the hot pixels from a group of images, but I don't know how long the ref image would be valid. The hot pixel pattern is certainly not valid from one night to the next, but it maybe OK for (say) 30 minutes.

Anyway, I can live with these static dots when I'm just searching for flying objects. And if I had a special image I wanted to publish, the dots can easily be edited out!



While taking the photo sequence above, I noticed a satellite passing overhead, but thought it was just to the left of the cameras field of view. However, I processed the 20 images again using my script with the threshold set right down at 6, and it just appears about a quarter of the way up the left hand side (look for the X).


Even when I only view the single image that contains this satellite, I still need to bring the threshold down to 6 or 7 to see it, which illustrates how faint a moving object is when captured with a 30s exposure. It probably indicates that I should take more images with a shorter exposure, so will experiment with 10s, 15s & 20s shutter speeds.

No comments:

Post a Comment