There’s no argument that digital photography has become standard to our modern way of living, with smaller, more compact cameras that deliver amazing picture quality entering the market every day. But what makes this possible? The secret is in the image sensor. This episode of Techquickie with Linus Sebastian provides a quick, to-the-point explanation of image sensors:
What Are Image Sensors?
Image sensors artificially mimic the transduction process of a biological eye. OK, wait, you say, what is transduction? It’s the action or process of converting energy into another form.
So, in photography, image sensors provide us the ability to capture high quality images and videos more directly into a digital format for easy usage.
Here’s Sebastian’s condensed explanation:
In the human eye, rod and cone receptors work in combination with ganglion cells to convert photons into an electrochemical signal which the occipital lobe in your brain can then process.
In the case of an image sensor, photons are captured as charged electrons and silicon and converted to a voltage value through the use of capacitors and amplifiers, then later transferred into digital code which can be processed by a computer.
While this can be done in different manners, most sensors pretty much operate in a similar way. The big difference that separates them is the way in which they process stimuli. There are two more readily available and mature forms of sensors: CCD (Charge Coupled Device) and CMOS (Complementary Metal Oxide Semiconductor).
As Sebastian explains, CCD sensors work by registering photon rays in silicon which contains a grid array of pixels. After electron charges are captured in this pixel array, they are then processed from the bottom to the top of the grid into a serial shift register and pushed out a single charge at a time to be converted into an analogue voltage that is then transformed into coding by way of an analogue to digital converter.
Since these sensors operate by processing charges individually across the lines of an array, this system needs quite a bit of power to function. But, that also means there will be less noise because of the minimized use of voltage amplifiers. For this reason, CCD sensors are better for certain types of photography like aerial or space.
CMOS sensors are what you will typically find in consumer grade products. The difference between CMOS and CCD is that instead of shuffling electron charges along an array to then be modified, CMOS sensors adds extra circuitry to each pixel, allowing it to pretty much do all of the processing individually with the signal, then sending it directly down the line to the CPU.
This eliminates gridlock, which reduces power usage and increases processing speed.
CCD and CMOS Image Sensors Compared
- More expensive
- Use more power
- Less digital noise
- Better in low light
- Less expensive
- Use less power
- More digital noise
- Rolling shutter issues
That was a nice, compact explanation and comparison of CCD and CMOS image sensors. We hope it cleared a few things up for you and helped you understand how your camera works just a little bit more.
Like This Article?
Don't Miss The Next One!
Join over 100,000 photographers of all experience levels who receive our free photography tips and articles to stay current: