How the Basic Science We Learned at School Can Improve Our Photography

Understanding the basic physics of light will help us achieve the best photographic results. Here are some simplified explanations on how the science we learned at school can influence what we do with cameras.

There’s a lot more to light than meets the eye. Okay, so that’s an awful pun, but it is both metaphorically and literally correct. It’s a complex topic that we don’t need to understand in order to live our daily lives. There is also a lot of light hitting your retina, which is more than the retina can convert to nerve impulses sent to your brain. There is less light than that landing on your camera sensor that gets turned into an electronic signal and transmitted to the camera’s processor.

In other words, our eyes can’t see all the light there is, much of it is invisible to us, but they see more than our cameras.

Take Control of Your Camera’s dynamic range

You can see the most details on a sunny day in the highlights of the sky as well as the shadows of objects under the ground. Your camera, with a single exposure, can’t see as wide a range of tones. Modern sensor technology is much improved and as you can see in the image above they are able to capture both details and highlights even in the darker areas. It wasn’t that many years ago when I would have had to bracket the exposures of that scene and then combine them to create a high dynamic range (HDR) image to be able to see the top of the lighthouse when pointing the camera directly at the sunrise.

As we age, we gradually lose the ability to see a wide range of colors. We should be able see between 18 to 20 stops between black-and-white. Cameras at the top of the range are capable of 12 to 15 stops. However, a new sensor released this year boasts 24.6 stops. Don’t worry too much about that, though; your camera will still take fabulous pictures.

What is the importance of our Visible Spectrum?

The visible spectrum is only a small fraction of what we can see with our eyes and cameras. I find it quite incredible that we can differentiate the different colors of the spectrum that occur in a range that’s only 320 nanometers wide. We can distinguish all seven colors of white light, plus their combinations.

We are lucky that the majority of photons hitting the Earth fall within the range from 380-700 nanometers. This is the result of our planet’s location in the Goldilocks zone. It is exactly at the right distance to the right kind of star. In addition, our atmosphere contains an ozone film that prevents the harmful ultraviolet light from reaching us. We wouldn’t last very long if we were surrounded by more energetic particles like UV or Gamma radiation. If the wavelengths we were exposed to were longer, our eyes would be much larger and it would be difficult for us to thread a needle.

What is the impact of light bending on our images?

As the earth spins and dawn arrives, we see the sun appear before it’s physically above the horizon. That’s because, like passing through a prism or raindrops, the light bends when it hits the atmosphere. This refraction of light allows us to see beyond the horizon. It’s the same effect as when you put a spoon into a glass of water and see it bend.

White light is made of seven different colours: red, orange and yellow. Green, blue, violet, indigo and green are also present. These colors are also found in a Rainbow and the Pink Floyd album The Dark Side of the Moon.

Each color travels at different speeds because each wavelength is different. White light is broken up into its components because the colors bend differently. Red light has the least speed and is diffracted in the least amount. On the opposite end of the spectrum violet light is slowed the most, and it is refracted the greatest. Dispersion is the process of splitting up light. The greater the refraction, the more dispersion. Diamonds are the most refractive, and this is why they have a rainbow of colors.

In photography, this splitting of light is not desired. Rainbows are the last thing that we want from our lens. It is often called color fringing around high-contrast corners. This is a common fault with cheap lenses. Perfect lenses would have no aberrations. All wavelengths would also converge at a single spot on the sensor. Lens manufacturers use multiple glass elements to bring together the different wavelengths. Modern lens technologies are constantly improving and modern professional lenses have no visible distortions.

When light strikes an edge, it can also cause diffraction. Imagine water ripples hitting an obstacle. They will bend around the obstruction. It’s the same with light.

Why We Avoid Small Apertures

On a bright sunny day, stand outside and observe your shadow. You’ll notice that the outer edges aren’t as dark as the center. The light is bent around you, which causes the lighter edge. In fact, the darker edge of the shadow is known as the penumbra and the darkest part is called the umbra. The larger the umbra, in comparison to the penumbra of the shadow, the farther you are away from the light source. This will make your shadow sharper. This is something to take into consideration when using studio lighting and flash.

This light-bending occurs when photons bend and bounce around the edges of aperture blades. This bending and bouncing is more prominent the smaller the aperture. That’s because the proportion of diffracted light to un-diffracted light is high at small apertures.

Photographers avoid using small apertures as the images are less sharp due to the increased amount of diffracted lighting.

Why the sky is blue

Not only does this bouncing – properly called scattering – occur when light hits boundaries, but also when light encounters other particles. Blue has shorter wavelengths and is therefore scattered more readily than red. This is why the sky is blue during the day.

When we look towards the horizon, it can seem whiter. This is because the light from the blue spectrum is scattered more often in the atmosphere, removing the blueness. Furthermore, looking obliquely through the atmosphere the extra particles scatter other colors too, plus there’s also the light reflected from the planet’s surface affecting it. 

These factors cause the wavelengths to be mixed and produce white light. One exception is over the ocean, where blue water reflects up into the atmosphere and turns the low sky back blue.

How to use a CPL filter

The scattered skylight is polarised. The blue light waves will travel in one direction as opposed to moving randomly. This polarized movement runs perpendicularly to the source of light. If you attach a circular-polarizing (CPL), you can reduce the amount of light that is reflected in the sky when the Sun is at 90 degrees. This will make the sky look darker.

Polarizing filters are great for taking away reflections off the water’s surface, allowing you to see more clearly what lies beneath because the reflection is polarized. You can also use a polarizing filter to remove glare off of damp leaves in autumn, which will allow their colors become more vibrant.

Physics Behind Those Glorious Sunsets

As the sun descends, it must travel through more and more atmosphere to reach you. The atmosphere is full of dust and water vapour at lower levels. This scatters blue light more and we only see the reds, oranges and yellows.

Warm Colors are not Warm at All

When I refer to warm colors, I am referring to those that are psychologically warm. We tend to think that reds, yellows and oranges are warm colors while blues and greens are cool. In physics it is the opposite. Imagine a blacksmith heating up a piece metal. It starts out red, and then turns yellow. It turns blueish-white as it heats up. Gas torch used by welders is very hot, and can melt steel. It is blue.

It’s for a reason that your camera used red, green, and blue

It is a common question. Why are photographic sensors and computer monitors using red, Green, and Blue to reproduce colors, instead of Red, Orange, Yellow, Green, Blue, Indigo, or Violet? It is the difference between science, and engineering. The simplest engineering solution to achieving white is to combine just these three colors. It would be difficult and expensive to combine all seven colours of the spectrum onto a computer screen or camera sensor.

As with everything else in photography, there are compromises. The three primary colors (red, green, and blu) used by your camera and computer screen cannot produce the same range of colours as our eyes can see.

Even that isn’t as simple as it first seems. The device’s color palette varies. The virtual image is the most accurate version. It’s what your camera records when shooting raw and what your editing software understands your image to be. There is also a more limited version that appears on your computer screen, or the camera if it’s a jpeg. The version you can print is also different.

We want these three ranges to be identical. We use color management to achieve this. Color management is the process of defining the maximum and minimum value for red, blue, and green. Whole books have been written about this, and there’s far too much information to include in this article. You should set your camera, printer and screen to the same colour space if you do not shoot raw.

The most common of these color spaces to be used is sRGB. Adobe RGB, which has more colors and was used for high-end printing, is also a common profile. ProPhoto RGB offers an even wider range of colors and is compatible with most printers. But, the world has changed. The printers I now use have color profiles that are specific to each print type and paper. They provide the highest color accuracy.

It is sufficient for most photographers to simply remember to use color spaces that are no larger than the gamut on your device.

What is the Difference Between Subtractive and Additive Light?

Red, green and blue are the primary colors when we mix projected lights. They produce secondary colours: magenta cyan and yellow. The light is additive, so mixing green and red produces yellow. The printer’s color range can be determined by mixing different proportions of the colored lights.

When we use printer inks to remove colors from white, it is a different story. Inks can absorb light or reflect it. Mixing the magenta, yellow, cyan and cyan inks in different proportions to black can give us a different range of colours. The gamut is what color managers call this range.

By using a single color space we can ensure that we only use colors where two gamuts intersect. We would get strange results if we tried to display or print colors outside of the capabilities your screen or printer.

You Can’t Control the Color of Everything

In a similar vein, you’ll find control buttons on the screen to adjust, at a minimum, brightness and contrast. So will everyone else’s. Calibrating your screen is an important step to ensuring your prints’ colors and tones match what you print. Of course, when you share an image online, most other people won’t have calibrated screens. Your images could appear darker or brighter, with less or more saturation and a different contrast. It is impossible to change this. If you plan to print your photos and you want accurate results or if other photographers will be sharing the photos with you, calibrating your monitor is essential.

Please read more

This article is, of course, just scratching the surface of these topics and there is plenty to be learned under each topic I’ve covered. Fstoppers has over 33,000 articles that contain a wealth knowledge. Many of these articles go deeper into the topics I’ve briefly discussed here.

Previous post How to Scan From Printer to Computer
Next post Garment Bags Market Research Report 2023 Growth, Recent Trends and Forecast to 2029 by Key Players