A Brief History of Microscopes and Microscopy

Apple | Spotify | Amazon | Player.FM | TuneIn
Castbox | Podurama | Podcast Republic | RSS | Patreon

Podcast Transcript

Ever since humans could see, we’ve been able to look up at the night sky and see things lightyears away.

However, for almost that entire time, we had no idea that right in front of us, there was another world so small that we couldn’t see. 

That world was first unveiled in the 17th century, and since then, we have developed the ability to see ever smaller things. 

Learn more about the history of microscopes and microscopy on this episode of Everything Everywhere Daily.

Human eyesight is designed to observe things at a human scale. These could be things we hold in our hands or larger things in the landscape, such as trees, buildings, or mountains. 

If you look closely at one of your fingers, depending on your eyesight, of course, you can probably discern the individual ridges in your fingerprints. You can see the width of a hair, especially if it contrasts with the background you see it against. 

Roughly speaking, the smallest thing we can see with our naked eyes is approximately one-tenth of a millimeter. In theory, we could see an even smaller object if it emits light, but most objects of that size do not emit light. 

The field of seeing very small things that cannot be seen with the naked eye is known as microscopy. Even though people didn’t know it at the time, Microscopy goes back a long way. 

The earliest known object which may have been used for magnification was known as the Nimrud Lens. It was discovered in 1850 in the Nimrud Palace in modern-day Iraq and dates back to the Assyrian Empire in the 8th century BC.

The lens is nothing more than a circular piece of quartz that does a crude 3x magnification. It does appear to have been ground by hand, but we don’t know exactly what it was used for. While it does magnify, it might have been used to start fires using the sun, which would have been a more practical use at the time for such a device. 

The ability of quartz crystal to magnify and bend light has been known for thousands of years. However, actually finding a practical use for them didn’t occur until the 9th century. An Islamic scholar named Abbas ibn Firnas is credited with the development of the reading stone. 

A reading stone was nothing more than a polished dome-shaped rock crystal or a glass sphere cut in half that could be held over text to make it easier to read. They became popular in Europe around the year 1000, and you can actually still buy them today. If you search on Amazon for “dome magnifier,” you will find what is basically a modern-day reading stone. 

There are also claims, but so far no archeological evidence, that simple one-lens magnifying glasses might have been used in the 10th century in China. 

Reading Stones only lasted until the 14th century, when they were replaced by spectacles. Spectacles improved on reading stones, and they began the development of lenses and the lens industry. 

The first spectacles and lens-grinding industry developed in the late 13th century in Venice and Florence. 

These early spectacles were just dual magnifying glasses that sat on your face. It was a one size fits all type of product, regardless of your vision. 

Single lens magnification improved as the art of lens grinding developed over the centuries. 

Eventually, it dawned on someone: what if you magnified something that was already magnified? In other words, what if you put two lenses in a row so that one lens magnified what the other lens magnified?

The placement of lenses in series became known as the compound microscope. 

It isn’t known who developed the compound microscope. Credit is sometimes given to the Dutch father/son team of lens makers Hans and Zacharias Janssen, who would have invented it in 1590. However, Zacharias only claimed to have invented the microscope decades after they became popular, and much about his story doesn’t add up.

Another person who is credited is Gallielo, who, in 1610, found that turning his telescope around allowed him to view extremely small objects. In 1624, Galileo submitted a proposal for a compound microscope to the Accademia dei Lincei. 

The word microscope was coined by Giovanni Faber, who named Gallielo’s submission. He used the Greek words micron, meaning small, and skopien, meaning “to see.”

Regardless of who first invented it, by 1630, they were regularly appearing as a product by lens makers along with simple telescopes. 

The first cells were observed in 1665 by the English physician Robert Hooke, who saw the cellular structure of a cork under a microscope. He also coined the term “cell.”

For the most part, in the early 17th century, microscopes were novelty items. You could look at a flea or a fly close up, and that was about it.

The person who popularized the use of microscopes in biology and who developed the science of microscopy was the Dutch scientist Antonie van Leeuwenhoek. 

Van Leeuwenhoek, who did much of his research in the late 17th and early 18th centuries,  wasn’t just an observer but also made the finest microscope in the world at that time. He actually just used a simple one-lens microscope, not a compound microscope, as many people assume. He was able to get 270x magnification with his handmade devices. 

For the first time in history, he observed living single-cell organisms. He dubbed these small creatures “animalcules.”

He was the first person to see red blood cells. He observed individual sperm cells and made a huge step toward unlocking how reproduction works.

He looked at drops of water from ponds and rivers and discovered an entire ecosystem that lived in the drop of water. 

Oddly enough, microscope technology was stagnant for about 150 years. There were problems with compound microscopes that prevented them from achieving the same levels of resolution that van Leeuwenhoek was able to achieve with his simple, one lens microscope.

Lenses did improve over this time, but there was an issue with chromatic aberration. Chromatic aberration is when different colors do not focus on the same point. 

In 1824, English optician Joseph Jackson Lister developed the achromatic lens. 

Advancements were made, which allowed for finer focusing. American John Leonard Riddell invented the first binocular microscope which allowed you to use both eyes through a single lens.

Henry Crouch and Charles Chevalier developed the oil immersion technique, which allowed for even greater resolution of images. 

The oil immersion technique involves placing a drop of specialized immersion oil between the microscope objective lens, which is the bottom lens in a microscope, and the specimen. This oil has a refractive index closely matching that of glass, minimizing light refraction and increasing the numerical aperture, which enhances resolution and clarity.

In August 1893, German scientist August Köhler developed Köhler illumination, which allows for even illumination of a subject without having to see the filament in a bulb.

Optical microscopes are an important part of any medical lab and are still used extensively today. 

However, there was a problem. Optical microscopes were limited by the wavelength of light used in observation. There were tests done in the late 19th century which used ultraviolet rays for microscopy. It doubled the resolution of optical microscopes, but it required quartz instead of glass and was extremely expensive and impractical. 

It was thought that microscopes might be limited to only viewing things with a resolution of one micron, or one-millionth of a meter, or one-thousandth of a millimeter. 

However, the advances in quantum physics found that electrons had a wavelength just like light. German physicist Ernst Ruska conceived the idea of using electrons instead of light for microscopy, as electrons have much shorter wavelengths.

In 1926, Ruska, along with Max Knoll, built the first electron microscope, which used a magnetic coil to focus electron beams. Subsequesnt versions of their electron microscope achieved a resolution of 2 nanometers by 1933. This type of electron microscope is known as a Transmission Electron Microscope or TEM.

Ruska was awarded the Nobel Prize in Physics in 1986, but Knoll passed away in 1969 and was not eligible. 

Their work resulted in the first commercial electron microscope sold in 1939 by the Siemens corporation. 

The 1940s saw improvements in the design of the Transmission Electron Microscope, which allowed for resolutions down to 2.4 angstroms or 0.24 nanometers. This included better electromagnetic lenses and better electron beams.

Transmission Electron Microscopes were a huge improvement over optical microscopes, but they still had limitations. Because they shot a beam of electrons through a substance, they have a very limited depth of field. It is good for looking at the internal structure of samples, but not the surface. 

Many of the problems of Transmission Electron Microscopes were solved with the development of Scanning Electron Microscopes. 

A Scanning Electron Microscope uses a focused beam of electrons to scan the surface of a specimen. It produces high-resolution, three-dimensional images of the specimen’s surface, making it valuable for examining surface topography in various scientific and industrial applications.

The Scanning Electron Microscope was theorized in the 1930s, but the first practical one was developed by a team led by the British-American physicist Albert Crewe, who developed it at Argonne National Labs in Illinois.

One downside of a Scanning Electron Microscope is that it requires coating any specimen with a conducting layer, whereas a Transmission electron microscope does not.

If you see a detailed image of something very small with high resolution that looks like it was taken with a camera, it was probably imaged with a scanning electron microscope.

These are not the only types of advanced non-optical microscopes. Field ion microscopes were developed in the 1960s, and when combined with a mass spectrometer, they resulted in what was called an Atom Probe and resulted in the first images of a single atom. 

Scanning tunneling microscopes were developed in the 1980s. It operates by measuring the quantum tunneling of electrons between a sharp metal tip and the surface, allowing researchers to visualize and manipulate nanoscale structures with extraordinary precision.

The inventors of the Scanning tunneling microscopes also shared the 1986 Nobel Prize in Physics. 

X-ray microscopes have been developed. X-rays have a wavelength between visible light and electrons. 

Cryogenic electron microscopy was developed, which allows for imaging of biomolecular structures at near-atomic resolution. The 2017 Nobel Prize in Chemistry was awarded for this technique. 

One of the highest-resolution images of atoms in a crystal was taken in 2018 by a team from Cornell University. They were using a technique called ptychography, which doesn’t even use a lens. Rather it has a computer develop an image from the scattering of electrons. 

Currently, the most powerful microscope in the world is the TEAM 0.5 microscope at the National Center for Electron Microscopy at Lawrence Berkeley National Laboratory. TEAM stands for Transmission Electron Aberration-Corrected Microscope.

It has the ability to resolve images down to half an angstroms or half the width of a hydrogen atom. 

Microscopy has come a long way in just the last 150 years. We’ve gone from observing individual cells to individual atoms. Microscopes have gotten better and cheaper. You can now purchase a digital microscope with a thousand-fold magnification that will display on your smartphone for under $30. 

Microscopes are used in a variety of disciplines today, including forensic science, evaluating gemstones, diagnosing diseases, and analyzing fractures in metal.

Much of what we know about biology, chemistry, and atoms all comes from the field of Microscopy.

The Executive Producer of Everything Everywhere Daily is Charles Daniel.

The associate producers are Peter Bennett and Cameron Kieffer.

It is something I don’t like doing, but sometimes I have to do it. In my episode on the Roman Empire, I said that the Emperor in 284 that ushered in the Dominate was Domitian. It is, of course, Diocletian. 

Domitian was a horrible emperor who ruled from the years 81 to 96, well before Diocletian came along. 

Mea culpa, mea culpa, mea maxima culpa.