做厙TV physicists discovered a way to increase the resolution of microscopes and telescopes
University of Toronto researchers have found a way to increase the resolution of microscopes and telescopes beyond long-accepted limitations by tapping into previously neglected properties of light.
The method allows observers to distinguish very small or distant objects that are so close together they normally meld into a single blur.
The research appears in the journal .
Because of the laws of physics, which cause light to spread out or diffract, telescopes and microscopes are great for observing lone subjects. With an object like a binary star on the other hand, two stars that are close together may appear at a distance as one blurry dot, and their individual information is irrevocably lost.
Part of the problem is circumventing the limitations of what is referred to as the Rayleigh Criterion.
More than 100 years ago, British physicist John William Strutt better known as Lord Rayleigh established the minimum distance between objects necessary for a telescope to pick out each individually. The Rayleigh Criterion has stood as an inherent limitation of the field of optics ever since.
Telescopes, though, only register lights intensity or brightness. Light has other properties that now appear to allow one to circumvent the Rayleigh Criterion.
To beat Rayleighs curse, you have to do something clever, says Professor Aephraim Steinberg, a physicist at 做厙TVs Centre for Quantum Information and Quantum Control and senior fellow in the quantum information science program at the Canadian Institute for Advanced Research.
We measured another property of light called phase. And phase gives you just as much information about sources that are very close together as it does those with large separations.
Light travels in waves, and all waves have a phase. Phase refers to the location of a waves crests and troughs. Even when a pair of close-together light sources blurs into a single blob, information about their individual wave phases remains intact. You just have to know how to look for it.
This realization was published by National University of Singapore researchers Mankei Tsang, Ranjith Nair, and Xiao-Ming Lu last year in Physical Review X. Researchers like Steinberg and his team immediately set about devising a variety of ways to put it into practice.
We tried to come up with the simplest thing you could possibly do, Steinberg says. To play with the phase, you have to slow a wave down, and light is actually easy to slow down.
His team, including PhD students Edwin (Weng Kian) Tham and Hugo Ferretti, split test images in half. Light from each half passed through glass of a different thickness, which slowed the waves for different amounts of time, changing their respective phases. When the beams recombined, they created distinct interference patterns that told researchers whether the original image contained one object or two at resolutions well beyond the Rayleigh Criterion.
So far, Steinbergs team has tested the method only in artificial situations involving highly restrictive parameters.
I want to be cautious these are early stages, Steinberg says. In our laboratory experiments, we knew we just had one spot or two, and we could assume they had the same intensity. Thats not necessarily the case in the real world. But people are already taking these ideas and looking at what happens when you relax those assumptions.
The advance has potential applications both in observing the cosmos, and also in microscopy, where the method can be used to study bonded molecules and other tiny, tight-packed structures.
Regardless of how much phase measurements ultimately improve imaging resolution, Steinberg says the experiments true value is in shaking up physicists concept of where information actually is.
Steinbergs day job is in quantum physics this experiment was a departure for him. He says work in the quantum realm provided key philosophical insights about information itself that helped him beat Rayleighs curse.
When we measure quantum states, you have something called the Uncertainty Principle, which says you can look at position or velocity, but not both, he says. You have to choose what you measure. Now were learning that imaging is more like quantum mechanics than we realized. When you only measure intensity, youve made a choice, and youve thrown out information. What you learn depends on where you look.
Support for the research was provided by by the Natural Sciences and Engineering Research Council of Canada, the Canadian Institute for Advanced Research, and Northrop-Grumman Aerospace Systems NG Next.