The representation of the lion at the Make in India show at the Hannover Messe in 2015 was a spectacular display using augmented reality technology. This and allied methods have changed the character of teleconferencing, advertising and entertainment. They use computer generated audio and video effects that recreate the environment of distant places. They also relay images of the reactions of participants so that they can respond to each other as if they were face to face.
The reality they create, however, when it is on a screen, is not real 3D, except that views from all angles are readily available and the images are of high quality. Objects can thus be rotated and the viewer can speed behind corners, but the screen looks the same, from whichever angle it is seen. The imaging that creates real 3D, where each eye, without the use of special glasses, sees a distinct image, and the mind can perceive depth, has been possible only with the hologram, which works with the help of lasers.
Advertisement
The trouble with holo-graphy, however, is that holograms are static pictures and take time to create. Therefore it is not useful for important applications, like transmitting images for medical or surgical operations, which need to show moving pictures.
Professor Nasser Peyghambarian and his group in the University of Arizona had published in the journal, Nature, a method of creating holograms every two seconds. This may have been the first step in speeding up the process to enable real 3D and moving representation.
Ritesh Agarwal and others at the University of Pennsylvania now report in Nano Letters, the journal of the American Chemical Society, a medium that switches between three hologram images on being stretched. This could help design new displays and speed up display or transmission.
An improved method of display, currently in use, is virtual reality, where the user wears a head-set to experience real depth and intense “immersion” in the target locale. The device has goggles that present separate images to each eye and the images could be real-life or computer-processed animation.
The image has depth, like a reallife view, but is still not true 3D, as one cannot move around to get a view from a different angle. Virtual reality has found success for personal entertainment but not for practical use as it does not allow users to communicate, or to participate in the action they are viewing.
Another development is augmented reality, which is not real 3D, but different views of real objects, or computer creations, which are projected on transparent screens. With special optical and sound effects, and the images being rapidly refreshed by computers, the impression is created of the viewer being surrounded by objects, movement, buildings or a landscape.
This was the technique used to create the illusion of a lion from India walking among the audience, and then morphing into a mechanical assembly of machine parts, at the opening ceremony of Hanover Messe. The same class of techniques also allowed Prime Minister Narendra Modi, to present himself during his election campaign, apparently in person, before dozens of farflung audiences at the same time.
In the field of engineering too, 3D software takes a set of engineering drawings and creates 2D images of the finished product, building, bridge, tower, et al. The software then allows the image to be rotated and turned around, to provide multiple views.
The viewer can also zoom in to any specific part, for a closer look, and so on. The software even allows the user to make changes in the 3D view, so that corresponding changes are carried out in all the related 2D drawings, for use by engineers.
True 3D, or an image that looks like a real set of objects with depth, so that one object moves before or behind another, when a viewer moves her head, however, is created only with the help of the hologram.
The hologram works by capturing not the image of an object, as received at a sensor or set of sensors, but by capturing the wave front of light waves that emerge from the object. If the same wave front is then created again, there is no way any sensor, like a pair of eyes, can tell that what they see is not the real object itself.
Recording the wave front, however, is easier said than done, because the light that falls on things is a mixture of light waves that are in all stages of wave motion and there is no single wave front to capture. This problem can be overcome by illuminating the scene to be captured with laser light.
While the waves of the laser are synchronised, or “in step”, there is still the problem of capturing a wave front. This is overcome by bringing in two sets of light waves, those that come directly from the source laser and the waves reflected from the objects, to fall on a light-sensitive screen.
There is thus the interaction of two wave fronts that arrive at the screen. At each different point on the screen, the two wave fronts would either add, to get stronger, or cancel, to get weaker. The screen is hence covered with a pattern of dark and bright portions, like a bar code, and this distribution captures the relationship of the illuminating light and the light that has been reflected by the objects illuminated.
Now, if this screen, with the pattern of dark and bright parts, is again illuminated by a beam from the same laser, the wave front that emerges would have the same intensity distribution as the wave front, which was originally played upon the screen. A person who looks at the screen would thus see the same, original wave front and would see the same objects as before. As the pattern on the screen, and hence the wave front, does not depend on where the viewer is located, any viewer would see the original objects as if they were physically there.
The problem, however, is that it takes time to print the pattern, which is the hologram, on the screen and it is a static pattern. To show motion, we need to create a hologram every sixteenth of a second, and then to run through the holograms at the same rate, so that the eyes see continuous motion.
The writer can be contacted at response@simplescience.in