Stream Of Panoramic Audio
Since 26 June, 2012 7:16
Last update; 11 April, 2017
In real life, localization of
sound sources are rarely static. When listeners
turn their head to the left or right in the real-world environment, the
flow of sound from their viewpoint instantly changes. Is it possible to
transmit such sort of sensation online?
Our answer is, 'Yes it is.'
On this site, we introduce our technology for online transmission of spatial audio information. This technology is named 'SOPA' which stands for 'Stream Of Panoramic Audio' and its features can be summarized as follows.
A unique feature of this technology is that it allows for interactive manipulation of panning.
In real-world situations, the locations of sound sources vary according to the movement of the listener's head. If a listener directly facing the sound source, for instance, a loudspeaker, turns his or her head to the left or right, the sound no longer arrives from the front. Therefore head movements can be interpreted as movements of the world from the viewpoint of the listener. In reality, this never occurs when listening to music through headphones since the headphones rotate along with the head.
SOPA makes it possible to control the panning of the sound during its reproduction.
SOPA data contain not only an audio data stream but also its spatial information. Spatial information can be used to produce directionality so that the listener can pick up sounds from any desired direction and reject sound from other directions.
The listener can control the target direction while listening to the sound.
Spatial audio information captured by the miniature head simulator system can be encoded to a custom data format. Even though its size is equivalent to the conventional stereo WAV file, a SOPA file contains not only the acoustic data stream but the 3 dimensional spatial information.
SOPA data can be decoded by using only a simple trigonometric function and Fast Fourier Transform. Since this process does not require much computer resources, a SOPA file can be reproduced not only on the desktop computers but also on mobile devices.
By using the SOPA format, spatial audio information can be streamed to the browsers, smartphones, and tablets with a low bit rate.
A SOPA file contains only the monaural audio data stream and the directional information. How to reproduce the data is not specified anywhere in the SOPA file. How to reconstruct the space acoustically from the SOPA file is open to the developers.
Although all demonstrations presented on this site require stereo headphones, spatial audio information can be reconstructed from a SOPA file also for the multi-channel loudspeaker system.
We constructed, for instance, a 6-channel loudspeaker system to reproduce SOPA data that is illustrated in the figure below. The system is named 'Panoramic Sound Spot System .'
Fig. 6-channel Panoramic Sound Spot System
Although this 6-channel loudspeaker system is rather small and for the personal use, the system might be expanded for the theatrical performance.
A Java program for the 6-channel Panoramic Sound Spot System 'pss6tr' is available. You can download a jar file from pss6tr.jar (SOPA archives).
A Java program for the 4-channel Panoramic Sound Spot System 'pss4tr' is also available. You can download a jar file from pss4tr.jar (SOPA archives).
Here, we provide samples demonstrating the capabilities of SOPA.
In the demonstrations, stereo headphones are necessary.
The demo can be started by clicking the thumbnail below .
Once the image is loaded and the page says "Click to start" or "Tap to start," you can start reproducing the sound by clicking (tapping) on the image.
If the instruction saying "Tap on the image," appears, tap the screen and the browser starts loading data. When the necessary data are loaded, the page will say "Tap to start" and you can start reproducing panoramic sound by tapping the image.
During the reproduction, panning of sound changes according to the motion of the mouse pointer.
If you are using the mobile device, pan and tilt can be controlled by the rotation of the device.
Thanks to Three.js (http://threejs.org/)