innovative system architecture for spatial volumetric acoustic seeing

Document Type

Conference Proceeding

Publication Date

6-15-2009

Abstract

Situational awareness is a critical issue for the modern battle and security systems improvement of which increases human performance efficiency. There are multiple research project and development efforts based on omni-directional (fish-eye) electro-optical and other frequency sensor fusion systems implementing head-mounted visualization systems. However, the efficiency of these systems is limited by the human eye-brain system perception limitations. Humans are capable to naturally perceive the situations in front of them, but interpretation of omni-directional visual scenes increases the user's mental workload, increasing human fatigue and disorientation requiring more effort for object recognition. It is especially important to reduce this workload making rear scenes perception intuitive in battlefield situations where a combatant can be attacked from multiple directions. This paper describes an experimental model of the system fusion architecture of the Visual Acoustic Seeing (VAS) for representation spatial geometric 3D model in form of 3D volumetric sound. Current research in the area of auralization points to the possibility of identifying sound direction. However, for complete spatial perception it is necessary to identify the direction and the distance to an object by an expression of volumetric sound, we initially assume that the distance can be encoded by the sound frequency. The chain: object features ⇒ sensor ⇒ 3D geometric model ⇒ auralization constitutes Volumetric Acoustic Seeing (VAS). Paper describes VAS experimental research for representing and perceiving spatial information by means of human hearing cues in more details. © 2009 SPIE.

Publication Title

Proceedings of SPIE - The International Society for Optical Engineering

Share

COinS