Developing a multisource information fusion and interactive system is important for applications in virtual reality (VR) and augmented reality (AR). Particularly, the real-time understanding of human interaction intentions from related information has become a major technology trend of the VR and AR system.
As a subset of AR, VR provides completely virtual information that is mainly for creating a sense of immersion. AR is based on the superimposing of virtual scenes and reality, which can provide more real information. In VR and AR information collection systems, various types of sensors acting as the interactive interfaces are key components.
At present, mainstream technologies involving human-machine interactions can be categorized into eye movement tracking, mark point tracking, optical sensor interacting, and tactile sensor interacting systems.
Eye movement tracking simulates the visual effect in reality to complete some menu operations by acquiring changes in eyeballs, surrounding features, or projecting infrared to iris to extract features. For mark point tracking, the information of mark point is saved in advance, and image recognition technology is used to find and identify the mark to combine the animation and display. Optical sensors, or tactile sensors, are controlled by translating the optical signals (images) and mechanical signals into electrical signals. Compared with the first three types of technologies, tactile sensors have the advantages of high sensitivity, good dynamic performance, small size, and easy to wear, etc., especially in the intent perception of human intuitive commands in AR applications.
Along with the development of AR, portable, low-power consumption, and low-cost tactile sensors have become the major developing trend of interactive tools. However, in some AR applications such as virtual assembly, games, and AR home designer, traditional tactile sensors are complex, bulky, and inconvenient.
Over the years, major research efforts have focused on motion sensing mechanisms, including optical and mechanical, for the optimization of the interacting system. Mechanical sensing mechanisms can directly convert a mechanical trigger into an electric signal for detection of motion, vibration, and physical touch. But it is still difficult to implement a lightweight, simple, and easy-to-use interactive system. Besides, another common limitation is that most of these sensors require external power sources, which poses challenges to the longevity and mobility of the sensor.
Recently, a number of self-powered sensors based on triboelectric nanogenerators (TENGs) were extensively investigated and developed. By harvesting the mechanical energy from its working environment, the TENG-based sensors are able to work independently without an external power source.
In particular, TENG for tactile sensing has received increasing research interest in the last few years. TENGs can achieve high output power density, enabling diversified applications of tactile sensors in wireless systems, portable electronics, biomedical microsystems, and self-powered nanosensors. By transforming mechanical energy into electrical energy, self-powered TENG-based tactile sensors can generate voltage, current, or charge to indicate the property of applied force, i.e., normal force component and shear force component.
The device developed towards this goal is a self-powered triboelectric based virtual reality 3D-control sensor (VR-3D-CS) by the coupling of contact electrification and electrostatic induction. The VR-3D-CS has a novel structure design of two opposite touch spheres and separated sensing electrodes for 3D force information sensing and controlling. By moving two spheres toward same directions or opposite directions, 3D space coordinates (X, Y, Z and θX, θY, θZ) can be detected and controlled. Six-axis directions in 3D space are detected by a triboelectric mechanism for the first time.
The device consists of two symmetric structures, each with a top semi-sphere acting as the touching point for operation, bottom semi-sphere made of galinstan and polydimethylsiloxane (PDMS) mixture as the positive triboelectric layer, and four separated electrodes covered with polytetrafluoroethylene (PTFE, as the negative triboelectric layer). The galinstan-PDMS mixture and PTFE are chosen as the contacting layers due to their large difference of the work function.
For the characterization of the sensor, the normal force and shear force are produced at the same time when the general force acts on the top semi-sphere. The normal force presses on the bottom semi-sphere, bringing it into contact with PTFE and causing the output generation on the sensing electrodes. The shear force then compresses and extends the PDMS supporting membrane along the shear force direction, resulting in output generation on the electrodes in that direction. When the external force is taken away, the PDMS-supporting membrane pushes the sphere back to its starting position. By analyzing the output voltagesâ value on the four electrodes, the direction of the shear force can be determined.
For detecting the normal force, it shows a linear range from 0 to 18 N. The open circuit voltage, output charge and short circuit current sensitivity are 3.6 V Nâ1, 1.1 nC Nâ1 and 11.1 nA Nâ1, respectively. In the aspect of shear force detection, it can resolve the shear force direction with the step of at least 15°. The symmetric sensor modules are designed with 2 touch points and 8 sensing electrodes. The capability of detecting the normal force and the shear force is calibrated by judging the voltage values of the 8 electrodes. The combination of the 8 components simulates the space vector (X, Y, Z and θX, θY, θZ). Thus, the sensor realized the 3D attitude-control operation of an object in virtual space.
The demonstration of the VR-3D-CS as an interactive interface is successfully realized in the virtual assembly for AR application. Considering the advantages of self-powered mechanism, cost effectiveness, and easy implementation, VR-3D-CS shows great potential for batteryless AR interface, robotics, and energy-saving applications.
For any inquiry, please contact Prof. Chengkuo Lee at the Center for Intelligent Sensors and MEMS, National University of Singapore (E-mail: elelc@nus.edu.sg).
These findings are described in the article entitled Novel augmented reality interface using self-powered triboelectric based virtual reality 3D-control sensor, recently published in the journal Nano Energy. This work was conducted by Tao Chen, Mingyue Zhao, Qiongfeng Shi, Zhan Yang, Huicong Liu, Lining Sun, Jianyong Ouyang, and Chengkuo Lee from the National University of Singapore.Â