ISO ISO IEC 18038-2020 pdf free download – Information technology — Computer graphics, image processing and environmental representation — Sensor representation in mixed and augmented reality.
4 Concepts
4.1 Overview
This clause describes the concepts of sensor-based mixed and augmented reality, including definition, objectives, sensor type, physical sensor representation, system functions for mixed and augmented reality (MAR), MAR objects, MAR scene graph, and MAR world. A mixed-reality world consists of a 3D virtual world and real-world sensors represented as 3D objects with their physical properties. As a simple example, the conceptual scene of a mixed-reality world is represented in Figure 1. It displays a heritage site represented by a 3D virtual world with a global navigation satellite system (GNSS) sensor and a CCTV sensor. The virtual world is of a real heritage location in a city and the character represents a tourist. GNSS information is displayed for the tourist and a real CCTV device is located at its real physical location at the heritage site.
Once real physical sensors are integrated into a 3D virtual world, their physical properties can be represented precisely in the virtual world. Sensor-based mixed reality is obtained by this convergence of 3D with physical sensors in the real world. For sensor-based mixed reality, sensors in a 3D virtual world are defined, and their information should be able to be transferred between applications, and between a virtual world and a real world. This work is intended to define how to exchange AR/MR application data in heterogeneous computing environments, and how physical sensors can be managed and controlled with their physical properties in a 3D virtual world.
Physical sensors in the real world are many and varied [11][14][18] . In order to control them in a 3D scene, these physical sensors are classified based on their information types and functions. Types of physical sensor devices include acoustic and sound, automotive and transportation, chemical, electric and magnetic, environment and weather, flow and fluid, radiation and particle, navigation, position and angle, speed and acceleration, optical and light, pressure, force and density, thermal and temperature, proximity and presence, and video. Each sensor is represented as a physical device in a 3D scene visually and/or functionally depending on the application and the type of sensor. Figure 2 shows an indoor and an outdoor scene, each with many physical sensors. Each scene represents a corresponding real world. The information and function of each sensor can be represented in the scenes. MAR scenes with physical sensors can be used for representing and simulating the functions of the sensors and, therefore, for managing the sensors using the 3D scene. These can also be used for facility management in a real world [3] . This document focuses on how to represent physical sensors in a 3D scene, what to represent about each physical sensor, what each sensor can do and the reason why each sensor needs this specification. When representing a physical sensor in a 3D scene, having the sensor appear in the scene is optional depending on the type of sensor and the application. Precise location and orientation of a sensor should be able to be represented and units for each sensor should be specified. A 3D scene should be able to be changed by the function of a physical sensor and simulated accordingly. The reason why such representation is needed is to provide a 3D scene with capabilities that can manage and control various physical sensors, for information services or security purposes.ISO ISO IEC 18038 pdf download
ISO ISO IEC 18038-2020 pdf free download – Information technology — Computer graphics, image processing and environmental representation — Sensor representation in mixed and augmented reality
Note:
If you can share this website on your Facebook,Twitter or others,I will share more.