Symposium on Mobile Graphics and Interactive Applications

FCFull Conference Pass (FC - All Days)
FC1Full Conference Pass (FC - 1-Day Only)
BCBasic Conference Pass
ExExhibits Only
ETElectronic Theater Ticket
RTReception Ticket
Title
Mobile - based Streaming System for Omnidirectional Contents
Date
Tuesday, 03 November
Time
12:00 - 13:30
Location
Kobe Int’l Conference Center, Room 402, Level 4

Mobile - based Streaming System for Omnidirectional Contents


Many types of display systems have been developed for expressing high levels of presence. However, these types of display systems sometimes require the allocation of large spaces and they tend to be expensive. On the other hand, mobile devices such as smartphones and tablets are now widespread. Thus, it may be possible to build an immersive reality system on mobile devices, which users can experience at any time and in any place. In this context, we have developed a prototype spatial audiovisual display system using multiple mobile devices. However, real-time sharing of remote audiovisual space is still impossible. Therefore, in this study, we implement a spatial streaming system based on our audio-visual display system.
Our developed audiovisual display system is composed of mobile devices that are connected to a wireless LAN as a server-client system. In this system, iPads are used for visual information and iPod touches are used for sound information. In addition to an audiovisual display system, a streaming system has an omnidirectional camera (Ladybug 2), two stereo microphones, and a streaming server. The streaming server of the system captures omnidirectional images from Ladybug 2, and sound data from two stereo microphones. This captured video and sound data, with playlist file, is broadcast by HTTP Live Streaming protocol.
After the streaming server has prepared video and sound data, the visual server first downloads the playlist. After downloading them, the visual server separates visual and sound data and transfers the sound data to sound clients. The sound clients analyze the sound data format and play it one by one. The visual server generates a virtual sphere model and repeatedly maps the omnidirectional image to the inside of the sphere.
With the coordinated processes of the distributing server, the visual server, and sound clients, the streaming system is realized.

Presenter(s)

Masanori Hironishi , Kanazawa Institute of Technology

Go Back
BACK TO TOP