Post

Live 3D Scene Capture for Virtual Teleportation

URL: https://doi.org/10.1145/3560905.3568086

Bibtex Entry:

@inproceedings{2022-Jin-sensys, author = “Jin, Tao and Dasari, Mallesham and Smith, Connor and Apicharttrisorn, Kittipat and Rowe, Anthony and Seshan, Srinivasan”, title = “Live 3D Scene Capture for Virtual Teleportation”, year = “2022”, isbn = “9781450398862”, publisher = “Association for Computing Machinery”, address = “New York, NY, USA”, url = “https://doi.org/10.1145/3560905.3568086”, doi = “10.1145/3560905.3568086”, abstract = “It has long been a goal of immersive telepresence to capture and stream 3D spaces such that a remote viewer can watch from any location or angle within the scene. This demonstration presents Mosaic, a new distributed 3D scene capture system that uses textured mesh data representation for streaming a 3D volumetric video of a space to remote viewers. Compared to more common point cloud based methods, we show that textured mesh data requires less bandwidth and yields the same visual quality. However, textured mesh reconstruction is compute and memory intensive, mesh simplification is not easily parallelizable, and texture maps lacks spatial and temporal coherence. Mosaic tackles these challenges by examining each computational stage and determines how they can be efficiently distributed across multiple compute nodes to reduce overall latency, minimize bandwidth, and maintain quality. We then provide an end-to-end latency and bandwidth breakdown that can be used to target future acceleration work.”, booktitle = “ACM Conference on Embedded Networked Sensor Systems (SenSys)”, pages = “774–775”, numpages = “2”, location = “Boston, Massachusetts”, series = “SenSys ‘22”, month = “November” }

This post is licensed under CC BY 4.0 by the author.