Presenting Holographic Studio, a volumetric video capturing and production system for performance capture, visible from any perspective in virtual environments. This technology not only finds a natural application under the entertainment industry such as video-games and remote concerts, but also extends itself to other industries such as sports, fashion, culture, remote communication and education, military and medical training, among others.
Holographic Studio, or HoloStudio for short, was born as a joint initiative between University of Aveiro and VR360 and composes the work for my MSc. thesis in Informatics Engineering, year 2020/2021. This dissertation aims at reducing the gap between this technology, typically characterized by expensive hardware, scarce availability or high usage costs, and small and medium-sized companies that would benefit with the increase in immersion of their content.
The proposed infrastructure resorts to RGB-D sensors to capture depth data and integrates 3D data processing algorithms to clean and replicate live action performances to be viewed in immersive virtual spaces. The implemented architecture provides a solid basis to future innovations on the photorealism of captures, in a scalable and replicable way.
The selected RGB-D camera was Azure Kinect, synchronized through a combination of daisy-chain hardware connection and a custom software algorithm for refinement. Calibration is done through a semi-automatic process, with key point manual selection for global alignment and the use of ICP for refinement.
Camera placement characteristics, including height, angle, orientation and distance, were all studied for results optimization. The final capture studio, with custom lighting conditions, is highly portable and replicable.
Source videos from all RGB-D cameras are fed as input and through a multi-step processing pipeline that includes:
- Input Validation
- Image Processing
- Point-Cloud Alignment & Processing
- Surface Reconstruction
- Texturing
- Exporting
Image processing, using flying-pixel removal and maximum range techniques (left) and histogram matching and color transfer algorithms (right), applied to a preview capture:
Point-cloud alignment (left) and segmentation (right) using custom algorithms and standard processes (e.g. DBSCAN clustering), applied to a preview capture:
Poisson surface reconstruction, applied to a preview capture:
Visualization is done through a custom-developed Unity plugin, capable of real-time rendering of volumetric captures done through HoloStudio.
The full thesis is available in Portuguese at University of Aveiro's Institutional Repository or at my ResearchGate Profile. Short version presentation, written in English, is available in PowerPoint format here. Promotional video also available on YouTube.
If you find this work useful, please cite:
@phdthesis{phdthesis,
author = {Pires, Filipe},
year = {2021},
month = {07},
pages = {},
title = {HoloStudio: Volumetric Video Production Tool for Immersive Applications (MSc. Thesis Informatics Engineering, University of Aveiro)}
}I am the author of all work here presented, from the proprietary source code and dedicated infrastructure setup to the documentation, thesis and promotional video. Hardware used during preliminary testing was offered by research institutions IEETA, IT and INESCTEC.
This work could not have been done without the guidance of professors Paulo Dias and Miguel Oliveira. Special thanks to Marcelo Silva and the entire VR360 team.
For further information, feel free to reach out.









