Title: Deep Light Direction Reconstruction from single RGB images
Authors: Miller, Markus
Nischwitz, Alfred
Westermann, Rüdiger
Citation: WSCG 2021: full papers proceedings: 29. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, p. 31-40.
Issue Date: 2021
Publisher: Václav Skala - UNION Agency
Document type: konferenční příspěvek
conferenceObject
URI: http://hdl.handle.net/11025/45007
ISBN: 978-80-86943-34-3
ISSN: 2464-4617
2464–4625(CD/DVD)
Keywords: světlo;zdroj;směr;odhad;rekonstrukce;RGB;hluboké učení
Keywords in different language: light;source;direction;estimation;reconstruction;RGB;deep learning
Abstract in different language: In augmented reality applications, consistent illumination between virtual and real objects is important for creatingan immersive user experience. Consistent illumination can be achieved by appropriate parameterisation of thevirtual illumination model, that is consistent with real-world lighting conditions. In this study, we developed amethod to reconstruct the general light direction from red-green-blue (RGB) images of real-world scenes using amodified VGG-16 neural network. We reconstructed the general light direction as azimuth and elevation angles. Toavoid inaccurate results caused by coordinate uncertainty occurring at steep elevation angles, we further introducedstereographically projected coordinates. Unlike recent deep-learning-based approaches for reconstructing the lightsource direction, our approach does not require depth information and thus does not rely on special red-green-blue-depth (RGB-D) images as input.
Rights: © Václav Skala - UNION Agency
Appears in Collections:WSCG 2021: Full Papers Proceedings

Files in This Item:
File Description SizeFormat 
H59.pdfPlný text7,5 MBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/45007

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.