Title: Memory-Friendly Deep Mesh Registration
Authors: Le Clerc, François
Sun, Hao
Citation: WSCG 2020: full papers proceedings: 28th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, p. 1-10.
Issue Date: 2020
Publisher: Václav Skala - UNION Agency
Document type: conferenceObject
konferenční příspěvek
URI: http://wscg.zcu.cz/WSCG2020/2020-CSRN-3001.pdf
http://hdl.handle.net/11025/38445
ISBN: 978-80-86943-35-0
ISSN: 2464–4617 (print)
2464–4625 (CD-ROM)
Keywords: geometrické hluboké učení;konvoluční neuronové sítě;přizpůsobení tvaru;3D mřížka
Keywords in different language: geometric deep learning;convolutional neural networks;shape matching;3D mesh
Abstract in different language: Processing 3D meshes using convolutional neural networks requires convolutions to operate on features sampled on non-Euclidean manifolds. To this purpose, spatial-domain approaches applicable to meshes with different topologies locally map feature values in vertex neighborhoods to Euclidean ’patches’ that provide consistent inputs to the convolution filters around all mesh vertices. This generalization of the convolution operator significantly increases the memory footprint of convolutional layers and sets a practical limit to network depths on the available GPU hardware. We propose a memory-optimized convolution scheme that mitigates the issue and allows more convolutional layers to be included in a network for a given memory budget. The experimental evaluation of mesh registration accuracy on datasets of human face and body scans shows that deeper networks bring substantial performance improvements and demonstrate the benefits of our scheme. Our results outperform the state of art.
Rights: © Václav Skala - UNION Agency
Appears in Collections:WSCG 2020: Full Papers Proceedings

Files in This Item:
File Description SizeFormat 
E29.pdfPlný text5,59 MBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/38445

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.