Title: Patch-Trees for Fast Level-of-Detail Synthesis
Authors: Birkholz, Hermann
Citation: WSCG '2007: Short Communications Proceedings: The 15th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision 2007 in co-operation with EUROGRAPHICS: University of West Bohemia, Plzen, Czech Republic, January 29 – February 1, 2007, p. 1-8.
Issue Date: 2007
Publisher: Václav Skala - UNION Agency
Document type: konferenční příspěvek
conferenceObject
URI: http://wscg.zcu.cz/wscg2007/Papers_2007/short/!WSCG2007_Short_Proceedings_Final-Part_1.zip
http://hdl.handle.net/11025/11139
ISBN: 978-80-86943-02-2
Keywords: míra detailu;syntéza mřížky
Keywords in different language: level of detail;mesh synthesis
Abstract: This paper describes a procedure that synthesizes Level-of-Detail (LoD) meshes from a tree of mesh-patches. The patch tree stores the surface of the original mesh in different detail levels. The leaf patches represent the original detail, while lower levels in the tree represent the geometry of their child nodes with less detail. Such patch trees have a coarser granularity compared to basic approaches like “edge-collapse”. This is because only complete patches can switch their detail, instead of pairs of triangles. On the other hand, it can better utilize the graphics hardware, which is capable to render preloaded patches very fast. The problem of such a patchbased LoD approach is to join the patches of different resolutions together in a smooth mesh. This problem is solved by the use of different versions of the patch borders that depend on the detail level of the neighbor patches.
Rights: © Václav Skala - UNION Agency
Appears in Collections:WSCG '2007: Short Communications Proceedings

Files in This Item:
File Description SizeFormat 
Birkholz.pdfPlný text954,41 kBAdobe PDFView/Open
Birkholz_prezentace.pdfPrezentace2,24 MBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/11139

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.