Title: Rendering techniques for hardware-accelerated image-based CSG
Authors: Kirsch, Florian
Döllner, Jürgen
Citation: Journal of WSCG. 2004, vol. 12, no. 1-3, p. 269-276.
Issue Date: 2004
Publisher: UNION Agency
Document type: článek
article
URI: http://wscg.zcu.cz/wscg2004/Papers_2004_Full/M11.pdf
http://hdl.handle.net/11025/1729
ISSN: 1213-6972
Keywords: konstruktivní prostorová geometrie;vykreslovací algoritmy;prostorové modelování
Keywords in different language: constructive solid geometry;rendering algorithms;solid modelling
Abstract: Image-based CSG rendering algorithms for standard graphics hardware rely on multipass rendering that includes reading and writing large amounts of pixel data from and to the frame buffer. Since the performance of this data path has hardly improved over the last years, we describe new implementation techniques that efficiently use modern graphics hardware. 1) The render-to-texture ability is used to temporarily store shape visibility, avoiding the expensive copy of z-buffer content to external memory. Shape visibility is encoded discretely instead of using depth values. Hence, the technique is also not susceptible to artifacts in contrast to previously described methods. 2) We present an image-based technique for calculating the depth complexity of a CSG shape that avoids reading and analyzing pixel data from the frame buffer. Both techniques optimize various CSG rendering algorithms, namely the Goldfeather and the layered Goldfeather algorithm, and the Sequenced-Convex- Subtraction (SCS) algorithm. This way, these image-based CSG algorithms now operate accelerated by graphics hardware and, therefore, represent a significant improvement towards real-time image-based CSG rendering for complex models.
Rights: © UNION Agency
Appears in Collections:Volume 12, number 1-3 (2004)

Files in This Item:
File Description SizeFormat 
M11.pdf664,61 kBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/1729

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.