Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorTan, R.T.
dc.contributor.authorKol, T.R.
dc.date.accessioned2013-11-21T18:02:11Z
dc.date.available2013-11-21
dc.date.available2013-11-21T18:02:11Z
dc.date.issued2013
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/15361
dc.description.abstractIn the quest for realistic rendering of outdoor scenes, clouds form a crucial component. The sky is a prominent feature for many such scenes, and seldom is the air devoid of clouds. Their great variety in shape, size and color poses a difficult problem in the pursuit of convincing visualization. Additionally, the complexity of light behavior in participating media like clouds, as well as their sheer size, often cause photo-realistic approaches to have exceedingly high computational costs. This thesis presents a method for rendering realistic, animated cumuliform clouds through a phenomenological approach. Taking advantage of the parallel processing power of the GPU, the cloud appearance is computed per pixel, resulting in real-time full-screen visualization even on low-end GPU's. The pixel color is determined by three components. First, the opacity is computed based on a cloud mesh consisting of a set of ellipsoids, upon which a fractal noise texture is superimposed to generate detailed cloud features. Second, the single scattering component is analytically calculated using the same triangle mesh and a discretized Mie phase function. Finally, the multiple scattering component is obtained from a phenomenological model using trigonometric and logarithmic formulas with coefficients based on the view and sun angles and the cloud thickness at the current pixel. The parameters for this model are based on the work of Bouthors et al., who studied light transport in plane-parallel slabs. We apply this model in a more robust manner to find the best-fitting plane-parallel slab, providing for a better and faster alternative to the original approach. The clouds are animated at no extra cost by translating them across the scene, which creates a simulation of turbulence, due to the fact that the noise texture is sampled by the world coordinates of a pixel. Additionally, cloud formation and dissipation is simulated by modifying the water droplet density.
dc.description.sponsorshipUtrecht University
dc.format.extent24172310 bytes
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.titleReal-time Cloud Rendering on the GPU
dc.type.contentMaster Thesis
dc.rights.accessrightsOpen Access
dc.subject.keywordscomputer graphics, cloud rendering, light scattering, GPU programming
dc.subject.courseuuGame and Media Technology


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record