Physically-based modelling and rendering of 3D point-cloud data
Supervisor: Dr Miles Hansard
Research group(s): Cognitive Science, Robotics Engineering
3D data from depth-cameras and scanners is typically obtained in the form of semi-organized point clouds. This means that there are adjacency relationships between visual rays, from the viewpoint of each scan, and that the relative positions and orientations of the scans have been estimated. Large-scale point clouds, however, are often incomplete and inconsistent, owing to occlusions, overlaps, and viewpoint constraints. This project will investigate new geometric representations for point cloud data, with an emphasis on physically-based modelling and rendering. In particular, the problems of surface interpolation, re-sampling, physical consistency, and animation will be investigated. Strong programming skills, and a background in computer science, physics, or applied mathematics is required. There is scope for collaboration with the QMUL School of Geography, in this project.