New year, New PhD opportunity
CAMERA associate Wenbin Li is inviting applications for the following PhD: Real2Sim: an end-to-end pipeline for uncontrolled scene digital twin.
This PhD project is a part of MyWorldBristolBath, the Strength In Places Fund. Launched in April 2021, MyWorld is a five-year programme, the flagship for the UK’s creative technology sector, and is part of a UK-wide exploration into devolved research and development funding (UKRI video). Led by the University of Bristol, MyWorld will position the South West as an international trailblazer in screen-based media. This £46m programme is bringing together 30 partners from Bristol and Bath’s creative technologies sector and world-leading academic institutions, to create a unique cross-sector consortium. MyWorld will forge dynamic collaborations to progress technological innovation, deliver creative excellence, establish and operate state of the art facilities, offer skills training and drive inward investment, raising the region’s profile on the global stage.
The University of Bath is inviting applications for the following PhD project commencing in October 2022 and supervised by Dr Wenbin Li in the Department of Computer Science.
The goal of this project is to develop an end-to-end pipeline to pre-capture visual assets from real-world environments and turn them into high-quality virtual scenes. This is followed by the implementation of a computational solution to create diverse synthesised scenarios and precise ground truth labels.
Modern machine learning systems can learn remarkably complex tasks from a good amount of data and the associated ground truth labels. The simulation from a virtual environment is important due to the low cost and automatically generating ‘perfect’ annotations. Many recent machine learning research relies on high-quality simulation. The challenge with simulated training is the absence of full capture of the real-world effects. In some common cases, models trained only on synthetic data would fail to generalize to the real-world scenario. There is very likely a discrepancy between synthesised virtual scenes and real environments, in terms of important visual and physical properties. This difficulty is so-call ‘Reality Gap’.
In this project, we suppose to bridge the ‘Reality Gap’ by creating a digital twin from real-world scenes. We would like to capture representatives of real-world effects, and digitalise them into a controlled virtual environment and the variations. These tasks would require the system to have a certain level of understanding of the real-world surrounding, and the ability to generalise virtual scenes. The potential candidate would focus on the development of a system including:
- capture high-quality real-world properties and their geometry estimation using multiple sensor fusion e.g. light-field, events, depth and 3D LiDAR etc;
- create 3D editable contents using geometric and semantic information we obtained from a scene;
- synthesise various virtual scenes and allow reasonable randomisation and a combination of certain real-world features.
In particular, the outcome of the project is supposed to achieve a solution to create a digital twin from any uncontrolled real-world surroundings. This should comprise sensor fusion, visual capture and 3D reconstruction for a real-world scene including a series of real-world challenges e.g. dynamic objects, illumination changes, large textureless regions etc. In addition, the achieved visual information as elements, is then turned into virtual contents and their combination for the generation of various synthesised scenes. The key challenge will lie in automatic 3D object manipulation and arrangement. This PhD project is a part of MyWorld, the Strength In Places Fund. The successful candidate is supposed to closely work with the experts from CAMERA and the Department of Computer Science, as well as other collaborators from University of Bristol, and our project partners.