Digitise your dog into a computer game

Research
CAMERA Digitise your dog into a computer game

Researchers from the University of Bath have developed motion capture technology that enables you to digitise your dog without a motion capture suit and using only one camera.

The software could be used for a wide range of purposes, from helping vets diagnose lameness and monitoring recovery of their canine patients, to entertainment applications such as making it easier to put digital representations of dogs into movies and video games.

Motion capture technology is widely used in the entertainment industry, where actors wear a suit dotted with white markers which are then precisely tracked in 3D space by multiple cameras taking images from different angles. Movement data can then be transferred onto a digital character for use in films or computer games.

Similar technology is also used by biomechanics experts to track the movement of elite athletes during training, or to monitor patients’ rehabilitation from injuries. However, these technologies – particularly when applying them to animals – require expensive equipment and dozens of markers to be attached.

Computer scientists from CAMERA, the University of Bath’s motion capture research centre digitised the movement of 14 different breeds of dog, from lanky lurchers to squat pugs, which were residents of the local Bath Cats’ and Dogs’ Home (BCDH).

Wearing special doggie motion capture suits with markers, the dogs were filmed under the supervision of their BCDH handlers doing a range of movements as part of their enrichment activities.

They used these data to create a computer model that can accurately predict and replicate the poses of dogs when they’re filmed without wearing the motion capture suits. This model allows 3D digital information for new dogs – their shape and movement – to be captured without markers and expensive equipment, but instead using a single RGBD camera. Whereas normal digital cameras record the red, green and blue (RGB) colour in each pixel in the image, RGBD cameras also record the distance from the camera for each pixel.

PhD researcher Sinéad Kearney said: “This is the first time RGBD images have been used to track the motion of dogs using a single camera, which is much more affordable than traditional motion capture systems that require multiple cameras.

“This technology allows us to study the movement of animals, which is useful for applications such as detecting lameness in a dog and measuring its recovery over time.

“For the entertainment industry, our research can help produce more authentic movement of virtual animals in films and video games. Dog owners could also use it to make a 3D digital representation of their pet on their computer, which is a lot of fun!”

The team presented their research at one of the world’s leading AI conferences, the CVPR (Computer Vision and Pattern Recognition) conference on 17 &18 June.

The team has also started testing their method on computer-generated images of other four-legged animals including horses, cats, lions and gorillas, with some promising results. They aim in the future to extend their animal dataset to make the results more accurate; they will also be making the dataset available for non-commercial use by others.

Professor Darren Cosker, Director of CAMERA, said: “While there is a great deal of research on automatic analysis of human motion without markers, the animal kingdom is often overlooked.

“Our research is a step towards building accurate 3D models of animal motion along with technologies that allow us to very easily measure their movement. This has many exciting applications across a range of areas – from veterinary science to video games.”

Sinéad Kearney talks about the research

 

Notes

Dataset: https://github.com/CAMERA-Bath/RGBD-Dog

Paper: https://researchportal.bath.ac.uk/en/publications/rgbd-dog-predicting-canine-pose-from-rgbd-sensors

Workshop: dynavis.github.io/2020/

Presentation: https://www.youtube.com/watch?v=EYJ9pB2CIG0

Video: https://www.youtube.com/watch?v=FGFI7S0fmHc&feature=youtu.be

This work was supported by the Centre for the Analysis of Motion, Entertainment Research and Applications (EP/M023281/1), the EPSRC Centre for Doctoral Training in Digital Entertainment (EP/L016540/1) and the Settlement Research Fund (1.190058.01) of the Ulsan National Institute of Science & Technology.

For further information, please contact Vicky Just in the University of Bath Press Office

+44 (0)7966 341 357

[email protected]

Keep Up To Date?

Join Our Centre Newsletter