Helping computers see in 3D

Together, academics and industry are coming up with increasingly sophisticated 3D mobile applications.

One of the greatest challenges facing the robotics industry is the need to develop technologies that can give products such as drones, autonomous vehicles and even vacuum cleaners the competitive edge.

Research carried out since 2005 has played a key role in meeting that challenge. In particular, work by Dr Andrew Calway and Dr Walterio Mayol-Cuevas, from the Department for Computer Science,  has led to the development of camera-operated systems that have the highest degree of accuracy, flexibility and practicality in ‘live’ 3D situations.

How a computer can 'see'

These systems are based on Simultaneous Localisation and Mapping (SLAM), an algorithm that improves the functionality of wearable computers and automated vehicles. The technology of rapid 3D visual mapping works by tracking the position of a moving camera whilst simultaneously building a computerised map of the surrounding scene.

Global use in robotics

In 2008, Bristol’s research into visual SLAM technology attracted the attention of US company Applied Sciences Laboratory (ASL), a market leader in wearable eye tracking technology.

“Industry is always looking for something to give them the edge. That’s where academic research comes in,” says Calway, who, with Mayol-Cuevas, helped ASL to develop GazeMap, the world’s first automated method for analysing data captured by wearable technology in commercial and military environments.

Ongoing investigations in this area, led by the department, have subsequently attracted investment from other key players in the global robotics industry.

Innovate UK, the UK government’s innovation agency, approached Calway and Mayol-Cuevas to help explore how semi-autonomous flying robots might be used to reduce the risks and costs involved in inspections carried out by the energy sector. Computer vision algorithms developed at Bristol have also been used by Samsung for its humanoid robot, Roboray.

Co-ordinating national services

Visual SLAM technology also forms the core of the £1.5m ViewNet project, aimed at helping the UK emergency services coordinate their responses by rapidly searching and mapping an area. Co-funders include Toshiba, The Ordnance Survey, ST Microelectronics and the National Physical Laboratory.

Other potential applications include using SLAM technology to help improve the spatial awareness of stroke patients, thanks to a project funded by the UK Occupational Therapy Research Foundation.

Research licences for the University’s software have also been issued to international research centres across the UK, Japan and Mexico.

Limitless applications

The list of applications, like the future of SLAM, is limitless, says Mayol- Cuevas: “From a research point of view, understanding what things are and what happens where is the next big challenge to enable many applications in smart sensors and devices.”

Study Computer Science

Collaborate with experts at the cutting edge of research.

Apply

Edit this page