Self-driving vehicles such as cars, ships and drones offer the potential for reduced costs, lower environmental impacts and fewer accidents. Now, a new open dataset from researchers at Chalmers University of Technology, Sweden, sets a new standard for evaluating the algorithms of such vehicles, and the development of autonomous transport systems on roads, water and in the air.
For self-driving vehicles to work, they need to interpret and understand their surroundings. To achieve this, they use cameras, sensors, radar and other equipment, to ‘see’ their environment. This form of artificial perception allows them to adapt their speed and steering, in a way similar to how human drivers react to changing conditions in their surroundings. In recent years, researchers and companies around the world have competed over which software algorithms provide the best artificial perception. To help, they use huge datasets which contain recorded sequences from traffic environments and other situations. These datasets are used to verify that the algorithms work as well as possible and that they are interpreting situations correctly.
Open data for researchers and specialists
Now, Chalmers University of Technology, Sweden, is launching a new open dataset called Reeds, in collaboration with the University of Gothenburg, Rise (Research Institutes of Sweden), and the Swedish Maritime Administration, which is now available to researchers and industry worldwide.
The dataset provides recorded surroundings of the test vehicle of the highest quality and accuracy. In order to create the most challenging conditions possible – and thus increase the complexity of the software algorithms – the researchers chose to use a boat, where movements relative to the surroundings are more complex than for vehicles on land. This means that Reeds is the first marine dataset of this type.
Ola Benderius, Associate Professor at the Department of Mechanics and Maritime Sciences at Chalmers University of Technology, is leading the project. He hopes the dataset will represent a breakthrough for more accurate verification to increase the quality of artificial perception.
"The goal is to set a standard for development and evaluation of tomorrow's fully autonomous systems. With Reeds, we are creating a dataset of the highest possible quality, that offers great social benefit and safer systems.”
The dataset has been developed using an advanced research boat that travels predetermined routes around western Sweden, under different weather and light conditions. The tours will continue for another three years and the dataset will thus grow over time. The boat is equipped with highly advanced cameras, laser scanners, radar, motion sensors and positioning systems, to create a comprehensive picture of the environment around the craft.
Image: The boat's logging computers which handle and save all the large data streams from the boat's sensors.
The highest technical standards to open doors to advanced AI
The camera system on the boat contains the latest in camera technology, generating 6 gigabytes of image data per second. A 1.5-hour trip thus provides 16 terabytes of image data – significantly more than what has been presented so far in competing datasets. It also provides far better conditions for verification of artificial perception in the future.
“Our system is of a very high technical standard. It allows for a more detailed verification and comparison between different software algorithms for artificial perception – a crucial foundation for AI,” says Ola Benderius.
During the project, Reeds has been tested and further developed by other researchers at Chalmers, as well as specially invited international researchers. They have worked with automatic recognition and classification of other vessels, measuring their own ship's movements based on camera data, 3D modeling of the environment and AI-based removal of water droplets from camera lenses.
Reeds contributes to both cooperation and competition
Reeds also provides the conditions for fair comparisons between different researchers' software algorithms. The researcher uploads their software to Reeds’ cloud service, where the evaluation of data and comparison with other groups’ software takes place completely automatically. The results of the comparisons are published openly, so anyone can see which researchers around the world have developed the best methods of artificial perception in different areas. This means that large amounts of raw data will gradually accumulate and the data will be analysed continuously and automatically in the cloud service. Reeds’ cloud service thus provides the conditions for both collaboration and competition between research groups, meaning that over time artificial perception will increase in complexity for all types of self-driving systems.
Image: The three GNSS antennas that measure the boat's position and direction with very high accuracy Also visible is the radar, which provides a complete radar view around the entire boat.
More about the research project
The project began in 2020 and has been run by Chalmers University of Technology in collaboration with the University of Gothenburg, Rise and the Swedish Maritime Administration. The Swedish Transport Administration is funding the project.
Chalmers University of Technology in Gothenburg, Sweden, conducts research and education in technology and natural sciences at a high international level. The university has 3100 employees and 10,000 students, and offers education in engineering, science, shipping and architecture.
With scientific excellence as a basis, Chalmers promotes knowledge and technical solutions for a sustainable world. Through global commitment and entrepreneurship, we foster an innovative spirit, in close collaboration with wider society.The EU’s biggest research initiative – the Graphene Flagship – is coordinated by Chalmers. We are also leading the development of a Swedish quantum computer.
Chalmers was founded in 1829 and has the same motto today as it did then: Avancez – forward.