This project aims at improving the applicability and performance of multifunctional Remotely Operated Vehicles (ROV) for the
inspection, monitoring and survey of underwater environments and manmade installations. In particular, the project aims at solving or improving the following activities:
- Surveying and monitoring of underwater structures, such as oil rigs, pipes, ship wrecks, and other submersed structure in general;
- Inspection of underwater installations, such as dams, dikes, docks, tanks, canals and tunnels (e.g., in hydro- and thermo-electric power plants), in order to detect causes of danger or damage;
- Surveillance for detection of oil and gas leakage from pipes, submersed tanks and wrecks;
- Control of unmanned industrial operations (e.g., pipe spool metrology).
At the state of the art, these problems are either solved through intervention of divers, which are consequently exposed to high risks; or addressed with insufficient reliability, because of the poor accuracy and quality of
feedback provided by sensors on board an ROV.
The goal of the project will be achieved by developing a novel integrated sensorial system, which will exploit both optical and acoustical data to provide the ROV pilot with an
augmented representation of the scene (augmented reality). The system will be also able to automatically synthesise a 3D model of the scene from sensorial data (model acquisition). Moreover, a unified man machine interface
will be provided to control an ROV as well as the sensors mounted on board
Therefore, the main innovation
of the ARROV project lies in the use of a compact set of imaging sensors (optical and acoustic cameras) aimed at the accurate 3D reconstruction of the surrounding environment, and in the effective and efficient
visualization of a scene to the vehicle operator through the use of augmented reality (mixed real and synthetic images).
- Augmented reality visualisation as well as a unified control interface will significantly contribute to reduce the cognitive load on tele-operators by providing cues that help prevent them getting lost and disoriented.
- Accurate 3D interpretation of data and automatic fault detection (e.g., leak detection) will significantly contribute to improve surveillance and control of unmanned operations.