Project status :
Completed
Key application areas

Essential capabilities

PARTNER
ANSYS Germany GmbH / AVL List / AstaZero AB / BWV / Basemark Oy / Brightway Vision Ltd / CARISSMA Institute of Automated Driving / FIFTY2 Technology GmbH / Institut für Lasertechnologien in der Medizin und Meßtechnik an der Universität Ulm / Magna Electronics Sweden AB / Meluta Oy / Mercedes-Benz AG / OQmented GmbH / Patria Land Oy / Robert Bosch GmbH / Technical University of Applied Sciences / Torc Canada / Torc Europe GmbH / Unikie Oy / University of Stuttgart / VTT Technical Research Centre of Finland Ltd / Veoneer Sweden AB / ams-OSRAM AG
ANSYS Germany GmbH / AVL List / AstaZero AB / BWV / Basemark Oy / Brightway Vision Ltd / CARISSMA Institute of Automated Driving / FIFTY2 Technology GmbH / Institut für Lasertechnologien in der Medizin und Meßtechnik an der Universität Ulm / Magna Electronics Sweden AB / Meluta Oy / Mercedes-Benz AG / OQmented GmbH / Patria Land Oy / Robert Bosch GmbH / Technical University of Applied Sciences / Torc Canada / Torc Europe GmbH / Unikie Oy / University of Stuttgart / VTT Technical Research Centre of Finland Ltd / Veoneer Sweden AB / ams-OSRAM AG
Countries involved






Project leader(s)
Dr. Werner RitterKey project dates
June 2021 - December 2024Artificial Intelligence enhancing vehicle vision in low visibility conditions. [AI-SEE]
In the rapidly evolving field of Automated Driving Systems (ADS), achieving reliable vehicle operation in adverse weather and low-visibility conditions represents one of the most pressing challenges. Today’s ADS technologies remain largely dependent on optimal weather for safe functioning which restricts operational availability and impacts customer acceptance. The project embarked on a journey to build a novel, robust sensing system supported by Artificial Intelligence (AI) that enables automated travel in varied traffic, lighting and weather conditions.
Achievements and results of the project
Key results include the development of a high-resolution, multimodal sensor suite, AI-driven real-time sensor adaptation, and advanced sensor fusion techniques. These innovations enhance obstacle detection, even in low-visibility conditions, and enable safer, continuous operation. Additionally, a comprehensive simulation environment has been created to accelerate system validation, reducing the need for extensive real-world testing. One of the most groundbreaking aspects of the AI-SEE’s robust perception system is its ability to detect and identify lost cargo and small obstacles—issues that present considerable risk on the road. This includes recognizing both common and more severe hazards, from scattered pallets and loose tires to fallen motorcycles. AI-SEE innovative solution provides the world’s first robust approach to detect these threats at distances of up to 200 meters, even in low-visibility
conditions, such as at night or in adverse weather.
Background, objectives of the project and challenges
Connected and Automated Driving (CAD) has become a megatrend in the digitalisation of society and the economy. CAD has the potential to drastically influence the mobility systems in terms of road safety, traffic efficiency, and environmental footprint. Personal pointto-point transportation without congestion, reduced cost and better quality in collective transport especially in rural areas, and attractive first and last mile services are seen as realistic impacts. Therefore, automated vehicle technology has drawn billions of Euros in investment, spectacular demonstrations and publicity. There have been great advances towards automation with new vehicles increasingly equipped with Advanced Driver Assistance Systems (ADAS). However, in this rapidly changing landscape, overcoming extreme weather conditions remains a significant challenge. The Operational Design Domain (ODD) sets the limits for the operation of automated vehicles. This means that the benefits of Automated Driving (AD) will be marginal if the ODD restricts the use of AD to only good weather conditions in daylight. The key for market introduction is an all-conditions capable AD system that can ensure safe travel. Considering this overarching vision, the AI-SEE project has been initiated to lay the groundwork for automated driving in all environmental and lighting conditions and establish the industry standard for high-level (SAE L4) automation.

Copyright AI-SEE Consortium
AI-SEE aspires to set the industry benchmark toward high level (SAE L4) autonomy by framing the foundation for automated driving in all environmental and lighting conditions. Project’s overall goal was to develop a novel, all-weather multi-sensor perception system supported by Artificial Intelligence (AI) enabling automated travel in all visibility and weather conditions. This goal is broken down into the following specific objectives related to new hardware and AI:
- High resolution adaptive all-weather sensor suite with novel sensors
- AI platform for predictive detection of prevailing environmental conditions including signal enhancement and sensor adaptation
- Smart Fusion to create a 24/365 adaptive allweather robust perception system
- Novel simulation path which allows to realistically simulate adverse weather near the sensor to adapt and test the system on both real and artificially generated road scenes
- System validation plan and driving test campaigns
Technological achievements
The beyond state-of-the-art progress and subsequent achievements are organized according to the main objectives:
- 24/365-High Resolution-Adaptive all-weather sensor suite
PolLiDAR unit integrates polarized imaging and LiDAR into a single hardware solution, combining colour, polarization, and 3D depth data for seamless sensor fusion. The system moves beyond the limitations of traditional LiDAR, which only captures distance, and expands its capabilities by utilizing the polarization state to enhance reconstruction accuracy in terms of distance, surface normal, and material property estimations. 4D MIMO Radar’s innovative waveguide-based antenna design minimizes signal loss and maximizes antenna
compactness, enabling 30 transmit and 40 receive channels for a fivefold improvement in angular resolution compared to the state-of-the art. Benchmarking against key corner cases highlights its superior detection range and resolution, ensuring reliable performance beyond 350 meters, even in adverse weather conditions. Gated Camera: Leveraging advanced gating technology and a deep neural network, the GatedCam produces high resolution depth maps that significantly outperform in spatial resolution, yet with sufficient depth resolution any existing hardware solutions. By only capturing light from specific depth ranges, GatedCam significantly reduces noise from weather particles, enhancing the clarity of captured images. SWIR LiDAR based on a large 2D Ge-on-Si SPAD Array with integrated receiver read-out and a MEMS scanning architecture. AI-SEE explored the path for highly reducing the costs of LiDAR systems by establishing the fundamentals for Ge-on-Si- based SPAD array research. - Artificial Intelligence (AI) platform for predictive detection of prevailing environmental conditions including signal enhancement and sensor adaptation
The 24/365 sensor suite has been equipped with novel signal enhancement methods based on compressive sensing employing latest neural networks and sensor adaption via semantic feedback. Using compressive sensing, the input signal can be implemented already on the individual hardware level. The main outcome is that AI-SEE has achieved improvements of up to 17% in conventional sensor technology through the use of novel deep neural networks in adverse weather conditions (rain, fog, snow). This benchmark enhances driving safety even with conventional (already installed) sensors. Furthermore, AI-SEE has made significant progress in the automatic recognition of prevailing bad weather conditions from a moving vehicle using the Gated Camera, and substantial strides in the automatic adaptation of parameters to these weather conditions. - Smart fusion to create the 24/365 adaptive all-weather robust perception system. The AI-SEE Sensor Fusion Platform is a groundbreaking solution designed to provide seamless perception in all-weather conditions. By integrating multiple sensor modalities, the platform delivers robust, longrange object detection and precise depth estimation, specifically tackling the lost cargo problem – the challenge of detecting small, non-traversable obstacles (such as tires and pallets) over extended distances of up to 150 meters, even in low-visibility conditions such as nighttime or adverse weather. The platform integrates two innovative sensor fusion approaches that build on each other to overcome key limitations of individual sensors while improving performance across a range of scenarios.

conditions. The qualitative results above show beneficial low light detection capabilities due to the gated camera as well as example detections from our proposed approach in
night-time, snowy and foggy conditions, which are achieved through attentive blending of features and multimodal querying. We depict ground truth bounding boxes in red, and predictions in green.
Copyright Torc Robotics
4. Demonstrator and system validation testing campaigns
AI-SEE all-weather multi-sensor perception system has been integrated in the test and demonstration vehicles. Extensive validation campaigns have been conducted throughout the project, both in controlled environments and on public roads. The campaign demonstrated the efficiency and robustness of AI-SEE perception system, proving its capability to function under all lighting and weather conditions.


Copyright Mercedes-Benz
Market Potential
The AI-SEE project stands as a transformative initiative with profound impacts for project partners and beyond, particularly addressing the severe technological barriers posed by inclement weather, which hinder the market introduction of automated vehicles. According to EU Regulation R157, an automated vehicle is permitted to drive at speeds up to 130 km/h ONLY if it can reliably detect non-overridable obstacles (height of >= 15 cm, e.g. tires, pallets, or a lying motorcyclist) at a distance of 150 meters. This problem has so far been solved
only in day light, clear weather conditions. AI-SEE has developed the world’s first solution for the lost cargo problem at night and in inclement weather conditions. This is a giant step towards SAE L3 market deployment, as well as it increases the safety of the existing ADAS systems. AI-SEE is poised to make a significant impact across multiple dimensions. Economically, it enhances competitiveness and fosters growth; environmentally, it promotes cleaner mobility; in the market, it opens new opportunities and accelerates adoption; and societally,
it enhances safety, mobility, and quality of life.
Societal & Economic Impact
Societal Impact: through the development of advanced sensor and software technologies, the project aims to reduce accidents and fatalities, significantly enhancing public safety. The reliable operation of automated vehicles in various weather conditions ensures greater mobility for all, including the elderly and disabled, improving their quality of life. Additionally, the reduction in traffic congestion due to better AD systems will result in smoother commutes and less time wasted on the road, contributing to overall societal well-being.
Environmental Impact: by enhancing the reliability and performance of sensors and software under adverse weather conditions, automated vehicles can operate more efficiently and with reduced emissions. Additionally, optimized driving patterns and smoother traffic flow resulting from advanced automation will further decrease energy consumption and environmental impact.
Economic & Market Impact: The collaboration between partners in the early stages of development will foster innovation, leading to job creation and bolstering the economic health of the automotive and electronics sectors. This advancement is expected to shorten the time to market for these technologies, promoting economic growth through the creation of new market opportunities and the enhancement of existing ones.
Patents, Standardisation, Publications
— 3 patents
— 23 Publications
— Contributing expertise to IEEE and ISO standards discussions
Future Developments
- Specialized sensors are the foundation for all-weather automated driving. Further development of high-resolution sensors like gated cameras, PolLiDAR, and MIMO Radar specifically optimized to perform reliably in low visibility conditions (e.g., darkness, fog, rain, snow) are fundamental for weather-resilient perception systems. Gated cameras use near-infrared (NIR) illumination, delivering detailed images even in low visibility. PolLiDAR (polarization combined with LiDAR) enhances 3D depth perception and material recognition, critical for distinguishing road surfaces and obstacles. MIMO Radar’s high angular resolution and long detection range provide reliable object tracking in extreme weather. Advancements in self-supervised depth estimation and neural network integration into depth mapping set new standards for weather-agnostic perception.
- Real-time AI-driven sensor signal adaptation and enhancement is key to maintaining accuracy in changing weather. Adaptive AI algorithms, like Weather-Net, detect current weather conditions and adjust sensor parameters instantly, optimizing performance and reducing false detections in challenging environments. Additionally, methods such as self-supervised depth estimation and de-scattering algorithms, refine sensor outputs by reducing noise and improving clarity in low visibility scenarios.
- Individual sensors have limitations, multimodal sensor fusion is essential for achieving reliable performance. By integrating data from various sensor types—including LiDAR, Radar, gated NIR cameras, and RGB cameras—fusion systems effectively compensate for the weaknesses of each sensor, providing a comprehensive and accurate perception crucial for safe operation in complex and adverse conditions. Techniques like Cross-Spectral Gated-RGB Depth Estimation and SAMFusion need further development to achieve robustness and enhance detection capabilities.
- Simulation environments reduce the need for extensive, costly and risky real-world testing. By creating realistic virtual scenarios, simulations enable efficient training and validation
of automated driving systems. These environments generate high quality synthetic data, mimicking challenging weather scenarios, which aids in training and validating the AI models. This approach accelerates development, reduces safety risks, and allows for testing in extreme conditions that would be challenging or dangerous to replicate in the real
world. - A holistic approach that integrates sensors, simulation, and testing is crucial for system development. Depending solely on field data is insufficient, especially in extreme or dangerous conditions. Bridging real world data collection with advanced simulations and controlled testing enables comprehensive system validation in challenging scenarios.