Modelling, Simulation and Testing of automotive perception sensors

Sim4CAMSens is a CCAV funded project working on methods to quantify and simulate camera, radar and lidar sensor performance under all conditions.

The Effect of Snow on LiDAR Performance


To navigate our surroundings – whilst in a vehicle or otherwise – we use our senses to perceive the environment. Replacing human perception with automation has the potential to increase road safety by reducing the number of collisions [1]. For vehicles to operate with high levels of automation, we must substitute such senses with a new method of perceiving the world around us. To do this, automated vehicles are equipped with a sensor suite containing perception sensors such as cameras, radars and LiDARs (Light Detection and Ranging). The LiDAR sensor is a key component of this sensor suite and is the focus of this blog. These sensors work by emitting a beam of infrared light into the environment which reflects off objects in the environment and returns to the LiDAR. The information contained in the returned beam can be used to calculate the distance to the object, which can be converted to a point in space with knowledge of the emission angles of the beam. Emit thousands of beams at different horizontal and vertical emission angles over a short period of time and these points can be combined to form a “point cloud” that gives an “image” of the surrounding environment (see Video [Cross-reference]).

Video: Cross reference

To ensure automotive functions are reliable, it is important that LiDAR sensors can operate not only during sunny conditions with good visibility, but also in poor weather conditions such as during a snowfall event. To investigate how LiDAR sensors perform when the weather turns frosty, the Sim4CamSens project has collected and analysed camera, radar and LiDAR data during snowfall events. The methodology for data collection is presented in a separate Sim4CamSens blog [2]. In this blog post we present and discuss the initial analysis and results thereof for the LiDAR data. The data used for analysis was collected over 30 seconds in February 2024 during a heavy snowfall event.

Picture: View of the test site with the sensor housing in the foreground and the targets across the range on a cold frosty morning.

Effect on Range

Figure 1 shows the average across 100 frames of the range values for a LiDAR sensor displayed as a normal distribution across the averaged point cloud for clear weather and heavy snow. Both the mean and standard deviation of the range reduce in snowy weather against clear weather baseline. Since the environment has not changed, this is evidence that the maximum range of the LiDAR sensor will reduce and that more points will be returned with smaller range than with clear weather. Video (cross-reference) shows visually the point cloud and shows that many reflections from snowflakes occur resulting in points with the point cloud. The reduction in mean range in heavy snow provides evidence that some object points will be replaced by these “snow points” in each LiDAR point cloud. Snow data were collected using a disdrometer and classified as heavy if the equivalent rain rate was greater than 10mm/h.

Figure 1: Average range values across 100 point cloud frames displayed as a normal distribution of the points in the averaged point cloud.

Effect on Intensity

Figure 2 shows the average intensity across 100 frames for a LiDAR sensor displayed as a normal distribution across the averaged point cloud of intensity values for clear weather and heavy snow. The mean and standard deviation of intensity values reduce during heavy snow suggesting either that snow points return lower intensity than the object points they replace or that object points return a lower intensity due to the snowfall, or even a combination of the two. Lower mean intensity suggests that the returns from the environment are weaker, and lower standard deviation suggests less variation in intensity across the point cloud. The former could reduce the probability of detecting low reflectivity objects, and the latter could make object classification using LiDAR more difficult.

Figure 2: Average intensity values across 100 point cloud frames displayed as a normal distribution of the points in the averaged point cloud.

Effect of Wavelength

When it comes to LiDARs operating in snowy conditions, the choice of wavelength could play a crucial role in performance. Our recent field experiments with 905 nm and 1550 nm systems during snowfall reveal amazing insights into their behaviour. Since, the water absorption coefficient of 905 nm is less than that of 1550 nm wavelength, there is potential for wavelength to have an impact on range and intensity distribution in heavy snowfall, rain and foggy conditions [3, 4]. Figure 3 and Figure 4 show comparisons of the distributions for 905nm and 1550nm wavelength LiDARs in heavy snow and clear weather. Our findings appear to show a stronger effect from the heavy snow for the 1550nm LiDAR than the 905nm LiDAR. These findings highlight the importance of considering environmental factors, particularly snowfall, when designing and selecting LiDAR devices.

Figure 3: Comparison of effect of heavy snow on range for (left) 905nm and (right) 1550nm LiDARs.

Figure 4: Comparison of the effect of heavy snow on intensity for (left) 905nm and (right) 1550nm LiDARs.


The results provided demonstrate a clear effect of the snowfall on the performance of the LiDAR sensor. Detectable range decreases and more points are witnessed closer to the sensor resulting from detection of the snowflakes. The intensity of the returns reduces with a lower mean and less variation in their values. There appears to be a larger impact on 1550nm LiDARs than 905nm LiDARs. This provides evidence that the effect of snowfall on LiDAR performance must be investigated further and considered during the design and implementation of sensing and perception systems.


[1] The Royal Society for the Prevention of Accidents, “Road Safety Factsheet,” 2021. [Online]. Available: chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/ [Accessed 4th July 2024]


[2] Sim4CAMSens, “Modelling, Simulation and Testing of automotive perception sensors,” [Online]. Available: [Accessed 4th July 2024]


[3] M. Chaplin, “Water Absorption Spectrum,” 2000. [Online]. Available: [Accessed 4th July 2024]


[4] M. Z. M. K. Z. M. Jacek W, “Comparison of 905nm and 1550nm semiconductor laser rangefinders performance deterioration due to adverse environmental conditions,” Opto-Electronics Review, June 2014.


Please get in touch if you have any questions or have got a topic in mind that you would like us to write about. You can submit your questions / topics via: Tech Blog Questions / Topic Suggestion

Project Updates

All the latest news, updates and information from our project partners.

Go to Top