ADAS optics for safety

ADAS Optics – Not Just LiDAR

By Olov von Hofsten|
I have previously written some articles about LiDAR. This is what I have worked with the most and it is an interesting technology. But ADAS (Advanced Driver Assistance Systems) is not just LiDAR. Actually, LiDAR has not yet had the impact on ADAS as I thought before (although I still think it will be a key sensor in the future – and there are clear signs of this). I thought it would be good, for fairness, to discuss some other features of ADAS with interesting optics.

CAMERAS

Cameras are still the backbone of the ADAS systems. A new car today at ADAS level L2+ (~lane centering and lane changing functionality) uses 8 to 12 cameras: surround-view, front-facing long-range, rear-view, side mirrors, and sometimes a separate camera for traffic sign recognition.

The optical requirements for these cameras vary enormously. They are usually based on Euro NCAP ratings, which now test AEB at highway speeds [1]. In order for there to be enough time for the system to read, react and brake, this translates to pedestrian detection at over 100 meters, which demands high MTF at the edge of the field. A surround-view camera needs 190° or more field-of-view, where distortion management becomes tricky. And a rear-view camera for automated parking needs excellent low-light performance with minimal motion blur. There is also need for high dynamic range and stray light suppression.

What makes automotive camera optics particularly difficult is the environment. The temperature range alone — from -40°C to +85°C, sometimes up to +105°C near the windshield — creates significant challenges for athermalization (since the cameras are fixed focus). A lens system that delivers sharp images at room temperature can shift focus by hundreds of micrometres at thermal extremes, making material selection and housing design just as important as the optical prescription itself. Add to this the vibration, humidity, contamination from road spray and different chemical solvents (if the camera is not behind the windshield), and the need to meet qualifications, and you start to understand why automotive camera lens design is a speciality in itself. At Eclipse we have worked with this since 2015, see e.g Magna.

IMAGE QUALITY TRADEOFFS

Working with automotive optics, it is clear that there are many things that need to be considered when it comes to camera image quality. Some things are non-negotiable, we are after all dealing with lives. For example, spending some more time optimizing a 6-lens design instead of a 7-lens design, and athermalizing for as few glass lenses as possible can save a huge amount of cost without compromising quality.

It is also important to know the limits of optical design. It may not be possible to reach all requirements of large field-of-view, high resolution and low f#. Knowledge of the limitations is important and saves time and cost.

Another huge challenge is the wide dynamic range that a camera needs to have. Going from a sunny day and into a tunnel and not seeing a broken down car can be lethal. Combining high dynamic range and low light sensitivity (where Euro NCAP tests are performed) is in direct contrast to each other so this is a clear trade-off. Also, stray light management from scratched windshields or multilayered windshields for HUD is also something that places difficult demands on the camera image quality.

Optimizing the ISP pipeline is equally important. At Eclipse we have experience in this area as well. What choices to make may vary for the application and it can be very valuable to simulate. Let us know if you want to know more about this.

IN-CABIN SENSING

Perhaps the fastest-growing area of automotive optics right now is not outside the car but inside it. The EU General Safety Regulation (GSR2) [2] is making advanced driver monitoring systems (DMS) mandatory in all new vehicle types, from 2026 all new vehicles will need camera-based eye tracking (ADDW – advanced driver distraction warning). This means every new car sold in Europe will need an optical system that can track the driver’s gaze, head position, and drowsiness level. The US does not have a firm timeline for this implementation yet, but OEMs are still implementing it for European compliance (and for China where it is already required).
The typical DMS uses a 940 nm NIR illumination source (usually VCSELs) and a camera with a bandpass filter matched to that wavelength. The optical design challenges here are different from exterior cameras:

  • • Field of view: Needs to cover the full range of driver head positions, typically 80° to 110° diagonal.
    • Working distance: Short. Roughly 500 mm to 1,000 mm from camera to driver’s face.
    • Illumination uniformity: Critical. The VCSEL illuminator needs to provide homogenous NIR illumination across the driver’s face without hot spots that could affect the classification algorithm.
    • Eye safety: Operating at 940 nm close to human eyes means strict compliance with IEC 62471.
    • Sunlight rejection: The bandpass filter must reject ambient sunlight outside the VCSEL wavelength range, while transmitting maximal signal from the VCSEL source over temperature, angles and for the range of other tolerances.
    • Sensor QE: Traditionally being low at 940 nm but in recent years there has been a substantial improvement in efficiency, from ~12% to ~36%. These improvements come from thicker silicon in the photodiodes and back-side illumination and reflecting layers that increases the path length [3].
    • Global shutter: Critical for pupil movement detection and many other safety-critical features. Adds costs and limits other sensor features.

Monitoring does not only apply to the driver. Occupant monitoring systems (OMS) extend the concept to the full cabin — detecting whether a child has been left in a rear seat, for example. This requires wider-angle optics, sometimes multiple cameras, and for some use cases, time-of-flight depth sensing.

An interesting point is that these systems have been around for a long time: Structured light, depth from defocus and stereo vision. The application is new, but the real challenge is meeting all the requirements that come with the automotive environment (see above).

AR HEADUP DISPLAYS

Another area that is rapidly maturing is the augmented reality head-up display (AR HUD). Having worked with this I really find this exciting! The “standard” projection distance of about 2 m is increasing to 10 m (matching our HUD project 😊) and overlayed augmented information is coming (which is the part that makes this an ADAS component and not a cool gimmick).

The optical challenge is considerable. You need a wide field of view, a large eye box so the driver can see the image regardless of driver height, and a virtual image distance that matches the road geometry. All of this while dealing with a curved, non-uniform windshield that introduces astigmatism and distortion. It also involves the difficult and unusual concept of dipvergence, which is the difference in height of the projected image between the eyes. If this is slightly off, we notice it and get annoyed. Any augmented information requires pupil tracking for projected information to be overlayed with reality, but this is already in place for ADDW.

The optics solution involves conic surfaces and sometimes additional freeform design parameters that solve the requirements above. This can be difficult to do tolerancing on and especially to measure and manufacture. It is important to know what can be done and not, and to know the right manufacturers.
Hopefully we will soon experience the HUD that we have seen for years in different powerpoints: road-overlayed information! Thank you optics.
Read about our HUD here.

Figure 1: Typical Powerpoint presentation of an automotive HUD: Soon in your car?

Future direction

Optics is getting increasingly advanced in our cars. The number of optical sensors per vehicle is increasing, and the optical complexity of each sensor is growing. Soon, solid-state LiDAR will be standard which will reduce the mechanical complexity on the LiDAR side. For cameras, HUDs, and in-cabin sensing, we will continue to see development.

For optical designers, this means automotive is becoming one of the most demanding and rewarding application areas. The volumes drive manufacturing innovation, the safety requirements drive quality standards, and the multi-sensor architecture drives system-level optical thinking. The industry has seen some problems in recent years compared to the hay days of the start of the 2020s, but innovation must still be maintained.

At Eclipse, we welcome this development – since we love optics! And even if projects start with a small design, they can quickly expand into thermal analysis, stray light simulation, tolerance stack-ups for the full sensor module, and image quality optimization.

If you are working on an ADAS optical system and want to discuss your design challenges — whether it is a camera module, an illumination system, a DMS, or a full sensor cluster — feel free to reach out. We enjoy these problems!

References

  1. Euro NCAP Test protocols: https://www.euroncap.com/protocols/
  2. EU Delegated Regulation 2023/2590 (ADDW technical requirements), https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=PI_COM:Ares(2021)1075107
  3. Nyxel NIR (Near Infrared) Pixel Technology: https://www.ovt.com/technologies/nyxel-technology/