Imec demonstrates camera-integrated solution allowing Si-based CMOS sensors to detect short-wave infrared wavelengths that are normally out of reach for them because of physical and optical laws. This could open the way to extended functionalities in Augmented Reality, machine vision, autonomous vehicles etc.
Thanks to a cross-departmental effort involving material development, semiconductor processing skills, system-level design and much more, imec has realized important breakthroughs in the capabilities of silicon-based CMOS imagers to detect short-wave infrared (SWIR) wavelengths above one micrometer. Such wavelengths (e.g. the 1450nm and 1550nm bands) are important for the development of applications such as computer vision in mobile devices. Yet, due to optical limitations, these wavelengths are normally invisible to Si-based devices. Conventional approaches that use III-V materials (e.g. InGaAs) can overcome this detection barrier, but are not available at an acceptable price point for consumer devices. Thanks to thin-film photodetector (TFPD) technology, imec has now developed an end-to-end solution that enables Si-based infrared CMOS sensors at the price level of conventional CMOS imagers. Initial results are being demonstrated up to the system-level via a camera-integrated solution. The results are part of an overall effort, which already runs for several years and involves multiple research teams from various imec departments, locations and from its partners. Pawel Malinowski, program manager user interfaces & imagers at imec, Pierre Boulenc, team leader pixel devices at imec, and David Cheyns, team leader thin-film technologies at imec, give more details on the research, its results and future developments.
Your smartphone camera is also a projector
Before diving into the technological details, let’s first have a short high-level side-step to how your smartphone camera works. What’s obvious, is that it can detect and (thanks to the flashlight) emit visible light.
What might be less obvious for some, is that it’s also active in the non-visible light spectrum. Not only as a detector, but also as a projector. For applications such as face recognition, your smartphone camera emits a grid dot pattern based on infrared light and captures the reflections coming from your face. For those who want proof of these IR capabilities: aim our your (tv) remote to your smartphone camera while pressing one of its buttons. You should see the camera detecting light that’s not visible to your eyes. This is the infrared signal from your remote. And gives you a small trick from the playbook to check if your batteries are dead…
Typically, cameras to unlock your smartphone via face recognition are tuned to the near infrared (NIR) spectrum, more specifically the 940nm wavelength. This light is being absorbed by water, for example in the earth’s atmosphere. Which means you will get very little background noise or interference from radiation coming from the sunlight. Even though the human eye cannot consciously ‘see’ 940nm NIR, it is still sensitive to this frequency. This enforces limitations on the power that can be used to emit these signals and therefore on the useful application distance and efficiency in broad daylight. Face recognition works at maximum arm’s length; more advanced applications such as 3D scanning of objects and spaces don’t optimally function at larger distances.
Conventional Silicon CMOS is blind above 1 micrometer
A more ideal operational frequency for these scanning and sensing applications would be to move to the short-wave infrared (SWIR) spectrum. Here, the 1450nm band has similar advantages due to water absorption, yet less power restrictions due to several orders of magnitude lower sensitivity of the human eye. Also, in this SWIR spectrum lies the 1550nm band. Completely opposite to 940nm and 1450nm, this frequency is totally transparent to water, so in turn allows you to look through mist, clouds, smoke and water vapor. These frequencies allow to design devices that reach higher distances and have increased sensing capabilities, for example in (autonomous) vehicles flying through clouds or driving in bad weather conditions.
Unfortunately for image-sensor and application developers, silicon photodiodes cannot detect light with wavelengths above one micrometer. The basic silicon material is transparent for photons in this spectrum. Because of these physical limitations, conventional CMOS imagers are blind to SWIR radiation.
One well-known way to solve this is by using semiconductors that do allow for electrons to be excited by lower-energy photons. For example, sensors based on InGaAs or other III-V materials. While these technologies are already quite well developed, they lack the production throughput that is needed for consumer applications, and their system integration is not straightforward. This makes them too expensive for mass-manufacturing.
Thin films allow to affordably detect 1450nm and 1550nm short-wave infrared
Imec has therefore come up with a solution that does allow for CMOS-based SWIR detection and this at the cost levels of silicon processing. Key enabler to achieve this are thin-film photodetectors (TFPD): multilayer stacks of a few hundred nanometers overall thickness with one of the layers being sensitive to IR. By post-processing these onto a Si-CMOS readout circuit, imec combines the best of both worlds: infrared detection via a CMOS-compatible process flow.
Regarding materials suitable for this thin film, imec is investigating multiple options, ranging from polymer- and small-molecule organic materials to inorganic colloidal quantum-dot layers. The latter, for the moment, is most promising because of the tunable and low-energy bandgap inherent to quantum dots. So far, imec has built most of its successful prototypes and demonstrators with PbS quantum-dot materials. The quantities of lead being used remain well within the legal restrictions of the EU ROHS guidelines and other regulations, making them suitable for production. Yet, completely lead-free alternatives are on imec’s roadmap and being investigated as well.
Progressive application roadmap
Schematic illustration of the design choices targeting imec’s progressive application roadmap. Left: basic IR-detector; Middle: IR detection integrated in visible-light imager; Right: multispectral IR detection thanks to tunable TFPD layers.
In view of future applications and product integration, imec follows a stepwise approach. In a first instance, monochrome infrared imagers based on a single TFPD stack are created and integrated as a separate die/functionality on the system level. This first implementation is the simplest, as it uses a plain, unpatterned layer of the thin-film photodetector stack. In this scenario, all pixels have the same absorption spectrum, unless you use specific filters. Potential application could be wavelength extension of the face scanner in smartphone cameras, allowing to move to the 1450nm spectrum without adding too much cost or complexity at the system level. Especially for Augmented Reality, this could become a valuable option to enable room-size scanning and applications.
In a second implementation, imec targets monolithically integrated TFPD stacks into the RGB pixel composition on the CMOS imager itself. In this design, an infrared subpixel can be added next to the conventional red, green and blue photodiodes. This means a separate sensor for IR detection would not be required, reducing both the system footprint and power consumption. Also, it would add an additional layer of information to visible cameras. Think for example about very accessible cameras with depth sensing capability.
The third implementation builds further on the monolithic pixelated design concept, combining multiple TFPD stacks with different active materials. Such configuration would enable pixel-level multispectral sensors in NIR and SWIR ranges, at a very compact form factor and a price point in the range of silicon image sensors. Application potential could be in autonomous vehicles needing long-range scanning capabilities (enabled by 1450nm-sensitive TFPD) as well as visibility in bad weather or low-light conditions (enabled by 1550nm-sensitive TFPD). Another application example could be in material sorting applications, where tuning pixels to characteristic wavelengths would add material-determination capabilities (e.g. discrimination of vegetation vs. buildings or real vs. fake plants).
Camera demonstrates expertise from processing up to system level
With this research, imec pioneers on the bridge between the worlds of IR and imagers: two domains that – in terms of conferences and publications etc. – at the moment only rather exceptionally mingle. Also, to bring these designs and theoretical concepts to industry-compatible technology, imec expertise spans the entire spectrum from materials design up to system integration.
For the first concept – the monochromatic IR sensor – imec has successfully built a complete end-to-end prototype integrated into a camera. Starting with a 200mm ROIC wafer processing in the foundry. Post-processing and TFPD integration (on die or wafer level) was executed in the imec fab, as well as chip packaging and buildup of the camera module. For the two monolithic designs, who are still in an earlier stage of research, the ambitions and roadmap are similar.
Ready for transfer to industry and R&D partnerships
Current status of technology is the accumulation of several years of R&D supported by various teams and partners within and outside of imec. For example, part of the research was executed with the support of the VLAIO SBO project “MIRIS” and developments happen in close partnership with the image-sensor industry.
Want to know more?
- On this webpage you can find more information about this research.
- Read the press release “Imec reports monolithic thin-film image sensor for the SWIR range with record pixel density” via the direct link or via this page.
Paweł E. Malinowski was born in Lodz, Poland. He received the M.Sc.Eng. degree in electronics and telecommunications (thesis on design of radiation-tolerant integrated circuits) from the Lodz University of Technology, Poland, in 2006. In 2011, he received the Ph.D. degree in electrical engineering from the KU Leuven, Belgium (thesis on III-nitridebased imagers for space applications). Paweł is working at imec since 2005, and in the Large Area Electronics Department since 2011. Currently, he is Program Manager “User Interfaces & Imagers” and focuses on thin-film image sensors and high resolution OLED displays. He coordinates activities related to applications of thin-film electronics technology in future devices, prepares go-to-market scenarios and sets the technology roadmaps. Paweł has coauthored more than 40 publications and submitted 4 patent applications. He was a recipient of the International Display Workshops Best Paper Award in 2014 and the SID Display Week Distinguished Paper Award in 2018. Paweł is Member of IEEE, SID and SPIE and serves on the ODI committee for IEDM since 2018.
David Cheyns received his master and Ph.D. in electrical engineering in 2003 and 2008, respectively, from the Katholieke Universiteit Leuven, Belgium. He has over 15 years experience in thin-film, large-area technologies, and co-authored over 60 publications and 5 patents. He was co-organizer of SPIE Europe, eMRS and member of the scientific advisory board of EUPV-SEC. He supervised more than 20 master and PhD students, and is assessor of 5 PhD topics. He pioneered the work on organic tandem solar cells, perovskite materials, OLEDs and thin-film photovoltaic modules at imec. Presently, he is principal scientist and team leader “Future Interactive Thin-film Technology”. In this role, he is responsible for the development of the next-generation large-area sensors and actuators for applications such as infrared imaging, gesture recognition, medical imaging, haptic feedback and microfluidics.
Pierre Boulenc was born in Grenoble (France) on September the 20th, 1978. He received a Solid-State Physics Master’s degree at “Université Paris VII” (2003) and a Material Science PhD degree in “Université des Sciences et Technologies de Lille” (2007) in France. From 2006 to 2013, he was responsible for TCAD simulations in the framework of CMOS Image Sensors, bipolar transistors, process and device modeling calibration at STMicroelectronics Crolles (France). In 2013, Pierre joined Imec in Leuven (Belgium) as R&D engineer. His work has been focused on CCD-in-CMOS pixel design and test as well as the exploration of novel pixel architectures by means of TCAD simulations. He is now leading the Pixel Devices team at Imec.
18 October 2019