​Given the limitations of the human eye, sometimes applications require more advanced imaging. For example, an autonomous vehicle needs as much visual information about its surroundings as possible to assist in avoiding a collision.

One way to see beyond the human eye’s capabilities is using polarimetric imaging, which considers the polarization of light in every pixel of an image to create a fuller view of an object than a traditional camera.

To understand polarized light, it’s first essential to understand that light is made of electromagnetic waves, and normal light waves travel across many geometric planes. Polarized light, which is visible to the human eye by restricting light through such means as a polarized filter on a camera or polarized sunglasses, travels along a single plane linearly or on a rotating plane clockwise or counterclockwise as circularly or elliptically polarized light.

Polarimetric imaging can reveal information a normal photo or human eye may miss by providing additional views that can increase the contrast of details, such as clearly pointing out cracks in manufactured building materials before they can be used in construction and cause a catastrophic collapse.

However, polarimetric imaging technology is typically large and can provide inaccurate images due to its composition. The machinery involves many moving parts that can slow down the imaging process, which also causes inaccuracies when objects in the imaging sensor’s view move.

Yu Yao, an associate professor of electrical engineering in the Ira A. Fulton Schools of Engineering at Arizona State University, sought to solve these issues through years of research, culminating in the invention of a chip-integrated metasurface full-Stokes polarimetric imaging sensor, which she and her team have dubbed MetPolarIm.

Read more on Full Circle