Metasurface-based augmented reality headset equipped with an eye movement monitoring system

In this paper, a metasurface-based waveguide display equipped with an eye movement monitoring system is presented. In the suggested device, the functions of the eye movement system and AR are completely independent of each other and are designed in two separate sections at wavelengths 775 nm and 635 nm respectively. In the next part, in order to investigate the effects of the shape of the waveguide on FOV and efficiency, a multifunctional display system comprise of a single rectangular waveguide with two sensitive polarization channels are designed to operate as an AR and eye movement monitoring systems simultaneously at visible and IR wavelengths respectively. In both devices, monitoring eye movements can be done over the range of − 24° to 24° and the digital images are displaced in the user’s FOV based on horizontal eye positions. Although the first suggested system is heavier than one, its FOV is almost more than twice of the second system. The results indicate that metasurface-based waveguide technology can be considered as an appropriate platform for developing wearable eye movement systems.


Introduction
Augmented reality head-mounted display is a novel experience that real-world environment can be modified through superimposition with virtual world information (audio, video, and text). Actually, in augmented reality, some parts of the information perceived by the user exist in the real-world and another part is created by the computer. Nowadays, AR technology is considered a key asset in myriad applications in people' life. Education and training, medical assistance, entertainment, Tourism Industry, Repair and Maintenance are just some of the examples of applications that have been explored by AR technology (Desselle et al. 2020;Surale et al. 2014;Gerup et al. 2020). Intending to improve the quality of AR HMDs, different approaches consist of waveguide displays, free form technology, reflective systems, retinal projection and projection near has been suggested (Hua & Huang 2020;Jeong, et al. 2020;Wei et al. 2019;Chen et al. 2017;Cheng et al. 2014;Utsugi et al.2020;Zhang and Fang 2019). Significant progress has been made in recent years to promote and improve the function of AR technology. However, a more immersive sense, more smooth interaction with the digital world and a comfortable experience are still among the main hurdles that need to be overcome. Hopefully, eye tracking is an emerging technology that has largely overcome these hurdles. Actually, In contrast to the old versions of AR systems that interaction and rendering were based on the head poses of users, in the new versions, the eye tracker system adjusts the visual content with the real world according to the movements of the pupil (Beach et al. 1998;Chamberlain 2007).
Several approaches are used in designing eye-tracking systems. In the first approach, pupil movements can be measured with contact lenses Khaldi et al. 2020). In the second method, by connecting two electrodes around the eye, the difference in electrical potential caused by the movements of the cornea and pupil of the eye produces an electrical signal called the electro-oculogram (EOG). With the help of EOG analysis, pupil movements can be detected (Zao et al. 2016). Video-based eye-tracker is the third technique that a camcorder records the movements of the pupil of the eye when the eye is seeing images. In this method, repeated irradiation of mainly infrared rays to the eye causes the production of successive reflections from the pupil of the eye. After analyzing the information received from the changes of successive reflections of the eye by the processors, the pupil movements are predicted. This type of tracking system is commonly used to measure eye gazing. High safety, low price, and ease of use in many applications are the advantages of the optical method, which has made this method known as the most standard eye-tracking method among the mentioned methods (Hua et al. 2006;Zdarsky et al. 2021).
The idea of integrating the Eye tracker system and HMDs have been introduced many years ago. In principle, these techniques are dividable into two general categories; the Functionality integration and the fully integration approaches (Zdarsky et al. 2021;Cognolato et al. 2018;Suguru et al. 2020). In the first approach, the eye tracker and HMD are assembled together as two independent instruments. Imposing complexity and challenges in terms of portability, compactness, and system alignment to AR system are disadvantages of this method. Assembling a commercial eye tracker with VR4 HMD by ISCAN Company is one of the examples of the functionality approach (Duchowski 1998). In the second approach, the eye tracker and HMD are fully integrated and their combination can be regarded as a unique instrument. For example, Hong Hua introduced an augmented reality glasses equipped with an eye tracker in which hot mirrors transmit images to the eye in the augmented reality section at visible wavelength, as well as tracking pupil movements at infrared wavelength in order to transmit the received information from the eye to the sensors (Hua et al. 2006).
In this paper, another optical part is added to the previous design (Afra et al. 2021) in order to equip the proposed metasurface-based waveguide display system with an eye gaze direction monitoring technology. The suggested system comprises two completely distinguished sections; the first part consists of a right trapezoid waveguide and two couplers to acts as an AR system at wavelengths 635 nm, while the other section includes a right trapezoid waveguide and two couplers operate as an eye tracker system at wavelength 775 nm. The unconventional shape of the waveguide in two sections not only reduces the weight of the device but also, makes it appropriate to realize a wide FOV in the AR section and high quality-image in the eye tracker system. As follows, we study the effects of changing the shape of the waveguide in the weight, diffraction efficiency, uniformity, FOV and the quality of received image by the detector in the eye tracker section. For this purpose, instead of implementing two separate waveguides, a multifunctional display system comprise of a single rectangular waveguide with two sensitive linear polarization channels are designed to operate as an AR and eye movement monitoring system simultaneously at visible and IR wavelengths respectively. Time domain (FDTD) technique is utilized in all simulations (Afra et al. 2021;Forouzmand and Mosallaei 2017).

Method of designing display system
For designing the shape of the waveguide in this approach, we have used the suggested idea in (Afra et al. 2021). In this reference, the design procedure of the waveguide and metasurface couplers are extensively explained. The schematic of the proposed Metasurface based waveguide display system is demonstrated in Fig. 1. As seen, to equip the device with an eye tracker, another optical part is added to the previous suggested design for the AR system in (Afra et al. 2021).As explained, the task of the AR part is transferring virtual images from the image source to the eye' viewer through input and output couplers, MGI1 and MGO1 under TE polarization at wavelength 635 nm. The second part consists of an IR light source, a trapezoid waveguide with the refractive index of 1.7, the leg angle 36°, and two meta-gratings that act as input and output couplers. In this path, the reflected IR light from the eye is coupled into the input coupler MGI2 under TE polarization, travels across the waveguide and coupled out by MGO2, and reaches the detector with different angles. The range of angles for trapped rays in waveguide is determined according to the Snell Law. So, the minimum allowed angle is specified 36° and the maximum angle is 80° in both waveguides. Also, the main orders in the first and second channels are determined − 1 and + 1, and the other ones should be suppressed. The structures of all couplers consist of periodic arrays of high-contrast crystalline silicon (C-silicon) elliptical-shaped metasurfaces with The values of refractive index and extinction coefficient of C-silicon at wavelengths 635 nm and 775 nm are: 3.8737 + 0.018j, 3.714 + 0.008j. It is necessary to mention that in order to steer lights directly to the eye, the waveguides are rotated 36°.
The simulation region is surrounded by Bloch boundaries in the X-and Y axis directions and perfectly matched layers in the Z axis direction (Afra et al. 2021).The periods in the Y and X directions for input and output in the AR part are 186 nm and 218 nm in (Afra et al. 2021).In the proposed eye tracker system, the suggested structures for MGI2 and MGO2 gratings consist of periodic arrays of three elliptical C-silicon nanoelements with the same thicknesses and 120 phase shifts which their radius R x increase gradually in the X direction, while their radius R Y are fixed in this direction. The period in the Y and X directions for input and output coupler are 92 nm and 134 nm and the thicknesses of the elements in both couplers are 605 nm.

Result and discussion
In this part, the simulation results of gratings in the two systems under TE polarization at wavelength 635 nm and 775 nm are demonstrated. In (Afra et al. 2021), the variation of diffraction angle and diffraction efficiency in both couplers are studied. As seen in the Figs. 7 and 8 in (Afra et al. 2021), the diffraction efficiencies in both coupler in AR system at wavelength 635 nm in the range of 0 to 44 dgerees are rather uniform that make them appropriate to realize clear image in AR section. In this section, the simulation results of the input and output couplers MGI2 and MGO2 under TE polarization at wavelength 775 nm are discussed. In the eye tracker system, to monitor eye movements horizontally, the reflected infrared light under different angles is coupled into the input coupler MGI2, diffracts, and propagates through the waveguide under TIR condition, reaches MGO2, and after diffraction receives by a detector. In the simulation input coupler, the range of incident reflected IR lights to MGI2 is from 0° to 24°. The same values for coupling efficiency are achieved in the negative direction of incidence light in the range of − 24° to 0° for the grating structure MGO2 when the radius of the elliptical-shaped elements in the positive direction of the X-axis gradually increase. Therefore we only investigate the diffraction efficiency of grating MGI2 in the positive direction of incidence angles. Figure 4a, presents the simulation results of diffraction angles of MGI2 in the eye tracker system.
As seen, the diffraction angles in the order + 1 in the eye tracker system, changes from 55° to 36°. Due to the limitation of the TIR condition, the order 0 cannot travel along the waveguide. Moreover, diagram of diffraction efficiency is represented in the Fig. 4b. Although diffraction efficiency is not uniform, it is sufficient for steering reflected lights from eye to the output coupler and tracking eve movements. In the next step, the diffracted angles from MGI2 propagate into the waveguide under TIR condition and coupled into MGO2 under TE polarization. Like the previous part, due to the change of the waveguide shape, the range of incidence angles to the output coupler MGO2 shift from − 55° to − 36° to − 19° to 0°. By displacement elliptical elements (1 and 3) in MGO2 coupler, and imposing TE polarized light in the positive direction (0 to 75 degrees) we achieve the same results. As seen in the Fig. 5b, the variation of diffraction efficiency of MGO2 in this range is negligible which lead to a good quality of recorded images by the detector. Figure 6, shows the intensities of beam in all orders in output coupler for the change of incident angles from − 19° to 0° (− 19°, − 10°, − 15°, 0°) which are matched with the changes of diffraction efficiencies and diffraction angles in Fig. 5. 4 Study the changing the shape of the waveguide on the main parameters of the proposed system: In this part, the effects of changing the shape of the waveguide to a rectangular on FOV, diffraction efficiency, and weight of the system are studied. For this purpose, we consider a common waveguide for the AR section and eye tracker system. The desired waveguide display system is presented in Fig. 7. The suggested system consists of a waveguide and four couplers with two independent optical channels in order to operate different functions simultaneously in the single waveguide at wavelengths 635 nm and 775 nm. The first optical path is regarded as a channel with two metasurface polarization-sensitive gratings MGI1 and MGO1 to transfer virtual images from an image source to the eye' viewer in visible wavelength under TE polarization at wavelength 635 nm. In this path, after striking collimated light to the input coupler MGI1, diffracted light travels through the waveguide and reaches the output coupler MGO1. In the second optical path, reflected IR light from the eye hits to the metasurface input coupler MGI2 under TM polarization, travels across the waveguide, and is diffracted by metasurface polarization grating MGO2 and after diffraction, received by a detector at wavelength 775 nm. As seen, two gratings MGO1 and MGI2 are placed in two separate rows. As a result in order to design TM polarization dependence gratings in the TM channel, the radius of grating's elements in the eye tracker system increase in Y direction while their sizes are fixed in the X direction. To achieve a desired performance, some conditions should be considered as follow.
Propagation of light in the waveguide according to the (TIR) law Regarding only one order as the desired one in each optical channel to eliminate ghost stray lights Avoiding optical interferences between two channels and different gratings.
Minimizing the reflection effects in the all proposed gratings

Simulation results of AR system and eye tracker in a rectangular waveguide
In this part, the simulation results of the AR system, and eye tracker in the common waveguide shown in the Fig. 7 are studied. In designing couplers we implemented the structure of gratings in the previous part. There is no change in the simulation results of the input coupler at wavelength 635 nm. However, conforming to the TIR law, the range of incidence angles to the output coupler is from 36° to 80°. Compared to FOV calculated through the maximum and minimum values of diffraction angles of output coupler demonstrated in the Fig. 8a (Afra et al. 2021), FOV is reduced almost by half and reaches 35°. Moreover, as seen in the Fig. 8b in (Afra et al. 2021), due to the sensitivity of the diffraction efficiency to incidence angles the diffraction efficiency is not uniform in this Fig. 6 Values of diffraction angles from the output coupler in the proposed system in the far-field diagram under illumination of TE polarized light at wavelength 775 nm range, decreases with a steep slope and reaches 0.15 in incidence angle 80° that absolutely affects the quality of the image in the AR channel. In designing metasurface gratings in the TM channel, we also used the structures in the eye tracker section in the previous part. The simulation results of couplers in TM channels are discussed in the Figs. 8 and 9.
According to the Fig. 8b, although the efficiency of MGI2 in the main order -1 is not uniform in the range of − 24° to 0°, it is sufficient for gathering information from eye movements in eye tracker section. The destructive effects of other orders are insignificant either due to low diffraction efficiency or to failure in meeting TIR law. As demonstrated in the Fig. 9a, in the range of 36° to 55°, the variations of diffraction angles of output coupler MGO2 in the order -1 are from 6° to 28°. Moreover, diffraction efficiency of MGO2 can be studied in Fig. 9b. As shown, the diffraction efficiency in the range of 36° to 55° of incidence angle in the order − 1 under TM polarization around 0.6 is nearly fix and in secondary orders (0, + 1) are negligible.

Study of the reflections and interferences effects in metasurface gratings in the common waveguide
In order to create a clear image and detect eye movements correctly, the effects of reflections and light interferences in the suggested waveguide display should be minimized. The effects of reflections and light interferences are as follow: The gratings MGO1 and MGI2 should operate independently with each other at wavelengths 635 nm and 775 nm in order to prevent light interferences between two gratings. Hence, the transmission of TE/ TM gratings should be close to zero under TM/TE polarization. Figure 10, shows the variation of diffraction angle and diffraction efficiency of MGI2 under TE polarization light.
As shown in Fig. 10b, the transmission of -1 and 0 orders in the range of incidence angles to the input coupler MGI2 at wavelength 635 nm are insignificant. Although the diffraction efficiency in the order + 1 is rather high, we should mention that due to the nonfulfillment of TIR condition, it cannot move along the waveguide. Also, the variation of diffraction efficiency of output coupler MGO1 under TM polarization is presented in the Fig. 11a. Since the diffraction efficiencies in all orders are negligible, the variation of diffraction angles are not shown.
MGO2 should be just sensitive to TM polarization to prevent reaction to the TE polarization light from MGI1 and MGO1. According to the negligible efficiencies of all orders in the grating MGO2 under TE polarization shown in the Fig. 11b, it can be concluded that the output grating MGO2 is insensitive to the TE polarization. As infrared light is invisible, reflections of MGI2 and MGO2 at wavelength 775 nm in TM polarization are ineffective in creating stray lights.

Conclusion
In this study, we present and design a metasurface-based waveguide display equipped with an eye movement monitoring system by the FDTD method. In this study, two distinguished trapezoid waveguides are designed to operate as AR and eye tracker, and we achieved a relatively uniform diffraction efficiency across 80º FOV in the AR section. In the next section, the performance of the suggested system in terms of FOV, diffraction efficiency, and weight is compared to another system whose waveguide shape is changed to a rectangular. In the second system, two polarization dependence channels are designed in a single rectangular waveguide, so that it acts as AR systems and eye-tracker simultaneously. Two polarization dependence channels are designed in a single rectangular waveguide to act as AR systems and eye-tracker simultaneously. Diffraction efficiency is not uniform and FOV is obtained 35º which is almost less than half the field of view in the previous approach. Both suggested systems can follow eye movements horizontally in the range of − 24° to 24°. The easier fabrication process, compactness and low weight are the other features of the proposed systems. Both systems can be implemented to design colorful displays and higher diffraction efficiencies can be realized by using other materials and different metasurface structures.