Gaze Estimation Method based on an Aspherical Model of the Cornea:
                  Surface of Revolution about the Optic...
is                     planes when we use two cameras (j = 0, 1). The optical axis of
                                    ...
where the incident vector vj = (Cj −Bj )/||Cj −Bj ||, the normal
                                                         ...
2) Calculate Bj and tj by using Equations 6, 7, and 8, where R is              connecting the corneal center and the virtu...
Upcoming SlideShare
Loading in …5
×

Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea Surface Of Revolution About The Optical Axis Of The Eye

1,518
-1

Published on

A novel gaze estimation method based on a novel aspherical model of the cornea is proposed in this paper. The model is a surface of revolution about the optical axis of the eye. The calculation method is explained on the basis of the model. A prototype system for estimating the point of gaze (POG) has been developed using this method. The proposed method has been found to be more accurate than the gaze estimation method based on a spherical model of the
cornea.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,518
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
16
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea Surface Of Revolution About The Optical Axis Of The Eye

  1. 1. Gaze Estimation Method based on an Aspherical Model of the Cornea: Surface of Revolution about the Optical Axis of the Eye Takashi Nagamatsu∗ Yukina Iwamoto† Junzo Kamahara‡ Naoki Tanaka§ Michiya Yamamoto¶ Kobe University Kwansei Gakuin University Abstract cornea was not spherical or light from an LED was reflected from the sclera that was not modeled. A novel gaze estimation method based on a novel aspherical model of the cornea is proposed in this paper. The model is a surface of Rotation Center Center of Corneal Curvature revolution about the optical axis of the eye. The calculation method POG Cornea is explained on the basis of the model. A prototype system for estimating the point of gaze (POG) has been developed using this l Axis R Visua method. The proposed method has been found to be more accurate E α,β than the gaze estimation method based on a spherical model of the A Optical Axis cornea. F B Center of the Pupil Fovea CR Categories: H.5.2 [Information Interfaces and Presentation]: Pupil User Interfaces—Ergonomics; I.4.9 [Image Processing and Com- K puter Vision]: Applications Figure 1: Eye model with spherical model of cornea. Keywords: Gaze tracking, calibration-free, eye movement, eye model 0 128 256 384 512 640 768 896 1024 1152 1280 1 Introduction 0 102 The use of a physical model of the eye for remote gaze estima- 205 tion has gained considerable importance in recent times because 307 this technique does not require a large number of calibration points. 410 a Most of the studies use a spherical model of the cornea [Shih and 512 b c Liu 2004; Guestrin and Eizenman 2007]. However, Shih et al. and 614 Guestrin et al. pointed out that a spherical model may not be suit- 717 able for modeling the boundary region of the cornea. 819 922 In this paper, we propose a novel physical model of the cornea, 1024 which is a surface of revolution about the optical axis of the eye, and an estimation method based on the model. Figure 2: Evaluation results of our previously proposed system based on spherical model of cornea, in display coordinate sysytem. 2 Gaze estimation based on spherical model of cornea 3 Novel aspherical model of cornea: surface In our previous studies, we had proposed systems for the estima- of revolution about optical axis of eye tion of the point of gaze (POG) on the computer display, which were based on the spherical model of the cornea as shown in Fig- ure 1. [Nagamatsu et al. 2008a; Nagamatsu et al. 2008b]. Figure 3.1 Novel aspherical model 2 shows the evaluation results of the system that used two cameras and two light sources; the evaluation involved three subjects (a, b, We propose an aspherical model of the cornea for remote gaze es- c) [Nagamatsu et al. 2008b]. The system had low accuracy of esti- timation. This model is a surface of revolution about the optical mating the POG around the top left and right corners of the display. axis of the eye as shown in Figure 3. In our model, the boundary The results could be because the boundary region of the modeled between the cornea and the sclera can be joined smoothly. ∗ e-mail:nagamatu@kobe-u.ac.jp † e-mail:0667286w@stu.kobe-u.ac.jp 3.2 Determination of optical axis of eye on the basis of ‡ e-mail:kamahara@maritime.kobe-u.ac.jp novel model § e-mail:ntanaka@maritime.kobe-u.ac.jp ¶ e-mail:michiya.yamamoto@kwansei.ac.jp We use a special arrangement of cameras and light sources to de- Copyright © 2010 by the Association for Computing Machinery, Inc. termine the optical axis of the eye on the basis of the novel model. Permission to make digital or hard copies of part or all of this work for personal or We use two cameras with a light source attached to each camera. classroom use is granted without fee provided that copies are not made or distributed The position of each light source is supposed to be the same as the for commercial advantage and that copies bear this notice and the full citation on the nodal point of the camera. first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on Figure 4 shows a cross section of the eyeball. A is the center of servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail corneal curvature near the optical axis. Lj and Cj denote the po- permissions@acm.org. sition of the light source j and the nodal point of camera j, respec- ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00 255
  2. 2. is planes when we use two cameras (j = 0, 1). The optical axis of a l ax O ptic the eye can be determined from the intersection of the two planes. The two planes must not be coplanar (i.e., the optical axis of the eye must not be coplanar with the nodal points of both cameras). Irrespective of the part on the surface of the eye (central region area of the corneal surface, boundary region of the cornea, scleral region, etc.) from where light is reflected, the optical axis of the eye can be determined mathematically. Actually, the scleral surface is not smooth, making it difficult to estimate the glint positions. By using the method, only the optical axis of the eye can be deter- mined. However, it is necessary to determine A to determine the visual axis of the eye, because we have assumed that the visual axis Figure 3: Surface of revolution about optical axis of eye. and the optical axis of the eye intersect at A. The visual axis of the eye is described as X = A+tc in a parametric form, where c is the unit direction vector of the visual axis of the eye, which can be esti- Scleral Surface mated by the method described in Section 5.2. The estimation error of A leads to a parallel shift of the visual axis of the eye. Therefore, Corneal Surface as the distance between a user and the gazed object increases, the estimation error of the POG on the object in terms of view angle d decreases. The intersection of two lines (X = Cj + tj (Cj − Bj ), Optical Axis Pupil j = 0, 1 ) gives an approximation of A; the approximation error is Center of Pupil B B''j less than approximately 7.8 mm (the average value of the radius of A the cornea). Light Source j However, when the distance between the user and the object is Center of Corneal Pj Lj Image Plane Curvature of Camera j small, the estimation error of POG caused by estimation error of A is relatively large in terms of view angle. In order to solve this Purkinje Image Cj P'j on Image Plane problem, we propose a method for estimating A and determining Nodal Point B'j Center of Pupil the visual axis of the eye accurately. of Camera j on Image Plane Camera j 4 Estimation of user dependent parameters Scleral Surface (user calibration) We have to estimate the following user dependent parameters: the Figure 4: Cross section of eyeball showing center of corneal curva- radius of corneal curvature near the optical axis of the eye, R; the ture, along with center of pupil, position of light source, and nodal distance between the centers of corneal curvature and the pupil, K; point of camera. and the offset between optical and visual axes, α and β. In order to estimate these user dependent parameters, the user is instructed to gaze at a single point (calibration point) the position of which is tively; Cj is assumed to the same as Lj . The value of Cj (= Lj ) known. The position of the calibration point is selected such that is determined by calibrating the camera beforehand. the light from the camera is reflected from the corneal surface that is approximated as a sphere. It is assumed that the pupil can be ob- A ray originating from the center of the pupil B gets refracted at served through the corneal surface that is approximated as a sphere. point Bj , passes through the nodal point of the camera j, Cj , and Therefore, the refraction at the corneal surface can be determined intersects the camera image plane at a point Bj . on the basis of the spherical model of the cornea wherein the pupil A ray from Lj is reflected at a point Pj on the corneal surface is observed by using the cameras. back along its incident path. It passes through Cj and intersects the camera image plane at a point Pj . If the cornea is perfectly 4.1 Estimation of radius of corneal curvature on the spherical, the line connecting Lj and Pj would pass through A, basis of spherical model of cornea and A can be determined by using the two cameras. However, the position of A cannot be estimated accurately when light is reflected When a user gazes at an object near the camera in the user- from an aspherical surface of the eye. calibration process, light is reflected from the spherical corneal sur- face. Hence, we can use the spherical model of the cornea in this Because we use the model of a surface of revolution about the op- case. tical axis of the eye, the ray from Lj is reflected from the surface of the eye back in the plane including the optical axis of the eye. We estimate the position of the center of corneal curvature, A. Fig- Therefore, A, B, Bj , Bj , Cj , Lj , Pj , Pj , and the optical axis ure 5 shows a cross section of the cornea including the center of of the eye are coplanar. The normal vector of a plane that includes corneal curvature A; the position of the light source i, Li ; the posi- the optical axis is {(Cj − Bj ) × (Pj − Cj )}, and the plane is tion of the light source j, Lj ; the nodal point of camera i, Ci ; and expressed as the nodal point of the camera j, Cj . The positions of Ci (= Li ) and Cj (= Lj ) are known. A ray from Li reflected from the corneal surface returns to Ci and reaches Pii . The extension of the path {(Cj − Bj ) × (Pj − Cj )} · (X − Cj ) = 0, (1) includes A, because the corneal suface is supposed to be a sphere. Similarly, the line connectiong Cj and Pjj includes A. Therefore, where X (= (x, y, z)T ) is a point on the plane. We obtain two A can be estimated from the intersection of two lines as follows: 256
  3. 3. where the incident vector vj = (Cj −Bj )/||Cj −Bj ||, the normal vector at the point of refraction, nj = (Bj − A)/||Bj − A||, and X = Ci + tii Ci − Pii , (2) ρ = n1 /n2 (n1 : refractive index of air ≈ 1; n2 : effective refractive X = Cj + tjj Cj − Pjj , (3) index ≈ 1.3375). The center of the pupil, B, can be determined from the intersection where tii and tjj are parameters. of two rays from the two cameras, as follows: A ray from Li is reflected at a point Pji on the corneal surface such that the reflected ray passes through Cj and intersects the camera X = Bj + sj tj (j = 0, 1), (9) image plane at a point Pji . Similarly, a ray from Lj is reflected at a point Pij on the corneal surface such that the reflected ray passes where sj is a parameter. Therefore, the distance between the centers through Ci and intersects the camera image plane at a point Pij . In of corneal curvature and the pupil, K, is determined as K = ||B − order to estimate the radius of the cornea, we estimate the reflection A||. point Pji (= Pij ), that is, the intersection of the lines as follows: X = Ci + tij Ci − Pij , (4) Optical Axis Pupil X = Cj + tji Cj − Pji , (5) B tj B''j where tij and tji are parameters. Therefore, the radius of corneal nj curvature, R, is determined as R = ||Pji − A||. A vj Corneal Surface Ci Image Plane Center of P'ij Corneal Curvature Li Image Plane Pji = Pij P'ii Corneal Surface Cj A B'j Center of Corneal Curvature Image Plane Figure 6: Refraction on corneal surface. Lj P'jj 4.3 Estimation of offset between optical and visual Cj P'ji axes The offset between optical and visual axes is expressed by two pa- Figure 5: Cross section of cornea showing containing center of rameters; e.g., horizontal and vertical angles. For the case of a corneal curvature, position of light sources, and nodal points of user gazing at a known position, the offset between optical and vi- cameras. sual axes is calculated by the method described in Nagamatsu et al. [2008b]. 4.2 Estimation of distance between centers of corneal curvature and pupil 5 Estimation of visual axis of eye after user calibration As shown in Figure 6, a ray originating from the center of the pupil B gets refracted at point Bj , passes through the nodal point of the After the user calibration, the user moves his/her eyes freely. The camera j, Cj , and intersects the camera image plane at a point Bj . optical axis of the eye can be calculated by the method described in Bj can be determined by solving the equations given below: Section 3. R, K, and the offset between optical and visual axes of the eye are known from the user calibration. The position of the center of corneal curvature, A, and the unit X = Cj + tj Cj − Bj , (6) direction vector along the visual axis of the eye, c, are required for R = ||X − A||. (7) the calculation of the visual axis of the eye. These equations may have two solutions; we select the one closer 5.1 Estimation of center of corneal curvature to Cj . We suppose that the corneal surface where the pupil is observed can The equation of the vector tj (the refracted vector at Bj shown in be approximated as a spherical surface. The algorithm for searching Figure 6) can be obtained by using Snell’s law as follows: the position of A is as follows: 1) Set the position of A on the optical axis; select the position that is nearest to the intersection of the two lines, X = Cj + tj (Cj − Bj ) tj = −ρnj · vj − 1 − ρ2 (1 − (nj · vj )2 ) nj + ρvj , (8) (j = 0, 1). 257
  4. 4. 2) Calculate Bj and tj by using Equations 6, 7, and 8, where R is connecting the corneal center and the virtual pupil on the basis of known from the user calibration. the spherical model of the cornea. 3) Calculate B, the position of the center of the pupil, from the Figure 8 shows the evaluation results. The crosses and triangles intersection of the two lines described by Equation 9. indicate the POG obtained by our method and Chen’s method, re- spectively. Our method appears to be more accurate than Chen’s 4) Calculate the distance between B and A, and compare it to K method in determining POG at the top left and right corners of the that was estimated during the user calibration. display. The estimated R and K were 8.04 mm and 4.43 mm, re- spectively. 5) Shift the position of A toward the rotation center of the eye along the optical axis of the eye and repeat steps 1–4 to determine the 0 128 256 384 512 640 768 896 1024 1152 1280 accurate position of A. It is sufficient to search the position of A 0 for a length of 10 mm, because the average radius of the cornea is 102 approximately 7.8 mm. The search is finished when ||B − A|| = K. 205 307 410 5.2 Estimation of visual axis of eye and POG 512 The unit direction vector of the visual axis of the eye, c, is deter- 614 mined from the unit direction vector of the optical axis of the eye, 717 d, and the offset between optical and visual axes of the eye by using 819 the method described in Nagamatsu et al. [2008b]. 922 The intersection point between the visual axis of the eye (X = 1024 A + tc) and the object gives the POG. Our method Chen's method 6 Implementation Figure 8: Comparison of our method and Chen’s method in display coordinate system. A prototype system for the estimation of the POG on a display has been implemented, as shown in Figure 7. This system consists of two synchronized monochrome IEEE-1394 digital cameras (Firefly 7 Conclusion MV, Point Grey Research Inc.), a 17 LCD, and a Windows-based PC (Windows XP). The software was developed using OpenCV 1.0 We proposed a novel physical model of the eye for remote gaze [Intel]. Each camera is equipped with a 1/3 CMOS image sensor tracking. This model is a surface of revolution about the optical whose resolution is 752 × 480 pixels. A 35-mm lens and an IR filter axis of the eye. We determined the mathematical expression for are attached to each camera. Two infrared LEDs are attached to estimating the POG on the basis of the model. We evaluated the each camera such that the midpoint of the two LEDs coincides with prototype system developed on the basis of our method and found the nodal point of the camera. These cameras are positioned under that the system could be used to estimate the POG on the entire the display. The intrinsic parameters of the cameras are determined computer display. before setting up the system. References C HEN , J., T ONG , Y., G RAY, W., AND J I , Q. 2008. A robust 3D eye gaze tracking system using noise reduction. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applica- tions, 189–196. G UESTRIN , E. D., AND E IZENMAN , M. 2007. Remote point- of-gaze estimation with free head movements requiring a single- point calibration. In Proceedings of the 29th Annual Interna- tional Conference of the IEEE EMBS, 4556–4560. I NTEL. Open source computer vision library. http://sourceforge.net/projects/opencvlibrary/. NAGAMATSU , T., K AMAHARA , J., I KO , T., AND TANAKA , N. Figure 7: Prototype system. 2008. One-point calibration gaze tracking based on eyeball kine- matics using stereo cameras. In Proceedings of the 2008 Sympo- sium on Eye Tracking Research & Applications, 95–98. The evaluation of the prototype system in a laboratory involved an adult subject who does not wear glasses or contact lenses. The NAGAMATSU , T., K AMAHARA , J., AND TANAKA , N. 2008. subject’s eye (right) was approximately 500 mm from the display. 3D gaze tracking with easy calibration using stereo cameras for She was asked to stare at 25 points on the display. More than 10 robot and human communication. In Proceedings of IEEE RO- data points were recorded for each point. MAN 2008, 59–64. In order to confirm the effectiveness of our method, we compared S HIH , S.-W., AND L IU , J. 2004. A novel approach to 3-D gaze our method to the method described in Sections 3.2 and 3.3 in Chen tracking using stereo cameras. IEEE Transactions on Systems, et al. [2008], in which the optical axis was determined as a line Man, and Cybernetics, Part B 34, 1, 234–245. 258

×