ZHOU Sha, NIU Jiqiang, XU Feng, PAN Xiaofang, ZHEN Wenjie, QIAN Haoyue. Estimating Gaze Directions for Pedestrian Navigation[J]. Geomatics and Information Science of Wuhan University, 2021, 46(5): 700-705,735. DOI: 10.13203/j.whugis20200465
Citation: ZHOU Sha, NIU Jiqiang, XU Feng, PAN Xiaofang, ZHEN Wenjie, QIAN Haoyue. Estimating Gaze Directions for Pedestrian Navigation[J]. Geomatics and Information Science of Wuhan University, 2021, 46(5): 700-705,735. DOI: 10.13203/j.whugis20200465

Estimating Gaze Directions for Pedestrian Navigation

  •   Objectives  Accurately capturing gaze directions of pedestrians can effectively improve the efficiency and the safety of pedestrian navigation. Traditional methods of gaze direction estimation are somewhat invasive in many practical applications and unadapted to different users and head pose variations, and unsuitable for pedestrian navigation in terms of the portability of the corresponding devices. Accordingly, we propose a model to estimate the gaze directions of pedestrians using smart glasses.
      Methods  Based on scale invariant feature transform features, we first measure the similarity degree between gaze photos and street view images. Then we propose a method to estimate gaze directions while pedestrians and street view images are disjoint.When pedestrians and street view images are overlapped, we develop another estimation method that considers the position error of pedestrians. Furthermore, we establish a gaze-direction estimation model that considers the positional relationship between pedestrians and street view images. Finally, we select two real-world scenes to verify the reliability of the proposed estimation model of gaze directions by simulation experiments.Results: The results show that: (1) The estimation errors of the gaze directions by simulation experiments.
      Results  The results show that: (1) The estimation errors of the proposed estimation model(i.e., gaze-direction estimation model considering positional relationship) are significantly lower than that of gaze-directions estimation model ignoring positional relationship. (2) In the same scenes, the estimated error does not increase with the test distances. Moreover, the average estimation accuracy of our model is almost similar to the estimation method based on depth cameras.
      Conclusions  (1) The proposed gaze direction estimation is significantly superior to the model that ignores positional relationship.(2) Pedestrian position variations have little impact on the estimation accuracies of our model in the same scene. Hence, the proposed model is suitable for pedestrian navigation and we can use portable smart glasses to estimate gaze directions.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return