02261nas a2200217 4500000000100000000000100001008004100002260001200043653001600055653001500071653002400086653002000110100001200130700001200142245008200154856009600236300001200332490000600344520167900350022001402029 2018 d c12/201810aInteraction10aInterfaces10aGesture Recognition10aVirtual Reality1 aS Ullah1 aM Raees00aEVEN-VE: Eyes Visibility Based Egocentric Navigation for Virtual Environments uhttp://www.ijimai.org/journal/sites/default/files/files/2018/08/ijimai_5_3_15_pdf_15422.pdf a141-1510 v53 aNavigation is one of the 3D interactions often needed to interact with a synthetic world. The latest advancements in image processing have made possible gesture based interaction with a virtual world. However, the speed with which a 3D virtual world responds to a user’s gesture is far greater than posing of the gesture itself. To incorporate faster and natural postures in the realm of Virtual Environment (VE), this paper presents a novel eyes-based interaction technique for navigation and panning. Dynamic wavering and positioning of eyes are deemed as interaction instructions by the system. The opening of eyes preceded by closing for a distinct time-threshold, activates forward or backward navigation. Supporting 2-Degree of Freedom head’s gestures (Rolling and Pitching) panning is performed over the xy-plane. The proposed technique was implemented in a case-study project; EWI (Eyes Wavering based Interaction). With EWI, real time detection and tracking of eyes are performed by the libraries of OpenCV at the backend. To interactively follow trajectory of both the eyes, dynamic mapping is performed in OpenGL. The technique was evaluated in two separate sessions by a total of 28 users to assess accuracy, speed and suitability of the system in Virtual Reality (VR). Using an ordinary camera, an average accuracy of 91% was achieved. However, assessment made by using a high quality camera testified that accuracy of the system could be raised to a higher level besides increase in navigation speed. Results of the unbiased statistical evaluations suggest/demonstrate applicability of the system in the emerging domains of virtual and augmented realities. a1989-1660