Fork me on GitHub

WebGazer.js

Democratizing Webcam Eye Tracking on the Browser


WebGazer.js is an eye tracking library that uses common webcams to infer the eye-gaze locations of web visitors on a page in real time. The eye tracking model it contains self-calibrates by watching web visitors interact with the web page and trains a mapping between the features of the eye and positions on the screen. WebGazer.js is written entirely in JavaScript and with only a few lines of code can be integrated in any website that wishes to better understand their visitors and transform their user experience. WebGazer.js runs entirely in the client browser, so no video data needs to be sent to a server, and it requires the user's consent to access their webcam.


Real time gaze prediction on most major browsers

No special hardware - WebGazer.js uses common webcams

Self-calibration from clicks and cursor movements

Easy to integrate with a few lines of JavaScript

Swappable components for eye detection

Multiple gaze prediction models


Usage

To use WebGazer.js you need to add the webgazer.js file as a script in your website:
 /* WebGazer.js library */
<script src="webgazer.js" type="text/javascript" >

Be aware that when you do local development and you might need to run locally a simple http server that supports the https protocol.

Once the script is included, the webgazer object is introduced into the global namespace. webgazer has methods for controlling the operation of WebGazer.js allowing us to start and stop it, add callbacks, or change out modules. The two most important methods on webgazer are webgazer.begin() and webgazer.setGazeListener(). webgazer.begin() starts the data collection that enables the predictions, so it's important to call this early on. Once webgazer.begin() has been called, WebGazer.js is ready to start giving predictions. webgazer.setGazeListener() is a convenient way to access these predictions. This method invokes a callback you provide every few milliseconds to provide the current gaze location of a user. If you don't need constant access to this data stream, you may alternatively call webgazer.getCurrentPrediction() which will give you a prediction at the moment when it is called.


webgazer.setGazeListener(function(data, elapsedTime) {
    if (data == null) {
        return;
    }
    var xprediction = data.x; //these x coordinates are relative to the viewport
    var yprediction = data.y; //these y coordinates are relative to the viewport
    console.log(elapsedTime); //elapsed time is based on time since begin was called
}).begin();
                    

Here is the alternate method of getting predictions where you can request a gaze prediction as needed.


var prediction = webgazer.getCurrentPrediction();
if (prediction) {
    var x = prediction.x;
    var y = prediction.y;
}
                    

Advanced Usage

There are several features that WebGazer.js enables beyond the example shown so far.

Saving Data Between Sessions

WebGazer.js can save and restore the training data between browser sessions by storing data to localstorage. This occurs automatically when end() is called. If you want each user session to be independent make sure that you do not call the end() function.

webgazer.end()

Changing in Use Regression and Tracker Modules

At the heart of WebGazer.js are the tracker and regression modules. The tracker module controls how eyes are detected and the regression module determines how the regression model is learned and how predictions are made based on the eye patches extracted from the tracker module. These modules can be swapped in and out at any time. We hope that this will make it easy to extend and adapt WebGazer.js and welcome any developers that want to contribute.

WebGazer.js requires the bounding box that includes the pixels from the webcam video feed that correspond to the detected eyes of the user. Currently we include three external libraries that implement different Computer Vision algorithms to detect the face and eyes.

webgazer.setTracker("clmtrackr"); //set a tracker module
webgazer.addTrackerModule("newTracker", NewTrackerConstructor); //add a new tracker module

Here are all the external tracker modules that come by default with WebGazer.js. Let us know if you want to introduce your own facial feature detection library.

webgazer.setRegression("ridge"); //set a regression module
webgazer.addRegressionModule("newReg", NewRegConstructor); //add a new regression module

Here are all the regression modules that come by default with WebGazer.js. Let us know if you would like introduce different modules - just keep in mind that they should be able to produce predictions very fast.

  • ridge - a simple ridge regression model mapping pixels from the detected eyes to locations on the screen.
  • weightedRidge - a weight ridge regression model with newest user interactions contribution more to the model.
  • threadedRidge - a faster implementation of ridge regression that uses threads.
  • linear - a basic simple linear regression that maps

Pause and Resume

It may be necessary to pause the data collection and predictions of WebGazer.js for performance reasons.


webgazer.pause(); //WebGazer.js is now paused, no data will be collected and the gaze callback will not be executed
webgazer.resume(); //data collection resumes, gaze callback will be called again
                    

Util and Params

We provide some useful functions and objects in webgazer.util. The webgazer.params object also contains some useful parameters to tweak to control video fidelity (trades off speed and accuracy) and sample rate for mouse movements.


webgazer.util.bound(prediction);
prediction.x; //now always in the bounds of the viewport
prediction.y; //now always in the bounds of the viewport
                    

Support

WebGazer.js uses the getUserMedia/Stream API to get access to the webcam. These browsers are currently supported, as seen here.

Google Chrome

Google Chrome
53+

Microsoft Edge

Microsoft Edge
12+

Mozilla Firefox

Mozilla Firefox
42+

Opera

Opera
40+

Safari

Safari
11+


Download Instructions

Download


Dataset

A webcam video dataset for training and evaluating eye tracking models. Please see the documentation download link.

Create from Source

The GitHub repository can be found here.
# Ensure NodeJS is downloaded: https://nodejs.org/en/download/
npm install -g grunt-cli
git clone https://github.com/brownhci/WebGazer.git
cd WebGazer
npm install
# Run grunt to build the webgazer.js file in the build directory
grunt
                    

Examples

Empty Webpage Demo

WebGazer.js on an Empty Webpage with calibration

See how easy it is to integrate WebGazer.js on any webpage. With just a few clicks you will get real-time predictions. Follow the popup instructions to click through 9 calibration points on the screen whilst looking at the cursor.

Collision demo

Ball Collision Game

Move the orange ball with your eyes and create collisions with the blue balls. Train WebGazer.js by clicking in various locations within the screen, while looking at your cursor.

SearchGazer

WebGazer on Search Engines

We have created SearchGazer.js, a library that incorporates WebGazer in Search Engines such as Bing and Google.


Publications

If you use WebGazer.js please cite the following paper:

@inproceedings{papoutsaki2016webgazer,
author = {Alexandra Papoutsaki and Patsorn Sangkloy and James Laskey and Nediyana Daskalova and Jeff Huang and James Hays},
title = {WebGazer: Scalable Webcam Eye Tracking Using User Interactions},
booktitle = {Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI)},
pages = {3839--3845},
year = {2016},
organization={AAAI}
}

If you use SearchGazer.js please cite the following paper:

@inproceedings{papoutsaki2017searchgazer,
author = {Alexandra Papoutsaki and James Laskey and Jeff Huang},
title = {SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search},
booktitle = {Proceedings of the ACM SIGIR Conference on Human Information Interaction \& Retrieval (CHIIR)},
year = {2017},
organization={ACM}
}

For the WebGazer webcam dataset and findings about gaze behavior during typing:

@inproceedings{papoutsaki2018eye,
title={The eye of the typer: a benchmark and analysis of gaze behavior during typing},
author={Papoutsaki, Alexandra and Gokaslan, Aaron and Tompkin, James and He, Yuze and Huang, Jeff},
booktitle={Proceedings of the 2018 ACM Symposium on Eye Tracking Research \& Applications},
pages={16},
year={2018},
organization={ACM}
}


Press

Online discussions in:


Who We Are

Alexandra Papoutsaki

Alexandra Papoutsaki

Assistant Professor of Computer Science at Pomona College.

Aaron Gokaslan

Aaron Gokaslan

Undergraduate student at Brown University.

Ida De Smet

Ida De Smet

Undergraduate student at the University of Auckland.

Jack Wong

Jack Wong

Undergraduate student at the University of Auckland.

James Tompkin

James Tompkin

Assistant Professor of Computer Science at Brown University.

Jeff Huang

Jeff Huang

Assistant Professor of Computer Science at Brown University.

Other Collaborators

Acknowledgements

Webgazer is developed based on the research that is done by Brown University. The work of the calibration example file was developed in the context of a course project with the aim to improve the feedback of WebGazer, proposed by Dr. Gerald Weber and his team Dr. Clemens Zeidler and Kai-Cheung Leung.

This research is supported by NSF grants IIS-1464061, IIS-1552663, a Seed Award from the Center for Vision Research at Brown University, and the Brown University Salomon Award.


License

Copyright (C) 2019 Brown HCI Group
Licensed under GPLv3.