Masterclass Follow-up

We organized an eye tracking workshop recently that featured talks about eye tracking technology, methods and research including some really nice presentations by Tim Smith & Ignace Hooge. As part of the workshop I gave this presentation on the basics of eye movement segmentation with a focus on velocity threshold methods.  Additionally, I realized from talking with some of the attendees there are a couple more additions to the Tobii Calibration Psychtoolbox tool I needed to add, so here is the link to an updated version.


Psychophysics toolbox and Tobii eye trackers

My colleague Estefania Dominguez has some really great software to help with getting up and running with Tobii’s eye trackers using matlab & the psychophysics toolbox (and also some details on syncing equipment in particular EEG).

The Tobii 3.0 SDK includes a good base for programming in C++, Python, .NET and Matlab. However, the SDK does not have any examples of getting eye tracking calibrations or streaming data with Matlab working with the Psychophysics Toolbox 3.0.  Since this combination is fairly popular among vision & behavioral researchers using matlab, Estefania made some helpful tools to share. Since joining Tobii, I’ve modified her tools and have them here for download:

Tobii Eye Tracker SDK and Psychophysics Toolbox Integration

I’ve fixed a couple small visualization bugs, modified a bit of the visuals (adding some more verbose output, a direct display of and return values for mean & standard dev. for calibration error on each point), and altered a bit of the code structure to make it a little cleaner to read (at least for me).

Vintage eye tracking tutorial & haptic simulation

This is a bit dated (circa 2003?) but here is a tutorial we generated back at the University of Rochester for using a head-mounted and HMD-mounted ASL eye tracker, gives some good tips for those systems & basics on eye tracking in general

Bonus: Here is another on the Phantom haptic feedback system (i.e. robot arms that give force feedback so it feels like you are touching virtual objects).

Both of these pages are from the Hayhoe & Ballard VR Lab at UT. Unfortunately, the site is out of date, they actually have a totally new lab space with a more modern setup, but this covers much of the equipment I worked with between 2001-2012.


I am a scientist with an interest in studying human vision. I study how we use and move our eyes around to pick up and store visual information during everyday life. I am currently working as a post-doctoral researcher with Tobii Technology AB as an experienced researcher within the LanPercept initial training network as part of the European Union’s Marie Curie Actions. In my current position, I am focused on using gaze contingent eye tracking technology to study interactions between vision and language. In addition to this research, I am also interested in psychology, neuroscience, computer vision and artificial intelligence/machine learning.  I’ll being using this space to share notes on prior research as well as some recent insights and tips from my time working at Tobii.