Video Annotation Revisited

We have some students doing a term project using the SMI eye tracking glasses. They need to manually annotate the eye tracking data and stimuli but we have more students than SMI BeGaze software, so we tried out some of the annotation tools I’ve mentioned previously. Unfortunately, recently while demoing the RIT Code app to some students it seems that with movies using newer codecs the application is painfully slow when trying to move thru video frames. While the software works well with older codecs (e.g. mpeg-2) – It seems to be showing it’s age as it was created 10+ years ago with the older quicktime framework – Will need to look into seeing if it can be updated. In the meantime one of the students found Anvil Video Annotation Tool

Video codecs and cross platform/application compatibility can drive you nuts – I messed around way too much today to actually get Anvil to work. The problem being it has a very particular list of ‘older’ codecs it supports. I am not sure how well maintained the software is as the links to their demo movies to test Anvil out had broken links.  I have movies from an SMI tracker which are xvid codec in an .avi container which is not supported by Anvil. To get something Anvil compatible I tried a few things out and the best I could find (after trying virtual dub, mpeg streamclip, handbrake, and looking at ffmpeg (but running into some problems wirth each.

Ultimately, I found that a combo of handbrake (which can open these avis but doesn’t support old codecs) and mpeg streamclip (which can’t open the avis but supports the old formats) will work.

Make sure to install handbrake: https://handbrake.fr/ and mpeg streamclip: http://www.squared5.com/

First use handbrake to open the SMI tracker avi’s and convert to a .mp4 format with video encoder chosen as h.264 and framerate chosen as same as source.

Now you should be able to open this new video in mpeg streamclip. If you choose file->export as quicktime -and then choose the compression in the dropdown box. H.261 and H.263 work, but you can also try the others listed here: http://www.anvil-software.org/#codecs

Only problem is mpeg streamclip converts to these formats really slow (seems like it takes the same length as the video, so a 10 minute video is a 10 minute wait at least on my 2014 macbook pro) so it might be good to try alternates for better speed/quality as ymmv from mine.

Eye Tracking Annotation Tools

I recently attended a workshop in Aarhus discussing Methods in Mobile Eye Tracking. Several good talks there and discussions, including one on annotation tools for mobile video data. While most eye tracking manufactures provide software tools with their hardware, there are also tools from academia that are open source and hardware agnostic and can provide some features lacking in the commercial software. Here is a short list of such tools available:

The first 2 are Mac specific and ELAN supports Win, Linux & Mac

Video Coder (Foulsham Lab, University of Essex)

RIT Code (Pelz Lab, Rochester Institute of Technology)

ELAN (Max Planck Institute for Psycholinguistics, The Language Archive, Nijmegen, The Netherlands)

*Update August 8 2016*

mediaBlix VideoGazer Happened to see this commerical tool from MediaBlix

*Update Jan 31, 2018*

Anvil Video Annotation – ‘ANVIL is a free video annotation tool, developed by Michael Kipp.’

 

Psychophysics toolbox and Tobii eye trackers

My colleague Estefania Dominguez has some really great software to help with getting up and running with Tobii’s eye trackers using matlab & the psychophysics toolbox (and also some details on syncing equipment in particular EEG).

The Tobii 3.0 SDK includes a good base for programming in C++, Python, .NET and Matlab. However, the SDK does not have any examples of getting eye tracking calibrations or streaming data with Matlab working with the Psychophysics Toolbox 3.0.  Since this combination is fairly popular among vision & behavioral researchers using matlab, Estefania made some helpful tools to share. Since joining Tobii, I’ve modified her tools and have them here for download:

Tobii Eye Tracker SDK and Psychophysics Toolbox Integration

I’ve fixed a couple small visualization bugs, modified a bit of the visuals (adding some more verbose output, a direct display of and return values for mean & standard dev. for calibration error on each point), and altered a bit of the code structure to make it a little cleaner to read (at least for me).

Vintage eye tracking tutorial & haptic simulation

This is a bit dated (circa 2003?) but here is a tutorial we generated back at the University of Rochester for using a head-mounted and HMD-mounted ASL eye tracker, gives some good tips for those systems & basics on eye tracking in general

Bonus: Here is another on the Phantom haptic feedback system (i.e. robot arms that give force feedback so it feels like you are touching virtual objects).

Both of these pages are from the Hayhoe & Ballard VR Lab at UT. Unfortunately, the site is out of date, they actually have a totally new lab space with a more modern setup, but this covers much of the equipment I worked with between 2001-2012.

Welcome!

I am a scientist with an interest in studying human vision. I study how we use and move our eyes around to pick up and store visual information during everyday life. I am currently working as a post-doctoral researcher with Tobii Technology AB as an experienced researcher within the LanPercept initial training network as part of the European Union’s Marie Curie Actions. In my current position, I am focused on using gaze contingent eye tracking technology to study interactions between vision and language. In addition to this research, I am also interested in psychology, neuroscience, computer vision and artificial intelligence/machine learning.  I’ll being using this space to share notes on prior research as well as some recent insights and tips from my time working at Tobii.