Lack of clarity in Matlab

An odd bug (or lack of feature?) I found with Matlab results in a error messages that are largely unhelpful to track down why your code isn’t running. I had this happen a few months ago and solved it but then completely forgot again when it happened again today, so hope this helps.

I was running a program needing to use resample.m part of Mathwork’s own code. Running resample.m with proper inputs got back the message:

>Undefined function 'resample' for input arguments of type 'double'.

That’s strange. If you type exist(‘resample’) it comes saying it does in fact exist. If you type edit resample.m you get a file to come up…what’s going on?! Doing a file search also reveals that matlab toolboxes have at least 6 different copies of a file called resample.m (another confusion, but that’s another story). This seems really strange! Another useful function is which, to track down the path matlab is using when invoking a function. And here we get:

> which resample
C:\Program Files\MATLAB\R2015a\toolbox\signal\signal\resample.m % Has no license

I’m on a network license of matlab sharing with coworkers! So there you have it, we exceeded our number of seats. Totally baffling though when you first try to diagnose. Mathworks really should have a way to give a nice verbose error about when you try to use files that do indeed exist but have no valid license so can’t be executed.

See Also:


Eye Tracking Annotation Tools

I recently attended a workshop in Aarhus discussing Methods in Mobile Eye Tracking. Several good talks there and discussions, including one on annotation tools for mobile video data. While most eye tracking manufactures provide software tools with their hardware, there are also tools from academia that are open source and hardware agnostic and can provide some features lacking in the commercial software. Here is a short list of such tools available:

The first 2 are Mac specific and ELAN supports Win, Linux & Mac

Video Coder (Foulsham Lab, University of Essex)

RIT Code (Pelz Lab, Rochester Institute of Technology)

ELAN (Max Planck Institute for Psycholinguistics, The Language Archive, Nijmegen, The Netherlands)

*Update August 8 2016*

mediaBlix VideoGazer Happened to see this commerical tool from MediaBlix

*Update Jan 31, 2018*

Anvil Video Annotation – ‘ANVIL is a free video annotation tool, developed by Michael Kipp.’


PsychoPy and the Tobii Python SDK

While the Tobii SDK does not directly support PsychoPy, PsychoPy has example code that should work with the Tobii trackers (and several other eye trackers). However, getting the example code to run is not always easy as there can be some problems that can be difficult to troubleshoot if you aren’t familiar with Python or PsychoPy. [Edit: Please note all of this advice pertains to PsychoPy 1.81.03  and may not work with later versions of PsychoPy].

With my coworker Burcin Dede, we have been documenting some common problems with getting psychopy’s demo code to run with the Tobii tracker and some important notes we have made are listed below: You can find a code example within the PsychoPy GUI if you look at the ‘demos’ drop-down menu and look within the ioHub section: Demos>ioHub>gCursor.

In fact PsychoPy has a huge amount of demos that are well worth examining as many techniques for psychophysics experiments are shown off and can save a lot of work! If you have any problems running the demo, you can double check following things:

1. Make sure that the SDK is properly setup with pyGTK and the python path is correctly set (SDK developer guide pg. 62)

2. Make sure that sample codes in the Tobii SDK can run in python, these examples are located in: C:\(sdk install location)\TobiiAnalyticsSDK-3.0.83-Win\tobii-analytics-sdk-3.0.83-win-Win32\Python27\Samples

3. You may need to try running PsychoPy as an admin

4. Be sure to comment out the portion of the config.yaml file that references any trackers that aren’t Tobii and to uncomment any sections that refer to the Tobii tracker.

5. A configuration file is written by default to specify a non-Tobii eye tracker so must change it. The iohub.config.yaml file and the source code for the demo can be found in your psychopy install directory (it will look something like): C:\(psychopy install location)\PsychoPy2\Lib\site-packages\PsychoPy-1.80.03-py2.7.egg\psychopy-1.80.03\demos\coder\iohub\eyetracking\gcCursor

6. Download and install the latest version of the avbin library

7. If you have more than one Tobii tracker, make sure to only have the one you are using plugged in. By default the psychopy program requests the name of the first tracker connected and will have unpredictable results if it sees multiple trackers. You have to manually edit the program if you want to use your specific tracker id.

**Update Sept 9 2016**

For those of you who prefer a different IDE, I’ve found Python(x,y) to be a pretty good matlab (my usual platform) replacement, it’s a package that contains many useful libraries (ie. numpy, matplotlib, scipy, and many more) and does a lot of the tedious housekeeping for you in one install, along with installing several useful python dev apps. Python(x,y) comes bundled with Spyder which is a very nice IDE. However, if you are trying to run code from PsychoPy using Spyder it’s not obvious at first how to make this happen (i.e. how do you add the right directories to the environment). Fortunately, Jonas Lindelov has a nice blog on several aspects of python including this very issue. I’ve reprinted part of his solution below (note this gui solution may cause issues with ipython console, library conflict, if so try the import version instead):

On windows, Spyder is pre-packaged in the python(x,y) package among others. Install that. If you use Windows, you need to make Spyder “aware” of PsychoPy by doing the following:

  1. Open Spyder –> Tools –> PYTHONPATH manager
  2. Add the following two paths (psychopy usually installs to C:\Program Files (x86)\PsychoPy2 on Windows) and do NOT click “synchronize”.
  3. Now open Spyder and you should be able to run the following small script without errors:

Now go to “Tools –> Preferences –> Run”, and tick “Execute in a dedicated Python interpreter” to make it easy to re-run the experiment without getting errors due to python being shut down when the psychopy module is unloaded.

Import version code you can put at top of your scripts if you end up with ipython conflicts like I did:

import sys


Masterclass Follow-up

We organized an eye tracking workshop recently that featured talks about eye tracking technology, methods and research including some really nice presentations by Tim Smith & Ignace Hooge. As part of the workshop I gave this presentation on the basics of eye movement segmentation with a focus on velocity threshold methods.  Additionally, I realized from talking with some of the attendees there are a couple more additions to the Tobii Calibration Psychtoolbox tool I needed to add, so here is the link to an updated version.

Psychophysics toolbox and Tobii eye trackers

My colleague Estefania Dominguez has some really great software to help with getting up and running with Tobii’s eye trackers using matlab & the psychophysics toolbox (and also some details on syncing equipment in particular EEG).

The Tobii 3.0 SDK includes a good base for programming in C++, Python, .NET and Matlab. However, the SDK does not have any examples of getting eye tracking calibrations or streaming data with Matlab working with the Psychophysics Toolbox 3.0.  Since this combination is fairly popular among vision & behavioral researchers using matlab, Estefania made some helpful tools to share. Since joining Tobii, I’ve modified her tools and have them here for download:

Tobii Eye Tracker SDK and Psychophysics Toolbox Integration

I’ve fixed a couple small visualization bugs, modified a bit of the visuals (adding some more verbose output, a direct display of and return values for mean & standard dev. for calibration error on each point), and altered a bit of the code structure to make it a little cleaner to read (at least for me).

Vintage eye tracking tutorial & haptic simulation

This is a bit dated (circa 2003?) but here is a tutorial we generated back at the University of Rochester for using a head-mounted and HMD-mounted ASL eye tracker, gives some good tips for those systems & basics on eye tracking in general

Bonus: Here is another on the Phantom haptic feedback system (i.e. robot arms that give force feedback so it feels like you are touching virtual objects).

Both of these pages are from the Hayhoe & Ballard VR Lab at UT. Unfortunately, the site is out of date, they actually have a totally new lab space with a more modern setup, but this covers much of the equipment I worked with between 2001-2012.