Got back last week from a very nice ECVP in Berlin. Lots of interesting research, I particularly liked the seminar on retinal physiology and functioning of single cone stimulation – lots of great techniques required to do those sorts of things. Regarding topics closer to my work in eye movements I came upon a two things worth highlighting: Tobii had a really nice demo of using eye tracking in a Vive, I had seen early versions but the final production model is really impressive – Having done eye tracking in VR since 2001, it was really nice to see such a seamless integration. As SMI have been bought out by Apple, this appears to be the best solution for vision research in VR I’ve seen (at least from my fast evaluation using the demos).
Acuity VR had a really nice demo of their analysis tool for VR, something that Tobii doesn’t address at the moment. Acuity’s tool does a 3D reconstruciton with some nice visualisations and AOIs mapped to objects in the scene. Everything they showed looked familiar as the tools we developed in the Hayhoe & Ballard lab did the same thing but were tuned to each experiment and weren’t that easy to adapt. Not sure how they will deal with complex 3D assets – We used to have problems with models that had multiple inseparable parts so it made it difficult to do the AOI selection without drawing bounding boxes – curious to see how it’s addressed by this software.
Blickshift are a company that has an interesting tool set for eye movement and other sensor stream analysis with the capability to observe many data streams across participants simultaneously.
Lastly, not at ECVP but ECEM and relayed via a colleague – a recent addition to the manual coding software I’ve discussed before, is Gazecode – Haven’t tried yet but seems promising if you use Pupil Labs or Tobii mobile trackers & have a copy of matlab.
Also here’s a copy of our poster: Predicting Eye-Head Coordination While Looking or Pointing