Computer vision: Taylor Swift saliency mapping

swift_shake_saliency_01

In cognitive neuroscience, we’re interested in what guides human attention. We distinguish between influences from high-level cognition (e.g. current goals), and low-level visual features. There are highly sophisticated models of how visual features such as intensity, colour, and movement guide human attention. Computerised implementations of these models allow computers to mimic human eye movements. Turns out Taylor Swift’s amazing videos are an excellent example!

Continue reading

Python wrapper for Gazepoint’s OpenGaze API

Gazepoint's GP3 eye tracker.

Gazepoint is a relatively small player on the eye-tracking market. They sell two devices: the 60 Hz GP3 at a price of $695, and the 150 Hz GP3 HD at $1995 (both of those prices exclude VAT and shipping). Because of its relatively low price, the basic GP3 is an appealing model for researchers on a budget. As of today, PyGaze supports Gazepoint’s trackers through their OpenGaze API. Download the new code from GitHub, and have fun!

Continue reading

Tutorial: creating a Twitterbot

Although it sounds like a lot of effort, creating a Twitter bot is actually really easy! This tutorial, along with some simple tools, can help you create Twitter bots that respond when they see certain phrases, or that periodically post a tweet. These bots work with Markov chains, which can generate text that looks superficially good, but is actually quite nonsensical. You can make the bots read your favourite texts, and they will produce new random text in the same style!

Continue reading