Webcam Eye Tracker

Building your own eye tracker for dirt cheap. How hard can it be? Turns out the basics are surprisingly simple! On this page, I’ll try to keep you posted on how the project advances. If you want to play around with the source code, feel free to grab if off GitHub.

First steps

11 October 2013

This video shows the result of my first attempt at pupil tracking. A lot of work needs to be done before this is actually useable, but I’m already quite happy with the results. The software is, of course, based on PyGaze. It uses the new webcam library. Image analysis is done on the fly, using PyGame. The webcam I use is a Trust Cuby model (retail price: 15 Euro), from which I removed the infrared filter. All regular lights in my office were off during tracking; I used a bunch of infrared LEDs to illuminate my face.

Progress

25 November 2013

I’ve added two new videos, showing the current state of the project. As you can see, I made quite a bit of progress! A Graphical User Interface (GUI) has been added, to allow users to set their own system up in an easier way (previously, it was done via code and keyboard shortcuts). Furthermore, the pupil’s edges (indicated by the green rectangle) are now detected, which means the software can now be used for pupilometry (the science of measuring pupil size). The pupil centre is still indicated by the red dot, but now there is the option to only locate a potential pupil inside of a user-defined enclosure. This is useful, because the pupil detection is done based on a rather simple principle: “find the largest dark bit in the picture, this must be the pupil!”. As you can see, my eyebrows and hair are pretty dark as well, therefore oftentimes falsely recognized as a pupil. The blue rectangle you see in the screen that says “select pupil and set pupil detection bounds” is the enclosure outside of which no pupil detection is attempted. As you can see in the video, this pupil enclosure moves along with the pupil, so you don’t have to worry about moving your head. Note that the lighting conditions were pretty normal in the current videos: I was simply sitting in my office, with all the lights and two monitors on, without the infrared LEDs that I used for the previous video.

Software explained

My software works in a relatively straightforward way. Every image that the webcam produces (30 per second!) is analyzed to find the dark bits in the image. This makes sense, because your pupil usually is one of the darkest parts of an image of your face (go check for yourself by looking through your Facebook profile pics). Of course, how dark exactly your pupil is can differ depending on the environment you are in. Therefore, you will have to tell my software what ‘dark’ actually is. You do so by setting a threshold value: a single number below which everything is considered ‘dark’. If you are now wondering “how can any part of an image be ‘lower than a single number’?”, you’re on the right track!

A computer does not see a picture in the way we humans do. Quite frankly, it doesn’t really ‘see’ the picture at all! For your computer, your profile pic is simple a collection of pixels, the tiniest parts of an image. If you zoom in real close, you should be able to see them in any image: small squares, consisting of only a single colour. To a computer a colour isn’t a colour in the way we perceive colours. To a computer, a colour consists of three numbers: one value for the amount of red, one for the amount of green, and one for the amount of blue. In our case, these numbers range from 0 to 255, where 0 means ‘none of this colour at all, please’ and 255 means ‘maximal colour!’. To give you some examples: (0,0,0) is black (no colour for all) and (255,255,255) is white (maximal colour for all). By now, you might be able to guess what the brightest red is to a computer: (255,0,0).

Now you know how a computer reads out the webcam’s images, it’s time to return to pupil detection. As I mentioned before, you can select a single threshold value for what my software must consider ‘dark’ (and therefore potentially the pupil!). My software then looks at every single pixel in the image, and checks if any of the values for red, green and blue are below that threshold. Only if ALL the values for that pixel are below the threshold value, that pixels is considered ‘dark’. In the movies below, you can see a blue-black screen, where all the dark parts of the webcam’s video stream are black and all other pixels are blue. As you can see: the pupil is one of the darkest parts in the video stream. In essence, this is what we want. But what about all other dark parts of the images? We don’t want the software to mistake my eyebrows for my pupil!

The easiest way to prevent incorrect pupil detection, is by specifying where in each image my software is allowed to look. Basically, it needs to know where the pupil is, and how big the area around the pupil is in which it can look for the pupil. The easiest way to achieve this, is by directly telling my software where your pupil is. If you’re thinking “Wait, hang on Mr. Genius. Your software is supposed to tell me where the pupil is, not the other way around!”, you might have a point. Luckily, you will only have to tell my software where your pupil is once. After that, it’s perfectly capable of telling you where your pupil is. Alternatively, I could try to write some sophisticated face detection algorithm, that finds your face in an image, and then knows where to look for your pupil. This, however, has the disadvantage that an entire face would have to be present in the image, which is not necessarily the case (see the videos under First steps and Progress). On top of this, even the world’s best programmer wouldn’t be able to write face detection software that surpassed your ability to recognize an eye in an image, because you’re just so darn good at it! Therefore, I chose to let you tell the software where your pupil is, by clicking on it with the mouse.

After indicating the pupil location, you can increase or reduce the size of the ‘pupil bounding rect’, the enclosure outside of which my software ignores everything. You can set its limits to anything (and you can even deactivate it), but I’d recommend a bounding rect that encloses the entire eye, and maybe even a bit around it. The larger the bounding rect, the higher the risk of false pupil detection, but the smaller the bounding rect, the higher the risk of losing pupil detection if you move too fast. After setting the rect, you can test if your settings are good by moving and gazing around a bit, and you can adjust the threshold if needed (or go back to any earlier step). Please see the videos above for some nice demo’s!

87 Comments:

  1. i need python opencv2 code gaze tracking……

  2. sir, i m working on pupil detection and found your code helpful but i have no idea about the language python so kindly tell me the procedure to run this code

    • Assuming you’re on windows, the easiest way is to download the following things:

      1) Anaconda, version 2.7 (a Python distribution that includes a lot of cool stuff, including the Spyder editor)
      2) PyGame (choose the version for Python 2.7)
      3) The webcam-eyetracker source code (click on the ‘DOWNLOAD ZIP’ button)

      Now run the Anaconda installer, and then run the PyGame installer. Afterwards, unzip the webcam-eyetracker source code archive. You can open ‘GUItest.py’ in the Spyder editor, and run it from there. A better alternative is to create a batch file, in which you write: "C:\Anaconda\python.exe" "GUItest.py". Then run that batch file.

      PS: To create a batch file, create a new and empty plain text document, and change the .txt extension to .bat. Then right-click the batch file, and choose ‘Edit’.

  3. Hi Edwin,
    I’m trying to run your software on Windows but I got this “Error: Cannot set capture resolution.” at line 52 of the file “_camera_vidcapture.py” which is a file of the pygame folder … I tried to comment this line, the result was interesting because I was able to access your Welcome page in the pygame window and when I press a key my webcam turns on and the software opens the ActiveMovie Window ! However, after less than a second, the image stops as the python.exe stops working …
    If you have any idea of what could be the source of the problems, I take. I read somewhere that pygame only handles the camera on Linux but it’s likely out of date .

    • I’ve been trying to figure out what was the problem more precisely.
      I found out that the software stops on line 865 of camtracker.py :
      pygame.transform.threshold(thimg, image, self.settings[‘pupilcol’], th, self.settings[‘nonthresholdcol’], 1)

      • In my case the software also stops at that particular line. I get the following error:

        TypeError; must be pygame.Surface, not None.

        I checked some of the variables that are used in the pygame.transform.threshold() function, and it seems that the image variable (passed as a parameter) is the problem. It is defined as img = self.get_snapshot() at the end of the file. However, the get_snapshot() function tries to return a self.cam.get_image(), which returns None on my pc.

        Then, on stackoverflow – http://stackoverflow.com/questions/25711028/typeerror-must-be-pygame-surface-not-none – I found something that turned out to be the solution for me.

        I added pygame.Surface((640,480)) as a parameter to get_image() in the get_snapshot() function, so:

        def get_snapshot(self):

        “””Returns a snapshot, without doing any any processing

        arguments
        None

        keyword arguments
        None

        returns
        snapshot — a pygame.surface.Surface instance,
        containing a snapshot taken with the webcam
        “””

        pygame_surface = pygame.Surface((640,480))
        return self.cam.get_image(pygame_surface)

        After this, everything worked according to plan.

  4. Hi Edwin,
    im trying to run my code on my raspberry pi but unable to track pupil efficiently can y suggest me the web cam for raspberry pi or what you r using
    thank you

    • Hi,

      Sounds like a cool project! On a Pi, you could use the Pi NoIR camera. You will have to change the code, as that one is not compatible with PyGame. The first webcam I used was a Trust Cuby (at the time, it was 15 Euros). Since I have also used others. They were all super cheap webcams, from which I manually removed the infrared filter (you don’t want to do that with an expensive webcam, as it mucks up your regular image quite bad).

      Good luck!

    • Hai adeeth,
      Do you succeed in running this code without any error?

    • Where you ever able to use the Noir camera? trying to do the same but having trouble changing the code to get the camera image.

  5. hi there,
    i need help to run this code,could you tell me the procedure to run the code
    properly because when i open the guitest.spy

  6. Hi Edwin Dalmaijer ,
    Is the webcam eye tracker only on GUI. Is is possible to get the location of retina(eye) in x and y in command line.
    Can I use the code without executing guitest.py ? if yes how can i do it ?

    • Yes, you can use it without the GUI. In fact, the GUI setup is only there to return a ‘calibrated’ (set pupil threshold) tracker instance. Look up the underlying code on GitHub, and it should be self-evident :)

  7. hi Edwin Dalmaijer ,,
    thank’s for your project ,, can i run this code with Windows surface pro webcam ?

    i have this
    Exception: Error in camtracker: PyGame could not be imported and initialized! :(

    pygame and anaconda is instaled in my pc .

    thanks

  8. ABHISHEK SACHAN

    Hello
    How can I run your PyOpenCV code. Which all files I need to run. what all dependcies i need to install.

  9. I want to make one eye tracking device…the camera you mentioned trust cuby webcam not available right now in market…so which camera I can use as a replacement for that which will better work to track pupil?

  10. Hi Edwin, I have a problem.
    You can help me?
    I installed everything as you explained, but after starting the application, when the program calls the webcam, I get a message of error.

    “This application has requested the runtime to terminate it in an unusual way”.

    I can’t see in which part of program is the error.

    What can be?

    Thank you!

  11. rajarajan elango

    i am getting error ” videocapture module not found ” while running GUItest.py in linux ( ubuntu ). how to install that library in ubuntu. kindly help. Installed pil but having same error

  12. just a beginner question, what is the difference between openCV and PyGaze?

    I have a save mp4 video of the eye/pupil, and I want to track the movement.

    I guess that I don’t need face recognition?(seems to be a feature of openCV)?

    • PyGaze is a software library that you can use to create experiments for psychological research. It allows you to display things on the monitor, to interface with external devices (keyboards, mice, joysticks, EEG equipment, etc.), and also to interface with existing eye trackers (EyeLink, EyeTribe, GazePoint, SMI, and Tobii). See here for the source code: https://github.com/esdalmaijer/PyGaze

      However, what you’re likely referring to is my webcam eye tracker. This allows you to track pupils and glints in a webcam stream. The old codebase used PyGame for this, but the newer version uses OpenCV (including some of its face detection functionality). For the source code, see here: https://github.com/esdalmaijer/webcam-eyetracker

      For your purpose, I think you might want to use my OpenCV implementation. This offers a generic class that handles the tracking, and all you have to do to allow it to process your videos rather than a webcam stream is to slightly adjust the existing code. Basic steps:

      1) Inherit the generic class, EyeTracker, in a child class for your video analysis.
      2) Use my webcam tracker as inspiration. https://github.com/esdalmaijer/webcam-eyetracker/blob/master/PyOpenCV/pygazetracker/webcam.py
      3) Use my image implementation as inspiration: https://github.com/esdalmaijer/webcam-eyetracker/blob/master/PyOpenCV/pygazetracker/images.py
      4) Write your own class for your videos. Make sure it inherits the generic EyeTracker class, and that you define a ‘connect’, a ‘_get_frame’, and a ‘_close’ function.

  13. Hi.

    Thank you for all the work you have done here.

    I would like to run this on a raspberry pi 3 using the raspicam module. Is there an easy way to change the input to raspicam?

  14. I installed everything on a raspberry pi. When i try to run the opencv webcam_eyetracker example.py file i get the following error.

    Traceback (most recent call last):
    File “/home/pi/webcam-eyetracker-master/PyOpenCV/example.py”, line 9, in
    from pygazetracker.webcam import WebCamTracker
    File “/home/pi/webcam-eyetracker-master/PyOpenCV/pygazetracker/__init__.py”, line 22, in
    _DIR = os.path.abspath(os.path.dirname(__file__)).decode(u’utf-8′)
    AttributeError: ‘str’ object has no attribute ‘decode’
    >>>

    Any idea what is wrong here?

  15. Is Webcam with IR leds is harmful for for eyes. because I am using A4Tech 333 model ?

    • It can definitely be harmful at high intensities, as it can heat up your retina (which is bad, m’kay?). That’s unlikely to occur in consumer devices, although I’m not familiar with the model you mention. You could opt for being careful, and not use it close to your eyes.

  16. Hi Edwin,
    I’m trying to make the webcam-eyetracker system with the version containing OpenCV work on my computer (by running the command console “python example.py”) but I can not. I think I downloaded all the dependencies that you indicate (Python- OpenCV (cv2), NumPy and SciPy and MatPlotLib) and when I run it in the console I get: “VIDIOC_QUERYMENU: Invalid argument”
    If you could go to me saying one by one the steps to follow or you could find where my mistake lies, I would appreciate it.
    Best regards.

  17. Hi Edwin,
    I have doubt regarding the pupil position field in the program, what exactly does it represent? Is it a position on the screen?

  18. Hi Edwin,
    I have an error when I try run the program, this is the error:
    Traceback (most recent call last):
    File “C:\Users\usuario1\Anaconda2\lib\multiprocessing\queues.py”, line 277, in _feed
    send(obj)
    IOError: [Errno 232] Se est� cerrando la canalizaci�n
    Elapsed time: 3400.000 ms

    I don1t know what can I have wrong…
    Could you help me?
    Thank you

  19. Hi Edwin,

    Both your sample codes aren’t working for me. So I get that the PyGame isn’t working correctly from their part.

    I am working with OpenCV3 and had to make minor changes to the sample code to make it work but now I have Assertion failed errors everywhere. Your code isn’t easy to understand from the outside and I don’t get what you’re doing to understand where those errors come from.

    Could you give me a hint please ?

    (The main line I changed is : flags=cv2.CASCADE_SCALE_IMAGE instead of flags=cv2.cv.CV_HAAR_SCALE_IMAGE)

    Thanks,
    Amelie.

  20. Hi Edwin,

    Thanks for the good post info, and your patience answering all these questions :-).

    I’m curious though (haven’t dug into the project yet, but) it seems like the project is pupil tracking, but I was wondering what you’ve used it for. I’m curious about building a super-cheap eyetracker for website heatmap analysis. I’ve got no relevant experience, but just assumed that high-tech hardware would be required.

    Does pupil tracking translate into tracking the users gaze (easily)? And are commercial webcams sufficiently capable?

    For my project I am considering using an Xbox One Kinect, as it comes with IR mode, and 60 FPS at a relatively low resolution (640×480). In your experiences, will that be sufficient to give a mostly-reliable gaze tracking output?

    • >> I’ve got no relevant experience

      I have plenty of software dev, just none in pupil tracking exactly :-)

    • Hi Stephen,

      With my OpenCV-based pupil tracker, you can do two things: 1) Track the pupil centre, and 2) Track the centre of the corneal reflection (CR). The difference between these two is linearly related to the gaze position. Hence, if you consecutively show e.g. 9 points on the screen while you record the pupil-CR difference, you should be able to figure out the intercept and slope of the linear function that translates between pupil-CR and gaze position. That is: gaze_x = intercept + (pupil_x – CR_x) * slope, and the same for gaze_y.

      I haven’t used these for any serious projects, as I am lucky enough to have access to more expensive hardware. In general, you should probably be able to make out at what quadrant people are looking. Anything beyond that is a bonus 😉

      Cheers,
      Edwin

  21. I am looking to have a device which can help my Cerebral Palsy son to type or assist in communication using an AAC.

    I want a mechanism where in he can use his eye movements for typing or using an aac…since he doesnot have control on his hands and the only way we can use any input from him is to use his head or eye movements.

    Would need your inputs since u r into eye tracking.
    R u aware of such similar things.

    Thanks in advance.

  22. HI Edwin,

    Thank you for the codes, I have a question, why did you remove the infrared filter?

    • Normally infrared light is filtered out, so that the resulting spectrum is closer to human vision. In this case the infrared bit of the spectrum contains a lot of valuable information, in particular when the webcam is used in conjunction with a source of infrared light (I used a battery of LEDs).

  23. Dear Edwin
    Thank you much for your effort.. You really help young researches studying neuromarketing..I am a young Research Assistant.. I wish the Pygaze should have been more User friendly and I have no experience and No time to know more about Phyton….

    I developed head mounted Infrared illumunated Web cam eye tracker by hacking Ps3 eye cam… İt works perfectly… I am looking for a good software which is capable of analysing and accurate eye tracking…

    Can I use one of your softwares?? I use Windows 10 could you pls tell me which software I can use and what are the downloading steps in offer to achieve accurate tracking and analysis by using my IR eye tracking webcam???

  24. i just want to do pupil tracking from a video i have it is a college project and i do not know which code to use from these codes please help me

  25. Dear Muhammed,
    Hello from Turkey
    I have to do the same thing… i just want to do pupil tracking from a video and analyse the data (heat map, focus map etc.) if you find any solution please let me konow ilkersahin1985@hotmail.com.

  26. I am struggling to get the code to work, when in cmd using python 3 and running the code with all the modules installed it gives me
    left = facecrop[y:y+h, x:x+w]
    TypeError: slice indices must be integers or None or have an __index__ method
    and I have tried [int(y:y+h, x:x+w] and many others
    also tried python 2 but it gave me a separate issue
    [ WARN:0] terminating async callback
    and both python versions are in their own seperate environment and both worked a bit(the light on the camera turned on but nothing happened)
    also tried the debugger but can’t seem to fully understand it with the pdb
    and python -m pdb myscript.py
    it gave me -> if __name__ == u’__main__':
    (Pdb)
    at this point, some help would be appreciated kindly(using windows 10)
    Thanks in advance

  27. First of all, thanks Edwin, for sharing the resources with us

    opencv based working. I’ve done all the necessary setup example.py with pycharm when running the camera is lit and then turns off. There is no image but the program appears to be working without error.
    can you guess why the problem is caused by this?

    • Hi Kenan,

      It seems clear that the camera turns off, and the feed is inaccessible. Annoyingly, OpenCV doesn’t throw any informative errors in this case, and instead of a new frame just returns None. The problem usually is with your OpenCV installation, and/or its dependencies. (Can it find ffmpeg? Are the correct codecs installed? Etc.)

      Good luck!

  28. Merhab Edwin;
    Thank you for your early return.
    Yes, I set up ffmpeg. I use the webcam Logitech C920. webcamin video Encoder Cisco H264.
    —————————
    Python: 2.7.15
    Opencv: pyopencv is failing in installation.
    ————————-
    “(Webcam) C: \ Windows \ system32> pip install pyopencv == 2.1.0.wr1.0.0
    DEPRECATION: Python 2.7 will reach the end of the life on January 1st, 2020. Ple
    ase upgrade your Python as Python 2.7 won’t be maintained after that date. A fut
    ure version of the pacifier for Python 2.7.
    Collecting pyopencv == 2.1.0.wr1.0.0
      Using cached https://files.pythonhosted.org/packages/48/c9/c04aa0afd6706641334
    40bafd5ea7b7a8e6e7f1a6ee27ba5718881e642d5 / pyopencv-2.1.0.wr1.0.0.tar.gz
        Complete output from command python setup.py egg_info:
        Create
    can run ‘setup.py’. Just copy file ‘config_example_win32.py’ or ‘config_example_
    linux.py ‘to file’ config.py ‘,
    up.py ‘again.

        —————————————-
    Command “python setup.py egg_info” failed with error code -1 in c: \ users \ ibm \ app
    data \ local \ temp \ pip-install-ekozs is \ pyopencv \

    (Webcam) C: \ Windows \ system32> ”
    ————————————————-

    So I installed opencv-python = 3.4.5.20. Founded successfully. But when I run Example.py, the program runs but the calibrate screen does not appear.
    The program is always running and only
     [WARN: 0] terminating async callback
    message is coming up.
    ——————————————–
    The same program when I run pygame-based problems I’ve solved the program is running, but this time on the Calibrate screen says a key press. Pressing any button freezes. Can’t switch to the calibration screen.

    Thank you in advance, Mr. Edwin.
    You’re great ..

  29. is there any chance that you still have code for the first iteration of your pupil tracker. Basically, the first video that you uploaded for this page. I’m working on a research project for college and I think it could be very useful.

  30. Hi Edwin ,
    I have tried both PyGame and PyOpenCV folder but faced problem in both of it.

    When I run GUItest.py in the PyGame folder it shows error as below:

    File “C:\Users\liyin\Anaconda2\lib\site-packages\VideoCapture\__init__.py”, line 154, in getImage
    ‘RGB’, (width, height), buffer, ‘raw’, ‘BGR’, 0, -1)
    File “C:\Users\liyin\Anaconda2\lib\site-packages\PIL\Image.py”, line 2053, in fromstring
    “Please call frombytes() instead.”)
    Exception: fromstring() has been removed. Please call frombytes() instead.

    I have tried to change fromstring into frombytes but then there is another error
    File “camtracker.py”, line 862, in get_snapshot
    data = image.tostring()
    File “C:\Users\liyin\Anaconda2\lib\site-packages\PIL\Image.py”, line 686, in tostring
    “Please call tobytes() instead.”)
    Exception: tostring() has been removed. Please call tobytes() instead.

    And when I changed again the tostring into tobytes then there is runtime error.

    I have look up on the internet and some says that there may be the pillow version problem but even I change into version 2.9.0 or 3.1.0 both have error.

    For PyOpenCV, when I run example.py the webcam on my laptop lit and then turned off but there is nothing shown. It appears that it is working without error.

    I’m using anaconda2 with python2.7 32-bit, the opencv I download through “pip install opencv-python”. I have downloaded ffmpeg but I don’t know how to set it up

  31. Hi Edwin,

    I’m currently doing my master project, which requires recording gaze data from the users. I only have coding experience in Python, so your work turned to be the best choice for me. I tried your code it works fine, but seems only the pupil data could be recorded and saved. I was wondering if I can get the raw gaze coordinates as well? Thanks in advance!!

    Best,
    Agnes

    • Hi Agnes,

      You’ll need to compute a conversion between the webcam-image coordinates and gaze coordinates. The usual process for this is by running a calibration. This entails showing points on a screen, and asking a participant to look at them. Because the points are known, you can use this to fit a function that transforms webcam-image coordinates into gaze coordinates. The usual approach is to compute the difference between pupil and glint (within the webcam image) along the horizontal and vertical axis; this should linearly relate to the on-screen gaze coordinates. Another approach is to just fit a polynomial, using pupil coordinates as predictors, and gaze coordinates as outcomes.

      Good luck! :)
      Edwin

  32. Hi. We’re currently working on making it possible to use RealEye.io webcam eye-tracking available from PyGaze.
    I’m hoping we will make it publicly available soon :)

  33. I have a problem with the webcam, and the pygame importing… it does not seem to work, help please?

  34. Hi Edwin,

    Thank you for all your work on the eye tracker. I am working on my masters project and found that PyGaze will be of great use specially the webcam tracker. I ran the Opencv version since I use a Mac and PyGame does not seem to be supported here. I ran the example.py file. I see the camera turning on and off. However, I do not see anything else on the screen. For example, no outputs or recorded videos. The program also does not seem to terminate.

    Is this the expected behavior or am I missing something here?

    Thank You!

  35. Hello Edwin,
    do you remember how exactly you removed the IR filter from the Cuby Cam?
    IR filter seems to face outside from the looks of it (light red tint and flat glass surface visible)
    Usually the filter sits on the inside facing the sensor and making it easy to remove.
    Hint would by appreciated
    Thanks Jorg

  36. from __init__ import _message

    ImportError: cannot import name ‘_message’ from ‘__init__’ (C:\Users\tulip\anaconda3\lib\site-packages\IPython\extensions\__init__.py)
    I receive this error,Could you help me? And if I want to run some programs,program wants to reload pygazetracker but I cannot find a program in this name,I see just a folder including _init_images, generic and webcam programs.

  37. Pingback: Webcam eye tracker – Quantitative Exploration of Development (Q.E.D.)

  38. Pingback: PyGaze Analyser – Quantitative Exploration of Development (Q.E.D.)

  39. can we download it on MacBook Air. I need to track gaze of small children with cerebral visual impairment with poor attention

  40. Have you heard of anyone else who has carried forward the work of integrating webcam support into PyGaze as Damian said he was doing? I was assigned to a masters project team that needs gaze tracking data on fixations and saccades, but no one on our team has any experience with the specialized hardware.

  41. Hi Edwin, really nice work!
    Do you think it’s possible getting the pupil dilation using a common webcam?
    Under the right light conditions, of course.

    • Only if the camera is really close to the face. The further away your camera is, the smaller the pupil in the image. It quickly becomes just a few pixels wide, which is nowhere near enough to reliably measure subtle dilations or constrictions. (And, indeed, you’d need optimal light conditions. Beware the glare! Screens, lights, windows, etc. can reflect on the cornea, which obscures the pupil.)

  42. Hey, I have a special request. I am not a programmer but I have a request due to a severe illness of one of my friends who is only able to use her eyes (open and close and move up and down and blink).
    I want to help her to be able to communicate in a better way based on eye gaze detection. For this I guess I need a free open source eye gaze detection algorithm, a camera, a screen, probably arduino or raspi, somehow a slide or ppt with box such as letters, numbers and common words which could be looked at and chosen and an algorithm which can be controlled via eyes to move along this slide with boxes and choose the focused boxes (letters etc.).
    Is there anyone who could give me some hints or recommendations and Tipps how to realize this without programming skills? Any git or any online instruction or comparable…

    Thanks a lot already for you support
    Best regards

Leave a Reply to maddox Cancel reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code class="" title="" data-url=""> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> <pre class="" title="" data-url=""> <span class="" title="" data-url="">