Interestingly enough, we actually came up with the concept for this project during a meetup about interactive art installations. The idea of eye tracking came up and we discussed what we could discover with the technology. So we started to try to figure out how to a study with the free and open-source tools available. We ended up needing:

A webcam to look at the user's eyes.

ITU Gaze Tracker to calibrate and interpret that data. (However, their website is now down, so I'm not sure how viable this is as part of the toolchain now.)

OGAMA - Open Gaze and Mouse Analysis to conduct the study, display and record the data.

A custom chin rest that we fabricated and used a mouse pad for cushion

We threw together a workstation for about $400 (the laptop + webcam were the main costs) to do the study and started collecting data -

We showed surgical residents and surgeons with over 7 years of experience a series of 30 radiograph images and asked them to rate the deformity from 0 to 3 in severity. Experts tended to lock onto areas for longer and use their peripheral vision more for diagnosis. Novices would search the image by moving their focus around more and tended to rank the deformity as less severe. Our main goal was to demonstrate that this kind of data collection can be done as a proof-of-concept cost-effectively and there's a lot to learn with it. We put together a video to further explain the setup, processes, and findings here if you'd like to learn more! -

It was a fun project despite a huge number of roadblocks and setbacks with the setups, calibration, data manipulation, Despite the challenges, we came out of it with some really interesting research that demonstrates yet again how awesome it is to have a diverse community of experts and all the tools they need in one place. Support your local hackerspace/makerspace!