Skip to content
/ ET-SAM Public

Zero-shot evaluation of Segment Anything Model on Eye Tracking images in a VR setup

License

Notifications You must be signed in to change notification settings

vbmaq/ET-SAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains the code for evaluating the Segment Anything Model on eye tracking images for the long paper Maquiling, V., Byrne, S.A., Nyström, M., Kasneci, E., & Niehorster, D.C. (accepted) Zero-Shot Segmentation of Eye Features Using the Segment Anything Model.

When using the code or model in this repository in your work, please cite Maquiling & Byrne, et al.

For more information or questions, e-mail: virmarie.maquiling@tum.de / sean.byrne@imtlucca.it. The latest version of this repository is available from https://github.com/vbmaq/ET-SAM

About

Zero-shot evaluation of Segment Anything Model on Eye Tracking images in a VR setup

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published