A modern, modular, and accessible Python library for expressive movement analysis — bridging research, health, and the arts
PyEyesWeb
is a research toolkit for extracting quantitative features from human movement data.
It builds on the Expressive Gesture Analysis library of EyesWeb, bringing expressive movement analysis into Python as a core aim of the project.
The library provides computational methods to analyze different qualities of movement, supporting applications in research, health, and the arts.
It is designed to facilitate adoption in artificial intelligence and machine learning pipelines, while also enabling seamless integration with creative and interactive platforms such as TouchDesigner, Unity, and Max/MSP.
pip install pyeyesweb
A minimal example of extracting movement features with PyEyesWeb
:
from pyeyesweb.data_models import SlidingWindow
from pyeyesweb.mid_level import Smoothness
# Movement smoothness analysis
smoothness = Smoothness(rate_hz=50.0)
window = SlidingWindow(max_length=100, n_columns=1)
window.append([motion_data])
# here `motion_data` is a float representing a single sample of motion data
# (e.g., the x coordinate of the left hand at time t).
sparc, jerk = smoothness(window)
Tip
For more advanced and complete use cases see the Documentation and the examples folder.
Comprehensive documentation for PyEyesWeb
is available online and includes tutorials, API references, and the theoretical and scientific background of the implemented metrics:
- Getting Started: step-by-step guide to installation and basic usage.
- API Reference: technical descriptions of modules, classes, and functions.
- Theoretical Foundation: background on the scientific principles and research behind the metrics.
If you encounter issues or have questions about PyEyesWeb
, you can get help through the following channels:
- GitHub Issues: report bugs, request features, or ask technical questions on the PyEyesWeb GitHub Issues page.
- Discussions / Q&A: participate in conversations or seek advice in GitHub Discussions.
- Email: Reach out to the maintainers at
[email protected]
for direct support or collaboration inquiries.
Please provide clear descriptions, minimal reproducible examples, and version information when submitting issues—it helps us respond faster.
PyEyesWeb
is under active development, and several features are planned for upcoming releases:
- Expanded feature extraction: addition of more movement expressivity metrics (you can find an example of which features to expect in related conceptual layer guide.
- Improved examples and tutorials: more interactive Jupyter notebooks and example datasets to facilitate learning and adoption.
- Cross-platform compatibility: streamlined integration with creative and interactive platforms (e.g., TouchDesigner plugin, Unity, Max/MSP).
Future development priorities may evolve based on user feedback and research needs. Users are encouraged to suggest features or improvements via GitHub Issues.
Contributions to PyEyesWeb
are welcome! Whether it's reporting bugs, adding features, improving documentation, or providing examples, your help is appreciated.
- Fork the repository and create a branch for your feature or bug fix:
git checkout -b feature/your-feature-name
- Set up the development environment:
pip install pyeyesweb[dev]
- Make your changes, ensuring code quality and adherence to the project's coding standards.
- Submit a pull request to the
main
branch, with a clear description of your changes. - Engage in code reviews and address any feedback provided by maintainers.
PyEyesWeb
is developed by InfoMus Lab – Casa Paganini, University of Genoa, with the partial support of the EU ICT STARTS Resilence Project.
MIT License