Skip to content

A research toolkit for extracting quantitative features from human movement data, offering computational methods to analyze movement qualities like smoothness, bilateral symmetry, contraction/expansion patterns, and synchronization. Built for researchers in motor control, biomechanics, and movement disorders.

License

Notifications You must be signed in to change notification settings

InfoMusCP/PyEyesWeb

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyEyesWeb

Expressive movement analysis toolkit

A modern, modular, and accessible Python library for expressive movement analysis — bridging research, health, and the arts

PyPI version Docs License

PyEyesWeb is a research toolkit for extracting quantitative features from human movement data.
It builds on the Expressive Gesture Analysis library of EyesWeb, bringing expressive movement analysis into Python as a core aim of the project. The library provides computational methods to analyze different qualities of movement, supporting applications in research, health, and the arts.
It is designed to facilitate adoption in artificial intelligence and machine learning pipelines, while also enabling seamless integration with creative and interactive platforms such as TouchDesigner, Unity, and Max/MSP.

Installation

pip install pyeyesweb

Usage

A minimal example of extracting movement features with PyEyesWeb :

from pyeyesweb.data_models import SlidingWindow
from pyeyesweb.mid_level import Smoothness

# Movement smoothness analysis
smoothness = Smoothness(rate_hz=50.0)
window = SlidingWindow(max_length=100, n_columns=1)
window.append([motion_data]) 
# here `motion_data` is a float representing a single sample of motion data
# (e.g., the x coordinate of the left hand at time t).

sparc, jerk = smoothness(window)

Tip

For more advanced and complete use cases see the Documentation and the examples folder.

Documentation

Comprehensive documentation for PyEyesWeb is available online and includes tutorials, API references, and the theoretical and scientific background of the implemented metrics:

Support

If you encounter issues or have questions about PyEyesWeb, you can get help through the following channels:

Please provide clear descriptions, minimal reproducible examples, and version information when submitting issues—it helps us respond faster.

Roadmap

PyEyesWeb is under active development, and several features are planned for upcoming releases:

  • Expanded feature extraction: addition of more movement expressivity metrics (you can find an example of which features to expect in related conceptual layer guide.
  • Improved examples and tutorials: more interactive Jupyter notebooks and example datasets to facilitate learning and adoption.
  • Cross-platform compatibility: streamlined integration with creative and interactive platforms (e.g., TouchDesigner plugin, Unity, Max/MSP).

Future development priorities may evolve based on user feedback and research needs. Users are encouraged to suggest features or improvements via GitHub Issues.

Contributing

Contributions to PyEyesWeb are welcome! Whether it's reporting bugs, adding features, improving documentation, or providing examples, your help is appreciated.

How to Contribute

  1. Fork the repository and create a branch for your feature or bug fix:
    git checkout -b feature/your-feature-name
  2. Set up the development environment:
    pip install pyeyesweb[dev]
  3. Make your changes, ensuring code quality and adherence to the project's coding standards.
  4. Submit a pull request to the main branch, with a clear description of your changes.
  5. Engage in code reviews and address any feedback provided by maintainers.

Authors & Acknowledgments

PyEyesWeb is developed by InfoMus Lab – Casa Paganini, University of Genoa, with the partial support of the EU ICT STARTS Resilence Project.

InfoMus Lab Logo
Resilence Project Logo EU Logo

Maintainers & Contributors

License

MIT License

About

A research toolkit for extracting quantitative features from human movement data, offering computational methods to analyze movement qualities like smoothness, bilateral symmetry, contraction/expansion patterns, and synchronization. Built for researchers in motor control, biomechanics, and movement disorders.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Packages

No packages published

Languages