Motion Capture Datasets
The MoDa 2.0 web-based frontend is available at http://movehub.omid.al/. We also provide access to the training datasets that we collected as a part of my PhD through MoDa:
Mova: Movement Analytics Platform
A prototype of Mova is available here. With this prototype you can choose a sample movement, add features, and visualize the data. You can also watch a tutorial video here.
PyMO: A library for using motion capture data for machine learning
The source code for PyMO is available at PyMO's Github page.
AffectNet
- Training Data:Affect-Expressive Movements Dataset
- Output Videos
- Source Code
WalkNet
GrooveNet - Preliminary Model
The complementary materials for our submission to the Workshop on Machine Learning for Creativity at the 23rd ACM SIGKDD Conference on Knowledge Discovery and Data Mining. Halifax, Nova Scotia - Canada. 2017
- Training Data: GrooveDB
- Sample Outputs: https://omid.al/groovenet-material-ml4c
- Source Code
GrooveNet 2.0
- Training Data: GrooveDB
- Sample Outputs
- Demo of GrooveNet Real-Time Engine
- List of Machine Learning Experiments
- Source Code