Visualizing the learning process of artificial neural networks
A custom Keras layer for Mixter Density outputs
Implementations of RBM and its variants in tensorflow
WalkNet is a neural-network-based interactive movement controller with navigation capabilities, as well as modulating the agent's affective state and movement signature.
AffectNet is a generative model of affect-expressive movements, based on FCRBM. The affective qualities are represented along the valence and arousal dimensions.
Mova is a web-based platform for visualization and analysis of human movement data, based on D3.js.
MocapJS is a motion capture library for Three.js. It is designed for applications such as web-based mocap players, streaming movement data, and virtual reality scenarios.
Teach a deep learning model to generate novel, beat-synchronous dance movements for a given song.
This project investigates what a deep neural network that is trained on human movement learns about movement.
A visual exploration of the spatial patterns in the endings of city names in Iran. Adaptation of the -ach, -ingen, -zell project for Iran.
A small C++ program to convert motion capture marker data in C3D files to CSV format.
AIMS is a framework for agent interaction simulations, developed in Java.