AIST++ API This repo contains starter code for using the AIST++ dataset.

Overview

AIST++ API

This repo contains starter code for using the AIST++ dataset. To download the dataset or explore details of this dataset, please go to our dataset website.

Installation

The code has been tested on python>=3.7. You can install the dependencies and this repo by:

pip install -r requirements.txt
python setup.py install

You also need to make sure ffmpeg is installed on your machine, if you would like to visualize the annotations using this api.

How to use

We provide demo code for loading and visualizing AIST++ annotations. Note AIST++ annotations and videos, as well as the SMPL model (for SMPL visualization only) are required to run the demo code.

The directory structure of the data is expected to be:


├── motions/
├── keypoints2d/
├── keypoints3d/
├── splits/
├── cameras/
└── ignore_list.txt


└── *.mp4


├── SMPL_MALE.pkl
└── SMPL_FEMALE.pkl

Visualize 2D keypoints annotation

The command below will plot 2D keypoints onto the raw video and save it to the directory ./visualization/.

python demos/run_vis.py \
  --anno_dir <ANNOTATIONS_DIR> \
  --video_dir <VIDEO_DIR> \
  --save_dir ./visualization/ \
  --video_name gWA_sFM_c01_d27_mWA2_ch21 \
  --mode 2D

Visualize 3D keypoints annotation

The command below will project 3D keypoints onto the raw video using camera parameters, and save it to the directory ./visualization/.

python demos/run_vis.py \
  --anno_dir <ANNOTATIONS_DIR> \
  --video_dir <VIDEO_DIR> \
  --save_dir ./visualization/ \
  --video_name gWA_sFM_c01_d27_mWA2_ch21 \
  --mode 3D

Visualize the SMPL joints annotation

The command below will first calculate the SMPL joint locations from our motion annotations (joint rotations and root trajectories), then project them onto the raw video and plot. The result will be saved into the directory ./visualization/.

python demos/run_vis.py \
  --anno_dir <ANNOTATIONS_DIR> \
  --video_dir <VIDEO_DIR> \ 
  --smpl_dir <SMPL_DIR> \
  --save_dir ./visualization/ \ 
  --video_name gWA_sFM_c01_d27_mWA2_ch21 \ 
  --mode SMPL

Multi-view 3D keypoints and motion reconstruction

This repo also provides code we used for constructing this dataset from the multi-view AIST Dance Video Database. The construction pipeline starts with frame-by-frame 2D keypoint detection and manual camera estimation. Then triangulation and bundle adjustment are applied to optimize the camera parameters as well as the 3D keypoints. Finally we sequentially fit the SMPL model to 3D keypoints to get a motion sequence represented using joint angles and a root trajectory. The following figure shows our pipeline overview.

AIST++ construction pipeline overview.

The annotations in AIST++ are in COCO-format for 2D & 3D keypoints, and SMPL-format for human motion annotations. It is designed to serve general research purposes. However, in some cases you might need the data in different format (e.g., Openpose / Alphapose keypoints format, or STAR human motion format). With the code we provide, it should be easy to construct your own version of AIST++, with your own keypoint detector or human model definition.

Step 1. Assume you have your own 2D keypoint detection results stored in , you can start by preprocessing the keypoints into the .pkl format that we support. The code we used at this step is as follows but you might need to modify the script run_preprocessing.py in order to be compatible with your own data.

python processing/run_preprocessing.py \
  --keypoints_dir <KEYPOINTS_DIR> \
  --save_dir <ANNOTATIONS_DIR>/keypoints2d/

Step 2. Then you can estimate the camera parameters using your 2D keypoints. This step is optional as you can still use our camera parameter estimates which are quite accurate. At this step, you will need the /cameras/mapping.txt file which stores the mapping from videos to different environment settings.

# If you would like to estimate your own camera parameters:
python processing/run_estimate_camera.py \
  --anno_dir <ANNOTATIONS_DIR> \
  --save_dir <ANNOTATIONS_DIR>/cameras/
# Or you can skip this step by just using our camera parameter estimates.

Step 3. Next step is to perform 3D keypoints reconstruction from multi-view 2D keypoints and camera parameters. You can just run:

python processing/run_estimate_keypoints.py \
  --anno_dir <ANNOTATIONS_DIR> \
  --save_dir <ANNOTATIONS_DIR>/keypoints3d/

Step 4. Finally we can estimate SMPL-format human motion data by fitting the 3D keypoints to the SMPL model. If you would like to use another human model such as STAR, you will need to do some modifications in the script run_estimate_smpl.py. The following command runs SMPL fitting.

python processing/run_estimate_smpl.py \
  --anno_dir <ANNOTATIONS_DIR> \
  --smpl_dir <SMPL_DIR> \
  --save_dir <ANNOTATIONS_DIR>/motions/

Note that this step will take several days to process the entire dataset if your machine has only one GPU. In practise, we run this step on a cluster, but are only able to provide the single-threaded version.

MISC.

  • COCO-format keypoint definition:
[
"nose", 
"left_eye", "right_eye", "left_ear", "right_ear", "left_shoulder","right_shoulder", 
"left_elbow", "right_elbow", "left_wrist", "right_wrist", "left_hip", "right_hip", 
"left_knee", "right_knee", "left_ankle", "right_ankle"
]
  • SMPL-format body joint definition:
[
"root", 
"left_hip", "left_knee", "left_foot", "left_toe", 
"right_hip", "right_knee", "right_foot", "right_toe",
"waist", "spine", "chest", "neck", "head", 
"left_in_shoulder", "left_shoulder", "left_elbow", "left_wrist",
"right_in_shoulder", "right_shoulder", "right_elbow", "right_wrist"
]
Owner
Google
Google ❤️ Open Source
Google
Projeto job insights - Projeto avaliativo da Trybe do Bloco 32: Introdução à Python

Termos e acordos Ao iniciar este projeto, você concorda com as diretrizes do Código de Ética e Conduta e do Manual da Pessoa Estudante da Trybe. Boas

Lucas Muffato 1 Dec 09, 2021
Group P-11's submission for the University of Waterloo's 2021 Engineering Competition (Programming section).

P-11-WEC2021 Group P-11's submission for the University of Waterloo's 2021 Engineering Competition (Programming section). Part I Compute typing time f

TRISTAN PARRY 1 May 14, 2022
Write Streamlit apps using Notion! (Prototype)

Streamlit + Notion test app Write Streamlit apps using Notion! ☠️ IMPORTANT: This is just a little prototype I made to play with some ideas. Not meant

Thiago Teixeira 22 Sep 08, 2022
Low-level Python CFFI Bindings for Argon2

Low-level Python CFFI Bindings for Argon2 argon2-cffi-bindings provides low-level CFFI bindings to the Argon2 password hashing algorithm including a v

Hynek Schlawack 4 Dec 15, 2022
Semantic Data Management - Property Graphs 📈

SDM - Lab 1 @ UPC 👨🏻‍💻 Table of contents Introduction Property Graph Dataset 1. Introduction This repo is all about what we have done in SDM lab 1

Mohammad Zain Abbas 1 Mar 20, 2022
Force you (or your user) annotate Python function type hints.

Must-typing Force you (or your user) annotate function type hints. Notice: It's more like a joke, use it carefully. If you call must_typing in your mo

Konge 13 Feb 19, 2022
ChieriBot,词云API版,用于统计群友说过的怪话

wordCloud_API 词云API版,用于统计群友说过的怪话,基于wordCloud 消息储存在mysql数据库中.数据表结构见table.sql 为啥要做成API:这玩意太吃性能了,如果和Bot放在同一个服务器,可能会影响到bot的正常运行 你服务器性能够用的话就当我在放屁 依赖包 pip i

chinosk 7 Mar 20, 2022
Simple dotfile pre-processor with a per-file configuration

ix (eeks) Simple dotfile pre-processor with a per-file configuration Summary (TL;DR) ix.py is all you need config is an ini file. files to be processe

Poly 12 Dec 16, 2021
Speed up your typing by some exercises in the multi-platform(Windows/Ubuntu).

Introduction This project purpose is speed up your typing by some exercises in the multi-platform(Windows/Ubuntu). Build Environment Software Environm

lyfer233 1 Mar 24, 2022
Explores the python bytecode, provides some tools to access it for fun and profit.

Pyasmtools - looking at the python bytecode for fun and profit. The pyasmtools library is made up of two parts A python bytecode disassembler . See Py

Michael Moser 299 Jan 04, 2023
Python Control Systems Library

The Python Control Systems Library is a Python module that implements basic operations for analysis and design of feedback control systems.

Control Systems Library for Python 1.3k Jan 06, 2023
NASH 2021 project... this may or may not end up working 🤷‍♂️

wavespace synthesiser this is my NASH 2021 project, which may or may not end up working 🤷‍♂️ what is going on? imagine you have a big folder of audio

Ben Hayes 12 May 17, 2022
Stori QA Automation Challenge

Stori-QA-Automation-Challenge This is the repository is created for the Stori QA Intern Automation Engineer Challenge! In this you can find the Requir

Daniel Castañeda 0 Feb 20, 2022
Program Input Nilai Mahasiswa Menggunakan Fungsi

PROGRAM INPUT NILAI MAHASISWA MENGGUNAKAN FUNGSI Nama : Maulana Reza Badrudin Nim : 312110510 Matkul : Bahas Pemograman DESKRIPSI Deklarasi dicti

Maulana Reza Badrudin 1 Jan 05, 2022
Restaurant-finder - Restaurant finder With Python

restaurant-finder APIs /restaurants query-params: a. filter: column based on whi

Kumar saurav 1 Feb 22, 2022
Absolute solvation free energy calculations with OpenFF and OpenMM

ABsolute SOLVantion Free Energy Calculations The absolv framework aims to offer a simple API for computing the change in free energy when transferring

7 Dec 07, 2022
Time tracking program that will format output to be easily put into Gitlab

time_tracker Time tracking program that will format output to be easily put into Gitlab. Feel free to branch and use it yourself! Getting Started Clon

Jake Strasler 2 Oct 13, 2022
ESteg - A simple steganography program for python

ESteg A simple steganography program to embed the contents of a text file into a

Jithin Renji 1 Jan 02, 2022
Tool for working with Direct System Calls in Cobalt Strike's Beacon Object Files (BOF) via Syswhispers2

Tool for working with Direct System Calls in Cobalt Strike's Beacon Object Files (BOF) via Syswhispers2

150 Dec 31, 2022
Winxp_python3.6.15 - Python 3.6.15 For Windows XP SP3

This is Python version 3.6.15 Copyright (c) 2001-2021 Python Software Foundation. All rights reserved. See the end of this file for further copyright

Alex Free 13 Sep 11, 2022