18 Star 75 Fork 49

DeepSpark / DeepSparkHub

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
README.md 1.95 KB
一键复制 编辑 原始数据 按行查看 历史

AlphaPose

Model description

AlphaPose is an accurate multi-person pose estimator, which is the first open-source system that achieves 70+ mAP (75 mAP) on COCO dataset and 80+ mAP (82.1 mAP) on MPII dataset. To match poses that correspond to the same person across frames, we also provide an efficient online pose tracker called Pose Flow. It is the first open-source online pose tracker that achieves both 60+ mAP (66.5 mAP) and 50+ MOTA (58.3 MOTA) on PoseTrack Challenge dataset.

Step 1: Installation

# install libGL
yum install mesa-libGL

# install zlib
wget http://www.zlib.net/fossils/zlib-1.2.9.tar.gz
tar xvf zlib-1.2.9.tar.gz
cd zlib-1.2.9/
./configure && make install
cd ..
rm -rf zlib-1.2.9.tar.gz zlib-1.2.9/

# install requirements
pip3 install seaborn pandas pycocotools matplotlib
pip3 install easydict tensorboardX opencv-python

Step 2: Preparing datasets

Go to visit COCO official website, then select the COCO dataset you want to download.

Take coco2017 dataset as an example, specify /path/to/coco2017 to your COCO path in later training process, the unzipped dataset path structure sholud look like:

coco2017
├── annotations
│   ├── instances_train2017.json
│   ├── instances_val2017.json
│   └── ...
├── train2017
│   ├── 000000000009.jpg
│   ├── 000000000025.jpg
│   └── ...
├── val2017
│   ├── 000000000139.jpg
│   ├── 000000000285.jpg
│   └── ...
├── train2017.txt
├── val2017.txt
└── ...

Step 3: Training

# create soft link to coco
mkdir -p /home/datasets/cv/
ln -s /path/to/coco2017 /home/datasets/cv/coco

# 
bash ./scripts/trainval/train.sh ./configs/coco/resnet/256x192_res50_lr1e-3_1x.yaml 1

Results

GPUs FPS ACC
BI-V100 x8 1.71s/it acc: 0.8429

Reference

Python
1
https://gitee.com/deep-spark/deepsparkhub.git
git@gitee.com:deep-spark/deepsparkhub.git
deep-spark
deepsparkhub
DeepSparkHub
master

搜索帮助