Keypoint Detection
FBAGSTM commited on
Commit
f6002f9
·
verified ·
1 Parent(s): 0904a0c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +161 -6
README.md CHANGED
@@ -1,6 +1,161 @@
1
- ---
2
- license: other
3
- license_name: sla0044
4
- license_link: >-
5
- https://github.com/STMicroelectronics/stm32aimodelzoo/pose_estimation/LICENSE.md
6
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: sla0044
4
+ license_link: >-
5
+ https://github.com/STMicroelectronics/stm32aimodelzoo/pose_estimation/LICENSE.md
6
+ pipeline_tag: keypoint-detection
7
+ ---
8
+ # MoveNet quantized
9
+
10
+ ## **Use case** : `Pose estimation`
11
+
12
+ # Model description
13
+
14
+
15
+ MoveNet is a single pose estimation model targeted for real-time processing implemented in Tensorflow.
16
+
17
+ The model is quantized in int8 format using tensorflow lite converter.
18
+
19
+ ## Network information
20
+
21
+
22
+ | Network information | Value |
23
+ |-------------------------|-----------------|
24
+ | Framework | TensorFlow Lite |
25
+ | Quantization | int8 |
26
+ | Provenance | https://www.kaggle.com/models/google/movenet
27
+ | Paper | https://storage.googleapis.com/movenet/MoveNet.SinglePose%20Model%20Card.pdf |
28
+
29
+
30
+ ## Networks inputs / outputs
31
+
32
+ With an image resolution of NxM with K keypoints to detect :
33
+
34
+ - For heatmaps models
35
+
36
+ | Input Shape | Description |
37
+ | ----- | ----------- |
38
+ | (1, N, M, 3) | Single NxM RGB image with UINT8 values between 0 and 255 |
39
+
40
+ | Output Shape | Description |
41
+ | ----- | ----------- |
42
+ | (1, W, H, K) | FLOAT values Where WXH is the resolution of the output heatmaps and K is the number of keypoints|
43
+
44
+ - For the other models
45
+
46
+ | Input Shape | Description |
47
+ | ----- | ----------- |
48
+ | (1, N, M, 3) | Single NxM RGB image with UINT8 values between 0 and 255 |
49
+
50
+ | Output Shape | Description |
51
+ | ----- | ----------- |
52
+ | (1, Kx3) | FLOAT values Where Kx3 are the (x,y,conf) values of each keypoints |
53
+
54
+
55
+ ## Recommended Platforms
56
+
57
+
58
+ | Platform | Supported | Recommended |
59
+ |----------|-----------|-------------|
60
+ | STM32L0 | [] | [] |
61
+ | STM32L4 | [] | [] |
62
+ | STM32U5 | [] | [] |
63
+ | STM32H7 | [] | [] |
64
+ | STM32MP1 | [x] | [] |
65
+ | STM32MP2 | [x] | [x] |
66
+ | STM32N6 | [x] | [x] |
67
+
68
+ # Performances
69
+
70
+ ## Metrics
71
+
72
+ Measures are done with default STM32Cube.AI configuration with enabled input / output allocated option.
73
+
74
+ ### Reference **NPU** memory footprint based on COCO Person dataset (see Accuracy for details on dataset)
75
+ |Model | Dataset | Format | Resolution | Series | Internal RAM (KiB)| External RAM (KiB) | Weights Flash (KiB) | STM32Cube.AI version | STEdgeAI Core version |
76
+ |----------|------------------|--------|-------------|------------------|------------------|---------------------|-------|----------------------|-------------------------|
77
+ | [ST MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/ST_pretrainedmodel_public_dataset/custom_dataset_person_13kpts/st_movenet_lightning_heatmaps_192/st_movenet_lightning_heatmaps_192_int8_pc.tflite) | COCO-Person | Int8 | 192x192x3 | STM32N6 | 1674 | 0.0 | 3036.17 | 10.0.0 | 2.0.0 |
78
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_192/movenet_lightning_heatmaps_192_int8_pc.tflite) | COCO-Person | Int8 | 192x192x3 | STM32N6 | 1674 | 0.0 | 3036.41 | 10.0.0 | 2.0.0 |
79
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_224/movenet_lightning_heatmaps_224_int8_pc.tflite) | COCO-Person | Int8 | 224x224x3 | STM32N6 | 2058 | 0.0 | 3088.56 | 10.0.0 | 2.0.0 |
80
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_256/movenet_lightning_heatmaps_256_int8_pc.tflite) | COCO-Person | Int8 | 256x256x3 | STM32N6 | 2360 | 0.0 | 3141.36 | 10.0.0 | 2.0.0 |
81
+
82
+
83
+ ### Reference **NPU** inference time based on COCO Person dataset (see Accuracy for details on dataset)
84
+ | Model | Dataset | Format | Resolution | Board | Execution Engine | Inference time (ms) | Inf / sec | STM32Cube.AI version | STEdgeAI Core version |
85
+ |--------|------------------|--------|-------------|------------------|------------------|---------------------|-------|----------------------|-------------------------|
86
+ | [ST MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/ST_pretrainedmodel_public_dataset/custom_dataset_person_13kpts/st_movenet_lightning_heatmaps_192/st_movenet_lightning_heatmaps_192_int8_pc.tflite) | COCO-Person | Int8 | 192x192x3 | STM32N6570-DK | NPU/MCU | 18.44 | 54.23 | 10.0.0 | 2.0.0 |
87
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_192/movenet_lightning_heatmaps_192_int8_pc.tflite) | COCO-Person | Int8 | 192x192x3 | STM32N6570-DK | NPU/MCU | 18.49 | 54.08 | 10.0.0 | 2.0.0 |
88
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_224/movenet_lightning_heatmaps_224_int8_pc.tflite) | COCO-Person | Int8 | 224x224x3 | STM32N6570-DK | NPU/MCU | 22.33 | 44.78 | 10.0.0 | 2.0.0 |
89
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_256/movenet_lightning_heatmaps_256_int8_pc.tflite) | COCO-Person | Int8 | 256x256x3 | STM32N6570-DK | NPU/MCU | 27.01 | 37.03 | 10.0.0 | 2.0.0 |
90
+
91
+
92
+ ### Reference **MPU** inference time based on COCO Person dataset (see Accuracy for details on dataset)
93
+ | Model | Format | Resolution | Quantization | Board | Execution Engine | Frequency | Inference time (ms) | %NPU | %GPU | %CPU | X-LINUX-AI version | Framework |
94
+ |--------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------|------------|---------------|-------------------|------------------|-----------|---------------------|-------|-------|------|--------------------|-----------------------|
95
+ | [ST MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/ST_pretrainedmodel_public_dataset/custom_dataset_person_13kpts/st_movenet_lightning_heatmaps_192/st_movenet_lightning_heatmaps_192_int8_pc.tflite) | Int8 | 192x192x3 | per-channel** | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 58.02 ms | 3.75 | 96.25 |0 | v5.0.0 | OpenVX |
96
+ | [ST MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/ST_pretrainedmodel_public_dataset/custom_dataset_person_13kpts/st_movenet_lightning_heatmaps_192/st_movenet_lightning_heatmaps_192_int8_pt.tflite) | Int8 | 192x192x3 | per-tensor | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 7.93 ms | 84.89 | 15.11 |0 | v5.0.0 | OpenVX |
97
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_192/movenet_lightning_heatmaps_192_int8_pc.tflite) | Int8 | 192x192x3 | per-channel** | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 58.17 ms | 3.80 | 96.20 |0 | v5.0.0 | OpenVX |
98
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_192/movenet_lightning_heatmaps_192_int8_pt.tflite) | Int8 | 192x192x3 | per-tensor | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 8.00 ms | 86.48 | 13.52 |0 | v5.0.0 | OpenVX |
99
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_224/movenet_lightning_heatmaps_224_int8_pc.tflite) | Int8 | 224x224x3 | per-channel** | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 81.65 ms | 2.77 | 97.23 |0 | v5.0.0 | OpenVX |
100
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_224/movenet_lightning_heatmaps_224_int8_pt.tflite) | Int8 | 224x224x3 | per-tensor | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 11.55 ms | 87.04 | 12.96 |0 | v5.0.0 | OpenVX |
101
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_256/movenet_lightning_heatmaps_256_int8_pc.tflite) | Int8 | 256x256x3 | per-channel** | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 70.57 ms | 3.74 | 96.26 |0 | v5.0.0 | OpenVX |
102
+ | [MoveNet Lightning heatmaps](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_256/movenet_lightning_heatmaps_256_int8_pc.tflite) | Int8 | 256x256x3 | per-tensor | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 12.90 ms | 86.33 | 13.67 |0 | v5.0.0 | OpenVX |
103
+ | [MoveNet Lightning](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_192/movenet_singlepose_lightning_192_int8.tflite) | Int8 | 192x192x3 | per-channel** | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 66.97 ms | 6.72 | 93.28 |0 | v5.0.0 | OpenVX
104
+ | [MoveNet Thunder](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_thunder_256/movenet_singlepose_thunder_256_int8.tflite) | Int8 | 256x256x3 | per-channel** | STM32MP257F-DK2 | NPU/GPU | 800 MHz | 187.1 ms | 3.96 | 96.04 |0 | v5.0.0 | OpenVX |
105
+
106
+ ** **To get the most out of MP25 NPU hardware acceleration, please use per-tensor quantization**
107
+
108
+ ### OKS on COCO Person dataset
109
+
110
+
111
+ Dataset details: [link](https://cocodataset.org/#download) , License [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/legalcode) , Quotation[[1]](#1) , Number of classes: 80, Number of images: 118,287
112
+
113
+
114
+ | Model | Format | Resolution | OKS |
115
+ |-------|--------|------------|----------------|
116
+ | [ST MoveNet Lightning heatmaps per-channel](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/ST_pretrainedmodel_public_dataset/custom_dataset_person_13kpts/st_movenet_lightning_heatmaps_192/st_movenet_lightning_heatmaps_192_int8_pc.tflite) | Int8 | 192x192x3 | *52.1 % |
117
+ | [ST MoveNet Lightning heatmaps per-tensor](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/ST_pretrainedmodel_public_dataset/custom_dataset_person_13kpts/st_movenet_lightning_heatmaps_192/st_movenet_lightning_heatmaps_192_int8_pt.tflite) | Int8 | 192x192x3 | *39.31 % |
118
+ | [MoveNet Lightning heatmaps per-channel](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_192/movenet_lightning_heatmaps_192_int8_pc.tflite) | Int8 | 192x192x3 | 54.01 % |
119
+ | [MoveNet Lightning heatmaps per-tensor](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_192/movenet_lightning_heatmaps_192_int8_pt.tflite) | Int8 | 192x192x3 | 48.49 % |
120
+ | [MoveNet Lightning heatmaps per-channel](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_224/movenet_lightning_heatmaps_224_int8_pc.tflite) | Int8 | 224x224x3 | 57.07 % |
121
+ | [MoveNet Lightning heatmaps per-tensor](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_224/movenet_lightning_heatmaps_224_int8_pt.tflite) | Int8 | 224x224x3 | 50.93 % |
122
+ | [MoveNet Lightning heatmaps per-channel](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_256/movenet_lightning_heatmaps_256_int8_pc.tflite) | Int8 | 256x256x3 | 58.58 % |
123
+ | [MoveNet Lightning heatmaps per-tensor](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_heatmaps_256/movenet_lightning_heatmaps_256_int8_pt.tflite) | Int8 | 256x256x3 | 52.86 % |
124
+ | [MoveNet Lightning](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_lightning_192/movenet_singlepose_lightning_192_int8.tflite) | Int8 | 192x192x3 | 54.12% |
125
+ | [MoveNet Thunder](https://github.com/STMicroelectronics/stm32ai-modelzoo/pose_estimation/movenet/Public_pretrainedmodel_custom_dataset/custom_dataset_person_17kpts/movenet_thunder_256/movenet_singlepose_thunder_256_int8.tflite) | Int8 | 256x256x3 | 64.43% |
126
+
127
+
128
+ \* keypoints = 13
129
+
130
+ ## Integration in a simple example and other services support:
131
+
132
+ Please refer to the stm32ai-modelzoo-services GitHub [here](https://github.com/STMicroelectronics/stm32ai-modelzoo-services)
133
+
134
+
135
+ # References
136
+
137
+ <a id="1">[1]</a>
138
+ “Microsoft COCO: Common Objects in Context”. [Online]. Available: https://cocodataset.org/#download.
139
+ @article{DBLP:journals/corr/LinMBHPRDZ14,
140
+ author = {Tsung{-}Yi Lin and
141
+ Michael Maire and
142
+ Serge J. Belongie and
143
+ Lubomir D. Bourdev and
144
+ Ross B. Girshick and
145
+ James Hays and
146
+ Pietro Perona and
147
+ Deva Ramanan and
148
+ Piotr Doll{'{a} }r and
149
+ C. Lawrence Zitnick},
150
+ title = {Microsoft {COCO:} Common Objects in Context},
151
+ journal = {CoRR},
152
+ volume = {abs/1405.0312},
153
+ year = {2014},
154
+ url = {http://arxiv.org/abs/1405.0312},
155
+ archivePrefix = {arXiv},
156
+ eprint = {1405.0312},
157
+ timestamp = {Mon, 13 Aug 2018 16:48:13 +0200},
158
+ biburl = {https://dblp.org/rec/bib/journals/corr/LinMBHPRDZ14},
159
+ bibsource = {dblp computer science bibliography, https://dblp.org}
160
+ }
161
+