File size: 1,833 Bytes
d0da088
 
 
 
 
 
 
 
 
 
f9bc60a
d0da088
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
license: other
license_name: aplux-model-farm-license
license_link: https://aiot.aidlux.com/api/v1/files/license/model_farm_license_en.pdf
pipeline_tag: image-segmentation
tags:
- AIoT
- QNN
---

![](https://aiot.aidlux.com/_next/image?url=%2Fapi%2Fv1%2Ffiles%2Fmodel%2Fcover%2F20250403015534_%25E5%259B%25BE1%402x(3).png&w=640&q=75)

## PidNet: Semantic Segmentation

PidNet, proposed in 2023, is a real-time semantic segmentation model that innovatively integrates Proportional-Integral-Derivative (PID) control theory into feature fusion. Its architecture consists of three parallel branches: a Proportional Branch (capturing local details), an Integral Branch (aggregating global context), and a Derivative Branch (detecting edge variations). These are adaptively fused via a PID controller, dynamically balancing multi-scale features for accuracy and speed. With a lightweight design, PidNet achieves 80%+ mIoU on Cityscapes at over 100 FPS, outperforming models like BiSeNet. It excels in segmenting small objects and fine edges in complex scenes, making it ideal for autonomous driving, robotics, and industrial inspection where real-time processing and precision are critical.

### Source model

- Input shape: 1x3x1024x2048
- Number of parameters: 7.62M
- Model size: 29.14M
- Output shape: 1x19x128x256

The source model can be found [here](https://github.com/XuJiacong/PIDNet)

## Performance Reference

Please search model by model name in [Model Farm](https://aiot.aidlux.com/en/models)

## Inference & Model Conversion

Please search model by model name in [Model Farm](https://aiot.aidlux.com/en/models)

## License

- Source Model: [MIT](https://github.com/XuJiacong/PIDNet/blob/main/LICENSE)

- Deployable Model: [APLUX-MODEL-FARM-LICENSE](https://aiot.aidlux.com/api/v1/files/license/model_farm_license_en.pdf)