This doc introduces how to convert your pytorch model into onnx, and how to run an onnxruntime demo to verify your convertion.
cd <ByteTrack_HOME>
python3 tools/export_onnx.py --output-name bytetrack_s.onnx -f exps/example/mot/yolox_s_mix_det.py -c pretrained/bytetrack_s_mot17.pth.tar
You can run onnx demo with 16 FPS (96-core Intel® Xeon® Platinum 8163 CPU @ 2.50GHz):
cd <ByteTrack_HOME>/deploy/ONNXRuntime
python3 onnx_inference.py