YOLOv8-DeepSORT: A High-Performance Framework for Real-Time Multi-Object Tracking with Attention and Adaptive Optimization
Keywords:
YOLOv8, DeepSORT, Object Tracking, MOTA, Real-time Performance, Computer Vision, Deep Learning, Multi-object TrackingAbstract
The integration of YOLOv8 and DeepSORT has significantly advanced real-time multi-object tracking in computer vision, delivering a robust solution for dynamic video analysis. This study comprehensively evaluates the YOLOv8-DeepSORT pipeline, combining YOLOv8's high-accuracy detection capabilities with DeepSORT's efficient identity association to achieve precise and consistent tracking. Key contributions include domain-specific fine-tuning of YOLOv4, optimization through model pruning and quantization, and seamless integration with DeepSORT's deep appearance descriptors and Kalman filtering. The system was rigorously tested on the MOT20 benchmark, achieving a Multiple Object Tracking Accuracy (MOTA) of 78.2%, precision of 83.5%, recall of 81.0%, and a mean Intersection over Union (IoU) of 0.74, demonstrating strong detection and tracking performance. The framework exhibited reliable identity preservation across frames with only 19 ID-switches and a false positive rate (FPR) of 4.8%. Real-time deployment on a GTX 1660 Ti achieved 28.6 frames per second (FPS), confirming its suitability for latency-sensitive applications. The study highlights practical implementations in traffic monitoring, industrial automation, retail analytics, and surveillance, showcasing the pipeline's adaptability to diverse scenarios. Challenges such as computational efficiency for edge deployment, occlusion handling in crowded environments, and ethical considerations in surveillance applications are critically analyzed. Optimization techniques, including adaptive tracking and multimodal integration, are proposed to address current limitations. By synthesizing experimental results and real-world case studies, this work provides a detailed assessment of the YOLOv8-DeepSORT framework, emphasizing its balance of accuracy, speed, and scalability. The findings serve as a valuable reference for researchers and practitioners aiming to deploy efficient object tracking systems in resource-constrained environments.
Published
How to Cite
Issue
Section
Copyright (c) 2025 Godfrey Perfectson Oise, Nkem Belinda Unuigbokhai, Chioma Julia Onwuzo, Onyemaechi Clement Nwabuokei, Prosper Otega Ejenarhome, Onoriode Michael Atake, Sofiat Kehinde Bakare (Author)

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
- Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
- NonCommercial — You may not use the material for commercial purposes.
- No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.