HW/SW Co-design and Prototyping Approach for Embedded Smart Camera: ADAS Case Study


  • B. Senouci Graduate Engineering School, ECE-Paris, INSEEC-U Research Center, Paris, France
  • H. Rouis Graduate Engineering School, ECE-Paris, INSEEC-U Research Center, Paris, France
  • Q. Cabanes Graduate Engineering School, ECE-Paris, INSEEC-U Research Center, Paris, France
  • A.C. Ramdan University of Versailles Saint-Quentin en Yvelines, LISV Laboratory
  • D.S. Han Kyungpook National University, South Korea


ADAS, Embedded Architecture, FPGA based design, Hardware Accelerators, High Level Synthesis, HW/SW Co-design, Machine learning, Real Time OS, Smart Cars,


In 1968, Volkswagen integrated an electronic circuit as a new control fuel injection system, called the “Little Black Box”, it is considered as the first embedded system in the automotive industry. Currently, automobile constructors integrate several embedded systems into any of their new model vehicles. Behind these automobile’s electronics systems, a sophisticated Hardware/Software (HW/SW) architecture, which is based on heterogeneous components, and multiple CPUs is built. At present, they are more oriented toward visionbased systems using tiny embedded smart camera. This visionbased system in real time aspects represents one of the most challenging issues, especially in the domain of automobile’s applications. On the design side, one of the optimal solutions adopted by embedded systems designer for system performance, is to associate CPUs and hardware accelerators in the same design, in order to reduce the computational burden on the CPU and to speed-up the data processing. In this paper, we present a hardware platform-based design approach for fast embedded smart Advanced Driver Assistant System (ADAS) design and prototyping, as an alternative for the pure time-consuming simulation technique. Based on a Multi-CPU/FPGA platform, we introduced a new methodology/flow to design the different HW and SW parts of the ADAS system. Then, we shared our experience in designing and prototyping a HW/SW vision based on smart embedded system as an ADAS that helps to increase the safety of car’s drivers. We presented a real HW/SW prototype of the vision ADAS based on a Zynq FPGA. The system detects the fatigue/drowsiness state of the driver by monitoring the eyes closure and generates a real time alert. A new HW Skin Segmentation step to locate the eyes/face is proposed. Our new approach migrates the skin segmentation step from processing system (SW) to programmable logic (HW) taking the advantage of High-Level Synthesis (HLS) tool flow to accelerate the implementation, and the prototyping of the Vision based ADAS on a hardware platform.


“How VW Fuel Injector Works: A Mini-Computer Aids Economy, Cuts Pollution” Chicago Tribune, Sunday, February 25, 1968

A. Eskandarian and A. Mortazavi, “Evaluation of smart algorithm for commercial vehicle driver drowsiness detection,” Proceedings of the 2007 IEEE Intelligent Vehicles Symposium, pp. 553–559, 2007.


Pedro U. Lima, Aamir Ahmad, André Dias, André G.S. Conceição, António Paulo Moreira, Eduardo Silva, Luis Almeida, Luis Oliveira, and Tiago P. Nascimento. 2015. “Formation control driven by cooperative object tracking. Robot” AutonSyst. 63, P1 (January 2015), 68-79. DOI: http://dx.doi.org/10.1016/j.robot.2014.08.018


http://en.wikipedia.org/wiki/Autonomous car, “Autonomous car.”

S. Russel, “DARPA Grand Challenge Winner: Stanley the Robot!” Popular Mechanics, Jan. 2006. [Online]. Available: http://www.popularmechanics.com/technology/robots/a393/2169012/

T. C. Frankel, “What it feels like to drive a Tesla on autopilot,” The Washington Post, Feb. 2016. [Online]. Available: https://www.washingtonpost.com/news/theswitch/wp/2016/02/01/wha t-it-feels-like-to-drive-a-tesla-on-autopilot/


Nurvitadhi, E., Subhaschandra, S., Boudoukh, G., Venkatesh, G., Sim, J., Marr, D., Huang, R., Ong Gee Hock, J., Liew, Y.T., Srivatsan, K., Moss.D. “Can FPGAs Beat GPUs in Accelerating Next-Generation Deep Neural Networks?”, Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays - FPGA ’17. Presented at the t2017 ACM/SIGDA International Symposium, ACM Press, Monterey, California, USA, pp. 5–14. https://doi.org/10.1145/3020078.3021740-

Chin-Teng, L. Ruei-Cheng, W. Sheng-Fu, L.Wen-Hung, C. Yu-Jie, C. Tzyy-Ping, J. “EEG-Based Drowsiness Estimation for Safety Driving Using Independent Component Analysis” IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 52, NO. 12, DECEMBER 2005


R. Sayed and A. Eskandarian, “Unobtrusive drowsiness detection by neural network learning of driver steering,” Proceedings of the Institution of Mechanical Engineers. Part D, Journal of Automobile Engineering, vol. 215, pp. 969–975, 2001.

Y.Lin, H.Leng, and e. a. G.Yang, “An intelligent noninvasive sensor for driver pulse wave measurement,” IEEE Sensor Journal, vol. 7, pp. 790– 799, 2007.

Duy Tran et al “A Driver Assistance Framework based on Driver Drowsiness Detection” The 6th Annual IEEE International Conference on Cyber Technology in Automation, Control and Intelligent Systems June 19-22, 2016, Chengdu, China

R. Wang, Y. Wang, and C. Luo, “Eeg-based real-time drowsiness detection using hilbert-huang transform,” in Intelligent Human-Machine Systems and Cybernetics (IHMSC), 2015 7th International Conference on, vol. 1. IEEE, 2015, pp. 195–198.

A. Tayyaba, M. Arfan Ja_ar, M. Ramzan, and M. Anwar Mirza, Automatic Fatigue Detection of Drivers through Yawning Analysis_ 2009.

E. Murphy-Chutorian, A. Doshi, and M. Trivedi,_Head Pose Estimation for Driver Assistance Systems: A Robust Algorithm and Experimental Evaluation 2007.

B.Senouci, H.Rouis, D.S.Han and E.Bourennane “A Hardware SkinSegmentation IP for Vision Based Smart ADAS Through an FPGA Prototyping” 9th IEEE International Conference on Ubiquitous and Future Networks ICUFN, The 5th International Workshop on Intelligent Vehicles, Milan, Italy, July 2017

B. Senouci, I. Charfi, B. Heyrman, J.Dubois, J.Miteran, “Fast prototyping of a SoC-based smartcamera: a real-time fall detection case study” Journal of Real-Time Image Processing, pp. 1861_8200, 2015

G. P. Stein, E.Rushinek, G. Hayun, and A. Shashua, “A computer vision system on a chip: a case study from the automotive domain,” IEEE Conference on Computer Vision and Pattern Recognition (CVPRW’05), p. 130, June 2005.

Gururaj P et al “An Analysis of Skin Pixel Detection using Different Skin Color Extraction Techniques” International Journal of Computer Applications (0975 - 8887) Volume 54 - No. 17, September 2012

C.Claus, W. Stechele, and A.Herkersdorf, “Autovision– a run-time reconfigurable mpsoc architecture for future driver assistance systems,” Information Technology, vol. 49, no. 3, pp. 181–187, 2007

Shi, W., Alawieh, M.B., Li, X., Yu, H., 2017. Algorithm and hardware implementation for visual perception system in autonomous vehicle: A survey. Integration 59, 148–156. https://doi.org/10.1016/j.vlsi. 2017.07.007

Falsafi, B., Dally, B., Singh, D., Chiou, D., Yi, J.J., Sendag, R., 2017. FPGAs versus GPUs in Data centers. IEEE Micro 37, 60–72. https://doi.org/10.1109/MM.2017.19






How to Cite

Senouci, B., Rouis, H., Cabanes, Q., Ramdan, A., & Han, D. (2019). HW/SW Co-design and Prototyping Approach for Embedded Smart Camera: ADAS Case Study. Journal of Telecommunication, Electronic and Computer Engineering (JTEC), 11(4), 31–40. Retrieved from https://jtec.utem.edu.my/jtec/article/view/5313