Skip to main content

Advertisement

Log in

Abstract

This article discusses the use of humanoid robots in daily life and their ability to perform tasks that humans may find unpleasant. The study introduces a Voice Controlled Humanoid Robot (VCHR), a mobile robot that can be controlled by precise voice instructions. The Google Voice API is used to handle voice commands, which are translated into text and sent to the Arduino Node MCU to perform necessary operations. The VCHR app and system are linked through Bluetooth, and the robot has SONAR sensors to identify obstacles and a camera for live video streaming. The VCHR system can perform around 20 distinct tasks, including speech-emotion recognition. The IoT cloud service provider ThingSpeak receives temperature sensor data from the VCHR system for analysis. Experimental findings demonstrate the system’s performance in movement, speech-emotion recognition, and sensor data processing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Adams, B.: Learning humanoid arm gestures. In: Working notes–AAAI Spring Symposium Series: Learning Grounded Representations, Stanford (2001)

  • Al-Shuka, H.F., Corves, B.J., Vanderborght, B., Zhu, W.-H.: Zeromoment point-based biped robot with different walking patterns. Int J Intell Syst Appl 7, 31–41 (2015)

    Google Scholar 

  • Azeta, J., Bolu, C., Hinvi, D., Abioye, A.A.: Obstacle detection using ultrasonic sensor for a mobile robot. IOP Conference Series: Materials Science and Engineering, 012012 (2019)

  • Badr, A.A., Abdul-Hassan, A.K.: A review on voice-based interface for human-robot interaction. Iraqi J Electr Electron Eng 16(2), 1–12 (2020)

    Article  Google Scholar 

  • Chen, L., Wu, M., Pedrycz, W., Hirota, K.: Emotion recognition and understanding for emotional human-robot interaction systems. In: Emotion Recognition and Understanding for Emotional Human-Robot Interaction System, vol. 926 (2020)

  • Imteaj, A., Thakker, U., Wang, S., Li, J., Amini, M.H.: A survey on federated learning for resource-constrained iot devices. IEEE Internet Things J. 9(1), 1–24 (2022). https://doi.org/10.1109/JIOT.2021.3095077

    Article  Google Scholar 

  • Artificial Intelligence, pp. 494–504 (2014). Springer

  • Jeong, J., Yang, J., Baltes, J.: Robot magic show as testbed for humanoid robot interaction. Entertainment Comput 40, 100456 (2022)

    Article  Google Scholar 

  • Khalil, R.A., Jones, E., Babar, M.I., Jan, T., Zafar, M.H., Alhussain, T.: Speech emotion recognition using deep learning techniques: a review. IEEE Access 7, 117327–117345 (2019)

    Article  Google Scholar 

  • Li, Y., Ishi, C.T., Ward, N., Inoue, K., Nakamura, S., Takanashi, K., Kawahara, T.: Emotion recognition by combining prosody and sentiment analysis for expressing reactive emotion by humanoid robot. In: 2017 AsiaPacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), pp. 1356–1359 (2017). IEEE

  • Mart´ın, F.A., Castillo, J.C., Malf´az, M., Castro-Gonz´alez, A.: Applications and trends in social robotics. Electronics 11(2), 212 (2022)

  • Mohammed, S.N., Hassan, A.K.A.: A survey on emotion recognition for human robot interaction. J. Comput. Inf. Technol. 28(2), 125–146 (2020)

    Google Scholar 

  • Peng, W., Wang, J., Chen, W.: Tracking control of human-following robot with sonar sensors. In: Intelligent Autonomous Systems 14: Proceedings of the 14th International Conference IAS-14 14, pp. 301–313 (2017). Springer

  • Rifinski, D., Erel, H., Feiner, A., Hoffman, G., Zuckerman, O.: Humanhuman-robot interaction: robotic object’s responsive gestures improve interpersonal evaluation in human interaction. Human Comput Interact 36(4), 333–359 (2021)

    Article  Google Scholar 

  • Sasagawa, A., Fujimoto, K., Sakaino, S., Tsuji, T.: Imitation learning for human-robot cooperation using bilateral control. arXiv preprint arXiv:1909.13018 (2019)

  • Schuller, B.W.: Speech emotion recognition: two decades in a nutshell, benchmarks, and ongoing trends. Commun. ACM 61(5), 90–99 (2018)

    Article  Google Scholar 

  • Singh, Y.B., Goel, S.: A systematic literature review of speech emotion recognition approaches. Neurocomputing (2022) Voice Controlled Humanoid Robot

  • Spezialetti, M., Placidi, G., Rossi, S.: Emotion recognition for humanrobot interaction: Recent advances and future perspectives. Frontiers in Robotics and AI, 145 (2020)

  • Sridhar, R., Wang, H., McAllister, P., Zheng, H.: E-bot: A facial recognition based human-robot emotion detection system. In: Proceedings of the 32nd International BCS Human Computer Interaction Conference 32, pp. 1–5 (2018)

  • Uday Girish, M., Harsha Vardhan, G., Sudheer, A.: Riggu: a semihumanoid robot platform for speech and image recognition. In: Intelligent Systems, Technologies and Applications: Proceedings of Fifth ISTA 2019, India, pp. 31–41 (2020). Springer

  • Valagkouti, I.A., Troussas, C., Krouska, A., Feidakis, M., Sgouropoulou, C.: Emotion recognition in human–robot interaction using the nao robot. Computers 11(5), 72 (2022)

    Article  Google Scholar 

  • Xhevahir, B., Ahmet, S., Gezim, H., Rame, L.: Dynamic modelling and analyzing of a walking of humanoid robot. J Mech Eng 68(3), 59–76 (2018)

    Google Scholar 

  • Yousefi-Koma, A., Maleki, B., Maleki, H., Amani, A., Bazrafshani, M.A., Keshavarz, H., Iranmanesh, A., Yazdanpanah, A., Alai, H., Salehi, S., et al.: Surenaiv: Towards a cost-effective full-size humanoid robot for realworld scenarios. In: 2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids), pp. 142–148 (2021). IEEE

  • Yuan, F., Anderson, J.G., Wyatt, T.H., Lopez, R.P., Crane, M., Montgomery, A., Zhao, X.: Assessing the acceptability of a humanoid robot for alzheimer’s disease and related dementia care using an online survey. International Journal of Social Robotics, 1–15 (2022)

  • Zatarain-Cabada, R., Barr´on-Estrada, M.L., Alor-Hern´andez, G., ReyesGarc´ıa, C.A.: Emotion recognition in intelligent tutoring systems for android-based mobile devices. In: Mexican International Conference on Artificial Intelligence, pp. 494–504 (2014). Springer

  • Zheng, L., Li, Q., Ban, H., Liu, S.: Speech emotion recognition based on convolution neural network combined with random forest. In: 2018 Chinese Control and Decision Conference (CCDC), pp. 4143–4147 (2018). IEEE

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bisma Naeem.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Ethical approval

All experimental protocols were approved by Information Technology University Lahore. Accordance: All the methods were carried out in accordance with the relevant guidelines and regulations.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Naeem, B., Kareem, W., Saeed-Ul-Hassan et al. Voice controlled humanoid robot. Int J Intell Robot Appl 8, 61–75 (2024). https://doi.org/10.1007/s41315-023-00304-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s41315-023-00304-z

Keywords

Navigation