Comparison of Two Objects Classification Techniques using Hidden Markov Models and Convolutional Neural Networks
- Авторлар: Sarmiento C.1, Savage J.1
-
Мекемелер:
- National Autonomous University of Mexico (UNAM)
- Шығарылым: Том 19, № 6 (2020)
- Беттер: 1222-1254
- Бөлім: Artificial intelligence, knowledge and data engineering
- URL: https://ogarev-online.ru/2713-3192/article/view/266291
- DOI: https://doi.org/10.15622/ia.2020.19.6.4
- ID: 266291
Дәйексөз келтіру
Толық мәтін
Аннотация
This paper presents a comparison between discrete Hidden Markov Models and Convolutional Neural Networks for the image classification task. By fragmenting an image into sections, it is feasible to obtain vectors that represent visual features locally, but if a spatial sequence is established in a fixed way, it is possible to represent an image as a sequence of vectors. Using clustering techniques, we obtain an alphabet from said vectors and then symbol sequences are constructed to obtain a statistical model that represents a class of images. Hidden Markov Models, combined with quantization methods, can treat noise and distortions in observations for computer vision problems such as the classification of images with lighting and perspective changes.We have tested architectures based on three, six and nine hidden states favoring the detection speed and low memory usage. Also, two types of ensemble models were tested. We evaluated the precision of the proposed methods using a public domain data set, obtaining competitive results with respect to fine-tuned Convolutional Neural Networks, but using significantly less computing resources. This is of interest in the development of mobile robots with computers with limited battery life, but requiring the ability to detect and add new objects to their classification systems.
Негізгі сөздер
Авторлар туралы
C. Sarmiento
National Autonomous University of Mexico (UNAM)
Хат алмасуға жауапты Автор.
Email: ing.adriansarmiento@comunidad.unam.mx
Circuito Exterior S/N, Ciudad Universitaria -
J. Savage
National Autonomous University of Mexico (UNAM)
Email: robotssavage@gmail.com
Circuito Exterior S/N, Ciudad Universitaria -
Әдебиет тізімі
- Dhall D., Kaur R., Juneja M. Machine Learning: A Review of the Algorithms and Its Applications // Proceedings of International Conference on Recent Innovations in Computing (ICRIC’2019). 2019. pp. 47–63.
- Dhruv P., Naskar S. Image Classification Using Convolutional Neural Network (CNN) and Recurrent Neural Network (RNN): A Review // International Conference on Machine Learning and Information Processing (ICMLIP’2019). 2020. pp. 367–381.
- Wang W et al. Development of convolutional neural network and its application in image classification: a survey // Optical Engineering. 2019. vol. 58. no. 4. pp. 1–19.
- Gambella C., Ghaddar B., Naoum-Sawaya J. Optimization problems for machine learning: A survey // European Journal of Operational Research. Available at: https://doi.org/10.1016/j.ejor.2020.08.045 (accessed: 20.09.2020).
- Dogo E.M. et al. A Comparative Analysis of Gradient Descent-Based Optimization Algorithms on Convolutional Neural Networks // International Conference on Computational Techniques, Electronics and Mechanical Systems (CTEMS’2018). 2018. pp. 92–99.
- Belanche D., Casal ´o L.V., Flavi´an C., Schepers J. Service robot implementation: a theoretical framework and research agenda // The Service Industries Journal. 2020. vol. 40 no. 3-4. pp. 203–225.
- Torras C. Service Robots for Citizens of the Future // European Review. 2016. vol. 24. no. 1. pp. 17–30.
- Zachiotis G.A. et al. A Survey on the Application Trends of Home Service Robotics // IEEE International Conference on Robotics and Biomimetics (ROBIO’2018). 2018. pp. 1999–2006.
- Matamoros M. et al. Robocup at home 2019: Rules and regulations. Available at: http://www.robocupathome.org/rules/2019_rulebook.pdf (accessed: 20.09.2020).
- Baum L.E., Petrie T., Soules G., Weiss N. A Maximization Technique Occurring in the Statistical Analysis of Probabilistic Functions of Markov Chains // The Annals of Mathematical Statistics. 1970. vol. 41. no.1. pp. 164–171.
- Rabiner L.R. A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Readings in Speech Recognition // Morgan Kaufmann Publishers Inc. 1990. pp. 267–296.
- Mor B., Garhwal S., Kumar A. A Systematic Review of Hidden Markov Models and Their Applications. Archives of Computational Methods in Engineering. Available at: https://doi.org/10.1007/s11831-020-09422-4. (accessed: 20.09.2020).
- Corcoran P., Iancu C. Hidden Markov Models in Automatic Face Recognition – A Review. Reviews, Refinements and New Ideas in Face Recognition. Available at: https://doi.org/10.5772/17664 (accessed: 20.09.2020).
- Rastghalam R., Pourghassem H. Breast cancer detection using MRF-based probable texture feature and decision-level fusion-based classification using HMM on thermography images // Pattern Recognition. 2016. vol. 51. pp. 176–186.
- Hassan M. et al. Robust Hidden Markov Model based intelligent blood vessel detec-tion of fundus images // Computer Methods and Programs in Biomedicine. 2017. vol. 151. pp. 193–201.
- Sarmiento C. et al. Feature detection using Hidden Markov Models for 3D-visual recognition // IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC’2019). 2019. pp. 1–6.
- Russakovsky O. et al. ImageNet Large Scale Visual Recognition Challenge // Interna-tional Journal of Computer Vision. 2015. vol. 115. no. 3. pp. 211–252.
- Krizhevsky A., Sutskever I., Hinton G. Imagenet classification with deep convolutional neural networks // Neural Information Processing Systems (NIPS’2012). 2012. pp. 1097–1105.
- Simonyan K., Zisserman A. Very Deep Convolutional Networks for Large-Scale Image Recognition // International Conference on Learning Representations (ILCR’2015). 2015.
- He K., Zhang X., Ren S., Sun J. Deep residual learning for image recognition // Pro-ceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’2016). 2016. pp. 770–778.
- Abadi M. et al. TensorFlow: Large-scale machine learning on heterogeneous systems. Available at: www.tensorflow.org (accessed: 20.09.2020).
- Howard A.G. et al. MobileNets:Efficient Convolutional Neural Networks for Mobile Vision Applications. CoRR. Available at: http://arxiv.org/abs/1704.04861 (accessed: 20.09.2020).
- Sculley D. Web-scale k-means clustering // Proceedings of the 19th International Conference on World Wide Web (WWW ’10). 2010. pp. 1177–1178.
- Lai K., Bo L., Ren X., Fox D. A large-scale hierarchical multi-view RGB-D object dataset // IEEE International Conference on Robotics and Automation (ICRA’2011). 2011. pp. 1817–1824.
- Sharma G., Wu W., Dalal E.N. The CIEDE2000 color-difference formula: implemen-tation notes, supplementary test data, and mathematical observations // Color Research and Application. 2005. vol. 30. no. 1. pp. 21–30.
- Arthur D., Vassilvitskii S. K-means++: The advantages of careful seeding // ACM-SIAM Symposium on Discrete Algorithms (SODA’07). 2007. pp. 1027–1035.
- Pedregosa F. et al. Scikit-learn: Machine learning in python // Journal ´ of Machine Learning Research. 2011. vol. 12. no. 85. pp. 2825–2830.
- Elkan C. Using the triangle inequality to accelerate k-means // Proceedings of the 20th International Conference on Machine Learning (ICML’03). 2003. pp 147–153.
- Marina S.M., Lapalme G. A systematic analysis of performance measures for classification tasks // Information Processing & Management. 2009. vol. 45. no. 4 pp. 427–437.
Қосымша файлдар
