INFORMATION-THEORETIC BOUNDS TO ACCURACY FOR BIOMETRIC IDENTIFICATION IN METRIC SPACES OF DATA REPRESENTATIONS
- Авторлар: Lange A.M1, Lange M.M1, Paramonov S.V1
-
Мекемелер:
- Federal Research Center “Computer Science and Control” RAS
- Шығарылым: № 6 (2025)
- Беттер: 146–154
- Бөлім: РАСПОЗНАВАНИЕ ОБРАЗОВ И ОБРАБОТКА ИЗОБРАЖЕНИЙ
- URL: https://ogarev-online.ru/0002-3388/article/view/360471
- DOI: https://doi.org/10.7868/S3034543X25060149
- ID: 360471
Дәйексөз келтіру
Аннотация
Авторлар туралы
A. Lange
Federal Research Center “Computer Science and Control” RAS
Email: lange_am@mail.ru
Moscow, Russia
M. Lange
Federal Research Center “Computer Science and Control” RAS
Email: lange_mm@mail.ru
Moscow, Russia
S. Paramonov
Federal Research Center “Computer Science and Control” RAS
Email: psypobox@gmail.com
Moscow, Russia
Әдебиет тізімі
- Gallager R.G. Information Theory and Reliable Communication. N.Y.: Wiley and Sons, 1968. 588 p.
- Dobrushin R.L., Tsybakov B.S. Information Transmission with Additional Noise // IRE Transaction on Information Theory. 1962. V. 8(5). P. 293–304. doi: 10.1109/TIT.1962.1057738.
- Sethi I.K., Sarvarayudd G.P.R. Hierarchical Classifier Design Using Mutual Information // IEEE Transactions on Pattern Analysis and Machine Intelligence. 1982. V. 4(4). P. 441–445.
- Rigau J., Fekxas M., Sbert M. An Information Theoretic Framework for Image Segmentation // Intern. Conf. on Image Processing. ICIP '04, Singapura: IEEE, 2005. AN: 8436089. doi: 10.1109/ICIP.2004.1419518.
- Lange M.M., Lange A.M. Information Theoretic Lower Bounds to Error Probability for the Models of Noisy Discrete Source Coding and Object Classification // Pattern Recognition and Image Analysis. 2022. V. 32(3). P. 570-574. doi: 10.1134/S105466182203021X.
- Ланге M.M., Ланге A.M. Теоретико-информационные границы точности кодирования сообщений и распознавания образов по ансамблям данных // Компьютерная оптика. 2024. Т. 48(3). С. 460–470. doi: 10.18287/2412-6179-CO-1362
- Brown G., Pocock A., Zhao M.J., Luj'an M. Conditional Likelihood Maximization: A Unifying Framework for Information Theoretic Feature Selection // J. Machine Learning Research. 2012. V. 13(8). P. 27–66.
- Fleuret F. Fast Binary Feature Selection with Conditional Mutual Information // J. Machine Learning Research. 2004. V. 5. P. 1531–1555.
- Ланге A.M., Ланге M.M., Парамонов С.В. О соотношении взаимной информации и вероятности ошибки в задаче классификации данных // ЖВМ и МФ. 2021. Т. 61(7). С. 1129–1205. doi: 10.31857/S0044466921070115.
- Duin R.P.W., de Ridder D., Tax D.M.J. Experiments with a Featureless Approach to Pattern Recognition // Pattern Recognition Letters. 1997. V. 18. P. 1159–1166.
- Dovenko S.D. Recovering Missing Values of Paired Comparisons // Pattern Recognition and Image Analysis. 2022. V. 32(3). P. 522–527. doi: 10.1134/S1054661822030099.
- Sueño H.T., Gerardo B.D., Medina R.P. Multi-class Document Classification Using Support Vector Machine (SVM) Based on Improved Naive Bayes Vectorization Technique // Intern. J. Advanced Trends in Computer Science and Engineering. 2020. V. 9(3). P. 3937–3944. doi: 10.30534/ijatcse/2020/216932020.
- Xu X., Huang S.L., Zheng L., Wornell G.W. An Information Theoretic Interpretation to Deep Neural Networks // Entropy. 2022. V. 24(1). P. 135. doi: 10.3390/e24010135.
- Cover T.M., Thomas J.A. Elements of Information Theory. 2nd ed. N.Y.: Wiley and Sons, 2006. 748 p.
- Distance Matrices for Face Dataset. Available: https://github.com/lange-am/tree_distance_matricies/blob/main/treedist_faces_euclidean2.txt
- Distance Matrices for Signature Dataset. Available: https://github.com/lange-am/tree_distance_matricies/blob/main/treedist_signs_euclidean2.txt
Қосымша файлдар

