Far East Journal of Experimental and Theoretical Artificial Intelligence
Volume 2, Issue 1, Pages 1 - 24
(August 2008)
|
|
THE BOUNDS ON THE RATE OF UNIFORM CONVERGENCE OF LEARNING PROCESS BASED ON COMPLEX RANDOM SAMPLES
Zhi-Ming Zhang (P. R. China), Witold Pedrycz (P. R. China), Ming-Hu Ha (P. R. China) and Da-Zeng Tian (P. R. China)
|
Abstract: Statistical Learning Theory is commonly regarded as a sound framework within which we handle a variety of learning problems in presence of small size data samples. It has become a rapidly progressing research area in machine learning. The theory is based on real random samples and as such is not ready to deal with the statistical learning problems involving complex random samples, which we may encounter in real world scenarios. This paper explores statistical learning theory based on complex random samples and in this sense generalizes the existing fundamentals. Firstly, the definitions of complex random variable, primary norm and linear functional are introduced. Secondly, the definitions of the complex empirical risk functional, the complex expected risk functional, and complex empirical risk minimization principle are proposed. Thirdly, the concepts of annealed entropy, growth function and VC dimension of complex measurable functions are proposed, and some important properties are proved. Finally, given these definitions and derived properties, the bounds on the rate of uniform convergence of learning process based on complex random samples are constructed. |
Keywords and phrases: complex random variable, complex empirical risk minimization principle, VC dimension, bounds on the rate of uniform convergence. |
Communicated by Shun-Feng Su |
Number of Downloads: 173 | Number of Views: 477 |
|