TY - JOUR AB - Purpose– Bounds on the rate of convergence of learning processes based on random samples and probability are one of the essential components of statistical learning theory (SLT). The constructive distribution‐independent bounds on generalization are the cornerstone of constructing support vector machines. Random sets and set‐valued probability are important extensions of random variables and probability, respectively. The paper aims to address these issues.Design/methodology/approach– In this study, the bounds on the rate of convergence of learning processes based on random sets and set‐valued probability are discussed. First, the Hoeffding inequality is enhanced based on random sets, and then making use of the key theorem the non‐constructive distribution‐dependent bounds of learning machines based on random sets in set‐valued probability space are revisited. Second, some properties of random sets and set‐valued probability are discussed.Findings– In the sequel, the concepts of the annealed entropy, the growth function, and VC dimension of a set of random sets are presented. Finally, the paper establishes the VC dimension theory of SLT based on random sets and set‐valued probability, and then develops the constructive distribution‐independent bounds on the rate of uniform convergence of learning processes. It shows that such bounds are important to the analysis of the generalization abilities of learning machines.Originality/value– SLT is considered at present as one of the fundamental theories about small statistical learning. VL - 40 IS - 9/10 SN - 0368-492X DO - 10.1108/03684921111169486 UR - https://doi.org/10.1108/03684921111169486 AU - Ha Minghu AU - Chen Jiqiang AU - Pedrycz Witold AU - Sun Lu PY - 2011 Y1 - 2011/01/01 TI - Bounds on the rate of convergence of learning processes based on random sets and set‐valued probability T2 - Kybernetes PB - Emerald Group Publishing Limited SP - 1459 EP - 1485 Y2 - 2024/09/18 ER -