Among these approaches, the first is an alternative way to handle various kinds of complex noises, but requires that each individual be attached the same sufficiently large sample size in order to obtain an approximate solution; this causes computationally expensive resource consumption. The second can theoretically ensure that the solution of an approximation model approaches the true solution with any precision, but needs many mathematical theory foundations. For example, after exhaustively studying the relation between solutions for single-objective chance constrained programming and its related sample average approximation model, Luedtke et al. [21] derived a sample bound estimate which could make any δ-optimal solution of the approximation model approach a corresponding ε-optimal solution of the true problem. The third, which is superior to the former two, demands that sample sizes depend on individuals' quality, but is difficult in deciding the total of sampling sizes attached by given individuals. The forth, which performs well over the former three, is a noise handling approach of which can adaptively adjust the sampling sizes of random factors, in other words, if the quality of an individual is high, it will be attached a large sample size, and conversely it can be only adhered a small sample size. This can greatly reduce computational cost and is helpful in rapidly finding the optimal solution. Therefore, it becomes increasingly popular in the context of stochastic optimization.