The bootstrap is a computer-based resampling method that can provide good approximations to the finite sample distribution of a given statistic. In this talk some methods to use the empirical bootstrap approach for stochastic gradient descent (SGD) to minimize the empirical risk over a Hilbert space are investigated from the view point of algorithmic stability and statistical robustness. Two types of approaches are based on averages and are investigated from a theoretical point of view. Another type of bootstrap SGD is proposed to demonstrate that it is possible to construct purely distribution-free pointwise confidence intervals and distribution-free pointwise tolerance intervals of the conditional median function using bootstrap SGD.
