I found the definition:
Bagging is to use the same training for every predictor, but to train them on
different random subsets of the training set.
When sampling is performed with replacement, this method is
called bagging (short for bootstrap aggregating).
When sampling is performed without replacement, it is called pasting.
What is "replacement" in this context?