Epoch in machine learning refers to one complete pass of the entire training dataset through the algorithm. It is a hyperparameter that determines the process of training the machine learning model. The training data is always broken down into small batches to overcome the issue that could arise. These smaller batches can be easily fed into the machine learning model to train it. This process of breaking it down to smaller bits is called batch in machine learning. An epoch is when all the batches are fed into the model to train at once. Too few epochs of training can result in underfitting, while too many epochs of training can result in overfitting.