batch
Full-Batch Training
loop through all examples each iteration
batch training
loop through N examples each iteration
why use batch?
too large to fit into memory all at once
mini-batch
minibatch is a small, randomly selected subset of data u
minibatch size (e.g., 32, 64, or 128)
dataloader = DataLoader(dataset, batch_size=10, shuffle=True)
for epoch in range(100): # number of epochs
for inputs, target in dataloader: # iterate over mini-batches
DL book p.270
SGD vs mini batch?
SGD (1 random at a time)
mini batch SGD (N at a time)
GD (all at a time)