From the relevant Keras documnetation:
steps_per_epoch: Integer. Total number of steps (batches of samples) to yield from generator before declaring one epoch finished and starting the next epoch. It should typically be equal to the number of samples of your dataset divided by the batch size. Optional for Sequence: if unspecified, will use the len(generator) as a number of steps.
So your assumptions are correct.
If you want to use all your data in each epoch, you should chose a batch_size and steps_per_epoch that multiply together to give your total number of samples.
Usually it will be your resources that decide this for you. If memory is an issue, you have to reduce batch-size until you can fit a batch onto a GPU (as an example).
In your case, I would probably set batch_size to the desired amount, then let Keras work out step_per_epoch for you! Only change it if you really want the model to not use all data in each epoch (which actually bends the definition of the word "epoch").