Thanks to GPT2 pretrained model now it is possible to generate meaningful sequence of words with or without prefix. However a sentence should end with a proper endings (.,!,?). I am just wondering how to generate a sentence (with proper ending) of length N?
One possible approach is post-processing, that is process many sequences and choose the ones the serve the purpose! However, it could be a really daunting task to use in any pipeline.
Is there any suggestion, perhaps a secondary algorithm, to tune the hyper-parameter such that it produces sentence of desired length with higher probabilities.