Text Synth
Text completion using
the GPT-2
language model. It is a neural network of up to 1.5 billion
parameters. Type a text and let the neural network complete
it. Each try returns a different randomly chosen completion.
The same model can be used to compress text messages.
Model:
top-k:
top-p:
temperature:
seed:
Completed Text:
© 2019-2020 Fabrice Bellard - Technical notes