Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GPT-2 text generation example #228

Merged
merged 2 commits into from
Jun 4, 2024
Merged

Add GPT-2 text generation example #228

merged 2 commits into from
Jun 4, 2024

Conversation

robertknight
Copy link
Owner

Add a text generation / completion example using GPT-2. This also includes the start of some utilities to implement the generation loop for auto-regressive transformer models.

The initial version uses simple greedy sampling from the output logits. As a result, completions are prone to being repetitive. I plan to replace this with top-K sampling soon.

Example usage:

$ cargo run -r -p rten-examples --bin gpt2 gpt2_medium_onnx/model.rten gpt2_medium_onnx/tokenizer.json "Once fine day, in the middle of"

Output:

Once fine day, in the middle of the night, I was sitting in my room, watching TV, when I heard a knock on the door. I opened it and there was a man

@robertknight robertknight merged commit 07b62d4 into main Jun 4, 2024
2 checks passed
@robertknight robertknight deleted the gpt2-demo branch June 4, 2024 06:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant