diff --git a/lit_nlp/examples/blank_slate_demo.py b/lit_nlp/examples/blank_slate_demo.py index a4c66c4d..c82ffde1 100644 --- a/lit_nlp/examples/blank_slate_demo.py +++ b/lit_nlp/examples/blank_slate_demo.py @@ -6,12 +6,6 @@ - classification model on MultiNLI, with the MultiNLI dataset. - TensorFlow Keras model for penguin classification, with the Penguin tabular dataset from TFDS. -- T5 models using HuggingFace Transformers and Keras, with the English CNNDM - summarization dataset and the WMT '14 machine-translation dataset. -- BERT (bert-*) as a masked language model and GPT-2 (gpt2* or distilgpt2) as a - left-to-right language model, with the Stanford Sentiment Treebank dataset, - the IMDB reviews dataset, Billion Word Benchmark (lm1b) dataset and the option - to load sentences from a flat text file. To run: python -m lit_nlp.examples.blank_slate_demo --port=5432 diff --git a/lit_nlp/examples/prompt_debugging/transformers_lms.py b/lit_nlp/examples/prompt_debugging/transformers_lms.py index 1ab121f6..6ba20581 100644 --- a/lit_nlp/examples/prompt_debugging/transformers_lms.py +++ b/lit_nlp/examples/prompt_debugging/transformers_lms.py @@ -1,6 +1,6 @@ """Wrapper for HuggingFace models in LIT. -Includes BERT masked LM, GPT-2, and T5. +Supported models include Gemma, GPT-2, Llama, Mistral, etc. This wrapper loads a model into memory and implements the a number of helper functions to predict a batch of examples and extract information such as