-
Notifications
You must be signed in to change notification settings - Fork 251
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rework model docstrings for progressive disclosure of complexity #867
Comments
Also, we are getting a lot of contributor interest right now! Let's all try to..
Thanks! |
I can take this up for Roberta. Thanks. |
Thanks for this issues & guideline, I want to work with f_net model. |
@mattdangerw I'd like to work with Distil_bert model. Thanks |
@mattdangerw I am working on albert |
I shall take this up for |
I am working with GPT2 |
@mattdangerw I'd like to take up opt. |
I will take XLMRoberta instead of GPT-2 for now as it is not exported to keras_nlp library to be used in colab to test docsting snippets. I will continue my work with GPT-2 after installing dependencies in my local machine soon. |
I am taking up t5 |
I am taking this up for |
@mattdangerw & @chenmoneygithub RoBERTa is still pending. |
Yeah, sorry this issue might keep getting marked as close due to thing like |
Also thanks everyone for the work on this issue! |
We need to simplify our model docstring so they are more easily understandable, and we can start by porting the changes in #843 to all other models.
Let's do this one model at a time, to keep things granular.
Checklist:
The text was updated successfully, but these errors were encountered: