Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rework model docstrings for progressive disclosure of complexity #867

Closed
3 tasks
mattdangerw opened this issue Mar 17, 2023 · 14 comments · Fixed by #881, #893, #903 or #996
Closed
3 tasks

Rework model docstrings for progressive disclosure of complexity #867

mattdangerw opened this issue Mar 17, 2023 · 14 comments · Fixed by #881, #893, #903 or #996
Labels
stat:contributions welcome Add this label to feature request issues so they are separated out from bug reporting issues

Comments

@mattdangerw
Copy link
Member

We need to simplify our model docstring so they are more easily understandable, and we can start by porting the changes in #843 to all other models.

Let's do this one model at a time, to keep things granular.

Checklist:

  • Make sure to update any "custom vocabulary" examples to match the model actual vocabulary type and special token requirements (varies per model).
  • Test out all docstring snippets!
  • Make sure to follow our code style guidelines re indentation etc.
@mattdangerw mattdangerw added the stat:contributions welcome Add this label to feature request issues so they are separated out from bug reporting issues label Mar 17, 2023
@mattdangerw
Copy link
Member Author

mattdangerw commented Mar 17, 2023

Also, we are getting a lot of contributor interest right now! Let's all try to..

  • Use this issue to communicate.
  • Only work on a single model at a time. E.g. wait till you have a PR ready for one model before starting on the next.
  • Be patient with each other as we all self organize here :)

Thanks!

@mattdangerw mattdangerw changed the title Rework model docstring for progressive disclosure of complexity Rework model docstrings for progressive disclosure of complexity Mar 17, 2023
@shivance
Copy link
Collaborator

I can take this up for Roberta. Thanks.

@ADITYADAS1999
Copy link
Contributor

Thanks for this issues & guideline, I want to work with f_net model.

@Cyber-Machine
Copy link
Contributor

@mattdangerw I'd like to work with Distil_bert model. Thanks

@soma2000-lang
Copy link
Contributor

soma2000-lang commented Mar 18, 2023

@mattdangerw I am working on albert

@TheAthleticCoder
Copy link
Contributor

TheAthleticCoder commented Mar 18, 2023

I shall take this up for deberta_v3 then

@abuelnasr0
Copy link
Contributor

abuelnasr0 commented Mar 18, 2023

I am working with GPT2

@Warlord-K
Copy link
Contributor

@mattdangerw I'd like to take up opt.

@abuelnasr0
Copy link
Contributor

abuelnasr0 commented Mar 19, 2023

I will take XLMRoberta instead of GPT-2 for now as it is not exported to keras_nlp library to be used in colab to test docsting snippets. I will continue my work with GPT-2 after installing dependencies in my local machine soon.

@soma2000-lang
Copy link
Contributor

soma2000-lang commented Mar 20, 2023

I am taking up t5

@susnato
Copy link
Contributor

susnato commented Mar 20, 2023

I am taking this up for Whisper.

@shivance
Copy link
Collaborator

@mattdangerw & @chenmoneygithub RoBERTa is still pending.
@abuelnasr0 is a new contributor interested in contributing to KerasNLP. As the issue is mostly good first issue, it can be good start for him.
Abu you can start working on this issue for RoBERTa, good luck !

@mattdangerw
Copy link
Member Author

Yeah, sorry this issue might keep getting marked as close due to thing like "Partially fixes #867" in the PR description. Will keep reopening for now.

@mattdangerw
Copy link
Member Author

Also thanks everyone for the work on this issue!

This was referenced Mar 23, 2023
This was referenced Apr 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment