Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

some accelerator enhancements #366

Closed
ZhiyuanChen opened this issue May 14, 2022 · 12 comments
Closed

some accelerator enhancements #366

ZhiyuanChen opened this issue May 14, 2022 · 12 comments
Labels
feature request Request for a new feature to be added to Accelerate

Comments

@ZhiyuanChen
Copy link
Contributor

ZhiyuanChen commented May 14, 2022

Hi,

My main code base have been switched to be based on accelerate for a while, it works grate! Thank you guys so much!

There are a few things I think it's a common need.

The first is .distsibuted attributes, which is basically .num_processes > 1. I mean, it's not a lot of works but .... I'm an extremely lazy person.

def distsibuted(self):
    """
    Runs in distributed mode
    """
    return self.num_processes > 1

Besides, these two decorators are often used in my code when saving models and downloading datasets, I think it could be useful.

def on_main_process(func):
    """
    Run func on main process
    """
    @wraps(func)
    def wrapper(self, *args, **kwargs):
        if self.is_main_process or not self.distributed:
            return func(self, *args, **kwargs)
    return wrapper


def on_local_main_process(func):
    """
    Run func on local main process
    """
    @wraps(func)
    def wrapper(self, *args, **kwargs):
        if self.is_local_main_process or not self.distributed:
            return func(self, *args, **kwargs)
    return wrapper
@muellerzr
Copy link
Collaborator

Nice idea! @sgugger WDYT of instead having this wrap around AcceleratorState directly, rather than having this be attached to Accelerator?

@sgugger
Copy link
Collaborator

sgugger commented May 16, 2022

I don't mind having wrappers on the Accelerator for those. As long as the naming is clear, it's not a bad thing to have multiple properties/methods/context managers for different use cases :-)

@ZhiyuanChen
Copy link
Contributor Author

ZhiyuanChen commented May 17, 2022

I do think distsibuted should be in AcceleratorState, but for the decorators, I think it's better to put it in the Accelerator

In this case, should distsibuted be a property or a variable?

Property gives better support in elastic training, variable is more consitent with existing design.

@sgugger
Copy link
Collaborator

sgugger commented May 17, 2022

There is no way to set it and make it work (you can't transform the training in distributed mode suddenly), so it should be a property.

@ZhiyuanChen
Copy link
Contributor Author

There is no way to set it and make it work (you can't transform the training in distributed mode suddenly), so it should be a property.

Ohh, by property I meant the @property decorated function, and by variable I meant a value. So I assume variable is better.

Though as PyTorch is supporting elastic training, maybe other values (e.g. num_processes) should be set to a property?

@sgugger
Copy link
Collaborator

sgugger commented May 19, 2022

I understood properly, and I still stand by the fact it should be a property, decorated by @property and not a variable.

ZhiyuanChen added a commit to ZhiyuanChen/accelerate that referenced this issue May 20, 2022
@ZhiyuanChen
Copy link
Contributor Author

Should we makee other variables (e.g. num_processes) property too?

@muellerzr
Copy link
Collaborator

@ZhiyuanChen I'd say it couldn't hurt I think!

@ZhiyuanChen
Copy link
Contributor Author

ZhiyuanChen commented May 30, 2022

Hi sorry for this late reply.... but do you mean if changing doesn't hurt or not changing doesn't hurt...@muellerzr

@muellerzr
Copy link
Collaborator

Changing it! 😄

@ZhiyuanChen
Copy link
Contributor Author

Changing it! 😄

Haha will do

@muellerzr muellerzr added the feature request Request for a new feature to be added to Accelerate label Jun 4, 2022
ZhiyuanChen added a commit to ZhiyuanChen/accelerate that referenced this issue Jul 5, 2022
@ZhiyuanChen
Copy link
Contributor Author

I believe everything has been merged now, so I'm gonna close this issue.

Thanks again for this grate work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Request for a new feature to be added to Accelerate
Projects
None yet
Development

No branches or pull requests

3 participants