-
Notifications
You must be signed in to change notification settings - Fork 12
Improve setup guide #31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
5192cab
to
1a26031
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
setup.py
Outdated
"accelerate>=0.26.0", | ||
"datasets", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why do we need accelerate
and datasets
as required dependencies ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@echarlaix Yeah, required when running certain models, as reported by users here #29 (comment)
I want to simplify the UX so that users can have most common deps installed by default.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd be in favor of keeping the number of required dependencies as low as possible (only keeping what's necessary). Could you extend on why datasets needs to be added? Also for accelerate I don't think this is mandatory (we could for example check if accelerate is available and depending on it how to set low_cpu_mem_usage
when loading the model) https://github.com/huggingface/transformers/blob/v4.49.0/src/transformers/modeling_utils.py#L3612 wdyt ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't want to block this PR so reverted in 7768d35, would you mind opening a new PR for this so that we merge this PR asap and continue the discussion there?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also cc @michaelbenayoun who will likely review the second PR
Uh oh!
There was an error while loading. Please reload this page.