Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE REQUEST] Can Merlion use GPU to train models? #76

Open
adminblackjack opened this issue Feb 27, 2022 · 2 comments
Open

[FEATURE REQUEST] Can Merlion use GPU to train models? #76

adminblackjack opened this issue Feb 27, 2022 · 2 comments

Comments

@adminblackjack
Copy link

adminblackjack commented Feb 27, 2022

Can Merlion use GPU to train models?

@aadyotb
Copy link
Contributor

aadyotb commented Mar 29, 2022

Sorry for the delayed response. Currently, this is on a per-model basis. Some of the deep learning models have a cuda / use_cuda / gpu option which allows them to run on GPU. However, it's currently not done in a unified way. We do plan to make this interface more consistent in the future.

@adminblackjack
Copy link
Author

Sorry for the delayed response. Currently, this is on a per-model basis. Some of the deep learning models have a cuda / use_cuda / gpu option which allows them to run on GPU. However, it's currently not done in a unified way. We do plan to make this interface more consistent in the future.

Thank you! I'm looking forwark to it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants