Distributed Training: Train BART/T5 for Summarization usi... | Distributed Training: Train BART/T5 for Summarization usi...
Distributed Training: Train BART/T5 for Summarization using πŸ€— Transformers and Amazon SageMaker
Open on Github
In case you missed it: on March 25th we announced a collaboration with Amazon SageMaker to make it easier to create State-of-the-Art Machine Learning models, and ship cutting-edge NLP features faster.

Together with the SageMaker team, we built πŸ€— Transformers optimized Deep Learning Containers to accelerate training of Transformers-based models. Thanks AWS friends!πŸ€— πŸš€

With the new HuggingFace estimator in the SageMaker Python SDK, you can start training with a single line of code.