You can test most of our models directly on their pages from the model hub. It's straightforward to train your models with one before loading them for inference with the other. □ Transformers is backed by the three most popular deep learning libraries - Jax, PyTorch and TensorFlow - with a seamless integration between them. At the same time, each python module defining an architecture is fully standalone and can be modified to enable quick research experiments. □ Transformers provides APIs to quickly download and use those pretrained models on a given text, fine-tune them on your own datasets and then share them with the community on our model hub. Transformer models can also perform tasks on several modalities combined, such as table question answering, optical character recognition, information extraction from scanned documents, video classification, and visual question answering. □️ Audio, for tasks like speech recognition and audio classification.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |