Instructions to use datasetsANDmodels/purpose-extraction with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use datasetsANDmodels/purpose-extraction with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("datasetsANDmodels/purpose-extraction") model = AutoModelForSeq2SeqLM.from_pretrained("datasetsANDmodels/purpose-extraction") - Notebooks
- Google Colab
- Kaggle
| from transformers import pipeline | |
| extractor = pipeline("text2text-generation", model="purpose-extraction") | |
| intent = " I am calling to invite you the meeting." | |
| label=extractor(intent)[0]["generated_text"] | |
| if label=="": | |
| label="No purpose detected" | |
| print (label ) |