Instructions to use WebOrganizer/TopicClassifier with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use WebOrganizer/TopicClassifier with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="WebOrganizer/TopicClassifier", trust_remote_code=True)# Load model directly from transformers import AutoModelForSequenceClassification model = AutoModelForSequenceClassification.from_pretrained("WebOrganizer/TopicClassifier", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
[bugfix] Initialize attention bias on the same device as Query/Key/Value
#1
by kenneth-doh - opened
The attention bias in xformers is currently initialized on the default device, rather than the device of the Q/K/V tensors.
Thus, in a multi-GPU environment, the following error occurs:
Error: Attention bias and Query/Key/Value should be on the same device
query.device: cuda:6
attn_bias : cuda:0
This PR resolved the above error.
Note: The same error occurred in vllm and was resolved by the following PR.
https://github.com/vllm-project/vllm/pull/13468