Instructions to use jfkback/hypencoder.4_layer with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use jfkback/hypencoder.4_layer with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="jfkback/hypencoder.4_layer")# Load model directly from transformers import HypencoderDualEncoder model = HypencoderDualEncoder.from_pretrained("jfkback/hypencoder.4_layer", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Add license and pipeline tag
#1
by nielsr HF Staff - opened
README.md
CHANGED
|
@@ -1,11 +1,13 @@
|
|
| 1 |
---
|
| 2 |
-
|
|
|
|
| 3 |
datasets:
|
| 4 |
- microsoft/ms_marco
|
| 5 |
language:
|
| 6 |
- en
|
| 7 |
-
|
| 8 |
-
|
|
|
|
| 9 |
---
|
| 10 |
|
| 11 |
# Model Card
|
|
|
|
| 1 |
---
|
| 2 |
+
base_model:
|
| 3 |
+
- google-bert/bert-base-uncased
|
| 4 |
datasets:
|
| 5 |
- microsoft/ms_marco
|
| 6 |
language:
|
| 7 |
- en
|
| 8 |
+
library_name: transformers
|
| 9 |
+
license: apache-2.0
|
| 10 |
+
pipeline_tag: feature-extraction
|
| 11 |
---
|
| 12 |
|
| 13 |
# Model Card
|