Added short README
Browse files
README.md
CHANGED
|
@@ -1,3 +1,9 @@
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
---
|
| 4 |
+
|
| 5 |
+
# switch-large-128_qmoe
|
| 6 |
+
|
| 7 |
+
This is the [google/switch-large-128](https://huggingface.co/google/switch-large-128) model quantized with the QMoE framework to ternary precision and stored in the custom further compressed QMoE format.
|
| 8 |
+
|
| 9 |
+
Please see the [QMoE repository](https://github.com/IST-DASLab/qmoe) for how to use this model.
|