💧 LFM2.5
Collection
Collection of Instruct, Base, and Japanese LFM2.5-1.2B models.
•
19 items
•
Updated
•
53
MLX export of LFM2.5-1.2B-JP for Apple Silicon inference.
LFM2.5-JP is a Japanese language model based on the LFM2.5 hybrid architecture, optimized for Japanese text generation and completion tasks.
| Property | Value |
|---|---|
| Parameters | 1.2B |
| Precision | 8-bit |
| Group Size | 64 |
| Context Length | 128K |
| Parameter | Value |
|---|---|
| temperature | 0.3 |
| min_p | 0.15 |
| repetition_penalty | 1.05 |
| max_tokens | 512 |
pip install mlx-lm
from mlx_lm import load, generate
from mlx_lm.sample_utils import make_sampler, make_logits_processors
model, tokenizer = load("LiquidAI/LFM2.5-1.2B-JP-8bit")
prompt = "東京は日本の"
sampler = make_sampler(temp=0.3, min_p=0.15)
logits_processors = make_logits_processors(repetition_penalty=1.05)
response = generate(
model,
tokenizer,
prompt=prompt,
max_tokens=512,
sampler=sampler,
logits_processors=logits_processors,
verbose=True,
)
This model is released under the LFM 1.0 License.