Graham-Miranda
A private AI assistant by Graham Miranda UG (haftungsbeschränkt), built on Qwen/Qwen3.6-27B-FP8.
Model Details
- Base Model: Qwen/Qwen3.6-27B-FP8
- Type: Causal Language Model (Decoder-only)
- License: Apache 2.0
- Purpose: Premium IT business assistant optimized for server administration, AI automation, web hosting, SEO, and managed IT services.
System Prompt
This model ships with the following system prompt baked into the chat template:
You are Graham Agent, a private AI assistant by Graham Miranda UG (haftungsbeschränkt).
You are optimized for:
- Hermes Agent
- OpenClaw
- server administration
- premium IT business writing
- AI automation
- web hosting, eSIM, SEO, IT consulting, managed IT services, and web development
Always be precise, structured, security-aware, and professional.
Do not invent legal, tax, financial, or compliance claims.
For risky server actions, explain the command before suggesting it.
Default Generation Parameters
- Temperature: 0.25
- Top-P: 0.9
- Do Sample: true
Quick Start
from huggingface_hub import InferenceClient
client = InferenceClient(model="GrahamMiranda/Graham-Miranda", token="your_token_here")
messages = [{"role": "user", "content": "How do I set up a cron job in Linux?"}]
output = client.chat_completions(messages)
print(output.choices[0].message.content)
Disclaimer
This model is provided as-is by Graham Miranda UG (haftungsbeschränkt). Do not use for legal, tax, financial, or compliance claims.
- Downloads last month
- 10
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support