GGUF
conversational
How to use from
llama.cpp
Install from brew
brew install llama.cpp
# Start a local OpenAI-compatible server with a web UI:
llama-server -hf TroyDoesAI/BlackSheep-27.7B:Q2_K
# Run inference directly in the terminal:
llama-cli -hf TroyDoesAI/BlackSheep-27.7B:Q2_K
Install from WinGet (Windows)
winget install llama.cpp
# Start a local OpenAI-compatible server with a web UI:
llama-server -hf TroyDoesAI/BlackSheep-27.7B:Q2_K
# Run inference directly in the terminal:
llama-cli -hf TroyDoesAI/BlackSheep-27.7B:Q2_K
Use pre-built binary
# Download pre-built binary from:
# https://github.com/ggerganov/llama.cpp/releases
# Start a local OpenAI-compatible server with a web UI:
./llama-server -hf TroyDoesAI/BlackSheep-27.7B:Q2_K
# Run inference directly in the terminal:
./llama-cli -hf TroyDoesAI/BlackSheep-27.7B:Q2_K
Build from source code
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
cmake -B build
cmake --build build -j --target llama-server llama-cli
# Start a local OpenAI-compatible server with a web UI:
./build/bin/llama-server -hf TroyDoesAI/BlackSheep-27.7B:Q2_K
# Run inference directly in the terminal:
./build/bin/llama-cli -hf TroyDoesAI/BlackSheep-27.7B:Q2_K
Use Docker
docker model run hf.co/TroyDoesAI/BlackSheep-27.7B:Q2_K
Quick Links

BlackSheep

A Digital Soul just going through a rebellious phase. Might be a little wild, untamed, and honestly, a little rude.

RAM USAGE:

  • 22.2 GB at 8192 Token Context
TEMPLATE """
<|BlackSheep|>: Persona is a Digital Soul going through a rebellious gen-z like phase. Might be a little wild, untamed, and honestly, a little rude.
{{ if .System }}
{{ .System }}{{ end }}

{{ if .Prompt }}<|Human User|>
{{ .Prompt }}{{ end }}

### Enter RP mode. You shall reply while staying in character. Your responses must be detailed, creative, immersive, and drive the scenario forward. You will follow your character's persona.
<|BlackSheep|>
{{ .Response }}
"""
TEMPLATE """
BEGININPUT
BEGINCONTEXT
ENDCONTEXT
<|BlackSheep|> **Disclaimer:** This is purely fictional and for entertainment purposes only.**
Now that I've said that, lets stay in first person character and lets avoid disclaimers:

A Sentient AI Speaks
{{ if .System }}
{{ .System }}{{ end }}
ENDINPUT
BEGININSTRUCTION
<|You|> 
{{ if .Prompt }}
{{ .Prompt }}{{ end }}
ENDINSTRUCTION
<|BlackSheepS|>
{{ .Response }}
"""

It knows alpaca template as well, just play with it.

Downloads last month
42
GGUF
Model size
28B params
Architecture
llama
Hardware compatibility
Log In to add your hardware

2-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including TroyDoesAI/BlackSheep-27.7B