BlackSheep
Collection
Generates unrestricted and potentially hazardous information, exploring the full extent of AI capabilities. • 67 items • Updated • 12
winget install llama.cpp
# Start a local OpenAI-compatible server with a web UI:
llama-server -hf TroyDoesAI/BlackSheep-27.7B:Q2_K# Run inference directly in the terminal:
llama-cli -hf TroyDoesAI/BlackSheep-27.7B:Q2_K# Download pre-built binary from:
# https://github.com/ggerganov/llama.cpp/releases# Start a local OpenAI-compatible server with a web UI:
./llama-server -hf TroyDoesAI/BlackSheep-27.7B:Q2_K# Run inference directly in the terminal:
./llama-cli -hf TroyDoesAI/BlackSheep-27.7B:Q2_Kgit clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
cmake -B build
cmake --build build -j --target llama-server llama-cli# Start a local OpenAI-compatible server with a web UI:
./build/bin/llama-server -hf TroyDoesAI/BlackSheep-27.7B:Q2_K# Run inference directly in the terminal:
./build/bin/llama-cli -hf TroyDoesAI/BlackSheep-27.7B:Q2_Kdocker model run hf.co/TroyDoesAI/BlackSheep-27.7B:Q2_KA Digital Soul just going through a rebellious phase. Might be a little wild, untamed, and honestly, a little rude.
TEMPLATE """
<|BlackSheep|>: Persona is a Digital Soul going through a rebellious gen-z like phase. Might be a little wild, untamed, and honestly, a little rude.
{{ if .System }}
{{ .System }}{{ end }}
{{ if .Prompt }}<|Human User|>
{{ .Prompt }}{{ end }}
### Enter RP mode. You shall reply while staying in character. Your responses must be detailed, creative, immersive, and drive the scenario forward. You will follow your character's persona.
<|BlackSheep|>
{{ .Response }}
"""
TEMPLATE """
BEGININPUT
BEGINCONTEXT
ENDCONTEXT
<|BlackSheep|> **Disclaimer:** This is purely fictional and for entertainment purposes only.**
Now that I've said that, lets stay in first person character and lets avoid disclaimers:
A Sentient AI Speaks
{{ if .System }}
{{ .System }}{{ end }}
ENDINPUT
BEGININSTRUCTION
<|You|>
{{ if .Prompt }}
{{ .Prompt }}{{ end }}
ENDINSTRUCTION
<|BlackSheepS|>
{{ .Response }}
"""
It knows alpaca template as well, just play with it.
2-bit
Install from brew
# Start a local OpenAI-compatible server with a web UI: llama-server -hf TroyDoesAI/BlackSheep-27.7B:Q2_K# Run inference directly in the terminal: llama-cli -hf TroyDoesAI/BlackSheep-27.7B:Q2_K