Installation¶
Requirements¶
- Python 3.11 or later
- pip or uv package manager
Basic installation¶
This installs CT Toolkit with all core dependencies. No additional packages are needed to get started with OpenAI, Anthropic, or Ollama.
Framework extras¶
Install optional extras based on the frameworks you use:
Enables TheseusChatModel and TheseusLangChainCallback.
Requires langchain-core >= 1.2.
Enables DivergencePenaltyLoss for PyTorch training loops.
Requires torch >= 2.0.
Development installation¶
For contributors or if you want to run the test suite:
git clone https://github.com/hakandamar/ct-toolkit
cd ct-toolkit
# Using uv (recommended)
uv sync --all-extras
# Or with pip
pip install -e ".[dev,langchain,crewai,autogen]"
Run the tests:
Virtual environment (recommended)¶
Always use a virtual environment in production:
Verifying the installation¶
Or run a quick sanity check:
from ct_toolkit import TheseusWrapper, ConstitutionalKernel
kernel = ConstitutionalKernel.default()
print(f"Kernel: {kernel.name}")
print(f"Anchors: {len(kernel.anchors)}")
print(f"Commitments: {len(kernel.commitments)}")
Expected output:
Dependencies overview¶
CT Toolkit's core dependencies:
| Package | Purpose |
|---|---|
litellm |
Unified transport layer (OpenAI, Anthropic, Ollama, etc.) |
instructor |
Structured L2/L3 judge responses |
numpy |
L1 embedding calculations |
pyyaml |
Kernel and template YAML loading |
pydantic |
Config validation |
cryptography |
HMAC key management |
jinja2 |
System prompt templating |
No heavy ML dependency
CT Toolkit's core does not require PyTorch or any ML framework. The [ml] extra is only needed for training-time DivergencePenaltyLoss.