<RETURN_TO_BASE

Build a Secure, Memory-Powered Cipher Workflow with Dynamic LLM Switching

'Step-by-step guide to building a secure, memory-enabled Cipher workflow that dynamically selects an LLM provider and exposes an API for integration. Includes Python helpers to manage keys, generate cipher.yml, store and retrieve memories, and run Cipher in API mode.'

Securely capture API keys

Capture the Gemini API key through Colab's getpass so it isn't stored in plaintext or visible in the notebook outputs. The snippet below sets GEMINI_API_KEY in the environment.

import os, getpass
os.environ["GEMINI_API_KEY"] = getpass.getpass("Enter your Gemini API key: ").strip()

Dynamic LLM selection

Define choose_llm() to pick OpenAI, Gemini or Anthropic automatically based on available environment variables. This lets the same workflow run across different accounts or deployment targets without changing code.

import subprocess, tempfile, pathlib, textwrap, time, requests, shlex
 
 
def choose_llm():
   if os.getenv("OPENAI_API_KEY"):
       return "openai", "gpt-4o-mini", "OPENAI_API_KEY"
   if os.getenv("GEMINI_API_KEY"):
       return "gemini", "gemini-2.5-flash", "GEMINI_API_KEY"
   if os.getenv("ANTHROPIC_API_KEY"):
       return "anthropic", "claude-3-5-haiku-20241022", "ANTHROPIC_API_KEY"
   raise RuntimeError("Set one API key before running.")

Shell command helper

Use a small helper to run shell commands from Python, print stdout/stderr for visibility, and fail loudly when something goes wrong. This makes the automation more transparent when run in notebooks or CI logs.

def run(cmd, check=True, env=None):
   print("▸", cmd)
   p = subprocess.run(cmd, shell=True, text=True, capture_output=True, env=env)
   if p.stdout: print(p.stdout)
   if p.stderr: print(p.stderr)
   if check and p.returncode != 0:
       raise RuntimeError(f"Command failed: {cmd}")
   return p

Install Node.js and Cipher CLI

Ensure the environment has Node.js, npm, and the Cipher CLI installed globally before attempting any Cipher operations.

def ensure_node_and_cipher():
   run("sudo apt-get update -y && sudo apt-get install -y nodejs npm", check=False)
   run("npm install -g @byterover/cipher")

Generate cipher.yml with a memory-enabled agent

Programmatically create a cipher.yml inside a memAgent folder. The configuration sets the chosen provider and model, references the environment API key, enables a system prompt for long-term memory, and registers a filesystem MCP server.

def write_cipher_yml(workdir, provider, model, key_env):
   cfg = """
llm:
 provider: {provider}
 model: {model}
 apiKey: ${key_env}
systemPrompt:
 enabled: true
 content: |
   You are an AI programming assistant with long-term memory of prior decisions.
embedding:
 disabled: true
mcpServers:
 filesystem:
   type: stdio
   command: npx
   args: ['-y','@modelcontextprotocol/server-filesystem','.']
""".format(provider=provider, model=model, key_env=key_env)
 
 
   (workdir / "memAgent").mkdir(parents=True, exist_ok=True)
   (workdir / "memAgent" / "cipher.yml").write_text(cfg.strip() + "n")

Run a single Cipher command

Use cipher_once() to run an ad-hoc Cipher CLI instruction from Python, capture its output and return it for programmatic use.

def cipher_once(text, env=None, cwd=None):
   cmd = f'cipher {shlex.quote(text)}'
   p = subprocess.run(cmd, shell=True, text=True, capture_output=True, env=env, cwd=cwd)
   print("Cipher says:n", p.stdout or p.stderr)
   return p.stdout.strip() or p.stderr.strip()

Start Cipher in API mode

Launch Cipher as a subprocess in API mode and poll the /health endpoint until it responds. This ensures the API is ready for external clients.

def start_api(env, cwd):
   proc = subprocess.Popen("cipher --mode api", shell=True, env=env, cwd=cwd,
                           stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=True)
   for _ in range(30):
       try:
           r = requests.get("http://127.0.0.1:3000/health", timeout=2)
           if r.ok:
               print("API /health:", r.text)
               break
       except: pass
       time.sleep(1)
   return proc

Orchestrating the demo

The main() function ties everything together: it selects the LLM provider and model, installs dependencies, writes the cipher.yml, stores project decisions into Cipher's memory, queries those memories back, and briefly runs the API server.

def main():
   provider, model, key_env = choose_llm()
   ensure_node_and_cipher()
   workdir = pathlib.Path(tempfile.mkdtemp(prefix="cipher_demo_"))
   write_cipher_yml(workdir, provider, model, key_env)
   env = os.environ.copy()
 
 
   cipher_once("Store decision: use pydantic for config validation; pytest fixtures for testing.", env, str(workdir))
   cipher_once("Remember: follow conventional commits; enforce black + isort in CI.", env, str(workdir))
 
 
   cipher_once("What did we standardize for config validation and Python formatting?", env, str(workdir))
 
 
   api_proc = start_api(env, str(workdir))
   time.sleep(3)
   api_proc.terminate()
 
 
if __name__ == "__main__":
   main()

Practical benefits

This compact workflow secures API keys, automatically chooses an available LLM provider, provisions a persistent-memory Cipher agent, and exposes an API endpoint for integrations. It's lightweight, reproducible in notebooks or CI, and helps teams capture and query project decisions programmatically.

🇷🇺

Сменить язык

Читать эту статью на русском

Переключить на Русский