system capabilities

10:13 AM

This page lets you view and control the app's capabilities, its available powers.

Zzz's goal is to connect your intent to the world through machines, so it runs in many contexts, and you can do whatever you wish with whatever capabilities are on hand.

This is a work in progress. The idea is to put handles on the system for transparency and control.

🜢 backend

backend not checked
http://localhost:8999

The Zzz backend provides local system access (like to your filesystem), handles API requests to AI providers, and enables other capabilities that would otherwise be unavailable to the app running in the browser. It's made with Hono[🡵], a JS server framework that aligns with web standards, and SvelteKit[🡵].

  • 0ms
  • 0ms
  • 0ms
  • 0ms
  • 0ms
  • 0ms

backend filesystem

filesystem not initialized
 

This is the backend's filesystem directory. For security reasons, filesystem operations are scoped to this directory and symlinks are not followed. Defaults to .zzz in the backend's current working directory. To configure it set the .env variable PUBLIC_ZZZ_CACHE_DIR. Configure at your own risk.

backend websocket

Websockets are an optional transport that's preferred by default. Zzz currently relies on websockets for pushed updates like filesystem changes -- SSE will be supported as an option in the future.

websocket disconnected  
connected for: -
connected:
last send:
last receive:

Ollama

ollama not checked
backend unavailable
Ollama (ollama.com[🡵], GitHub[]) is a local model server that forks llama.cpp[]. It's one of Zzz's first integrations and the plan is to support many other local LLM backends (input/feedback is welcome). See also the missing provider page.

Full controls are on the missing provider page.

Claude

ChatGPT

Gemini

system

@ryanatkn/zzz@0.0.1

DEV: false

github.com/ryanatkn/zzz[]

/about

/settings