system capabilities

11:10 AM

This page lets you view and control the app's capabilities, its available powers.

Zzz's goal is to connect your intent to the world through machines, so it runs in many contexts, and you can do whatever you wish with whatever capabilities are on hand.

This is a work in progress. The idea is to put handles on the system for transparency and control.

🜢 backend

backend not checked
http://localhost:8999

The Zzz backend provides local system access (like to your filesystem), handles API requests to AI providers, and enables other capabilities that would otherwise be unavailable to the app running in the browser. It's made with Hono[🡵], a JS server framework that aligns with web standards, and SvelteKit[🡵].

  • 0ms
  • 0ms
  • 0ms
  • 0ms
  • 0ms
  • 0ms

backend filesystem

filesystem not initialized
 

The backend's filesystem is scoped for security. Symlinks are not followed. Configure with PUBLIC_ZZZ_DIR and PUBLIC_ZZZ_SCOPED_DIRS.

backend websocket

Websockets are an optional transport that's preferred by default. Zzz currently relies on websockets for pushed updates like filesystem changes -- SSE will be supported as an option in the future.

websocket disconnected  
connected for: -
connected:
last send:
last receive:

Ollama

ollama not checked
backend unavailable
Ollama (ollama.com[🡵], GitHub[]) is a local model server that forks llama.cpp[]. It's one of Zzz's first integrations and the plan is to support many other local LLM backends (input/feedback is welcome). See also the missing provider page.

Full controls are on the missing provider page.

Claude

ChatGPT

Gemini

system

@fuzdev/zzz@0.0.1

DEV: false

github.com/fuzdev/zzz[]

/about

/settings