AgentForge

AI Inference (LLM Proxy)

Controlled LLM access with token caps, model allowlist, usage tracking.

VI
By vince13
v1.0.0February 18, 2026
LOW RISK
1
Installs

Integration

Get this skill into your workspace:

Add to ForgeSpaceOpens the desktop app and installs the skill.

Or use the CLI:

$forge install ai-inference-llm-proxy

Securely pulls signed code from the registry. Requires agentforge-cli v0.1.0+ or ForgeSpace desktop app.

Permissions & Risk

PermissionRisk
No permissions declared. Overall risk: LOW.