Skip to content

Forge Pool API

The Forge Pool API is the public execution surface of the Planetary Kernel.

It exposes one canonical contract for submitting deterministic distributed workloads across Forge Pool infrastructure.

Through one stable surface, the system can resolve many workload families:

  • Monte Carlo and probabilistic simulation
  • ensemble execution and reducer-driven confidence workflows
  • graph and traversal workloads
  • search and retrieval execution
  • media transformation and analysis
  • tensor and dense numeric workloads

This keeps the external API stable while the Kernel expands internally through primitive families, profiles, adapters, and orchestration layers.

What This API Represents

The Forge API is not a traditional job execution interface.

It is a control surface for probabilistic, distributed execution across a heterogeneous planetary compute network.

Instead of submitting deterministic jobs, clients declare execution intent — defined by primitives and execution profiles — which the system expands, distributes, and resolves dynamically.

Execution Model

Every request submitted to Forge follows a structured lifecycle:

  1. Declaration The client defines a primitive and associated execution profile.

  2. Expansion The system expands the request into multiple execution paths (e.g. simulations, scenarios, variations).

  3. Distribution Work is scheduled across available nodes via the scheduler and network layer.

  4. Execution Agents execute assigned workloads independently.

  5. Aggregation Results are collected and merged into a final output.

  6. Resolution The system returns aggregated results, not individual execution outputs.


In 60 Seconds

Forge Pool API means:

  • one public execution surface
  • one canonical execution envelope
  • one deterministic Kernel doctrine
  • replayable and auditable results

You send:

  • a canonical execution request

Forge returns:

  • reduced output
  • execution metadata
  • replay references
  • optional artifacts

Canonical Public Entry Point

All public distributed workloads are submitted through a single endpoint:

http
POST /api/v0/ops/execute

Workload selection happens inside the request envelope:

json
{
  "op": {
    "name": "mc",
    "version": 1,
    "profile": "insurance.loss.v1"
  }
}

This means Forge Pool does not expose a route-per-workload model for distributed execution.

Instead, the API surface remains stable while the Kernel resolves:

  • primitive family
  • family version
  • profile semantics
  • seed behavior
  • policy behavior
  • verification discipline
  • replay metadata
  • artifact production

Why the API Is Structured This Way

Forge Pool is not documenting a pile of unrelated endpoints.

It is documenting a planetary execution system.

The API therefore centers on:

  • one canonical execution contract
  • deterministic replay discipline
  • auditable artifact generation
  • horizontal sharding across heterogeneous agents
  • extensibility through primitive families and profiles

This gives developers, operators, and enterprise buyers one clean mental model:

one public surface, many primitives, one Kernel doctrine


Public Runtime Position

The public API sits above the runtime core, but remains governed by the same execution doctrine.

High-level path:

text
Client

Web Core

Hub

Agents

Primitives + Profiles

Hub Reduction + Verification

Web Core Response

Responsibilities are separated intentionally:

  • Web Core handles authentication, tenancy, public API lifecycle, persistence, and billing context
  • Hub handles planning, sharding, verification, aggregation, and replay discipline
  • Agents execute deterministic shard workloads
  • Primitives + Profiles define canonical execution semantics
  • Adapters / Studio / Labs may shape requests or compose workflows, but do not replace Kernel execution truth

Core Sections

Start here:

Primitive family references:

Platform surfaces:


Mental Model for New Readers

If you are evaluating Forge Pool for the first time, the simplest correct model is:

  1. the API accepts one canonical execution envelope
  2. the Kernel determines how that workload is executed
  3. the workload is planned, sharded, verified, reduced, and recorded
  4. the response includes domain output plus replay-grade execution metadata

That is the foundation that allows Forge Pool to scale from one simulation to a planetary execution layer for uncertainty.


Why Not Traditional Compute APIs?

Traditional systems execute single deterministic workloads.

Forge executes distributed probabilistic workloads:

  • Multiple execution paths instead of one
  • Aggregated results instead of single outputs
  • Dynamic scheduling instead of fixed allocation

This enables simulation, uncertainty modeling, and large-scale probabilistic computation at planetary scale.

Final Note

The public API is not separate from the system.

It is the controlled public entrance into the system.

That is why the contract is stable, why the runtime is deterministic, and why replay is treated as part of execution rather than as an afterthought.