Layers
Partner APIGetting started

Sandbox & test keys

What the `_test` key segment routes to today, what it mocks, what it doesn't, and how to write a CI smoke test that doesn't burn credits.

View as Markdown

Partner keys come in two flavors: lp_live_... and lp_test_.... The env segment is parsed, stored, and logged — but it does not route to a separate sandbox backend today. A test key hits the same database, the same workflows, and the same platform integrations as a live key. Treat it as an isolation hint, not an isolation boundary.

A future release will route test keys to sandboxed back-ends. Today, test keys and live keys are indistinguishable once authenticated.

What the _test segment does today

The key parser reads the env segment and surfaces it on the verification result, but nothing downstream branches on it — key verification doesn't change its query, its scope check, or its rate-limit bucket based on env. So:

  • A lp_test_... key creating content bills your org wallet exactly like a live key.
  • A lp_test_... key starting an App Store ingest calls the real store scrapers exactly like a live key.
  • A lp_test_... key kicking off an ingest_github workflow spins up a real sandboxed agent and opens a real PR against the target repo.
  • A lp_test_... key publishing a scheduled post posts to the real platform through whatever social account you attached.

The key prefix is still useful: it's the one-glance signal you're not in production, it's safe to grep for, and it keeps test and live keys isolated by (api_key_id, endpoint) in the rate-limit and idempotency tables.

What's actually safe to test without side effects

The endpoints below are read-only or no-op on external systems. Hammer these freely:

  • GET /v1/whoami — no writes, no external calls.
  • GET /v1/projects, GET /v1/projects/:id — DB reads.
  • GET /v1/jobs/:jobId — DB reads.
  • GET /v1/content/:id, GET /v1/content/:id/assets/:assetId — DB reads; asset URLs are signed but fetching them doesn't re-render.
  • Every GET /v1/projects/:id/metrics/* — DB reads against cached metric tables.

Writes that are safe because they're self-contained:

  • POST /v1/projectsPATCH /v1/projects/:idPOST /v1/projects/:id/archive — project CRUD, no external calls on create.
  • POST /v1/projects/:id/influencers — creates an influencer row; no media generation until you actually request content.
  • POST /v1/webhook-endpoints + POST /v1/webhook-endpoints/:id/test — a test.ping fires at your URL; nothing else.

What costs money or hits external systems

EndpointExternal cost
POST /v1/projects/:id/contentCredit spend (generation). Real models. Real seconds.
POST /v1/content/:id/regenerateSame.
POST /v1/projects/:id/ingest/githubSandboxed agent minutes + a PR on the target repo.
POST /v1/projects/:id/ingest/appstoreApp Store / Play Store scraper.
POST /v1/projects/:id/ingest/websiteWebsite scrape spend.
POST /v1/content/:id/publish / POST /v1/content/:id/schedule then publishPosts to real Instagram / TikTok / YouTube.
Anything ads-relatedMeta / TikTok / Apple ad spend against the real wallet.

If you need to exercise any of these in CI, use a throwaway project and cancel the job before it completes — see the CI recipe below.

Credits and the plan gate

Every partner request passes through a plan-gate check. It reads the organization's current plan and asserts the plan features include api_access. No matching plan row → 402 PAYMENT_REQUIRED.

There's no special "test credit allowance" — your org has one wallet, shared by live and test keys. If you're on a partner plan with API access enabled, your test keys work. If you're not, neither do your live keys.

If a fresh partner sandbox org 402s on every call, the fix is to seat the org on a partner-tier plan with API access enabled. That's an ops action — email your Layers contact rather than try to create it via the API.

Minimum-viable CI smoke test

One read call to confirm auth, one long-running-start to confirm mutations work, and a cancel to avoid burning credits. No polling to completion, no real publishing.

smoke.ts
import { randomUUID } from "node:crypto";

const BASE = "https://api.layers.com";
const KEY = process.env.LAYERS_API_KEY!; // lp_test_... in CI

async function headers(extra: Record<string, string> = {}) {
  return {
    "X-Api-Key": KEY,
    "Content-Type": "application/json",
    ...extra,
  };
}

async function smoke() {
  // 1. whoami — confirms the key authenticates and has api_access.
  const who = await fetch(`${BASE}/v1/whoami`, { headers: await headers() });
  if (who.status !== 200) throw new Error(`whoami ${who.status}`);

  // 2. Create a throwaway project. customerExternalId keeps it idempotent
  //    across CI runs — PATCH on re-run instead of creating dupes.
  const proj = await fetch(`${BASE}/v1/projects`, {
    method: "POST",
    headers: await headers({ "Idempotency-Key": randomUUID() }),
    body: JSON.stringify({
      name: "ci-smoke",
      customerExternalId: `ci-${process.env.GITHUB_RUN_ID ?? Date.now()}`,
      timezone: "UTC",
      primaryLanguage: "en",
    }),
  }).then((r) => r.json());

  // 3. Kick off a long-running job — assert the 202 shape, don't wait.
  const ingest = await fetch(`${BASE}/v1/projects/${proj.id}/ingest/appstore`, {
    method: "POST",
    headers: await headers({ "Idempotency-Key": randomUUID() }),
    body: JSON.stringify({ iosBundleId: "com.example.doesnotexist" }),
  });
  if (ingest.status !== 202) throw new Error(`ingest ${ingest.status}`);
  const { jobId } = await ingest.json();

  // 4. Cancel immediately — no scrape spend, no credit spend.
  const cancel = await fetch(`${BASE}/v1/jobs/${jobId}/cancel`, {
    method: "POST",
    headers: await headers({ "Idempotency-Key": randomUUID() }),
  });
  if (![200, 202, 409].includes(cancel.status)) {
    throw new Error(`cancel ${cancel.status}`);
  }

  console.log("smoke ok", { projectId: proj.id, jobId });
}

smoke().catch((e) => {
  console.error(e);
  process.exit(1);
});
smoke.py
import os, sys, time, uuid, requests

BASE = "https://api.layers.com"
KEY = os.environ["LAYERS_API_KEY"]  # lp_test_... in CI

def H(**extra):
    return {"X-Api-Key": KEY, "Content-Type": "application/json", **extra}

def smoke():
    # 1. whoami
    r = requests.get(f"{BASE}/v1/whoami", headers=H())
    assert r.status_code == 200, f"whoami {r.status_code}"

    # 2. throwaway project — customerExternalId keeps it stable per CI run
    run_id = os.environ.get("GITHUB_RUN_ID", str(int(time.time())))
    proj = requests.post(
        f"{BASE}/v1/projects",
        headers=H(**{"Idempotency-Key": str(uuid.uuid4())}),
        json={
            "name": "ci-smoke",
            "customerExternalId": f"ci-{run_id}",
            "timezone": "UTC",
            "primaryLanguage": "en",
        },
    ).json()

    # 3. kick off a long-running job; assert 202, don't wait
    ingest = requests.post(
        f"{BASE}/v1/projects/{proj['id']}/ingest/appstore",
        headers=H(**{"Idempotency-Key": str(uuid.uuid4())}),
        json={"iosBundleId": "com.example.doesnotexist"},
    )
    assert ingest.status_code == 202, f"ingest {ingest.status_code}"
    job_id = ingest.json()["jobId"]

    # 4. cancel immediately
    cancel = requests.post(
        f"{BASE}/v1/jobs/{job_id}/cancel",
        headers=H(**{"Idempotency-Key": str(uuid.uuid4())}),
    )
    assert cancel.status_code in (200, 202, 409), f"cancel {cancel.status_code}"

    print("smoke ok", {"projectId": proj["id"], "jobId": job_id})

if __name__ == "__main__":
    try:
        smoke()
    except AssertionError as e:
        print(e, file=sys.stderr)
        sys.exit(1)

Why this shape:

  • customerExternalId keyed to the CI run keeps the test idempotent — re-running against the same run ID hits the existing project; Idempotency-Key replays the response.
  • Cancel before the scrape finishes — App Store scrapes start within a second of the 202 but the actual network fetch takes longer. Canceling immediately typically prevents the outbound request entirely.
  • Accept 409 on cancel — if the job did finish in the window between start and cancel, the cancel returns 409 CONFLICT with reason: "already_terminal". That's still a passing smoke.

Total spend per run: zero credits, at most one partial scraper invocation.

See also

On this page