Build with ThreadSync
Developer Hub
Three API surfaces. One developer experience. Governed AI access, governed execution, and enterprise integration — all with enterprise-grade security.
Three Developer Paths
Choose the API surface that fits your use case — or combine all three for a complete AI-powered integration stack.
LLM Gateway API
Unified /v1/chat/completions endpoint across 5 providers and 43+ models. Auto-routing, manual model pinning, per-request cost tracking, and browser-safe PKCE auth.
Magic Runtime
Contract-driven controller deployment with input/output schemas, hot-reload, sandboxed execution, and full observability via Prometheus metrics, JSON logs, and tracing.
Explore Magic RuntimeThreadSync Core API
REST API with OAuth 2.0 / API key auth, versioned endpoints, unified data models, 200+ connectors, webhooks, and event-driven sync across enterprise systems.
Explore Core APILLM Gateway API
One endpoint. Five providers. Forty-three models. Governed AI access for every team.
Unified Completions Endpoint
Single /v1/chat/completions endpoint compatible with the OpenAI SDK format. Drop-in replacement for any existing integration.
Auto-Routing or Model Pinning
Let the gateway choose the best model for cost/quality, or pin a specific model like claude-sonnet-4-20250514 or gpt-4o.
5 Providers, 43+ Models
Anthropic, OpenAI, Google, Mistral, and Groq — all through one authenticated endpoint with unified response schemas.
Browser-Safe PKCE Auth
OAuth 2.0 PKCE flow for frontend apps. No secrets in the browser. Secure token exchange with automatic refresh.
Idempotent Requests
Send an Idempotency-Key header to safely retry requests. Conversation memory stores thread context across calls.
Per-Request Cost Tracking
Every response includes token counts and cost breakdowns. Aggregate by user, team, or project for budget governance.
Quick Start: Chat Completion
# Send a chat completion through LLM Gateway
curl -X POST https://llmgateway.threadsync.io/v1/chat/completions \
-H "Authorization: Bearer YOUR_GATEWAY_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-20250514",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain API gateways in one paragraph."}
],
"max_tokens": 256
}'
# Response includes token usage and cost breakdown:
# {
# "id": "chatcmpl-abc123",
# "model": "claude-sonnet-4-20250514",
# "choices": [{ "message": { "content": "..." } }],
# "usage": { "prompt_tokens": 24, "completion_tokens": 128,
# "cost_usd": 0.00082 }
# }
from openai import OpenAI
# Point the OpenAI SDK at LLM Gateway
client = OpenAI(
base_url="https://llmgateway.threadsync.io/v1",
api_key="YOUR_GATEWAY_TOKEN"
)
# Send a chat completion — works with any supported model
response = client.chat.completions.create(
model="claude-sonnet-4-20250514",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain API gateways in one paragraph."}
],
max_tokens=256
)
print(response.choices[0].message.content)
import OpenAI from 'openai';
// Point the OpenAI SDK at LLM Gateway
const client = new OpenAI({
baseURL: 'https://llmgateway.threadsync.io/v1',
apiKey: 'YOUR_GATEWAY_TOKEN'
});
// Send a chat completion — works with any supported model
const response = await client.chat.completions.create({
model: 'claude-sonnet-4-20250514',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Explain API gateways in one paragraph.' }
],
max_tokens: 256
});
console.log(response.choices[0].message.content);
Magic Runtime
Deploy AI-generated controllers with enterprise-grade safety. Default-deny security, contract-driven execution, and full observability — built for production.
Controller Contracts
Every controller declares its inputs, outputs, capabilities, and resource limits. The contract is the single source of truth — no surprises in production.
CLI Scaffolding
Bootstrap new controllers in seconds. magic scaffold --name my-agent generates contract, handler, tests, and Dockerfile.
Hot-Reload Deployment
Ship new controllers without restarts. Validate, deploy, and execute — all through the CLI or API. Zero-downtime updates by default.
Default-Deny Security
HTTP egress, database access, and secrets are blocked unless explicitly declared and scoped per-controller. Sandboxed execution with syscall restrictions.
Prometheus + JSON Logs
Structured JSON logs, Prometheus metrics, and distributed tracing out of the box. Every request traced by request_id.
AI-Native Workflow
LLM context packs and scaffolding so AI assistants generate conformant controllers. Built for the AI-first development loop.
Example: Controller Contract
// contract.json — declares what the controller can do
{
"name": "sentiment-analyzer",
"version": "1.0.0",
"input_schema": {
"type": "object",
"properties": {
"text": { "type": "string", "maxLength": 5000 }
},
"required": ["text"]
},
"output_schema": {
"type": "object",
"properties": {
"sentiment": { "type": "string", "enum": ["positive", "negative", "neutral"] },
"confidence": { "type": "number", "minimum": 0, "maximum": 1 }
}
},
"capabilities": ["llm:chat"],
"limits": {
"timeout_ms": 10000,
"memory_mb": 256
}
}
# handler.py — the controller logic
from magic_runtime import Controller, Request, Response
app = Controller("sentiment-analyzer")
@app.handle
async def analyze(req: Request) -> Response:
# Use LLM Gateway through the runtime SDK
result = await req.llm.chat(
model="claude-sonnet-4-20250514",
messages=[{
"role": "user",
"content": f"Classify sentiment: {req.input['text']}"
}]
)
return Response({
"sentiment": result.parsed["sentiment"],
"confidence": result.parsed["confidence"]
})
ThreadSync Core API
Enterprise data integration. Connect any system in minutes, not months.
Zero Config Integrations
Connect to Salesforce, SAP, or Workday in 3 lines of code. No middleware, no XML, no headaches.
Event Streaming
Poll sync status in real-time via REST. Webhook push delivery coming Q2 2026 for zero-polling event-driven architectures.
Enterprise Security
SOC 2 aligned, end-to-end encryption, and RBAC built-in. Pass your security review on day one.
Version Control
Full API versioning with 24-month deprecation windows. Your integrations won't break unexpectedly.
Built-in Observability
Distributed tracing, metrics, and logs out of the box. Debug issues in seconds, not hours.
Developer Support
Dedicated Slack channel, office hours, and engineers who actually understand your code.
Authentication
Three ways to authenticate with the ThreadSync Core API.
Sandbox API Key
For development and testing. Request a key from the dashboard — use it directly as a Bearer token.
Authorization: Bearer sk_test_...
Client Credentials
For production service-to-service. Exchange client_id + client_secret via POST /v1/auth/token for a short-lived JWT.
Authorization: Bearer eyJhbG...
Provider OAuth ROADMAP
For connecting Salesforce, SAP, Workday, etc. ThreadSync handles the OAuth flow — you provide the callback URL. In sandbox, connections return simulated "connected" status.
POST /v1/connections/oauth/start
The THREADSYNC_API_TOKEN environment variable used in SDK examples accepts either an API key or a client credentials JWT.
Core API Quick Start Guide
From zero to your first integration in under 5 minutes.
SDKs in preview on GitHub.
# 1. Authenticate and get your token
curl -X POST https://api.threadsync.io/v1/auth/token \
-H "Content-Type: application/json" \
-d '{"client_id": "YOUR_CLIENT_ID", "client_secret": "YOUR_SECRET"}'
# 2. Connect to Salesforce (source)
curl -X POST https://api.threadsync.io/v1/connections \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{"provider": "salesforce"}'
# 3. Connect to Snowflake (destination)
curl -X POST https://api.threadsync.io/v1/connections \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{"provider": "snowflake"}'
# 4. Sync contacts to your data warehouse
# Use the connection IDs returned from steps 2 and 3
curl -X POST https://api.threadsync.io/v1/syncs \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"source": {"connection": "SF_CONNECTION_ID", "object": "Contact"},
"destination": {"connection": "SNOWFLAKE_CONNECTION_ID", "table": "contacts"},
"schedule": "realtime"
}'
import { ThreadSync } from '@threadsync/sdk';
// Initialize the client
const ts = new ThreadSync({
bearerToken: process.env.THREADSYNC_API_TOKEN
});
async function main() {
// Connect to Salesforce
const sf = await ts.connections.create('salesforce');
// Connect to your data warehouse
const dest = await ts.connections.create('snowflake');
// Sync contacts in real-time
const sync = await ts.sync.create({
source: { connection: sf.id, object: 'Contact' },
destination: { connection: dest.id, table: 'contacts' },
schedule: 'realtime'
});
// Check sync status
const status = await ts.sync.get(sync.id);
console.log(`Synced ${status.recordsSynced} records`);
}
main();
import os
from threadsync import ThreadSync
# Initialize the client
ts = ThreadSync(bearer_token=os.environ["THREADSYNC_API_TOKEN"])
# Connect to Salesforce
sf = ts.connections.create(provider="salesforce")
# Connect to your data warehouse
dest = ts.connections.create(provider="snowflake")
# Create a real-time sync
sync = ts.sync.create(
source={"connection": sf.id, "object": "Contact"},
destination={"connection": dest.id, "table": "contacts"},
schedule="realtime"
)
# Check sync status
status = ts.sync.get(sync.id)
print(f"Synced {status.records_synced} records")
package main
import (
"fmt"
"log"
"os"
"github.com/threadsync-infrastructure/threadsync-go"
)
func main() {
// Initialize the client
client := threadsync.New(os.Getenv("THREADSYNC_API_TOKEN"))
// Connect to Salesforce
sf, err := client.Connections.Create("salesforce", nil)
if err != nil { log.Fatal(err) }
// Connect to your data warehouse
dest, err := client.Connections.Create("snowflake", nil)
if err != nil { log.Fatal(err) }
// Create a real-time sync
sync, err := client.Sync.Create(&threadsync.SyncConfig{
Source: threadsync.Endpoint{Connection: sf.ID, Object: "Contact"},
Destination: threadsync.Endpoint{Connection: dest.ID, Table: "contacts"},
Schedule: "realtime",
})
if err != nil { log.Fatal(err) }
fmt.Printf("Sync created: %s\n", sync.ID)
}
import io.threadsync.sdk.ThreadSync;
import io.threadsync.sdk.models.*;
public class QuickStart {
public static void main(String[] args) {
// Initialize the client
ThreadSync ts = ThreadSync.builder()
.bearerToken(System.getenv("THREADSYNC_API_TOKEN"))
.build();
// Connect to Salesforce
Connection sf = ts.connections()
.create("salesforce");
// Connect to your data warehouse
Connection dest = ts.connections()
.create("snowflake");
// Create a real-time sync
Sync sync = ts.sync().create(
SyncConfig.builder()
.source(new Endpoint(sf.getId(), "Contact"))
.destination(new Endpoint(dest.getId(), "contacts"))
.schedule("realtime")
.build()
);
System.out.printf("Sync created: %s%n", sync.getId());
}
}
using ThreadSync.Sdk;
using ThreadSync.Sdk.Models;
// Initialize the client
var ts = new ThreadSyncClient(
bearerToken: Environment.GetEnvironmentVariable("THREADSYNC_API_TOKEN")
);
// Connect to Salesforce
var sf = await ts.Connections.CreateAsync("salesforce");
// Connect to your data warehouse
var dest = await ts.Connections.CreateAsync("snowflake");
// Create a real-time sync
var sync = await ts.Sync.CreateAsync(new SyncConfig
{
Source = new Endpoint { Connection = sf.Id, Object = "Contact" },
Destination = new Endpoint { Connection = dest.Id, Table = "contacts" },
Schedule = "realtime"
});
Console.WriteLine($"Sync created: {sync.Id}");
require 'threadsync'
# Initialize the client
ts = ThreadSync::Client.new(
bearer_token: ENV['THREADSYNC_API_TOKEN']
)
# Connect to Salesforce
sf = ts.connections.create(provider: 'salesforce')
# Connect to your data warehouse
dest = ts.connections.create(provider: 'snowflake')
# Create a real-time sync
sync = ts.sync.create(
source: { connection: sf.id, object: 'Contact' },
destination: { connection: dest.id, table: 'contacts' },
schedule: 'realtime'
)
# Check sync status
status = ts.sync.get(sync.id)
puts "Synced #{status.records_synced} records"
<?php
use ThreadSync\Client;
use ThreadSync\Models\SyncConfig;
// Initialize the client
$ts = new Client(
bearerToken: getenv('THREADSYNC_API_TOKEN')
);
// Connect to Salesforce
$sf = $ts->connections->create('salesforce');
// Connect to your data warehouse
$dest = $ts->connections->create('snowflake');
// Create a real-time sync
$sync = $ts->sync->create(new SyncConfig(
source: ['connection' => $sf->id, 'object' => 'Contact'],
destination: ['connection' => $dest->id, 'table' => 'contacts'],
schedule: 'realtime'
));
echo "Sync created: {$sync->id}\n";
use threadsync::{Client, SyncConfig, Endpoint};
use std::env;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize the client
let ts = Client::new(
&env::var("THREADSYNC_API_TOKEN")?
);
// Connect to Salesforce
let sf = ts.connections()
.create("salesforce")
.await?;
// Connect to your data warehouse
let dest = ts.connections()
.create("snowflake")
.await?;
// Create a real-time sync
let sync = ts.sync().create(SyncConfig {
source: Endpoint { connection: sf.id.clone(), object: "Contact".into() },
destination: Endpoint { connection: dest.id.clone(), table: "contacts".into() },
schedule: "realtime".into(),
}).await?;
println!("Sync created: {}", sync.id);
Ok(())
}
200+ Enterprise Connectors
Pre-built integrations for the systems you already use. New connectors added monthly.
Core API Reference
RESTful API with predictable resource-oriented URLs and standard HTTP response codes.
Authentication
/v1/auth/token
Exchange client credentials for a short-lived JWT. Send client_id and client_secret in the request body. No authentication header required.
Health
/v1/health
Health check endpoint. Returns API status, version, and environment. No authentication required.
Connections
/v1/connections
Create a new connection to an external system. Only provider is required; name is optional.
/v1/connections
List all connections with health status and last sync timestamps. Sandbox: health status and timestamps are simulated.
/v1/connections/:id
Get a single connection by ID, including provider, status, and timestamps. Sandbox: status is simulated.
/v1/connections/:id
Safely remove a connection. Associated syncs are paused automatically.
Syncs
/v1/syncs
Create a sync job between source and destination. Supports real-time, scheduled, or manual triggers.
/v1/syncs
List all sync jobs with current status, schedule, and record counts. Sandbox: record counts are simulated.
/v1/syncs/:id
Get sync details including status, records processed, errors, and throughput metrics. Sandbox: metrics are simulated.
/v1/syncs/:id/trigger
Manually trigger a sync job. Useful for testing or on-demand refreshes. Sandbox: completes instantly with simulated results.
Webhooks ROADMAP — Q2 2026
/v1/webhooks
Subscribe to events like sync.completed, connection.failed, or data.changed.
/v1/webhooks/:id/deliveries
View webhook delivery history with payloads and response codes for debugging.
Official SDKs
Open source SDKs for every major language. All v0.1.0 preview — install from GitHub. Contributions welcome.
Architecture & Security for Developers
How the APIs connect, and the security model that protects every request.
How the APIs Connect
OAuth 2.0 + PKCE
Server-side client credentials for backend services. Browser-safe PKCE flow for frontend apps. No secrets exposed to the client.
API Key Scoping
Scope keys to specific endpoints, rate tiers, and environments. Rotate keys without downtime via dual-key support.
Audit Logging
Every API call logged with timestamp, identity, action, and resource. Exportable to your SIEM via webhook or pull.
TLS 1.3 Everywhere
All traffic encrypted in transit. Certificate pinning available for enterprise deployments. No plaintext, ever.
Rate Limits & Tiers
Generous limits that scale with your needs. No surprise throttling.
| Tier | Requests/min | Connections | Active Syncs | Support |
|---|---|---|---|---|
| Sandbox environment: 60 requests/min across all tiers. Production limits apply after launch. | ||||
| Starter | 1,000 | 5 | 3 | |
| Professional | 10,000 | 25 | 25 | Priority |
| Enterprise | Unlimited | Unlimited | Unlimited | Dedicated |
Ready to Build?
Start with the API that fits your use case. Sandbox access within 24 hours.
Questions? Email us at support@threadsync.io
