Skip to content

Upload Pipeline

Upload data with a single call. The SDK selects providers and handles multi-copy replication automatically:

import {
class Synapse
Synapse
} from "@filoz/synapse-sdk";
import {
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
} from 'viem/accounts'
const
const synapse: Synapse
synapse
=
class Synapse
Synapse
.
Synapse.create(options: SynapseOptions): Synapse
create
({
SynapseOptions.account: `0x${string}` | Account
account
:
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
('0x...'),
SynapseOptions.source: string | null
source
: 'my-app' });
const
const data: Uint8Array<ArrayBuffer>
data
= new
var Uint8Array: Uint8ArrayConstructor
new (elements: Iterable<number>) => Uint8Array<ArrayBuffer> (+6 overloads)
Uint8Array
([1, 2, 3, 4, 5])
const {
const pieceCid: PieceLink
pieceCid
,
const size: number
size
,
const complete: boolean
complete
,
const copies: CopyResult[]
copies
,
const failedAttempts: FailedAttempt[]
failedAttempts
} = await
const synapse: Synapse
synapse
.
Synapse.storage: StorageManager
storage
.
StorageManager.upload(data: UploadPieceStreamingData, options?: StorageManagerUploadOptions): Promise<UploadResult>
upload
(
const data: Uint8Array<ArrayBuffer>
data
)
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
("PieceCID:",
const pieceCid: PieceLink
pieceCid
.
Link<MerkleTreeNode, RAW_CODE, MulticodecCode<4113, "fr32-sha2-256-trunc254-padded-binary-tree">, 1>.toString<string>(base?: MultibaseEncoder<string> | undefined): ToString<Link<MerkleTreeNode, RAW_CODE, MulticodecCode<4113, "fr32-sha2-256-trunc254-padded-binary-tree">, Version>, string>

Returns a string representation of an object.

toString
())
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
("Size:",
const size: number
size
, "bytes")
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
("Stored on",
const copies: CopyResult[]
copies
.
Array<CopyResult>.length: number

Gets or sets the length of the array. This is a number one higher than the highest index in the array.

length
, "providers")
for (const
const copy: CopyResult
copy
of
const copies: CopyResult[]
copies
) {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(` Provider ${
const copy: CopyResult
copy
.
CopyResult.providerId: bigint
providerId
}: role=${
const copy: CopyResult
copy
.
CopyResult.role: CopyRole
role
}, dataSet=${
const copy: CopyResult
copy
.
CopyResult.dataSetId: bigint
dataSetId
}`)
}
if (!
const complete: boolean
complete
) {
var console: Console
console
.
Console.warn(...data: any[]): void

The console.warn() static method outputs a warning message to the console at the 'warning' log level.

MDN Reference

warn
("Some copies failed:",
const failedAttempts: FailedAttempt[]
failedAttempts
)
}

The result contains:

  • complete - true when all requested copies were stored and committed on-chain. This is the primary field to check.
  • requestedCopies - the number of copies that were requested (default: 2)
  • pieceCid - content address of your data, used for downloads
  • size - size of the uploaded data in bytes
  • copies - array of successful copies, each with providerId, dataSetId, pieceId, role ('primary' or 'secondary'), retrievalUrl, and isNewDataSet
  • failedAttempts - providers that were tried but did not produce a copy. The SDK retries failed secondaries with alternate providers, so a non-empty array often just means a provider was swapped out. These are diagnostic, check complete for the actual outcome.

Attach metadata to organize uploads. The SDK reuses existing data sets when metadata matches, avoiding duplicate payment rails:

import {
class Synapse
Synapse
} from "@filoz/synapse-sdk";
import {
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
} from 'viem/accounts'
const
const synapse: Synapse
synapse
=
class Synapse
Synapse
.
Synapse.create(options: SynapseOptions): Synapse
create
({
SynapseOptions.account: `0x${string}` | Account
account
:
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
('0x...'),
SynapseOptions.source: string | null
source
: 'my-app' });
const
const data: Uint8Array<ArrayBuffer>
data
= new
var TextEncoder: new () => TextEncoder

The TextEncoder interface takes a stream of code points as input and emits a stream of UTF-8 bytes.

MDN Reference

TextEncoder
().
TextEncoder.encode(input?: string): Uint8Array<ArrayBuffer>

The TextEncoder.encode() method takes a string as input, and returns a Global_Objects/Uint8Array containing the text given in parameters encoded with the specific method for that TextEncoder object.

MDN Reference

encode
("Hello, Filecoin!")
const
const result: UploadResult
result
= await
const synapse: Synapse
synapse
.
Synapse.storage: StorageManager
storage
.
StorageManager.upload(data: UploadPieceStreamingData, options?: StorageManagerUploadOptions): Promise<UploadResult>
upload
(
const data: Uint8Array<ArrayBuffer>
data
, {
BaseContextOptions.metadata?: Record<string, string> | undefined
metadata
: {
type Application: string
Application
: "My DApp",
type Version: string
Version
: "1.0.0",
type Category: string
Category
: "Documents",
},
StorageManagerUploadOptions.pieceMetadata?: Record<string, string> | undefined
pieceMetadata
: {
filename: string
filename
: "hello.txt",
contentType: string
contentType
: "text/plain",
},
})
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
("Uploaded:",
const result: UploadResult
result
.
UploadResult.pieceCid: PieceLink
pieceCid
.
Link<MerkleTreeNode, RAW_CODE, MulticodecCode<4113, "fr32-sha2-256-trunc254-padded-binary-tree">, 1>.toString<string>(base?: MultibaseEncoder<string> | undefined): ToString<Link<MerkleTreeNode, RAW_CODE, MulticodecCode<4113, "fr32-sha2-256-trunc254-padded-binary-tree">, Version>, string>

Returns a string representation of an object.

toString
())

Subsequent uploads with the same metadata reuse the same data sets and payment rails.

Adjust the number of copies for your durability requirements:

import {
class Synapse
Synapse
} from "@filoz/synapse-sdk"
import {
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
} from "viem/accounts"
const
const synapse: Synapse
synapse
=
class Synapse
Synapse
.
Synapse.create(options: SynapseOptions): Synapse
create
({
SynapseOptions.account: `0x${string}` | Account
account
:
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
("0x..."),
SynapseOptions.source: string | null
source
: 'my-app' })
const
const data: Uint8Array<ArrayBuffer>
data
= new
var Uint8Array: Uint8ArrayConstructor
new (length: number) => Uint8Array<ArrayBuffer> (+6 overloads)
Uint8Array
(256)
// Store 3 copies for higher redundancy
const
const result3: UploadResult
result3
= await
const synapse: Synapse
synapse
.
Synapse.storage: StorageManager
storage
.
StorageManager.upload(data: UploadPieceStreamingData, options?: StorageManagerUploadOptions): Promise<UploadResult>
upload
(
const data: Uint8Array<ArrayBuffer>
data
, {
CreateContextsOptions.copies?: number | undefined
copies
: 3 })
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
("3 copies:",
const result3: UploadResult
result3
.
UploadResult.copies: CopyResult[]
copies
.
Array<CopyResult>.length: number

Gets or sets the length of the array. This is a number one higher than the highest index in the array.

length
)
// Store a single copy when redundancy isn't needed
const
const result1: UploadResult
result1
= await
const synapse: Synapse
synapse
.
Synapse.storage: StorageManager
storage
.
StorageManager.upload(data: UploadPieceStreamingData, options?: StorageManagerUploadOptions): Promise<UploadResult>
upload
(
const data: Uint8Array<ArrayBuffer>
data
, {
CreateContextsOptions.copies?: number | undefined
copies
: 1 })
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
("1 copy:",
const result1: UploadResult
result1
.
UploadResult.copies: CopyResult[]
copies
.
Array<CopyResult>.length: number

Gets or sets the length of the array. This is a number one higher than the highest index in the array.

length
)

The default is 2 copies. The first copy is stored on an endorsed provider (high trust, curated), and secondary copies are pulled via SP-to-SP transfer from approved providers.

Track the lifecycle of a multi-copy upload with callbacks:

import {
class Synapse
Synapse
} from "@filoz/synapse-sdk"
import {
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
} from "viem/accounts"
const
const synapse: Synapse
synapse
=
class Synapse
Synapse
.
Synapse.create(options: SynapseOptions): Synapse
create
({
SynapseOptions.account: `0x${string}` | Account
account
:
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
("0x..."),
SynapseOptions.source: string | null
source
: 'my-app' })
const
const data: Uint8Array<ArrayBuffer>
data
= new
var Uint8Array: Uint8ArrayConstructor
new (length: number) => Uint8Array<ArrayBuffer> (+6 overloads)
Uint8Array
(1024) // 1KB of data
const
const result: UploadResult
result
= await
const synapse: Synapse
synapse
.
Synapse.storage: StorageManager
storage
.
StorageManager.upload(data: UploadPieceStreamingData, options?: StorageManagerUploadOptions): Promise<UploadResult>
upload
(
const data: Uint8Array<ArrayBuffer>
data
, {
StorageManagerUploadOptions.callbacks?: Partial<CombinedCallbacks> | undefined
callbacks
: {
onStored?: ((providerId: bigint, pieceCid: PieceCID) => void) | undefined
onStored
: (
providerId: bigint
providerId
,
pieceCid: PieceLink
pieceCid
) => {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Data stored on provider ${
providerId: bigint
providerId
}`)
},
onCopyComplete?: ((providerId: bigint, pieceCid: PieceCID) => void) | undefined
onCopyComplete
: (
providerId: bigint
providerId
,
pieceCid: PieceLink
pieceCid
) => {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Secondary copy complete on provider ${
providerId: bigint
providerId
}`)
},
onCopyFailed?: ((providerId: bigint, pieceCid: PieceCID, error: Error) => void) | undefined
onCopyFailed
: (
providerId: bigint
providerId
,
pieceCid: PieceLink
pieceCid
,
error: Error
error
) => {
var console: Console
console
.
Console.warn(...data: any[]): void

The console.warn() static method outputs a warning message to the console at the 'warning' log level.

MDN Reference

warn
(`Copy failed on provider ${
providerId: bigint
providerId
}:`,
error: Error
error
.
Error.message: string
message
)
},
onPullProgress?: ((providerId: bigint, pieceCid: PieceCID, status: PullStatus) => void) | undefined
onPullProgress
: (
providerId: bigint
providerId
,
pieceCid: PieceLink
pieceCid
,
status: pullPiecesApiRequest.PullStatus
status
) => {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Pull to provider ${
providerId: bigint
providerId
}: ${
status: pullPiecesApiRequest.PullStatus
status
}`)
},
onPiecesAdded?: ((transaction: Hex, providerId: bigint, pieces: {
pieceCid: PieceCID;
}[]) => void) | undefined
onPiecesAdded
: (
txHash: `0x${string}`
txHash
,
providerId: bigint
providerId
,
pieces: {
pieceCid: PieceCID;
}[]
pieces
) => {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`On-chain commit submitted: ${
txHash: `0x${string}`
txHash
}`)
},
onPiecesConfirmed?: ((dataSetId: bigint, providerId: bigint, pieces: PieceRecord[]) => void) | undefined
onPiecesConfirmed
: (
dataSetId: bigint
dataSetId
,
providerId: bigint
providerId
,
pieces: PieceRecord[]
pieces
) => {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Confirmed on-chain: dataSet=${
dataSetId: bigint
dataSetId
}, provider=${
providerId: bigint
providerId
}`)
},
onProgress?: ((bytesUploaded: number) => void) | undefined
onProgress
: (
bytesUploaded: number
bytesUploaded
) => {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Uploaded ${
bytesUploaded: number
bytesUploaded
} bytes`)
},
},
})

Callback lifecycle:

  1. onProgress - fires during upload to primary provider
  2. onStored - primary upload complete, piece parked on SP
  3. onPullProgress - SP-to-SP transfer status for secondaries
  4. onCopyComplete / onCopyFailed - secondary pull result
  5. onPiecesAdded - commit transaction submitted
  6. onPiecesConfirmed - commit confirmed on-chain

upload() is designed around partial success over atomicity: it commits whatever succeeded rather than throwing away successful work. This means the return value is the primary interface for understanding what happened.

upload() only throws in these cases:

ErrorWhat happenedWhat to do
StoreErrorPrimary upload failedRetry the upload
CommitErrorData is stored on providers but all on-chain commits failedUse split operations to retry commit() without re-uploading
Selection errorNo endorsed provider available or reachableCheck provider health / network

If upload() returns (no throw), at least one copy is committed on-chain. But the result may contain fewer copies than requested. Every copy in copies[] represents a committed on-chain data set that the user is now paying for.

import {
class Synapse
Synapse
} from "@filoz/synapse-sdk"
import {
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
} from "viem/accounts"
const
const synapse: Synapse
synapse
=
class Synapse
Synapse
.
Synapse.create(options: SynapseOptions): Synapse
create
({
SynapseOptions.account: `0x${string}` | Account
account
:
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
("0x..."),
SynapseOptions.source: string | null
source
: 'my-app' })
const
const data: Uint8Array<ArrayBuffer>
data
= new
var Uint8Array: Uint8ArrayConstructor
new (length: number) => Uint8Array<ArrayBuffer> (+6 overloads)
Uint8Array
(256)
const
const result: UploadResult
result
= await
const synapse: Synapse
synapse
.
Synapse.storage: StorageManager
storage
.
StorageManager.upload(data: UploadPieceStreamingData, options?: StorageManagerUploadOptions): Promise<UploadResult>
upload
(
const data: Uint8Array<ArrayBuffer>
data
, {
CreateContextsOptions.copies?: number | undefined
copies
: 2 })
// Check overall success: complete === true means all requested copies succeeded
if (!
const result: UploadResult
result
.
UploadResult.complete: boolean
complete
) {
var console: Console
console
.
Console.warn(...data: any[]): void

The console.warn() static method outputs a warning message to the console at the 'warning' log level.

MDN Reference

warn
(`Only ${
const result: UploadResult
result
.
UploadResult.copies: CopyResult[]
copies
.
Array<CopyResult>.length: number

Gets or sets the length of the array. This is a number one higher than the highest index in the array.

length
}/${
const result: UploadResult
result
.
UploadResult.requestedCopies: number
requestedCopies
} copies succeeded`)
for (const
const attempt: FailedAttempt
attempt
of
const result: UploadResult
result
.
UploadResult.failedAttempts: FailedAttempt[]
failedAttempts
) {
var console: Console
console
.
Console.warn(...data: any[]): void

The console.warn() static method outputs a warning message to the console at the 'warning' log level.

MDN Reference

warn
(` Provider ${
const attempt: FailedAttempt
attempt
.
FailedAttempt.providerId: bigint
providerId
} (${
const attempt: FailedAttempt
attempt
.
FailedAttempt.role: CopyRole
role
}): ${
const attempt: FailedAttempt
attempt
.
FailedAttempt.error: string
error
}`)
}
}
// Every copy is committed and being paid for
for (const
const copy: CopyResult
copy
of
const result: UploadResult
result
.
UploadResult.copies: CopyResult[]
copies
) {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Provider ${
const copy: CopyResult
copy
.
CopyResult.providerId: bigint
providerId
}, dataset ${
const copy: CopyResult
copy
.
CopyResult.dataSetId: bigint
dataSetId
}, piece ${
const copy: CopyResult
copy
.
CopyResult.pieceId: bigint
pieceId
}`)
}

For auto-selected providers (no explicit providerIds or dataSetIds), the SDK automatically retries failed secondaries with alternate providers up to 5 times. If you explicitly specify providers, the SDK respects your choice and does not retry.

upload()Split Operations
ControlAutomaticManual per-phase
Error recoveryRe-upload on commit failureRetry commit without re-upload
Batch filesOne call per fileStore many, commit in batch
Wallet promptsManaged internallyControl via presignForCommit()
Best forMost use casesProduction pipelines, custom UX

Every upload goes through three phases:

store --> pull --> commit
| | |
| | +-- On-chain: create dataset, add piece, start payments
| +-- SP-to-SP: secondary provider fetches from primary
+-- Upload: bytes sent to one provider (no on-chain state yet)
  • store: Upload bytes to a single SP. Returns { pieceCid, size }. The piece is “parked” on the SP but not yet on-chain and subject to garbage collection if not committed.
  • pull: SP-to-SP transfer. The destination SP fetches the piece from a source SP. No client bandwidth used.
  • commit: Submit an on-chain transaction to add the piece to a data set. Creates the data set and payment rail if needed.

Upload data to a provider without committing on-chain:

const
const contexts: StorageContext[]
contexts
= await
const synapse: Synapse
synapse
.
Synapse.storage: StorageManager
storage
.
StorageManager.createContexts(options?: CreateContextsOptions): Promise<StorageContext[]>
createContexts
({
CreateContextsOptions.copies?: number | undefined
copies
: 2,
})
const [
const primary: StorageContext
primary
,
const secondary: StorageContext
secondary
] =
const contexts: StorageContext[]
contexts
const {
const pieceCid: PieceLink
pieceCid
,
const size: number
size
} = await
const primary: StorageContext
primary
.
StorageContext.store(data: UploadPieceStreamingData, options?: StoreOptions): Promise<StoreResult>
store
(
const data: Uint8Array<ArrayBuffer>
data
, {
StoreOptions.pieceCid?: PieceLink | undefined
pieceCid
:
const preCalculatedCid: PieceLink
preCalculatedCid
, // skip expensive PieceCID (hash digest) calculation (optional)
StoreOptions.signal?: AbortSignal | undefined
signal
:
const abortController: AbortController
abortController
.
AbortController.signal: AbortSignal

The signal read-only property of the AbortController interface returns an AbortSignal object instance, which can be used to communicate with/abort an asynchronous operation as desired.

MDN Reference

signal
, // cancellation (optional)
StoreOptions.onProgress?: ((bytesUploaded: number) => void) | undefined
onProgress
: (
bytes: number
bytes
) => { // progress callback (optional)
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Uploaded ${
bytes: number
bytes
} bytes`)
},
})
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Stored: ${
const pieceCid: PieceLink
pieceCid
}, ${
const size: number
size
} bytes`)

store() accepts Uint8Array or ReadableStream<Uint8Array>. Use streaming for large files to minimize memory.

After store completes, the piece is parked on the SP and can be:

  • Retrieved via the context’s getPieceUrl(pieceCid)
  • Pulled to other providers via pull()
  • Committed on-chain via commit()

Request a secondary provider to fetch pieces from the primary:

// Pre-sign to avoid double wallet prompts during pull + commit
const
const extraData: `0x${string}`
extraData
= await
const secondary: StorageContext
secondary
.
StorageContext.presignForCommit(pieces: Array<{
pieceCid: PieceCID;
pieceMetadata?: MetadataObject;
}>): Promise<Hex>
presignForCommit
([{
pieceCid: PieceLink
pieceCid
}])
const
const pullResult: PullResult
pullResult
= await
const secondary: StorageContext
secondary
.
StorageContext.pull(options: PullOptions): Promise<PullResult>
pull
({
PullOptions.pieces: PieceLink[]
pieces
: [
const pieceCid: PieceLink
pieceCid
],
PullOptions.from: PullSource
from
: (
cid: PieceLink
cid
) =>
const primary: StorageContext
primary
.
StorageContext.getPieceUrl(pieceCid: PieceCID): string
getPieceUrl
(
cid: PieceLink
cid
), // source URL builder (or URL string)
PullOptions.extraData?: `0x${string}` | undefined
extraData
, // pre-signed auth (optional, reused for commit)
PullOptions.signal?: AbortSignal | undefined
signal
:
const abortController: AbortController
abortController
.
AbortController.signal: AbortSignal

The signal read-only property of the AbortController interface returns an AbortSignal object instance, which can be used to communicate with/abort an asynchronous operation as desired.

MDN Reference

signal
, // cancellation (optional)
PullOptions.onProgress?: ((pieceCid: PieceCID, status: PullStatus) => void) | undefined
onProgress
: (
cid: PieceLink
cid
,
status: pullPiecesApiRequest.PullStatus
status
) => { // status callback (optional)
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`${
cid: PieceLink
cid
}: ${
status: pullPiecesApiRequest.PullStatus
status
}`)
},
})
if (
const pullResult: PullResult
pullResult
.
PullResult.status: "complete" | "failed"
status
!== "complete") {
for (const
const piece: {
pieceCid: PieceCID;
status: "complete" | "failed";
}
piece
of
const pullResult: PullResult
pullResult
.
PullResult.pieces: {
pieceCid: PieceCID;
status: "complete" | "failed";
}[]
pieces
) {
if (
const piece: {
pieceCid: PieceCID;
status: "complete" | "failed";
}
piece
.
status: "complete" | "failed"
status
=== "failed") {
var console: Console
console
.
Console.error(...data: any[]): void

The console.error() static method outputs a message to the console at the 'error' log level.

MDN Reference

error
(`Failed to pull ${
const piece: {
pieceCid: PieceCID;
status: "complete" | "failed";
}
piece
.
pieceCid: PieceLink
pieceCid
}`)
}
}
}

The from parameter accepts either a URL string (base service URL) or a function that returns a piece URL for a given PieceCID.

Pre-signing: presignForCommit() creates an EIP-712 signature that can be reused for both pull() and commit(). This avoids prompting the wallet twice. Pass the same extraData to both calls.

Add pieces to an on-chain data set. Creates the data set and payment rail if one doesn’t exist:

// Commit on both providers
const [
const primaryCommit: PromiseSettledResult<CommitResult>
primaryCommit
,
const secondaryCommit: PromiseSettledResult<CommitResult>
secondaryCommit
] = await
var Promise: PromiseConstructor

Represents the completion of an asynchronous operation

Promise
.
PromiseConstructor.allSettled<[Promise<CommitResult>, Promise<CommitResult>]>(values: [Promise<CommitResult>, Promise<CommitResult>]): Promise<[PromiseSettledResult<CommitResult>, PromiseSettledResult<CommitResult>]> (+1 overload)

Creates a Promise that is resolved with an array of results when all of the provided Promises resolve or reject.

@paramvalues An array of Promises.

@returnsA new Promise.

allSettled
([
const primary: StorageContext
primary
.
StorageContext.commit(options: CommitOptions): Promise<CommitResult>
commit
({
CommitOptions.pieces: {
pieceCid: PieceCID;
pieceMetadata?: MetadataObject;
}[]
pieces
: [{
pieceCid: PieceLink
pieceCid
,
pieceMetadata?: MetadataObject | undefined
pieceMetadata
: {
filename: string
filename
: "doc.pdf" } }],
CommitOptions.onSubmitted?: ((txHash: Hex) => void) | undefined
onSubmitted
: (
txHash: `0x${string}`
txHash
) => {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Transaction submitted: ${
txHash: `0x${string}`
txHash
}`)
},
}),
const secondary: StorageContext
secondary
.
StorageContext.commit(options: CommitOptions): Promise<CommitResult>
commit
({
CommitOptions.pieces: {
pieceCid: PieceCID;
pieceMetadata?: MetadataObject;
}[]
pieces
: [{
pieceCid: PieceLink
pieceCid
,
pieceMetadata?: MetadataObject | undefined
pieceMetadata
: {
filename: string
filename
: "doc.pdf" } }],
CommitOptions.extraData?: `0x${string}` | undefined
extraData
, // pre-signed auth from presignForCommit() (optional)
CommitOptions.onSubmitted?: ((txHash: Hex) => void) | undefined
onSubmitted
: (
txHash: `0x${string}`
txHash
) => {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Transaction submitted: ${
txHash: `0x${string}`
txHash
}`)
},
})
])
if (
const primaryCommit: PromiseSettledResult<CommitResult>
primaryCommit
.
status: "rejected" | "fulfilled"
status
=== "fulfilled") {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Primary: dataSet=${
const primaryCommit: PromiseFulfilledResult<CommitResult>
primaryCommit
.
PromiseFulfilledResult<CommitResult>.value: CommitResult
value
.
CommitResult.dataSetId: bigint
dataSetId
}`)
}
if (
const secondaryCommit: PromiseSettledResult<CommitResult>
secondaryCommit
.
status: "rejected" | "fulfilled"
status
=== "fulfilled") {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Secondary: dataSet=${
const secondaryCommit: PromiseFulfilledResult<CommitResult>
secondaryCommit
.
PromiseFulfilledResult<CommitResult>.value: CommitResult
value
.
CommitResult.dataSetId: bigint
dataSetId
}`)
}

The result:

  • txHash - transaction hash
  • pieceIds - assigned piece IDs (one per input piece)
  • dataSetId - data set ID (may be newly created)
  • isNewDataSet - whether a new data set was created

Upload multiple files to 2 providers with full error handling:

import {
class Synapse
Synapse
, type
type PieceCID = Link<MerkleTreeNode, RAW_CODE, MulticodecCode<4113, "fr32-sha2-256-trunc254-padded-binary-tree">, 1>
PieceCID
} from "@filoz/synapse-sdk"
import {
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
} from "viem/accounts"
const
const synapse: Synapse
synapse
=
class Synapse
Synapse
.
Synapse.create(options: SynapseOptions): Synapse
create
({
SynapseOptions.account: `0x${string}` | Account
account
:
function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount

@description Creates an Account from a private key.

@returnsA Private Key Account.

privateKeyToAccount
("0x.."),
SynapseOptions.source: string | null
source
: "my-app" })
const
const files: Uint8Array<ArrayBuffer>[]
files
= [
new
var TextEncoder: new () => TextEncoder

The TextEncoder interface takes a stream of code points as input and emits a stream of UTF-8 bytes.

MDN Reference

TextEncoder
().
TextEncoder.encode(input?: string): Uint8Array<ArrayBuffer>

The TextEncoder.encode() method takes a string as input, and returns a Global_Objects/Uint8Array containing the text given in parameters encoded with the specific method for that TextEncoder object.

MDN Reference

encode
("File 1 content..."),
new
var TextEncoder: new () => TextEncoder

The TextEncoder interface takes a stream of code points as input and emits a stream of UTF-8 bytes.

MDN Reference

TextEncoder
().
TextEncoder.encode(input?: string): Uint8Array<ArrayBuffer>

The TextEncoder.encode() method takes a string as input, and returns a Global_Objects/Uint8Array containing the text given in parameters encoded with the specific method for that TextEncoder object.

MDN Reference

encode
("File 2 content..."),
new
var TextEncoder: new () => TextEncoder

The TextEncoder interface takes a stream of code points as input and emits a stream of UTF-8 bytes.

MDN Reference

TextEncoder
().
TextEncoder.encode(input?: string): Uint8Array<ArrayBuffer>

The TextEncoder.encode() method takes a string as input, and returns a Global_Objects/Uint8Array containing the text given in parameters encoded with the specific method for that TextEncoder object.

MDN Reference

encode
("File 3 content..."),
]
// Create contexts for 2 providers
const [
const primary: StorageContext
primary
,
const secondary: StorageContext
secondary
] = await
const synapse: Synapse
synapse
.
Synapse.storage: StorageManager
storage
.
StorageManager.createContexts(options?: CreateContextsOptions): Promise<StorageContext[]>
createContexts
({
CreateContextsOptions.copies?: number | undefined
copies
: 2,
BaseContextOptions.metadata?: Record<string, string> | undefined
metadata
: {
source: string
source
: "batch-upload" },
})
// Store all files on primary (note: these could be done in parallel w/ Promise.all)
const
const stored: {
pieceCid: PieceCID;
size: number;
}[]
stored
: {
pieceCid: PieceLink
pieceCid
:
type PieceCID = Link<MerkleTreeNode, RAW_CODE, MulticodecCode<4113, "fr32-sha2-256-trunc254-padded-binary-tree">, 1>
PieceCID
;
size: number
size
: number }[] = []
for (const
const file: Uint8Array<ArrayBuffer>
file
of
const files: Uint8Array<ArrayBuffer>[]
files
) {
const
const result: StoreResult
result
= await
const primary: StorageContext
primary
.
StorageContext.store(data: UploadPieceStreamingData, options?: StoreOptions): Promise<StoreResult>
store
(
const file: Uint8Array<ArrayBuffer>
file
)
const stored: {
pieceCid: PieceCID;
size: number;
}[]
stored
.
Array<{ pieceCid: PieceCID; size: number; }>.push(...items: {
pieceCid: PieceCID;
size: number;
}[]): number

Appends new elements to the end of an array, and returns the new length of the array.

@paramitems New elements to add to the array.

push
(
const result: StoreResult
result
)
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Stored ${
const result: StoreResult
result
.
StoreResult.pieceCid: PieceLink
pieceCid
}`)
}
// Pre-sign for all pieces on secondary
const
const pieceCids: PieceLink[]
pieceCids
=
const stored: {
pieceCid: PieceCID;
size: number;
}[]
stored
.
Array<{ pieceCid: PieceCID; size: number; }>.map<PieceLink>(callbackfn: (value: {
pieceCid: PieceCID;
size: number;
}, index: number, array: {
pieceCid: PieceCID;
size: number;
}[]) => PieceLink, thisArg?: any): PieceLink[]

Calls a defined callback function on each element of an array, and returns an array that contains the results.

@paramcallbackfn A function that accepts up to three arguments. The map method calls the callbackfn function one time for each element in the array.

@paramthisArg An object to which the this keyword can refer in the callbackfn function. If thisArg is omitted, undefined is used as the this value.

map
(
s: {
pieceCid: PieceCID;
size: number;
}
s
=>
s: {
pieceCid: PieceCID;
size: number;
}
s
.
pieceCid: PieceLink
pieceCid
)
const
const extraData: `0x${string}`
extraData
= await
const secondary: StorageContext
secondary
.
StorageContext.presignForCommit(pieces: Array<{
pieceCid: PieceCID;
pieceMetadata?: MetadataObject;
}>): Promise<Hex>
presignForCommit
(
const pieceCids: PieceLink[]
pieceCids
.
Array<PieceLink>.map<{
pieceCid: PieceLink;
}>(callbackfn: (value: PieceLink, index: number, array: PieceLink[]) => {
pieceCid: PieceLink;
}, thisArg?: any): {
pieceCid: PieceLink;
}[]

Calls a defined callback function on each element of an array, and returns an array that contains the results.

@paramcallbackfn A function that accepts up to three arguments. The map method calls the callbackfn function one time for each element in the array.

@paramthisArg An object to which the this keyword can refer in the callbackfn function. If thisArg is omitted, undefined is used as the this value.

map
(
cid: PieceLink
cid
=> ({
pieceCid: PieceLink
pieceCid
:
cid: PieceLink
cid
}))
)
// Pull all pieces to secondary
const
const pullResult: PullResult
pullResult
= await
const secondary: StorageContext
secondary
.
StorageContext.pull(options: PullOptions): Promise<PullResult>
pull
({
PullOptions.pieces: PieceLink[]
pieces
:
const pieceCids: PieceLink[]
pieceCids
,
PullOptions.from: PullSource
from
: (
cid: PieceLink
cid
) =>
const primary: StorageContext
primary
.
StorageContext.getPieceUrl(pieceCid: PieceCID): string
getPieceUrl
(
cid: PieceLink
cid
),
PullOptions.extraData?: `0x${string}` | undefined
extraData
,
})
// Commit on both providers
const [
const primaryCommit: PromiseSettledResult<CommitResult>
primaryCommit
,
const secondaryCommit: PromiseSettledResult<CommitResult>
secondaryCommit
] = await
var Promise: PromiseConstructor

Represents the completion of an asynchronous operation

Promise
.
PromiseConstructor.allSettled<[Promise<CommitResult>, Promise<CommitResult>]>(values: [Promise<CommitResult>, Promise<CommitResult>]): Promise<[PromiseSettledResult<CommitResult>, PromiseSettledResult<CommitResult>]> (+1 overload)

Creates a Promise that is resolved with an array of results when all of the provided Promises resolve or reject.

@paramvalues An array of Promises.

@returnsA new Promise.

allSettled
([
const primary: StorageContext
primary
.
StorageContext.commit(options: CommitOptions): Promise<CommitResult>
commit
({
CommitOptions.pieces: {
pieceCid: PieceCID;
pieceMetadata?: MetadataObject;
}[]
pieces
:
const pieceCids: PieceLink[]
pieceCids
.
Array<PieceLink>.map<{
pieceCid: PieceLink;
}>(callbackfn: (value: PieceLink, index: number, array: PieceLink[]) => {
pieceCid: PieceLink;
}, thisArg?: any): {
pieceCid: PieceLink;
}[]

Calls a defined callback function on each element of an array, and returns an array that contains the results.

@paramcallbackfn A function that accepts up to three arguments. The map method calls the callbackfn function one time for each element in the array.

@paramthisArg An object to which the this keyword can refer in the callbackfn function. If thisArg is omitted, undefined is used as the this value.

map
(
cid: PieceLink
cid
=> ({
pieceCid: PieceLink
pieceCid
:
cid: PieceLink
cid
})) }),
const pullResult: PullResult
pullResult
.
PullResult.status: "complete" | "failed"
status
=== "complete"
?
const secondary: StorageContext
secondary
.
StorageContext.commit(options: CommitOptions): Promise<CommitResult>
commit
({
CommitOptions.pieces: {
pieceCid: PieceCID;
pieceMetadata?: MetadataObject;
}[]
pieces
:
const pieceCids: PieceLink[]
pieceCids
.
Array<PieceLink>.map<{
pieceCid: PieceLink;
}>(callbackfn: (value: PieceLink, index: number, array: PieceLink[]) => {
pieceCid: PieceLink;
}, thisArg?: any): {
pieceCid: PieceLink;
}[]

Calls a defined callback function on each element of an array, and returns an array that contains the results.

@paramcallbackfn A function that accepts up to three arguments. The map method calls the callbackfn function one time for each element in the array.

@paramthisArg An object to which the this keyword can refer in the callbackfn function. If thisArg is omitted, undefined is used as the this value.

map
(
cid: PieceLink
cid
=> ({
pieceCid: PieceLink
pieceCid
:
cid: PieceLink
cid
})),
CommitOptions.extraData?: `0x${string}` | undefined
extraData
})
:
var Promise: PromiseConstructor

Represents the completion of an asynchronous operation

Promise
.
PromiseConstructor.reject<never>(reason?: any): Promise<never>

Creates a new rejected promise for the provided reason.

@paramreason The reason the promise was rejected.

@returnsA new rejected Promise.

reject
(new
var Error: ErrorConstructor
new (message?: string, options?: ErrorOptions) => Error (+1 overload)
Error
("Pull failed, skipping secondary commit")), // not advised!
])
if (
const primaryCommit: PromiseSettledResult<CommitResult>
primaryCommit
.
status: "rejected" | "fulfilled"
status
=== "fulfilled") {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Primary: dataSet=${
const primaryCommit: PromiseFulfilledResult<CommitResult>
primaryCommit
.
PromiseFulfilledResult<CommitResult>.value: CommitResult
value
.
CommitResult.dataSetId: bigint
dataSetId
}`)
}
if (
const secondaryCommit: PromiseSettledResult<CommitResult>
secondaryCommit
.
status: "rejected" | "fulfilled"
status
=== "fulfilled") {
var console: Console
console
.
Console.log(...data: any[]): void

The console.log() static method outputs a message to the console.

MDN Reference

log
(`Secondary: dataSet=${
const secondaryCommit: PromiseFulfilledResult<CommitResult>
secondaryCommit
.
PromiseFulfilledResult<CommitResult>.value: CommitResult
value
.
CommitResult.dataSetId: bigint
dataSetId
}`)
}

Each phase’s errors are independent. Failures don’t cascade, and you can retry at any level:

PhaseFailureData stateRecovery
storeUpload/network errorNo data on SPRetry store() with same or different context
pullSP-to-SP transfer failedData on primary onlyRetry pull(), try different secondary, or skip
commitOn-chain transaction failedData on SP but not on-chainRetry commit() (no re-upload needed)

The key advantage of split operations: if commit fails, data is already stored on the SP. You can retry commit() without re-uploading the data. With the high-level upload(), a CommitError would require re-uploading.

  • Storage Operations - Data set management, retrieval, downloads, and lifecycle operations.

  • Storage Costs - Calculate your monthly costs and understand funding requirements.

  • Synapse Core - Use the core library directly for maximum control over provider selection, uploads, and SP-to-SP transfers.