Skip to content

attestAI

const result = await fidemark.attestAI({
content, // string | Uint8Array: the AI output
modelId, // string: model identifier
provider, // string: provider name
prompt, // optional, hashed by the SDK to bytes32
promptHash, // optional, pre-computed bytes32 hex (mutually exclusive with prompt)
parameters, // optional, object or string
refUID, // optional, parent attestation UID
});

Returns the same UID + tx hash + verify URL shape as attestHuman.

FieldTypeNotes
contentbytes / stringThe AI output. Hashed with SHA-256.
modelIdstringUse the canonical model ID, e.g. claude-sonnet-4-6, gpt-4o, llama-3.1-70b.
providerstringE.g. anthropic, openai, google, local.
promptbytes / stringOptional plaintext prompt. The SDK hashes it to bytes32: the prompt itself never goes on chain.
promptHashstringPre-computed prompt hash (0x + 32 bytes hex). For pipelines that hash off-band. Mutually exclusive with prompt.
parametersobject / stringGeneration parameters. Objects are JSON-stringified by the SDK.

If neither prompt nor promptHash is supplied, promptHash is recorded as zero: useful for agent flows where there’s no single discrete prompt.

The schema stores promptHash (32 bytes), not the prompt itself. This proves which prompt produced the output without disclosing it:

  • An enterprise can attest to its outputs without leaking proprietary prompt logic.
  • A researcher can later prove they used a specific prompt by revealing it and matching the hash.

Complete, copy-pasteable runnable. See Configuration for every constructor option.

import { Fidemark, getNetwork } from "@fidemark/sdk";
const fidemark = new Fidemark({
network: getNetwork("base-sepolia"),
privateKey: process.env.PRIVATE_KEY,
});
const ai = await fidemark.attestAI({
content: response.text,
modelId: "claude-sonnet-4-6",
provider: "anthropic",
prompt: userPrompt,
parameters: { temperature: 0.7, maxTokens: 4096 },
});