attestAI
Signature
Section titled “Signature”const result = await fidemark.attestAI({ content, // string | Uint8Array: the AI output modelId, // string: model identifier provider, // string: provider name prompt, // optional, hashed by the SDK to bytes32 promptHash, // optional, pre-computed bytes32 hex (mutually exclusive with prompt) parameters, // optional, object or string refUID, // optional, parent attestation UID});result = fidemark.attest_ai(AttestAIInput( content=..., model_id="...", provider="...", prompt=None, # str | bytes (hashed for you) prompt_hash=None, # pre-computed 0x...32-byte hex (mutually exclusive) parameters=None, # dict or str ref_uid=None,))res, err := client.AttestAI(ctx, fidemark.AttestAIInput{ Content: []byte(...), ModelID: "...", Provider: "...", Prompt: []byte("..."), // optional, hashed for you PromptHash: "", // optional pre-computed hex (mutually exclusive) Parameters: `{"temperature":0.7}`, RefUID: "",})Returns the same UID + tx hash + verify URL shape as attestHuman.
Inputs
Section titled “Inputs”| Field | Type | Notes |
|---|---|---|
content | bytes / string | The AI output. Hashed with SHA-256. |
modelId | string | Use the canonical model ID, e.g. claude-sonnet-4-6, gpt-4o, llama-3.1-70b. |
provider | string | E.g. anthropic, openai, google, local. |
prompt | bytes / string | Optional plaintext prompt. The SDK hashes it to bytes32: the prompt itself never goes on chain. |
promptHash | string | Pre-computed prompt hash (0x + 32 bytes hex). For pipelines that hash off-band. Mutually exclusive with prompt. |
parameters | object / string | Generation parameters. Objects are JSON-stringified by the SDK. |
If neither prompt nor promptHash is supplied, promptHash is recorded as zero: useful for agent flows where there’s no single discrete prompt.
Why hash the prompt?
Section titled “Why hash the prompt?”The schema stores promptHash (32 bytes), not the prompt itself. This proves which prompt produced the output without disclosing it:
- An enterprise can attest to its outputs without leaking proprietary prompt logic.
- A researcher can later prove they used a specific prompt by revealing it and matching the hash.
Example
Section titled “Example”Complete, copy-pasteable runnable. See Configuration for every constructor option.
import { Fidemark, getNetwork } from "@fidemark/sdk";
const fidemark = new Fidemark({ network: getNetwork("base-sepolia"), privateKey: process.env.PRIVATE_KEY,});
const ai = await fidemark.attestAI({ content: response.text, modelId: "claude-sonnet-4-6", provider: "anthropic", prompt: userPrompt, parameters: { temperature: 0.7, maxTokens: 4096 },});import osfrom fidemark import AttestAIInput, Fidemark, get_network
fidemark = Fidemark( network=get_network("base-sepolia"), private_key=os.environ["PRIVATE_KEY"],)
ai = fidemark.attest_ai(AttestAIInput( content=response_text, model_id="claude-sonnet-4-6", provider="anthropic", prompt=user_prompt, parameters={"temperature": 0.7, "maxTokens": 4096},))import ( "context" "log" "os"
"github.com/fidemark/sdk-go/fidemark")
network, _ := fidemark.GetNetwork("base-sepolia")client, err := fidemark.New(fidemark.Config{ Network: network, PrivateKey: os.Getenv("PRIVATE_KEY"),})if err != nil { log.Fatal(err) }
ai, err := client.AttestAI(context.Background(), fidemark.AttestAIInput{ Content: []byte(responseText), ModelID: "claude-sonnet-4-6", Provider: "anthropic", Prompt: []byte(userPrompt), Parameters: `{"temperature":0.7,"maxTokens":4096}`,})