Back to Blog

Custom Connections Guide | Import Any CSV Into Clams

Every exchange exports CSVs differently — different column names, different date formats, different ways of encoding amounts. The Custom connection lets you define a JSON mapping that tells Clams how to read your columns, then reuse it for every future import.

There are two ways to get started:

  • Use an LLM: paste the prompt below into ChatGPT, Claude, or any LLM along with a sample of your CSV. It will generate a mapping.json that you save and use with the Clams CLI. This is the fastest path.
  • Do it yourself: follow the step-by-step walkthrough below to understand the mapping format and build one by hand.

LLM Quickstart

Copy the prompt below and paste it into ChatGPT, Claude, or whichever LLM you prefer. Along with the prompt, include a sample of your CSV (the header row and a few representative rows is enough). You don't need to share your full transaction history to build a mapping.

Important: Make sure your sample includes at least one row for every transaction type in your CSV (e.g. deposits, withdrawals, buys, sells). The LLM needs to see each type to generate a pattern for it.

Privacy note: If you'd rather not send any real data to a cloud LLM, use a local model via Ollama or replace the sample rows with fake data that has the same column structure.

The LLM will read your CSV sample, fetch the mapping spec from clams.tech, and generate a complete mapping.json. Save the output to a file, then follow the CLI commands printed after the prompt to create the connection and import your data.

I need you to build a mapping.json file for importing a CSV into Clams
using a Custom connection.

First, read the CSV sample I've provided. Look at the headers and rows
to understand the structure.

Then fetch the CSV mapping spec at this URL:
https://clams.tech/docs/connections/csv-mapping.md

That spec is the complete reference for building a mapping.json file. It
covers the expression DSL, all canonical types and their fields, dialect
options, amount encoding, and error handling. Read it carefully before
writing the mapping.

Build a mapping.json that wraps everything in a top-level "csv_mapping"
key, like: { "csv_mapping": { "dialect": {...}, "row_patterns": [...] } }

The mapping should:
- Set the correct dialect (delimiter, has_header: true)
- Have a row_pattern for each distinct transaction type in the CSV.
  If the same type appears with different assets (e.g. BTC deposits
  and USD deposits), create separate patterns for each because they
  need different amount parsers
- Use "when" expressions to classify rows. Use "and" to combine
  conditions when a single column isn't enough (e.g. type + currency)
- Map columns to the correct canonical fields for each pattern
- Include btc_destination for BTC deposit rows (required: must be a
  valid Bitcoin address, BOLT11 invoice, or 64-hex payment hash)
- Handle optional fields properly (wrap in "some", use "empty_to_none"
  for optional strings, use "none" when absent)
- Parse amounts correctly: use parse_decimal_sats_from_btc for BTC
  values, parse_decimal_sats for fiat, to_decimal_sats_from_msat for
  millisatoshis
- Parse timestamps with parse_timestamp (or parse_rfc3339 /
  parse_timestamp_format if the format is specific)

The five canonical types are: deposit, withdrawal, trade, pay, invoice.
Supported assets: BTC, USD, EUR, JPY, GBP, CNY, AUD, CAD, CHF, HKD,
NZD, SEK, KRW, SGD, MXN, INR, BRL, NOK, ZAR, DKK.

Output only the complete mapping.json. No explanation needed.

Save the output as mapping.json, then run:

clams connections create \
  --label "<connection label>" \
  --kind Custom \
  --configuration-file mapping.json

clams connections import \
  --label "<connection label>" \
  --input-format csv \
  --file <path to csv>

clams journals process

If the import fails, Clams returns an error with the row number and reason. Paste the error message back into your LLM chat and ask it to fix the mapping. It already has the full context from your conversation. Save the updated mapping, then re-run the import. Re-imports are safe since rows are deduplicated by their canonical ID.


Contents:
What You'll Need
The Example CSV
Step 1: Understand the CSV Mapping
Step 2: Build the Mapping
Step 3: Create the Connection
Step 4: Import Your CSV
Step 5: Process and Verify
Pro Tips
Need Help?

What You'll Need

  • Clams CLI installed (download here)
  • A CSV export from your exchange, wallet, or other data source
  • A text editor for writing the JSON mapping file

The Example CSV

Throughout this guide we'll use an export from a fictional "Acme Exchange" that includes deposits, withdrawals, and trades:

transaction_id,date,type,from_amount,from_currency,to_amount,to_currency,fee,fee_currency,destination,notes
TX-1000,2025-01-01T00:00:00Z,deposit,,,150000.00,USD,0.00,USD,,Initial USD funding
TX-1001,2025-01-15T09:30:00Z,deposit,,,0.50000000,BTC,0.00000000,BTC,bc1qar0srrr7xfkvy5l643lydnw9re59gtzzwf5mdq,Deposit from cold storage
TX-1002,2025-02-01T14:22:00Z,buy,24250.00,USD,0.25000000,BTC,2.50,USD,,
TX-1003,2025-03-10T11:00:00Z,sell,0.10000000,BTC,9500.00,USD,1.75,USD,,
TX-1004,2025-04-05T16:45:00Z,withdrawal,0.30000000,BTC,,,0.00010000,BTC,bc1qw508d6qejxtdg4y5r3zarvary0c5xw7kv8f3t4,Withdraw to hardware wallet
TX-1005,2025-05-20T08:15:00Z,buy,98500.00,USD,1.00000000,BTC,5.00,USD,,
TX-1006,2025-06-12T19:00:00Z,deposit,,,0.75000000,BTC,0.00000000,BTC,bc1qw508d6qejxtdg4y5r3zarvary0c5xw7kv8f3t4,Second deposit

Save this as acme-transactions.csv. The columns are:

ColumnDescription
transaction_idUnique ID from the exchange
dateTimestamp in ISO 8601 / RFC 3339 format
typeOne of: deposit, withdrawal, buy, sell
from_amount (optional)Source amount (fiat for buys, BTC for sells/withdrawals)
from_currency (optional)Source asset code
to_amount (optional)Destination amount (BTC for buys/deposits)
to_currency (optional)Destination asset code
feeFee amount
fee_currencyFee asset code
destination (optional)Bitcoin address (deposits and withdrawals)
notes (optional)Free-text description

Step 1: Understand the CSV Mapping

A CSV mapping is a JSON configuration that tells Clams how to read your CSV. The configuration file has a top-level csv_mapping key containing two sections:

  1. dialect: how to parse the CSV file (delimiter, headers, etc.)
  2. row_patterns: rules that classify each row and map columns to canonical fields

Dialect

The dialect tells Clams the basics of your CSV format:

{
  "delimiter": ",",
  "has_header": true
}
  • delimiter: the column separator ("," for CSV, "\t" for TSV)
  • has_header: must be true (headerless CSVs are not supported)
  • quote: optional, defaults to double-quote
  • encoding: optional, must be utf8 when present

Row Patterns

Each row pattern classifies a row and maps its columns to canonical fields. Patterns are evaluated in order; the first matching when condition wins. Rows that don't match any pattern are skipped.

A row pattern has this structure:

{
  "name": "deposits",
  "canonical_type": "deposit",
  "when": { ... },
  "fields": { ... }
}
  • name: a unique label for this pattern
  • canonical_type: one of the five canonical types (see below)
  • when: a boolean expression that classifies the row
  • fields: expressions that map CSV columns to canonical fields

The Five Canonical Types

Every transaction imported into Clams maps to one of five canonical types:

TypeUse CaseKey Required Fields
depositFunds arriving (exchange deposit, wallet receive)canonical_id, timestamp, amount, asset, btc_destination (required for BTC)
withdrawalFunds leaving (exchange withdrawal, wallet send)canonical_id, timestamp, amount, asset
tradeAsset exchange (BTC/USD buy or sell)canonical_id, timestamp, from_asset, from_amount, to_asset, to_amount
payLightning payment sent (BTC only)canonical_id, timestamp, amount, fee
invoiceLightning payment received (BTC only)canonical_id, timestamp, amount

Each type also supports optional fields like fee, btc_destination, btc_txid, and description. Supported assets: BTC plus 19 fiat currencies (USD, EUR, JPY, GBP, CNY, AUD, CAD, CHF, HKD, NZD, SEK, KRW, SGD, MXN, INR, BRL, NOK, ZAR, DKK). Altcoins aren't directly supported — for BTC/altcoin trades, convert to the fiat equivalent and note the original details. See the full CSV mapping spec for every field.

The Expression DSL

Field values aren't plain column names — they're expressions that can transform and validate data during import. The most common operations:

  • col / col_opt: read a required or optional column
  • trim, lower, concat: string operations
  • parse_decimal_sats, parse_decimal_sats_from_btc, to_decimal_sats_from_msat: amount parsing
  • parse_timestamp, parse_rfc3339, parse_timestamp_format: timestamp parsing
  • eq, and, or: comparisons for when conditions
  • some, none, empty_to_none, if: optional field handling
  • lit: literal values (string, bool, i128)

Step 2 below demonstrates these in context. For the full expression reference, see the CSV mapping spec.

Step 2: Build the Mapping

Now let's build the complete mapping for our Acme Exchange CSV. We need five row patterns: BTC deposits, USD deposits, withdrawals, buys, and sells. Because our CSV has deposits in two different assets (BTC and USD), we split them into separate patterns — each asset needs different amount parsing.

Pattern 1: BTC Deposits

BTC deposits match rows where type = "deposit" and to_currency = "BTC". We use and to combine both conditions, since we need a separate pattern for USD deposits (which use different amount parsing):

{
  "name": "btc_deposits",
  "canonical_type": "deposit",
  "when": {
    "and": [
      {
        "eq": {
          "a": { "lower": { "trim": { "col": "type" } } },
          "b": { "lit": { "type": "string", "value": "deposit" } }
        }
      },
      {
        "eq": {
          "a": { "lower": { "trim": { "col": "to_currency" } } },
          "b": { "lit": { "type": "string", "value": "btc" } }
        }
      }
    ]
  },
  "fields": {
    "canonical_id": { "trim": { "col": "transaction_id" } },
    "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
    "amount": { "parse_decimal_sats_from_btc": { "trim": { "col": "to_amount" } } },
    "asset": { "trim": { "col": "to_currency" } },
    "fee": {
      "some": { "parse_decimal_sats_from_btc": { "trim": { "col": "fee" } } }
    },
    "btc_destination": { "empty_to_none": { "trim": { "col": "destination" } } },
    "description": { "empty_to_none": { "trim": { "col": "notes" } } }
  }
}

Key decisions:

  • when uses and to match on both the type and currency columns
  • amount uses parse_decimal_sats_from_btc because BTC values have up to 8 decimal places
  • fee is wrapped in some because it's an optional field (the fee is also in BTC here, so we use the same parser)
  • btc_destination is required for BTC deposits (must be a valid Bitcoin address, BOLT11 invoice, or 64-hex payment hash)
  • description uses empty_to_none so empty notes become None instead of empty strings

Pattern 2: USD Deposits

USD deposits match rows where type = "deposit" and to_currency = "USD". The key difference from BTC deposits is amount parsing — fiat amounts use parse_decimal_sats instead of parse_decimal_sats_from_btc:

{
  "name": "usd_deposits",
  "canonical_type": "deposit",
  "when": {
    "and": [
      {
        "eq": {
          "a": { "lower": { "trim": { "col": "type" } } },
          "b": { "lit": { "type": "string", "value": "deposit" } }
        }
      },
      {
        "eq": {
          "a": { "lower": { "trim": { "col": "to_currency" } } },
          "b": { "lit": { "type": "string", "value": "usd" } }
        }
      }
    ]
  },
  "fields": {
    "canonical_id": { "trim": { "col": "transaction_id" } },
    "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
    "amount": { "parse_decimal_sats": { "trim": { "col": "to_amount" } } },
    "asset": { "trim": { "col": "to_currency" } },
    "fee": {
      "some": { "parse_decimal_sats": { "trim": { "col": "fee" } } }
    },
    "btc_destination": { "none": { "type": "string" } },
    "description": { "empty_to_none": { "trim": { "col": "notes" } } }
  }
}

Key differences from BTC deposits:

  • amount and fee use parse_decimal_sats (fiat parser) instead of parse_decimal_sats_from_btc
  • btc_destination uses none since USD deposits have no Bitcoin address

Pattern 3: Withdrawals

Withdrawals have type = "withdrawal". The amount field for withdrawals represents the net value (after fees):

{
  "name": "withdrawals",
  "canonical_type": "withdrawal",
  "when": {
    "eq": {
      "a": { "lower": { "trim": { "col": "type" } } },
      "b": { "lit": { "type": "string", "value": "withdrawal" } }
    }
  },
  "fields": {
    "canonical_id": { "trim": { "col": "transaction_id" } },
    "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
    "amount": { "parse_decimal_sats_from_btc": { "trim": { "col": "from_amount" } } },
    "asset": { "trim": { "col": "from_currency" } },
    "fee": {
      "some": { "parse_decimal_sats_from_btc": { "trim": { "col": "fee" } } }
    },
    "btc_destination": { "empty_to_none": { "trim": { "col": "destination" } } },
    "description": { "empty_to_none": { "trim": { "col": "notes" } } }
  }
}

Pattern 4: Buys (USD to BTC)

A "buy" is a trade where you spend fiat and receive BTC:

{
  "name": "buys",
  "canonical_type": "trade",
  "when": {
    "eq": {
      "a": { "lower": { "trim": { "col": "type" } } },
      "b": { "lit": { "type": "string", "value": "buy" } }
    }
  },
  "fields": {
    "canonical_id": { "trim": { "col": "transaction_id" } },
    "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
    "from_asset": { "trim": { "col": "from_currency" } },
    "from_amount": { "parse_decimal_sats": { "trim": { "col": "from_amount" } } },
    "to_asset": { "trim": { "col": "to_currency" } },
    "to_amount": { "parse_decimal_sats_from_btc": { "trim": { "col": "to_amount" } } },
    "fee_asset": { "some": { "trim": { "col": "fee_currency" } } },
    "fee_amount": { "some": { "parse_decimal_sats": { "trim": { "col": "fee" } } } }
  }
}

Note: from_amount (USD) uses parse_decimal_sats for fiat amounts, while to_amount (BTC) uses parse_decimal_sats_from_btc.

Pattern 5: Sells (BTC to USD)

A "sell" is a trade where you spend BTC and receive fiat:

{
  "name": "sells",
  "canonical_type": "trade",
  "when": {
    "eq": {
      "a": { "lower": { "trim": { "col": "type" } } },
      "b": { "lit": { "type": "string", "value": "sell" } }
    }
  },
  "fields": {
    "canonical_id": { "trim": { "col": "transaction_id" } },
    "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
    "from_asset": { "trim": { "col": "from_currency" } },
    "from_amount": { "parse_decimal_sats_from_btc": { "trim": { "col": "from_amount" } } },
    "to_asset": { "trim": { "col": "to_currency" } },
    "to_amount": { "parse_decimal_sats": { "trim": { "col": "to_amount" } } },
    "fee_asset": { "some": { "trim": { "col": "fee_currency" } } },
    "fee_amount": { "some": { "parse_decimal_sats": { "trim": { "col": "fee" } } } }
  }
}

The Complete Mapping

Here's the full mapping.json combining all five patterns. Note the top-level csv_mapping key that wraps the dialect and row patterns:

{
  "csv_mapping": {
    "dialect": {
      "delimiter": ",",
      "has_header": true
    },
    "row_patterns": [
      {
        "name": "btc_deposits",
        "canonical_type": "deposit",
        "when": {
          "and": [
            {
              "eq": {
                "a": { "lower": { "trim": { "col": "type" } } },
                "b": { "lit": { "type": "string", "value": "deposit" } }
              }
            },
            {
              "eq": {
                "a": { "lower": { "trim": { "col": "to_currency" } } },
                "b": { "lit": { "type": "string", "value": "btc" } }
              }
            }
          ]
        },
        "fields": {
          "canonical_id": { "trim": { "col": "transaction_id" } },
          "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
          "amount": { "parse_decimal_sats_from_btc": { "trim": { "col": "to_amount" } } },
          "asset": { "trim": { "col": "to_currency" } },
          "fee": {
            "some": { "parse_decimal_sats_from_btc": { "trim": { "col": "fee" } } }
          },
          "btc_destination": { "empty_to_none": { "trim": { "col": "destination" } } },
          "description": { "empty_to_none": { "trim": { "col": "notes" } } }
        }
      },
      {
        "name": "usd_deposits",
        "canonical_type": "deposit",
        "when": {
          "and": [
            {
              "eq": {
                "a": { "lower": { "trim": { "col": "type" } } },
                "b": { "lit": { "type": "string", "value": "deposit" } }
              }
            },
            {
              "eq": {
                "a": { "lower": { "trim": { "col": "to_currency" } } },
                "b": { "lit": { "type": "string", "value": "usd" } }
              }
            }
          ]
        },
        "fields": {
          "canonical_id": { "trim": { "col": "transaction_id" } },
          "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
          "amount": { "parse_decimal_sats": { "trim": { "col": "to_amount" } } },
          "asset": { "trim": { "col": "to_currency" } },
          "fee": {
            "some": { "parse_decimal_sats": { "trim": { "col": "fee" } } }
          },
          "btc_destination": { "none": { "type": "string" } },
          "description": { "empty_to_none": { "trim": { "col": "notes" } } }
        }
      },
      {
        "name": "withdrawals",
        "canonical_type": "withdrawal",
        "when": {
          "eq": {
            "a": { "lower": { "trim": { "col": "type" } } },
            "b": { "lit": { "type": "string", "value": "withdrawal" } }
          }
        },
        "fields": {
          "canonical_id": { "trim": { "col": "transaction_id" } },
          "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
          "amount": { "parse_decimal_sats_from_btc": { "trim": { "col": "from_amount" } } },
          "asset": { "trim": { "col": "from_currency" } },
          "fee": {
            "some": { "parse_decimal_sats_from_btc": { "trim": { "col": "fee" } } }
          },
          "btc_destination": { "empty_to_none": { "trim": { "col": "destination" } } },
          "description": { "empty_to_none": { "trim": { "col": "notes" } } }
        }
      },
      {
        "name": "buys",
        "canonical_type": "trade",
        "when": {
          "eq": {
            "a": { "lower": { "trim": { "col": "type" } } },
            "b": { "lit": { "type": "string", "value": "buy" } }
          }
        },
        "fields": {
          "canonical_id": { "trim": { "col": "transaction_id" } },
          "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
          "from_asset": { "trim": { "col": "from_currency" } },
          "from_amount": { "parse_decimal_sats": { "trim": { "col": "from_amount" } } },
          "to_asset": { "trim": { "col": "to_currency" } },
          "to_amount": { "parse_decimal_sats_from_btc": { "trim": { "col": "to_amount" } } },
          "fee_asset": { "some": { "trim": { "col": "fee_currency" } } },
          "fee_amount": { "some": { "parse_decimal_sats": { "trim": { "col": "fee" } } } }
        }
      },
      {
        "name": "sells",
        "canonical_type": "trade",
        "when": {
          "eq": {
            "a": { "lower": { "trim": { "col": "type" } } },
            "b": { "lit": { "type": "string", "value": "sell" } }
          }
        },
        "fields": {
          "canonical_id": { "trim": { "col": "transaction_id" } },
          "timestamp": { "parse_timestamp": { "trim": { "col": "date" } } },
          "from_asset": { "trim": { "col": "from_currency" } },
          "from_amount": { "parse_decimal_sats_from_btc": { "trim": { "col": "from_amount" } } },
          "to_asset": { "trim": { "col": "to_currency" } },
          "to_amount": { "parse_decimal_sats": { "trim": { "col": "to_amount" } } },
          "fee_asset": { "some": { "trim": { "col": "fee_currency" } } },
          "fee_amount": { "some": { "parse_decimal_sats": { "trim": { "col": "fee" } } } }
        }
      }
    ]
  }
}

Save this as mapping.json.

Step 3: Create the Connection

With your mapping file ready, create the Custom connection:

clams connections create \
  --label "Acme Exchange" \
  --kind Custom \
  --configuration-file mapping.json

This registers the connection and stores the mapping. You only need to do this once. The mapping is reused for every future import from this source.

To verify it was created:

clams connections list

Step 4: Import Your CSV

Now import your transaction data:

clams connections import \
  --label "Acme Exchange" \
  --input-format csv \
  --file acme-transactions.csv

Clams will parse the CSV, evaluate each row against your mapping patterns, and create canonical transaction records. The import is all-or-nothing: if any matched row fails validation, the entire import is rejected with a descriptive error message including the row number.

Rows that don't match any pattern are skipped and logged. If no rows match any pattern, the import fails.

Re-importing: If you later export a fresh CSV from your exchange that includes previously imported rows, Clams handles this gracefully. Rows are deduplicated by their canonical ID, so re-importing the same data is safe (upsert behavior).

Step 5: Process and Verify

After importing, process your journals to update cost basis calculations and reports:

clams journals process

Then check your data:

# View imported deposits, withdrawals, and trades
clams records deposits list
clams records withdrawals list
clams records trades list

# Check your balance sheet
clams reports balance-sheet

Review the imported transactions to confirm everything mapped correctly. Pay attention to:

  • Transaction types (deposit, withdrawal, trade) are classified correctly
  • Amounts and fees look right
  • Timestamps are accurate
  • Descriptions carried over

Pro Tips

Handle Multiple CSV Formats

Some exchanges change their export format over time, or have different formats for different transaction types. You can handle old and new column names in a single mapping using col_opt and coalesce:

{
  "coalesce": [
    { "col_opt": "new_column_name" },
    { "col_opt": "old_column_name" }
  ]
}

Amount Encoding Notes

Clams stores all amounts as decimal_sats, a decimal with exactly 3 fractional digits. Here's how different source formats map:

Source FormatExpressionExample Input
BTC (up to 11 decimals)parse_decimal_sats_from_btc"0.50000000"
Satoshis (integer)to_decimal_sats_from_msat + mul_i128_const"50000000"
Millisatoshis (integer)to_decimal_sats_from_msat + parse_i128"50000000000"
Fiat / generic decimalparse_decimal_sats"24250.00"

Batch Imports

If you have multiple CSV files from the same source (e.g., monthly exports), put them all in a folder and import them at once with --folder:

clams connections import \
  --label "Acme Exchange" \
  --input-format csv \
  --folder /path/to/csv-folder

Deduplication ensures overlapping rows across files aren't double-counted.

Error Handling

If your mapping or CSV has issues, Clams returns descriptive errors:

  • E400-INVALID-CONFIG: your mapping JSON has structural problems (unknown operators, type mismatches, etc.)
  • E400-INVALID-CSV: the CSV data failed validation (missing headers, parse errors, failed assertions). The error includes the 1-based row number.

Read the error message carefully. It tells you exactly what went wrong and where.

Need Help?

Having trouble with your Custom import? Reach out:

  • From the CLI: run clams feedback to submit a bug report or question directly
  • Email: support@clams.tech
  • Full DSL reference: CSV mapping spec