Here we look into an Squid SDK indexer (a squid) that gets the data about Orca Exchange NFTs, their transfers and owners.

Pre-requisites: Node.js v20 or newer, Git, Docker.

Download the project

Begin by retrieving the template and installing dependencies:

git clone -b portal-api https://github.com/subsquid-labs/solana-example
cd solana-example
npm i

Interfacing with the Whirlpool program

First, we inspect the data available for indexing.

SQD provides a tool (@subsquid/solana-typegen) for retrieving program IDLs off the chain and generating boilerplate ABI code for data decoding. It supports IDLs generated by the Anchor framework and Shank.

Generating the ABI code for Whirlpool is done with

npx squid-solana-typegen src/abi whirLbMiicVdio4qvUfM5KAg6Ct8VwpYzGff3uctyCc#whirlpool

Here, src/abi is the destination folder and the whirlpool suffix sets the base name for the generated file.

At the src/abi/whirlpool/instructions.ts we find exports of instruction instances for every instruction in the Whirlpool program. Here’s one for the swap instruction:

export const swap = instruction(
  {
    d8: '0xf8c69e91e17587c8',
  },
  {
    tokenProgram: 0,
    tokenAuthority: 1,
    whirlpool: 2,
    tokenOwnerAccountA: 3,
    tokenVaultA: 4,
    tokenOwnerAccountB: 5,
    tokenVaultB: 6,
    tickArray0: 7,
    tickArray1: 8,
    tickArray2: 9,
    oracle: 10,
  },
  struct({
    amount: u64,
    otherAmountThreshold: u64,
    sqrtPriceLimit: u128,
    amountSpecifiedIsInput: bool,
    aToB: bool,
  }),
)

Here, d8 are the eight bytes that the relevant instruction data starts with. In the next section we’ll use this discriminator to request pre-filtered swap data from the Soldexer API.

Configuring the data source

“Data source” is a component that defines what data should be retrieved and where to get it. Here’s how we configure it to retrieve the swap instruction data of the Whirlpool program:

src/main.ts
// ...
import {DataSourceBuilder} from '@subsquid/solana-stream'
import * as whirlpool from './abi/whirpool'

const dataSource = new DataSourceBuilder()
  // An SQD Network Portal URL is required
  .setPortal('https://portal.sqd.dev/datasets/solana-mainnet')
  .setBlockRange({from: 300_000_000})
  .addInstruction({
    // Select instructions that
    where: {
      // were executed by the Whirlpool program, and
      programId: [whirlpool.programId],
      // have the first eight bytes of .data equal to the swap descriptor, and
      d8: [whirlpool.instructions.swap.d8],
      // come from the USDC-SOL pair, and
      ...whirlpool.instructions.swap.accountSelection({
        whirlpool: ['7qbRF6YsyGuLUVs6Y1q64bdVrfe4ZcUUz1JRdoVNUJnm']
      }),
      // were successfully committed
      isCommitted: true
    },
    // For each instruction data item selected above
    // make sure to also include
    include: {
      // inner instructions,
      innerInstructions: true,
      // transaction that executed the instruction,
      transaction: true,
      // all token balance update records of that transaction
      transactionTokenBalances: true,
    }
  })
  // Include the following fields the fetched data items:
  .setFields({
    block: {
      // timestamps for the block headers
      timestamp: true
    },
    transaction: {
      // signatures for transactions:
      // the first one is used as a tx hash/ID
      signatures: true
    },
    instruction: {
      programId: true,
      accounts: true,
      data: true
    },
    tokenBalance: {
      preAmount: true,
      postAmount: true,
      preOwner: true,
      postOwner: true
    }
  })
  .build()

Here,

  • 'https://portal.sqd.dev/datasets/solana-mainnet' is the address of the public SQD Network portal for Solana mainnet.

  • 300_000_000 is first Solana slot to be indexed by the processor.

  • The argument of addInstruction() is a set of filters that tells the processor to retrieve all data on Whirlpool program instructions with discriminator matching the hash of the <namespace>:<instruction> of the swap instruction.

    It also instructs the processor to fetch some related data: inner instructions, parent transactions and token balance updates due to these transactions.

    Aside from instructions, it’s also possible to request transactions, execution logs, token balance updates for SOL and other tokens and reward records, plus the related data for each of these data item types.

Next, we fetch the filtered data from the Soldexer API / SQD Portal, transform it and save the result to our destination of choice - a Postgres DB.

Transforming and saving data

The run function from the @subsquid/batch-processor package fetches batches of data from the Soldexer API and runs a transformation function (called batch handler) on each batch. Here’s how it’s called:

import {run} from '@subsquid/batch-processor'

run(dataSource, database, async ctx => {
    // data transformation and persistence code here
})

Here,

  • dataSource is the data source object described in the previous section
  • database is a compatible data sink implementation
  • async ctx => { ... } is the batch definition
  • ctx is a batch context object that exposes a batch of data (at ctx.blocks) and any data persistence facilities derived from database (at ctx.store)

And here is how this call looks in our Whirlpool data processor:

src/main.ts
import {run} from '@subsquid/batch-processor'
import {augmentBlock} from '@subsquid/solana-objects'
import {DataSourceBuilder, SolanaRpcClient} from '@subsquid/solana-stream'
import {TypeormDatabase} from '@subsquid/typeorm-store'
import assert from 'assert'
import * as tokenProgram from './abi/token-program'
import * as whirlpool from './abi/whirpool'
import {Exchange} from './model'

// The data source definition from prev section
const dataSource = ...

// Saving to Postgres via TypeORM
const database = new TypeormDatabase()

run(dataSource, database, async ctx => {
  // Block items that we get from `ctx.blocks` are flat JS objects.
  //
  // We can use `augmentBlock()` function from `@subsquid/solana-objects`
  // to enrich block items with references to related objects and
  // with convenient getters for derived data (e.g. `Instruction.d8`).
  let blocks = ctx.blocks.map(augmentBlock)

  let exchanges: Exchange[] = []

  for (let block of blocks) {
    for (let ins of block.instructions) {
      // See https://read.cryptodatabytes.com/p/starter-guide-to-solana-data-analysis
      if (ins.programId === whirlpool.programId &&
          ins.d8 === whirlpool.instructions.swap.d8) {

        // A TypeORM object for an `exchange` table row
        let exchange = new Exchange({
          id: ins.id,
          slot: block.header.slot,
          tx: ins.getTransaction().signatures[0],
          timestamp: new Date(block.header.timestamp * 1000)
        })

        // Decoding the inner instructions executed by the Token Program
        assert(ins.inner.length == 2)
        let srcTransfer = tokenProgram.instructions.transfer.decode(ins.inner[0])
        let destTransfer = tokenProgram.instructions.transfer.decode(ins.inner[1])

        // Figuring out the exchange params
        let srcBalance = ins.getTransaction().tokenBalances.find(tb => tb.account == srcTransfer.accounts.source)
        let destBalance = ins.getTransaction().tokenBalances.find(tb => tb.account === destTransfer.accounts.destination)
        let srcMint = ins.getTransaction().tokenBalances.find(tb => tb.account === srcTransfer.accounts.destination)?.preMint
        let destMint = ins.getTransaction().tokenBalances.find(tb => tb.account === destTransfer.accounts.source)?.preMint
        assert(srcMint)
        assert(destMint)

        // Updating the `Exchange` object and storing it in a buffer
        exchange.fromToken = srcMint
        exchange.fromOwner = srcBalance?.preOwner || srcTransfer.accounts.source
        exchange.fromAmount = srcTransfer.data.amount
        exchange.toToken = destMint
        exchange.toOwner = destBalance?.postOwner || destBalance?.preOwner || destTransfer.accounts.destination
        exchange.toAmount = destTransfer.data.amount

        exchanges.push(exchange)
      }
    }
  }

  // Batch inserting the `exchange` table rows
  await ctx.store.insert(exchanges)
})

This goes through all the instructions in the block, verifies that they indeed are swap instruction from the Whirlpool program and decodes the data of each inner instruction. Then it retrieves the info on the exchange from decoded inner instructions, including input and output tokens, source and destination accounts and amounts. The decoding is done with the tokenProgram.instructions.transfer.decode function from the Typescript ABI provided in the template.

At this point the squid is ready for its first test run. Execute

npx tsc
docker compose up -d
npx squid-typeorm-migration apply
node -r dotenv/config lib/main.js

You can verify that the data is being stored in the database by running

docker exec "$(basename "$(pwd)")-db-1" psql -U postgres -c "SELECT * FROM exchange"

The full code can be found here.

Here we look into an Squid SDK indexer (a squid) that gets the data about Orca Exchange NFTs, their transfers and owners.

Pre-requisites: Node.js v20 or newer, Git, Docker.

Download the project

Begin by retrieving the template and installing dependencies:

git clone -b portal-api https://github.com/subsquid-labs/solana-example
cd solana-example
npm i

Interfacing with the Whirlpool program

First, we inspect the data available for indexing.

SQD provides a tool (@subsquid/solana-typegen) for retrieving program IDLs off the chain and generating boilerplate ABI code for data decoding. It supports IDLs generated by the Anchor framework and Shank.

Generating the ABI code for Whirlpool is done with

npx squid-solana-typegen src/abi whirLbMiicVdio4qvUfM5KAg6Ct8VwpYzGff3uctyCc#whirlpool

Here, src/abi is the destination folder and the whirlpool suffix sets the base name for the generated file.

At the src/abi/whirlpool/instructions.ts we find exports of instruction instances for every instruction in the Whirlpool program. Here’s one for the swap instruction:

export const swap = instruction(
  {
    d8: '0xf8c69e91e17587c8',
  },
  {
    tokenProgram: 0,
    tokenAuthority: 1,
    whirlpool: 2,
    tokenOwnerAccountA: 3,
    tokenVaultA: 4,
    tokenOwnerAccountB: 5,
    tokenVaultB: 6,
    tickArray0: 7,
    tickArray1: 8,
    tickArray2: 9,
    oracle: 10,
  },
  struct({
    amount: u64,
    otherAmountThreshold: u64,
    sqrtPriceLimit: u128,
    amountSpecifiedIsInput: bool,
    aToB: bool,
  }),
)

Here, d8 are the eight bytes that the relevant instruction data starts with. In the next section we’ll use this discriminator to request pre-filtered swap data from the Soldexer API.

Configuring the data source

“Data source” is a component that defines what data should be retrieved and where to get it. Here’s how we configure it to retrieve the swap instruction data of the Whirlpool program:

src/main.ts
// ...
import {DataSourceBuilder} from '@subsquid/solana-stream'
import * as whirlpool from './abi/whirpool'

const dataSource = new DataSourceBuilder()
  // An SQD Network Portal URL is required
  .setPortal('https://portal.sqd.dev/datasets/solana-mainnet')
  .setBlockRange({from: 300_000_000})
  .addInstruction({
    // Select instructions that
    where: {
      // were executed by the Whirlpool program, and
      programId: [whirlpool.programId],
      // have the first eight bytes of .data equal to the swap descriptor, and
      d8: [whirlpool.instructions.swap.d8],
      // come from the USDC-SOL pair, and
      ...whirlpool.instructions.swap.accountSelection({
        whirlpool: ['7qbRF6YsyGuLUVs6Y1q64bdVrfe4ZcUUz1JRdoVNUJnm']
      }),
      // were successfully committed
      isCommitted: true
    },
    // For each instruction data item selected above
    // make sure to also include
    include: {
      // inner instructions,
      innerInstructions: true,
      // transaction that executed the instruction,
      transaction: true,
      // all token balance update records of that transaction
      transactionTokenBalances: true,
    }
  })
  // Include the following fields the fetched data items:
  .setFields({
    block: {
      // timestamps for the block headers
      timestamp: true
    },
    transaction: {
      // signatures for transactions:
      // the first one is used as a tx hash/ID
      signatures: true
    },
    instruction: {
      programId: true,
      accounts: true,
      data: true
    },
    tokenBalance: {
      preAmount: true,
      postAmount: true,
      preOwner: true,
      postOwner: true
    }
  })
  .build()

Here,

  • 'https://portal.sqd.dev/datasets/solana-mainnet' is the address of the public SQD Network portal for Solana mainnet.

  • 300_000_000 is first Solana slot to be indexed by the processor.

  • The argument of addInstruction() is a set of filters that tells the processor to retrieve all data on Whirlpool program instructions with discriminator matching the hash of the <namespace>:<instruction> of the swap instruction.

    It also instructs the processor to fetch some related data: inner instructions, parent transactions and token balance updates due to these transactions.

    Aside from instructions, it’s also possible to request transactions, execution logs, token balance updates for SOL and other tokens and reward records, plus the related data for each of these data item types.

Next, we fetch the filtered data from the Soldexer API / SQD Portal, transform it and save the result to our destination of choice - a Postgres DB.

Transforming and saving data

The run function from the @subsquid/batch-processor package fetches batches of data from the Soldexer API and runs a transformation function (called batch handler) on each batch. Here’s how it’s called:

import {run} from '@subsquid/batch-processor'

run(dataSource, database, async ctx => {
    // data transformation and persistence code here
})

Here,

  • dataSource is the data source object described in the previous section
  • database is a compatible data sink implementation
  • async ctx => { ... } is the batch definition
  • ctx is a batch context object that exposes a batch of data (at ctx.blocks) and any data persistence facilities derived from database (at ctx.store)

And here is how this call looks in our Whirlpool data processor:

src/main.ts
import {run} from '@subsquid/batch-processor'
import {augmentBlock} from '@subsquid/solana-objects'
import {DataSourceBuilder, SolanaRpcClient} from '@subsquid/solana-stream'
import {TypeormDatabase} from '@subsquid/typeorm-store'
import assert from 'assert'
import * as tokenProgram from './abi/token-program'
import * as whirlpool from './abi/whirpool'
import {Exchange} from './model'

// The data source definition from prev section
const dataSource = ...

// Saving to Postgres via TypeORM
const database = new TypeormDatabase()

run(dataSource, database, async ctx => {
  // Block items that we get from `ctx.blocks` are flat JS objects.
  //
  // We can use `augmentBlock()` function from `@subsquid/solana-objects`
  // to enrich block items with references to related objects and
  // with convenient getters for derived data (e.g. `Instruction.d8`).
  let blocks = ctx.blocks.map(augmentBlock)

  let exchanges: Exchange[] = []

  for (let block of blocks) {
    for (let ins of block.instructions) {
      // See https://read.cryptodatabytes.com/p/starter-guide-to-solana-data-analysis
      if (ins.programId === whirlpool.programId &&
          ins.d8 === whirlpool.instructions.swap.d8) {

        // A TypeORM object for an `exchange` table row
        let exchange = new Exchange({
          id: ins.id,
          slot: block.header.slot,
          tx: ins.getTransaction().signatures[0],
          timestamp: new Date(block.header.timestamp * 1000)
        })

        // Decoding the inner instructions executed by the Token Program
        assert(ins.inner.length == 2)
        let srcTransfer = tokenProgram.instructions.transfer.decode(ins.inner[0])
        let destTransfer = tokenProgram.instructions.transfer.decode(ins.inner[1])

        // Figuring out the exchange params
        let srcBalance = ins.getTransaction().tokenBalances.find(tb => tb.account == srcTransfer.accounts.source)
        let destBalance = ins.getTransaction().tokenBalances.find(tb => tb.account === destTransfer.accounts.destination)
        let srcMint = ins.getTransaction().tokenBalances.find(tb => tb.account === srcTransfer.accounts.destination)?.preMint
        let destMint = ins.getTransaction().tokenBalances.find(tb => tb.account === destTransfer.accounts.source)?.preMint
        assert(srcMint)
        assert(destMint)

        // Updating the `Exchange` object and storing it in a buffer
        exchange.fromToken = srcMint
        exchange.fromOwner = srcBalance?.preOwner || srcTransfer.accounts.source
        exchange.fromAmount = srcTransfer.data.amount
        exchange.toToken = destMint
        exchange.toOwner = destBalance?.postOwner || destBalance?.preOwner || destTransfer.accounts.destination
        exchange.toAmount = destTransfer.data.amount

        exchanges.push(exchange)
      }
    }
  }

  // Batch inserting the `exchange` table rows
  await ctx.store.insert(exchanges)
})

This goes through all the instructions in the block, verifies that they indeed are swap instruction from the Whirlpool program and decodes the data of each inner instruction. Then it retrieves the info on the exchange from decoded inner instructions, including input and output tokens, source and destination accounts and amounts. The decoding is done with the tokenProgram.instructions.transfer.decode function from the Typescript ABI provided in the template.

At this point the squid is ready for its first test run. Execute

npx tsc
docker compose up -d
npx squid-typeorm-migration apply
node -r dotenv/config lib/main.js

You can verify that the data is being stored in the database by running

docker exec "$(basename "$(pwd)")-db-1" psql -U postgres -c "SELECT * FROM exchange"

The full code can be found here.