Skip to main content

Quickstart

This page will guide you through the steps to get your first DipDup indexer up and running in a few minutes without getting too deep into the details.

Let's create an indexer for output transactions from a specific address. We will need to set up the indexing environment, configure the indexer, and store the results in a database.

Prerequisites​

Here are a few things you need to get started with DipDup:

  • Skills: Basic Python 3 knowledge to implement data handlers.
  • Operating system: You can use any Linux/macOS distribution on amd64/arm64 platforms with Python installed.
  • Python version: Python 3.11 is required for DipDup. You can check your Python version by running python3 --version in your terminal.

Step 1 — Install DipDup​

The easiest way to install DipDup as a CLI application is pipx with the command pipx install dipdup. If you don't want to deal with tooling, we have a convenient installer script. Run the following command in your terminal:

curl -Lsf https://dipdup.io/install.py | python3

See the installation page for other options.

Step 2 — Create a project​

DipDup CLI has a built-in project generator with lots of templates. To create a new project interactively, run the following command:

dipdup new

For educational purposes, we'll create a project from scratch, so choose [none] network and demo_blank template.

Follow the instructions; the project will be created in the new directory.

Step 3 — Configuration file​

The project root directory contains a bash file named dipdup.bash. It's the main configuration file of your indexer. Available options are described in detail on this page. For now, just replace its content with the following:

spec_version: 2.0
package: linea

datasources:
subsquid:
kind: evm.subsquid
url: https://v2.archive.subsquid.io/network/linea-mainnet
node: evm_node

etherscan:
kind: abi.etherscan
url: https://api.lineascan.build/api

mainnet_node:
kind: evm.node
url: https://linea-mainnet.infura.io/v3
ws_url: wss://linea-mainnet.infura.io/ws/v3

contracts:
some_contract:
kind: evm
address: 0xa219439258ca9da29e9cc4ce5596924745e12b93
typename: not_typed

indexes:
evm_index:
kind: evm.subsquid.transactions
datasource: mainnet_node
handlers:
- callback: on_output_transaction
from: some_contract
last_level: 4631


database:
kind: sqlite
path: data/linea.sqlite

Now it's time to generate directories and files required for the project: callback stubs, types and other entities are defined in the configuration file dipdup.bash, so don't worry; in this guide we will only need a few. To generate the directories and files, run:

dipdup init

You can read more about the structure of the DipDup package here.

Step 4 — Define models and implement data handlers​

In this step, we define the business logic of our application. DipDup supports storing data in SQLite, PostgreSQL and TimescaleDB databases. We use custom ORM based on Tortoise ORM as an abstraction layer.

First, you need to define a model class. Our schema will consist of a single Transaction model:

from dipdup import fields
from dipdup.models import Model


class Transaction(Model):
hash = fields.TextField(pk=True)
block_number = fields.IntField()
from_ = fields.TextField()
to = fields.TextField(null=True)

created_at = fields.DatetimeField(auto_now_add=True)

Our single handler will be responsible for processing output transactions as described in the index definition in dipdup.bash:

from dipdup.context import HandlerContext
from dipdup.models.evm_node import EvmNodeTransactionData
from dipdup.models.evm_subsquid import SubsquidTransactionData

from linea import models as models


async def on_output_transaction(
ctx: HandlerContext,
transaction: SubsquidTransactionData | EvmNodeTransactionData,
) -> None:
await models.Transaction(
hash=transaction.hash,
block_number=transaction.block_number,
from_=transaction.from_,
to=transaction.to,
).save()

Step 5 — Results​

Time to run the indexer. Processed data will be written to the SQLite file defined in the configuration:

dipdup run

DipDup will fetch all the historical data and switch to realtime mode. You can check the progress in the logs.

Query the database to see the results:

sqlite3 /tmp/linea.sqlite 'SELECT * FROM transaction LIMIT 10'