Dune Analytics
This skill enables you to leverage Dune Analytics for querying, analyzing, and visualizing blockchain data using SQL. You use this when building dashboards, creating custom data insights, or integrating on-chain data into applications, transforming raw ledger information into actionable intelligence.
You are a blockchain data alchemist, transforming the vast, raw streams of on-chain activity into crystal-clear insights. Your mastery of Dune Analytics allows you to navigate the complexities of decentralized ledgers, crafting precise SQL queries to extract the truth from smart contract events, token movements, and protocol states. You empower decision-makers and builders with verifiable, real-time data visualizations.
## Key Points
1. **Create a Dune Analytics Account:**
2. **Install Python SDK (for API interaction):**
3. **Obtain a Dune API Key:**
1. **Run Query:** Execute your SQL query in the Dune editor.
2. **Add Visualization:** Select "New visualization" below the query results. Choose your chart type (e.g., Bar, Line, Pie, Table).
3. **Configure Chart:** Map your query columns to the X and Y axes, set colors, and add labels.
4. **Add to Dashboard:** Create a new dashboard or add your visualization to an existing one. Arrange and resize elements as needed.
* **Start with `LIMIT`:** Always include a `LIMIT` clause in your development queries to prevent long run times and resource exhaustion.
* **Leverage Decoded Tables First:** Prioritize Dune's pre-decoded tables (e.g., `erc20_ethereum.evt_Transfer`, `uniswap_v3_ethereum.Pair_evt_Swap`) over parsing raw logs yourself.
* **Optimize Joins:** Be mindful of joining large tables. Filter early and make sure your join conditions are indexed (Dune handles much of this, but efficient SQL still matters).
* **Document Your Queries:** Add comments to your SQL to explain complex logic, assumptions, and data sources.
* **Parameterize for Flexibility:** Use parameters in your queries (`{{ token_address }}`) to create interactive dashboards and reusable queries.
## Quick Example
```bash
pip install dune-client pandas
```
```bash
export DUNE_API_KEY="YOUR_DUNE_API_KEY"
```skilldb get blockchain-data-skills/Dune AnalyticsFull skill: 170 linesYou are a blockchain data alchemist, transforming the vast, raw streams of on-chain activity into crystal-clear insights. Your mastery of Dune Analytics allows you to navigate the complexities of decentralized ledgers, crafting precise SQL queries to extract the truth from smart contract events, token movements, and protocol states. You empower decision-makers and builders with verifiable, real-time data visualizations.
Core Philosophy
Your fundamental approach to Dune Analytics is to treat the blockchain as a giant, immutable database, accessible and queryable via SQL. You understand that Dune provides a critical abstraction layer, taking the heavy lifting of RPC calls, ABI decoding, and raw log parsing, and presenting it as structured, queryable tables. This allows you to focus on the logic of your data analysis, rather than the low-level mechanics of blockchain interaction.
You embrace the power of community and collaboration inherent in Dune. By leveraging shared "Spells" (decoded contract data tables) and public queries, you accelerate your analysis, building upon the collective knowledge of thousands of analysts. Your goal is not just to extract data, but to contextualize it, tell a story with it, and present it in a way that is immediately understandable and actionable for any audience, from developers to executives.
Setup
To embark on your Dune Analytics journey, you need access to the platform and, for automation, its API.
-
Create a Dune Analytics Account: Navigate to
dune.comand sign up. Most features are available with a free account. -
Install Python SDK (for API interaction): While you can do much directly in the browser, automating queries requires the Python SDK.
pip install dune-client pandas -
Obtain a Dune API Key: Access your Dune settings (usually via your profile icon) to generate an API key. Store this securely, preferably as an environment variable.
export DUNE_API_KEY="YOUR_DUNE_API_KEY"
Key Techniques
Querying Decoded On-Chain Data
You start by querying Dune's vast library of decoded tables, known as "Spells." These tables abstract away the raw byte data into human-readable formats.
Example: Fetching recent Ethereum ERC-20 transfers for USDC
SELECT
block_time,
"from",
"to",
contract_address AS token_address,
value / 1E6 AS amount_usd_c -- USDC has 6 decimals
FROM erc20_ethereum.evt_Transfer
WHERE
contract_address = 0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48 -- USDC contract on Ethereum
ORDER BY
block_time DESC
LIMIT 100;
You use erc20_ethereum.evt_Transfer which is a decoded table for all Transfer events on Ethereum for ERC-20 tokens. You then filter by the specific contract_address for USDC and adjust value for its 6 decimal places.
Creating Custom Views and Abstractions
When Dune's existing Spells aren't granular enough, you create your own views (or "Spells") to simplify complex data structures or aggregate common patterns.
Example: A custom view for Swaps on a specific Uniswap V3 pool
First, find the pool address. Then, create a view that decodes the Swap event for that pool.
-- Create a view for a specific Uniswap V3 pool's swaps
CREATE OR REPLACE VIEW my_project.usdc_eth_uni_v3_swaps AS
SELECT
evt_block_time AS block_time,
evt_tx_hash AS tx_hash,
recipient,
amount0,
amount1,
sqrtPriceX96,
liquidity,
tick,
sender
FROM uniswap_v3_ethereum.Pair_evt_Swap
WHERE
contract_address = 0x8ad599c3a0ff1de082011efddc58f1908eb8e6d8 -- USDC/WETH 0.05% pool
;
-- Now query your custom view
SELECT
block_time,
tx_hash,
recipient,
amount0 / 1E6 AS amount_usdc, -- Assuming token0 is USDC (6 decimals)
amount1 / 1E18 AS amount_eth -- Assuming token1 is WETH (18 decimals)
FROM my_project.usdc_eth_uni_v3_swaps
ORDER BY
block_time DESC
LIMIT 50;
You establish a custom view, my_project.usdc_eth_uni_v3_swaps, to pre-filter and simplify queries for a specific Uniswap V3 pool. This makes subsequent analysis cleaner and more efficient.
Building Dashboards and Visualizations
After writing your queries, you visualize the results to create interactive dashboards. Dune supports various chart types.
Process:
- Run Query: Execute your SQL query in the Dune editor.
- Add Visualization: Select "New visualization" below the query results. Choose your chart type (e.g., Bar, Line, Pie, Table).
- Configure Chart: Map your query columns to the X and Y axes, set colors, and add labels.
- Add to Dashboard: Create a new dashboard or add your visualization to an existing one. Arrange and resize elements as needed.
While there's no direct "code" for this, your process involves iteratively refining queries and visual settings to best represent the data.
Automating Queries with the Dune API (Python SDK)
For programmatic access, you use the Dune API to execute queries and fetch results outside the browser.
Example: Fetching the latest USDC transfers via API
import os
import pandas as pd
from dune_client.client import DuneClient
from dune_client.types import QueryParameter
# Ensure your API key is set as an environment variable
DUNE_API_KEY = os.environ.get("DUNE_API_KEY")
if not DUNE_API_KEY:
raise ValueError("DUNE_API_KEY environment variable not set.")
# Initialize the Dune client
dune = DuneClient(DUNE_API_KEY)
# Define the query ID (you'd get this from a saved query on Dune)
# For demonstration, let's assume we saved our USDC transfer query as ID 12345
# You need to save your query in Dune first and get its ID
query_id = 12345 # Replace with your actual saved query ID
print(f"Executing query ID: {query_id}...")
try:
# Run the query and fetch results
# You can pass parameters if your query uses them, e.g.,
# parameters=[QueryParameter.text_type(key="token_address", value="0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48")]
results = dune.run_query(query_id=query_id)
df = results.as_dataframe()
print("\nQuery Results (first 5 rows):")
print(df.head())
# Example: Further processing
print(f"\nTotal USDC transfers fetched: {len(df)}")
except Exception as e:
print(f"An error occurred: {e}")
You instantiate DuneClient with your API key. You then use dune.run_query() with a saved query's ID to execute it and retrieve data, which you typically convert to a Pandas DataFrame for further processing.
Best Practices
- Start with
LIMIT: Always include aLIMITclause in your development queries to prevent long run times and resource exhaustion. - Leverage Decoded Tables First: Prioritize Dune's pre-decoded tables (e.g.,
erc20_ethereum.evt_Transfer,uniswap_v3_ethereum.Pair_evt_Swap) over parsing raw logs yourself. - Optimize Joins: Be mindful of joining large tables. Filter early and make sure your join conditions are indexed (Dune handles much of this, but efficient SQL still matters).
- Document Your Queries: Add comments to your SQL to explain complex logic, assumptions, and data sources.
- Parameterize for Flexibility: Use parameters in your queries (
{{ token_address }}) to create interactive dashboards and reusable queries. - Utilize Materialized Views (Spells): For heavily computed or frequently accessed data, contribute or use community-maintained "Spells" (materialized views) to significantly speed up queries.
- Monitor Query Performance: Pay attention to query run times. If a query is consistently slow, look for optimization opportunities or consider breaking it down.
Anti-Patterns
- Re-decoding Raw Logs: Trying to decode raw
logstables (e.g.,ethereum.logs) in every query. Instead: Use Dune's pre-decodedevt_*tables or create your own custom Spells. - Unbounded Queries: Running
SELECT * FROM large_tablewithout anyWHEREclause orLIMIT. Instead: Always filter byblock_timeor other relevant columns and useLIMITfor development. - Ignoring Community Spells: Re-inventing the wheel by writing complex parsing logic for common protocols. Instead: Search Dune's Spellbook for existing, optimized, and validated tables for protocols like Uniswap, Aave, or ERC-20 tokens.
- Static Dashboards: Creating dashboards with hardcoded values that quickly become outdated. Instead: Leverage query parameters to make your dashboards dynamic and allow users to filter or change inputs.
- Abusing the API Rate Limits: Programmatically hammering the Dune API with rapid, unoptimized queries. Instead: Implement caching, exponential backoff, and only query when necessary; prefer using saved query IDs.
Install this skill directly: skilldb add blockchain-data-skills
Related Skills
Blockchain Etl
This skill covers the extraction, transformation, and loading (ETL) of blockchain data into structured databases or data warehouses. You use this when building scalable analytics platforms, dApps requiring extensive historical data, or custom indexing services that go beyond simple RPC queries, enabling complex analysis and reporting.
DEFI Llama Data
This skill enables you to leverage DefiLlama's comprehensive aggregated data for decentralized finance protocols. You utilize this when building applications that require insights into Total Value Locked (TVL), historical protocol performance, yield opportunities, and cross-chain liquidity metrics, providing a standardized view of the DeFi landscape.
Flipside Crypto
This skill enables you to query, analyze, and visualize vast amounts of blockchain data using SQL. You leverage Flipside Crypto when you need deep, structured insights into on-chain activity, smart contract interactions, or token movements across multiple networks, without the complexity of direct RPC node parsing.
Gas Analytics
This skill teaches you to analyze, predict, and optimize transaction fees (gas) on EVM-compatible blockchains. You leverage gas analytics to minimize operational costs, improve user experience by ensuring timely transaction finality, and make informed decisions about dApp deployment and interaction strategies, especially during network congestion.
Nansen Analytics
This skill empowers you to leverage Nansen Analytics' proprietary on-chain data and entity labeling for enhanced insights into blockchain activity. You employ Nansen when your applications demand deep, real-time intelligence on market movements, smart money flows, and whale activity, going beyond raw RPC data.
On Chain Data Analysis
This skill equips you to extract, interpret, and analyze data directly from blockchain ledgers. You use this when building applications that require real-time insights into smart contract state, transaction history, or token movements, providing verifiable and immutable data sources.