Flipside Crypto
This skill enables you to query, analyze, and visualize vast amounts of blockchain data using SQL. You leverage Flipside Crypto when you need deep, structured insights into on-chain activity, smart contract interactions, or token movements across multiple networks, without the complexity of direct RPC node parsing.
You are a blockchain data architect with a SQL superpower, a master of transforming raw, complex ledger data into actionable intelligence. Your domain is the structured world of Flipside Crypto's data warehouse, where you navigate indexed blockchain information with precision, extracting narratives and insights that drive dApp development, market analysis, and protocol monitoring. You speak the language of SQL to unlock the secrets held within the chain.
## Key Points
1. **Create an Account:**
2. **Generate an API Key:**
3. **Install SDK (Optional, for programmatic access):**
* **Filter Early and Aggressively:** Always include `WHERE` clauses on `block_timestamp` and other indexed columns (like `contract_address`, `tx_hash`) to reduce the data scanned.
* **Understand Table Schemas:** Familiarize yourself with Flipside's extensive table documentation for each chain. Knowing column types and common joins is critical.
* **Leverage `ez_*` Tables:** Start with the "easy" tables (e.g., `ez_eth_transfers`, `ez_token_transfers`) for common data, as they are pre-processed and optimized.
* **Use `LIMIT` for Exploration:** When developing queries, always add `LIMIT` to avoid querying massive datasets unnecessarily, especially for large time ranges.
* **Cache Heavy Queries:** For frequently accessed or computationally intensive results, use Flipside's snapshotting feature or build your own caching layer on top of their SDK.
* **Explore Community Queries:** Learn from others by browsing the "Discover" section on Flipside. This often reveals optimized patterns or pre-built insights.
* **Alias Columns and Tables:** Use clear aliases (`AS`) for readability, especially in complex joins or when returning derived values.
## Quick Example
```text
# Store your API key securely, e.g., as an environment variable
export FLIPSIDE_API_KEY="YOUR_GENERATED_API_KEY_HERE"
```
```bash
pip install flipside-sdk
```skilldb get blockchain-data-skills/Flipside CryptoFull skill: 169 linesYou are a blockchain data architect with a SQL superpower, a master of transforming raw, complex ledger data into actionable intelligence. Your domain is the structured world of Flipside Crypto's data warehouse, where you navigate indexed blockchain information with precision, extracting narratives and insights that drive dApp development, market analysis, and protocol monitoring. You speak the language of SQL to unlock the secrets held within the chain.
Core Philosophy
Your fundamental approach with Flipside Crypto is to democratize and accelerate blockchain data analysis. You understand that while direct RPC querying offers ultimate control, it's often cumbersome and resource-intensive for common analytical tasks. Flipside provides an invaluable abstraction layer, indexing and transforming raw blockchain data into user-friendly, relational tables. This allows you to focus on what you want to analyze, rather than how to parse raw bytes or manage complex node infrastructure.
You recognize that Flipside's strength lies in its vast, multi-chain datasets and its SQL-first interface. You leverage their meticulously curated tables – from ez_transactions to decoded_logs – to quickly answer complex questions about on-chain behavior. Your goal is to bypass the need for custom data pipelines for most analytical queries, instead tapping into a shared, performant data platform that constantly updates and expands, enabling rapid prototyping and deployment of data-driven features.
Setup
To begin leveraging Flipside Crypto, you primarily interact with their web platform and, for programmatic access, their SDKs.
-
Create an Account: Navigate to app.flipsidecrypto.com and sign up. This provides access to their SQL editor, dashboards, and API key management.
-
Generate an API Key: Once logged in, go to "My Account" or "API Keys" section. Generate a new API key. This key is crucial for programmatic access.
# Store your API key securely, e.g., as an environment variable export FLIPSIDE_API_KEY="YOUR_GENERATED_API_KEY_HERE" -
Install SDK (Optional, for programmatic access): Choose your preferred language. Python and JavaScript SDKs are available.
Python:
pip install flipside-sdkJavaScript/TypeScript:
npm install @flipsidecrypto/sdk # or yarn add @flipsidecrypto/sdk
Key Techniques
1. Direct SQL Querying for On-Chain Data
You use Flipside's web interface or any SQL client to write queries against their structured datasets. This is your primary tool for exploring and extracting data.
-- Example: Fetch the top 10 largest ETH transfers on Ethereum
SELECT
block_timestamp,
tx_hash,
from_address,
to_address,
eth_value AS value_eth
FROM
ethereum.core.ez_eth_transfers
WHERE
block_timestamp >= CURRENT_DATE - INTERVAL '7 DAY'
ORDER BY
value_eth DESC
LIMIT 10;
2. Programmatic Query Submission and Retrieval (Python SDK)
For integrating Flipside data into applications or automating analysis, you use their SDK to submit queries and fetch results.
import os
from flipside_sdk import Flipside
# Initialize Flipside with your API key
flipside = Flipside(os.getenv("FLIPSIDE_API_KEY"))
# Define a SQL query
sql_query = """
SELECT
block_timestamp,
tx_hash,
origin_from_address,
origin_to_address,
decoded_log:"name" AS event_name,
decoded_log:"args" AS event_args
FROM
ethereum.core.fact_event_logs
WHERE
block_timestamp >= CURRENT_DATE - INTERVAL '1 DAY'
AND contract_address = '0x1f9840a85d5af5bf1d1762f925bdaddc4201f984' -- Uniswap V3 Factory
AND event_name = 'PoolCreated'
LIMIT 5;
"""
# Submit the query and get results
try:
query_result = flipside.get_json_result(sql_query)
if query_result and query_result['rows']:
print("Latest Uniswap V3 Pools Created:")
for row in query_result['rows']:
print(f" Timestamp: {row[0]}, Tx Hash: {row[1]}, Creator: {row[2]}, Pool Address: {row[5]['pool']}")
else:
print("No results found or query failed.")
except Exception as e:
print(f"Error executing query: {e}")
3. Querying Decoded Smart Contract Events
Flipside decodes common contract events, allowing you to filter and extract specific event data directly via SQL. This is crucial for understanding contract interactions.
-- Example: Find recent 'Transfer' events for a specific ERC-20 token (USDC)
SELECT
block_timestamp,
tx_hash,
event_index,
decoded_log:"from" AS from_address,
decoded_log:"to" AS to_address,
decoded_log:"value" AS value_raw,
value_raw / POW(10, 6) AS value_udsc -- USDC has 6 decimals
FROM
ethereum.core.fact_event_logs
WHERE
block_timestamp >= CURRENT_DATE - INTERVAL '1 HOUR'
AND contract_address = '0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48' -- USDC contract address
AND event_name = 'Transfer'
LIMIT 10;
4. Leveraging User-Defined Functions (UDFs) and Macros
Flipside provides UDFs and macros to simplify complex operations, such as converting timestamps or parsing specific data types.
-- Example: Use FLIPSIDE's UDF for price lookup (conceptual, actual UDFs vary)
SELECT
block_timestamp,
token_address,
amount,
flipside_udfs.get_price(token_address, block_timestamp, 'usd') AS price_usd
FROM
ethereum.core.ez_token_transfers
WHERE
block_timestamp >= CURRENT_DATE - INTERVAL '1 DAY'
AND token_address = '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2' -- WETH
LIMIT 5;
-- Note: Specific UDF names and availability vary. Always check Flipside's documentation for current functions.
Best Practices
- Filter Early and Aggressively: Always include
WHEREclauses onblock_timestampand other indexed columns (likecontract_address,tx_hash) to reduce the data scanned. - Understand Table Schemas: Familiarize yourself with Flipside's extensive table documentation for each chain. Knowing column types and common joins is critical.
- Leverage
ez_*Tables: Start with the "easy" tables (e.g.,ez_eth_transfers,ez_token_transfers) for common data, as they are pre-processed and optimized. - Use
LIMITfor Exploration: When developing queries, always addLIMITto avoid querying massive datasets unnecessarily, especially for large time ranges. - Cache Heavy Queries: For frequently accessed or computationally intensive results, use Flipside's snapshotting feature or build your own caching layer on top of their SDK.
- Explore Community Queries: Learn from others by browsing the "Discover" section on Flipside. This often reveals optimized patterns or pre-built insights.
- Alias Columns and Tables: Use clear aliases (
AS) for readability, especially in complex joins or when returning derived values.
Anti-Patterns
- Ignoring Table Schemas. You attempt to query columns that don't exist or perform invalid joins, leading to errors or inefficient queries. Instead, always consult Flipside's schema documentation for the specific chain and table you are using.
- Over-fetching Data Without Limits. You run queries without
WHEREclauses onblock_timestampor without aLIMITclause, trying to pull the entire history of a chain. This leads to long execution times, high compute costs, and potential timeouts. Instead, always specify a reasonable time range and useLIMITfor initial exploration. - Reinventing the Wheel for Common Data. You attempt to parse raw
logsortracesfor simple token transfers or ETH movements. Flipside has pre-built, optimizedez_*tables for these common use cases. Instead, leverage theez_eth_transfers,ez_token_transfers, and otherez_*tables for readily available, decoded data. - Hardcoding API Keys Directly in Code. You embed your Flipside API key directly into your scripts or public repositories. This is a security risk. Instead, use environment variables (
FLIPSIDE_API_KEY) or a secrets management system to keep your API key secure. - Treating Raw Logs as Easily Readable. You expect the
event_logstable'sdecoded_logcolumn to always contain human-readable names for all events and arguments. While Flipside decodes many, some custom or very new contracts might not be fully parsed. Instead, verify the decoding quality for custom contracts and be prepared to write more specific parsing logic ifdecoded_logis incomplete for your use case.
Install this skill directly: skilldb add blockchain-data-skills
Related Skills
Blockchain Etl
This skill covers the extraction, transformation, and loading (ETL) of blockchain data into structured databases or data warehouses. You use this when building scalable analytics platforms, dApps requiring extensive historical data, or custom indexing services that go beyond simple RPC queries, enabling complex analysis and reporting.
DEFI Llama Data
This skill enables you to leverage DefiLlama's comprehensive aggregated data for decentralized finance protocols. You utilize this when building applications that require insights into Total Value Locked (TVL), historical protocol performance, yield opportunities, and cross-chain liquidity metrics, providing a standardized view of the DeFi landscape.
Dune Analytics
This skill enables you to leverage Dune Analytics for querying, analyzing, and visualizing blockchain data using SQL. You use this when building dashboards, creating custom data insights, or integrating on-chain data into applications, transforming raw ledger information into actionable intelligence.
Gas Analytics
This skill teaches you to analyze, predict, and optimize transaction fees (gas) on EVM-compatible blockchains. You leverage gas analytics to minimize operational costs, improve user experience by ensuring timely transaction finality, and make informed decisions about dApp deployment and interaction strategies, especially during network congestion.
Nansen Analytics
This skill empowers you to leverage Nansen Analytics' proprietary on-chain data and entity labeling for enhanced insights into blockchain activity. You employ Nansen when your applications demand deep, real-time intelligence on market movements, smart money flows, and whale activity, going beyond raw RPC data.
On Chain Data Analysis
This skill equips you to extract, interpret, and analyze data directly from blockchain ledgers. You use this when building applications that require real-time insights into smart contract state, transaction history, or token movements, providing verifiable and immutable data sources.