HAF Block Explorer (HAFBE) is a comprehensive blockchain API for querying information about the Hive blockchain. It provides a RESTful API for accessing transactions, operations, accounts, witnesses, and block data.
Built on PostgreSQL and PostgREST, HAFBE processes blockchain data through the Hive Application Framework (HAF) and exposes it via a clean REST API. It integrates three sub-applications to provide complete blockchain data coverage: Balance Tracker (account balances), Reputation Tracker (reputation scores), and HAfAH (account history).
HAFBE is designed for developers building Hive applications, block explorers, analytics tools, and any service that needs access to historical blockchain data.
- Block Search: Query blocks by number, hash, or transaction hash
- Account Data: Look up account information, operation history, and statistics
- Witness Information: Track witness votes, statistics, and price feeds
- Transaction Lookup: Search transactions by hash with full operation details
- Balance Tracking: Real-time account balances (HIVE, HBD, VESTS, savings, delegations)
- Reputation Scores: Calculated reputation for all accounts
- Authority History: Track account key and authority changes over time
- OpenAPI/Swagger: Auto-generated API documentation and client generation
- High Performance: Optimized PostgreSQL queries and indexes for fast responses
┌─────────────────────────────────────────────────────────────────┐
│ Hive Blockchain │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ HAF (Hive Application Framework) │
│ Receives blocks from hived, stores in PostgreSQL │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ HAF Block Explorer │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────────────────┐ │
│ │ btracker │ │ reptracker │ │ hafah │ │
│ │ (balances) │ │ (reputation)│ │ (account history) │ │
│ └─────────────┘ └─────────────┘ └─────────────────────────┘ │
│ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ HAFBE Core Processing │ │
│ │ • Witness tracking • Account statistics │ │
│ │ • Block search • Transaction indexing │ │
│ └─────────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ PostgREST │
│ Converts SQL functions to REST endpoints │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ REST API Clients │
│ Applications, Block Explorers, Analytics Tools │
└─────────────────────────────────────────────────────────────────┘
- Database: PostgreSQL 14 with PL/pgSQL stored procedures
- API Layer: PostgREST (automatic REST API from SQL schema)
- Data Source: HAF (Hive Application Framework)
- Languages: SQL (primary), Bash (scripts), Python 3.12+ (testing/client)
- Containerization: Docker, Docker Compose, BuildKit
- Testing: Tavern (YAML API tests), pytest, JMeter (performance)
- Linting: sqlfluff (SQL), shellcheck (Bash)
- Package Management: Poetry (Python dependencies)
haf_block_explorer/
├── backend/ SQL application logic
│ ├── endpoint_helpers/ Functions powering API endpoints
│ ├── operation_parsers/ Extract data from blockchain operations
│ └── utilities/ Shared utilities (validation, constants)
├── db/ Core database layer
│ ├── hafbe_app.sql Main app setup and processing loop
│ ├── process_*.sql Block processing functions
│ └── indexes.sql Database indexes
├── endpoints/ PostgREST API definitions
│ ├── endpoint_schema.sql All endpoint definitions
│ ├── accounts/ Account endpoints
│ ├── witnesses/ Witness endpoints
│ ├── block-search/ Block search endpoints
│ └── transactions/ Transaction endpoints
├── scripts/ Operational scripts
│ ├── install_app.sh Install HAFBE on database
│ ├── process_blocks.sh Run block processing
│ └── uninstall_app.sh Remove HAFBE
├── tests/ Test suites
│ ├── regression/ Compare against hived snapshots
│ ├── tavern/ YAML-based API tests
│ ├── performance/ JMeter performance tests
│ └── functional/ Script tests
├── docker/ Docker Compose deployment
└── submodules/ Integrated sub-applications
├── btracker/ Balance Tracker
├── reptracker/ Reputation Tracker
└── hafah/ HAF Account History (HAfAH)
Before installing HAF Block Explorer, you need:
-
HAF Server: A running HAF (Hive Application Framework) instance with:
- PostgreSQL 14+
- HAF SQL extensions installed
- Block data synced (or syncing)
-
System Requirements:
- PostgreSQL 14 or higher
- Bash shell
- Python 3.12+ (for testing and client generation)
-
For Docker deployment:
- Docker Engine 20.10+
- Docker Compose v2
Note: HAFBE uses balance_tracker as a sub-app. If you have balance_tracker installed separately, uninstall it first to avoid schema conflicts.
The easiest way to run HAFBE is using the HAF API Node scripts, which manage HAF and all applications together.
-
Clone the repository with submodules:
git clone --recursive https://gitlab.syncad.com/hive/haf_block_explorer.git cd haf_block_explorer -
Set up dependencies:
./scripts/setup_dependencies.sh --install-all
-
Install HAFBE on the database:
./scripts/install_app.sh --host=<db_host> --port=5432 --user=haf_admin
Use
./scripts/install_app.sh --helpfor all options.
After installation, process blockchain blocks to populate HAFBE data:
./scripts/process_blocks.sh --host=<db_host>The processing will continue until it catches up with the latest blocks in HAF, then switch to live mode.
Use ./scripts/process_blocks.sh --help for options including:
--stop-at-block=N: Stop after processing block N--log-file=PATH: Write logs to file
Once PostgREST is running (default port 3000), access endpoints like:
# Get account info
curl "http://localhost:3000/rpc/get_account?_account_name=hiveio"
# Search blocks
curl "http://localhost:3000/rpc/get_block?_block_num=1000000"
# Get witness data
curl "http://localhost:3000/rpc/get_witness?_witness_name=blocktrades"cd docker
docker compose up -d # Start all services
docker compose --profile swagger up -d # Include Swagger UI (port 8080)
docker compose --profile db-tools up -d # Include PgAdmin/PgHeroSee docker/README.md for detailed Docker instructions.
HAF Block Explorer integrates three sub-applications:
Tracks all account balances in real-time:
- HIVE and HBD liquid balances
- VESTS (staked HIVE)
- Savings balances
- Delegations (incoming and outgoing)
- Recurrent transfers
Calculates reputation scores for all accounts based on votes received. Reputation is computed using the same algorithm as hived.
Provides account history API, including:
- All operations involving an account
- Filtered queries by operation type
- Historical account activity
To remove HAFBE and its data from the database:
./scripts/uninstall_app.sh --host=<db_host>HAFBE processes blockchain data from HAF (Hive Application Framework) via a continuous main loop. The entry point is hafbe_app.main() called by scripts/process_blocks.sh.
Processing pipeline:
scripts/process_blocks.shstarts the applicationhafbe_app.main()enters an infinite loop callinghive.app_next_iteration()- For each block range, the system orchestrates:
- Balance Tracker processing (btracker submodule)
- HAFBE core processing (5 processors)
- HAF state provider updates
Used during initial sync or catching up after downtime:
- Batch size: 10,000 blocks per iteration
- Optimization:
synchronous_commit = OFFfor maximum throughput - Behavior: No cache table updates, indexes created after stage completes
Used when caught up with blockchain head:
- Batch size: 1 block per iteration
- Optimization:
synchronous_commit = ONfor data safety - Behavior: Updates cache tables for fast API queries
Each block range runs through these processors in order:
- Account Stats (
process_account_stats) - Account creation, recovery, voting rights - Block Operations (
process_block_operations) - Operation counts per block - Transaction Stats (
process_transaction_stats) - Daily/monthly transaction aggregations - Witness Stats (
process_witness_stats) - Witness properties and metadata - Witness Votes (
process_witness_votes) - Vote and proxy state management
HAFBE tracks comprehensive witness data:
- Properties: URL, signing key, price feed, block size preferences
- Statistics: Missed blocks, last created block, version
- Votes: Current votes, vote history, voter counts
- Proxies: Proxy assignments and delegation chains
Hive Blockchain
│
▼
HAF (stores operations in PostgreSQL)
│
▼
process_blocks.sh → hafbe_app.main()
│
├── btracker_process_blocks() → Balance data
│
├── process_account_stats() → Account parameters
├── process_block_operations() → Operation counts
├── process_transaction_stats() → Transaction aggregations
├── process_witness_stats() → Witness metadata
├── process_witness_votes() → Vote/proxy state
│
└── haf.app_state_providers_update() → HAF core state
| Function | File | Purpose |
|---|---|---|
process_blocks() |
db/hafbe_app.sql |
Dispatcher for massive/single processing |
process_account_stats() |
db/process_account_stats.sql |
Account parameters |
process_block_operations() |
db/process_block_operations.sql |
Operation counts |
process_transaction_stats() |
db/process_transaction_stats.sql |
Transaction aggregations |
process_witness_stats() |
db/process_witness_stats.sql |
Witness metadata |
process_witness_votes() |
db/process_witness_votes.sql |
Votes and proxies |
To gracefully stop block processing:
-- From a separate database session:
SELECT hafbe_app.stopProcessing();
COMMIT; -- Must commit for the change to be visibleThe main loop will exit after completing the current block.
HAFBE exposes a REST API via PostgREST, automatically converting PostgreSQL functions into HTTP endpoints. The API uses JSON for request/response bodies.
http://localhost:3000/hafbe-api/
When deployed with HAF API Node, the API is typically available at:
https://api.hive.blog/hafbe-api/
No authentication is required. The API is read-only and publicly accessible.
| Endpoint | Method | Description |
|---|---|---|
/block-search |
GET | Search blocks by operation type, account, and time range |
| Endpoint | Method | Description |
|---|---|---|
/accounts/{account-name} |
GET | Get account info with balances |
/accounts/{account-name}/authority |
GET | Get account keys and authorities |
/accounts/{account-name}/operations/comments/{permlink} |
GET | Get operations for a post |
/accounts/{account-name}/comments |
GET | List account's posts/comments |
/accounts/{account-name}/proxies |
GET | Get accounts proxying to this account |
| Endpoint | Method | Description |
|---|---|---|
/witnesses |
GET | List all witnesses with ranking |
/witnesses/{account-name} |
GET | Get single witness info |
/witnesses/{account-name}/voters |
GET | List accounts voting for witness |
/witnesses/{account-name}/voters/num |
GET | Get voter count for witness |
/witnesses/{account-name}/votes-history |
GET | Get witness vote history |
| Endpoint | Method | Description |
|---|---|---|
/transaction-statistics |
GET | Get aggregated transaction stats |
| Endpoint | Method | Description |
|---|---|---|
/version |
GET | Get HAFBE version (git hash) |
/headblock |
GET | Get last synced block number |
/operation-type-counts |
GET | Get operation histogram for blocks |
/input-type/{input-value} |
GET | Detect input type (block/tx/account) |
curl "http://localhost:3000/hafbe-api/accounts/blocktrades"Response:
{
"id": 440,
"name": "blocktrades",
"can_vote": true,
"reputation": 69,
"balance": 29594875,
"vesting_shares": "8172549681941451",
"witnesses_voted_for": 9,
"is_witness": true
}curl "http://localhost:3000/hafbe-api/block-search?operation-types=0&page-size=5"Response:
{
"total_blocks": 5000000,
"total_pages": 1000000,
"blocks_result": [
{
"block_num": 5000000,
"created_at": "2016-09-15T19:47:21",
"producer_account": "ihashfury",
"operations": [
{"op_type_id": 0, "op_count": 2}
]
}
]
}curl "http://localhost:3000/hafbe-api/witnesses/blocktrades"Response:
{
"witness_name": "blocktrades",
"rank": 8,
"url": "https://blocktrades.us",
"vests": "82373419958692803",
"voters_num": 263,
"price_feed": 0.545,
"version": "1.27.0",
"missed_blocks": 935
}curl "http://localhost:3000/hafbe-api/transaction-statistics?granularity=monthly"Response:
[
{
"date": "2024-01-01T00:00:00",
"trx_count": 15234567,
"avg_trx": 12,
"min_trx": 0,
"max_trx": 156
}
]Many endpoints support these standard parameters:
| Parameter | Type | Description |
|---|---|---|
page |
integer | Page number (1-indexed) |
page-size |
integer | Results per page (default: 100, max: 1000) |
direction |
string | Sort order: asc or desc |
from-block |
string | Block number or timestamp (YYYY-MM-DD HH:MI:SS) |
to-block |
string | Block number or timestamp (YYYY-MM-DD HH:MI:SS) |
The complete OpenAPI 3.1 specification is auto-generated from SQL comments.
Access the spec:
curl "http://localhost:3000/hafbe-api/"Swagger UI: Available at port 8080 when deployed with Docker using:
docker compose --profile swagger up -d| Status | Description |
|---|---|
| 200 | Success |
| 404 | Resource not found (e.g., account doesn't exist) |
| 400 | Invalid parameters |
HAFBE uses a multi-layered testing strategy to ensure API correctness, data integrity, and performance:
| Test Type | Purpose |
|---|---|
| Regression | Compare computed data against hived snapshots |
| Tavern | Validate API response patterns (YAML-based) |
| Performance | JMeter load testing for throughput measurement |
| Functional | Verify install/uninstall scripts work correctly |
# Regression tests - compare against hived snapshots
cd tests/regression && ./run_test.sh --host=localhost --type=all
# Tavern API pattern tests (parallel execution)
cd tests/tavern/patterns-mainnet && pytest -n 8 .
# Performance tests with JMeter
./tests/performance/run_performance_tests.sh --postgrest-host=localhost
# Functional script tests
./tests/functional/test_scripts.sh --host=localhostCompare HAFBE's computed account and witness data against expected values from hived node snapshots. Detects calculation errors and processing bugs.
- Location:
tests/regression/ - Runner:
./run_test.sh --host=localhost --type=all - Options:
--type=account,--type=witness,--type=all
YAML-based API tests using pytest-tavern. Validates endpoint responses against saved pattern files.
- Location:
tests/tavern/patterns-mainnet/ - Runner:
pytest -n 8 . - Environment: Requires
HAFBE_ADDRESSandHAFBE_PORT
JMeter-based load testing to measure endpoint throughput and response times.
- Location:
tests/performance/ - Runner:
./run_performance_tests.sh - Output: HTML dashboard at
result/result_report/index.html
Verifies install and uninstall scripts work correctly.
- Location:
tests/functional/ - Runner:
./test_scripts.sh --host=localhost
Tests run automatically in the GitLab CI pipeline:
detect → lint → build → sync → test → publish
| CI Job | Test Type | Artifacts |
|---|---|---|
regression-test |
Regression | regression_test.log |
pattern-test |
Tavern | JUnit XML report |
performance-test |
Performance | HTML report |
setup-scripts-test |
Functional | - |
- Regression: Add comparison logic to
tests/regression/sql/02_compare.sql - Tavern: Create
.tavern.yamlfile in appropriatetests/tavern/patterns-mainnet/directory - Performance: Modify
tests/performance/endpoints.jmxJMeter test plan - Functional: Add tests to
tests/functional/test_scripts.sh
For detailed test documentation, see scripts/claude/tests.md.
Contributions are welcome! Please follow these guidelines:
- Fork the repository and create a feature branch
- Follow coding conventions:
- SQL: Max 170 char lines, use
SET ROLE hafbe_owner; - Bash: Use
set -euo pipefail, include--helpoption - Python: Python 3.12+, use Poetry for dependencies
- SQL: Max 170 char lines, use
- Add tests for new functionality
- Update documentation in
scripts/claude/for significant changes - Submit a merge request with clear description
# Clone with submodules
git clone --recursive https://gitlab.syncad.com/hive/haf_block_explorer.git
# Make changes to SQL files
vim backend/endpoint_helpers/your_function.sql
# Install changes
./scripts/install_app.sh --host=localhost
# Run tests
cd tests/tavern/patterns-mainnet && pytest -n 8 .This project is licensed under the MIT License. See the LICENSE file for details.
- Repository: GitLab - haf_block_explorer
- HAF Documentation: GitLab - HAF
- HAF API Node: GitLab - haf_api_node
- Hive Developer Portal: developers.hive.io
- Issue Tracker: GitLab Issues