PantherSwarm
  • Introduction
    • What is Pantherswarm?
  • Development Road Map
  • Community Road Map
  • Swarm
    • Cub
  • Truffle Hog (scanner)
  • Panther AI
  • Wild Cat (Wallet Manager)
  • Tokenomics
    • $Panther
    • NFT Investor Pass / Liquidity Raise
  • System Architecture
    • Overview
    • Technology Stack
    • Data Flow
    • Ingestion
  • Ai Agents
  • Delivery
  • Contracts overview
  • Contracts Staking
  • Developers
    • Getting Started
    • Api
    • Security Audits
  • Dictionary
    • Dictionary: Key Terms
Powered by GitBook
On this page
  • CORE WebSocket Client
  • EVM Data Scraper / Listener
  • Message Queue Publisher
  1. System Architecture

Ingestion

The Data Ingestion Module is responsible for reliably connecting to Hyperliquid's data sources and feeding raw or structured data into the Panther processing pipeline.

CORE WebSocket Client

  • Purpose: Connects to Hyperliquid's primary WebSocket API (wss://api.hyperliquid.xyz/ws) to receive real-time data from the CORE perpetuals exchange.

  • Functionality:

    • Establishes and maintains a persistent WebSocket connection.

    • Handles authentication if required by the API.

    • Subscribes to necessary data streams (e.g., l2Book, trades, webData - [Placeholder: List exact streams used]).

    • Receives incoming JSON messages.

    • Parses JSON data.

    • Implements robust error handling (connection drops, malformed messages).

    • Includes automatic reconnection logic with backoff strategies.

    • Publishes received and parsed data to designated Redis channels.

  • Technology: [Placeholder: e.g., Python with websockets library, Node.js with ws library]

EVM Data Scraper / Listener

  • Purpose: Interacts with the Hyperliquid EVM layer via RPC endpoints to gather data about DeFi activities, particularly DEX liquidity pools.

  • Functionality:

    • Connects to a reliable Hyperliquid EVM RPC endpoint ([Placeholder: Specify source - e.g., public node, paid provider like Alchemy/Infura, self-hosted]).

    • Periodically queries relevant smart contracts for state information (e.g., pool reserves, total supply of LP tokens) using eth_call.

    • Subscribes to specific contract events (e.g., Swap, Mint, Burn for target DEXs/LPs) using eth_subscribe or fetches logs via eth_getLogs.

    • Requires ABIs (Application Binary Interfaces) for the contracts being monitored.

    • Parses event data and contract query results.

    • Handles RPC errors and node connectivity issues.

    • Publishes relevant EVM data updates (e.g., swaps, liquidity changes, calculated APYs) to designated Redis channels.

  • Technology: [Placeholder: e.g., Python with web3.py, Node.js with ethers.js or web3.js]

Message Queue Publisher

  • Purpose: Acts as the interface between the ingestion clients (CORE & EVM) and the Redis message queue.

  • Functionality:

    • Takes parsed data objects from the ingestion clients.

    • Connects to the Redis server.

    • Serializes data objects (e.g., to JSON strings).

    • Publishes serialized data to the appropriate Redis channel using the PUBLISH command.

    • Handles Redis connection errors.

  • Technology: [Placeholder: e.g., redis-py library for Python, ioredis or redis library for Node.js]

PreviousData FlowNextAi Agents