Professional Python client library for GMGN.ai WebSocket API
GmGnAPI provides real-time access to Solana blockchain data through GMGN.ai's WebSocket API with advanced features including intelligent filtering, data export capabilities, monitoring statistics, and automated alerting.
- π Documentation - Complete guides and API reference
- π¬ Discord Community - Get help and discuss strategies
- π Create GMGN Account - Sign up with our referral link to support the project
- Real-time WebSocket connection to GMGN.ai API
- Multiple data channels: New pools, token launches, pair updates, chain statistics, social info, wallet trades, limit orders
- Automatic reconnection with exponential backoff
- Comprehensive error handling and logging
- Type-safe with full Pydantic model validation
- Async/await support for modern Python applications
- π Intelligent Filtering: Advanced token filtering by market cap, liquidity, volume, holder count, exchange, and risk scores
- π Data Export: Export to JSON, CSV, or SQLite database with automatic file rotation and compression
- π Monitoring & Statistics: Real-time connection metrics, message counts, unique token/pool tracking
- π¨ Alert System: Configurable alerts for market conditions with rate limiting and webhook support
- β‘ Rate Limiting: Configurable message processing limits to prevent overwhelming
- π Queue Management: Buffered message processing with configurable queue sizes
pip install gmgnapi
git clone https://github.com/yourusername/gmgnapi.git
cd gmgnapi
pip install -e .
Create your GMGN account using our referral link to support the project: π Create GMGN Account
- Log in to your GMGN account
- Navigate to Account Settings
- Generate an API token
- Copy and securely store your token
Join our Discord server for support and updates: π Discord Community
import asyncio
from gmgnapi import GmGnClient
async def on_new_pool(pool_info):
if pool_info.pools:
pool = pool_info.pools[0]
token_info = pool.bti
if token_info:
print(f"New pool: {token_info.s} ({token_info.n})")
async def main():
client = GmGnClient()
client.on_new_pool(on_new_pool)
await client.connect()
await client.subscribe_new_pools()
# Keep running
while True:
await asyncio.sleep(1)
asyncio.run(main())
import asyncio
from decimal import Decimal
from gmgnapi import (
GmGnEnhancedClient,
TokenFilter,
DataExportConfig,
AlertConfig
)
async def main():
# Configure advanced token filtering
token_filter = TokenFilter(
min_market_cap=Decimal("50000"), # $50k minimum market cap
min_liquidity=Decimal("10000"), # $10k minimum liquidity
min_volume_24h=Decimal("5000"), # $5k minimum daily volume
min_holder_count=10, # 10+ holders
exchanges=["raydium", "orca"], # Specific exchanges only
exclude_symbols=["SCAM", "TEST"], # Exclude potential scams
max_risk_score=0.7, # Maximum risk threshold
)
# Configure data export
export_config = DataExportConfig(
enabled=True,
format="json", # "json", "csv", or "database"
file_path="./gmgn_data", # Export directory
max_file_size_mb=50, # File rotation at 50MB
rotation_interval_hours=6, # Rotate every 6 hours
compress=True, # Enable compression
)
# Configure alerts
alert_config = AlertConfig(
enabled=True,
webhook_url="https://hooks.slack.com/...", # Optional webhook
conditions=[
{
"type": "high_value_pool",
"min_market_cap": 100000,
"description": "Alert for pools > $100k"
}
],
rate_limit_seconds=300, # Max 1 alert per 5 minutes
)
# Initialize enhanced client
client = GmGnEnhancedClient(
token_filter=token_filter,
export_config=export_config,
alert_config=alert_config,
rate_limit=100, # Max 100 messages/second
)
# Event handlers
async def on_new_pool(pool_info):
if pool_info.pools:
pool = pool_info.pools[0]
token_info = pool.bti
if token_info:
print(f"π₯ Filtered pool: {token_info.s} - ${token_info.mc:,}")
async def on_volume_spike(pair_data):
if pair_data.volume_24h_usd > Decimal("500000"):
print(f"π Volume spike: ${pair_data.volume_24h_usd:,}")
client.on_new_pool(on_new_pool)
client.on_pair_update(on_volume_spike)
# Connect and subscribe
await client.connect()
await client.subscribe_all_channels()
# Monitor and get statistics
while True:
await asyncio.sleep(60)
stats = client.get_monitoring_stats()
print(f"π Stats: {stats.total_messages:,} messages, "
f"{stats.unique_tokens_seen} tokens, "
f"{stats.unique_pools_seen} pools")
asyncio.run(main())
new_pools
: New liquidity pool creation eventspair_update
: Trading pair price and volume updatestoken_launch
: New token launch notificationschain_stats
: Blockchain statistics and metrics
token_social
: Token social media and community informationwallet_trades
: Wallet trading activity and transactionslimit_orders
: Limit order updates and fills
Filter tokens based on various criteria:
TokenFilter(
min_market_cap=Decimal("10000"), # Minimum market cap in USD
max_market_cap=Decimal("1000000"), # Maximum market cap in USD
min_liquidity=Decimal("5000"), # Minimum liquidity in USD
min_volume_24h=Decimal("1000"), # Minimum 24h volume in USD
min_holder_count=10, # Minimum number of holders
exchanges=["raydium", "orca"], # Allowed exchanges
symbols=["SOL", "USDC"], # Specific symbols to include
exclude_symbols=["SCAM", "TEST"], # Symbols to exclude
max_risk_score=0.5, # Maximum risk score (0-1)
)
Configure data export and storage:
DataExportConfig(
enabled=True, # Enable/disable export
format="json", # "json", "csv", or "database"
file_path="./exports", # Export directory
max_file_size_mb=100, # File size limit for rotation
rotation_interval_hours=24, # Time-based rotation
compress=True, # Enable compression
include_metadata=True, # Include extra metadata
)
Set up alerts and notifications:
AlertConfig(
enabled=True, # Enable/disable alerts
webhook_url="https://hooks.slack.com/...", # Webhook URL for notifications
email="[email protected]", # Email for alerts
conditions=[ # Custom alert conditions
{
"type": "new_pool",
"min_liquidity": 100000,
"description": "High liquidity pool alert"
}
],
rate_limit_seconds=300, # Minimum time between alerts
)
Get real-time monitoring statistics:
stats = client.get_monitoring_stats()
print(f"Total messages: {stats.total_messages:,}")
print(f"Messages per minute: {stats.messages_per_minute:.1f}")
print(f"Unique tokens seen: {stats.unique_tokens_seen}")
print(f"Unique pools seen: {stats.unique_pools_seen}")
print(f"Connection uptime: {stats.connection_uptime:.0f}s")
print(f"Error count: {stats.error_count}")
{
"timestamp": "2024-01-15T10:30:00",
"channel": "new_pools",
"data": {
"c": "sol",
"p": [{
"a": "pool_address_here",
"ba": "base_token_address",
"qa": "quote_token_address",
"bti": {
"s": "TOKEN",
"n": "Token Name",
"mc": 150000
}
}]
}
}
timestamp,channel,data
2024-01-15T10:30:00,new_pools,"{""c"":""sol"",""p"":[...]}"
Tables: messages
, new_pools
, trades
with structured data storage.
GmGnAPI provides comprehensive error handling:
from gmgnapi import (
GmGnAPIError,
ConnectionError,
AuthenticationError,
SubscriptionError,
MessageParsingError,
)
try:
await client.connect()
await client.subscribe_wallet_trades()
except AuthenticationError:
print("Invalid access token")
except ConnectionError:
print("Failed to connect to WebSocket")
except SubscriptionError as e:
print(f"Subscription failed: {e}")
except GmGnAPIError as e:
print(f"API error: {e}")
The examples/
directory contains comprehensive examples:
basic_usage.py
: Simple connection and data streamingadvanced_monitoring.py
: Full-featured monitoring with statisticsdata_export.py
: Data export in multiple formatsfiltering_alerts.py
: Advanced filtering and alertingmultiple_channels.py
: Subscribe to multiple data channels
Run the test suite:
# Install test dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Run with coverage
pytest --cov=gmgnapi
# Run specific test file
pytest tests/test_client.py
# Format code
black src/ tests/ examples/
# Sort imports
isort src/ tests/ examples/
# Type checking
mypy src/
# Linting
flake8 src/ tests/
# Install docs dependencies
pip install -e ".[docs]"
# Build docs
cd docs/
make html
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Make your changes
- Add tests for new functionality
- Ensure all tests pass (
pytest
) - Format your code (
black
,isort
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- GMGN.ai for providing the WebSocket API
- Pydantic for data validation
- websockets for WebSocket client implementation
- Documentation: https://gmgnapi.readthedocs.io/
- PyPI Package: https://pypi.org/project/gmgnapi/
- GitHub Issues: https://github.com/yourusername/gmgnapi/issues
- GMGN.ai: https://gmgn.ai/
β‘ Built for speed, designed for reliability, crafted for the Solana ecosystem.