Apache Superset
Connect Parseable to Apache Superset for data exploration and visualization
Connect Parseable to Apache Superset for powerful data exploration, visualization, and dashboarding.
Overview
Integrate Parseable with Apache Superset to:
- Data Exploration - Explore log data with an intuitive interface
- Rich Visualizations - Create charts, graphs, and maps
- Interactive Dashboards - Build real-time monitoring dashboards
- SQL Lab - Run ad-hoc queries on your log data
Prerequisites
- Apache Superset instance
- Parseable instance with data
- Python 3.11.6 or higher
Installation
Install Parseable SQLAlchemy Driver
Install the Parseable connector for Apache Superset:
# Create and activate a virtual environment (recommended)
python -m venv superset-env
source superset-env/bin/activate # On Windows: superset-env\Scripts\activate
# Install Apache Superset
pip install apache-superset
# Install Parseable SQLAlchemy driver
pip install sqlalchemy-parseable
# Initialize Superset
superset db upgrade
superset fab create-admin
superset init
# Run Superset
superset run -p 8088 --with-threads --reload --debuggerConnection Setup
Add Database Connection
- Log in to Apache Superset at
http://localhost:8088 - Go to Data → Databases → + Database
- Select Other as the database type
- Use the following SQLAlchemy URI format:
parseable://username:password@host:port/dataset_nameExample Connection String
parseable://admin:admin@demo.parseable.com:443/ingress-nginxFor local Parseable instance:
parseable://admin:admin@localhost:8000/application-logsCreating Visualizations
Add Dataset
- Go to Data → Datasets
- Click + Dataset
- Select your Parseable database
- Choose a dataset as the table
- Click Add
Create Chart
- Go to Charts → + Chart
- Select your dataset
- Choose a visualization type:
- Time-series for log trends
- Bar Chart for categorical data
- Table for detailed views
- Big Number for KPIs
Example: Error Rate Over Time
- Create a new chart with Time-series visualization
- Configure:
- Time Column:
p_timestamp - Metric:
COUNT(*) - Filter:
level = 'error' - Time Grain:
hour
- Time Column:
Building Dashboards
Create Dashboard
- Go to Dashboards → + Dashboard
- Add charts by dragging from the chart list
- Arrange and resize as needed
- Add filters for interactivity
- Save and publish
Dashboard Filters
Add cross-filtering to your dashboard:
- Edit dashboard
- Click Filter icon
- Add filter components:
- Time range filter
- Stream selector
- Log level filter
SQL Lab Queries
Use SQL Lab for ad-hoc analysis:
-- Error count by service
SELECT
service,
COUNT(*) as error_count
FROM "application-logs"
WHERE level = 'error'
AND p_timestamp > NOW() - INTERVAL '24 hours'
GROUP BY service
ORDER BY error_count DESC
LIMIT 10;
-- Response time percentiles
SELECT
percentile_cont(0.50) WITHIN GROUP (ORDER BY response_time) as p50,
percentile_cont(0.95) WITHIN GROUP (ORDER BY response_time) as p95,
percentile_cont(0.99) WITHIN GROUP (ORDER BY response_time) as p99
FROM "api-logs"
WHERE p_timestamp > NOW() - INTERVAL '1 hour';Best Practices
- Use Caching - Enable query caching for better performance
- Optimize Queries - Use time filters to limit data scanned
- Create Virtual Datasets - Pre-aggregate data for complex dashboards
- Set Refresh Intervals - Configure appropriate auto-refresh rates
Troubleshooting
Connection Issues
- Verify Parseable is accessible from Superset
- Check credentials are correct
- Ensure PostgreSQL port is exposed
Slow Queries
- Add time range filters
- Use LIMIT clauses
- Enable query caching
Next Steps
- Create dashboards in Parseable
- Set up alerts for anomalies
- Explore Grafana integration
Was this page helpful?