Skip to content

Hono Example

Drizzle-Cube Hono Example with React Dashboard

Section titled “Drizzle-Cube Hono Example with React Dashboard”

A complete full-stack analytics application with Hono backend and React frontend using drizzle-cube. This demonstrates how to create a production-ready semantic layer with type-safe analytics queries and interactive dashboards.

Github Repository

  • 🚀 Hono web framework - Fast, lightweight, and built on Web Standards
  • ⚛️ React dashboard - Interactive analytics dashboards with chart editing
  • 🗃️ Drizzle ORM integration - Type-safe database operations with PostgreSQL
  • 📊 Cube.js compatibility - Drop-in replacement for existing Cube.js frontends
  • 🔒 Multi-tenant security - Organization-based data isolation
  • 📈 Real-time analytics - Employee and department analytics with joins
  • 💾 Persistent dashboards - Save and load dashboard configurations
  • 🎯 Type safety - Full TypeScript support from database to frontend
  • ☁️ Cloudflare Workers - Deploy to edge locations globally with Wrangler
  • 🌐 Neon Integration - Auto-detects Neon URLs for serverless PostgreSQL
  • 🤖 AI Assistant - Natural language query generation with Google Gemini
Section titled “Option A: Using Docker Compose (Recommended)”
Terminal window
# Start PostgreSQL with Docker Compose
npm run docker:up
# View logs (optional)
npm run docker:logs

This starts:

  • PostgreSQL on port 54921 (high random port to avoid conflicts)
  • pgAdmin on port 5050 for database administration
Terminal window
# Start PostgreSQL manually
docker run --name drizzle-cube-postgres \
-e POSTGRES_DB=drizzle_cube_db \
-e POSTGRES_USER=drizzle_user \
-e POSTGRES_PASSWORD=drizzle_pass123 \
-p 54921:5432 \
-d postgres:15-alpine

Update the DATABASE_URL in your .env file to point to your existing PostgreSQL instance.

Terminal window
# Install server dependencies
npm install
# Install client dependencies
npm run install:client
Terminal window
# Copy environment template
cp .env.example .env
# The default settings work with Docker Compose
# Edit .env only if using a different database setup
Terminal window
# Starts Docker, runs migrations, and seeds data
npm run setup
Terminal window
# Generate migrations from schema
npm run db:generate
# Run migrations to create tables
npm run db:migrate
# Seed with sample data
npm run db:seed
Terminal window
# Start both backend and frontend in watch mode
npm run dev:full
# Or start them separately:
# npm run dev:server # Backend on http://localhost:3001
# npm run dev:client # Frontend on http://localhost:3000
  • GET /cubejs-api/v1/meta - Get available cubes and schema
  • POST /cubejs-api/v1/load - Execute analytics queries
  • GET /cubejs-api/v1/load?query=… - Execute queries via URL
  • POST /cubejs-api/v1/sql - Generate SQL without execution
  • GET /api/analytics-pages - List all dashboards
  • GET /api/analytics-pages/:id - Get specific dashboard
  • POST /api/analytics-pages - Create new dashboard
  • PUT /api/analytics-pages/:id - Update dashboard
  • DELETE /api/analytics-pages/:id - Delete dashboard
  • POST /api/analytics-pages/create-example - Create example dashboard
  • GET /api/docs - API documentation with examples
  • GET /health - Health check endpoint
Terminal window
curl -X POST http://localhost:3001/cubejs-api/v1/load \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-token" \
-d '{
"measures": ["Employees.count"],
"dimensions": ["Departments.name"],
"cubes": ["Employees", "Departments"]
}'
Terminal window
curl -X POST http://localhost:3001/cubejs-api/v1/load \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-token" \
-d '{
"measures": ["Employees.avgSalary", "Employees.totalSalary"],
"dimensions": ["Departments.name"],
"cubes": ["Employees", "Departments"]
}'
Terminal window
curl -X POST http://localhost:3001/cubejs-api/v1/load \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-token" \
-d '{
"measures": ["Employees.activeCount"],
"dimensions": ["Departments.name"],
"cubes": ["Employees", "Departments"],
"filters": [{
"member": "Employees.isActive",
"operator": "equals",
"values": [true]
}]
}'
examples/hono/
├── client/ # React dashboard frontend
│ ├── src/
│ │ ├── components/ # React components
│ │ ├── pages/ # Dashboard pages
│ │ ├── hooks/ # Custom React hooks
│ │ └── types/ # TypeScript types
│ ├── package.json # Frontend dependencies
│ └── vite.config.ts # Vite configuration
├── src/
│ ├── index.ts # Server entry point
│ └── analytics-routes.ts # Dashboard API routes
├── scripts/
│ ├── migrate.ts # Database migration runner
│ └── seed.ts # Sample data seeder
├── app.ts # Main Hono application
├── schema.ts # Drizzle database schema
├── cubes.ts # Analytics cube definitions
├── drizzle.config.ts # Drizzle configuration
├── package.json # Dependencies and scripts
└── README.md # This file

Defines the database schema using Drizzle ORM:

  • employees table with salary, department, and organization
  • departments table with budget information
  • analyticsPages table for storing dashboard configurations
  • Proper relations for type inference

Defines analytics cubes with type safety:

  • employeesCube - Employee analytics with department joins
  • departmentsCube - Department-level budget analytics
  • Security context integration for multi-tenant isolation

Main Hono application with:

  • Drizzle-cube integration
  • Security context extraction
  • CORS configuration
  • Error handling
  • API documentation endpoint
  • Dashboard management API routes

React dashboard frontend with:

  • Interactive analytics dashboards
  • Chart editing and configuration
  • Dashboard CRUD operations
  • Real-time data visualization using drizzle-cube components

This example implements organization-based multi-tenancy:

  1. Security Context: Extracted from Authorization header
  2. Data Isolation: All queries filtered by organisationId
  3. SQL Injection Protection: Drizzle ORM parameterized queries
  4. Type Safety: Full TypeScript validation
  1. Define new tables in schema.ts
  2. Create cube definitions in cubes.ts
  3. Register cubes in app.ts
  4. Run migrations: npm run db:generate && npm run db:migrate

Modify the getSecurityContext function in app.ts to integrate with your authentication system:

async function getSecurityContext(c: any): Promise<SecurityContext> {
// Your auth logic here
const user = await validateJWT(c.req.header('Authorization'))
return {
organisationId: user.orgId,
userId: user.id,
roles: user.roles // Add custom fields
}
}
  • DATABASE_URL - PostgreSQL connection string
  • PORT - Server port (default: 3001)
  • NODE_ENV - Environment (development/production)
  • JWT_SECRET - JWT signing secret (if using JWT auth)
  1. Via React UI: Visit http://localhost:3000/dashboards and click “New Dashboard”
  2. Via API: POST to /api/analytics-pages with dashboard configuration
  3. Example Dashboard: Click “Create Example” to generate a sample dashboard

Dashboards are stored as JSON configurations with:

{
"portlets": [
{
"id": "unique-id",
"title": "Chart Title",
"query": "{\"measures\":[\"Employees.count\"]}",
"chartType": "pie",
"chartConfig": { "x": "dimension", "y": ["measure"] },
"w": 6, "h": 6, "x": 0, "y": 0
}
]
}

Supported chart types:

  • pie - Pie chart
  • bar - Bar chart
  • line - Line chart
  • area - Area chart
  • table - Data table
  • treemap - Tree map

This API is compatible with Cube.js frontends:

Simply point your frontend to http://localhost:3001/cubejs-api/v1 as the API URL.

The Hono example includes an AI Assistant that can generate queries from natural language descriptions using Google Gemini.

  1. Get a Google Gemini API key from Google AI Studio

  2. Configure the API key (choose one method):

    Option A: Environment variable (recommended)

    Terminal window
    export GEMINI_API_KEY="AIza..."
    npm run dev

    Option B: Pass in request headers

    Terminal window
    # The client will store the API key and pass it in headers
    # No server configuration needed
  1. Open the Query Builder - Navigate to the Query Builder tab
  2. Click the sparkles icon (✨) - Opens the AI Assistant modal
  3. Enter your API key - If not set via environment variable
  4. Describe your query - Use natural language like:
    • “Show me total employees by department”
    • “Revenue by month for the last year”
    • “Top performing products with sales count”
  5. Get the query - Gemini generates a Cube.js query JSON

The AI features are available via proxy endpoints to avoid CORS issues:

  • GET /api/ai/health - Check AI service status
  • POST /api/ai/generate - Generate content with Gemini

Example:

Terminal window
# Check if AI is configured
curl http://localhost:3001/api/ai/health
# Generate query
curl -X POST http://localhost:3001/api/ai/generate \
-H "X-API-Key: AIza..." \
-H "Content-Type: application/json" \
-d '{
"text": "Show me employees by department"
}'
  • API keys are never stored on the client permanently
  • All AI API calls are proxied through the server
  • Rate limiting and API key validation on the server side
  • Client sends API key in request headers only
Terminal window
# Start services
npm run docker:up
# Stop services
npm run docker:down
# View logs
npm run docker:logs
# Reset everything (removes volumes and data)
npm run docker:reset
# Complete setup from scratch
npm run setup
  • PostgreSQL: localhost:54921

    • Database: drizzle_cube_db
    • User: drizzle_user
    • Password: drizzle_pass123
  • pgAdmin: localhost:5050

    • Email: admin@drizzlecube.local
    • Password: admin123

From your host machine:

Terminal window
psql -h localhost -p 54921 -U drizzle_user -d drizzle_cube_db

From pgAdmin:

  1. Open http://localhost:5050
  2. Login with the credentials above
  3. Add server with:
    • Host: postgres (Docker network name)
    • Port: 5432 (internal port)
    • Database: drizzle_cube_db
    • Username: drizzle_user
    • Password: drizzle_pass123
  1. Environment: Set NODE_ENV=production
  2. Database: Use managed PostgreSQL (AWS RDS, etc.)
  3. Security: Implement proper JWT validation
  4. Monitoring: Add logging and metrics
  5. Scaling: Use load balancers and connection pooling

Deploy to Cloudflare’s global edge network for automatic scaling and minimal cold starts.

  • Cloudflare account (free tier works)
  • Neon PostgreSQL database for production
Terminal window
# 1. Authenticate with Cloudflare
npm run cf:login
# 2. Set production database URL
npx wrangler secret put DATABASE_URL
# Paste your Neon connection string when prompted
# 3. Deploy to production
npm run deploy
Terminal window
# Create local environment file
cp .dev.vars.example .dev.vars
# Edit .dev.vars with your DATABASE_URL
# Start local Wrangler development server
npm run dev:worker
  • Local: Use .dev.vars for development
  • Production: Use wrangler secret put for sensitive data
  • Staging: Deploy with npm run deploy:staging
  • Global Edge: 200+ locations worldwide
  • Auto Scaling: Scales from 0 to millions of requests
  • Zero Cold Starts: V8 isolates start in <1ms
  • Neon Integration: Built-in database connection pooling
  • Cost Effective: Free tier includes 100,000 requests/day

📖 Complete Guide: See CLOUDFLARE.md for detailed setup instructions.