Deploy Solo-Spark Innovation Platform to Cloudflare's global network with your custom domain brainsait.io. This setup provides:
- โก Global CDN with edge computing
- ๐พ KV storage for user data and sessions
- ๐๏ธ R2 object storage for files and exports
- ๐๏ธ D1 SQL database for structured data
- โ๏ธ Cloudflare Workers for serverless APIs
- ๐ Enterprise-grade security and DDoS protection
- ๐ Real-time analytics and monitoring
# Install Wrangler CLI
npm install -g wrangler
# Login to Cloudflare
wrangler login
# Verify authentication
wrangler whoami# Make script executable and run
chmod +x deploy-cloudflare.sh
./deploy-cloudflare.sh- Add
brainsait.ioto your Cloudflare account - Update nameservers to Cloudflare's
- Configure DNS records:
Type Name Target CNAME @ solo-spark.pages.dev CNAME www solo-spark.pages.dev CNAME api solo-spark.workers.dev CNAME cdn solo-spark.pages.dev
- Enable "Full (strict)" SSL mode
- Turn on "Always Use HTTPS"
- Enable HSTS with max-age 31536000
- Configure SSL/TLS certificates
# Create production KV namespaces
wrangler kv:namespace create "USER_DATA" --env production
wrangler kv:namespace create "SESSIONS" --env production
wrangler kv:namespace create "CACHE" --env production
wrangler kv:namespace create "ANALYTICS" --env production
# Create preview KV namespaces
wrangler kv:namespace create "USER_DATA" --env production --preview
wrangler kv:namespace create "SESSIONS" --env production --preview
wrangler kv:namespace create "CACHE" --env production --preview
wrangler kv:namespace create "ANALYTICS" --env production --preview# Create R2 buckets for file storage
wrangler r2 bucket create solo-spark-user-files
wrangler r2 bucket create solo-spark-prototypes
wrangler r2 bucket create solo-spark-exports
wrangler r2 bucket create solo-spark-backups
# Create preview buckets
wrangler r2 bucket create solo-spark-user-files-preview
wrangler r2 bucket create solo-spark-prototypes-preview
wrangler r2 bucket create solo-spark-exports-preview
wrangler r2 bucket create solo-spark-backups-preview# Create D1 database
wrangler d1 create solo-spark-db
# Initialize schema
wrangler d1 execute solo-spark-db --file=./schema.sql
# Verify database
wrangler d1 execute solo-spark-db --command="SELECT * FROM users LIMIT 1"# Create queues for background processing
wrangler queues create ai-processing-queue
wrangler queues create email-queue
wrangler queues create analytics-queue# Set secrets (sensitive data)
wrangler secret put ANTHROPIC_API_KEY
wrangler secret put GITHUB_TOKEN
wrangler secret put STRIPE_SECRET_KEY
wrangler secret put STC_MERCHANT_ID
wrangler secret put MADA_MERCHANT_ID
wrangler secret put CLOUDFLARE_API_TOKEN
# Set environment variables (non-sensitive)
wrangler env set ENVIRONMENT production
wrangler env set DOMAIN brainsait.io
wrangler env set DEFAULT_CURRENCY SAR
wrangler env set DEFAULT_LOCALE ar-SA# Build the application
npm run build
# Deploy to Cloudflare Pages
wrangler pages deploy dist --project-name=solo-spark
# Configure custom domain
wrangler pages domain add brainsait.io --project-name=solo-spark# Deploy authentication worker
wrangler deploy --env production
# Verify deployment
wrangler tail --env productionUpdate the resource IDs in wrangler.toml with your actual values:
# Replace with your actual KV namespace IDs
[[kv_namespaces]]
binding = "USER_DATA"
id = "your_actual_kv_id_here"
preview_id = "your_actual_preview_id_here"
# Replace with your actual D1 database ID
[[d1_databases]]
binding = "DB"
database_name = "solo-spark-db"
database_id = "your_actual_d1_id_here"Copy .env.example to .env and configure:
# Cloudflare Configuration
CLOUDFLARE_API_TOKEN=your_cloudflare_api_token
CLOUDFLARE_ACCOUNT_ID=your_cloudflare_account_id
CLOUDFLARE_ZONE_ID=your_cloudflare_zone_id
# API Keys
VITE_ANTHROPIC_API_KEY=your_claude_api_key
VITE_GITHUB_TOKEN=your_github_token
VITE_STRIPE_PUBLISHABLE_KEY=your_stripe_key
# Cloudflare Resources
VITE_KV_USER_DATA_ID=your_kv_namespace_id
VITE_R2_USER_FILES_BUCKET=solo-spark-user-files
VITE_D1_DATABASE_ID=your_d1_database_idEnable these features in your Cloudflare dashboard:
Speed:
- Auto Minify: CSS, JavaScript, HTML โ
- Brotli Compression โ
- Rocket Loader โ
- Polish (image optimization) โ
- Mirage (mobile optimization) โ
Caching:
- Browser Cache TTL: 4 hours
- Edge Cache TTL: 2 hours
- Development Mode: Off (production)
Network:
- HTTP/2: Enabled
- HTTP/3 (QUIC): Enabled
- 0-RTT Connection Resumption: Enabled
- WebSockets: Enabled
// In your Worker:
export default {
async fetch(request, env, ctx) {
// Set performance headers
const response = await handleRequest(request, env);
response.headers.set('Cache-Control', 'public, max-age=3600');
response.headers.set('CDN-Cache-Control', 'public, max-age=7200');
return response;
}
};Set up these security rules:
-
Rate Limiting:
(http.request.uri.path contains "/api/auth" and http.request.method eq "POST") Rate: 5 requests per minute per IP -
Bot Protection:
(cf.bot_management.score lt 30) Action: Challenge (Captcha) -
Geo-blocking (optional):
(ip.geoip.country ne "SA" and ip.geoip.country ne "US" and ip.geoip.country ne "GB") Action: Challenge
Configure these headers in Pages:
// _headers file
/*
X-Frame-Options: DENY
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Referrer-Policy: strict-origin-when-cross-origin
Permissions-Policy: geolocation=(), microphone=(), camera=()
Strict-Transport-Security: max-age=31536000; includeSubDomains; preloadEnable these analytics features:
-
Web Analytics:
- Real User Monitoring (RUM)
- Core Web Vitals
- Custom events tracking
-
Security Analytics:
- Threat monitoring
- Bot traffic analysis
- Attack pattern detection
-
Performance Analytics:
- Edge response times
- Cache hit ratios
- Bandwidth usage
Track business metrics:
// In your Worker
export default {
async fetch(request, env, ctx) {
// Track custom events
ctx.waitUntil(
env.ANALYTICS_ENGINE.writeDataPoint({
'blobs': ['user_registration'],
'doubles': [1],
'indexes': [request.cf.colo]
})
);
}
};-
DNS not propagating:
# Check DNS propagation dig brainsait.io nslookup brainsait.io -
KV namespace not found:
# List all KV namespaces wrangler kv:namespace list -
Worker deployment fails:
# Check wrangler configuration wrangler whoami wrangler dev --local -
SSL certificate issues:
- Verify DNS records are correct
- Check certificate status in Cloudflare dashboard
- Try disabling proxy (orange cloud) temporarily
# Check Worker logs
wrangler tail --env production
# Test KV operations
wrangler kv:key get "test-key" --namespace-id=your_namespace_id
# Test D1 queries
wrangler d1 execute solo-spark-db --command="SELECT COUNT(*) FROM users"
# Test R2 buckets
wrangler r2 object list solo-spark-user-files- Landing page loads correctly
- User registration/login works
- AI assistant responds
- Payment integration functional
- Arabic/English switching works
- Project save/export works
- GitHub integration works
- Mobile responsiveness verified
- PageSpeed Insights score > 90
- Time to First Byte < 200ms
- Largest Contentful Paint < 2.5s
- First Input Delay < 100ms
- Cumulative Layout Shift < 0.1
- SSL certificate valid
- Security headers present
- Rate limiting functional
- Bot protection active
- CORS properly configured
The existing CI/CD pipeline will work with Cloudflare. Update secrets:
# .github/workflows/deploy.yml additions
env:
CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }}
CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}# Deploy on git push
git push origin main
# Deploy specific branch to staging
wrangler pages deploy dist --project-name=solo-spark-staging --branch=stagingLeverage Cloudflare Workers for:
- Real-time AI processing at the edge
- Personalization based on user location
- A/B testing and feature flags
- Dynamic content optimization
- Automatic content replication across 300+ cities
- Smart routing to nearest data center
- Regional data compliance (GDPR, data residency)
- Multi-region failover
๐ Production URL: https://brainsait.io ๐ง Dashboard: https://dash.cloudflare.com ๐ Analytics: Available in Cloudflare dashboard โก Global Edge: 300+ cities worldwide
Your Solo-Spark platform is now running on Cloudflare's enterprise infrastructure, ready to serve Saudi innovators and scale globally! ๐๐ธ๐ฆ