SurfGo
A Go-based backend service for aggregating and serving surf forecast data
Overview
SurfGo is a Go-based backend service that aggregates and serves surf forecast data from multiple sources including buoys, tide predictions, and wind forecasts. It provides a unified API for accessing consolidated surf condition information, helping surfers make informed decisions about when and where to surf.
Key Features
- Aggregation of data from multiple sources (buoys, tides, winds)
- RESTful API for accessing surf forecasts
- Clean architecture with separation of concerns
- Scheduled data collection from external sources
- Persistent storage of forecast data
Go vs Python: Why Both?
The Surflie platform uses both Go and Python for different components, leveraging the strengths of each language:
Go Backend
- Core data services - Handles data aggregation, storage, and API endpoints
- Scales easily - Low memory footprint and efficient concurrency
- Fewer runtime issues - Static typing and compile-time checks
- Fast execution - Compiled language with excellent performance
- Simple deployment - Single binary with no dependencies
Python ML Service
- AI/ML capabilities - Rich ecosystem for machine learning
- Rapid prototyping - Faster development for ML features
- OpenAI integration - Better libraries for AI API interactions
- Data science tools - NumPy, Pandas for data manipulation
- FastAPI - Modern, high-performance web framework
"We chose Go for the core backend because it scales easier with fewer runtime issues. The compiled binary is easy to deploy and maintain on our self-hosted server. Python handles the ML components where its rich ecosystem shines."
Go Implementation
// Simplified version of actual production code
package repository
import (
"context"
"database/sql"
"encoding/json"
"fmt"
"time"
)
// BuoyRepository handles data access for buoy forecast data
type BuoyRepository struct {
db *sql.DB
}
// NewBuoyRepository creates a new buoy repository instance
func NewBuoyRepository(db *sql.DB) *BuoyRepository {
return &BuoyRepository{db: db}
}
// GetLatestBuoyForecast retrieves the most recent buoy forecast for a spot
func (r *BuoyRepository) GetLatestBuoyForecast(ctx context.Context, spotID int, stationID string) (interface{}, error) {
var data []byte
var fetchTime time.Time
query := `SELECT data, fetch_time
FROM buoy_forecasts
WHERE spot_id = ? AND station_id = ?
ORDER BY fetch_time DESC
LIMIT 1`
err := r.db.QueryRowContext(ctx, query, spotID, stationID).Scan(&data, &fetchTime)
if err != nil {
if err == sql.ErrNoRows {
return nil, fmt.Errorf("no forecast data found for spot %d, station %s", spotID, stationID)
}
return nil, fmt.Errorf("error querying forecast: %w", err)
}
var forecast interface{}
if err := json.Unmarshal(data, &forecast); err != nil {
return nil, fmt.Errorf("error parsing forecast data: %w", err)
}
return forecast, nil
}Python Implementation
# Simplified version of actual production code
import logging
from fastapi import APIRouter, Depends, HTTPException
# Import services and models
from internal.prediction.service import PredictionService
from internal.buoy_forecast.service import BuoyForecastService
from internal.prediction.models import PredictionResponse
# Import spot data maps
from internal.prediction.spot_data import (
spot_to_buoy_map,
spot_to_tide_station_map,
local_knowledge_base
)
router = APIRouter()
logger = logging.getLogger(__name__)
# --- Dependency Injectors ---
def get_prediction_service():
return PredictionService()
def get_buoy_forecast_service():
return BuoyForecastService()
@router.get("/predict/{spot}", response_model=PredictionResponse)
async def get_surf_prediction(
spot: str,
pred_service: PredictionService = Depends(get_prediction_service),
buoy_service: BuoyForecastService = Depends(get_buoy_forecast_service)
):
"""Generates an AI-powered surf prediction for a specific spot."""
# Get spot-specific data sources
buoy_id = spot_to_buoy_map.get(spot)
knowledge = local_knowledge_base.get(spot, "")
# Fetch buoy data
buoy_data = await buoy_service.get_buoy_data(buoy_id)
# Generate AI prediction
prediction_text, error = await pred_service.generate_surf_prediction(
spot_name=spot,
buoy_data=buoy_data,
local_knowledge=knowledge
)
return PredictionResponse(spot=spot, prediction=prediction_text, error=error)Design & Structure
SurfGo follows a clean architecture approach with strict separation of concerns. The application is divided into distinct layers: API Layer (handling HTTP requests), Service Layer (containing business logic), Repository Layer (managing data access), and Scraper Layer (collecting external data).
Key Components
- API Layer: Handles HTTP requests and responses
- Service Layer: Contains business logic and orchestrates operations
- Repository Layer: Manages data access and persistence
- Scraper Layer: Collects data from external sources on a scheduled basis
- Aggregator: Combines data from multiple sources to provide a unified view
Project Structure
The project is organized into packages that reflect the clean architecture approach. The cmd directory contains application entry points, while the internal directory houses the core business logic and implementation details. This structure ensures that dependencies flow inward, with high-level modules not depending on low-level modules.
Database Schema
SurfGo uses SQLite for data storage, with a schema designed to efficiently store and retrieve surf forecast data from multiple sources.
Database Schema
spots
Surf spot information
Relationships:
- buoy_forecasts.spot_id → id
- tide_forecasts.spot_id → id
- wind_forecasts.spot_id → id
buoy_forecasts
Buoy forecast data
Relationships:
- spot_id → spots.id
tide_forecasts
Tide forecast data
Relationships:
- spot_id → spots.id
wind_forecasts
Wind forecast data
Relationships:
- spot_id → spots.id
Technologies
Backend
- Go 1.23.4 with built-in routing in net/http (no framework needed)
- SQLite for data storage
- Context package for request scoping and cancellation
- Goroutines and channels for concurrency
Patterns
- Repository Pattern for data access abstraction
- Service Layer for business logic encapsulation
- Dependency Injection for component management
- Command-Query Separation for clear operation intent
APIs
- RESTful API design using net/http
- JSON response formatting
- Structured error handling
- Middleware for cross-cutting concerns
Tools
- External APIs for data sources
- Scheduled scraping with configurable intervals
- Graceful shutdown handling
- Comprehensive error logging
API Endpoints
SurfGo provides a comprehensive set of API endpoints for accessing surf-related data:
Spots
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/spots | Get all surf spots |
| GET | /api/spots/{spotID} | Get details for a specific surf spot |
Consolidated Data
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/spot/{spotID} | Get consolidated data for a specific spot (buoy, tide, wind) |
Buoy Data
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/buoy_realtime/{stationID} | Get real-time buoy data for a specific station |
| GET | /api/buoy_forecast/{stationID} | Get buoy forecast data for a specific station |
Other Data
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/tides/{stationID} | Get tide predictions for a specific station |
| GET | /api/winds/ | Get wind forecast by gridpoint |
| GET | /api/predict/{spot} | Generate AI-powered surf prediction for a specific spot |
Future Directions
Roadmap
- Machine learning for surf quality predictions
- Real-time alerts for optimal surf conditions
- User accounts with favorite spots
- Mobile application development
- Integration with weather radar and satellite imagery
Challenges
- Ensuring data freshness from multiple sources
- Handling API rate limits and downtime
- Scaling to support more spots and users
- Optimizing storage for historical data
Opportunities
- Expanding to global surf spot coverage
- Developing partnerships with surf-related businesses
- Creating a community-driven spot rating system
- Integrating with wearable devices for surfers