A 60-minute build from PRD to deployed dev for Google’s Buildathon Southern Africa 2026 (part of Build with AI). The brief was deliberately tight — one hour from a blank repo to a running service on Google Cloud. Sentinel Health placed first.
The problem
In many rural communities, the first person to see a sick child is not a doctor but a Community Health Volunteer (CHV) working with limited tools, poor connectivity, and high-pressure decisions. A child with malaria, pneumonia, dehydration, or severe malnutrition can deteriorate in hours. Sentinel helps the CHV capture symptoms by voice or form, flags danger signs immediately, and uses Gemini for triage support — keeping the human firmly in control. The same intake also feeds a district view where officers can see live patterns, urgent referrals, and early outbreak alerts.
What we built in 60 minutes
- Voice-assisted intake — browser speech recognition or on-device Whisper transcription, then Gemini extracts structured fields for human review.
- Deterministic danger-sign rules that bypass the model and trigger urgent referral immediately.
- Gemini-assisted triage for non-critical cases (decision support, not diagnosis).
- Offline-style queue and sync so the field reality of patchy connectivity doesn’t block intake.
- District dashboard — live case feed, urgent referrals, and outbreak-threshold alerts.
- BigQuery-backed event storage so the same intake stream powers downstream analytics.
Architecture
Google Cloud surface used: Gemini (intake + triage), Cloud Run (frontend + backend, deployed independently), BigQuery (events + outbreak detection), Secret Manager (backend-only Gemini key), Cloud Build + Artifact Registry (source-based deploys), Cloud Logging.
Why Gemini now, MedGemma next
For the buildathon, Gemini was the fastest path to demonstrate the full workflow — voice capture, structured intake, human review, triage support, district visibility. With more time, Sentinel would move toward an on-device-first architecture using MedGemma and Gemma-family models for clinically-aligned triage directly on low-cost Android devices: local-language voice intake, multimodal assessment for signs like malnutrition, offline sync with conflict handling, privacy-preserving district analytics, and tighter validation against local protocols.
Human-in-the-loop safety
Sentinel keeps the CHV in control:
- Voice input only prepopulates the form — the CHV reviews and edits before submission.
- Danger-sign detection is deterministic — it does not wait on AI.
- AI output is decision support, not diagnosis.
- District alerts are signals for public-health action, not automated conclusions.
Reflection
The 60-minute constraint forces brutal scoping — you ship what matters or nothing ships. Picking the rule-based danger-sign path as the safety floor and letting Gemini sit on top of it (rather than gating safety on the model) was the call that made the demo defensible. Voice + structured extraction + a real cloud-deployed dashboard was just enough to land the story for the judges.
Repository: github.com/ThamuMnyulwa/buildathon-60min-2026