This is a submission for Weekend Challenge: Earth Day Edition
What I Built
Personal Earth Assistant is a small full-stack app that helps people think about lower-impact everyday choices—commute, home size, diet, flights, and energy habits—without pretending to be a full carbon calculator.
Goals:
- Give users a simple “Private Earth Score” (0–100) from a transparent rubric, not a black box.
- Offer a chat coach that reads their saved profile and can update habits via server-side tools (so the model never needs API keys in the browser).
- Keep identity and data scoped per user with Auth0 and a SQLite store keyed by Auth0
sub.
Stack: React (Vite) + Express (TypeScript), Auth0 SPA login + API audience, Gemini or Groq on the server only, SQLite for profile + chat history + score snapshots, streaks, and lightweight badges.
Demo
Video walkthrough:
Code
Personal Earth Assistant
Vite + React SPA with an Express API: Auth0 login (JWT + audience), Gemini on the server only, SQLite profile + chat history keyed by Auth0 sub. Built for the DEV Weekend Challenge: Earth Day Edition.
Quick start (local)
-
Auth0
- Create a Single Page Application client.
- Create an API with an identifier (e.g.
https://personal-earth-api) — this value is your audience. - Under the API, authorize the SPA client if required by your tenant defaults.
- Application URLs (dev): Callback, Logout, and Allowed Web Origins →
http://localhost:5173.
-
Google AI Studio
- Create an API key for the Gemini API (server-side only).
-
Env files
- Copy client/.env.example →
client/.env(Auth0 SPA vars). - Copy server/.env.example →
server/.env(Auth0 API validation,GOOGLE_API_KEYor Groq viaLLM_PROVIDER,CLIENT_ORIGIN, etc.).
- Copy client/.env.example →
-
Run
npm install npm run dev
- UI: http://localhost:5173
- API: http://localhost:3001 (proxied as
/apifrom Vite)
Production build (single Node process)
…Highlights to skim in the repo:
-
Auth:
client/src/App.tsx— Auth0 SPA + audience;server/src/auth.ts— JWT verification with JWKS. -
LLM + tools:
server/src/gemini.ts— Gemini tool calls (update_user_habits,set_weekly_goal,note_win);server/src/groq.ts— optional Groq path viaserver/src/chatProvider.ts. -
Persistence:
server/src/db.ts,server/src/gamify.ts— profiles, messages, score history, activity days, badges. -
Streaming chat UI:
client/src/pages/ChatPage.tsx+POST /api/chat/streamon the server.
How I Built It
Frontend is a Vite + React SPA. Users sign in with Auth0; the client requests access tokens with an API audience that matches a Custom API in Auth0 (not the Management API). All /api/* calls send Authorization: Bearer <token>.
Backend is Express. I verify JWTs with jose against Auth0’s JWKS (AUTH0_DOMAIN + AUTH0_AUDIENCE). Every row in SQLite is tied to sub, so users only ever see their own profile and chat.
Onboarding collects coarse lifestyle fields; the server computes a deterministic score from a documented rubric (server/src/score.ts) and stores snapshots over time for charts on the dashboard.
Chat runs only on the server:
-
Google Gemini (default) or Groq (
LLM_PROVIDER=groq+GROQ_API_KEY) via a small provider switch. - The model can call tools to merge partial profile updates, set a weekly goal, or log a “win”—then the server persists to SQLite and recomputes the score when needed.
-
Streaming: SSE from
POST /api/chat/streamso the UI shows tokens incrementally (Groq path completes the turn server-side, then chunks the reply for a smooth SSE experience).
Env loading: server/src/loadEnv.ts always loads server/.env relative to the package so npm run dev from the repo root still picks up keys (avoids the common “wrong cwd” issue with dotenv).
Interesting tradeoff: I kept the rubric simple and inspectable on purpose—good for a weekend build and for explaining “what the score means” in the UI. The value is in auth + safe LLM integration + durable user state, not in claiming scientific carbon accuracy.
Auth0 for Agents (angle)
This MVP is built around trusted human identity (Auth0 SPA JWT), server-held secrets (Gemini / Groq keys), and tool execution that mutates user state scoped by sub—the same building blocks you need when wiring agents or MCP-style flows: identity, policy, and side effects on the server. A natural next step would be to add Auth0’s agent-specific flows or delegated credentials per their docs and describe that in a follow-up post.
Google Gemini (angle)
I use @google/generative-ai on the Node server only, with function calling so the model can persist structured updates instead of only chatting. The client never sees GOOGLE_API_KEY. Streaming uses the SDK’s stream API for Gemini so tokens can flow to the browser over SSE while tool rounds stay on the server.
Prize Categories
I’m submitting for:
- Best use of Auth0 for Agents — Auth0-backed SPA + API audience, JWT validation on every route, per-user SQLite isolation, and a server-side tool loop suitable for agent-style extensions (identity + secrets + durable actions).
- Best use of Google Gemini — Server-only Gemini with tools, streaming responses over SSE, and profile/habit persistence driven by model tool calls.
United States
NORTH AMERICA
Related News
How Braze’s CTO is rethinking engineering for the agentic area
10h ago
Amazon Employees Are 'Tokenmaxxing' Due To Pressure To Use AI Tools
21h ago

Implementing Multicloud Data Sharding with Hexagonal Storage Adapters
15h ago

DeepMind’s CEO Says AGI May Be ~4 Years Away. The Last Three Missing Pieces Are Not What Most People Think.
15h ago

CCSnapshot - A Claude Code Configs Transfer Tool
21h ago