Case Study  ·  EdTech Tool

gitch

Designing clarity
into classroom data.

The technology
isn't the hard part.
Interpretation is.

Role
Designer + Developer
Stack
HTML · CSS · JS · Supabase
Status
Live at gitch.org

Scroll to read

Visit gitch.org

Data was being
collected. It wasn't
changing anything.

In my classroom, formative data was being collected. It wasn't changing instruction. We tracked mastery using printed spreadsheets. After each lesson, I would check off boxes for 26 students across multiple checkpoints. Over weeks and months, those sheets stacked up. Fifty pages. Hundreds of marks. No synthesis.

"The system stored data. It didn't interpret it. That gap directly impacted the precision of my teaching."

The questions that actually mattered were impossible to answer:

Who has been steadily declining across the last three weeks?
Are we improving in writing but regressing in math simultaneously?
Which standards consistently break down at higher levels of rigor?

Make intervention
decisions obvious.

The tool needed to work inside real classroom constraints: limited time, limited devices, and constant cognitive load. That meant no onboarding overhead, no spreadsheet setup, and no training required.

01
Reduce friction in daily data entry to near zero
02
Automatically synthesize performance across lessons and time
03
Surface patterns, not just isolated outcomes
04
Make the right next instructional move immediately visible

A formative assessment
platform built for
real classrooms.

Gitch is a personalized formative assessment platform for elementary classrooms. Each teacher logs into a private database tied to their classroom. Data is not shared across users. Autonomy and psychological safety are preserved by design.

Gitch dashboard — lesson bins by subject with achievement percentages
Overview The main dashboard: subjects organized into lesson bins, each showing average class achievement and lesson count at a glance.
1
Lesson-Level Tracking
Teachers create structured lesson trackers aligned to standards and define instructional checkpoints in advance. Mastery data is recorded quickly during or after instruction. No typing, minimal clicking.
2
Real-Time Visualization
Data automatically compiles into subject-level performance summaries, student performance cards with benchmark color-coding, and trend graphs showing achievement across lessons. Instead of flipping through paper, teachers see patterns instantly.
3
Immediate Signal Detection
Color-coded performance bands (Well Below through Well Above Benchmark) make it possible to identify intervention needs in seconds. Trend lines reveal momentum rather than isolated outcomes.

What it looks
like in use.

Student Performance Overview with benchmark color bands
Student Performance The Student Performance Overview: every student color-coded by benchmark band. At a glance: who needs support, who's on track, who's excelling.
Class Roster with benchmark-coded student cards
Class Roster The full class roster with benchmark color-coding. Click any student to drill into their individual progress graph.
Individual student drill-down — Student 1 at 83%, Above Benchmark
Student Drill-Down Individual student view: overall benchmark status, checkpoint progress bar, and subject-level breakdown in one screen.
Standards breakdown showing CCSS performance by domain
Standards View Standards-level breakdown: which CCSS domains are achieved, which aren't, with checkpoint counts per standard.
Student benchmark overview — color-coded performance bands
Performance Bands The full performance band legend: Well Below → At → Above → Well Above Benchmark. Color makes triage instantaneous.

Three decisions
that defined
the product.

Signal over density
Early prototypes included deeper analytics layers and additional metrics. Teachers ignored them. The final version strips everything back to clear subject summaries, clear student performance bands, and one primary longitudinal trend graph. Clarity consistently outperformed complexity in every iteration.
Relational architecture, not a digital spreadsheet
This wasn't a flat data problem. Students, lessons, checkpoints, and standards required relational mapping to make longitudinal tracking possible. The backend was structured to support growth analysis across dimensions, not isolated daily records. That architectural decision is what makes pattern detection possible at all.
Designed for cognitive load
Teachers enter data between dismissal and planning. They interpret it under pressure, on a phone, in three minutes. The interface minimizes click depth, typing load, and visual clutter. Every primary workflow is one click away. If it slowed teachers down, it was removed.

Visibility changes
behavior.

Within the first week of implementation, the lessons didn't change dramatically. The decision-making did.

Instead of reteaching broadly, I could target specific students, specific standards, and specific breakdown points across time. Instruction shifted from reactive to pattern-based.

I can't give you a controlled study. What I can tell you is what changed: I stopped guessing about who needed what. The data told me.

What this project
demonstrates.

Identifying systemic workflow gaps before reaching for a solution
Designing user-centered tools under real time, device, and cognitive constraints
Prototyping and deploying a full-stack product from Figma to production
Iterating based on live user behavior, not assumptions
Building systems that reduce decision fatigue, not add to it

Collecting information is easy. Designing interpretation systems that make that information usable: that's where real impact lives.

See it live.
Judge it yourself.