EduAIssist — A Case Study
Automating question generation & answer evaluation for smarter classrooms
TL;DR
We built EduAIssist, an AI-powered education platform that helps teachers generate question papers from study materials and evaluate handwritten answers. Using Next.js, NestJS, Ollama, and ChromaDB, we streamlined exam preparation and grading workflows. Outcome: teachers save hours, students get faster feedback, and institutions improve efficiency.

Problem Overview
Teachers spend enormous amounts of time preparing question papers and grading handwritten answers. Both tasks are tedious, repetitive, and prone to bias or inconsistency. With increasing student numbers, manual methods no longer scale — leading to delays, errors, and frustration. Schools and institutions needed a smarter system to:

- Automate question paper generation from large volumes of study material.
- Evaluate handwritten answers consistently while still allowing teachers to override results.
- Reduce administrative load so educators could focus on teaching instead of paperwork.
EduAIssist set out to solve this by bringing AI into the classroom in a meaningful, workflow-friendly way.
Role & Responsibilities
- Role: Full-stack development team
- Responsibilities:
- AI integration (Ollama LLMs, embeddings)
- Frontend (Next.js, TypeScript, Tailwind)
- Backend (NestJS, APIs, database design)
- PDF ingestion & semantic search pipeline
- Authentication & user management
- QA, deployment, and documentation
Project Context
- Timeline: 10 weeks
- Team size: 4 developers
- Context: Paid client project for EdTech innovation
My Approach
We followed an agile development process with 2-week sprint cycles.

- Discovery & Research: Interviewed teachers, reviewed existing EdTech tools, and mapped workflows.
- Design & Prototyping: Built wireframes of teacher dashboards, exam flows, and AI evaluation screens.
- Implementation: Integrated Ollama for LLM-powered question generation and ChromaDB for semantic search.
- Testing & Iteration: Piloted with sample student data and refined prompts for accuracy.
This iterative loop ensured the final product was teacher-friendly and institution-ready.
Research & Insights
Key Findings from Teacher Interviews:
- Generating balanced question papers takes 4–6 hours per subject.
- Teachers wanted control over AI outputs (edit/override AI decisions).
- Existing EdTech tools often lacked support for handwritten answers.
- Institutions required reporting & analytics at class/subject levels.
Interview Questions We Asked:
- How do you currently prepare question papers?
- What’s the most time-consuming part of grading?
- How do you manage handwritten vs typed submissions?
- What reports/analytics do you wish you had?
User Persona:
- Name: Priya Sharma
- Role: High school science teacher
- Goals: Save time preparing papers, grade faster, get analytics for students.
- Pain Points: Manual grading, repetitive tasks, lack of tools for handwritten answers.

Information Architecture
EduAIssist was structured around these core screens:
- Onboarding & Login (Google/Microsoft OAuth)
- Dashboard (classes, exams, reports overview)
- Question Paper Generator (upload → generate → edit → export)
- Answer Evaluation Tool (upload sheets → AI score → teacher override)
- Study Material Manager (upload books, chapters, PDFs)
- Reports & Analytics (performance by student/class/subject)

Visual Language
We used a calm, professional palette of blue and white — colors that reflect trust, clarity, and focus. Typography combined Inter (modern, clean UI) with subtle weight variations for hierarchy. Visual choices aimed to reduce cognitive load for busy teachers.
Wireframes & Early Ideas
We began with low-fidelity sketches of the dashboard and paper generator. Teachers requested a “preview + edit” flow for generated questions, which we added after testing early wireframes. Another trade-off: balancing AI autonomy with teacher control — solved by giving override toggles in evaluation screens.
Designing Solutions
Problem 1: Manual question paper creation is slow.
- Solution:
- Upload PDFs/books directly.
- AI (Ollama + ChromaDB) generates questions based on semantic embeddings.
- Teachers can edit or regenerate sections.
- Export as PDF for printing/distribution.

Figure: Question generation dashboard with preview and edit options.
Problem 2: Handwritten answers are hard to evaluate fairly.
- Solution:
- AI-assisted scoring with OCR + embeddings
- Teacher override to correct/adjust scores.
- Confidence scores shown for transparency.
Figure: Answer evaluation with AI-suggested marks and override options.
Problem 2: Institutions lacked visibility into student performance
- Solution:
- Auto-generated report cards with detailed analytics.
- Visual breakdown of subject performance.
- Export options for administration.

Figure: Performance analytics showing class-wide trends.
Tech & Implementation
- Backend: NestJS (TypeScript, TypeORM)
- Database: MySQL/PostgreSQL with TypeORM
- Authentication: OAuth (Google/Microsoft) + JWT
- AI/LLM: Ollama (Mistral, LLaMA2) for question generation + embeddings
- Vector DB: ChromaDB for semantic search
- File Storage: Firebase Storage
- PDF Processing: pdf-parse for ingestion
- Deployment: Railway-ready configuration
- Scalability: Microservice-ready architecture with modular APIs
Real-world Features & Highlights
- AI Question Generation → saves teachers 70% of prep time.
- Handwritten Answer Evaluation → faster grading with teacher override.
- Semantic Search (ChromaDB) → find relevant content instantly.
- Admin & Teacher Portals → role-based access and management.
- Performance Reports → actionable insights for teachers & institutions.

Results & Impact
- 80% reduction in time spent preparing exams.
- 3x faster grading compared to manual evaluation.
- Teachers praised the control over AI outputs (“AI does the heavy lifting, I refine”).
- Institutions now have student performance analytics not possible before.
- Demo available: EduAIssist Live

Challenges & Learnings
- OCR accuracy: Variability in handwriting made evaluation tricky.
- Prompt engineering: Required structured prompts + fallback to maintain quality.
- User adoption: Teachers preferred gradual AI adoption (assistive, not fully autonomous).
- Learned that AI + human override is the sweet spot in EdTech.
Next Steps
- Expand multi-language support for question generation.
- Add student self-assessment portal.
- Integrate payments & subscriptions for schools.
- Deploy mobile app version for easier access.

Call to Action
If you want to build an AI-powered education platform like EduAIssist, reach out at whiz-cloud.com.