ThinkBot

Designed a persuasive writing plugin that supports critical thinking
Hero image
Overview
Students and professionals often use AI passively, copying outputs without reflection. I co-created ThinkBot to encourage active questioning, leading design and testing to shape a lightweight tool that nudged users toward deeper critical thinking without adding friction.
Role
UX Designer
Client
Course project (Persuasive Design, CMU HCII)
Duration
Apr - May 2025
Key Skills
Persuasive Design · Usability Testing · Prototyping · Figma · ChatGPT
The Problem

Combating the Cognitive Drift

Students and early-career professionals increasingly treat AI as an “answer machine.” Copy-paste culture encourages System 1 thinking — fast, intuitive, and unchecked — while bypassing the deeper evaluation, reflection, and judgment we need for critical thinking.

Our challenge was to design a companion that interrupts this drift and helps users slow down, question, and think.
Research & Discovery

Finding the Right Nudge

We grounded our work in research on persuasive technology and cognitive science. The key insight: AI is excellent at generating and analyzing content, but rarely supports higher-order skills like evaluation and synthesis.

At first, we imagined a “Socratic GPT” that would constantly ask users questions. But prior studies showed that overly assertive interventions frustrate people. Instead, we pivoted: rather than the AI interrogating the user, we’d empower the user to probe the AI.

That shift — subtle, respectful, and user-initiated — became our north star.

Design & Iteration

When Ambition Meets Overload

Our first prototype was ambitious — packed with colors, scores, links, and prompts — but usability testing revealed that more wasn’t better. Users liked the idea of “dig deeper” nudges, yet too many cues created overload and hesitation. We realized we needed to scope back, focusing only on interventions that guided reflection without demanding extra effort.

 Low-fidelity wireframes with various highlights, pop-ups, and scores that overwhelmed users during testing.
Our first prototype was ambitious, with multiple cues that caused user overload

Simplifying for Focus

We then streamlined the system to two highlight types — Orange for objective claims and Teal for subjective statements — supported by a short onboarding flow. We also introduced an interaction model inspired by Bloom’s Taxonomy, giving users simple cognitive “lenses” to choose from. Testing showed the experience was clearer, but accessibility gaps (like icons relying too heavily on color) still caused friction.

Mid-fidelity wireframes with orange/teal highlights and a reflection wheel, annotated with user feedback on clarity and accessibility issues.
We simplified the system to address feedback on clarity and accessibility

Refining for Accessibility

In the final iteration, we added clear text labels, improved contrast, and introduced a recap feature to support reflection without slowing the user down. Still, we discovered a subtle problem: the “ask/answer” buttons felt like homework. Even thoughtful nudges can feel like chores if they aren’t natural to the flow — a key insight that shaped our future direction.

High-fidelity mockups with a text-labeled mode selector, recap summaries, and improved accessibility features.
The final design focused on accessibility and a natural user flow, empowering users to probe the AI
The Solution

A Subtle Nudge toward Deeper Thinking

The final version of ThinkBot came together as a set of lightweight features that fit naturally into the AI workflow:

Color-coded highlights

Orange marks objective claims worth fact-checking, while teal flags subjective statements for personal scrutiny.

A screenshot of an AI interface with a paragraph of text. A sentence is highlighted in orange with the label "Objective Information" and another in teal with the label "Subjective Information.

Bloom’s-inspired reflection wheel

Hovering a highlight opens six cognitive lenses (Remembering → Creating) to prompt deeper engagement.

 A set of small white cards showing a circular "Think Modes" wheel. Each card shows one of the six sections—Remembering, Understanding, Apply, Analyze, Evaluate, and Create—highlighted in a different color.

Onboarding flow

A short guide explains the highlight system so users know how to interpret cues.

A two-part pop-up window. The left pop-up titled "Evaluating" explains what the feature is and what users can do. The right pop-up prompts the user to either "Ask Your Own" or "Answer Suggested" questions.

Recap summaries

At the end of a session, users receive a concise overview to reinforce reflection without slowing them down.

Two pop-up windows. The left one asks "What's on your mind?" with a question prompt. The right one provides a "Quick Recap" of the user's contributions and a summary of the topic.
Impact

Subtle Nudges Can Shift Behavior

The biggest impact of ThinkBot was showing that subtle, user-triggered interventions can change how people interact with AI without breaking their flow. Through testing, we saw that when highlights and prompts were lightweight and optional, users naturally slowed down to reflect instead of just copying. This was a powerful validation that critical thinking can be encouraged not by forcing behavior, but by gently guiding it.

At the same time, we learned the limits — if nudges feel like assignments, they lose their effectiveness. This balance between subtle support and user autonomy became the project’s most important takeaway.

reflection

Designing for Human-AI Collaboration

Looking back, what makes me proudest is that our design really did meet the challenge we set out to solve: helping people pause, reflect, and not just copy whatever AI gives them. We didn’t create a perfect solution — some ideas still felt like homework — but we proved that small, thoughtful nudges can shift behavior in meaningful ways.

This project also reminded me that good persuasive design doesn’t shout. It respects the user, fits into their flow, and quietly encourages better habits. For me, ThinkBot isn’t just a course project. It’s a glimpse of the kind of human-AI partnership I want to keep designing for — one where technology doesn’t replace our thinking, but helps us think deeper, learn better, and feel more in control.

Other projects

Cover image of the project. Click to view the full project.

Word Tag Assessments

Word Tag · METALS Capstone · MVP & Game-based Learning
Turned testing into play by designing research-based assessments that kids found engaging and motivating
Cover image of the project. Click to view the full project.

Lingo Buddy

CMU · Course Project · App UX & GenAI
Designed a chat-based learning app where non-native speakers practice slang directly with AI in context