skip to content
aditgupta.dev
Table of Contents

Introduction

In the world of reading apps, helping users discover their next great read is a fundamental challenge. While traditional recommendation systems rely on simple genre matching or collaborative filtering, I wanted to create something more personal and meaningful. This article explores how I built an AI-powered taste matching system for BookStates, a React Native reading app, that analyzes compatibility between users and books based on their unique reading personality.

What makes this implementation particularly interesting is its approach to progressive enhancement, graceful error handling, and thoughtful UX design for AI features that can be unpredictable in mobile environments.

The Problem: Beyond Simple Recommendations

Traditional book recommendation systems often fall short because they:

  • Rely too heavily on genre categorization
  • Miss the nuanced reasons why readers connect with certain books
  • Fail to account for reading personality and motivations
  • Provide generic recommendations that feel impersonal

I wanted to build a system that could tell users not just if they might like a book, but why – creating a personalized compatibility score that feels meaningful and actionable.

Architecture Overview

Core Components

The taste matching system consists of three main parts:

  1. Taste Match Component - The main React Native component that orchestrates the entire flow
  2. AI Backend Service - A serverless function that processes user data and generates compatibility scores
  3. Loading Indicator - A delightful loading animation that keeps users engaged during processing

Data Flow

interface TasteMatchResult {
matchScore: number;
matchReason: string;
personalityAlignment: string;
readingPatternRelevance?: string;
expansionPotential?: string;
}

The system collects four key data points:

  1. User Archetype - A personality profile generated from reading history
  2. Favorite Genre - Calculated from the user’s most-read genres
  3. Recent Read Books - Last 10 completed books with ratings
  4. Currently Reading - Active books to understand current preferences

Implementation Deep Dive

1. Progressive Enhancement Strategy

One of the most interesting aspects of this implementation is how it handles users at different stages of their journey:

const checkUserArchetype = async () => {
if (!isAuthenticated) {
setIsCheckingArchetype(false);
return;
}
try {
const { data: { session } } = await supabase.auth.getSession();
if (session?.user) {
const archetype = await fetchUserArchetype(session.user.id);
setHasArchetype(!!archetype);
}
} catch (error) {
console.error('Error checking archetype:', error);
} finally {
setIsCheckingArchetype(false);
}
};

The component implements a three-tier access model:

  1. Unauthenticated users see a simple message encouraging sign-in
  2. New users without archetypes receive guided onboarding instructions
  3. Established users get full AI-powered analysis

This approach ensures that every user understands what they’re missing and how to unlock the feature, rather than just showing an error.

2. Intelligent Data Fetching

The component uses parallel data fetching to minimize wait times:

const [archetype, genre, recentBooks, currentBooks] = await Promise.all([
fetchUserArchetype(user.id),
fetchFavoriteGenre(user.id),
fetchRecentReadBooks(user.id),
fetchCurrentlyReadingBooks(user.id)
]);

This parallel approach is crucial for mobile performance, especially on slower networks. Each fetch function is also designed to fail gracefully:

const fetchFavoriteGenre = async (userId: string): Promise<string | null> => {
try {
// First try the preprocessed table
const { data: favoriteGenreData } = await supabase
.from('favorite_genres')
.select('genre')
.eq('user_id', userId)
.single();
if (favoriteGenreData?.genre) {
return favoriteGenreData.genre;
}
// Fallback: calculate from user's read books
// ... calculation logic
} catch (error) {
console.error('Error fetching favorite genre:', error);
return null;
}
};

3. Robust Error Handling

The error handling strategy deserves special attention. Instead of generic error messages, the component provides context-specific guidance:

const renderErrorState = () => {
if (error === 'archetype_missing') {
return (
<View style={styles.errorContainer}>
<User size={18} color="#f59e0b" />
<Text style={styles.errorText}>
Generate your reader archetype first to see how this book matches your taste.
</Text>
</View>
);
}
if (error === 'api_down') {
return (
<View style={styles.errorContainer}>
<View style={[styles.errorIconContainer, styles.apiDownIconContainer]}>
<AlertCircle size={24} color="#ef4444" />
</View>
<Text style={[styles.errorTitle, styles.apiDownTitle]}>
Service Temporarily Unavailable
</Text>
<Text style={styles.errorDescription}>
Our AI analysis service is currently down. Please try again in a few minutes.
</Text>
<TouchableOpacity style={styles.retryButton} onPress={generateTasteMatch}>
<Text style={styles.retryButtonText}>Try Again</Text>
</TouchableOpacity>
</View>
);
}
// ... more error states
};

Each error state:

  • Uses distinct visual indicators (colors, icons)
  • Provides clear, actionable messaging
  • Offers appropriate recovery options
  • Maintains a positive, helpful tone

4. Delightful Loading States

Instead of a simple spinner, the loading indicator creates anticipation:

const analysisSteps = [
{ icon: User, text: "Reading your unique archetype", color: "#6366F1" },
{ icon: Book, text: "Analyzing your reading history", color: "#F59E0B" },
{ icon: Heart, text: "Checking genre preferences", color: "#EF4444" },
{ icon: Sparkles, text: "Calculating compatibility score", color: "#10B981" }
];

The component cycles through these steps with smooth animations, giving users insight into what’s happening behind the scenes. This transparency helps manage expectations for the 2-3 second processing time.

5. Handling API Timeouts and Failures

The implementation includes sophisticated timeout detection:

if (error instanceof Error) {
const errorMessage = error.message.toLowerCase();
if (errorMessage.includes('504') ||
errorMessage.includes('gateway timeout') ||
errorMessage.includes('timeout')) {
setError('api_down');
return;
}
if (errorMessage.includes('500') ||
errorMessage.includes('internal server error')) {
setError('server_error');
return;
}
}

This pattern recognition approach allows the component to provide specific guidance based on the type of failure, improving user trust and reducing support burden.

Performance Optimizations

1. Auto-Start Functionality

For users in discovery mode, the component can automatically trigger analysis:

useEffect(() => {
if (autoStart && isAuthenticated && !isCheckingArchetype &&
hasArchetype && !tasteMatch && !error && !isLoading) {
generateTasteMatch();
}
}, [autoStart, isAuthenticated, isCheckingArchetype, hasArchetype, tasteMatch, error, isLoading]);

This careful orchestration of conditions prevents unnecessary API calls while providing a seamless experience for power users.

2. Data Transformation Efficiency

The component handles various data formats from Supabase gracefully:

const canonicalBook = Array.isArray(cb) ? cb[0] : cb;
if (canonicalBook && typeof canonicalBook === 'object' &&
'title' in canonicalBook && 'author' in canonicalBook) {
readBooks.push({
id: book.id,
title: canonicalBook.title as string,
author: canonicalBook.author as string,
// ... other fields
});
}

This defensive programming approach ensures the component doesn’t crash when encountering unexpected data structures.

Lessons Learned

1. Design for Failure First

AI services are inherently unreliable. By designing the error states first, I ensured users always had a good experience, even when things went wrong.

2. Progressive Disclosure Works

Instead of hiding features behind authentication walls, showing users what they’re missing (with clear steps to unlock it) dramatically improved feature adoption.

3. Loading States Are UX Opportunities

The animated loading indicator not only reduced perceived wait time but also educated users about the analysis process, increasing trust in the results.

4. Context Matters for Errors

Generic error messages frustrate users. By detecting specific failure modes and providing contextual guidance, we reduced support tickets by 60%.

5. Parallel Processing Is Essential

On mobile networks, sequential API calls can make features feel sluggish. Aggressive parallelization made the feature feel native and responsive.

Future Enhancements

I’m exploring several improvements:

  1. Offline Caching - Cache compatibility scores for recently viewed books
  2. Batch Processing - Analyze multiple books at once for browse scenarios
  3. Real-time Updates - Use WebSockets for live score updates as users read
  4. Explanation Deep Dives - Let users explore why specific factors influenced their score

Conclusion

Building AI features for mobile apps requires careful consideration of network reliability, processing time, and user expectations. By focusing on progressive enhancement, graceful degradation, and delightful interactions, I created a feature that feels magical when it works and helpful when it doesn’t.

The key takeaway? AI features in mobile apps aren’t just about the algorithm – they’re about creating an experience that respects users’ time, handles failure gracefully, and provides value at every stage of the user journey.