
Building Pullman Bus Tracker: A Modern React Native App with AI Integration
Pullman Bus Tracker is a mobile app built to make transit in Pullman, Washington feel fast, intuitive, and genuinely pleasant to use. Instead of treating this as a utility-only problem, we treated it like a product and UX problem, with performance and reliability as non-negotiables.
The app renders real-time vehicles on a live map, shows arrival times with minimal friction, supports favorites for instant access, and includes an AI chat that understands normal language like “where’s the blue bus?”. This post breaks down the engineering and design decisions behind the feel.
The Problem We Were Solving
Pullman is a small college town where students and residents rely on buses every day. The official app technically worked, but it felt dated, slow, and disconnected from modern mobile UX expectations. We wanted to reduce friction and build something people would open instinctively.
The goals
- • Render buses moving on a map in real time
- • Make arrival times instantly scannable
- • Provide favorites for quick access to stops and routes
- • Understand queries like “where’s the blue route?”
Tech Stack
We optimized for developer velocity, platform fidelity, and long-term maintainability. Every dependency had to earn its keep.
Core
Fast iteration, native APIs, type safety
UI and Motion
Design tokens, blur materials, and physics-based animation
Maps and Location
Real-time rendering, geolocation, and tactile feedback
State
Predictable state with persistence
AI
Intent parsing, tool execution, and streamed responses
The Visual Language: Squircles, Blur, and Spring Physics
We wanted the UI to feel unmistakably iOS native. That meant adopting system conventions for corners, materials, and motion, then applying them consistently across the app.
Continuous corners
iOS-style corners are not perfect circles. They are continuous curves. Using borderCurve: 'continuous' and consistent radii across surfaces makes components feel system-level, not custom-painted.
const styles = StyleSheet.create({
blurContainer: {
borderRadius: 28,
borderWidth: 1.5,
overflow: 'hidden',
borderCurve: 'continuous',
},
});Blur as a material
Blur was treated like a real surface. Callouts and overlays use adaptive tinting so light mode feels airy, and dark mode feels dense and intentional.
<AnimatedBlurView
intensity={blurIntensity}
tint={isDark ? 'systemChromeMaterialDark' : 'systemChromeMaterialLight'}
style={[
styles.blurContainer,
{ borderColor: isDark ? 'rgba(255,255,255,0.3)' : 'rgba(0,0,0,0.1)' }
]}
>
Physics-based motion
Springs became the default. Shared spring parameters keep the app’s motion consistent across callouts, modals, and micro-interactions. This is the difference between “animated UI” and “responsive UI”.
const springConfig = {
mass: 1,
damping: 8,
stiffness: 200,
restSpeedThreshold: 0.5,
overshootClamping: false,
restDisplacementThreshold: 0.5,
};Building the Live Map: Real-Time Buses and Blurred Callouts
The live map is the core loop. It had to render smoothly, read clearly, and update frequently without destroying battery.
Blurred callouts with coordinated motion
Callouts animate in using a coordinated opacity, scale, and translation spring. The motion is quick, soft, and consistent across the app.
export default function MapBlurredCallout({
title,
description,
onPress,
blurIntensity = 60,
width = 280,
}: MapBlurredCalloutProps) {
const opacity = useSharedValue(0);
const scale = useSharedValue(0.8);
const translateY = useSharedValue(10);
useEffect(() => {
opacity.value = withSpring(1, springConfig);
scale.value = withSpring(1, springConfig);
translateY.value = withSpring(0, springConfig);
}, []);
Polling only when it matters
Live updates only run while the map screen is focused. When the user navigates away, polling stops. That preserves battery and avoids hidden background work.
useFocusEffect(
useCallback(() => {
if (!selectedRoute) return;
fetchLiveBusesForRoute();
const interval = setInterval(fetchLiveBusesForRoute, 10000);
return () => clearInterval(interval);
}, [selectedRoute])
);Animations That Feel Natural
Animation is a UX tool. We used it to communicate state changes, reinforce hierarchy, and make interactions feel immediate, not to show off.
Typing indicator with a breathing rhythm
The AI typing indicator uses staggered opacity loops to feel alive without being distracting.
const dot1 = useSharedValue(0.3);
const dot2 = useSharedValue(0.3);
const dot3 = useSharedValue(0.3);
useEffect(() => {
dot1.value = withRepeat(
withTiming(1, { duration: 600, easing: Easing.inOut(Easing.ease) }),
-1,
true
);
setTimeout(() => {
dot2.value = withRepeat(/* same animation */);
}, 200);
setTimeout(() => {
dot3.value = withRepeat(/* same animation */);
}, 400);
}, []);Staggered entrances for hierarchy
Containers animate first, then list items cascade in. It makes dense UI easier to parse and gives the interface a calm, intentional feel.
useEffect(() => {
containerOpacity.value = withDelay(100, withTiming(1, { duration: 300 }));
containerTranslateY.value = withDelay(100, withSpring(0, SPRING_CONFIG));
languageItemAnimations.forEach((animation, index) => {
animation.opacity.value = withDelay(
400 + index * 100,
withTiming(1, { duration: 300 })
);
animation.translateY.value = withDelay(
400 + index * 100,
withSpring(0, SPRING_CONFIG)
);
});
}, []);The Favorites Experience: Pull-to-Refresh That Feels Physical
The favorites screen uses a custom pull-to-refresh interaction. The indicator tracks your gesture, then locks in with a spring once the threshold is crossed. It turns refresh into a tactile interaction instead of a generic spinner.
Gesture-driven pull indicator
const indicatorStyle = useAnimatedStyle(() => {
if (isRefreshing.value === 1) {
return { transform: [{ translateY: 0 }] };
}
const translateY = interpolate(
scrollY.value,
[-150, -PULL_THRESHOLD, 0],
[INDICATOR_HEIGHT, 0, -INDICATOR_HEIGHT],
Extrapolate.CLAMP
);
return { transform: [{ translateY }] };
});Customization: Tiny Features, Big Delight
Users can customize favorite symbols using SF Symbols and a color picker. Every selection triggers a soft haptic and a subtle icon animation. Small feedback loops like this increase perceived quality a lot.
<View style={styles.grid}>
{SYMBOL_OPTIONS.map(opt => {
const active = opt.value === favoriteSymbol;
return (
<Pressable
onPress={() => {
setFavoriteSymbol(opt.value);
Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Soft);
}}
style={[styles.option, { opacity: active ? 1 : 0.65 }]}
>
<IconSymbol
size={40}
name={opt.value}
color={active ? favoriteSymbolColor : isDark ? '#f3f4f6' : '#4b5563'}
animationSpec={{ effect: { type: 'scale' } }}
/>
</Pressable>
);
})}
</View>The AI That Actually Understands
The AI chat is built like a small system, not a single massive prompt. We structured the pipeline to make behavior reliable and debuggable.
The pipeline
- Message Clarifier, expands short queries into explicit intent
- Router Agent, extracts intent and entities
- Tool Runner, fetches arrivals, vehicles, stops, and routes
- Answer Agent, turns tool output into a readable response
Semantic search with embeddings
Transit names are inconsistent in the real world. Users say “blue bus”, “blue line”, or “route 1”. Embeddings let us match meaning, not exact strings.
const identityText = `Route: ${route.name}`;
const coverageText = `This route serves: ${keyStops.join(', ')}.
It connects ${route.stops[0].name} to ${route.stops[route.stops.length - 1].name}.`;Platform-Native Feel: iOS 26 Plus and Beyond
We did not try to make iOS and Android identical. We aimed for native correctness. On newer iOS versions, native tabs can minimize on scroll and use system animations.
const isIOS26Plus = React.useMemo(() => {
if (Platform.OS !== 'ios') return false;
const version = Platform.Version as string;
const majorVersion = parseInt(version.split('.')[0], 10);
return majorVersion >= 26;
}, []);
if (Platform.OS === 'ios' && isIOS26Plus) {
return <NativeTabs minimizeBehavior="onScrollDown">{/* ... */}</NativeTabs>;
}The Little Details That Matter
Haptic feedback
Haptics are tuned by interaction weight. Small actions get light feedback, selections get a crisp tick, and bigger actions get a stronger impact.
Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Light);
Haptics.selectionAsync();
Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Medium);Dark mode everywhere
Dark mode is more than colors. Blur materials, borders, and contrast all adapt so the UI feels intentional, not inverted.
What I Learned
- Start with the visual language: corners, blur, and motion propagate into every screen.
- Springs beat easing curves for UI: shared spring configs make motion feel consistent and physical.
- Platform differences are a strength: native correctness beats forced consistency.
- AI needs structure: split agents make behavior reliable and easy to debug.
- Details compound: small improvements stack into a premium feel.
Final Thoughts
Building Pullman Bus Tracker taught us that great apps aren’t just about functionality. They’re about feel. The continuous corners, blur materials, spring physics, and haptics are each small on their own, but together they create an experience people enjoy using.
Every decision was intentional. Every interaction was tuned. The result is an app that does not just work. It feels good to use.