Building Pullman Bus Tracker: A Modern React Native App

Building Pullman Bus Tracker: A Modern React Native App

Sheheryar PirzadaDecember 20, 2024
React NativeExpoAIMobile DevelopmentTypeScriptZustand

"I didn't treat this as a utility problem. I treated it like a product and UX problem, with performance as a non-negotiable."

Pullman Bus Tracker renders real-time vehicles on a live map, shows arrival times with minimal friction, supports favorites for instant access, and includes an AI chat that understands natural language queries like "where's the blue bus?" This post breaks down the engineering and design decisions behind the feel: the corners, the blur, the physics, and the pipeline.

01

The Visual Language

I wanted the UI to feel unmistakably iOS-native. That meant adopting system conventions for corners, materials, and motion, then applying them consistently across every screen.

Corners

Continuous curves

iOS corners are not circles. They're continuous curves. Using borderCurve: 'continuous' makes surfaces feel system-level.

Material

Blur as surface

Adaptive tint: light mode feels airy, dark mode dense. systemChromeMaterial adapts automatically.

Motion

Spring physics

Shared spring configs make motion consistent across callouts, modals, and micro-interactions. Responsive, not just animated.

02

Building the Live Map

The core loop. It had to render smoothly, read clearly, and update frequently without destroying battery. Polling only runs while the map screen is focused. Navigate away and it stops.

Blurred callouts animate in with coordinated opacity, scale, and spring translation

useFocusEffect scopes polling to the active screen, no hidden battery drain

Custom pull-to-refresh tracks gesture with interpolation, locks in with a spring

03

Tech Stack

Every dependency earned its place.

CoreFast iteration, native APIs, type safety
ExpoReact NativeTypeScript
UI & MotionDesign tokens, blur, physics animation
NativeWindReanimatedexpo-blur
MapsReal-time rendering and geolocation
rn-mapsexpo-locationexpo-haptics
StatePredictable state with persistence
ZustandAsyncStorage
AIIntent parsing, tool use, streaming
Apple IntelligenceVercel AI SDKZod
04

The AI That Actually Understands

Built like a small system, not a single massive prompt. Structured into stages so behavior stays reliable and debuggable.

1
Message ClarifierExpands short queries into explicit intent
2
Router AgentExtracts intent and entities
3
Tool RunnerFetches arrivals, vehicles, stops, routes
4
Answer AgentTurns tool output into a readable response

Semantic search with embeddings

Transit names are inconsistent. Users say "blue bus", "blue line", or "route 1". Embeddings match meaning, not exact strings, so queries always land.

Result: "Where's the blue bus?" resolves to the correct route every time, even if the user doesn't know the route number.

05

What I Learned

01
Start with the visual languageCorners, blur, and motion propagate into every screen. Nail the system first, then scale it out.
02
Springs beat easing curvesShared spring configs make motion feel consistent and physical. The interface responds, it doesn't just move.
03
Platform differences are a strengthNative correctness beats forced consistency. iOS 26 tabs and system materials feel earned, not bolted on.
04
AI needs structureSplit agents make behavior reliable and easy to debug. One giant prompt is a black box. Four focused stages is a system.
05
Details compoundHaptic feedback, adaptive blur, staggered animations. Each is small. Together they create something that feels premium.
Final Thoughts

Apps that feel good to use.

Building Pullman Bus Tracker taught me that great apps aren't just about functionality. They're about feel. The continuous corners, blur materials, spring physics, and haptics are each small on their own, but together they create an experience people enjoy using.

Every decision was intentional. Every interaction was tuned. The result is an app that does not just work. It feels good to use.

Pullman Bus Tracker · 2025Expo · React Native · TypeScript