AidAlly Logo

AidAlly

AR-Guided Assembly for Humanitarian Aid Devices

Role

AR Developer · Unity Engineer

Stack

Unity · AR Foundation · ARKit · ARCore · Blender · C#

Context & Purpose

The Problem

Field Ready, a humanitarian engineering nonprofit, has successfully addressed logistics and supply chain challenges by creating a catalog of standardized aid device parts and establishing local manufacturers in disaster-affected regions. However, one critical gap remained: assembly.

After disasters, connectivity is down and professional responders are spread thin. Community members must assemble life-saving devices themselves, but existing instructions are text-heavy, non-localized, and require technical training that most users don't have.

The Solution

AidAlly bridges this final-mile gap by using augmented reality to guide users step-by-step through device assembly. The app requires no reading, translation, or technical expertise, making critical aid accessible to anyone, anywhere.

Field Ready's catalogue of devices and parts

My Role & Tools

Role: AR Developer

  • Designed and implemented the full AR interaction system: object detection, surface anchoring, and instruction overlays
  • Built the app architecture to run fully offline, optimized for low-resource environments
  • Created the step-by-step assembly guidance system with spatial anchoring and visual feedback

Tools & Technologies

  • Unity (C#) – Core app development and AR logic
  • AR Foundation – Unified AR abstraction across platforms
  • ARKit / ARCore – Object tracking and mobile AR capabilities
  • Unity XR Plugin Framework – Low-level AR tracking optimization
  • Blender – 3D model preparation and optimization
  • ARKit Object Scanner – Generated .arobject files from physical parts

AR Architecture

AidAlly was built using Unity, AR Foundation, and platform-specific extensions for ARKit (iOS) and ARCore (Android). The app is designed to run entirely offline, with all models, tracking data, and instruction logic packaged locally. This was a critical constraint, as most disaster zones experience power outages and loss of mobile signal.

Core AR System Capabilities

  • Object Recognition: Detect and recognize real-world aid components using object tracking (.arobject files)
  • Surface Anchoring: Anchor instructions to physical surfaces using raycasting and AR Plane Manager
  • Sequential Guidance: Guide users through a sequenced build process via spatial overlays, animations, and highlights
  • Offline Operation: All functionality works without internet connectivity or cloud services

This architecture allows the app to operate reliably in remote or low-resource environments without cloud services or active updates, making it ideal for humanitarian response scenarios.

AR Interaction System

Part Detection & Object Recognition

To enable real-world object tracking, I created a custom scanning tool using ARKit to generate .arobject files for each fabricated part. These files encode physical geometry, size, and surface detail, allowing the app to distinguish between similar parts.

In Unity, a detection system loads these reference objects and listens for a successful match via the XR Plugin Framework. When a part is detected:

  • A visual confirmation UI appears to reduce user uncertainty
  • Part alignment and orientation are validated in 3D space
  • The system logs completed detections to unlock the next step in the build process

This ensures that users follow the correct sequence and that the right parts are used at each stage.

Assembly Guide & Instruction Flow

After all required parts are detected, the app transitions to a step-by-step AR assembly mode. Using raycasting, the app identifies a flat surface and anchors the first instruction overlay. Each instruction step is:

  • Represented as a Unity GameObject layer, spatially registered to part geometry
  • Advanced using a simple tap interaction or auto-progression
  • Visualized with arrows, ghosted previews, and animated cues

Each stage is designed to require no text and minimal screen interaction, allowing users to follow along hands-free. The app's state machine ensures that parts are assembled in the correct order, with real-time alignment hints throughout.

Future Extensions

The next planned feature is AI-driven material substitution. In cases where a required part is missing, the app will allow users to scan a similar object (e.g., a pipe instead of a rod) and use a trained model to evaluate its dimensions and recommend whether it’s a suitable replacement.