Role
AR Developer · Unity Engineer
Stack
Unity · AR Foundation · ARKit · ARCore · Blender · C#
Field Ready, a humanitarian engineering nonprofit, has successfully addressed logistics and supply chain challenges by creating a catalog of standardized aid device parts and establishing local manufacturers in disaster-affected regions. However, one critical gap remained: assembly.
After disasters, connectivity is down and professional responders are spread thin. Community members must assemble life-saving devices themselves, but existing instructions are text-heavy, non-localized, and require technical training that most users don't have.
AidAlly bridges this final-mile gap by using augmented reality to guide users step-by-step through device assembly. The app requires no reading, translation, or technical expertise, making critical aid accessible to anyone, anywhere.
Field Ready's catalogue of devices and parts
AidAlly was built using Unity, AR Foundation, and platform-specific extensions for ARKit (iOS) and ARCore (Android). The app is designed to run entirely offline, with all models, tracking data, and instruction logic packaged locally. This was a critical constraint, as most disaster zones experience power outages and loss of mobile signal.
This architecture allows the app to operate reliably in remote or low-resource environments without cloud services or active updates, making it ideal for humanitarian response scenarios.
To enable real-world object tracking, I created a custom scanning tool using ARKit to generate .arobject
files for each fabricated part. These files encode physical geometry, size, and surface detail, allowing the app to distinguish between similar parts.
In Unity, a detection system loads these reference objects and listens for a successful match via the XR Plugin Framework. When a part is detected:
This ensures that users follow the correct sequence and that the right parts are used at each stage.
After all required parts are detected, the app transitions to a step-by-step AR assembly mode. Using raycasting, the app identifies a flat surface and anchors the first instruction overlay. Each instruction step is:
Each stage is designed to require no text and minimal screen interaction, allowing users to follow along hands-free. The app's state machine ensures that parts are assembled in the correct order, with real-time alignment hints throughout.
The next planned feature is AI-driven material substitution. In cases where a required part is missing, the app will allow users to scan a similar object (e.g., a pipe instead of a rod) and use a trained model to evaluate its dimensions and recommend whether it’s a suitable replacement.