Case Study Disclaimer
As autonomous robots become core to industrial operations, managing them through 2D dashboards no longer reflects how they operate in the real world. InOrbitโs Space Intelligence platform sits at the intersection of physical space, real-time data, and human decision-making.
As a UX/XR design intern out of 3, I explored how mixed-reality interaction models could revolutionize robotic operations, prototyping immersive workflows that bring spatial context and clarity back into the user experience.
Skills
Product design
Mixed reality (XR) design
UX heuristics evaluation
UX research
My Role
Researcher
Designer
Developer coordinator
Timeline
2.5 months, Q3 2025
Tools
Figma
Meta Quest 3
ShapesXR
Unity
InOrbit RobOps platform (internal)
Industry
Autonomous Robotic Systems
Mixed Reality (XR) Interfaces
A little bit about InOrbitโฆ
The central nervous system for industrial operations.
The startup is an AI-powered robot operations (RobOps) platform that manages, orchestrates, and optimizes the fleets of autonomous robots in warehouses and manufacturing facilities.
Where my work comes in
Translating RobOps complexity into intuitive XR interaction models.
Bridging technical feasibility and business goals with UX expertise.
Scope
A natural evolution as technologies mature
Immersive experiences are on the rise.
As spatial computing, AR, and XR mature, 2D dashboards increasingly favor expert users and create barriers for the growing number of non-expert operators now interacting with these systems. This opens doors from industrial SaaS tool users to vast end users (operators, supervisors, technicians, on-site staff, etc.)
End users think spatially, not abstractly
3D lowers cognitive barrier.
They need visual affordances, spatial context, interactions - not just configuration.
Immersive environments support exploration, rehearsal and error recovery in a less intimidating way than complex 2D systems.
Tackling a series of ambiguitiesโฆ
Broad problem space
What started as a wide exploration quickly required prioritization under time constraints, leading us to bottleneck opportunities into XR prototypes through cross-functional alignment.
No point of reference
Integrating XR into robotic operations lacked precedent, prompting experimentation without established best practices or metrics.
Unclear technical constraints
Feasibility between XR design concepts and what was technically achievable within existing tools were not fully known upfront. This required further tool experimentation and API troubleshooting.
2D -> 3D translation
Traditional 2D UI patterns behaved differently in 3D spaces, leading us to deep research in VR/AR and test spatial design patterns immersively.
Process
Final Design
XR task workflows that support day-to-day robotic operations, ultimately enabling end users to complete complex actions through intuitive and spatial interactions.
Task workflow 1
Task workflow 2





