HP Sprout 3D Capture — pink pig on the 3D stage
HP Inc. · Hardware + Software · 2015

HP Sprout
3D Capture

RoleUX Design Manager — 3D Capture
TeamHP Sprout Innovation Lab
PlatformWindows · Touch Mat · 3D Stage
Launched2015 · Palo Alto, CA

Turning Any Object
Into a 3D Model

HP Sprout was a radical reimagining of the personal computer — an all-in-one with a projector-camera mounted overhead, a large touch-sensitive mat below, and an Intel RealSense depth camera embedded in the unit. 3D Capture was its most ambitious feature: scan real-world objects in 360°, producing detailed textured meshes ready for editing, 3D printing, or sharing to platforms like SketchFab — all without any prior 3D expertise.

The complete workflow — place object on the blue mat circle, run up to 8 scan cycles, merge captures, fill gaps, export to OBJ or 3MF — had to feel as simple as taking a photograph. There was no roadmap. No existing UX conventions for consumer 3D scanning. We were inventing the interaction model while the hardware was being built around us.

360°
Automated object scanning
8
Scan cycles per capture
0
Prior UX conventions to follow
2015
Shipped with Sprout Pro G2

Four Hard Problems
in Uncharted Territory

Every interaction pattern had to be invented — from communicating scan progress to teaching users to reposition objects mid-cycle. There were no precedents to borrow from and no permission to ship later.

A New Input Paradigm
No UX conventions existed for a machine with a projector overhead, touch mat below, and 3D stage in front. Every pattern had to be invented.
No Consumer 3D Patterns
Pro scanning software assumed expert users. Consumer UX assumed simple actions. We were designing in between — powerful enough for pro output, approachable in under 60 seconds.
Hardware-Software Co-Dependency
UX decisions — how many cycles, when to prompt repositioning, how to visualize progress — directly constrained and were constrained by hardware capability.
Consumer Ease, Pro Output
Users ranged from hobbyists to product designers. The same flow had to be approachable for a first-timer and produce mesh-quality output for 3D printing.
Interaction Model

The Dual-Surface
System

The core UX challenge was teaching users the dual-surface interaction model — instructions on the vertical screen above, active workspace on the touch mat below. The overhead projector-camera simultaneously watches the mat and projects guidance onto it, creating a feedback loop entirely new to computing. Neither channel works without the other.

HP Sprout system diagram — screen above, touch mat below, overhead camera unit

Screen above displays the live camera feed and instructions. Touch mat below is the active workspace. The overhead unit projects guidance onto the mat and captures the object simultaneously.

Designing the System
While Building It

My approach began with the physical interaction model first — how does a user understand what to do with an object on a rotating stage? Before any screen design, I mapped the full physical-digital loop: place object → scan cycle → reposition → scan again → view result.

01
Map the
Interaction Loop
Every physical action — placing, repositioning, removing — had a corresponding digital state. Mapped the full end-to-end flow before touching UI.
02
Design the
Scan Cycle Model
8 cycles, each capturing hidden surfaces. The pie-chart progress visualization — showing completed segments and directing repositioning — emerged from over a dozen tested alternatives.
03
Prototype with
Hardware Team
Built paper and interactive prototypes alongside mechanical engineering — testing interaction concepts with the physical stage before any working software existed.
04
Manual vs.
Automatic Modes
Two distinct paths, each with its own interaction grammar, onboarding, and error states. The mode selection screen was the most iterated design artifact in the project.
05
Test with Real
Objects & Users
Ran sessions with a wide range of objects. The pink pig became our canonical test object — complex enough to break naive approaches, familiar enough that users immediately understood the goal.
06
Align Cross-
Functional Teams
Facilitated weekly alignment between mechanical engineering, computer vision, and software. Design decisions were the connective tissue holding all three together.
HP Sprout Choose Capture Type — Manual Scan vs Automatic Scan
UI Design · Mode Selection Two paths, clearly differentiated. Automatic Scan (motorized stage, hands-off) vs. Manual Scan (freeform, user-controlled repositioning). The visual language of each card immediately communicates the interaction nature.
The Dual Guidance System

On the Mat.
On the Screen.

On the mat: the scan cycle diagram is projected physically onto the stage — a pie chart telling users which of 8 positions to rotate to next. On the screen: a live camera feed shows how the computer is seeing the object in real time, confirming angle and centering before each scan.

Projected Pie
8 segments projected on the mat. Each represents one capture angle. Arrows direct rotation between cycles.
Live Feed
Screen shows exactly how the computer sees the object. Users verify angle and centering before tapping Scan.
Orange Glow
The stage emits a structured-light orange glow during capture — a physical cue that scanning is in progress.
Motor Sound
The rotating stage's motor and rhythm are interaction cues as powerful as any on-screen element — audible progress.

Eight Positions,
One Mental Model

8 scan positions projected directly onto the touch mat — each segment represents one scan angle. The arrows tell the user exactly how to rotate the object between cycles. Full blue circle = capture complete.

Scan cycle progress system — 8-stage pie chart with rotation positions and directional arrows

The 8-cycle pie chart — projected physically on the mat and mirrored on-screen. Users always know how many cycles remain and which direction to rotate.

Scan Cycle 3 — screen shows live camera feed as user repositions dinosaur
Close-up — hands repositioning dinosaur on stage following pie chart guidance

Left: screen shows exactly how the computer sees the object — user verifies angle before hitting Scan. Right: hands reposition the object to the angle indicated by the projected pie chart on the mat.

Pink pig on HP scan stage — scan session 1
01
Object placed on stage — cycle 1 in progress
Pink pig scan — repositioning between cycles
02
Mid-scan — repositioning to expose hidden surfaces
Pink pig scan — final cycles completing
03
Final cycles — pie chart nearly full at Cycle 6, stage still rotating
The pig — our canonical test object across all scan research sessions
"The object sitting on the mat is the interface. Everything we designed — the projected blue circle, the orange scan glow, the cycle sounds — was about making the physical object feel like part of the software."
Design Principle — HP Sprout 3D Capture

What We Learned
About 3D Scanning UX

Physical Feedback is UI
The stage's orange glow, motor sound, and rotation are interaction cues as powerful as any on-screen element. We designed for both channels simultaneously.
Progress Needs a Mental Model
"Scanning…" is not enough. Users needed to understand why 8 cycles, what each captures, and what to do between them — not just that something was happening.
Placement is the Hardest Step
Users consistently under-centered objects. The projected blue circle on the touch mat — a physical guide projected from above — reduced placement errors dramatically.
The Object is a Constraint
Light-coloured, matte objects scan beautifully. Shiny, transparent, or black surfaces confuse structured light — a UX problem as much as a hardware one.

The Stage,
the Screen, the Object

The motorized rotating stage with structured light projection. The orange glow is a signature physical cue — scanning in progress — while the screen simultaneously shows the pie chart filling up, cycle by cycle.

HP Sprout 3D Capture Stage in use
Pink pig on HP 3D Capture Stage at Cycle 6 of 8

Left: the motorized stage with structured-light orange glow signalling active capture. Right: Cycle 6 of 8 — the scan is nearly complete, screen and stage in the same frame.

Scan Results — From Physical Object to 3D Mesh
Gold teapot 3D mesh result
Antique teapot — complex curved geometry, handle detail preserved
Painted folk art creature 3D mesh result
Folk art figurine — surface texture and paint detail captured
Red pig 3D mesh with surface normal visualization
The pig — surface normal rendering reveals mesh geometry quality
3D scanned Dala horse model projected onto the HP Sprout touch mat

From Physical Object
to Digital Model —
on the Same Surface

Once scanned, the 3D model is projected directly onto the touch mat — the same surface where the physical object sat moments before. Users can rotate, scale, and interact with the digital version using their fingers. The transition from physical to digital happens without ever leaving the workspace.

See It
in Action

A live demonstration of the HP Sprout 3D Capture workflow — from placing an object on the stage through the automated scan cycles to the final 3D mesh result.

HP Sprout 3D Capture — the complete scan-to-mesh workflow in a single take.

What Shipped

First Consumer 3D Scanning PC

HP Sprout 3D Capture shipped with the HP Sprout Pro G2 in 2015 — the first consumer computer with integrated 3D scanning. Output in OBJ and 3MF for major 3D printing workflows.

Foundational UX Patterns

The 8-cycle scan model, dual-mode entry point, and projected stage guidance became foundational patterns for the entire Sprout 3D ecosystem — referenced in HP's spatial computing design guidelines.

Unified Hardware + Design

Demonstrated that hardware and design could ship simultaneously from a unified team — a model that influenced how HP structured subsequent Sprout development cycles.

UX Design Manager,
3D Capture

  • Led UX design for the 3D Capture feature end-to-end
  • Mapped the full physical-digital interaction loop
  • Designed the 8-cycle scan model and pie-chart progress system
  • Defined Manual vs. Automatic mode interaction grammar
  • Prototyped alongside mechanical engineering and CV teams
  • Ran user research sessions with a wide range of test objects
  • Facilitated cross-functional alignment across hardware and software
  • Contributed to HP spatial computing design guidelines

What I'd Do
Differently

I would advocate for earlier longitudinal testing with the physical hardware. Wizard of Oz methods were invaluable early on, but the transition to actual hardware revealed failure modes — especially the projected guide's visibility under ambient light — that we hadn't fully anticipated.

I'd also push harder for a dedicated first-run onboarding experience. The system was genuinely novel and we relied too heavily on in-context guidance. A brief first-time setup flow would have reduced the learning curve without adding friction for returning users.

Looking forward, voice coaching and AI make this process dramatically simpler. Instead of an 8-step manual repositioning cycle, an AI-powered system could analyse the mesh in real time and instruct users — or the stage itself — to only capture the angles that are actually missing. Move the object only when the computer can't see it.

← All Projects HP Sprout Walk-Up →