Codes by Shrey

Embodied Interface

HapTrek

A haptic hiking and public-safety concept exploring non-visual navigation, fall detection through 3-axis accelerometer sensing, and the application of human body dynamics to product design.

Focus

  • Haptic feedback for navigation
  • Accessible and non-visual wayfinding
  • 3-axis accelerometer sensing
  • Fall detection and safety signaling
  • Embodied product design

Domain

Human factors, hardware, safety

Methods

Haptic mapping, sensing, prototyping

Stack

Arduino, accelerometer, haptics

Lens

Embodied interaction

Problem

Outdoor navigation and safety tools often assume visual attention, phone availability, and low cognitive load. HapTrek explores how tactile feedback can support hikers when visual interfaces are impractical or unsafe.

Concept

The project frames navigation as an embodied interface problem: direction, motion, and safety status can be communicated through vibration patterns and sensor-driven feedback rather than a screen-first workflow.

Safety Layer

Accelerometer data supports fall-detection logic, giving the system a public-health and emergency-response angle beyond wayfinding alone.

Design Value

HapTrek connects human factors, biomechanics, and interface design: the body becomes the medium for information, and the product must respect perception, movement, balance, and terrain.

Product Brief Draft

This page serves as the current HapTrek case study and product brief. A fuller PRD would define target users, vibration language, fall-detection thresholds, environmental constraints, validation protocol, accessibility requirements, and emergency escalation behavior.

Process + Progress

HapTrek is being framed as a professional embodied-interface case study: concept first, then sensor behavior, haptic signal design, safety validation, and accessibility testing.

Concept

Define the hiking safety problem as glance-free navigation and terrain-aware status feedback.

Public-safety and non-visual interaction brief.

Sensing

Use 3-axis accelerometer data to detect motion state and candidate fall events.

Arduino + accelerometer prototype direction.

Haptics

Map directional and safety states into vibration patterns that can be learned without visual attention.

Vibration language still needs validation.

Validation

Test false positives, terrain constraints, cognitive load, accessibility, and response behavior.

Next PRD requirement.

Technical Skills Demonstrated

Sensor Logic

Accelerometer-based state detection, threshold thinking, and fall-detection requirements.

Embodied UX

Haptic signal mapping, non-visual information design, and movement-aware interaction constraints.

Safety Validation

Accessibility, false-positive analysis, environmental constraints, and emergency escalation planning.