0x00FF // initializing_perception_engine...

Amogh
Bajpai

Engineering machine perception through the lens of human dramaturgy.

Inference // Perception

Pixel
Manifold

Everybody is made of pixels. Real-time bitmapping of facial geometry into interactive particles.

[ Perception_Manual ]

  • 01. Initialize Perception to scan facial landmarks.
  • 02. Maintain blink for 3s to charge color-shift (Persists).
  • 03. Sustain blink for 6s total to trigger burst.
  • 04. Watch Intensity Meter for high-precision feedback.
Standby

POV_Calibration

Inference // Semantics

Semantic
Stream

Translating the visual world into high-fidelity narrative descriptions.

[ Perception_Manual ]

  • 01. Initialize Stream to start multimodal inference.
  • 02. Present objects or perform actions in the frame.
  • 03. Observe the Semantic Terminal for real-time captions.
  • 04. System identifies actors, objects, and latent intent.
[ SYSTEM_STANDBY ]
[ INFERENCING... ]

Infrastructure // Intelligence

Integrity
Sync Engine

Self-healing synchronization loops for high-stakes biometric reporting.

Architected recursive fetching to defeat unstable APIs. Integrated RAG pipelines for behavioral anomaly detection.

Inference // Precision

Biometric
Alignment

Turning web cameras into high-precision biometric verification sensors.

Real-time image stabilization and contrast enhancement using custom WASM modules.

The Dramaturg

I came to engineering through theatre. For years I directed performances, thinking about how bodies move through space, how timing shapes emotion, and how small interactions create meaning on stage.

That way of thinking never left. It simply moved into the systems I build. My discovery of TouchDesigner was the catalyst—it showed me that code could be as plastic and expressive as lighting or movement on a stage.

In theatre, blocking is the choreography of actors within a scene. In software, I think about architecture in a similar way: coordinating agents, managing timing, and shaping how information moves through a system.

Today my work lives at the intersection of machine learning, computer vision, and interactive media. I approach technology as a medium for designing experiences, not just solving problems.

BIOS Diagnostic // System_Stack_v2.0

[01] PERCEPTION

MediaPipe CV

TensorFlow.js

OpenCV.js (WASM)

[02] INFERENCE

RAG Pipelines

LangChain / LLM Ops

Semantic Search

[03] VISUALS

TouchDesigner

GLSL Shaders

Three.js / WebGL

[04] CORE_RECORDS

Spatial Semantics

HCI Blocking

Latent Intent

Logbook // Field_Notes_2026

[ LOG_01_ZEPHYR ]
MAR.2026

Biometric Latency Optimization STABLE

Synthesized recursive synchronization loops to reduce reporting lag in high-stakes biometric environments.

[ LOG_02_MIMIC ]
FEB.2026

Semantic Face Alignment VERIFIED

Mapped 52 blendshape indices to local mesh influences. Integrated CLAHE image enhancement.

[ LOG_03_FLUX ]
JAN.2026

Generative Visual Synthesis EXPERIMENTAL

Explored real-time particle displacement using TouchDesigner and GLSL shaders.