AIEYE : Computer Vision Analytics for

Industrial Safety & Productivity



Computer Vision Analytics for Industrial Safety & Productivity

Edge-enabled AI monitoring system designed for real-time risk mitigation, compliance enforcement, and operational optimisation across manufacturing facilities.

Multi-Class PPE Detection Engine

AI EYE's PPE Detection Engine leverages deep-learning object classification to simultaneously identify and verify helmets, safety shoes, gloves, high-visibility jackets, and respiratory masks across every individual within a camera's field of view. Unlike single-class models, this multi-class architecture processes multiple PPE categories per frame, enabling comprehensive compliance verification in a single inference pass.

The system supports multi-person simultaneous tracking, ensuring that even in dense factory floor environments with dozens of workers, each individual's PPE status is independently assessed, logged, and reported. Snapshot-based violation recording captures time-stamped visual evidence the instant a non-compliance event is detected — creating an irrefutable audit trail that meets OSHA, ISO 45001, and internal EHS documentation requirements.

5-Class Detection


Helmet, shoes, gloves, jacket & mask — all in a single inference cycle

Multi-Person Tracking

Simultaneous compliance status for every individual in the frame

Snapshot Violations


Time-stamped visual evidence captured at the moment of non-compliance

Audit-Ready Reports

Compliance export in PDF, CSV & API formats for regulatory submissions

Unsafe Behaviour Analytics

Beyond PPE compliance, AI EYE's Unsafe Behaviour Analytics module applies pose estimation and activity recognition models to detect high-risk human actions in real-time. The running detection model identifies workers exceeding safe movement speeds on the factory floor — a leading indicator of near-miss and collision events. The climbing posture classification algorithm recognises when workers assume elevated or precarious positions without proper fall protection protocols in place.

One of the module's most operationally critical capabilities is machine guard bypass detection. By monitoring the spatial relationship between workers and safety interlocks, the system flags instances where guards are removed or circumvented during active machine operation. All thresholds — from running speed sensitivity to posture angle tolerances — are fully configurable via the admin dashboard, allowing EHS teams to calibrate risk sensitivity to their specific facility layout and operational context.


Running Detection Model

AI-driven speed and gait analysis to identify unsafe movement velocities in restricted zones

Climbing Posture Classification

Skeleton-based pose estimation flags elevated or precarious body positions

Machine Guard Bypass

Spatial inference detects interlock circumvention during active machine operation

Configurable Risk Thresholds

Admin-adjustable sensitivity for speed limits, posture angles & zone boundaries





Proximity & Near-Miss Engine

The Proximity & Near-Miss Engine represents AI EYE's proactive approach to incident prevention. Rather than waiting for an accident to occur, this module performs continuous distance-based safety perimeter mapping, calculating real-time spatial relationships between workers, heavy machinery, restricted areas, and other hazard sources. When a worker breaches a configured proximity threshold, the system logs a near-miss event and triggers an instant control room alert — transforming what would otherwise be an invisible "close call" into a documented, actionable data point.

Zone configurations utilise polygon-based geofencing, enabling facility managers to draw custom-shaped safety perimeters that precisely match their floor layout — far more accurate than simple circular radius zones. Every near-miss event is logged with full metadata: timestamp, camera ID, zone ID, worker position, and proximity distance — creating a rich dataset for trend analysis and root cause investigations.



Perimeter Mapping

Distance-based safety zones calculated continuously in real-time
Polygon Geofencing

Polygon Geofencing Custom-shaped zones matching exact facility layouts
Near-Miss Logging

Full metadata capture for every proximity breach event
Instant Alerts

Real-time control room notifications with visual evidence






Forklift Detection & Speed Analytics


Technical Architecture
The Forklift Detection & Speed Analytics module employs a purpose-trained industrial vehicle object classification model capable of distinguishing forklifts, AGVs, pallet jacks, and other material handling equipment from human subjects. Once classified, a proprietary speed estimation algorithm calculates vehicle velocity based on frame-to-frame displacement, calibrated against known floor dimensions.

When a vehicle exceeds the configured speed limit within a restricted zone, the system triggers a speed violation alert — logged with vehicle type, estimated speed, zone ID, and visual snapshot. Over time, the operator behaviour insights dashboard aggregates these events to reveal patterns: which shifts have the most violations, which zones are highest risk, and which operational changes reduce incident rates.

Vehicle Classification


Distinguishes forklifts, AGVs & pallet jacks from human subjects

Speed Estimation


Frame-to-frame displacement calibrated to floor dimensions

Zone Violation Alerts


Real-time alerts for speed breaches in restricted pedestrian zones

Operator Insights


Aggregated behavioural patterns by shift, zone & incident type

Human–Vehicle Collision Prevention

AI EYE's Collision Prevention module is the platform's most safety-critical capability — engineered to prevent the highest-severity incident category on any factory floor. The system applies real-time bounding box tracking to simultaneously monitor the trajectories of both human subjects and industrial vehicles within shared operational zones. An AI-based collision prediction logic engine continuously calculates projected intersection points based on current speed, heading, and acceleration vectors.

When the algorithm determines that a human–vehicle trajectory convergence falls within a critical time-to-impact threshold, it triggers integrated audio/visual alarm systems — tower lights, sirens, and dashboard notifications — giving both the operator and the pedestrian actionable warning time. Every collision prediction event is stored with full time-stamped metadata, enabling post-incident analysis and regulatory documentation even when the alert successfully prevents contact.