We sell a computer vision system. We should be clear about what it misses. This is not a hedge or a liability disclaimer — it's a list of actual documented failure modes from our pilot deployments, written by the team that built the platform. A safety manager who deploys this system with an accurate understanding of its limitations will use it better and achieve better safety outcomes than one who believes it solves all detection problems. An oversold system that fails unexpectedly erodes trust and gets turned off. An honestly characterized system gets calibrated, supplemented, and used persistently.
Here's what we've documented, categorized by type.
Camera occlusion
Camera occlusion is the most frequent failure mode and the hardest to engineer around. When a worker is fully behind a piece of equipment, a structural column, a material stack, or another worker, the camera system cannot detect them. This sounds obvious but has non-obvious implications for safety coverage claims.
On the 47-acre Houston pilot site, camera dead zone analysis estimated that at any given moment, 12-18% of the worker population was in positions where camera coverage was incomplete. That percentage varied significantly by time of day and phase of construction: it was lowest during foundation work (open site, good sight lines) and highest during structural steel erection (multiple levels of steel blocking overhead camera views).
BLE sensor coverage fills most of the occlusion gap for zone breach alerts — a worker who's out of camera sight is still detectable by their BLE badge position within the mesh network. But BLE positioning at 2-meter accuracy can't confirm PPE compliance, fall posture, or the detailed behavioral information that camera detection provides. The safety value of the BLE layer in occluded zones is position and movement data only.
Adverse weather
Heavy rain significantly degrades camera-based PPE detection. At rainfall rates above 1 inch per hour — which happens in Houston on approximately 20-25 days per year — camera image quality drops to the point where our PPE detection true positive rate falls to 72-78%. That's the level we observed; we don't publish that number in our standard marketing materials, which is why we're writing it here instead.
Dense fog has a similar effect. Direct sun at low angles (first and last 30 minutes of daylight) creates glare conditions that affect specific camera orientations. Extreme heat shimmer above hot surfaces at low camera angles creates image distortion that increases false positive rates.
Our recommendation for sites with regular adverse weather: supplement camera-based detection with more frequent human walk-throughs during weather events, and configure the alert system to notify supervisors when weather conditions are affecting camera performance. We generate a camera quality score per stream per hour; when the quality score drops below 0.65, a "reduced coverage" notification goes to the site supervisor. That's not a fix — it's transparency about current coverage quality so the supervisor can compensate.
Novel PPE variants not in training distribution
Our model was trained on 280,000+ labeled construction images covering 14 hard hat color classes and 8 vest styles. When a subcontractor brings PPE that doesn't appear in the training distribution — an unusual hard hat shell shape, a reflective vest with a non-standard reflective strip pattern, or company-branded PPE with colors outside the standard range — detection accuracy drops until the model is updated with examples of the new variant.
During the first week of our Houston pilot, a structural steel subcontractor brought workforce members wearing white hard hats with a non-standard shell profile. Our initial detection model had seen fewer than 200 examples of this exact shell shape, and the true positive rate for that workforce was 81% until we collected 400 site-specific labeled examples and updated the model at week 4. After retraining: 96.2% for that PPE variant.
This is manageable but not invisible. Any deployment on a site that will have diverse subcontractor PPE should plan for a 2-4 week calibration period per new PPE variant that appears in significant volume. Projects where the GC controls PPE procurement and standardizes on a single hard hat and vest supplier have fewer calibration events and sustained higher detection rates.
Behavioral context the system cannot interpret
Computer vision detects what is visible in a frame. It does not understand why a worker is in a particular configuration. A worker kneeling to inspect a surface near an unguarded edge looks, to the camera system, like a worker near an edge fall hazard — because they are. The system will alert. Whether the worker is there for a legitimate inspection activity with appropriate fall protection awareness, or whether they're genuinely at risk, is a judgment call that requires context the camera doesn't have.
Similarly, a worker temporarily removing their hard hat to adjust their sweatband or take a phone call generates a PPE compliance alert. That alert may be technically correct (no hard hat = non-compliance) but contextually irrelevant (30-second break in a low-risk area). Supervisors learn to distinguish these patterns, but new supervisors go through an adjustment period where alert response times are longer because they're evaluating more candidate events.
The system does not detect: worker fatigue, impaired behavior from substances or medications, inattention to work hazards, failure to follow verbal instructions, or the dozens of behavioral factors that contribute to incidents but aren't visible as PPE absence or physical zone violations. A safety monitoring platform complements human supervision; it does not replace the judgment of an experienced safety professional who knows the site and the workforce.
Multi-level tracking at height
Our BLE positioning provides floor-level elevation tracking at approximately 1.5-meter vertical accuracy — sufficient to distinguish between floor levels in standard construction. What it cannot provide is tracking accuracy within an elevated work area: on scaffolding, elevated platforms, or ladder access routes. Workers on a 10-story scaffold system are tracked to within 2 meters horizontally at their elevation level, but the system cannot determine whether a worker is 2 feet from a scaffold edge or 6 feet from a scaffold edge. Scaffold fall protection verification at that resolution requires physical inspection, not remote monitoring.
Response is still required
The system generates alerts. Alerts require human response. If the site supervisor is unavailable, in a meeting, or has their device silenced, alerts pile up unacknowledged. Our alert escalation feature — which re-alerts a backup contact if the primary contact doesn't acknowledge within 90 seconds — addresses the most common failure mode. But escalation requires that a backup contact exist, be available, and be able to respond. No technology system compensates for understaffed supervision.
On the Houston pilot, the median alert acknowledgment time was 38 seconds from first notification. The 95th percentile was 4.2 minutes — most commonly attributable to supervisors who were in areas with poor cellular reception, including interior zones during foundation phases. We've added offline alert queuing so alerts accumulate and notify immediately when connectivity is restored, but the 4-minute gap exists and matters for proximity alerts.
What to do with this information
A safety manager who understands these limitations can design around them: increase human inspection frequency during adverse weather, build subcontractor PPE standardization requirements into contracts, assign backup alert recipients for every shift, and plan camera placements with dead zone analysis before installation rather than after.
The gap between what the system detects and what it misses should be mapped against the site's specific hazard profile. For a foundation project with good site lines, minimal adverse weather, and a single subcontractor, the gap is small. For a multi-level steel erection project with diverse subcontractors and frequent Gulf Coast weather events, the gap is larger and requires more active supplementation with human observation.
We share documented detection rate data from our pilot — including the adverse weather numbers and the novel PPE variant degradation — with all prospective customers during the evaluation process. If a vendor won't share their failure mode data, that's informative. Contact us at contact@hardhatpulse.com and ask for the raw numbers. We'll give them to you.