We are developing a system for inferring safety context on construction sites by fusing data from wearable devices, distributed sensing infrastructure, and video. Wearable sensors stream real-time levels of dangerous gases, dust, noise, light quality, precise altitude, and motion to base stations that synchronize the mobile devices, monitor the environment, and capture video. Context mined from these data is used to highlight salient elements in the video stream for monitoring and decision support in a control room. We tested our system in a initial user study on a construction site, instrumenting a small number of steel workers and collecting data. A recently completed hardware revision will be followed by further user testing and interface development.