4
dashboard could be manually refreshed to bring in new data
every 15 seconds, which is a level of live feed generally not
possible for other sensors. Both the mine operators (man-
agement, industrial hygienist, foreman) and the researchers
had access to the live data and could monitor and down-
load the data at any time. A second approach was explored
by NIOSH to streamline the process and to explore valu-
able information for the mine operators. We wrote a series
of scripts in R that generated weekly automated emails and
PDF visuals of the dust levels that week.
The weekly reports were generated through a custom R
script that calls the monitors through an application pro-
gramming interface (API), downloads the data, appends
it to a locally maintained master file each night, creates
the visuals, and sends a weekly email directly to the mine
industrial hygienist, foreman, and management as well as
to the NIOSH researchers. Because the data is transmit-
ted at such a high frequency, the script outputs three .csv
files the first averages the data by minute, the next averages
by hour, and the final calculates a time-weighted average
(TWA) for an 8-hour shift from 8 a.m. to 4 p.m. The TWA
value for each day is then multiplied by the average percent
crystalline silica content seen at this mine in the last three
years to approximate the silica concentration at that specific
location.
RESULTS
Correcting the data coming from any type of LCDM can
be important given the known limitation of the sensors in
these devices, the fact that these types of devices are not
traditionally used in industrial settings, and the internal
calibration may not be set for the specific environment in
which it is currently being used. For this specific mine and
application, we saw no need to correct the PM10 data that
we were receiving from the monitors after comparing it to
the gravimetric data for respirable dust that was collected
while were on-site setting up the system. For that reason,
we considered the PM10 data to be an acceptable represen-
tation of respirable dust concentration data. It should be
mentioned that there may be a need to periodically collect
gravimetric samples throughout the use of the LCDMs to
ensure there is no drift in the instruments, although for this
early pilot study, we were unable to collect such data.
One of the main questions that the mining company
was hoping to answer was whether the size of the screens
used to size-select the product impacted the airborne dust
levels. The screen house is a seven-story structure where the
sand is lifted to the top via a bucket lift and moves to the
bottom through a series of screens on each level. With that
in mind, three of the sensors were placed in the screen house
at the mine, all at ground level. This was primarily driven
by a poor Wi-Fi signal as we moved up through the vari-
ous floors although had this not been a limitation, having
monitors on multiple floors may have led to more infor-
mative data analysis. For the other two sensors, the mine
wanted to better understand the dust levels when the prod-
uct was transitioning in and out of the dry houses, so on the
ground floor of each drying building, we placed one sensor.
These buildings had relatively open interiors and were again
located on the ground floor due to the availability of Wi-Fi.
We were not given any production data from the mine and
thus were unable to make any correlations between screen
size and dust levels in the screen house. However, through
working closely with the mining company and explaining
how to read and understand the data, the on-site indus-
trial hygienist found that there was a correlation between
screening size and dust levels (data not shown).
One of the most important outcomes of the interac-
tions with the mine is the creation of an automated script
that sends weekly reports to both the mine and the research-
ers. We found that relying only on the live feed of the data
was overly time-consuming and did not allow for a good
enough way to see a representative snapshot of the over-
all dust levels across all monitors. Additionally, with the
provided dashboard from the sensor manufacturer, there
was some ability to search through the data, but you had
to know what you were looking for to find it, and it was
not very useful to observe overall trends. As an example,
Figure 1 is generated using 1 month of data from the screen
1 sensor. The first step taken to clean up the data was to
take the raw data from the sensor and average it by min-
ute and display it as a line chart (Figure 1A). This is the
fundamental building block of the data, and this type of
visualization is routinely accessed when looking for a spe-
cific time of an event (startup, shutdown, or shift change).
While very useful, line charts alone make it hard to observe
longer-term trends, and so through communications with
the mine we began to add heatmaps to the weekly reports
(Figure 1B). The heatmap data is averaged by hour, making
the data much less dense and easier to understand. Each
panel in Figure 1 represents the same data, and it is more
intuitive to understand that in panel B there is more red,
indicating higher dust levels, on November 1, 2, 29, and 30
than on the other days. While possible to discern that same
information from panel A, it is easier and faster to find that
same information on panel B.
After using the combination of the heatmaps and the
line charts for a few months, the mine felt they had a bet-
ter understanding of the sensors and wanted an even more
simplified weekly report. To accomplish this, we decided
Previous Page Next Page