Category Archives: Reference material

Psychological Evaluation of Virtual Reality Applied to Safety Training

04/14/22 Table Read for The Art & State of Safety Journal Club

Excerpts and comments from “A Transferable Psychological Evaluation of Virtual Reality Applied to Safety Training in Chemical Manufacturing”

Published as part of the ACS Chemical Health & Safety joint virtual special issue “Process Safety from Bench to Pilot to Plant” in collaboration with Organic Process Research & Development and Journal of Loss Prevention in the Process Industries.

Matthieu Poyade, Claire Eaglesham,§ Jordan Trench,§ and Marc Reid*

The full paper can be found here: https://pubs.acs.org/doi/abs/10.1021/acs.chas.0c00105

1. Introduction

Safety in Chemical Manufacturing

Recent high-profile accidents—on both research and manufacturing scales—have provided strong drivers for culture change and training improvements. While our focus here is on process-scale chemical manufacturing,[a][b][c][d][e][f][g][h][i][j] the similarly severe safety challenges exist on the laboratory scale; such dangers have been extensively reviewed recently. Through consideration of the emerging digitalization trends and perennial safety challenges in the chemical sector, we envisaged using interactive and immersive virtual reality (VR) as an opportune technology for developing safety training and accident readiness for those working in dangerous chemical environments.

Virtual Reality

VR enables interactive and immersive real-time task simulations across a growing wealth of areas. In higher education, prelab training in[k][l][m] VR has the potential to address these issues giving students multiple attempts to complete core protocols virtually in advance of experimental work, creating the time and space to practice outside of the physical laboratory.

Safety Educational Challenges

Safety education and research have evolved with technology, moving from videos on cassettes to online formats and simulations. However, the area is still a challenge, and very recent work has demonstrated that there must be an active link between pre-laboratory work and laboratory work in order for the advance work to have impact.

Study Aims

The primary question for this study can be framed as follows: When evaluated on a controlled basis, how do two distinct training methods[n][o][p][q][r][s], (1) VR training and (2) traditional slide training[t][u][v] (displayed as a video to ensure the consistency of the provision of training), compare for the same safety-critical task?

We describe herein the digital recreation of a hazardous facility using VR to provide immersive and proactive safety training. We use this case to deliver a thorough statistical assessment of the psychological principles of our VR safety training platform versus the traditional non-immersive training (the latter still being the de facto standard for such live industrial settings).

Figure 3. Summarized workflow for safety training case identification and comparative assessment of PowerPoint video versus VR.

2. Methods

After completing their training, participants were required to fill in standardized questionnaires which aimed to formally assess 5 measures of their training experiences.

1. Task-Specific Learning Effect

Participants’ post-training knowledge of the ammonia offload task was assessed in an exam-style test composed of six task-specific open-questions.

2. Perception of Learning Confidence

How well participants perform on a training exam and how they feel about the overall learning experience are not the same thing. Participant experiences were assessed through 8 bespoke statements, which were specifically designed for the assessment of both training conditions.

3. Sense of Perceived Presence

“Presence” can be defined as the depth of a user’s imagined sensation of “being there” inside the training media they are interacting with.

4. Usability

From the field of human–computer interaction, the System Usability Scale (SUS) has become the industry standard for the assessment of system performance and fitness for the intended purpose… A user answers their level of agreement on a Likert scale, resulting in a score out of 100 that can be converted to a grade A–F. In our study, the SUS was used to evaluate the subjective usability of our VR training system.

[w][x][y]

5. Sentiment Analysis

Transcripts of participant feedback—from both the VR and non-VR safety training groups—were used with the Linguistic Inquiry and Word Count (LIWC, pronounced “Luke”) program. Therein, the unedited and word-ordered text structure (the corpus) was analyzed against the LIWC default dictionary, outputting a percentage of words fitting psychological descriptors. Most importantly for this study, the percentage of words labeled with positive or negative affect (or emotion) were captured to enable quantifiable comparison between the VR and non-VR feedback transcripts.

3. Results

Safety Training Evaluation

Having created a bespoke VR safety training platform for the GSK ammonia offloading task, the value of this modern training approach could be formally assessed versus GSK’s existing training protocols. Crucial to this assessment was the bringing together of experimental methods which focus on psychological principles that are not yet commonly practiced in chemical health and safety training assessment (Figure 3). All results presented below are summarized in Figure 8 and Table 1.

Figure 8. Summary of the psychological assessment of VR versus non-VR (video slide-based) safety training. (a) Task-specific learning effect. (b) Perception of learning confidence. (c) Assessment of training presence. (d) VR system usability score. In parts b and c, * and ** represent statistically significant results with p < 0.05 and p < 0.001, respectively.

1. Task-Specific Learning Effect (Figure 8a)

Task-specific learning for the ammonia offload task was assessed using a questionnaire[z][aa][ab][ac] built upon official GSK training materials and marking schemes. Overall, test scores from the Control group and the VR group showed no statistical difference between groups. However, there was tighter distribution[ad] around the mean score for the VR group versus the Control group.

2. Perception of Learning Confidence (Figure 8b)

Participants’ perceived confidence of having gained new knowledge was assessed using a questionnaire composed of 8 statements, probing multiple aspects of the learning experience… Within a 95% confidence limit, the VR training method was perceived by participants to be significantly more fit for training purpose than video slides. VR also gave participants confidence that they could next perform the safety task alone[ae][af][ag]. Moreover, participants rated VR as having more potential than traditional slides for helping train in other complex tasks and to improve decision making skills (Figure 8b). Overall, participants from the VR group felt more confident and prepared for on-site training than those from the Control group.[ah]

3. Sense of Perceived Presence (Figure 8c)

The Sense of Presence questionnaire was used to gauge participants’ overall feeling of training involvement across four key dimensions. Results show that participants from the VR group reported experiencing a higher sense of presence than those from the Control group. On the fourth dimension, negative effects, participants from the Control group reported experiencing more negative effects than those from the VR group, but the result was not statistically different (Figure 8c). [ai]

4. Usability of the VR Training Platform (Figure 8d)

The System Usability Scale (SUS) was used to assess the effectiveness, intuitiveness, and satisfaction with which participants were able to achieve the task objectives within the VR environment. The average SUS score recorded was 79.559 (∼80, or grade A−), which placed our VR training platform on the edge of the top 10% of SUS scores (see Figure 5 for context). The SUS result indicated an overall excellent experience for participants in the VR group.

[Participants also] disagreed with any notion that the VR experience was too long (1.6 ± 0.7) and did not think it was too short (2.5 ± 1.1). Participants agreed that the simulation was stable and smooth (3.9 ± 1.2) and disagreed that it was in any way jaggy (2.3 ± 0.8). Hand-based interactions with the VR environment were agreed to be relatively intuitive (3.8 ± 1.3), and the head-mounted display was found to provide agreeable comfort for the duration of the training (4.0 ± 0.9).

5. Sentiment Analysis of Participant Feedback (Table 1)

In the final part of our study, we aimed to corroborate the formal statistical analysis against a quantitative analysis of open participant feedback. Using the text-based transcripts from both the Control and VR group participant feedback, the Linguistic Inquiry and Word Count (LIWC) tool provided further insight based on the emotional sentiment hidden in the plain text. VR participants were found to use more positively emotive words (4.2% of the VR training feedback corpus) versus the Control group (2.1% of the video training feedback corpus). More broadly, the VR group displayed a more positive emotive tone and used fewer negatively emotive words than the Control group.

Table 1. LIWC-Enabled Sentiment Analysis of Participant Feedback Transcripts

LIWC variable

brief description

VR-group

non-VR group

word count

no. of words in the transcript

1493

984

emotional tone

difference between positive and negative words (<50 = negative)

83.1

24.2

positive emotion

% positive words (e.g., love, nice, sweet)

4.2%

2.1%

negative emotion

% negative words (e.g., hurt, nasty, ugly)

1.1%

2.2%

4. Discussion

Overall, using our transferable assessment workflow, the statistical survey analysis showed that task-specific learning was equivalent[aj][ak][al][am][an] for VR and non-VR groups. This suggests that the VR training in that specific context is not detrimental to learning and appears to be as effective as the traditional training modality but, crucially, with improved user investment in the training experience. However, the distribution difference between both training modalities suggests that the VR training provided a more consistent experience across participants than watching video slides, but more evaluation would be required to verify this.

In addition, perceived learning confidence and sense of perceived presence were reported to be all significantly better in VR over the non-VR group. The reported differences in perceived learning confidence between participants from both groups suggest that those from the VR group, despite having acquired a similar amount of knowledge, were feeling more assured about the applicability of that knowledge. These findings thus suggest that the VR training resulted in a more engaging and psychologically involving modality able to increase participants’ confidence in their own learning. [ao][ap][aq]Further research will also aim to explore the applicability and validation of the perceived learning confidence questionnaire introduced in this investigation.

Additionally, VR system usability was quantifiably excellent, according to the SUS score and feedback text sentiment analysis.

Although our experimental data demonstrate the value of the VR modality for health and safety training in chemical manufacturing settings, the sampling, and more particularly the variation in digital literacy[ar] among participants, may be a limitation to the study. Therefore, future research should explore the training validity of the proposed approach involving a homogeneously digitally literate cohort of participants to more rigorously measure knowledge development between experimental conditions.

4.1. Implications for Chemical Health and Safety Training

By requiring learners to complete core protocols virtually in advance of real work, VR pretask training has the potential to address issues of complex learning, knowledge retention[as][at][au], training turnover times, and safety culture enhancements. Researchers in the Chemical and Petrochemical Sciences operate across an expansive range of sites, from small laboratories to pilot plants and refineries. Therefore, beyond valuable safety simulations and training exercises, outputs from this work are envisaged to breed applications where remote virtual or augmented assistance can alleviate the significant risks to staff on large-scale manufacturing sites.

4.2. Optimizing Resource-Intensive Laboratory Spaces

As a space category in buildings, chemistry laboratories are significantly more resource-intensive than office or storage spaces. The ability to deliver virtual chemical safety training, as demonstrated herein, could serve toward the consolidation and recategorization, minimizing utility and space expenditure threatening sustainability.[av][aw][ax][ay][az][ba][bb] By developing the new Chemistry VR laboratories, high utility bills[bc][bd][be][bf] associated with running physical chemistry laboratories could potentially be significantly reduced.

4.3. Bridging Chemistry and Psychology

By bringing together psychological and computational assessments of safety training, the workflow applied herein could serve as a blueprint for future developments in this emerging multidisciplinary research domain. Indeed, the need to bring together chemical and psychological skill sets was highlighted in the aforementioned safety review by Trant and Menard.

5. Conclusions

Toward a higher standard of safety training and culture, we have described the end-to-end development of a VR safety training platform deployed in a dangerous chemical manufacturing environment. Using a specific process chemical case study, we have introduced a transferable workflow for the psychological assessment of an advanced training tool versus traditional slide-based safety training. This same workflow could conceivably be applied to training developments beyond safety.

Comparing our VR safety training versus GSK’s established training protocols, we found no statistical difference in the task-specific learning[bg] achieved in VR versus traditional slide-based training. However, statistical differences, in favor of VR, were found for participants’ positive perception of learning confidence and in their training presence (or involvement) in what was being taught. In sum, VR training was shown to help participants invest more in their safety training than in a more traditional setting for training[bh][bi][bj][bk][bl].

Specific to the VR platform itself, the standard System Usability Scale (SUS) found that our development ranked as “A–” or 80%, placing it toward an “excellent” rating and well within the level of acceptance to deliver competent training.

Our ongoing research in this space is now extending into related chemical safety application domains.

[a]I would think the expense of VR could be seen as more “worth it” at this scale rather than at the lab scale given how much bigger and scarier emergencies can be (and how you really can’t “recreate” such an emergency in real life without some serious problems).

[b]Additionally, I suspect that in manufacturing there is more incentive to train up workers outside of the lecture/textbook approach. Many people are hands on learners and tend to move into the trades for that reason.

[c]I was also just about to make a comment that the budget for training and purchasing and upkeep for VR equipment is probably more negligible in those environments compared to smaller lab groups

[d]Jessica…You can create very realistic large scale simulations.  For example, I have simulated 50-100 barrel oil spills on water, in rivers, ponds and lakes, with really good results.

[e]Oh – this is a good point. What is not taken into account here is the comfort level people have with different types of learning. It would be interesting to know if Ph.D. level scientists and those who moved into this work through apprenticeship and/or just a BA would’ve felt differently about these training experiences.

[f]Neal – I wasn’t questioning that. I was saying that those things are difficult (or impossible) to recreate in real life – which is why being able to do a simulation would be more attractive for process scale than for lab scale.

[g]The skill level of the participants in not known.  A pilot plant team spans very skilled technicians to PhD level engineers and other scientists. I do not buy into Ralph’s observation.

[h]Jessica, I disagree.  Simulated or hands-on is really valuable for learning skills that require both conceptual understanding and muscle memory tasks.

[i]I’m not disagreeing. What I am saying is that if you want to teach someone how to clean up a spill, it is a lot easier to spill 50 mL of something in reality and have them practice leaning it up, than it is to spill 5000 L of something and ask them to practice cleaning it up. Ergo, simulation is going to automatically be more attractive to those who have to deal with much larger quantities.

[j]And the question wasn’t about skill level. It was about comfort with different sorts of learning. You can be incredibly highly skilled and still prefer to learn something hands-on – or prefer for someone to give you a book to read. The educational levels were somewhat being used as proxies for this (i.e. to get a PhD, you better like learning through reading!).

[k]Videos

The following YouTube links provide representative examples of:

i. The ammonia offload training introduction video; https://youtu.be/30SbytSHbrU

ii. The VR training walkthrough; https://youtu.be/DlXu0nTMCPQ

iii. The GSK ammonia tank farm fly through (i.e. the digital twin used in the VR training;

iv. The video slide training video; https://youtu.be/TZxJDJXVPgM

[l]Awesome – thank you for adding this.

[m]Very helpful. Thank you

[n]I’d be curious to see how a case 3 of video lecture then VR training compares, because this would have a focused educational component then focused skill and habit development component

[o]I’d be interested to see how in person training would compare to these.

[p]This would also open up a can of worms. It is in-person interactive learning? Is it in-person hands-on learning? Or is it sitting in a classroom watching someone give a Powerpoint in-person learning?

[q]I was imagining hands-on learning or the reality version of the VR training they did so they could inspect how immersive the training is compared to the real thing. Comparision to an interactive train could also have been interesting.

[r]I was about to add that I feel like comparison to interactive in-person training would’ve been good to see. I tend to think of VR as same as an in-person training but just digital.

[s]Thats why I think it could be interesting. They could see if there is in fact a difference between VR and hands-on training. Then if there was none you would have an argument for a cost saving in space and personnel.

[t]Comparing these two very different methods is problematic. If you are trying to assess the added immersive value of VR then it needed to be compared to another ‘active learning’ method such as computer simulation.

[u]Wouldn’t VR training essentially be a form of computer simulation? I was thinking that the things being compared were 2 trainings that the employees could do “on their own”. At the moment, the training done on their own is typically some sort of recorded slideshow. So they are comparing to something more interactive that is also something that the employee can do “on their own.”

[v]A good comparison could have been VR with the headset and controllers in each hand compared to a keyboard and mouse simulation where you can select certain options. More like Oregon Trail.

[w]I’ve never seen this graphic before, but I love it! Excel is powerful but user-hostile, particularly if you try to share your work with someone else.

Of course, Google and Amazon are cash cows for their platforms, so they have an incentive to optimize System Usability (partially to camouflage their commercial interest in what they are selling).

[x]I found this comparison odd. It is claiming to measure a computer system’s fitness for purpose. Amazon’s purpose for the user is to get people to buy stuff. While it may be complex beyond the scenes, it has a very simple and pretty singular purpose. Excel’s purpose for the user is to be able to do loads of different and varied complex tasks. They are really different animals.

[y]Yes, I agree that they are different animals. However understanding the limits of the two applications requires more IT education than most people get.(Faculty routinely comment that “this generation of students doesn’t know how to use computers”.) As a result, Excel is commonly used for tasks that it is not appropriate for. But you’re correct that Amazon and Google’s missions are much simpler than those Excel are used for.

[z]I’d be very interested to know what would have happened if the learners were asked to perform the offload task.

[aa]Yes! I just saw your comment below. I wonder if they are seeing no difference here because they are testing on different platform than they trained. Would be interesting to compare written and in practice results for each group. Maybe the VR group would be worse at the written test but better at physically doing the tasks.

[ab]This could get at my concerns about the Dunning-Kruger effect that I mentioned below. as well Just because someone feels more confident that they can do something, doesn’t mean that they are correct about that. It definitely would’ve been helpful to actually have the employees perform the task and see how the two groups compared. Especially since the purpose of training isn’t to pass a test – it is to actually be able to DO the task!

[ac]“Especially since the purpose of training isn’t to pass a test – it is to actually be able to DO the task!” – this

[ad]If I’m understanding the plot correctly, it looks like there is a higher skew in the positive direction for the control group which is interesting. I.e. lower but also higher scores. It seems to have an evening effect which makes the results of the training more predictable.

[ae]I’m slightly alarmed that VR give people the confidence to perform the tasks alone when there was no statistical difference in the task-specific learning scores than the control group. VR seems to give a false sense of confidence.

[af]I had the same reaction to reading this. I don’t believe a false sense of confidence in performing a task alone to be a good thing. Confidence in your understanding is great, but overconfidence in physically performing something can definitely lead to more accidents.

[ag]A comment I’ve seen is that the VR trainees may perform better in doing the action than the control but they only gauged their knowledge in an exam style, not in actually performing the task they are trained to do. But regardless, a false confidence would not be good.

[ah]Wonder if there is concern that this creates a false confidence given the exam scores.

[ai]Wow these are big differences. There is basically no overlap in the error bars except the negative effect dimension.

[aj]If the task-specific learning was equivalent, but one method caused people to express greater confidence than the other, is this necessarily a good thing? Taken to an extreme perhaps, wouldn’t we just be triggering the Dunning-Kruger effect?

[ak]I’d be interested in the value of VR option as refresher training, similar to the way that flight simulators are used for pilots. Sullenberger attributed surviving the bird strike at LaGuardia to lots of simulator time allowing him to react functionally rather than emotionally in the 30 seconds after birds were hit

[al]I had the same comment above. I was wondering if people should be worried about false confidence. But the problem with the assessment was on paper, they may be better at physically doing the tasks or responding in real time, that was not tested.

[am]Exactly, Kali!

[an]From Ralph’s comment, I think there is a lot of value in VR for training for and simulating emergency situations without any of the risks.

[ao]Again, this triggers the question: should they be more confident in this learning?

[ap]I don’t think the available data let us distinguish between an overconfident test group and an under-confident control group (or both).

[aq]I am curious what the standard in the field is for determining at what point learners or over confident. For example, are these good scores? Is there a correlation between higher test scores and confidence or inverse meaning false confidence?

[ar]I am curious as well how much training there needed to be to explain how the VR setup worked. That is a huge factor to how easy people will perceive the VR setup to be if someone walks them through it slowly, versus walking up to a training station and being expected to know how to use it.

[as]I suspect that if people are more engaged in the VR training that they would have better retention of the training over time and would be interesting to explore. If so, that would be a great argument for VR.

[at]Right and if places don’t have the capacity to offer hands on training, an alternative to online-only is VR. Or, in high hazard work where it’s not advisable to train under the circumstances.

[au]I argee that their is a lot of merit for VR in high hazard or emergnecy response simulation and training.

[av]I’m personally more interested in the potential to use augmented reality in an empty lab space to train researchers to work with a variety of hazards/operations.

[aw]Right, this whole time I wasn’t thinking about replacing labs I was thinking about adding a desk space for training. I am still curious about the logistics though, in the era of COVID if people will really be comfortable sharing equipment and how it will be maintained, because that all costs money as well.

[ax]This really would open up some interesting arguments. Would learning how to do everything for 4 years in college by VR actually translate to an individual being able to walk into a real lab and work confidently?

[ay]From COVID, some portion of the labs they were able to do virtually – like moving this etc. From personal standpoint, I’d say no and like we talked about here – it would give a false sense of confidence which could be detrimental. It’s the same way I feel about “virtual fire extinguisher training.” I don’t think it provides any of the necessary exposure.

[az]Amanda totally agree.  The virtual fire extinguisher training does not provide the feel (heat) smell (fire & agent) or sound or a live fire exercise.

[ba]Oh wait – virtual training and “virtual reality training” are pretty different concepts. I agree that virtual training would never be able to substitute for hands-on experience. However, what VR training has been driving at for years is to try to create such a realistic environment within which to perform the training that it really does “feel like you are there.” I’m not sure how close we are to that. In my experiences with VR, I haven’t seen anything THAT good. But I likely haven’t been exposed to the real cutting edge.

[bb]Jessica, I wouldn’t recommend that AR/VR ever be used as the only training provided,  but I suspect that it could shorten someone’s learning curve.

Amanda and Neal, having done both, I think that both have their advantages. The heat of the fire and kickback from the extinguisher aren’t well-replicated, but I actually think that the virtual extinguisher can do a better job of simulating the difficulty. When I’ve done the hands-on training, you’d succeed in about 1-2 seconds as long as you pointed the extinguisher in the general vicinity of the fire.

[bc]Utility bills for lab spaces are higher than average, but a small fraction of the total costs of labs. At Cornell, energy costs were about 5% of the cost of operating the building when you took the salaries of the people in the building into account. This explains why utility costs are not as compelling to academic administrators as they might seem. However, if there are labor cost savings associated with VR…

[bd]I’d love to see a breakdown of the fixed cost of purchasing the VR equipment, and the incremental cost of developing each new module.

[be]Ralph, that is so interesting to see the numbers I had no idea it was that small of an amount. I suppose the argument might need to be wasted energy/ environmental considerations then, rather than cost savings.

[bf]Yes, that is why I had a job at Cornell – a large part of the campus community was very committed to moving toward carbon neutrality and support energy conservation projects even with long payback periods (e.g. 7 years)

[bg]It seems like it would be more helpful to have a paper-test and field-test to see if VR helped with physically doing the tasks, since that is the benefit that I see. In addition, looking at retention over time would be important. Otherwise the case is harder to make for VR if it’s not increasing the desired knowledge and skills

[bh]How much of the statistical favoring of VR was due to the VR platform being “cool” and “new” versus the participants familiarity with traditional approaches? I do not see any control in the document for this.

[bi]I agree that work as reported seems to be demonstrating that the platform provides an acceptable experience for the learner, but it’s not clear whether the skills are acquired or retained.

[bj]three of the 4 authors are in the VR academic environment; one is in a chemistry department.  Seems to be real bias in showing that VR works in this.

[bk]One application I think this could help with is one I experienced today. I had to sit through an Underground Storage Tank training which was similar to the text-heavy powerpoint in the video. I had been able to self direct my tour of the tank and play with the valves and detection systems, I would have been able to complete the regulatory requirements of understanding the system well enough to oversee its operations, but not well enough to have hands-on responsibilities. The hands-on work is left the contractors who work on the tanks everyday.

[bl]We have very good results using simulators in the nuclear power and airflight industries. BUT, there is still significant learning that occurs when things become real.  The most dramatic example is landing high performance aircraft on carrier decks.  Every pilot agrees that nothing prepares them for the reality this controlled crash.

Positive Feedback for Improved Safety Inspections

Melinda Box, MEd, presenter
Ciana Paye, &
Maria Gallardo-Williams, PhD 
North Carolina State University

Melinda’s powerpoint can be downloaded from here.

03/17 Table Read for The Art & State of Safety Journal Club

Excerpts from “Positive Feedback via Descriptive Comments for Improved Safety Inspection Responsiveness and Compliance”

The full paper can be found here: https://pubs.acs.org/doi/10.1021/acs.chas.1c00009

Safety is at the core of industrial and academic laboratory operations worldwide and is arguably one of the most important parts of any enterprise since worker protection is key to maintaining on-going activity.28 At the core of these efforts is the establishment of clear expectations regarding acceptable conditions and procedures necessary to ensure protection from accidents and unwanted exposures[a]. To achieve adherence to these expectations, frequent and well-documented inspections are made an integral part of these systems.23 

Consideration of the inspection format is essential to the success of this process.31 Important components to be mindful about include frequency of inspections, inspector training, degree of recipient involvement, and means of documentation[b][c][d][e]        . Within documentation, the form used for inspection, the report generated, and the means of communicating, compiling, and tracking results all deserve scrutiny in order to optimize inspection benefits.27

Within that realm of documentation, inspection practice often depends on the use of checklists, a widespread and standard approach.  Because checklists make content readily accessible and organize knowledge in a way that facilitates systematic evaluation, they are understandably preferred for this application.  In addition, checklists reduce frequency of omission errors and, while not eliminating variability, do increase consistency in inspection elements[f][g][h][i][j] among evaluators because users are directed to focus on a single item at a time.26 This not only amplifies the reliability of results, but it also can proactively communicate expectations to inspection recipients and thereby facilitate their compliance preceding an inspection.

However, checklists do have limitations.  Most notably, if items on a list cover a large scale and inspection time is limited, reproducibility in recorded findings can be reduced[k]. In addition, individual interpretation and inspector training and preparation can affect inspection results[l][m][n][o][p][q].11 The unfortunate consequence of this variation in thoroughness is that without a note of deficiency there is not an unequivocal indication that an inspection has been done. Instead, if something is recorded as satisfactory, the question remains whether the check was sufficiently thorough or even done at all.  Therefore, in effect, the certainty of what a checklist conveys becomes only negative feedback[r][s][t].

Even with uniformity of user attention and approach, checklists risk producing a counterproductive form of tunnel vision[u] because they can tend to discourage recognition of problematic interactions and interdependencies that may also contribute to unsafe conditions.15 Also, depending on format, a checklist may not provide the information on how to remedy issues nor the ability[v][w] to prioritize among issues in doing follow-up.3 What’s more, within an inspection system, incentive to pursue remedy may only be the anticipation of the next inspection, so self-regulating compliance in between inspections may not be facilitated.[x][y][z][aa][ab][ac]22

Recognition of these limitations necessitates reconsideration of the checklist-only approach, and as part of that reevaluation, it is important to begin with a good foundation.  The first step, therefore, is to establish the goal of the process.  This ensures that the tool is designed to fulfill a purpose that is widely understood and accepted.9 Examples of goals of an environmental health and safety inspection might be to improve safety of surroundings, to increase compliance with institutional requirements, to strengthen preparedness for external inspections, and/or to increase workers’ awareness and understanding of rules and regulations.  In all of these examples, the aim is to either prompt change or strengthen existing favorable conditions.  While checklists provide some guidance for change, they do not bring about that change, [ad][ae]and they are arguably very limited when it comes to definitively conveying what favorable conditions to strengthen.  The inclusion of positive feedback fulfills these particular goals.

A degree of skepticism and reluctance to actively include a record of positive observations[af][ag] in an inspection, is understandable since negative feedback can more readily influence recipients toward adherence to standards and regulations.  Characterized by correction and an implicit call to remedy, it leverages the strong emotional impact of deficiency to encourage limited deviation from what has been done before.19 However, arousal of strong negative emotions, such as discouragement, shame, and disappointment, also neurologically inhibits access to existing neural circuits thereby invoking cognitive, emotional, and perceptual impairment.[ah][ai][aj]10, 24, 25 In effect, this means that negative feedback can also reduce the comprehension of content and thus possibly run counter to the desired goal of bringing about follow up and change.2 

This skepticism and reluctance may understandably extend to even including positive feedback with negative feedback since affirmative statements do not leave as strong of an impression as critical ones.  However, studies have shown that the details of negative comment will not be retained without sufficient accompanying positive comment.[ak][al][am][an][ao][ap][aq]1[ar][as] The importance of this balance has also been shown for workplace team performance.19 The correlation observed between higher team performance and a higher ratio of positive comments in the study by Losada and Heaphy is attributed to an expansive emotional space, opening possibilities for action and creativity.  By contrast, lower performing teams demonstrated restrictive emotional spaces as reflected in a low ratio of positive comments.  These spaces were characterized by a lack of mutual support and enthusiasm, as well as an atmosphere of distrust and [at]cynicism.18

The consequence of positive feedback in and of itself also provides compelling reason to regularly and actively provide it. Beyond increasing comprehension of corrections by offsetting critical feedback, affirmative assessment facilitates change by broadening the array of solutions considered by recipients of the feedback.13 This dynamic occurs because positive feedback adds to an employees’ security and thus amplifies their confidence to build on their existing strengths, thus empowering them to perform at a higher level[au].7 

Principles of management point out that to achieve high performance employees need to have tangible examples of right actions to take[av][aw][ax][ay][az], including knowing what current actions to continue doing.14, 21 A significant example of this is the way that Dallas Cowboys football coach, Tom Landry, turned his new team around.  He put together highlight reels for each player that featured their successes.  That way they could identify within those clips what they were doing right and focus their efforts on strengthening that.  He recognized that it was not obvious to natural athletes how they achieved high performance, and the same can be true for employees and inspection recipients.7 

In addition, behavioral science studies have demonstrated that affirmation builds trust and rapport between the giver and the receive[ba]r[bb][bc][bd][be][bf][bg].6 In the context of an evaluation, this added psychological security contributes to employees revealing more about their workplace which can be an essential component of a thorough and successful inspection.27 Therefore, positive feedback encourages the dialogue needed for conveying adaptive prioritization and practical means of remedy, both of which are often requisite to solving critical safety issues.[bh]

Giving positive feedback also often affirms an individual’s sense of identity in terms of their meaningful contributions and personal efforts. Acknowledging those qualities, therefore, can amplify them.  This connection to personal value can evoke the highest levels of excitement and enthusiasm in a worker, and, in turn, generate an eagerness to perform and fuel energy to take action.8 

Looked at from another perspective, safety inspections can feel personal. Many lab workers spend a significant amount of time in the lab, and as a result, they may experience inspection reports of that setting as a reflection on their performance rather than strictly an objective statement of observable work. Consideration of this affective impact of inspection results is important since, when it comes to learning, attitudes can influence the cognitive process.29  Indeed, this consideration can be pivotal in transforming a teachable moment into an occasion of learning.4 

To elaborate, positive feedback can influence the affective experience in a way that can shift recipients’ receptiveness to correction[bi][bj].  Notice of inspection deficiencies can trigger a sense of vulnerability, need, doubt, and/or uncertainty. At the same time, though, positive feedback can promote a sense of confidence and assurance that is valuable for active construction of new understanding.  Therefore, positive feedback can encourage the transformation of the intense interest that results from correction into the contextualized comprehension necessary for successful follow-up to recorded deficiencies.  

Overall, then, a balance of both positive and negative feedback is crucial to ensuring adherence to regulations and encouraging achievement.[bk]2, 12 Since positive emotion can influence how individuals respond to and take action from feedback, the way that feedback is formatted and delivered can determine whether or not it is effective.20  Hence, rooting an organization in the value of positivity can reap some noteworthy benefits including higher employee performance, increased creativity, and an eagerness in employees to engage.6

To gain these benefits, it is, therefore, necessary to expand from the approach of the checklist alone.16 Principles of evaluation recommend a report that includes both judgmental and descriptive information.  This will provide recipients with the information they seek regarding how well they did and the successfulness of their particular efforts.27 Putting the two together creates a more powerful tool for influence and for catalyzing transformation.

In this paper the authors would like to propose an alternative way to format safety inspection information, in particular, to provide feedback that goes beyond the checklist-only approach. The authors’ submission of this approach stems from their experiences of implementing a practice of documenting positive inspection findings within a large academic department. They would like to establish, based on educational and organizational management principles, that this practice played an essential role in the successful outcome of the inspection process in terms of corrections of safety violations, extended compliance, and user satisfaction. [bl][bm][bn][bo]

[a]There is also a legal responsibility of the purchaser of a hazardous chemical (usually the institution, at a researcher’s request) to assure it is used in a safe and healthy manner for a wide variety of stakeholders

[b]I would add what topics will be covered by the inspection to this list. The inspection programs I was involved in/led had a lot of trouble deciding whether the inspections we conducted were for the benefit of the labs, the institution or the regulatory authorities. Each of these stakeholders had a different set of issues they wanted to have oversight over. And the EHS staff had limited time and expertise to address the issues adequately from each of those perspectives.

[c]This is interesting to think about. As LSTs have taken on peer-to-peer inspections, they have been using them as an educational tool for graduate students. I would imagine that, even with the same checklist, what would be emphasized by an LST team versus EHS would end up being quite a bit different as influenced by what each group considered to be the purpose of the inspection activity.

[d]Hmm, well maybe the issue is trying to cram the interests and perspectives of so many different stakeholders into a single, annual, time-limited event 😛

[e]I haven’t really thought critically about who the stakeholders are of an inspection and who they serve. I agree with your point Jessica, that the EHS and LST inspections would be different and focus on different aspects. I feel include to ask my DEHS rep for his inspection check list and compare it to ours.

[f]This is an aspirational goal for checklists, but is not often well met. This is because laboratories vary so much in the way they use chemicals that a one size checklist does not fit all. This is another reason that the narrative approach suggested by this article is so appealing

[g]We have drafted hazard specific walkthrough rubrics that help address that issue but a lot of effort went into drafting them and you need someone with expertise in that hazard area to properly address them.

[h]Well I do think that, even with the issues that you’ve described, checklists still work towards decreasing variability.

In essence, I think of checklists as directing the conversation and providing a list of things to make sure that you check for (if relevant). Without such a document, the free-form structure WOULD result in more variability and missed topics.

Which is really to say that, I think a hybrid approach would be the best!

[i]I agree that a hybrid approach is ideal, if staff resources permit. The challenge is that EHS staff are responding to a wide variety of lab science issues and have a hard time being confident that they are qualified to raise issues. Moving from major research institutions to a PUI, I finally feel like I have the time to provide support to not only raise concerns but help address them. At Keene State, we do modern science, but not exotic science.

[j]I feel like in the end, it always comes down to “we just don’t have the resources…”

Can we talk about that for a second? HOW DO WE GET MORE RESOURCES FOR EHS  : /

[k]I feel like that would also be the case for a “descriptive comment” inspection if the time is limited. Is there a way that the descriptive method could improve efficiency while still covering all inspection items?

[l]This is something my university struggles with in terms of a checklist. There’s quite a bit of variability between inspections in our labs done by different inspectors. Some inspectors will catch things that others would overlook. Our labs are vastly different but we are all given the same checklist – Our checklist is also extensive and the language used is quite confusing.

[m]I have also seen labs with poor safety habits use this to their advantage as well. I’ve known some people who strategically put a small problem front and center so that they can be criticized for that. Then the inspector feels like they have “done their job” and they don’t go looking for more and miss the much bigger problem(s) just out of sight.

[n]^ That just feels so insidious. I think the idea of the inspector looking for just anything to jot down to show they were looking is not great (and something I’ve run into), but you want it to be because they CANT find anything big, not just to hide it.

[o]Amanda, our JST has had seminars to help prepare inspectors and show them what to look for. We also include descriptions of what each inspection score should look like to help improve reproducibility.

[p]We had experience with this issue when we were doing our LSTs Peer Lab Walkthrough! For this event, we have volunteer graduate students serve as judges to walk through volunteering labs to score a rubric. One of the biggest things we’ve had to overhaul over the years is how to train and prepare our volunteers.

So far, we’ve landed on providing training, practice sessions, and using a TEAM of inspectors per lab (rather than just one). These things have definitely made a huge difference, but it’s also really far from addressing the issue (and this is WITH a checklist)

[q]I’d really like to go back to peer-peer walkthroughs and implement some of these ideas. Unfortunately, I don’t think we are there yet in terms of the pandemic for our university and our grad students being okay with this. Brady, did you mean you train the EHS inspectors or for JST walkthroughs? I’ve given advice about seeing some labs that have gone through inspections but a lot of items that were red flags to me (and to the grad students who ended up not pointing it out) were missed / not recorded.

[r]This was really interesting to read, because when I was working with my student safety team to create and design a Peer Lab Walkthrough, this was something we tried hard to get around even though we didn’t name the issue directly. 

We ended up making a rubric (which seems like a form of checklist) to guide the walkthrough and create some uniformity in responding, but we decided to make it be scored 1-5 with a score of *3* representing sufficiency. In this way, the rubric would both include negative AND positive feedback that would go towards their score in the competition.

[s]I think the other thing that is cool about the idea of a rubric, is there might be space for comments, but by already putting everything on a scale, it can give an indication that things are great/just fine without extensive writing!

[t]We also use a rubric but require a photo or description for scores above sufficient to further highlight exemplary safety.

[u]I very much saw this in my graduate school lab when EHS inspections were done. It was an odd experience for me because it made me feel like all of the things I was worried about were normal and not coming up on the radar for EHS inspections.

[v]I think this type of feedback would be really beneficial. From my experience (and hearing others), we do not get any feedback on how to fix issues. You can ask for help to fix the issues, but sometimes the recommendations don’t align with the labs needs / why it is an issue in the first place

[w]This was a major problem in my 1st research lab with the USDA. We had safety professionals who showed up once per year for our annual inspection (they were housed elsewhere). I was told that a set up we had wasn’t acceptable. I explained why we had things set up the way we did, then asked for advice on how to address the safety issues raised. The inspector literally shrugged his shoulders and gruffly said to me “that’s your problem to fix.” So – it didn’t get fixed. My PI had warned my about this attitude (so this wasn’t a one-off), but I was so sure that if we just had a reasonable, respectful conversation….

[x]I think this is a really key point. We had announced inspections and were aware of what was on the checklists. As a LSO, we would in the week leading up to it get the lab in tip-top shape. Fortunately, we didn’t always have a ton of stuff to fix in the week leading up to the inspection, but its easy to imagine reducing things to a yes/no checklists can mask long-term problems that are common in-between inspections

[y]My lab is similar to this. When I first joined and went through my first inspection – the week prior was awful trying to correct everything. I started implementing “clean-ups” every quarter because my lab would just go back to how it was after the inspection.

[z]Same

[aa]Our EHS was short-staffed and ended up doing “annual inspections” 3 years apart – once was before I joined my lab, and the next was ~1.5 years in. The pictures of the incorrect things found turned out to be identical to the inspection report from 3 years prior. Not just the same issues, but quite literally the same bottles/equipment.

[ab]Yeah, we had biannual lab cleanups that were all day events, and typically took place a couple months before our inspections; definitely helped us keep things clean.

One other big things is when we swapped to subpart K waste handling (cant keep longer than 12months), we just started disposing of all waste every six months so that way nothing could slip through the cracks before an inspection

[ac]Jessica, you’re talking about me and I don’t like it XD

[ad]I came to feel that checklists were for things that could be reviewed when noone was in the room and that narrative comments would summarize conversations about inspectors’ observations. These are usually two very different sets of topics.

[ae]We considered doing inspections “off-hours” to focus on the checklist items, until we realized there was no such thing as off hours in most academic labs. Yale EHS found many more EHS problems during 3rd shift inspections than daytime inspections

[af]This also feels like it would be a great teachable moment for those newer to the lab environment. We often think that if no one said anything, I guess it is okay even if we do not understand why it is okay. Elucidating that something is being done right “and this is why” is really helpful to both teach and reinforce good habits.

[ag]I think that relates well to the football example of highlighting the good practices to ensure they are recognized by the player or researcher.

[ah]I noticed that not much was said about complacency, which I think can both be a consequence of an overwhelm of negative feedback and also an issue in and of itself stemming from culture. And I think positive feedback and encouragement could combat both of these contributors to complacency!

[ai]Good point. It could also undermine the positive things that the lab was doing because they didn’t actually realize that the thing they were previously doing was actually positive. So complacency can lead to the loss of things that you were doing right before.

[aj]I have seen situations where labs were forced to abandon positive safety efforts because they were not aligned with institutional or legal expectations. Compliance risk was lowered, but physical risk increased.

[ak]The inclusion of both positive and negative comments shows that the inspector is being more thorough which to me would give it more credibility and lead to greater acceptance and retention of the feedback.

[al]I think they might be describing this phenomenon a few paragraphs down when they talk about trust between the giver and receiver.

[am]This is an interesting perspective. I have found that many PIs don’t show much respect for safety professionals when they deliver bad news. Perhaps the delivery of good news would help to reinforce the authority and knowledge base of the safety professionals – so that when they do have criticism to deliver, it will be taken more seriously.

[an]The ACS Pubs advice to reviewers of manuscripts is to describe the strengths of the paper before raising concerns. As an reviewer, I have found that approach very helpful because it makes me look for the good parts of the article before looking for flaw.

[ao]The request for corrective action should go along with a discussion with the PI of why change is needed and practical ways it might be accomplished.

[ap]I worked in a lab that had serious safety issues when I joined.  Wondering if the positive -negative feedback balance could have made a difference in changing the attitudes of the students and the PI.

[aq]It works well with PIs with an open mind; but some PIs have a sad history around EHS that has burnt them out on trying to work with EHS staff. This is particularly true if the EHS staff has a lot of turnover.

[ar]Sandwich format- two positives (bread) between the negative (filling).

[as]That crossed my mind right away as well.

[at]I wonder how this can contribute to a negative environment about lab-EHS interactions? If there is only negative commentary (or the perception of that) flowing in one direction, would seem that it would have an impact on that relationship

[au]I see how providing positive feedback along with the negative feedback could help do away with the feeling the inspector is just out to get you. Instead, they are here to provide help and support.

[av]This makes me consider how many “problem” examples I use in the safety training curriculum.

[aw]This is a good point. From a learning perspective, I think it would be incredibly helpful to see examples of things being done right – especially when what is being shown is a challenge and is in support of research being conducted.

[ax]I third this so hard! It was also something that I would seek out a lot when I was new but had trouble finding resources on. I saw a lot of bad examples—in training videos, in the environments around me—but I was definitely starved for a GOOD example to mimic.

[ay]I read just part of the full paper. It includes examples of positive feedback and an email that was sent. The example helped me get the flavor of what the authors were doing.

[az]This makes a lot of sense to me – especially from a “new lab employee” perspective.

[ba]Referenced in my comment a few paragraphs above.

[bb]I suspect that another factor in trust is the amount of time spent in a common environment. Inspectors don’t have a lot of credibility if they are only visit a place annually where a lab worker spends every day.

[bc]A lot of us do not know or recognize our inspectors by name or face (we’ve also had so much turn around in EHS personnel and don’t even have a CHO). I honestly would not be able to tell you who does inspections if not for being on our universities LST.  This has occurred in our lab (issue of trust) during a review of an inspection of our radiation safety inspection. The waste was not tested properly, so we were  questioned on our rad waste. It was later found that the inspector unfortunately didn’t perform the correct test for testing the pH. My PI did not take it very well after having to correct them about this.

[bd]I have run into similar issues when inspectors and PIs do not take the time to have a conversation, but connect only by e-mail. Many campuses have inspection tracking apps which make this too easy a temptation for both sides

[be]Not only a conversation but inspectors can be actively involved in improving conditions- eg supply signs, ppe, similar sop’s…

[bf]Yes, sometimes, it’s amazing how little money it can take to show a good faith effort from EHS to support their recommendations. Other times, if it is a capital cost issue, EHS is helpless to assist even if there is a immediate concern.

[bg]I have found inspections where the EHS openly discuss issues and observations as they are making the inspection very useful Gives me the chance to ask the million questions I have about safety in the lab.

[bh]We see this in our JST walkthroughs. We require photos or descriptions of above acceptable scores and have some open ended discussion questions at the end to highlight those exemplary safety protocols and to address areas that might have been lacking.

[bi]Through talking with other students reps, we usually never know what we are doing “right” during the inspections. I think this would benefit in a way that shared positive feedback, would then help the other labs that might have been marked on their inspection checklist for that same item and allow them a resource on how to correct issues.

[bj]I think this is an especially important point in an educational environment. Graduate students are there to learn many things, such as how to maintain a lab. It is weird to be treated as if we should already know – and get no feedback about what is being done right and why it is right. I often felt like I was just sort of guessing at things.

[bk]At UVM, we hosted an annual party where we reviewed the most successful lab inspections. This was reasonably well received, particularly by lab staff who were permanent and wanted to know how others in the department were doing on the safety  front

[bl]This can be successful, UNTIL a regulatory inspection happens that finds significant legal concerns related primarily to paperwork issues. Then upper management is likely to say “enough of the nice guy approach – we need to stop the citations.” Been there, done that.

[bm]Fundamentally, I think there needs to be two different offices. One to protect the people, and one to protect the institution *huff huff*

[bn]When that occurs, there is a very large amount of friction between those offices. And the Provost ends up being the referee. That is why sometimes there is no follow up to an inspection report that sounds dire.

[bo]But I feel like there is ALREADY friction between these two issues. They’re just not as tangible and don’t get as much attention as they would if you have two offices directly conflicting.

These things WILL conflict sometimes, and I feel like we need a champion for both sides. It’s like having a union. Right now, the institution is the only one with a real hand in the game, so right now that perspective is the one that will always win out. It needs more balance.

CHAS presentations from the Spring 2022 national meeting

Below are PDF versions of presentations from CHAS symposia and those in other divisions from the Spring, 2022 meeting.

Value of storytelling to build empathy, awareness, and inclusivity in EHS, Dr. Kali Miller

Inclusive risk assessment: Why and how? Ralph Stuart, CIH, CCHO

Some thoughts about how to safely accommodate people with disabilities in the laboratory, Debbie Decker, CCHO

Job design for the “hidden” disabled professional, Dr. Daniel R. Kuespert, CSP

Discussion of accommodation and advocacy for graduate student researchers, Catherine Wilhelm

Parsing Chemical Safety Information Sources, Ralph Stuart, CIH, CCHO

Pragmatism as a teaching philosophy in the safety sciences

On March 10, Dr. Patricia Shields discussed the article she co-authored with three safety professionals about using “pragmatism” as a safety philosophy in the safety sciences. Her summary powerpoint and the comments form the table read of this article are below.

The full paper can be found here: https://www.sciencedirect.com/science/article/pii/S0925753520304926?casa_token=gG7VtvjEqqsAAAAA:Of4B_mGRk-HwwH-q_WQLybg2zDGPtjcYVFCg0ZgnYe5riPefhOJ6nDCGF2YwjMrhSR2wGfIABg

Excerpts from “Pragmatism as a teaching philosophy in the safety sciences: A higher education pedagogy perspective”

03/03 Table Read for The Art & State of Safety Journal Club

Excerpts from “Pragmatism as a teaching philosophy in the safety sciences: A higher education pedagogy perspective”

Full paper can be found here: https://www.sciencedirect.com/science/article/pii/S0925753520304926?casa_token=gG7VtvjEqqsAAAAA:Of4B_mGRk-HwwH-q_WQLybg2zDGPtjcYVFCg0ZgnYe5riPefhOJ6nDCGF2YwjMrhSR2wGfIABg

Meeting Plan

  • (5 minutes) Jessica to open meeting
  • (15 minutes) All participants read complete document
  • (10 minutes) All participants use “Comments” function to share thoughts
  • (10 minutes) All participants read others’ Comments & respond
  • (10 minutes) All participants return to their own Comments & respond
  • (5 minutes) Jessica announces next week’s plans & closes meeting

  1. Introduction

(FYI, most of the Introduction has been cut)

Elkjaer (2009) has previously alluded to this lack of appreciation and value of pragmatism ‘as a relevant learning theory’ (p. 91) in spite of the growing recognition of its important role in education and teaching (Dewey, 1923, 1938; Garrison and Neiman, 2003; Shields, 2003a; Sharma et al., 2018), scholarship and academic development (Bradley, 2001), academic practice (Shields, 2004; 2006), curriculum (Biesta, 2014) and online learning (Jayanti and Singh, 2009). This article, therefore, addresses this anomaly by arguing for the appropriateness of pragmatism as a highly relevant philosophical cornerstone, especially for safety science educators[a].

2. The Scholarship of Learning and Teaching (SoLT)

(FYI, this section has been cut)

3. Pragmatism as a teaching philosophy

3.1. Teaching philosophies

(FYI, most of this section has been cut)

The research paradigms used extensively in higher education are positivism and interpretivism and are often being cited by faculty as influencing their teaching philosophy (Cohen et al., 2006). These two are usually associated with quantitative and qualitative research methods respectively but both prove problematic for the teaching of the safety sciences. First, safety science relies on both quantitative and qualitative methods. Second, neither uses a ‘problem’ orientation in its approach to methods and safety science is inherently problem and practice oriented and certainly should be with respect to its teaching.[b][c][d]

Third, the mixed methods literature has recognized this drawback and adopted pragmatism as their research paradigm because it takes the research problem as its point of departure (Johnson and Onwuegbuzie, 2004). In contrast to positivism and interpretivism, pragmatism holds the view that the research question that needs to be answered is more important than either the philosophical stance or the methods that support such stance. Pragmatism is traditionally embraced as the para­digm of mixed methods hence, it turns the incompatibility theory on its head by combining qualitative and quantitative research approaches, and “offers an immediate and useful middle position philosophically and methodologically; a practical and outcome-oriented method of inquiry that is based on action and leads” (Johnson and Onwuegbunzie, 2004, p. 17). The pluralism of pragmatism allows it to work across and within methodological and theoretical approaches, which for the purpose of the intent of this paper is consistent with a safety science multi-disciplinary approach.

This places practice, where the problem must originate, as an important component of mixed methods. This practice orientation res­onates with the goals of learning and teaching in safety science. Therefore, presented here is the philosophy of ‘pragmatism’ which we argue is much better suited for guiding or informing safety science teaching endeavours.

3.2. The foundations of pragmatism

(FYI, this section has been cut)

3.3. Value of pragmatism for the safety sciences

(FYI, this section has been cut)

4. Safety science higher education in Australia

(FYI, this section has been cut)

5. Pragmatism and evidence informed practice (EIP)

Safety science education has traditionally taken an evidence-informed practice (EIP) stance for its teaching practice. Evidence informed practice is not a one-dimensional concept and its definition is still under debate with various academic lenses being applied to the notion of ‘research as evidence’ and how EIP can be measured (Nelson and Campbell, 2017). However, Bryk (2015) is attributed to offering up the view that EIP is a “fine-grained practice-relevant knowledge, generated by educators, which can be applied formatively to support professional learning and student achievement” (Nelson and Campbell, p. 129).[e]

This includes the expectation that students will be able to use their theoretical knowledge, gained through their academic studies, including research in the field, and translate this knowledge into practical appli­cations in the real world[f][g][h][i][j][k][l][m][n][o]. There are continued efforts to recognise these Research to Practice (RtP) endeavours, as an example, the Journal of Safety, Health and Environmental Research in 2012 devoted an issue to ‘Bridging the Gap Between Academia and the Safety, Health and Envi­ronmental (SH&E) Practitioner. The issue demonstrated “the vital role of transferring SH&E knowledge and interventions into highly effective prevention practices for improving worker safety and health” (Choi et al., 2012, p.1). In that issue Chen et al. (2012, p. 27) found that the ‘Singapore Workplace Safety and Health Research Agenda: Research-to-Practice’ prioritizes, first, organisational and business aspects of work­ place health and safety (WHS) and second, WHS risks and solutions.

Other researchers in that same issue (Loushine, 2012, p. 19) examined ‘The Importance of Scientific Training for Authors of Occupational Safety Publications’ and found that there needs to be “attention on the coordination of research and publication efforts between practitioners and academics/researchers to validate and advance the safety field body of knowledge” (p. 19).

Shields (1998) introduced the notion of ‘classical pragmatism’ as a way to address the academic/practitioner divide in the public admin­istration space. She also notes that the pure EIP approach often contains a lack of congruence between practitioner needs and research[p][q] (Shields, 2006). She identifies theory as a source of tension. Practitioners often see theory as an academic concern divorced from problems faced in their professional world. Here, pragmatism bridges theory and practice because theory[r] is considered a “tool of practice” which can strengthen student/practitioner skills and make academic (process and products) stand up to the light of practice (Shields, 2006, p. 3). The pragmatist philosopher, John Dewey used a map metaphor to describe the role of theory, whereby a map is not reality, but it is judged by its ability to help a traveller reach their chosen destination[s] (Dewey, 1938).

This perspective is often demonstrated in the student’s capstone, empirical research project. Using a problematic situation as a starting point, they introduce literature, experience and informed conceptual frameworks as theoretical tools that help align all aspects of a research process (research purpose, question, related literature, method and statistical technique). Thus, student/practitioners/researchers, led by a practical problem, could develop or find a theory by drawing on diverse (pluralistic) literature as well as their experience with the problematic situation. This provisional theory guides choice of methodology, vari­able measurement, data collection and analysis, which is subsequently shared (participatory) and evaluated. Practical problems are therefore addressed by the student’s conceptual framework, which is considered a tool related to the problem under investigation. This approach thus emphasizes the connective function of theory (Shields, 2006). The use of this pragmatic framework has allowed a bridge between theory and for it to be successfully applied to higher education more broadly (Bach­kirova et al., 2017; El-Hani and Mortimer, 2007). Texas State University has embedded a pragmatism informed research methodology in its Master of Public Administration program with success measured in student awards, citations and recognition in policy related publications (Shields et al., 2012).

Therefore, it is proposed that safety science is a discipline which would, and should, also benefit from alignment with philosophical pragmatism. This would represent a much wider stance and a shift from viewing safety science education with merely an EIP lens, where the main consideration for teaching practice is that students are presented with research which provides them the required ‘scientific evidence’ and that the teaching of this research is enough to inform their practice of the discipline [t][u](Hargreaves, 1996, 1997). It should be noted that pragmatism does not abandon evidence, rather it contextualizes it in a problematic situation.

6. The significance of pragmatism as a teaching philosophy

[v]

For pragmatism to penetrate the safety science education field it needs to be relatively easy to apply and transmit. Fortunately, Brendel (2006) has developed a simple four P’s framework, which captures pragmatism’s basic tenets and can easily be applied as to teaching (Bruce, 2010). The 4P’s of pragmatism include the notions that educa­tion needs to be Practical (scientific inquiry should incorporate practical problem solving), Pluralistic (the study of phenomena should be multi-and inter- disciplinary), Participatory (learning includes diverse perspectives of multiple stakeholders) and Provisional (experience is advanced by flexibility, exploration and revision), as shown in Fig. 2.

The majority of safety science students simultaneously study and work in agencies or organisations as safety professionals. Hence, they appreciate the pragmatic teaching approach whereby teacher, student and external stakeholders influence learning by incorporating multiple perspectives. When teaching is filtered through a pragmatic philosoph­ical lens, students’ learning is framed by their key domain area of in­terest as well as their professional context and work experienc[w][x][y][z][aa][ab]e. It encourages them to ‘try on’ their work as experiential[ac][ad][ae] learning, which they can take into and out of the classroom. Flexibility, integration, reflection and critical thinking are nurtured. Pragmatism and the four Ps can facilitate this process.

Ideally, the classroom environment incorporates communities of inquiry where students and teachers work on practical problems appli­cable to the health and safety domain. The pluralistic, expansive com­munity of inquiry concept incorporates participatory links to the wider public, including industry and workers (Shields, 2003b). The commu­nity of inquiry also encourages ongoing experimentation (provisional). The ‘practical problem’ and ‘theory as tool’ orientation provides op­portunities to bridge the sometime rigid dualisms between theory and practice. This teaching lens also incorporates a spirit of critical opti­mism, which leads to a commitment by the teacher [af]and the higher ed­ucation institution to continually experiment and work to improve the content delivery and student learning experience (Shields, 2003a).

Pragmatism emphasizes classroom environments which foster trans­formations in thinking and these transformations in thinking can often be observed in the quality of student’s final research project (Shields and Rangarajan, 2013). Most students graduating from postgraduate degrees in the safety sciences are required to produce a major piece of work (thesis) with broad practical value. Ideally they grow and develop useful skills from the learning experience and the thesis is useful to their employer/or­ganization and has applicability to the wider community in which they work as safety professionals.[ag][ah][ai][aj]

6.1. Pragmatic learning – student success – enhancement to practice

Higher education safety science pedagogy should be embedded in the notion that most of the students who attend come with some depth of practical experience and practical wisdom whom the academe should treat as lifelong learners and researchers[ak]. The academe should provide them with tools and skills to be stronger lifelong learners equipped to contribute to safety science practice[al][am][an][ao][ap][aq].

The universities in which the researchers of this article are aligned use pragmatism as a multi/trans-disciplinary approach in order to bridge the gap between academic theory (research) and practice. Whilst two of these universities teach safety science, the third one places pragmatism in the public administration domain and has for many years successfully incorporated the use of pragmatism to bridge the gap be­tween academia and practice (Shields and Tajalli, 2006; Foy, 2019).

The value of using pragmatism as a teaching philosophy is one which has been successfully demonstrated to bridge this gap. A snippet of just some of the student feedback on student learning from the use of a pragmatism philosophy of teaching are evidenced below:

Having been a railway man for over thirty years I recognised that a gap

needed to be closed in my academic knowledge to advance further in the

business and wider industry and the Safety Science courses have provided

the vehicle for this to occur. Importantly I have been able to link the

learning in these courses and the assignments directly to the activities of

my rail organisation. That’s a big selling point in today’s business world.

(Safety Science Student, Phil O’Connell)

In 2014, I was promoted to Administrative Division Chief of Safety. On several occasions, I found myself utilizing the skills I learned to help evaluate and improve issues and programs in my fire department. In particular I was able to [ar]use case study research to show that our Safety

Division was understaffed. As a result, I successfully increased our

numbers of Safety Officers from 5 to 26. I have also used the same

techniques to improve our departments PPE and cancer prevention pro­

grams. The greatest challenge, however, came when we had 100 fire

fighters exposed to a potentially massive amount of asbestos during a

major high rise fire. Our department had never dealt with an exposure of

its magnitude. I was able to help our department solve a very difficult

problem concerning asbestos and its effect on our PPE. I even received

calls from other fire departments who were interested in our method.

(Public Administration Student – Brian O’Neill)

These students have gone on to have their research cited and widely acknowledge (O’Connell et al., 2016; O’Connell et al., 2018; O’Neill, 2008) as have many other students under this pragmatic philosophy for learning and teaching.[as]

6.2. Pragmatic learning – student success – theoretical advancement

Whilst the embedding of pragmatism as a teaching philosophy is relatively new for Australian universities teaching in the safety science space, it is well entrenched within the public administration programs at Texas State University. Approximately 60 percent of students in this program work full time in state, local federal or non-profit organiza­tions. [at][au]Their capstone papers focus on the practical problems of public policy, public administration and nonprofit administration. [av][aw][ax][ay][az][ba]Problems with “disorganised graduate capstone papers with weak literature re­views” (Shields and Rangarajan, 2013, p. 3) pushed the faculty to adopt pragmatism as a teaching framework. This approach enhanced students’ Applied Research Projects (ARP), which have demonstrated remarkable industry, field and community impact (Shields, 1998). [bb]For example, five of the papers won first place in the United States among schools of public affairs and administration. A content analysis of the Texas State Uni­versity applied research papers (ARPs) revealed that “most of these ARPs are methodical inquiries into problems encountered by practitioners at the workplace. Hence a dynamic interplay of practitioner experience informs public administration research, and rigorous research informs practitioner response to administration/management problems” (Shields et al., 2012, pp. 176–177).

(FYI, paragraph cut)

7. Conclusion

Higher education teachers who have used pragmatism as their teaching philosophy for some time have led the way for an interest in pragmatism as a teaching philosophy to spread and gain momentum into other domains. However, despite this and publications which endorse the use of pragmatism, there still appears to be little understanding of the benefits and rationale for pragmatism to be used as a teaching phi­losophy over other more established and entrenched research focused philosophies. [bc]Therefore, this paper has tried to distil both an under­standing of what pragmatism represents and the ‘how and why’ it should be used more broadly, particularly for safety science educators.

Pragmatism goes beyond what is offered by the more singular notion of evidence-informed practice, especially within the safety sciences higher educational programs. Its value in other domains has been well established particularly where more problem focused, and practical applied applications are required.[bd] Further, significant positive results in student’s research outputs from having a pragmatic research [be]framework are now well demonstrated. Where student work can be used to inform decision making, policy making and problem solving that impacts wider inquiry its value stands out, as already evidenced in both the public administration space and safety science space.[bf]

In relation to the safety sciences, the higher educational pedagogist can be confident that the path to pragmatism is a well-worn, even if it may be unfamiliar to the discipline. It is recommended to extend teaching practices, past only valuing the evidence-informed practice stance, to reduce the theory to practice divide. This can be done by incorporating a broader philosophical (4 Ps) pragmatic perspective in order to develop a professional practice community of safety science problem solvers.

Therefore, embracing pragmatism as a teaching philosophy is encouraged in the higher education sector,[bg][bh] and recent acknowledgments of, and acceptance for this teaching philosophy stance, has instilled greater confidence of its recognition and credibility for its wider use. For the safety science educator, they can be proud that its adoption as a teaching philosophy is a long awaited natural development instigated by the early pragmatists forebearers who worked in the safety field.

[a]Is safety a science? I can see arguments that it is, but I can also see arguments that it is a cultural eductation about community expectations for workplace decision-making. (There are many different “communities” potentially included in this concept.)

[b]Would you include constructivism as a different paradigm?

[c]I see interpretivism and constructivism as very similar. The methods literature often treats them as basically the same. In many ways it depends on whether the problem is approached inductively or deductively. Construstivism is associated with inductive exploratory research often.

[d]I wonder if sometimes there is insufficiency of reflection to make constructivism too close to interpretivism.

[e]EIP or EBP (Evidence-Based Practice has become much more popular in general STEM education in the past 5-10 years, especially as part of the DBER (Discipline-Based Education Research) set of practices.

[f]Previous TA training I received stressed the importance of applying lecture content to new problems to help students learn and retain knowledge. I think thats a stong benefit of pragmatism.

[g]Again, I’m wondering a bit of the distinction between this and constructivism?

[h]I find that there are many missed opportunities in lecture courses and textbooks to really connect students to the safety aspects of the chemicals being described. For example, with the number of times HF is used as an example of textbook problems, it would be nice to include something about how incredibly hazardous it is to work with!

[i]Just today in an honors general chemistry course we talked about the hazards of perchlorate salts. I was surprised that the textbook was using it as a regular example, along with perchloric acid, without a hint of a discussion about safety…

[j]I believe this can also be applied to “less hazardous” compounds also, there is, in my opinion, a huge disconnect between the overall properties of a compound and its hazardous nature. For example, ethyl acetate, commonly used, not extremely hazardous, but just this week I had multiple students ask why it they needed to work with it in a hood rather than their open benchtop.

[k]One of the learning opportunities in pragmatic safety science is uncovering hidden assumptions in standard practices. My “hazmat chemists” instincts are very different from the “research chemists” instincts about the same chemicals. It takes a lot of practice to go into a conversation about these chemicals with an open mind.. (This has come up this week with a clean out of research lab and very different perceptions of the value and hazards of specific chemical containers.)

[l]It would be really cool to see an organic textbook for example that has inset sections on the safety considerations of different reactions. My O chem professor would sometimes highlight reactions that were good on paper and problematic in reality, but it should a more frequent discussion.

[m]This is something that gets addressed in our organic labs actually. They “design” their own experiment. They’re given a number of chemicals in a list (some are controlled substances, some are very expensive) and are asked to choose which ones they would like to use for their experiment. We then use their choice from groups to go over both safety aspects and expense aspects and how we can then still do our experiment with other chemicals.

[n]That is a great exercise. I especially like how practical and open-ended it is.

[o]Overall, in an organic chemistry course practical knowledge of synthesis is mostly untouched as many of the classic reactions used to teach the course are fairly complex experimentally. I.e. sandmeyer reactions are conveniently simple  to explain but harder to accomplish in person.

[p]This seems to be an issue across many fields. Often times we see that those performing the practice and those performing the research speak different languages and consider very different things important.

[q]I see this a lot in experimental and computational work. Different languages, different skill sets, and different approaches

[r]Safety science also has this issue internally. There is an interesting paper that was covered in a podcast awhile ago about “reality-based safety science”: https://safetyofwork.com/episodes/ep20-what-is-reality-based-safety-science

[s]I’m thinking about an analogy with computational and experimental chemistry also. I like the “tools of practice” bridge.

[t]How would this compare to case studies?

[u]I would imagine that for Case Studies to become research that someone would have to gather case studies and look for trends. I see Case Studies as an opportunity to share one experience or one set of experiences with the community in the hopes that with enough Case Studies a meaningful research study could be conducted.

[v]Case studies are definitely included in this.  Scientific evidence here would mean that the evidence was collected with a scientific attitude. There is no belief that actual objectivity is possible but something close should be strived for.

[w]Allowing students to pursue their interests is alway a benefit while learning. Its been a struggle to organize researcher safety meetings in a way to engage participants by allowing them to follow their interests, especially with virtual meetings. Has anyone found strategies that facilitate that interest and engagement?

[x]Something I had started to explore just before the lockdown was to try to set up opportunities for grad students to discuss the risk assessments around their own project work. In this way, they could show off their expertise while helping to educate others – and possibly reveal some things that they hadn’t thought about or didn’t know. I really liked how Texas A&M did their Table Discussions in which they invited students who had something in common (i.e. all those who use gloveboxes), presented a Safety Moment about them, then invited students to share their own stories, strategies, and concerns with one another about glovebox usage.

[y]We started doing round tables that would discuss safety topics within their own focus area (inorganic, organic, physicals/atmospheric), similar to what Jessica mentioned with gloveboxes and that’s gotten a lot response and interest.

[z]Those sounds like great ideas. We already have our department research groups divided into hazard classes so it would be easy to have them meet in those groups. Thanks for the suggestions. I also like the idea of participants presentation to eachother instead of a lecture style event.

[aa]I like this a lot. Is there much faculty involvement?

[ab]We don’t get as much faculty involvement due to their busy schedules. But we have had safety panels with faculty with different safety specialties such as lasers, slink lines, compressed gases, physical hazard etc.

[ac]Is pragmatism a bridge between theoretical and experiential learning?

[ad]I believe that it is most useful when the bridge runs both ways

[ae]Excellent point. One should inform the other.

[af]Action research is certainly on the continuum of research that can be informed by pragmatism. The pluralism of pragmatism comes to play here.

[ag]Hopefully within the safety sciences this aspiration is realized more often than in other disciplines. Too many times, theses and dissertations get lost in the archives and go unread.

[ah]Again, this is something that makes me think of the ideas behind Action Research. Since it is a research method by which the researcher questions their own practice, the thesis that ultimately comes of it could potentially be of interest to their own employers or teams (even if no one else reads it).

[ai]Safety research tends to be somewhat more read because it is often driven by the need to support a risk decision. But as Covid has shown, this may not improve the quality of the scientific literature that is being read. The rush to publish (no or small amounts of data) has really slowed the understanding of best safety practices

[aj]I see what you mean Jessica even if the actual manuscript is not disseminated a researcher self-evaluating their own practice can definitely serve a self-check where one can see places to improve.

[ak]How would you say the idea of “pragmatism” relates, if at all, to the concept of Action Research?

[al]Would a pragmatic point of view work in beginner safety courses?

[am]I think that the “citizen scientist” movement is an attempt at a pragmatic approach to purer enviromental sciences, but I’m not convinced that this kinds of projects improve science literacy. They seem to go to stop at the crowd sourced data collection phase and then the professionals interpret the data for the collectors

[an]This goes back to the expert/novice question. Would a pragmatic approach work for both? I can see the advantage in graduate/postgraduate education. I’m wondering if the knowledge base is broad enough for beginners?

[ao]I agree. You don’t know what you don’t know.

[ap]It is also very frustrating for the beginner to put in a lot of effort collecting data and then be told that that data is fatally flawed for an obscure reason

[aq]Pragmatism would call on the expert to listen carefully to the novice particularly if the novice is in the world of practice. This is where the participatory nature of pragmatism comes in. Both should have a voice.

[ar]Brian specifically mentions case study as a method he used.

[as]I actually think of a managers need to solve problems like safety issues at work could be looked at as a mini “applied” case study.  The context of the problem shapes the parameters of the case.

[at]Do the students who are not working full time have a good sense of applications? And does it make them feel better prepared for common workplace problems?

[au]I would think that even if they didn’t work full time, they could still pick some sort of problem in the public domain to really seriously do a lot of research on. If nothing else, it could give them a sense of why the problem is so intractable.

[av]My sister was involved in one of these programs after 15 years of experience and she said that the content was marginally interesting, but being able to network with fellow professionals was quite valuable, both the stories and solutions they shared and for future follow up to ask questions of. That seems like quite a pragmatic aspect of this program

[aw]I would think that the networking would be part of the purpose – and this is really true for any research program as well. You basically find that small group of people who are really interested in the same problems in which you are interested so that you can all swap stories, publications, and ideas in order to drive all research forward.

[ax]I agree – I think that academic leadership sees this opportunity more clearly than faculty members who are assigned 10 or 15 grad students to mentor at once, though. ACS is providing some education around this opportunity for new faculty, but it’s a challenge to incorporate mentoring skills along with teaching, research and service duties faculty are handed

[ay]This is why the Community of Inquiry is so important. Community comes first.  I actually have an article on the community of inquiry if anyone is interested.

[az]Reframing things as a community of scholars is very powerful.

[ba]I’d be glad to include any references that you think would be helpful on the web page for this discussion if you would like to share them. We get about 100 views of these pages after they go up, so the impact is not limited to the attendees at a particular session

[bb]How long are the courses? One semester? I often find it difficult for students to finish an in-depth lit review in that time frame.

[bc]This link might also be useful.  https://link.springer.com/article/10.1007/s11135-020-01072-9  It deals with deductive exploratory research and covers many of these themes.

[bd]I very much appreciate the use of this pedagogy as it applies to practical content!

[be]I believe the students that give their courses a good faith effort come away with tools to apply to their work.  We look at the research project as a project management challenge and apply project management ideas throughout. This is sometimes the most important lesson, particularly for the pre-service students.

[bf]This is a very important idea. When I pursued my 1st bachelors, in political science, I was incredibly disappointed to find how much research and practice diverged.

[bg]There is an important distinction here between undergrad and graduate students in higher ed. Traditional undergrads tend to be learning more practical skills outside of the curriculum. I wonder what the experience of non-traditional and community college students are in this regard?

[bh]It does seem like this approach lends itself very well to setting where previous or current experience is required.

CHAS at a Glance, Spring 2022

Information about the CHAS program for the upcoming national meeting March 20-24 in San Diego is now available. More information about the open business meeting will be added as it become available.

The Agenda Book for the Executive Committee meeting is now available below. The meeting will take place on Sunday, March 20, 2022 10:15-11:30AM PST The Zoom Virtual Link is available in the agenda book.

ACS Webinar: Zebras or Horses?

On February 9, the Division sponsored an ACS webinar, given by Mary Beth Mulchahy. She earned her PhD in physical chemistry from the University of Colorado in Boulder. She is currently a manager in the Global Chemical and Biological Security Program at Sandia National Laboratories in Albuquerque, NM. Dr. Mulcahy and her Team work internationally to build and strengthen knowledge for the responsible use of chemicals.

Dr.  Mulcahy also serves as the Editor-in-Chief of the American Chemical Society’s ACS Chemical Health & Safety journal which focuses on publishing high-quality articles of interest to scientists, EH&S professionals, and non-research personnel who manage or work in areas where chemicals are used or hazardous waste is generated.

“Culture eats strategy for breakfast,” was a quote shared in the recent ACS webinar, “Zebras or Horses? How a False Sense of Security Can Lead to Lab Accidents.” In her presentation, Mary Beth Mulcahy, editor-in-chief of the journal, ACS Chemical Health & Safety, and former CSB accident investigator, illustrated this relationship between culture and strategy using CSB findings from accident investigations atConAgraTexas Tech, and the Deepwater Horizon. She showed how these accidents came about in part due to the differences between what was written as policies and procedures and what was done in practice.  Steering a culture of safety in a fruitful direction can, thus, be achieved by recognizing and accommodating these conflicts between theory and implementation.
Dr. Mulcahy also pointed out that safety culture does not stand apart from an organization’s culture as a whole, that what is valued in a subsection will also be what is valued overall. And while learning from mistakes is essential, as was demonstrated in the webinar, developing a work culture that proactively supports safe practice is equally important.

Admittedly, there is no one size fits all with regard to the practicalities of a successful safety culture, as was pointed out by Terry Mathis in his recent EHS Today article, “The Laboratory for Testing Safety Efforts” , but one universal place to start can be the realm of psychological safety. Defined as “the belief that you won’t be punished or humiliated for speaking up with ideas, questions, concerns, or mistakes,” in the recent article from the Center for Creative Leadership, titled “What is Psychological Safety at Work?”  psychological safety facilitates the empowerment and transparency that are the bedrock of a culture of safety. Being able to give and receive feedback, raise concerns, disagree, ask for help and clarification, ask difficult questions, offer solutions, and admit to errors are all essential to maintaining healthy working conditions, both physically and psychologically. Figuring out how to prioritize these will require a custom fit for each organization, but consideration and discussion of the practices is a good place to begin.

The Presentation

You can find the recording of the webinar at https://www.acs.org/content/acs/en/acs-webinars/professional-development/false-security.html

Audience Poll Questions

CHAS Committee Interests Survey

CHAS provides chemical safety professionals with professional development opportunities by working on Divisional activities. These opportunities enable members to stay up to date on emerging technical and cultural aspects of chemical safety while networking with peers nationally and globally. Many have found this work to an be important asset both for themselves and the stakeholders they work with. To get involved in the Division’s work, indicate your interest in specific opportunities in the form below.

CHAS Committee Interest 2022
Your Name
Your Name
First
Last
Committees I am interested in learning more about
Established liaison opportunities I am interested in learning more about
Potential liaison opportunities I am interested in learning more about

Chemical Reactivity resources and references

In January, 2022, an inquiry to the DCHAS-L asked for suggestions for resources to evaluate reactivity hazards associated with teaching lab organic syntheses  organic synthesis or compounds involved in drug manufacturing. Below is a list of the references that were suggested in response to this request. If you know of other resources that should be included or entries that should be updated, let us know in the comments below.

Note: some of these references are free; others are require a fee to view. Many academic libraries can provide access to specific entries for these resources.

Workshops

CHAS workshop on the chemical reactivity hazards http://www.dchas.org/workshops

Univ of Rhode Island Explosives Courses
http://energetics.chm.uri.edu/?q=node/95

Computer Application

AIChE CCPS Chemical Reactivity Worksheet (2019)
https://www.aiche.org/ccps/resources/chemical-reactivity-worksheet

CAMEO chemicals:
https://cameochemicals.noaa.gov

Ongoing Literature

CAS Chemical Safety Library: https://safescience.cas.org

C&EN Safety Letters: http://pubsapp.acs.org/cen/safety/

Organic Syntheses: http://www.orgsyn.org

OPRD ACS journal: https://pubs.acs.org/toc/oprdfk/0/0 

Science of Synthesis at Stanford Library guide to chemical safety resources: https://guides.library.stanford.edu/lab-safety/recognize-hazards/reactive-substances

Chemical Safety Board web site on reactive chemicals: https://www.csb.gov/reactive-hazards/

Chemical Safety Board web site on laboratory safety
https://www.csb.gov/laboratory-safety/

Other Literature Sources

Thermal Safety of Chemical Processes: Risk Assessment and Process Design, Second, Completely Revised and Extended Edition
Francis Stoessel (2020)


Process Safety in the Pharmaceutical Industry—Part I: Thermal and Reaction Hazard Evaluation Processes and Techniques (2020)
https://pubs.acs.org/doi/10.1021/acs.oprd.0c00226

Bretherick’s Handbook of Reactive Chemical Hazards (2017)
https://www.elsevier.com/books/brethericks-handbook-of-reactive-chemical-hazards/urben/978-0-08-100971-0

Wiley’s Guide to Chemical Incompatibilities (2009)
https://www.wiley.com/en-us/Wiley+Guide+to+Chemical+Incompatibilities%2C+2nd+Edition-p-9780471721628

March’s Organic Synthesis (2006)
https://onlinelibrary.wiley.com/doi/book/10.1002/0470084960

Encyclopedia of Reagents for Organic Synthesis (2001) https://www.wiley.com/en-us/Fieser+and+Fieser%27s+Reagents+for+Organic+Synthesis%2C+Volume+14-p-9780471504009

Fieser and Fieser’s Reagents for Organic Synthesis (1989)
https://www.wiley.com/en-us/Fieser+and+Fieser%27s+Reagents+for+Organic+Synthesis%2C+Volume+14-p-9780471504009

Enhancing Research Productivity Through LST’s

On November 4, 2021, CHAS sponsored an ACS webinar presented by 4 current and recent graduate students about their work with Laboratory Safety Teams (LSTs) and why they took up this challenge. A key reason is that the productivity of their work and the safety of their labs are connected by housekeeping issues they faced in the lab.

The recording of the webinar will be available to ACS members soon, but you can review their presentation file here.

The audience provided many questions and comments to the panel. The questions were discussed in the recording available from ACS Webinars. Some of these issues were:

The Impact of Lab Housekeeping

  • Did you ever see serious accidents because of a lack of housekeeping?
  • An audience member responded: A major lab cleanup in the lab where I was finishing up as a graduate student nearly ended in disaster when a waste bottle EXPLODED. Fortunately, no one was present — everyone had left for dinner. Pieces of broken glass were found at the other end of the lab.

Working with the Administration

  • Have there been any situations where your PI encouraged you to deprioritize safety/housekeeping concerns because they did not put emphasis on it? How would you encourage a researcher who is facing this but interested in LSTs?
  • Have you run into management or leadership that is reluctant to implement changes to safety programs? How did you deal with this when not holding a leadership position?
  • How to get students involved in lab safety if PI don’t show interest on the matter?
  • I think a Lab safety team of students is great but I also think a Liaison between the research labs and EHS has proven extremely beneficial because while EHS looks at compliance and waste removal but as Chemists we often are resource for them as well.

Professional Skill Development

  • I have worked on a safety team and found it initially uncomfortable to give feedback to others in regards to housekeeping and safety. How do we support teams so they feel comfortable/empowered to provide feedback to others in their lab?
  • Lab safety is a big priority in industry (as we all know) and experience with lab safety is a GOOD thing to put on your resume. I’m sure comments along these lines helped me get my first industry job.
  • Kudos for all the safety culture building!

LST Strategies

  • Do you think it’s advisable to separate safety leadership in a lab from the responsibilities of a lab manager?
  • What are some strategies for encouraging students to join the LST on their own accord? It seems important that this not be mandatory necessarily, but how do you get people excited about putting more time into something when everyone is stretched pretty thin typically?
  • What fallout has happened, or not, from the fatal lab accident that occurred at UCLA?
  • What hazards do the LST find most frequently?
  • What systematic changes have you seen that are sustainable?
  • What is the gender breakdown of researchers participating in LSTs? As a safety professional I am sensitive to recognizing the majority role women play in participating in “non-promotable” tasks. If a gender discrepancy exists, how can we address it?”

Educational Opportunities

  • Hello, great webinar! This semester I am working with small groups of students from different labs (internship and rotations), and I think working on safety is a great topic to consider as part of the learning process. Any recommendations? greetings from Peru.

If we educate students before they come to the lab , will it benefit of LST?”

For More Information

New Lab Ventilation Video

Modern laboratory ventilation is a complex topic with many stakeholders and intersecting technologies that impact health, safety and sustainability concerns. This complexity can sometimes create confusion for both lab occupants and support staff in addressing ventilation concerns that arise.

To support effective discussion of this challenge between lab workers, facility staff and environmental health and safety professionals, the ACS Committee on Chemical Safety has partnered with CHAS to develop this video to help both lab occupants and support staff understand their roles and responsibilities in address lab ventilation concerns.

We will be providing further materials to support the use of this video over the next month. If you have any suggestions for questions these materials should address or resources we should cite, contact us at membership@dchas.org and we will consider your suggestions as we do this work. Also, if you would like to download a copy of this video, with has an Attribution-NonCommercial Creative Commons license, let us know at this address.

https://vimeo.com/612847280