On June 29, Ralph Stuart presented a webinar for Lab Manager magazine on the topic of Preventing and Managing the Most Likely Lab Accidents. This presentation highlighted a variety of ACS safety resources produced over the last 5 years and described how they can be used in the context of two historical laboratory incidents, specifically the death of Dr. Karen Wetterhahn.
This webinar was presented as part of the Divisions Innovative Project funded by the ACS to understand current practices in laboratory risk assessments and how ACS can support improving those practices. If you are interested in participating in this project, contact Ralph Stuart at email@example.com
On May 12, 2022, three chemists who have found careers in environmental health and safety work discussed how they found their way into this field, the opportunities and challenges they have found in making the transition from the laboratory to the safety profession, and provided advice for other chemists considering this career transition.
The presentation file they shared is available below and the recording of the webinar is available at the ACS Webinars web site. The webinar included active audience participation, with several members of the audience sharing their own interests and experiences in the field.
In addition to questions and answers from the audience, the webinar included 4 poll questions about their experience working with EHS professionals in their lab. Their responses are outlined in the figures below.
I work as a lab manager at a university that has an EHS department. I earned CHO certification purely for educational purposes. What type of position would I look for to actually use this certification and grow into a safety professional?
I worked in chemistry scale up for 20 years before moving into Safety –
Great to see the early career/late career questions being asked here – I think the presenters are saying this is an any time opportunity when it is right for you.
If your employer doesn’t have an EHS department – do you see this as an opportunity or a problem?
The engineering bias is there when things are written down in my experience too – I don’t think that is a blocker in practice.
Is there an on line space where people looking to connect through the comments can do that?
I would love to learn more about how to address a career in chem safety after being a CHO for my school and district and find a new position after retirement from teaching Chemistry for 30 years.
In past I have taken quite a few courses offered by Ken Roy ( NSTA consultant in Safety) while a teacher in CT and have helped to write my chemical Hygiene plan for our school and district and to introduce a Chemical Hygiene officer position. I believe our school and district has become safer as a result of the work.
I need to know how to gain certifications that can now support my skills and bring credibility to applying to positions in academic and or industry.
Need to know what certifications and courses to take? How to rewrite the resume to best apply for positions? Maybe someone might be willing here to share and to give me some help as a mentor to transition?
What kinds of opportunities exist for people freshly done with their undergraduate degree in the area ESH&Q opportunities, or is a Master’s/PhD required for these positions?
I work at a small, private liberal arts school that doesn’t have a EHS department. :/
I have 20 years industry experience as a formulator. I’m inspired that it’s not to late to switch to a safety career but concerned that my resume will be overlooked as either over or under qualified in that field, or both. What training might be available for such a transition without getting in over my head?
When I look at EHS positions, there appear to be far more looking for an engineering background than a chemistry background. Any tips for how to frame transferable skills for people who are in an engineering mindset?
What was that free course mentioned?
Perhaps someone can address an “encore” career transition after retiring from a technical career.
As a chemistry professor at a community college who also does not have an EHS department, I’ve found myself in the roll of needing to educate myself on the EHS concerns. What additional resources would be useful for those of us that need to help build institutional EHS departments?
At what point should an R&D organization have a dedicated “full-time” Chemical Hygiene Officer? Our site’s current CHO performs their EH&S duties in addition to their full-time R&D position.
Is there any great way of avoiding a near miss as far as Lab safety is concerned
And is a lab safety exam necessarily needed in academic Institutions for students doing a major that involves a wet lab environment
I have a bachelor’s in Chemistry and an MBA with a concentration in project management. I used to work in the raw-material lab for AkzoNobel in Brazil. I was responsible to generate security codes for handling raw materials in the manufacturing . Once I moved to the US, I had to change careers, but I am interested to go back into the scientific world and perhaps safety.
How can we start a joint training joint program funded by ACS with the academia in Egypt and Middle East?
Hi, Colin, safety professional in Pharma based in UK – great webinar so far – brilliant presentation from Whitney.
To register for these workshops, click on the the workshop description for that workshop. We can also help arrange presentations of these workshops in other venues. If you are interested in arranging any of these trainings for your company or local section meetings, contact us at firstname.lastname@example.org
Sunday, October 9, 2022: Empowering Academic Researchers to Strengthen Safety Culture
If you have any questions about these workshops, contact us at email@example.com or complete the workshop below.
04/14/22 Table Read for The Art & State of Safety Journal Club
Excerpts and comments from “A Transferable Psychological Evaluation of Virtual Reality Applied to Safety Training in Chemical Manufacturing”
Published as part of the ACS Chemical Health & Safety joint virtual special issue “Process Safety from Bench to Pilot to Plant” in collaboration with Organic Process Research & Development and Journal of Loss Prevention in the Process Industries.
Matthieu Poyade, Claire Eaglesham,§ Jordan Trench,§ and Marc Reid*
Recent high-profile accidents—on both research and manufacturing scales—have provided strong drivers for culture change and training improvements. While our focus here is on process-scale chemical manufacturing,[a][b][c][d][e][f][g][h][i][j] the similarly severe safety challenges exist on the laboratory scale; such dangers have been extensively reviewed recently. Through consideration of the emerging digitalization trends and perennial safety challenges in the chemical sector, we envisaged using interactive and immersive virtual reality (VR) as an opportune technology for developing safety training and accident readiness for those working in dangerous chemical environments.
VR enables interactive and immersive real-time task simulations across a growing wealth of areas. In higher education, prelab training in[k][l][m] VR has the potential to address these issues giving students multiple attempts to complete core protocols virtually in advance of experimental work, creating the time and space to practice outside of the physical laboratory.
Safety Educational Challenges
Safety education and research have evolved with technology, moving from videos on cassettes to online formats and simulations. However, the area is still a challenge, and very recent work has demonstrated that there must be an active link between pre-laboratory work and laboratory work in order for the advance work to have impact.
The primary question for this study can be framed as follows: When evaluated on a controlled basis, how do two distinct training methods[n][o][p][q][r][s], (1) VR training and (2) traditional slide training[t][u][v] (displayed as a video to ensure the consistency of the provision of training), compare for the same safety-critical task?
We describe herein the digital recreation of a hazardous facility using VR to provide immersive and proactive safety training. We use this case to deliver a thorough statistical assessment of the psychological principles of our VR safety training platform versus the traditional non-immersive training (the latter still being the de facto standard for such live industrial settings).
Figure 3. Summarized workflow for safety training case identification and comparative assessment of PowerPoint video versus VR.
After completing their training, participants were required to fill in standardized questionnaires which aimed to formally assess 5 measures of their training experiences.
1. Task-Specific Learning Effect
Participants’ post-training knowledge of the ammonia offload task was assessed in an exam-style test composed of six task-specific open-questions.
2. Perception of Learning Confidence
How well participants perform on a training exam and how they feel about the overall learning experience are not the same thing. Participant experiences were assessed through 8 bespoke statements, which were specifically designed for the assessment of both training conditions.
3. Sense of Perceived Presence
“Presence” can be defined as the depth of a user’s imagined sensation of “being there” inside the training media they are interacting with.
From the field of human–computer interaction, the System Usability Scale (SUS) has become the industry standard for the assessment of system performance and fitness for the intended purpose… A user answers their level of agreement on a Likert scale, resulting in a score out of 100 that can be converted to a grade A–F. In our study, the SUS was used to evaluate the subjective usability of our VR training system.
Transcripts of participant feedback—from both the VR and non-VR safety training groups—were used with the Linguistic Inquiry and Word Count (LIWC, pronounced “Luke”) program. Therein, the unedited and word-ordered text structure (the corpus) was analyzed against the LIWC default dictionary, outputting a percentage of words fitting psychological descriptors. Most importantly for this study, the percentage of words labeled with positive or negative affect (or emotion) were captured to enable quantifiable comparison between the VR and non-VR feedback transcripts.
Safety Training Evaluation
Having created a bespoke VR safety training platform for the GSK ammonia offloading task, the value of this modern training approach could be formally assessed versus GSK’s existing training protocols. Crucial to this assessment was the bringing together of experimental methods which focus on psychological principles that are not yet commonly practiced in chemical health and safety training assessment (Figure 3). All results presented below are summarized in Figure 8 and Table 1.
Figure 8. Summary of the psychological assessment of VR versus non-VR (video slide-based) safety training. (a) Task-specific learning effect. (b) Perception of learning confidence. (c) Assessment of training presence. (d) VR system usability score. In parts b and c, * and ** represent statistically significant results with p < 0.05 and p < 0.001, respectively.
1. Task-Specific Learning Effect (Figure 8a)
Task-specific learning for the ammonia offload task was assessed using a questionnaire[z][aa][ab][ac] built upon official GSK training materials and marking schemes. Overall, test scores from the Control group and the VR group showed no statistical difference between groups. However, there was tighter distribution[ad] around the mean score for the VR group versus the Control group.
2. Perception of Learning Confidence (Figure 8b)
Participants’ perceived confidence of having gained new knowledge was assessed using a questionnaire composed of 8 statements, probing multiple aspects of the learning experience… Within a 95% confidence limit, the VR training method was perceived by participants to be significantly more fit for training purpose than video slides. VR also gave participants confidence that they could next perform the safety task alone[ae][af][ag]. Moreover, participants rated VR as having more potential than traditional slides for helping train in other complex tasks and to improve decision making skills (Figure 8b). Overall, participants from the VR group felt more confident and prepared for on-site training than those from the Control group.[ah]
3. Sense of Perceived Presence (Figure 8c)
The Sense of Presence questionnaire was used to gauge participants’ overall feeling of training involvement across four key dimensions. Results show that participants from the VR group reported experiencing a higher sense of presence than those from the Control group. On the fourth dimension, negative effects, participants from the Control group reported experiencing more negative effects than those from the VR group, but the result was not statistically different (Figure 8c). [ai]
4. Usability of the VR Training Platform (Figure 8d)
The System Usability Scale (SUS) was used to assess the effectiveness, intuitiveness, and satisfaction with which participants were able to achieve the task objectives within the VR environment. The average SUS score recorded was 79.559 (∼80, or grade A−), which placed our VR training platform on the edge of the top 10% of SUS scores (see Figure 5 for context). The SUS result indicated an overall excellent experience for participants in the VR group.
[Participants also] disagreed with any notion that the VR experience was too long (1.6 ± 0.7) and did not think it was too short (2.5 ± 1.1). Participants agreed that the simulation was stable and smooth (3.9 ± 1.2) and disagreed that it was in any way jaggy (2.3 ± 0.8). Hand-based interactions with the VR environment were agreed to be relatively intuitive (3.8 ± 1.3), and the head-mounted display was found to provide agreeable comfort for the duration of the training (4.0 ± 0.9).
5. Sentiment Analysis of Participant Feedback (Table 1)
In the final part of our study, we aimed to corroborate the formal statistical analysis against a quantitative analysis of open participant feedback. Using the text-based transcripts from both the Control and VR group participant feedback, the Linguistic Inquiry and Word Count (LIWC) tool provided further insight based on the emotional sentiment hidden in the plain text. VR participants were found to use more positively emotive words (4.2% of the VR training feedback corpus) versus the Control group (2.1% of the video training feedback corpus). More broadly, the VR group displayed a more positive emotive tone and used fewer negatively emotive words than the Control group.
Table 1. LIWC-Enabled Sentiment Analysis of Participant Feedback Transcripts
no. of words in the transcript
difference between positive and negative words (<50 = negative)
% positive words (e.g., love, nice, sweet)
% negative words (e.g., hurt, nasty, ugly)
Overall, using our transferable assessment workflow, the statistical survey analysis showed that task-specific learning was equivalent[aj][ak][al][am][an] for VR and non-VR groups. This suggests that the VR training in that specific context is not detrimental to learning and appears to be as effective as the traditional training modality but, crucially, with improved user investment in the training experience. However, the distribution difference between both training modalities suggests that the VR training provided a more consistent experience across participants than watching video slides, but more evaluation would be required to verify this.
In addition, perceived learning confidence and sense of perceived presence were reported to be all significantly better in VR over the non-VR group. The reported differences in perceived learning confidence between participants from both groups suggest that those from the VR group, despite having acquired a similar amount of knowledge, were feeling more assured about the applicability of that knowledge. These findings thus suggest that the VR training resulted in a more engaging and psychologically involving modality able to increase participants’ confidence in their own learning. [ao][ap][aq]Further research will also aim to explore the applicability and validation of the perceived learning confidence questionnaire introduced in this investigation.
Additionally, VR system usability was quantifiably excellent, according to the SUS score and feedback text sentiment analysis.
Although our experimental data demonstrate the value of the VR modality for health and safety training in chemical manufacturing settings, the sampling, and more particularly the variation in digital literacy[ar] among participants, may be a limitation to the study. Therefore, future research should explore the training validity of the proposed approach involving a homogeneously digitally literate cohort of participants to more rigorously measure knowledge development between experimental conditions.
4.1. Implications for Chemical Health and Safety Training
By requiring learners to complete core protocols virtually in advance of real work, VR pretask training has the potential to address issues of complex learning, knowledge retention[as][at][au], training turnover times, and safety culture enhancements. Researchers in the Chemical and Petrochemical Sciences operate across an expansive range of sites, from small laboratories to pilot plants and refineries. Therefore, beyond valuable safety simulations and training exercises, outputs from this work are envisaged to breed applications where remote virtual or augmented assistance can alleviate the significant risks to staff on large-scale manufacturing sites.
As a space category in buildings, chemistry laboratories are significantly more resource-intensive than office or storage spaces. The ability to deliver virtual chemical safety training, as demonstrated herein, could serve toward the consolidation and recategorization, minimizing utility and space expenditure threatening sustainability.[av][aw][ax][ay][az][ba][bb] By developing the new Chemistry VR laboratories, high utility bills[bc][bd][be][bf] associated with running physical chemistry laboratories could potentially be significantly reduced.
4.3. Bridging Chemistry and Psychology
By bringing together psychological and computational assessments of safety training, the workflow applied herein could serve as a blueprint for future developments in this emerging multidisciplinary research domain. Indeed, the need to bring together chemical and psychological skill sets was highlighted in the aforementioned safety review by Trant and Menard.
Toward a higher standard of safety training and culture, we have described the end-to-end development of a VR safety training platform deployed in a dangerous chemical manufacturing environment. Using a specific process chemical case study, we have introduced a transferable workflow for the psychological assessment of an advanced training tool versus traditional slide-based safety training. This same workflow could conceivably be applied to training developments beyond safety.
Comparing our VR safety training versus GSK’s established training protocols, we found no statistical difference in the task-specific learning[bg] achieved in VR versus traditional slide-based training. However, statistical differences, in favor of VR, were found for participants’ positive perception of learning confidence and in their training presence (or involvement) in what was being taught. In sum, VR training was shown to help participants invest more in their safety training than in a more traditional setting for training[bh][bi][bj][bk][bl].
Specific to the VR platform itself, the standard System Usability Scale (SUS) found that our development ranked as “A–” or 80%, placing it toward an “excellent” rating and well within the level of acceptance to deliver competent training.
Our ongoing research in this space is now extending into related chemical safety application domains.
[a]I would think the expense of VR could be seen as more “worth it” at this scale rather than at the lab scale given how much bigger and scarier emergencies can be (and how you really can’t “recreate” such an emergency in real life without some serious problems).
[b]Additionally, I suspect that in manufacturing there is more incentive to train up workers outside of the lecture/textbook approach. Many people are hands on learners and tend to move into the trades for that reason.
[c]I was also just about to make a comment that the budget for training and purchasing and upkeep for VR equipment is probably more negligible in those environments compared to smaller lab groups
[d]Jessica…You can create very realistic large scale simulations. For example, I have simulated 50-100 barrel oil spills on water, in rivers, ponds and lakes, with really good results.
[e]Oh – this is a good point. What is not taken into account here is the comfort level people have with different types of learning. It would be interesting to know if Ph.D. level scientists and those who moved into this work through apprenticeship and/or just a BA would’ve felt differently about these training experiences.
[f]Neal – I wasn’t questioning that. I was saying that those things are difficult (or impossible) to recreate in real life – which is why being able to do a simulation would be more attractive for process scale than for lab scale.
[g]The skill level of the participants in not known. A pilot plant team spans very skilled technicians to PhD level engineers and other scientists. I do not buy into Ralph’s observation.
[h]Jessica, I disagree. Simulated or hands-on is really valuable for learning skills that require both conceptual understanding and muscle memory tasks.
[i]I’m not disagreeing. What I am saying is that if you want to teach someone how to clean up a spill, it is a lot easier to spill 50 mL of something in reality and have them practice leaning it up, than it is to spill 5000 L of something and ask them to practice cleaning it up. Ergo, simulation is going to automatically be more attractive to those who have to deal with much larger quantities.
[j]And the question wasn’t about skill level. It was about comfort with different sorts of learning. You can be incredibly highly skilled and still prefer to learn something hands-on – or prefer for someone to give you a book to read. The educational levels were somewhat being used as proxies for this (i.e. to get a PhD, you better like learning through reading!).
[n]I’d be curious to see how a case 3 of video lecture then VR training compares, because this would have a focused educational component then focused skill and habit development component
[o]I’d be interested to see how in person training would compare to these.
[p]This would also open up a can of worms. It is in-person interactive learning? Is it in-person hands-on learning? Or is it sitting in a classroom watching someone give a Powerpoint in-person learning?
[q]I was imagining hands-on learning or the reality version of the VR training they did so they could inspect how immersive the training is compared to the real thing. Comparision to an interactive train could also have been interesting.
[r]I was about to add that I feel like comparison to interactive in-person training would’ve been good to see. I tend to think of VR as same as an in-person training but just digital.
[s]Thats why I think it could be interesting. They could see if there is in fact a difference between VR and hands-on training. Then if there was none you would have an argument for a cost saving in space and personnel.
[t]Comparing these two very different methods is problematic. If you are trying to assess the added immersive value of VR then it needed to be compared to another ‘active learning’ method such as computer simulation.
[u]Wouldn’t VR training essentially be a form of computer simulation? I was thinking that the things being compared were 2 trainings that the employees could do “on their own”. At the moment, the training done on their own is typically some sort of recorded slideshow. So they are comparing to something more interactive that is also something that the employee can do “on their own.”
[v]A good comparison could have been VR with the headset and controllers in each hand compared to a keyboard and mouse simulation where you can select certain options. More like Oregon Trail.
[w]I’ve never seen this graphic before, but I love it! Excel is powerful but user-hostile, particularly if you try to share your work with someone else.
Of course, Google and Amazon are cash cows for their platforms, so they have an incentive to optimize System Usability (partially to camouflage their commercial interest in what they are selling).
[x]I found this comparison odd. It is claiming to measure a computer system’s fitness for purpose. Amazon’s purpose for the user is to get people to buy stuff. While it may be complex beyond the scenes, it has a very simple and pretty singular purpose. Excel’s purpose for the user is to be able to do loads of different and varied complex tasks. They are really different animals.
[y]Yes, I agree that they are different animals. However understanding the limits of the two applications requires more IT education than most people get.(Faculty routinely comment that “this generation of students doesn’t know how to use computers”.) As a result, Excel is commonly used for tasks that it is not appropriate for. But you’re correct that Amazon and Google’s missions are much simpler than those Excel are used for.
[z]I’d be very interested to know what would have happened if the learners were asked to perform the offload task.
[aa]Yes! I just saw your comment below. I wonder if they are seeing no difference here because they are testing on different platform than they trained. Would be interesting to compare written and in practice results for each group. Maybe the VR group would be worse at the written test but better at physically doing the tasks.
[ab]This could get at my concerns about the Dunning-Kruger effect that I mentioned below. as well Just because someone feels more confident that they can do something, doesn’t mean that they are correct about that. It definitely would’ve been helpful to actually have the employees perform the task and see how the two groups compared. Especially since the purpose of training isn’t to pass a test – it is to actually be able to DO the task!
[ac]“Especially since the purpose of training isn’t to pass a test – it is to actually be able to DO the task!” – this
[ad]If I’m understanding the plot correctly, it looks like there is a higher skew in the positive direction for the control group which is interesting. I.e. lower but also higher scores. It seems to have an evening effect which makes the results of the training more predictable.
[ae]I’m slightly alarmed that VR give people the confidence to perform the tasks alone when there was no statistical difference in the task-specific learning scores than the control group. VR seems to give a false sense of confidence.
[af]I had the same reaction to reading this. I don’t believe a false sense of confidence in performing a task alone to be a good thing. Confidence in your understanding is great, but overconfidence in physically performing something can definitely lead to more accidents.
[ag]A comment I’ve seen is that the VR trainees may perform better in doing the action than the control but they only gauged their knowledge in an exam style, not in actually performing the task they are trained to do. But regardless, a false confidence would not be good.
[ah]Wonder if there is concern that this creates a false confidence given the exam scores.
[ai]Wow these are big differences. There is basically no overlap in the error bars except the negative effect dimension.
[aj]If the task-specific learning was equivalent, but one method caused people to express greater confidence than the other, is this necessarily a good thing? Taken to an extreme perhaps, wouldn’t we just be triggering the Dunning-Kruger effect?
[ak]I’d be interested in the value of VR option as refresher training, similar to the way that flight simulators are used for pilots. Sullenberger attributed surviving the bird strike at LaGuardia to lots of simulator time allowing him to react functionally rather than emotionally in the 30 seconds after birds were hit
[al]I had the same comment above. I was wondering if people should be worried about false confidence. But the problem with the assessment was on paper, they may be better at physically doing the tasks or responding in real time, that was not tested.
[an]From Ralph’s comment, I think there is a lot of value in VR for training for and simulating emergency situations without any of the risks.
[ao]Again, this triggers the question: should they be more confident in this learning?
[ap]I don’t think the available data let us distinguish between an overconfident test group and an under-confident control group (or both).
[aq]I am curious what the standard in the field is for determining at what point learners or over confident. For example, are these good scores? Is there a correlation between higher test scores and confidence or inverse meaning false confidence?
[ar]I am curious as well how much training there needed to be to explain how the VR setup worked. That is a huge factor to how easy people will perceive the VR setup to be if someone walks them through it slowly, versus walking up to a training station and being expected to know how to use it.
[as]I suspect that if people are more engaged in the VR training that they would have better retention of the training over time and would be interesting to explore. If so, that would be a great argument for VR.
[at]Right and if places don’t have the capacity to offer hands on training, an alternative to online-only is VR. Or, in high hazard work where it’s not advisable to train under the circumstances.
[au]I argee that their is a lot of merit for VR in high hazard or emergnecy response simulation and training.
[av]I’m personally more interested in the potential to use augmented reality in an empty lab space to train researchers to work with a variety of hazards/operations.
[aw]Right, this whole time I wasn’t thinking about replacing labs I was thinking about adding a desk space for training. I am still curious about the logistics though, in the era of COVID if people will really be comfortable sharing equipment and how it will be maintained, because that all costs money as well.
[ax]This really would open up some interesting arguments. Would learning how to do everything for 4 years in college by VR actually translate to an individual being able to walk into a real lab and work confidently?
[ay]From COVID, some portion of the labs they were able to do virtually – like moving this etc. From personal standpoint, I’d say no and like we talked about here – it would give a false sense of confidence which could be detrimental. It’s the same way I feel about “virtual fire extinguisher training.” I don’t think it provides any of the necessary exposure.
[az]Amanda totally agree. The virtual fire extinguisher training does not provide the feel (heat) smell (fire & agent) or sound or a live fire exercise.
[ba]Oh wait – virtual training and “virtual reality training” are pretty different concepts. I agree that virtual training would never be able to substitute for hands-on experience. However, what VR training has been driving at for years is to try to create such a realistic environment within which to perform the training that it really does “feel like you are there.” I’m not sure how close we are to that. In my experiences with VR, I haven’t seen anything THAT good. But I likely haven’t been exposed to the real cutting edge.
[bb]Jessica, I wouldn’t recommend that AR/VR ever be used as the only training provided, but I suspect that it could shorten someone’s learning curve.
Amanda and Neal, having done both, I think that both have their advantages. The heat of the fire and kickback from the extinguisher aren’t well-replicated, but I actually think that the virtual extinguisher can do a better job of simulating the difficulty. When I’ve done the hands-on training, you’d succeed in about 1-2 seconds as long as you pointed the extinguisher in the general vicinity of the fire.
[bc]Utility bills for lab spaces are higher than average, but a small fraction of the total costs of labs. At Cornell, energy costs were about 5% of the cost of operating the building when you took the salaries of the people in the building into account. This explains why utility costs are not as compelling to academic administrators as they might seem. However, if there are labor cost savings associated with VR…
[bd]I’d love to see a breakdown of the fixed cost of purchasing the VR equipment, and the incremental cost of developing each new module.
[be]Ralph, that is so interesting to see the numbers I had no idea it was that small of an amount. I suppose the argument might need to be wasted energy/ environmental considerations then, rather than cost savings.
[bf]Yes, that is why I had a job at Cornell – a large part of the campus community was very committed to moving toward carbon neutrality and support energy conservation projects even with long payback periods (e.g. 7 years)
[bg]It seems like it would be more helpful to have a paper-test and field-test to see if VR helped with physically doing the tasks, since that is the benefit that I see. In addition, looking at retention over time would be important. Otherwise the case is harder to make for VR if it’s not increasing the desired knowledge and skills
[bh]How much of the statistical favoring of VR was due to the VR platform being “cool” and “new” versus the participants familiarity with traditional approaches? I do not see any control in the document for this.
[bi]I agree that work as reported seems to be demonstrating that the platform provides an acceptable experience for the learner, but it’s not clear whether the skills are acquired or retained.
[bj]three of the 4 authors are in the VR academic environment; one is in a chemistry department. Seems to be real bias in showing that VR works in this.
[bk]One application I think this could help with is one I experienced today. I had to sit through an Underground Storage Tank training which was similar to the text-heavy powerpoint in the video. I had been able to self direct my tour of the tank and play with the valves and detection systems, I would have been able to complete the regulatory requirements of understanding the system well enough to oversee its operations, but not well enough to have hands-on responsibilities. The hands-on work is left the contractors who work on the tanks everyday.
[bl]We have very good results using simulators in the nuclear power and airflight industries. BUT, there is still significant learning that occurs when things become real. The most dramatic example is landing high performance aircraft on carrier decks. Every pilot agrees that nothing prepares them for the reality this controlled crash.
Theme 1: Setting Core Values and Leading By Example.
Mission statements. …Mission statements establish the priorities and values of an organization, and can be useful in setting a positive example and signaling to potential new researchers the importance of safety. To examine the current state of public-facing mission statements, we accessed the websites of the top [a][b]50 chemistry graduate programs in the United States and searched any published mission statements for mentions of safety. [c]Although 29 departments had prominently displayed mission statements on their websites, only two (the University of Pennsylvania and UMN) included practicing safe chemistry in their mission statement (~at time of publishing in early 2021).[d][e][f][g]
Regular and consistent discussions about safety. Leaders can demonstrate that safety is a priority by regularly discussing safety in the context of experimental set-up, research presentations, literature presentations, or other student interactions[h][i]. Many potential safety teaching moments occur during routine discussions of the day-to-day lab experience.[j] Additionally, “safety minutes[k][l]”[m][n] have become a popular method in both industry and academia to address safety at the beginning of meetings.
Holding researchers accountable[o][p]. In an academic setting, researchers are still in the trainee stage of the career. As a result, it is important to hold follow-up discussions on safety to ensure that they are being properly implemented.[q] For example, in the UMN Department of Chemistry a subset of the departmental safety committee performs regular PPE “spot checks,”[r][s] and highlights exemplary behavior through spotlights in departmental media. Additionally, each UMN chemistry graduate student participates in an annual review process that includes laboratory safety as a formal category.[t] Candid discussion and follow-up on safe practices is critical for effective trainee development.
Theme 2: Empowering Researchers to Collaborate and Practice Safe Science.
Within a group: designate safety officers. Empower group members to take charge of safety within a group, both as individuals and through formal appointed roles such as laboratory safety officers (LSOs).[u] LSOs can serve multiple important roles within a research group. First, LSOs can act as a safety role model for peers in the research group, and also as a resource for departmental policy. Further, they act as liaisons to ensure open communication between the PI, the research group, and EHS staff. These types of formal liaisons are critical for fostering collaborations between a PI, EHS, and the researchers actually carrying out labwork. LSOs can also assist [v][w]with routine safety upkeep in a lab, such as managing hazardous waste removal protocols and regularly testing safety equipment such as eyewash stations. For example, in the departments of chemistry at UMN and UNC, LSOs are responsible for periodically flushing eyewash stations and recording the check on nearby signage[x].[y][z][aa][ab] Finally, LSOs are also natural picks for department-level initiatives such as department safety committees or student-led joint safety teams.[ac]
Within a group: work on policies as a collective.Have researchers co-write, edit, and revise standard operating procedures (SOPs).[ad] Many EHS offices have guides and templates for writing SOPs. More details on SOPs will be discussed in the training section of this chapter. Co-writing has the double benefit of acting as a safety teaching moment while also helping researchers feel more engaged and responsible for the safety protocols of the group.[ae][af][ag]
Within a department: establish safety collaborations.[ah] Research groups within departments often have very diverse sets of expertise. These should be leveraged through collaboration to complement “blind spots”[ai] within a particular group to the benefit of all involved—commonly, this is done through departmental safety committees, but alternative (or complementary) models are emerging. An extremely successful and increasingly popular strategy for establishing department-wide safety collaborations is the Joint Safety Team model.
Joint Safety Teams (JSTs). A Joint Safety Team (JST) is a collaborative, graduate student- and postdoc-led initiative with the goal of proliferating a culture of laboratory safety by bridging the gaps between safety administration, departmental administration, and researchers. JSTs are built on the premise that grassroots efforts can empower students to take ownership of safety practices, thus improving safety culture from the ground up. Since its inception in 2012, the first JST at UMN (a joint endeavor between the departments of Chemistry and Chemical Engineering & Materials Science, spurred through collaboration with Dow Chemical) has directly and significantly impacted the adoption of improved safety practices, and also noticeably improved the overall safety culture. Indeed, the energy and enthusiasm of students, which are well-recognized as key drivers in research innovation, can also be a significant driver for improving safety culture.[aj][ak]
…The JST model has several advantages[al]: (1) it spreads the safety burden across a greater number of stakeholders, reducing the workload for any one individual or committee; (2) it provides departmental leadership with better insight into the “on-the-ground” attitudes and behaviors of researchers;[am][an][ao][ap] and (3) it provides students with practical safety leadership opportunities that will be beneficial to their career. In fact, many of the strategies discussed in this chapter can be either traced back to a JST, or could potentially be implemented by a JST.
An inherent challenge with student-led initiatives like JSTs is that the high student turnover in a graduate program necessitates ongoing enthusiastic participation of students after the first generation. In fact, after the first generation of LSOs left the UMN JST upon graduation, there was a temporary lag in enthusiasm and engagement. In order to maintain consistent engagement with the JST, the departmental administration created small salary bonuses for officer-level positions within the JST, as well as a funded TA position. Since spending time on JST activities takes away from potential research or other time, it seems reasonable that students be compensated accordingly when resources allow it…[aq][ar]
Initial Safety Training. Chemical safety training often starts with standardized, institution-wide modules that accompany a comprehensive chemical hygiene plan or laboratory safety manual. These are important resources, but researchers can be overwhelmed by the amount of information[as][at]—particularly if only some components seem directly relevant. There is anecdotal evidence that augmentation with departmental-level training[au][av][aw] initiatives can provide a stronger safety training foundation. For example, several departments have implemented laboratory safety courses. At UNC, a course for all first-year chemistry graduate students was created with the goal of providing additional training tailored to the department’s students. Iteratively refined using student feedback, the course operates via a “flipped classroom” model to maximize engagement. The course combines case study-based instruction, role playing (i.e., “what would you do” scenarios), hands-on activities (e.g., field trips to academic labs), and discussions led by active researchers in chemistry subdisciplines or industrial research.
Continued Safety Training[ax]. Maintaining engagement and critical thinking about safety can help assure that individual researchers continue to use safe practices,[ay] and can strengthen the broader safety culture. Without reinforcement, complacency is to be expected—and this can lead to catastrophe. We believe continued training can go well beyond an annual review of documentation by incorporating aspects of safety training seamlessly into existing frameworks.
Departments can incorporate safety minutes into weekly seminars or meetings, creating dedicated time for safety discussions (e.g., specific hazards, risk assessment, or emergency action plans). In our experience, safety minutes are most effective when they combine interactive prompts[az] that encourage researcher participation and discussion. These safety minutes allow researchers to learn from one another and collaborate on safety.
While safety minutes provide continuous exposure and opportunities to practice thinking about safety, they lack critical hands-on (re)learning experiences. Some institutions have implemented annual researcher-driven activities to help address this challenge. For example, researchers at UNC started “safety field days” (harkening back to field days in grade school) designed specifically to offer practice with hands-on techniques in small groups. At UMN, principal investigators participate in an annual SOP “peer review” process, in which two PIs pair up and discuss strengths and potential weaknesses of a given research group’s SOP portfolio.[ba][bb]
Continued training can also come, perhaps paradoxically, when a researcher steps into a mentoring role to train someone else. The act of mentoring a less experienced researcher provides training to the mentee, but also forces the mentor to re-establish a commitment to safe practices and re-learn best practices[bc].[bd] Peer teaching approaches have long been known to improve learning outcomes for both the mentor and the mentee. With regards to safety training, mentors would be re-exposed to a number of resources used in initial safety trainings, such as SOPs, while having to demonstrate mastery over the material through their teaching. Furthermore, providing hands-on instruction would require demonstrating techniques for mentees and subsequently critically assessing and correcting the mentee’s technique. Additionally, mentors would have to engage in critical thinking about safety while answering questions and guiding the mentee’s risk assessments.
Continued training is important for all members of a research group. Seniority does not necessarily convey innate understandings of safety, nor does it exempt oneself from complacency. For example, incoming postdocs[be] will bring a wealth of research and safety knowledge, but they may not be as intimately familiar with all of the hazards or procedures of their new lab, or they may come from a group or department with a different safety culture. Discussing safety with researchers can be difficult, but is a necessary part of both initial and continued training.
Awareness as it pertains to chemical safety involves building layers of experience and expertise on top of safety training. It is about being mindful and engaged in the lab and being proactive and prepared in the event of an adverse scenario. Awareness is about foreseeing and avoiding accidents before they happen, not just making retroactive changes to prevent them from happening again. There are aspects of awareness that come from experiential learning—knowing the sights and sounds of the lab—while other aspects grow out of more formal training. For example, awareness of the potential hazards associated with particular materials or procedures likely requires some form of risk assessment.
Heightened awareness grows out of strong training and frequent communication, two facets of an environment with a strong culture of safety. Communication with lab mates, supervisors, EHS staff, industry, and the broader community builds awareness at many levels. Awareness is a critical pillar of laboratory safety because it links the deeply personal aspects of being a laboratory researcher with the broader context of safety communication, training, and culture. It also helps address the challenge of implementing a constructive and supportive infrastructure.
Like any experience-based knowledge, safety awareness will vary significantly between individuals. Members of a group likely have had many of the same experiences, and thus often have overlap in their awareness. Effective mentoring can lead to further overlap by sharing awareness of hazards even when the mentee has not directly experienced the situation. Between groups in a department, however, where the techniques and hazards can vary tremendously, there is often little overlap in safety awareness. [bg][bh][bi][bj]In this section, we consider strategies to heighten researcher safety awareness at various levels through resources and tools that allow for intentional sharing of experiential safety knowledge.
Awareness Within an Academic Laboratory[bk]. …Some level of awareness[bl] will come through formal safety training, as was discussed in the preceding section. Our focus here is on heightened awareness through experiential learning, mindset development, and cultivating communication and teaching to share experience between researchers.
We have encountered or utilized several frameworks for building experience. One widely utilized model involves one-on-one mentoring schemes, where a more experienced researcher is paired with a junior researcher. This provides the junior researcher an opportunity to hear about many experiences and, when working together, the more experienced researcher can point out times when heightened awareness is needed. All the while, the junior researcher is running experiments and learning new techniques. There are drawbacks to this method, though. For example, the mentor may not be as experienced in particular techniques or reagents needed for the junior researcher’s project. Or the mentor may not be effective in sharing experience or teaching awareness. Gaps in senior leadership can develop in a group, leaving mentorship voids or leading to knowledge transfer losses. Like the game “telephone” where one person whispers a message to another in a chain, it is easy for the starting message to change gradually with each transfer of knowledge.[bm] This underscores the importance of effective mentoring,[bn] and providing mentors with a supportive environment and training resources such as SOPs and other documentation…
…Another approach involves discussing a hypothetical scenario, rather than waiting to experience it directly in the lab. Safety scenarios are a type of safety minute that provide an opportunity to proactively consider plausible scenarios that could be encountered.[bo][bp] Whereas many groups or departments discuss how to prevent an accident from happening again, hypothetical scenarios provide a chance to think about how to prevent an accident before it happens[bq][br]. Researchers can mediate these scenarios at group meetings. If a researcher is asked to provide a safety scenario every few weeks, they may also spend more time in the lab thinking about possible situations and how to handle them on their own.
Almost all of these methods constitute some form of risk or hazard assessment. As discussed in the training section, formal risk assessment has not traditionally been part of the academic regimen. Students are often surprised [bs]to learn that they perform informal risk assessments constantly, as they survey a procedure, ask lab mates questions about a reagent, or have a mentor check on an apparatus before proceeding. Intuition provides a valuable risk assessment tool, but only when one’s intuition is strong. A balance[bt] of formal risk assessment and informal, experiential, or intuition-based risk assessment is probably ideal for an academic lab.
…Checklists are another useful tool for checking in on safety awareness. [bu][bv][bw]Checklists are critically important in many fields, including aviation and medicine. They provide a rapid visual cue that can be particularly useful in high-stress situations where rapid recall of proper protocol can be compromised, or in high-repetition situations where forgetting a key step could have negative consequences. Inspired by conversations with EHS staff, the Miller lab recently created checklists that cover daily lab closing, emergency shutdowns, glovebox malfunctions, high-pressure reactor status, vacuum line operations, and more. The checklists are not comprehensive, and do not replace in-person training and SOPs, but instead capture the most important aspects that can cause problems or are commonly forgotten.[bx][by] The signage provides a visual reminder of the experiential learning that each researcher has accumulated, and can provide an aid during a stressful moment when recall can break down.
A unifying theme in these approaches is the development of frameworks for gaining experience with strong mentoring and researcher-centric continued education. Communication is also essential, as this enables the shared experiences of a group to be absorbed by each individual…[bz][ca]
[b]I can’t remember exactly what we did, but if we were writing this today; that is where I’d probably start. Plenty of discussion about how accurate/useful those rankings are, but in this instance, it serves as a good starting point
[c]If safety is not part of your mission statement or part of the your graduate student handbook, then this could cause issues with any disciplinary actions you may want to take. For instance, we had a graduate student set off the fire alarm twice on purpose in a research intense building. It was difficult for the department this person was part for actions to be taken.
We have a separate section on safety in our handbook “Chemical engineering research often involves handling and disposing of hazardous materials. Graduate
students must follow safe laboratory practices as well as attend a basic safety training seminar before starting
any laboratory work. In order to promote a culture of safety, the department maintains an active Laboratory
Safety Committee composed of the department head, faculty, staff and a student members which meets each
semester. Students are expected to be responsive to the safety improvements suggested by the committee,
to serve on the committee when asked, and utilize the committee members as a resource for lab safety
[d]This is an interesting perspective on how others have prioritized safety.
[e]I find these sorts of things ring hollow given how little PIs or department leadership seem to know about what is happening in other people’s labs.
[f]I agree, particularly at the University level. However, a number of labs have mission statements, including the Miller lab, that mentions safety. I think at that level, it certainly can demonstrate greater intent
[g]Agreed. Having that language come from the PI is definitely different from having it come from the department or university – especially if the PI also walks the walk.
[h]So important because what is important to the professor becomes important to the student
[i]I definitely agree with this. I have always noticed that our students immediately reflect, pick up on, and look for what they think their professors deem most necessary/important essential.
[j]I think this is an underappreciated area. These are the conversations that help make safety a part of doing work, not just something bolted on to check a box or provide legal cover.
[l]I would say somewhat? I’ve come to realize that these two terms are often used interchangeable, but can mean very different things. At UNC, what my lab called “safety minutes” would be a dedicated section of group meeting every week where someone would lead a hypothetical scenario or a discussion of how to design a dangerous experiment with cradle-to-grave waste planning. At other places; these can mean things as simple as a static slide before seminar.
[m]The inverse of my comment above. I think these have their place, but if they’re disconnected from what the group’s “really” there to talk about, they can reinforce the idea that safety isn’t a core part of research.
[n]Agreed. I’ve seen some groups implement this by essentially swiping “safety minutes” from someone else. While this could be a way to get started, the items addressed really should be specific to your research group and your lab in order to be meaningful.
[o]Note how all these examples are of the department holding its own researchers accountable, not EHS coming externally to enforce
[p]Yes! It doesn’t fight against the autonomy that’s a core value of academia.
[q]This is a good point. Can’t be a one-and-done although it often feels like it is treated that way.
[r]It seems like this practice would also build awareness and appreciation of the other lab settings and help to foster communication between groups.
[s]I would also hope that it would normalize reminding others about their PPE. I was surprised to find how many faculty members in my department were incredibly uncomfortable with correcting the PPE or behavior of a student from a different lab. It meant effectively that we had extremely variable approaches to PPE and safety throughout our department.
[t]LOVE this idea. I’m guessing it makes the PI reflect on safety of each student as well as start a conversation about what’s going well and ways to improve
[u]We’ve seen dramatic improvement in safety issue resolution once we implemented this kind of program.
[v]Note how the word assist is used. Important to emphasize that the PI is delegating some of their duties to the LSO and throughout the lab but they are ultimately responsible for the safety of their researchers.
[w]Really important point. Too often I’ve seen an LSO designated just so the PI can essentially wash their hands of responsibility and just hand everything safety related to the LSO. It is also important for the PI to be prepared to back their LSO if another student doesn’t want to abide by a rule.
[x]Wondering what the risk is of institutions taking advantage of LSO programs by putting tasks on researchers that should really be the responsibility of the facility or EHS
[y]At UMN and UNC, do these tasks/roles contribute towards progress towards degree?
[z]In my experience at UNC serving as an LSO, it does not relate to progressing one’s studies/degree in any way (though there is the time commitment component of the role). At UMN, I know departmentally they have stronger support for their LSOs and requirements but I would not say that serving as an LSO helps/hinders degree progression (outside again of potential time commitments).
[aa]Thanks! I’m glad to hear that it doesn’t seem to hurt, but I think finding ways for it to help could make a huge difference. Thinking back to my own experience, my advisor counseled us to always have that end goal in mind when thinking about how we spent our time. This was in the context of not prioritizing TA duties ahead of research, but it is something that could argue against taking on these sorts of tasks.
[ab]Yeah, I think that is a really important point. If you’re PI continuously stresses only the results of research/imparts a sense of speed > safety; the students will pick up on that and shift in that direction through a function of their lab culture. So the flipside is if you can build and sustain a strong culture of safety; it becomes an inherent requirement, not an external check
[ac]It is important to keep in mind that this work should be considered in relation to other duties and to somehow be equally shared out among lab members. Depending on how this work is distributed, it can become an incredibly time-consuming set of tasks for one person to constantly be handling.
[ad]I have struggled with how to get buy-in for production of written SOPs.
[ae]It also increases the likelihood that the researchers will be able to implement the controls!
[af]Important point. It is often difficult in grad school to admit when you don’t understand or know how to do something. It is critical to make sure that they understand what is expected. I ran a small pilot project in which I found out that all 6 “soon to be defending” folks involved in the pilot had no clue what baffles in a hood were. Our Powerpoint “hoods” training was required every year. Ha!
[ag]In addition, it serves as a review process to catch risks hazards that the first writer may not have thought of. In industry this is a common practice that multiple people must check off on a protocol before it is used.
[ah]As outlined in this section, a good idea. We’re also working toward development of collaboration between staff with safety functions. For instance, have building managers from one department involved in inspections of buildings of other departments.
[ai]Also to avoid re-inventing the wheel. If another group has this expertise and has done risk assessments on the procedure you’re doing, better to start there rather than from scratch. You may even identify items for the other group to add.
[aj]Bottom up approach works extremely well when you have departmental support but not so much when the Head of the Dept doesn’t care.
[ak]They also can’t be effective if the concerns that the JST raise aren’t taken seriously by those who can change policies.
[al]An additional advantage is displaying value for performing safety duties. I.e. The culture is developed such that you work is appreciated rather than an annoyance
[am]I’m curious to hear what discoveries have been made about this.
[an]At UConn, we were trying to use surveys to essentially prove to our department leadership and faculty safety committee that graduate students actually DID want safety trainings – we just wanted them to be more specific to the things that we are actually concerned about it lab (as opposed to the same canned ones that they kept offering). My colleagues have told me that they are actually moving forward with these types of trainings now.
[ao]We also started holding quarterly LSO meetings because we proved to faculty through surveys that graduate students actually did want them (as long as they were designed usefully and addressed real issues in the research labs).
[ap]We work with representatives from various campus entities, which brings us a variety of insights. Yes focused training is much more valuable, and feels more worthwhile, educating both the trainer and trainee.
[aq]This is an impressive way for department administration to show endorsement and support for safety efforts.
[ar]Another way departments can communicate their commitment towards safety.
[as]We have one of these, too, but we need to move away from it. Not augmentation, but replacement. I don’t know what that looks like yet, but a content dump isn’t it.
[at]Agreed. At UNC we actually have an annual requirement to review the laboratory safety manual with our PI in some sort of setting (requires the PI to sign off on some forms saying they did it). Obviously with the length, that isn’t feasible in its entirety; so we’d highlight a few key sections but yeah. Not the most useful document
[au]I see the CHP or lab safety manual as education/resources and the training as actually practicing behaviors that are expected of researchers, which is critical to actually enforcing policies
[av]Agreed. I always felt any “training” should actually be doing something hands-on. Sitting in a lecture hall watching a Powerpoint should not qualify as “training.”
[aw]Agreed as well Jessica. That is why now all our safety trainings are hands on as you stated. It has worked and come across MUCH better. Even with our facilities and security personnel.
[ax]I really like how continued training is embedded throughout regular day-to-day activities in many cases, this is important. I would add that it is important to have the beginning training available for refresher or reference as needed but don’t think it’s worth it to completely retake the online trainings.
[ay]Let me remind everyone that when a senior lab person shows a junior person how to do a procedure, training is occurring. Including safety aspects in the teaching is critical. Capturing this with a note in the junior person’s lab notebook documents it. The UCLA prosecution would not have occurred if the PD had done this with Ms. Sangji.
[az]I’m very curious to hear about examples of this.
[ba]Wow. Getting the PIs to do this would be awesome. I wonder what is the PI incentive. Part of their annual review?
[bc]In recognition of this our department has put together an on-line lab research mentoring guide and we’re looking for ways to disseminate info about it.
[bd]This works both ways; a mentee ending up with a mentor who doesn’t emphasize safety in their work might be communicating that to their mentees as well.
[be]This has been something I’ve been concerned about and am not sure how to address as an embedded safety professional.
[bf]Just a thought – It seems like there is a big overlap between continued training and developing awareness, which makes sense
[bg]In working with graduate students, I have found a really odd understanding of this to be quite common. Many think that they are only responsible for understanding what is going on in their own labs – and not for what may be going on next door.
[bh]So true. This is where EHS could really help departments or buildings define safety better. Most people may not be aware that the floor above them uses pyrophorics, etc.
[bi]I think this speaks to how insular grad school almost forces you to be. You spend so much time deepening your knowledge and understanding of your area of research that you have no time to develop breadth.
[bj]Yeah, these are great points. Anecdotally, at UNC when we started the JST we really struggled to get any engagement whatsoever out of the computational groups, even when their work space is across the hall from heavily synthetic organic chemistry groups. We didn’t really solve this, but I know it was and is something we’re chewing on
[bk]Not mentioned explicitly in this section, but documenting what is learned is critically important. As noted earlier in the paper, students cycle through labs. The knowledge needs to stay there.
[bl]From my perspective, awareness seems to be directly tied t the PIs priorities except for the 1 in 20 student.
[bm]It seems like something like this happened in the 2010 Texas Tech explosion.
[bn]This highlights the importance of developing procedures/protocols/SOPs, secondary review, and good training/onboarding practicing for specific techniques
[bo]These are always great discussions and fruitful.
[bp]Agreed. Even if you never encounter that scenario in your career, the process of how to think about responding to the unexpected is a generalizable skill.
[bq]I also think it is important to use something like this to help researchers think about how to respond to incidents when they do happen in order to decrease the harm caused by the incident.
[br]This is really great. I think a big part of knowing what to do when a lab incident occurs has a lot to do with thinking about how to respond to the incident before it happens.
[bs]I’m really glad this is included in here. Most of risk assessment is actually very intuitive but this highlights the importance of going through the process in a formal way. But the term is so unfamiliar to researchers sometimes that it seems unapproachable
[bt]I’m interested in learning how others judge the way to find this balance.
[bu]These are useful if the student doing the work develops the checklist otherwise it becomes just a checklist without understanding and thought. I see many students look at these checklist and ignore hazards because it is not on or part of the list.
[bv]Good point. It would likely be a good practice to periodically update these as well – especially encouraging folks to bring in things that they’ve come across that were done poorly or they had to clean up.
[bw]These are great points. The checklists we’ve designed try their best to highlight major hazards, but due to brevity it isn’t possible to cover everything. I think as Jessica pointed out, is that if they are reviewed periodically, that can be a huge boost in a way that reviewing and updating SOPs as living documents is also important
[bx]This is important – a good checklist is neither an SOP nor a training.
[by]I utilize checklists but sometimes see a form of checklist fatigue – a repeated user thinks they know it and doesn’t bother with the checklist.
So the comment about a GOOD checklist is applicable.
[bz]I’m impressed with the ideas and diversity and discussion of alternatives. It’s inspiring. However, I’m trying to institute a culture of safety where I am and many of these ideas aren’t possible for me. I’m in chemistry at a 2-year (community) college, and I don’t have TAs, graduate students, etc. We’re not a research institution which is somewhat of an advantage because our situations are relatively static and theoretically controllable . My other problem is I’m trying to carry the safety culture across the college and to our sister colleges, to departments like maintenance and operations, art, aircraft mechanics, welding etc.
I would love to see ways to address
1. just a teaching college
2. other processes across campus.
I head a safety committee but am challenged to keep people engaged and aware.
[ca]I’ve found some helpful insights from others about this sort of thing from NAOSMM. They have a listserv and annual conferences where they offer workshops and presentations on safety helpful for non-research institutions like to what you’re describing.
Safety is at the core of industrial and academic laboratory operations worldwide and is arguably one of the most important parts of any enterprise since worker protection is key to maintaining on-going activity.28 At the core of these efforts is the establishment of clear expectations regarding acceptable conditions and procedures necessary to ensure protection from accidents and unwanted exposures[a]. To achieve adherence to these expectations, frequent and well-documented inspections are made an integral part of these systems.23
Consideration of the inspection format is essential to the success of this process.31 Important components to be mindful about include frequency of inspections, inspector training, degree of recipient involvement, and means of documentation[b][c][d][e] . Within documentation, the form used for inspection, the report generated, and the means of communicating, compiling, and tracking results all deserve scrutiny in order to optimize inspection benefits.27
Within that realm of documentation, inspection practice often depends on the use of checklists, a widespread and standard approach. Because checklists make content readily accessible and organize knowledge in a way that facilitates systematic evaluation, they are understandably preferred for this application. In addition, checklists reduce frequency of omission errors and, while not eliminating variability, do increase consistency in inspection elements[f][g][h][i][j] among evaluators because users are directed to focus on a single item at a time.26This not only amplifies the reliability of results, but it also can proactively communicate expectations to inspection recipients and thereby facilitate their compliance preceding an inspection.
However, checklists do have limitations. Most notably, if items on a list cover a large scale and inspection time is limited, reproducibility in recorded findings can be reduced[k]. In addition, individual interpretation and inspector training and preparation can affect inspection results[l][m][n][o][p][q].11 The unfortunate consequence of this variation in thoroughness is that without a note of deficiency there is not an unequivocal indication that an inspection has been done. Instead, if something is recorded as satisfactory, the question remains whether the check was sufficiently thorough or even done at all. Therefore, in effect, the certainty of what a checklist conveys becomes only negative feedback[r][s][t].
Even with uniformity of user attention and approach, checklists risk producing a counterproductive form of tunnel vision[u] because they can tend to discourage recognition of problematic interactions and interdependencies that may also contribute to unsafe conditions.15 Also, depending on format, a checklist may not provide the information on how to remedy issues nor the ability[v][w] to prioritize among issues in doing follow-up.3 What’s more, within an inspection system, incentive to pursue remedy may only be the anticipation of the next inspection, so self-regulating compliance in between inspections may not be facilitated.[x][y][z][aa][ab][ac]22
Recognition of these limitations necessitates reconsideration of the checklist-only approach, and as part of that reevaluation, it is important to begin with a good foundation. The first step, therefore, is to establish the goal of the process. This ensures that the tool is designed to fulfill a purpose that is widely understood and accepted.9 Examples of goals of an environmental health and safety inspection might be to improve safety of surroundings, to increase compliance with institutional requirements, to strengthen preparedness for external inspections, and/or to increase workers’ awareness and understanding of rules and regulations. In all of these examples, the aim is to either prompt change or strengthen existing favorable conditions. While checklists provide some guidance for change, they do not bring about that change, [ad][ae]and they are arguably very limited when it comes to definitively conveying what favorable conditions to strengthen. The inclusion of positive feedback fulfills these particular goals.
A degree of skepticism and reluctance to actively include a record of positive observations[af][ag] in an inspection, is understandable since negative feedback can more readily influence recipients toward adherence to standards and regulations. Characterized by correction and an implicit call to remedy, it leverages the strong emotional impact of deficiency to encourage limited deviation from what has been done before.19 However, arousal of strong negative emotions, such as discouragement, shame, and disappointment, also neurologically inhibits access to existing neural circuits thereby invoking cognitive, emotional, and perceptual impairment.[ah][ai][aj]10, 24, 25 In effect, this means that negative feedback can also reduce the comprehension of content and thus possibly run counter to the desired goal of bringing about follow up and change.2
This skepticism and reluctance may understandably extend to even including positive feedback with negative feedback since affirmative statements do not leave as strong of an impression as critical ones. However, studies have shown that the details of negative comment will not be retained without sufficient accompanying positive comment.[ak][al][am][an][ao][ap][aq]1[ar][as] The importance of this balance has also been shown for workplace team performance.19 The correlation observed between higher team performance and a higher ratio of positive comments in the study by Losada and Heaphy is attributed to an expansive emotional space, opening possibilities for action and creativity. By contrast, lower performing teams demonstrated restrictive emotional spaces as reflected in a low ratio of positive comments. These spaces were characterized by a lack of mutual support and enthusiasm, as well as an atmosphere of distrust and [at]cynicism.18
The consequence of positive feedback in and of itself also provides compelling reason to regularly and actively provide it. Beyond increasing comprehension of corrections by offsetting critical feedback, affirmative assessment facilitates change by broadening the array of solutions considered by recipients of the feedback.13 This dynamic occurs because positive feedback adds to an employees’ security and thus amplifies their confidence to build on their existing strengths, thus empowering them to perform at a higher level[au].7
Principles of management point out that to achieve high performance employees need to have tangible examples of right actions to take[av][aw][ax][ay][az], including knowing what current actions to continue doing.14, 21 A significant example of this is the way that Dallas Cowboys football coach, Tom Landry, turned his new team around. He put together highlight reels for each player that featured their successes. That way they could identify within those clips what they were doing right and focus their efforts on strengthening that. He recognized that it was not obvious to natural athletes how they achieved high performance, and the same can be true for employees and inspection recipients.7
In addition, behavioral science studies have demonstrated that affirmation builds trust and rapport between the giver and the receive[ba]r[bb][bc][bd][be][bf][bg].6 In the context of an evaluation, this added psychological security contributes to employees revealing more about their workplace which can be an essential component of a thorough and successful inspection.27Therefore, positive feedback encourages the dialogue needed for conveying adaptive prioritization and practical means of remedy, both of which are often requisite to solving critical safety issues.[bh]
Giving positive feedback also often affirms an individual’s sense of identity in terms of their meaningful contributions and personal efforts. Acknowledging those qualities, therefore, can amplify them. This connection to personal value can evoke the highest levels of excitement and enthusiasm in a worker, and, in turn, generate an eagerness to perform and fuel energy to take action.8
Looked at from another perspective, safety inspections can feel personal. Many lab workers spend a significant amount of time in the lab, and as a result, they may experience inspection reports of that setting as a reflection on their performance rather than strictly an objective statement of observable work. Consideration of this affective impact of inspection results is important since, when it comes to learning, attitudes can influence the cognitive process.29 Indeed, this consideration can be pivotal in transforming a teachable moment into an occasion of learning.4
To elaborate, positive feedback can influence the affective experience in a way that can shift recipients’ receptiveness to correction[bi][bj]. Notice of inspection deficiencies can trigger a sense of vulnerability, need, doubt, and/or uncertainty. At the same time, though, positive feedback can promote a sense of confidence and assurance that is valuable for active construction of new understanding. Therefore, positive feedback can encourage the transformation of the intense interest that results from correction into the contextualized comprehension necessary for successful follow-up to recorded deficiencies.
Overall, then, a balance of both positive and negative feedback is crucial to ensuring adherence to regulations and encouraging achievement.[bk]2, 12 Since positive emotion can influence how individuals respond to and take action from feedback, the way that feedback is formatted and delivered can determine whether or not it is effective.20 Hence, rooting an organization in the value of positivity can reap some noteworthy benefits including higher employee performance, increased creativity, and an eagerness in employees to engage.6
To gain these benefits, it is, therefore, necessary to expand from the approach of the checklist alone.16 Principles of evaluation recommend a report that includes both judgmental and descriptive information. This will provide recipients with the information they seek regarding how well they did and the successfulness of their particular efforts.27 Putting the two together creates a more powerful tool for influence and for catalyzing transformation.
In this paper the authors would like to propose an alternative way to format safety inspection information, in particular, to provide feedback that goes beyond the checklist-only approach. The authors’ submission of this approach stems from their experiences of implementing a practice of documenting positive inspection findings within a large academic department. They would like to establish, based on educational and organizational management principles, that this practice played an essential role in the successful outcome of the inspection process in terms of corrections of safety violations, extended compliance, and user satisfaction. [bl][bm][bn][bo]
[a]There is also a legal responsibility of the purchaser of a hazardous chemical (usually the institution, at a researcher’s request) to assure it is used in a safe and healthy manner for a wide variety of stakeholders
[b]I would add what topics will be covered by the inspection to this list. The inspection programs I was involved in/led had a lot of trouble deciding whether the inspections we conducted were for the benefit of the labs, the institution or the regulatory authorities. Each of these stakeholders had a different set of issues they wanted to have oversight over. And the EHS staff had limited time and expertise to address the issues adequately from each of those perspectives.
[c]This is interesting to think about. As LSTs have taken on peer-to-peer inspections, they have been using them as an educational tool for graduate students. I would imagine that, even with the same checklist, what would be emphasized by an LST team versus EHS would end up being quite a bit different as influenced by what each group considered to be the purpose of the inspection activity.
[d]Hmm, well maybe the issue is trying to cram the interests and perspectives of so many different stakeholders into a single, annual, time-limited event 😛
[e]I haven’t really thought critically about who the stakeholders are of an inspection and who they serve. I agree with your point Jessica, that the EHS and LST inspections would be different and focus on different aspects. I feel include to ask my DEHS rep for his inspection check list and compare it to ours.
[f]This is an aspirational goal for checklists, but is not often well met. This is because laboratories vary so much in the way they use chemicals that a one size checklist does not fit all. This is another reason that the narrative approach suggested by this article is so appealing
[g]We have drafted hazard specific walkthrough rubrics that help address that issue but a lot of effort went into drafting them and you need someone with expertise in that hazard area to properly address them.
[h]Well I do think that, even with the issues that you’ve described, checklists still work towards decreasing variability.
In essence, I think of checklists as directing the conversation and providing a list of things to make sure that you check for (if relevant). Without such a document, the free-form structure WOULD result in more variability and missed topics.
Which is really to say that, I think a hybrid approach would be the best!
[i]I agree that a hybrid approach is ideal, if staff resources permit. The challenge is that EHS staff are responding to a wide variety of lab science issues and have a hard time being confident that they are qualified to raise issues. Moving from major research institutions to a PUI, I finally feel like I have the time to provide support to not only raise concerns but help address them. At Keene State, we do modern science, but not exotic science.
[j]I feel like in the end, it always comes down to “we just don’t have the resources…”
Can we talk about that for a second? HOW DO WE GET MORE RESOURCES FOR EHS : /
[k]I feel like that would also be the case for a “descriptive comment” inspection if the time is limited. Is there a way that the descriptive method could improve efficiency while still covering all inspection items?
[l]This is something my university struggles with in terms of a checklist. There’s quite a bit of variability between inspections in our labs done by different inspectors. Some inspectors will catch things that others would overlook. Our labs are vastly different but we are all given the same checklist – Our checklist is also extensive and the language used is quite confusing.
[m]I have also seen labs with poor safety habits use this to their advantage as well. I’ve known some people who strategically put a small problem front and center so that they can be criticized for that. Then the inspector feels like they have “done their job” and they don’t go looking for more and miss the much bigger problem(s) just out of sight.
[n]^ That just feels so insidious. I think the idea of the inspector looking for just anything to jot down to show they were looking is not great (and something I’ve run into), but you want it to be because they CANT find anything big, not just to hide it.
[o]Amanda, our JST has had seminars to help prepare inspectors and show them what to look for. We also include descriptions of what each inspection score should look like to help improve reproducibility.
[p]We had experience with this issue when we were doing our LSTs Peer Lab Walkthrough! For this event, we have volunteer graduate students serve as judges to walk through volunteering labs to score a rubric. One of the biggest things we’ve had to overhaul over the years is how to train and prepare our volunteers.
So far, we’ve landed on providing training, practice sessions, and using a TEAM of inspectors per lab (rather than just one). These things have definitely made a huge difference, but it’s also really far from addressing the issue (and this is WITH a checklist)
[q]I’d really like to go back to peer-peer walkthroughs and implement some of these ideas. Unfortunately, I don’t think we are there yet in terms of the pandemic for our university and our grad students being okay with this. Brady, did you mean you train the EHS inspectors or for JST walkthroughs? I’ve given advice about seeing some labs that have gone through inspections but a lot of items that were red flags to me (and to the grad students who ended up not pointing it out) were missed / not recorded.
[r]This was really interesting to read, because when I was working with my student safety team to create and design a Peer Lab Walkthrough, this was something we tried hard to get around even though we didn’t name the issue directly.
We ended up making a rubric (which seems like a form of checklist) to guide the walkthrough and create some uniformity in responding, but we decided to make it be scored 1-5 with a score of *3* representing sufficiency. In this way, the rubric would both include negative AND positive feedback that would go towards their score in the competition.
[s]I think the other thing that is cool about the idea of a rubric, is there might be space for comments, but by already putting everything on a scale, it can give an indication that things are great/just fine without extensive writing!
[t]We also use a rubric but require a photo or description for scores above sufficient to further highlight exemplary safety.
[u]I very much saw this in my graduate school lab when EHS inspections were done. It was an odd experience for me because it made me feel like all of the things I was worried about were normal and not coming up on the radar for EHS inspections.
[v]I think this type of feedback would be really beneficial. From my experience (and hearing others), we do not get any feedback on how to fix issues. You can ask for help to fix the issues, but sometimes the recommendations don’t align with the labs needs / why it is an issue in the first place
[w]This was a major problem in my 1st research lab with the USDA. We had safety professionals who showed up once per year for our annual inspection (they were housed elsewhere). I was told that a set up we had wasn’t acceptable. I explained why we had things set up the way we did, then asked for advice on how to address the safety issues raised. The inspector literally shrugged his shoulders and gruffly said to me “that’s your problem to fix.” So – it didn’t get fixed. My PI had warned my about this attitude (so this wasn’t a one-off), but I was so sure that if we just had a reasonable, respectful conversation….
[x]I think this is a really key point. We had announced inspections and were aware of what was on the checklists. As a LSO, we would in the week leading up to it get the lab in tip-top shape. Fortunately, we didn’t always have a ton of stuff to fix in the week leading up to the inspection, but its easy to imagine reducing things to a yes/no checklists can mask long-term problems that are common in-between inspections
[y]My lab is similar to this. When I first joined and went through my first inspection – the week prior was awful trying to correct everything. I started implementing “clean-ups” every quarter because my lab would just go back to how it was after the inspection.
[aa]Our EHS was short-staffed and ended up doing “annual inspections” 3 years apart – once was before I joined my lab, and the next was ~1.5 years in. The pictures of the incorrect things found turned out to be identical to the inspection report from 3 years prior. Not just the same issues, but quite literally the same bottles/equipment.
[ab]Yeah, we had biannual lab cleanups that were all day events, and typically took place a couple months before our inspections; definitely helped us keep things clean.
One other big things is when we swapped to subpart K waste handling (cant keep longer than 12months), we just started disposing of all waste every six months so that way nothing could slip through the cracks before an inspection
[ac]Jessica, you’re talking about me and I don’t like it XD
[ad]I came to feel that checklists were for things that could be reviewed when noone was in the room and that narrative comments would summarize conversations about inspectors’ observations. These are usually two very different sets of topics.
[ae]We considered doing inspections “off-hours” to focus on the checklist items, until we realized there was no such thing as off hours in most academic labs. Yale EHS found many more EHS problems during 3rd shift inspections than daytime inspections
[af]This also feels like it would be a great teachable moment for those newer to the lab environment. We often think that if no one said anything, I guess it is okay even if we do not understand why it is okay. Elucidating that something is being done right “and this is why” is really helpful to both teach and reinforce good habits.
[ag]I think that relates well to the football example of highlighting the good practices to ensure they are recognized by the player or researcher.
[ah]I noticed that not much was said about complacency, which I think can both be a consequence of an overwhelm of negative feedback and also an issue in and of itself stemming from culture. And I think positive feedback and encouragement could combat both of these contributors to complacency!
[ai]Good point. It could also undermine the positive things that the lab was doing because they didn’t actually realize that the thing they were previously doing was actually positive. So complacency can lead to the loss of things that you were doing right before.
[aj]I have seen situations where labs were forced to abandon positive safety efforts because they were not aligned with institutional or legal expectations. Compliance risk was lowered, but physical risk increased.
[ak]The inclusion of both positive and negative comments shows that the inspector is being more thorough which to me would give it more credibility and lead to greater acceptance and retention of the feedback.
[al]I think they might be describing this phenomenon a few paragraphs down when they talk about trust between the giver and receiver.
[am]This is an interesting perspective. I have found that many PIs don’t show much respect for safety professionals when they deliver bad news. Perhaps the delivery of good news would help to reinforce the authority and knowledge base of the safety professionals – so that when they do have criticism to deliver, it will be taken more seriously.
[an]The ACS Pubs advice to reviewers of manuscripts is to describe the strengths of the paper before raising concerns. As an reviewer, I have found that approach very helpful because it makes me look for the good parts of the article before looking for flaw.
[ao]The request for corrective action should go along with a discussion with the PI of why change is needed and practical ways it might be accomplished.
[ap]I worked in a lab that had serious safety issues when I joined. Wondering if the positive -negative feedback balance could have made a difference in changing the attitudes of the students and the PI.
[aq]It works well with PIs with an open mind; but some PIs have a sad history around EHS that has burnt them out on trying to work with EHS staff. This is particularly true if the EHS staff has a lot of turnover.
[ar]Sandwich format- two positives (bread) between the negative (filling).
[at]I wonder how this can contribute to a negative environment about lab-EHS interactions? If there is only negative commentary (or the perception of that) flowing in one direction, would seem that it would have an impact on that relationship
[au]I see how providing positive feedback along with the negative feedback could help do away with the feeling the inspector is just out to get you. Instead, they are here to provide help and support.
[av]This makes me consider how many “problem” examples I use in the safety training curriculum.
[aw]This is a good point. From a learning perspective, I think it would be incredibly helpful to see examples of things being done right – especially when what is being shown is a challenge and is in support of research being conducted.
[ax]I third this so hard! It was also something that I would seek out a lot when I was new but had trouble finding resources on. I saw a lot of bad examples—in training videos, in the environments around me—but I was definitely starved for a GOOD example to mimic.
[ay]I read just part of the full paper. It includes examples of positive feedback and an email that was sent. The example helped me get the flavor of what the authors were doing.
[az]This makes a lot of sense to me – especially from a “new lab employee” perspective.
[ba]Referenced in my comment a few paragraphs above.
[bb]I suspect that another factor in trust is the amount of time spent in a common environment. Inspectors don’t have a lot of credibility if they are only visit a place annually where a lab worker spends every day.
[bc]A lot of us do not know or recognize our inspectors by name or face (we’ve also had so much turn around in EHS personnel and don’t even have a CHO). I honestly would not be able to tell you who does inspections if not for being on our universities LST. This has occurred in our lab (issue of trust) during a review of an inspection of our radiation safety inspection. The waste was not tested properly, so we were questioned on our rad waste. It was later found that the inspector unfortunately didn’t perform the correct test for testing the pH. My PI did not take it very well after having to correct them about this.
[bd]I have run into similar issues when inspectors and PIs do not take the time to have a conversation, but connect only by e-mail. Many campuses have inspection tracking apps which make this too easy a temptation for both sides
[be]Not only a conversation but inspectors can be actively involved in improving conditions- eg supply signs, ppe, similar sop’s…
[bf]Yes, sometimes, it’s amazing how little money it can take to show a good faith effort from EHS to support their recommendations. Other times, if it is a capital cost issue, EHS is helpless to assist even if there is a immediate concern.
[bg]I have found inspections where the EHS openly discuss issues and observations as they are making the inspection very useful Gives me the chance to ask the million questions I have about safety in the lab.
[bh]We see this in our JST walkthroughs. We require photos or descriptions of above acceptable scores and have some open ended discussion questions at the end to highlight those exemplary safety protocols and to address areas that might have been lacking.
[bi]Through talking with other students reps, we usually never know what we are doing “right” during the inspections. I think this would benefit in a way that shared positive feedback, would then help the other labs that might have been marked on their inspection checklist for that same item and allow them a resource on how to correct issues.
[bj]I think this is an especially important point in an educational environment. Graduate students are there to learn many things, such as how to maintain a lab. It is weird to be treated as if we should already know – and get no feedback about what is being done right and why it is right. I often felt like I was just sort of guessing at things.
[bk]At UVM, we hosted an annual party where we reviewed the most successful lab inspections. This was reasonably well received, particularly by lab staff who were permanent and wanted to know how others in the department were doing on the safety front
[bl]This can be successful, UNTIL a regulatory inspection happens that finds significant legal concerns related primarily to paperwork issues. Then upper management is likely to say “enough of the nice guy approach – we need to stop the citations.” Been there, done that.
[bm]Fundamentally, I think there needs to be two different offices. One to protect the people, and one to protect the institution *huff huff*
[bn]When that occurs, there is a very large amount of friction between those offices. And the Provost ends up being the referee. That is why sometimes there is no follow up to an inspection report that sounds dire.
[bo]But I feel like there is ALREADY friction between these two issues. They’re just not as tangible and don’t get as much attention as they would if you have two offices directly conflicting.
These things WILL conflict sometimes, and I feel like we need a champion for both sides. It’s like having a union. Right now, the institution is the only one with a real hand in the game, so right now that perspective is the one that will always win out. It needs more balance.
On March 10, Dr. Patricia Shields discussed the article she co-authored with three safety professionals about using “pragmatism” as a safety philosophy in the safety sciences. Her summary powerpoint and the comments form the table read of this article are below.
(15 minutes) All participants read complete document
(10 minutes) All participants use “Comments” function to share thoughts
(10 minutes) All participants read others’ Comments & respond
(10 minutes) All participants return to their own Comments & respond
(5 minutes) Jessica announces next week’s plans & closes meeting
(FYI, most of the Introduction has been cut)
Elkjaer (2009) has previously alluded to this lack of appreciation and value of pragmatism ‘as a relevant learning theory’ (p. 91) in spite of the growing recognition of its important role in education and teaching (Dewey, 1923, 1938; Garrison and Neiman, 2003; Shields, 2003a; Sharma et al., 2018), scholarship and academic development (Bradley, 2001), academic practice (Shields, 2004; 2006), curriculum (Biesta, 2014) and online learning (Jayanti and Singh, 2009). This article, therefore, addresses this anomaly by arguing for the appropriateness of pragmatism as a highly relevant philosophical cornerstone, especially for safety science educators[a].
2. The Scholarship of Learning and Teaching (SoLT)
(FYI, this section has been cut)
3. Pragmatism as a teaching philosophy
3.1. Teaching philosophies
(FYI, most of this section has been cut)
The research paradigms used extensively in higher education are positivism and interpretivism and are often being cited by faculty as influencing their teaching philosophy (Cohen et al., 2006). These two are usually associated with quantitative and qualitative research methods respectively but both prove problematic for the teaching of the safety sciences. First, safety science relies on both quantitative and qualitative methods. Second, neither uses a ‘problem’ orientation in its approach to methods and safety science is inherently problem and practice oriented and certainly should be with respect to its teaching.[b][c][d]
Third, the mixed methods literature has recognized this drawback and adopted pragmatism as their research paradigm because it takes the research problem as its point of departure (Johnson and Onwuegbuzie, 2004). In contrast to positivism and interpretivism, pragmatism holds the view that the research question that needs to be answered is more important than either the philosophical stance or the methods that support such stance. Pragmatism is traditionally embraced as the paradigm of mixed methods hence, it turns the incompatibility theory on its head by combining qualitative and quantitative research approaches, and “offers an immediate and useful middle position philosophically and methodologically; a practical and outcome-oriented method of inquiry that is based on action and leads” (Johnson and Onwuegbunzie, 2004, p. 17). The pluralism of pragmatism allows it to work across and within methodological and theoretical approaches, which for the purpose of the intent of this paper is consistent with a safety science multi-disciplinary approach.
This places practice, where the problem must originate, as an important component of mixed methods. This practice orientation resonates with the goals of learning and teaching in safety science. Therefore, presented here is the philosophy of ‘pragmatism’ which we argue is much better suited for guiding or informing safety science teaching endeavours.
3.2. The foundations of pragmatism
(FYI, this section has been cut)
3.3. Value of pragmatism for the safety sciences
(FYI, this section has been cut)
4. Safety science higher education in Australia
(FYI, this section has been cut)
5. Pragmatism and evidence informed practice (EIP)
Safety science education has traditionally taken an evidence-informed practice (EIP) stance for its teaching practice. Evidence informed practice is not a one-dimensional concept and its definition is still under debate with various academic lenses being applied to the notion of ‘research as evidence’ and how EIP can be measured (Nelson and Campbell, 2017). However, Bryk (2015) is attributed to offering up the view that EIP is a “fine-grained practice-relevant knowledge, generated by educators, which can be applied formatively to support professional learning and student achievement” (Nelson and Campbell, p. 129).[e]
This includes the expectation that students will be able to use their theoretical knowledge, gained through their academic studies, including research in the field, and translate this knowledge into practical applications in the real world[f][g][h][i][j][k][l][m][n][o]. There are continued efforts to recognise these Research to Practice (RtP) endeavours, as an example, the Journal of Safety, Health and Environmental Research in 2012 devoted an issue to ‘Bridging the Gap Between Academia and the Safety, Health and Environmental (SH&E) Practitioner. The issue demonstrated “the vital role of transferring SH&E knowledge and interventions into highly effective prevention practices for improving worker safety and health” (Choi et al., 2012, p.1). In that issue Chen et al. (2012, p. 27) found that the ‘Singapore Workplace Safety and Health Research Agenda: Research-to-Practice’ prioritizes, first, organisational and business aspects of work place health and safety (WHS) and second, WHS risks and solutions.
Other researchers in that same issue (Loushine, 2012, p. 19) examined ‘The Importance of Scientific Training for Authors of Occupational Safety Publications’ and found that there needs to be “attention on the coordination of research and publication efforts between practitioners and academics/researchers to validate and advance the safety field body of knowledge” (p. 19).
Shields (1998) introduced the notion of ‘classical pragmatism’ as a way to address the academic/practitioner divide in the public administration space. She also notes that the pure EIP approach often contains a lack of congruence between practitioner needs and research[p][q] (Shields, 2006). She identifies theory as a source of tension. Practitioners often see theory as an academic concern divorced from problems faced in their professional world. Here, pragmatism bridges theory and practice because theory[r] is considered a “tool of practice” which can strengthen student/practitioner skills and make academic (process and products) stand up to the light of practice (Shields, 2006, p. 3). The pragmatist philosopher, John Dewey used a map metaphor to describe the role of theory, whereby a map is not reality, but it is judged by its ability to help a traveller reach their chosen destination[s] (Dewey, 1938).
This perspective is often demonstrated in the student’s capstone, empirical research project. Using a problematic situation as a starting point, they introduce literature, experience and informed conceptual frameworks as theoretical tools that help align all aspects of a research process (research purpose, question, related literature, method and statistical technique). Thus, student/practitioners/researchers, led by a practical problem, could develop or find a theory by drawing on diverse (pluralistic) literature as well as their experience with the problematic situation. This provisional theory guides choice of methodology, variable measurement, data collection and analysis, which is subsequently shared (participatory) and evaluated. Practical problems are therefore addressed by the student’s conceptual framework, which is considered a tool related to the problem under investigation. This approach thus emphasizes the connective function of theory (Shields, 2006). The use of this pragmatic framework has allowed a bridge between theory and for it to be successfully applied to higher education more broadly (Bachkirova et al., 2017; El-Hani and Mortimer, 2007). Texas State University has embedded a pragmatism informed research methodology in its Master of Public Administration program with success measured in student awards, citations and recognition in policy related publications (Shields et al., 2012).
Therefore, it is proposed that safety science is a discipline which would, and should, also benefit from alignment with philosophical pragmatism. This would represent a much wider stance and a shift from viewing safety science education with merely an EIP lens, where the main consideration for teaching practice is that students are presented with research which provides them the required ‘scientific evidence’ and that the teaching of this research is enough to inform their practice of the discipline [t][u](Hargreaves, 1996, 1997). It should be noted that pragmatism does not abandon evidence, rather it contextualizes it in a problematic situation.
6. The significance of pragmatism as a teaching philosophy
For pragmatism to penetrate the safety science education field it needs to be relatively easy to apply and transmit. Fortunately, Brendel (2006) has developed a simple four P’s framework, which captures pragmatism’s basic tenets and can easily be applied as to teaching (Bruce, 2010). The 4P’s of pragmatism include the notions that education needs to be Practical (scientific inquiry should incorporate practical problem solving), Pluralistic (the study of phenomena should be multi-and inter- disciplinary), Participatory (learning includes diverse perspectives of multiple stakeholders) and Provisional (experience is advanced by flexibility, exploration and revision), as shown in Fig. 2.
The majority of safety science students simultaneously study and work in agencies or organisations as safety professionals. Hence, they appreciate the pragmatic teaching approach whereby teacher, student and external stakeholders influence learning by incorporating multiple perspectives. When teaching is filtered through a pragmatic philosophical lens, students’ learning is framed by their key domain area of interest as well as their professional context and work experienc[w][x][y][z][aa][ab]e. It encourages them to ‘try on’ their work as experiential[ac][ad][ae] learning, which they can take into and out of the classroom. Flexibility, integration, reflection and critical thinking are nurtured. Pragmatism and the four Ps can facilitate this process.
Ideally, the classroom environment incorporates communities of inquiry where students and teachers work on practical problems applicable to the health and safety domain. The pluralistic, expansive community of inquiry concept incorporates participatory links to the wider public, including industry and workers (Shields, 2003b).The community of inquiry also encourages ongoing experimentation (provisional). The ‘practical problem’ and ‘theory as tool’ orientation provides opportunities to bridge the sometime rigid dualisms between theory and practice. This teaching lens also incorporates a spirit of critical optimism, which leads to a commitment by the teacher[af]and the higher education institution to continually experiment and work to improve the content delivery and student learning experience (Shields, 2003a).
Pragmatism emphasizes classroom environments which foster transformations in thinking and these transformations in thinking can often be observed in the quality of student’s final research project (Shields and Rangarajan, 2013). Most students graduating from postgraduate degrees in the safety sciences are required to produce a major piece of work (thesis) with broad practical value. Ideally they grow and develop useful skills from the learning experience and the thesis is useful to their employer/organization and has applicability to the wider community in which they work as safety professionals.[ag][ah][ai][aj]
6.1. Pragmatic learning – student success – enhancement to practice
Higher education safety science pedagogy should be embedded in the notion that most of the students who attend come with some depth of practical experience and practical wisdom whom the academe should treat as lifelong learners and researchers[ak]. The academe should provide them with tools and skills to be stronger lifelong learners equipped to contribute to safety science practice[al][am][an][ao][ap][aq].
The universities in which the researchers of this article are aligned use pragmatism as a multi/trans-disciplinary approach in order to bridge the gap between academic theory (research) and practice. Whilst two of these universities teach safety science, the third one places pragmatism in the public administration domain and has for many years successfully incorporated the use of pragmatism to bridge the gap between academia and practice (Shields and Tajalli, 2006; Foy, 2019).
The value of using pragmatism as a teaching philosophy is one which has been successfully demonstrated to bridge this gap. A snippet of just some of the student feedback on student learning from the use of a pragmatism philosophy of teaching are evidenced below:
Having been a railway man for over thirty years I recognised that a gap
needed to be closed in my academic knowledge to advance further in the
business and wider industry and the Safety Science courses have provided
the vehicle for this to occur. Importantly I have been able to link the
learning in these courses and the assignments directly to the activities of
my rail organisation. That’s a big selling point in today’s business world.
(Safety Science Student, Phil O’Connell)
In 2014, I was promoted to Administrative Division Chief of Safety. On several occasions, I found myself utilizing the skills I learned to help evaluate and improve issues and programs in my fire department. In particular I was able to[ar]use case study research to show that our Safety
Division was understaffed. As a result, I successfully increased our
numbers of Safety Officers from 5 to 26. I have also used the same
techniques to improve our departments PPE and cancer prevention pro
grams. The greatest challenge, however, came when we had 100 fire
fighters exposed to a potentially massive amount of asbestos during a
major high rise fire. Our department had never dealt with an exposure of
its magnitude. I was able to help our department solve a very difficult
problem concerning asbestos and its effect on our PPE. I even received
calls from other fire departments who were interested in our method.
(Public Administration Student – Brian O’Neill)
These students have gone on to have their research cited and widely acknowledge (O’Connell et al., 2016; O’Connell et al., 2018; O’Neill, 2008) as have many other students under this pragmatic philosophy for learning and teaching.[as]
Whilst the embedding of pragmatism as a teaching philosophy is relatively new for Australian universities teaching in the safety science space, it is well entrenched within the public administration programs at Texas State University. Approximately 60 percent of students in this program work full time in state, local federal or non-profit organizations. [at][au]Their capstone papers focus on the practical problems of public policy, public administration and nonprofit administration. [av][aw][ax][ay][az][ba]Problems with “disorganised graduate capstone papers with weak literature reviews” (Shields and Rangarajan, 2013, p. 3) pushed the faculty to adopt pragmatism as a teaching framework. This approach enhanced students’ Applied Research Projects (ARP), which have demonstrated remarkable industry, field and community impact (Shields, 1998). [bb]For example, five of the papers won first place in the United States among schools of public affairs and administration. A content analysis of the Texas State University applied research papers (ARPs) revealed that “most of these ARPs are methodical inquiries into problems encountered by practitioners at the workplace. Hence a dynamic interplay of practitioner experience informs public administration research, and rigorous research informs practitioner response to administration/management problems” (Shields et al., 2012, pp. 176–177).
(FYI, paragraph cut)
Higher education teachers who have used pragmatism as their teaching philosophy for some time have led the way for an interest in pragmatism as a teaching philosophy to spread and gain momentum into other domains. However, despite this and publications which endorse the use of pragmatism, there still appears to be little understanding of the benefits and rationale for pragmatism to be used as a teaching philosophy over other more established and entrenched research focused philosophies.[bc]Therefore, this paper has tried to distil both an understanding of what pragmatism represents and the ‘how and why’ it should be used more broadly, particularly for safety science educators.
Pragmatism goes beyond what is offered by the more singular notion of evidence-informed practice, especially within the safety sciences higher educational programs. Its value in other domains has been well established particularly where more problem focused, and practical applied applications are required.[bd] Further, significant positive results in student’s research outputs from having a pragmatic research[be]framework are now well demonstrated. Where student work can be used to inform decision making, policy making and problem solving that impacts wider inquiry its value stands out, as already evidenced in both the public administration space and safety science space.[bf]
In relation to the safety sciences, the higher educational pedagogist can be confident that the path to pragmatism is a well-worn, even if it may be unfamiliar to the discipline. It is recommended to extend teaching practices, past only valuing the evidence-informed practice stance, to reduce the theory to practice divide. This can be done by incorporating a broader philosophical (4 Ps) pragmatic perspective in order to develop a professional practice community of safety science problem solvers.
Therefore, embracing pragmatism as a teaching philosophy is encouraged in the higher education sector,[bg][bh] and recent acknowledgments of, and acceptance for this teaching philosophy stance, has instilled greater confidence of its recognition and credibility for its wider use. For the safety science educator, they can be proud that its adoption as a teaching philosophy is a long awaited natural development instigated by the early pragmatists forebearers who worked in the safety field.
[a]Is safety a science? I can see arguments that it is, but I can also see arguments that it is a cultural eductation about community expectations for workplace decision-making. (There are many different “communities” potentially included in this concept.)
[b]Would you include constructivism as a different paradigm?
[c]I see interpretivism and constructivism as very similar. The methods literature often treats them as basically the same. In many ways it depends on whether the problem is approached inductively or deductively. Construstivism is associated with inductive exploratory research often.
[d]I wonder if sometimes there is insufficiency of reflection to make constructivism too close to interpretivism.
[e]EIP or EBP (Evidence-Based Practice has become much more popular in general STEM education in the past 5-10 years, especially as part of the DBER (Discipline-Based Education Research) set of practices.
[f]Previous TA training I received stressed the importance of applying lecture content to new problems to help students learn and retain knowledge. I think thats a stong benefit of pragmatism.
[g]Again, I’m wondering a bit of the distinction between this and constructivism?
[h]I find that there are many missed opportunities in lecture courses and textbooks to really connect students to the safety aspects of the chemicals being described. For example, with the number of times HF is used as an example of textbook problems, it would be nice to include something about how incredibly hazardous it is to work with!
[i]Just today in an honors general chemistry course we talked about the hazards of perchlorate salts. I was surprised that the textbook was using it as a regular example, along with perchloric acid, without a hint of a discussion about safety…
[j]I believe this can also be applied to “less hazardous” compounds also, there is, in my opinion, a huge disconnect between the overall properties of a compound and its hazardous nature. For example, ethyl acetate, commonly used, not extremely hazardous, but just this week I had multiple students ask why it they needed to work with it in a hood rather than their open benchtop.
[k]One of the learning opportunities in pragmatic safety science is uncovering hidden assumptions in standard practices. My “hazmat chemists” instincts are very different from the “research chemists” instincts about the same chemicals. It takes a lot of practice to go into a conversation about these chemicals with an open mind.. (This has come up this week with a clean out of research lab and very different perceptions of the value and hazards of specific chemical containers.)
[l]It would be really cool to see an organic textbook for example that has inset sections on the safety considerations of different reactions. My O chem professor would sometimes highlight reactions that were good on paper and problematic in reality, but it should a more frequent discussion.
[m]This is something that gets addressed in our organic labs actually. They “design” their own experiment. They’re given a number of chemicals in a list (some are controlled substances, some are very expensive) and are asked to choose which ones they would like to use for their experiment. We then use their choice from groups to go over both safety aspects and expense aspects and how we can then still do our experiment with other chemicals.
[n]That is a great exercise. I especially like how practical and open-ended it is.
[o]Overall, in an organic chemistry course practical knowledge of synthesis is mostly untouched as many of the classic reactions used to teach the course are fairly complex experimentally. I.e. sandmeyer reactions are conveniently simple to explain but harder to accomplish in person.
[p]This seems to be an issue across many fields. Often times we see that those performing the practice and those performing the research speak different languages and consider very different things important.
[q]I see this a lot in experimental and computational work. Different languages, different skill sets, and different approaches
[r]Safety science also has this issue internally. There is an interesting paper that was covered in a podcast awhile ago about “reality-based safety science”: https://safetyofwork.com/episodes/ep20-what-is-reality-based-safety-science
[s]I’m thinking about an analogy with computational and experimental chemistry also. I like the “tools of practice” bridge.
[u]I would imagine that for Case Studies to become research that someone would have to gather case studies and look for trends. I see Case Studies as an opportunity to share one experience or one set of experiences with the community in the hopes that with enough Case Studies a meaningful research study could be conducted.
[v]Case studies are definitely included in this. Scientific evidence here would mean that the evidence was collected with a scientific attitude. There is no belief that actual objectivity is possible but something close should be strived for.
[w]Allowing students to pursue their interests is alway a benefit while learning. Its been a struggle to organize researcher safety meetings in a way to engage participants by allowing them to follow their interests, especially with virtual meetings. Has anyone found strategies that facilitate that interest and engagement?
[x]Something I had started to explore just before the lockdown was to try to set up opportunities for grad students to discuss the risk assessments around their own project work. In this way, they could show off their expertise while helping to educate others – and possibly reveal some things that they hadn’t thought about or didn’t know. I really liked how Texas A&M did their Table Discussions in which they invited students who had something in common (i.e. all those who use gloveboxes), presented a Safety Moment about them, then invited students to share their own stories, strategies, and concerns with one another about glovebox usage.
[y]We started doing round tables that would discuss safety topics within their own focus area (inorganic, organic, physicals/atmospheric), similar to what Jessica mentioned with gloveboxes and that’s gotten a lot response and interest.
[z]Those sounds like great ideas. We already have our department research groups divided into hazard classes so it would be easy to have them meet in those groups. Thanks for the suggestions. I also like the idea of participants presentation to eachother instead of a lecture style event.
[aa]I like this a lot. Is there much faculty involvement?
[ab]We don’t get as much faculty involvement due to their busy schedules. But we have had safety panels with faculty with different safety specialties such as lasers, slink lines, compressed gases, physical hazard etc.
[ac]Is pragmatism a bridge between theoretical and experiential learning?
[ad]I believe that it is most useful when the bridge runs both ways
[af]Action research is certainly on the continuum of research that can be informed by pragmatism. The pluralism of pragmatism comes to play here.
[ag]Hopefully within the safety sciences this aspiration is realized more often than in other disciplines. Too many times, theses and dissertations get lost in the archives and go unread.
[ah]Again, this is something that makes me think of the ideas behind Action Research. Since it is a research method by which the researcher questions their own practice, the thesis that ultimately comes of it could potentially be of interest to their own employers or teams (even if no one else reads it).
[ai]Safety research tends to be somewhat more read because it is often driven by the need to support a risk decision. But as Covid has shown, this may not improve the quality of the scientific literature that is being read. The rush to publish (no or small amounts of data) has really slowed the understanding of best safety practices
[aj]I see what you mean Jessica even if the actual manuscript is not disseminated a researcher self-evaluating their own practice can definitely serve a self-check where one can see places to improve.
[ak]How would you say the idea of “pragmatism” relates, if at all, to the concept of Action Research?
[al]Would a pragmatic point of view work in beginner safety courses?
[am]I think that the “citizen scientist” movement is an attempt at a pragmatic approach to purer enviromental sciences, but I’m not convinced that this kinds of projects improve science literacy. They seem to go to stop at the crowd sourced data collection phase and then the professionals interpret the data for the collectors
[an]This goes back to the expert/novice question. Would a pragmatic approach work for both? I can see the advantage in graduate/postgraduate education. I’m wondering if the knowledge base is broad enough for beginners?
[ap]It is also very frustrating for the beginner to put in a lot of effort collecting data and then be told that that data is fatally flawed for an obscure reason
[aq]Pragmatism would call on the expert to listen carefully to the novice particularly if the novice is in the world of practice. This is where the participatory nature of pragmatism comes in. Both should have a voice.
[ar]Brian specifically mentions case study as a method he used.
[as]I actually think of a managers need to solve problems like safety issues at work could be looked at as a mini “applied” case study. The context of the problem shapes the parameters of the case.
[at]Do the students who are not working full time have a good sense of applications? And does it make them feel better prepared for common workplace problems?
[au]I would think that even if they didn’t work full time, they could still pick some sort of problem in the public domain to really seriously do a lot of research on. If nothing else, it could give them a sense of why the problem is so intractable.
[av]My sister was involved in one of these programs after 15 years of experience and she said that the content was marginally interesting, but being able to network with fellow professionals was quite valuable, both the stories and solutions they shared and for future follow up to ask questions of. That seems like quite a pragmatic aspect of this program
[aw]I would think that the networking would be part of the purpose – and this is really true for any research program as well. You basically find that small group of people who are really interested in the same problems in which you are interested so that you can all swap stories, publications, and ideas in order to drive all research forward.
[ax]I agree – I think that academic leadership sees this opportunity more clearly than faculty members who are assigned 10 or 15 grad students to mentor at once, though. ACS is providing some education around this opportunity for new faculty, but it’s a challenge to incorporate mentoring skills along with teaching, research and service duties faculty are handed
[ay]This is why the Community of Inquiry is so important. Community comes first. I actually have an article on the community of inquiry if anyone is interested.
[az]Reframing things as a community of scholars is very powerful.
[ba]I’d be glad to include any references that you think would be helpful on the web page for this discussion if you would like to share them. We get about 100 views of these pages after they go up, so the impact is not limited to the attendees at a particular session
[bb]How long are the courses? One semester? I often find it difficult for students to finish an in-depth lit review in that time frame.
[bc]This link might also be useful. https://link.springer.com/article/10.1007/s11135-020-01072-9 It deals with deductive exploratory research and covers many of these themes.
[bd]I very much appreciate the use of this pedagogy as it applies to practical content!
[be]I believe the students that give their courses a good faith effort come away with tools to apply to their work. We look at the research project as a project management challenge and apply project management ideas throughout. This is sometimes the most important lesson, particularly for the pre-service students.
[bf]This is a very important idea. When I pursued my 1st bachelors, in political science, I was incredibly disappointed to find how much research and practice diverged.
[bg]There is an important distinction here between undergrad and graduate students in higher ed. Traditional undergrads tend to be learning more practical skills outside of the curriculum. I wonder what the experience of non-traditional and community college students are in this regard?
[bh]It does seem like this approach lends itself very well to setting where previous or current experience is required.
The Agenda Book for the Executive Committee meeting is now available below. The meeting will take place on Sunday, March 20, 2022 10:15-11:30AM PST The Zoom Virtual Link is available in the agenda book.