This award is given to recognize graduate-level academic research faculty who demonstrate outstanding commitment to chemical health and safety in their laboratories.
Award Amount and Recognition
Up to $1,000 to support travel expenses to attend the ACS national meeting and deliver a 15 – 20 min presentation at the CHAS Awards Symposium, and/or a grant to be used for safety enhancements in the faculty member’s research group
Engraved plaque including name of recipient and sponsor logo to be presented to winner at award symposium
Award certificate mailed to the university president.
Eligible nominees are faculty members who have responsibility for a graduate-level research laboratory, and who demonstrate values and behaviors consistent with the criteria of this award. The faculty member’s laboratory may be part of any academic department at the institution, provided that chemical use is a significant part of the laboratory’s research. Detailed criteria provided below.
One award is given per year.
Award Criteria
Sets Safety-Compliance Expectations:
Enforces all institutional health and safety practices, protocols, and rules within his or her laboratory space.
Participates in formal laboratory inspections and group safety committee meetings.
Maintains safety in the laboratory by conducting unannounced walk-throughs.
Ensures all members of the research group and visitors (including vendors and contractors) read, understand, agree to follow, and realize the consequences of not following the safety rules.
Actively demonstrates his/her commitment to laboratory health and safety.
Personally provides a “new employee/student safety orientation” for each new member of the research group.
Monitors and Provides Safety Information and Training:
Insists that everyone who works in the lab receives comprehensive, lab-specific safety information and training.
Ensures students and others who work in his or her research lab are educated, informed, and trained in the safety skills they need to conduct research safely and to work independently.
Establishes coaching and mentoring relationships to enable new researchers to receive hands-on training in safety practices from more experienced researchers.
Requires a structured lab-orientation, including emergency information and safety rules, for new lab workers.
Annually and as needed, reviews and revises the lab safety manual and chemical hygiene plan (if separate from the lab safety manual).
Models Safe Behaviors:
Serves as a role model by personally exhibiting good safety behavior.
Wears proper lab attire and PPE when entering laboratories or handling research materials.
Completes, at a minimum, the same institutional safety training requirements as lab workers and signs the group’s rules agreement.
Personally proposes new safety initiatives and/or shares safety best practices with the department and/or Health and Safety department.
Actively participates in research group or department safety committees and Joint Safety Teams.
Includes safety information in published research and requires it in their students’ thesis proposals and thesis defenses.
Assesses Hazards and Evaluates Risks:
Ensures that lab members complete hazard analyses prior to conducting experimental procedures.
Identifies the hazards, types of emergencies that could occur, and what needs to be done to be prepared for them.
Implements prudent practices, protective facilities, and PPE needed to minimize risk
Reviews new laboratory procedures for potential risks.
Requires hazard analyses to be incorporated into lab notebooks prior to an experiment.
Expects hazard analysis to be included in thesis proposals, dissertation proposals, and published research.
Requires hazard analysis to be revisited if an experimental procedure yields unexpected results or if the procedure requires changes before conducting the experiment.
Creates Safety Leaders:
Empowers researchers to assume leadership roles in establishing safety practices within research groups and for entire departments.
Encourages lab members to participate in department safety committees or Joint Safety Teams.
Encourages lab members to propose new safety initiatives and/or share safety best practices with the department and/or Health and Safety department.
Rewards good safety performance.
Promotes Positive Safety Culture:
Takes actions to encourage safety and promote a strong, positive safety culture in the research lab.
Provides instruction and encouragement to lab members for reporting any incident no matter how minor and injuries.
Fosters a nonthreatening atmosphere for free expression of safety concerns or questions.
Provides ample budgetary support for safety supplies such as PPE and engineering controls.
Recognizes that psychological stress can undermine safety culture and performance.
Provides and encourages lab members to take advantage of resources for stress management.
Encourages open and ongoing dialog about safety.
Requires “safety moments” at laboratory group meetings or otherwise incorporate safety into research discussions.
Encourages and acknowledges lab members for working safely.
Accepts responsibility for safety.
Takes initiative to reduce waste and promote greener, more sustainable research practices in his or her lab.
Required support for nomination
The nomination must include a cover letter from the nominator describing why the nominee is deserving of this award. The letter shall describe the nominees work and how it is aligned with the purpose, eligibility, and criteria of the award.
The nomination must also include a letter of support from the institution’s Environmental Health and Safety Officer, as well as, a letter of support from a senior administrator such as the head of the academic department, the vice provost for research, or the dean of the school or college. In cases when the nominator (author of the cover letter) is a member of the Environment Health and Safety office or a senior administrator, an additional letter of support from that party is not required.
Individuals writing letters of support are not required to be members of the American Chemical Society or the Division of Chemical Health and Safety. The supporter’s contact information must be included in the letter. The support letter should clearly describe the impact of the nominee’s work and how the individual has met the purpose and criteria of the award.
Site visit criteria (if applicable)
A site visit will not be required for this award.
What, if any, involvement will the sponsor have in evaluating nominees?
The sponsor will serve as a reviewer of the nominations for this award. The sponsor’s evaluation will be given no more or less weight than the evaluations of the other members of the Awards Selection Subcommittee.
Eligible Sources of Nominations
Self-nomination
Subordinate (Student/lab member)
Department chair
EH&S Department
Vice Provost for Research or another Senior Administrator
Peer
Additional Information about this Award
This award was proposed in 2018 by James Kaufmann, Ph.D. (Founder, Laboratory Safety Institute), and was developed in collaboration with Dr. Kaufmann by the CHAS Awards Committee in 2019. The award was first given in 2020. Criteria for the award were derived primarily from the following sources of academic laboratory safety guidelines:
National Research Council. 2014. Safe Science: Promoting a Culture of Safety in Academic Chemical Research. Washington, DC: The National Academies Press. https://doi.org/10.17226/18706.
Laboratory Safety Guidelines: 40 Suggestions for a Safer Lab, James A. Kaufman, Laboratory Safety Institute, https://www.labsafety.org/resource
Previous Winners
2022: Alexander J. M. Miller, Ph.D., University of North Carolina 2021: Ian Albert Tonks, Ph.D., University of Minnesota 2020: Mahesh K. Mahanthappa, Ph.D., University of Minnesota
Division of Chemical Health & Safety Executive Committee Fall 2021 Meeting
Monday, August 22, 2022 12:00-2:00 PM CDT Hyde Park B / CC11B meeting room of the Hyatt Regency Join from PC, Mac, Linux, iOS or Android: https://yale.zoom.us/j/98891321288 Or Telephone:203-432-9666 or 646 568 7788 Meeting ID: 988 9132 1288
Marta Gmurczyk, Organizer, Presider; Dr. Daniel R. Kuespert, CSP, Organizer, Presider
Safety is a common theme across most chemical disciplines, but hazards deemed common in one field may be opaque to other researchers. Interdisciplinary research brings with it risks associated with several intersecting disciplines, and researchers trained in one speciality will not necessarily be familiar with all the relevant risks. What hazards are recognized in their research field? What hazard control challenges present themselves? Where are the high risks that have been accepted as “normal?” This symposium draws together representatives of many chemical disciplines to share their safety knowledge and challenges
3737607 – Blueprint thinking to build sustainable solutions on the solid foundation of safety; Kalyani Martinelango, Presenter; Jason Fisk
3743933 – Hazard identification and mitigation in a multidisciplinary industrial research environment, George Athens, Presenter; Jeff Foisel; Steve Horsch
3741150 – Safety in the catalysis research lab, Mark Bachrach, Presenter
3754630 – Risk, safety, and troublesome territoriality: Bridging interdisciplinary divides, John G Palmer, PhD, Presenter; Brenda Palmer
Division of Chemical Health & Safety Awards Symposium
Brandon Chance, Organizer, Presider
3754798 – Interdepartmental initiatives to improving campus chemical safety, Luis Barthel Rosa, Presenter
3738511 – Building and sustaining a culture of safety via ground-up approaches, Quinton Bruch, Presenter
3750224 – Safety net: Lessons in sharing safe laboratory practices, Alexander Miller, Presenter
3740645 – Governing green labs: Assembling safety at the lab bench, Susan Silbey, Presenter
Monday, August 22, 2022, 09:00am – 11:45am CDT
Indicators of Success in Laboratory Safety Cultures
Over the past decade, there has been increasing interest in improving the safety culture of laboratory settings. How do we identify and implement indicators of success of these efforts? When are quantitative cultural measurements available and when do we need to rely on qualitative indicators of movement forward? Both theoretical ideas and concrete examples of the use of this approach are welcome in this symposium.
3745133 – Indicators of success in a safety culture, Ralph Stuart, CIH, CCHO, Presenter
3733703 – What’s in a name? Mapping the variability in lab safety representative positions, Sarah Zinn, Presenter; Imke Schroeder; Dr. Craig Merlic
3735305 – Supporting high school educators with a chemical hygiene officer, Kevin Doyle, Presenter; Yvonne Doyle
3755577 – Factors for improving a laboratory safety coordinator (LSC) program, Kali Miller, Presenter
3741048 – Extraction, purification and bioactivity of policosanols from Cannabis sativa L. Megi Gjicolagj; Virginia Brighenti; Alberto Venturelli; Lisa Anceschi; Caterina Durante; Prof. Federica Pellati, Presenter
3750806 – Cannasulfur compounds: A new paradigm for the chemistry of cannabis, Iain Oswald, Presenter; Marcos Ojeda; Thomas Martin; Twinkle Paryani; Kevin Koby
3744744 – Purple rain: CBD discolouration results from cascading photo-redox reactions, Brodie Thomson, Presenter; Markus Roggen, Presenter; Glenn Sammis
3741211 – Optimization of a new HPLC method for the simultaneous separation of cannabinoids by experimental design techniques, Caterina Durante; Cindy Afezolli; Lisa Anceschi; Virginia Brighenti, Presenter; Federica Pollastro; Prof. Federica Pellati
3738752 – Sustainability in the evolving scenario of cannabinoid bioanalysis: Microsampling for the assessment of Δ8-THC and metabolites, Dr. Michele Protti, Presenter; Marco Cirrincione; Dorota Turonová; Lenka Krčmová; Prof. Laura Mercolini
Blazing Trails: Cannabis Chemistry in Post-secondary Education
Formal post-secondary instruction in the area of cannabis chemistry has emerged across North America in recent years, ranging from entire undergraduate programs to individual courses to specific modules within a course. This session is intended to provide early reporting on the development and progress of education in this field, and to foster a discussion on the directions it is heading.
3739767 – Cannabinoid separation: A new HPLC system suitable for cannabis research in undergraduate laboratories. Alicia Douglas; Maria Swasy; Candice Cashman; Benedict Liu, Presenter
3757535 – Medicinal plant chemistry five years in: A report on the development and status of the first cannabis chemistry undergraduate degree program. Brandon Canfield, Presenter; Lesley Putman
3757906 – Masters in cannabis science and therapeutics at University of Maryland, Baltimore: An overview. Alexandra Harris, Presenter; Dr. Chad R Johnson; Brittany Namaan
3758112 – Olive Harvey College urban agriculture curriculum development…the cannabis path. Akilah Easter, Presenter; Steven Philpott
3744000 – Cannabis biology and chemistry at Colorado State University Pueblo. David Lehmpuhl, Presenter; Chad Kinney; David Dillon; Jeffrey Smith; Jonathan Velasco
3751509 – Fields to findings, from seeds to statistics. Alexander Wilson, Presenter; Maris Cinelli; James DeDecker; Lesley Putman; Ara Kirakosyan; Brandon Canfield
3725733 – Second-year, course-based undergraduate research experience (CURE) in the organic chemistry of cannabinoids. Matthew Mio, Presenter
3742921 – Advancing cannabis/hemp technology platform through an educational-consulting interface (CFS). Jerry King, Presenter
3742395 – Cannabis chemistry in the European scenario of higher education programmes: A diversified network of opportunities. Roberto Mandrioli; Dr. Michele Protti; Stefano Girotti; Camelia Draghici; Fernando Remiao; Ana Morales; Premysl Mladěnka; Luca Ferrari; Prof. Laura Mercolini, Presenter
Latest Developments in Cannabis Science and Sustainability: Characterization of the Cannabis Plant and Modern Extraction Techniques
3752003 – Beyond terpenes: An untargeted approach for characterizing and categorizing cannabis cultivars to derive a robust chemotaxonomy. Aldwin Anterola, Presenter; Laura McGregor; John Abrams
3738042 – Closer look at cannabis: Using cryo-SEM, GC/MC, and HPLC to compare trichome density and trichome ratio to secondary metabolite profiles in cannabis sativa. Steven Philpott, Presenter
3735582 – Advances in extraction of compounds of value from biomass using hot air as an extraction medium. Steven Bonde, Presenter
3756218 – Beyond THC remedation: Examples of cannabinoid isolations with centrifugal partition chromatography. Manar Alherech, Presenter
There are many different kinds of hazards associated with laboratory work and sometimes these hazards results in unforeseen safety incidents. No one knows how often such laboratory incidents occur, but over the last several years, high profile laboratory events have raised public concern about them. This concern was crystallized in the Chemical Safety Board’s 2011 report on academic laboratory safety.
In order to address the concerns raised by the CSB, the Division of Chemical Health and Safety of the ACS believes that it is important to take advantage of the learning opportunities such incidents represent. Therefore, we have joined together to develop a form which we believe will be an effective tool for collecting appropriate information to develop “Lessons Learned”from such events.
These incidents can include events which result in injuries, financial or scientific losses, near-misses, and safety observations. Our motivation for collecting this information is for many reasons:
To avoid having the same incident occur again;
To enhance safety awareness of laboratory workers as they conduct routine work;
To support the lateral thinking required to develop “what if” scenarios when planning laboratory work;
To improve emergency planning for response to laboratory events
To identify successful health and safety protection measures;
To provide stories that can add interest to safety training efforts; and
To help laboratory safety managers correctly prioritize their concerns.
Currently, a variety of laboratory organizations have developed “lessons learned” programs. Examples of these programs can be found at:
On June 29, Ralph Stuart presented a webinar for Lab Manager magazine on the topic of Preventing and Managing the Most Likely Lab Accidents. This presentation highlighted a variety of ACS safety resources produced over the last 5 years and described how they can be used in the context of two historical laboratory incidents, specifically the death of Dr. Karen Wetterhahn.
A recorded version of the webinar is available for viewing on the Lab Manager website. A PDF version of the presentation is below, as well as the audience poll responses to the questions asked about their lab risk assessment processes.
This webinar was presented as part of the Divisions Innovative Project funded by the ACS to understand current practices in laboratory risk assessments and how ACS can support improving those practices. If you are interested in participating in this project, contact Ralph Stuart at membership@dchas.org
On May 12, 2022, three chemists who have found careers in environmental health and safety work discussed how they found their way into this field, the opportunities and challenges they have found in making the transition from the laboratory to the safety profession, and provided advice for other chemists considering this career transition.
The presentation file they shared is available below and the recording of the webinar is available at the ACS Webinars web site. The webinar included active audience participation, with several members of the audience sharing their own interests and experiences in the field.
In addition to questions and answers from the audience, the webinar included 4 poll questions about their experience working with EHS professionals in their lab. Their responses are outlined in the figures below.
I work as a lab manager at a university that has an EHS department. I earned CHO certification purely for educational purposes. What type of position would I look for to actually use this certification and grow into a safety professional?
I worked in chemistry scale up for 20 years before moving into Safety –
Great to see the early career/late career questions being asked here – I think the presenters are saying this is an any time opportunity when it is right for you.
If your employer doesn’t have an EHS department – do you see this as an opportunity or a problem?
The engineering bias is there when things are written down in my experience too – I don’t think that is a blocker in practice.
Is there an on line space where people looking to connect through the comments can do that?
I would love to learn more about how to address a career in chem safety after being a CHO for my school and district and find a new position after retirement from teaching Chemistry for 30 years.
In past I have taken quite a few courses offered by Ken Roy ( NSTA consultant in Safety) while a teacher in CT and have helped to write my chemical Hygiene plan for our school and district and to introduce a Chemical Hygiene officer position. I believe our school and district has become safer as a result of the work.
I need to know how to gain certifications that can now support my skills and bring credibility to applying to positions in academic and or industry.
Need to know what certifications and courses to take? How to rewrite the resume to best apply for positions? Maybe someone might be willing here to share and to give me some help as a mentor to transition?
What kinds of opportunities exist for people freshly done with their undergraduate degree in the area ESH&Q opportunities, or is a Master’s/PhD required for these positions?
I work at a small, private liberal arts school that doesn’t have a EHS department. :/
I have 20 years industry experience as a formulator. I’m inspired that it’s not to late to switch to a safety career but concerned that my resume will be overlooked as either over or under qualified in that field, or both. What training might be available for such a transition without getting in over my head?
When I look at EHS positions, there appear to be far more looking for an engineering background than a chemistry background. Any tips for how to frame transferable skills for people who are in an engineering mindset?
What was that free course mentioned?
Perhaps someone can address an “encore” career transition after retiring from a technical career.
As a chemistry professor at a community college who also does not have an EHS department, I’ve found myself in the roll of needing to educate myself on the EHS concerns. What additional resources would be useful for those of us that need to help build institutional EHS departments?
At what point should an R&D organization have a dedicated “full-time” Chemical Hygiene Officer? Our site’s current CHO performs their EH&S duties in addition to their full-time R&D position.
Is there any great way of avoiding a near miss as far as Lab safety is concerned
And is a lab safety exam necessarily needed in academic Institutions for students doing a major that involves a wet lab environment
I have a bachelor’s in Chemistry and an MBA with a concentration in project management. I used to work in the raw-material lab for AkzoNobel in Brazil. I was responsible to generate security codes for handling raw materials in the manufacturing . Once I moved to the US, I had to change careers, but I am interested to go back into the scientific world and perhaps safety.
How can we start a joint training joint program funded by ACS with the academia in Egypt and Middle East?
Hi, Colin, safety professional in Pharma based in UK – great webinar so far – brilliant presentation from Whitney.
04/14/22 Table Read for The Art & State of Safety Journal Club
Excerpts and comments from “A Transferable Psychological Evaluation of Virtual Reality Applied to Safety Training in Chemical Manufacturing”
Published as part of the ACS Chemical Health & Safety joint virtual special issue “Process Safety from Bench to Pilot to Plant” in collaboration with Organic Process Research & Development and Journal of Loss Prevention in the Process Industries.
Matthieu Poyade, Claire Eaglesham,§ Jordan Trench,§ and Marc Reid*
Recent high-profile accidents—on both research and manufacturing scales—have provided strong drivers for culture change and training improvements. While our focus here is on process-scale chemical manufacturing,[a][b][c][d][e][f][g][h][i][j] the similarly severe safety challenges exist on the laboratory scale; such dangers have been extensively reviewed recently. Through consideration of the emerging digitalization trends and perennial safety challenges in the chemical sector, we envisaged using interactive and immersive virtual reality (VR) as an opportune technology for developing safety training and accident readiness for those working in dangerous chemical environments.
Virtual Reality
VR enables interactive and immersive real-time task simulations across a growing wealth of areas. In higher education, prelab training in[k][l][m] VR has the potential to address these issues giving students multiple attempts to complete core protocols virtually in advance of experimental work, creating the time and space to practice outside of the physical laboratory.
Safety Educational Challenges
Safety education and research have evolved with technology, moving from videos on cassettes to online formats and simulations. However, the area is still a challenge, and very recent work has demonstrated that there must be an active link between pre-laboratory work and laboratory work in order for the advance work to have impact.
Study Aims
The primary question for this study can be framed as follows: When evaluated on a controlled basis, how do two distinct training methods[n][o][p][q][r][s], (1) VR training and (2) traditional slide training[t][u][v] (displayed as a video to ensure the consistency of the provision of training), compare for the same safety-critical task?
We describe herein the digital recreation of a hazardous facility using VR to provide immersive and proactive safety training. We use this case to deliver a thorough statistical assessment of the psychological principles of our VR safety training platform versus the traditional non-immersive training (the latter still being the de facto standard for such live industrial settings).
Figure 3. Summarized workflow for safety training case identification and comparative assessment of PowerPoint video versus VR.
2. Methods
After completing their training, participants were required to fill in standardized questionnaires which aimed to formally assess 5 measures of their training experiences.
1. Task-Specific Learning Effect
Participants’ post-training knowledge of the ammonia offload task was assessed in an exam-style test composed of six task-specific open-questions.
2. Perception of Learning Confidence
How well participants perform on a training exam and how they feel about the overall learning experience are not the same thing. Participant experiences were assessed through 8 bespoke statements, which were specifically designed for the assessment of both training conditions.
3. Sense of Perceived Presence
“Presence” can be defined as the depth of a user’s imagined sensation of “being there” inside the training media they are interacting with.
4. Usability
From the field of human–computer interaction, the System Usability Scale (SUS) has become the industry standard for the assessment of system performance and fitness for the intended purpose… A user answers their level of agreement on a Likert scale, resulting in a score out of 100 that can be converted to a grade A–F. In our study, the SUS was used to evaluate the subjective usability of our VR training system.
Transcripts of participant feedback—from both the VR and non-VR safety training groups—were used with the Linguistic Inquiry and Word Count (LIWC, pronounced “Luke”) program. Therein, the unedited and word-ordered text structure (the corpus) was analyzed against the LIWC default dictionary, outputting a percentage of words fitting psychological descriptors. Most importantly for this study, the percentage of words labeled with positive or negative affect (or emotion) were captured to enable quantifiable comparison between the VR and non-VR feedback transcripts.
3. Results
Safety Training Evaluation
Having created a bespoke VR safety training platform for the GSK ammonia offloading task, the value of this modern training approach could be formally assessed versus GSK’s existing training protocols. Crucial to this assessment was the bringing together of experimental methods which focus on psychological principles that are not yet commonly practiced in chemical health and safety training assessment (Figure 3). All results presented below are summarized in Figure 8 and Table 1.
Figure 8. Summary of the psychological assessment of VR versus non-VR (video slide-based) safety training. (a) Task-specific learning effect. (b) Perception of learning confidence. (c) Assessment of training presence. (d) VR system usability score. In parts b and c, * and ** represent statistically significant results with p < 0.05 and p < 0.001, respectively.
1. Task-Specific Learning Effect (Figure 8a)
Task-specific learning for the ammonia offload task was assessed using a questionnaire[z][aa][ab][ac] built upon official GSK training materials and marking schemes. Overall, test scores from the Control group and the VR group showed no statistical difference between groups. However, there was tighter distribution[ad] around the mean score for the VR group versus the Control group.
2. Perception of Learning Confidence (Figure 8b)
Participants’ perceived confidence of having gained new knowledge was assessed using a questionnaire composed of 8 statements, probing multiple aspects of the learning experience… Within a 95% confidence limit, the VR training method was perceived by participants to be significantly more fit for training purpose than video slides. VR also gave participants confidence that they could next perform the safety task alone[ae][af][ag]. Moreover, participants rated VR as having more potential than traditional slides for helping train in other complex tasks and to improve decision making skills (Figure 8b). Overall, participants from the VR group felt more confident and prepared for on-site training than those from the Control group.[ah]
3. Sense of Perceived Presence (Figure 8c)
The Sense of Presence questionnaire was used to gauge participants’ overall feeling of training involvement across four key dimensions. Results show that participants from the VR group reported experiencing a higher sense of presence than those from the Control group. On the fourth dimension, negative effects, participants from the Control group reported experiencing more negative effects than those from the VR group, but the result was not statistically different (Figure 8c). [ai]
4. Usability of the VR Training Platform (Figure 8d)
The System Usability Scale (SUS) was used to assess the effectiveness, intuitiveness, and satisfaction with which participants were able to achieve the task objectives within the VR environment. The average SUS score recorded was 79.559 (∼80, or grade A−), which placed our VR training platform on the edge of the top 10% of SUS scores (see Figure 5 for context). The SUS result indicated an overall excellent experience for participants in the VR group.
[Participants also] disagreed with any notion that the VR experience was too long (1.6 ± 0.7) and did not think it was too short (2.5 ± 1.1). Participants agreed that the simulation was stable and smooth (3.9 ± 1.2) and disagreed that it was in any way jaggy (2.3 ± 0.8). Hand-based interactions with the VR environment were agreed to be relatively intuitive (3.8 ± 1.3), and the head-mounted display was found to provide agreeable comfort for the duration of the training (4.0 ± 0.9).
5. Sentiment Analysis of Participant Feedback (Table 1)
In the final part of our study, we aimed to corroborate the formal statistical analysis against a quantitative analysis of open participant feedback. Using the text-based transcripts from both the Control and VR group participant feedback, the Linguistic Inquiry and Word Count (LIWC) tool provided further insight based on the emotional sentiment hidden in the plain text. VR participants were found to use more positively emotive words (4.2% of the VR training feedback corpus) versus the Control group (2.1% of the video training feedback corpus). More broadly, the VR group displayed a more positive emotive tone and used fewer negatively emotive words than the Control group.
Table 1. LIWC-Enabled Sentiment Analysis of Participant Feedback Transcripts
LIWC variable
brief description
VR-group
non-VR group
word count
no. of words in the transcript
1493
984
emotional tone
difference between positive and negative words (<50 = negative)
83.1
24.2
positive emotion
% positive words (e.g., love, nice, sweet)
4.2%
2.1%
negative emotion
% negative words (e.g., hurt, nasty, ugly)
1.1%
2.2%
4. Discussion
Overall, using our transferable assessment workflow, the statistical survey analysis showed that task-specific learning was equivalent[aj][ak][al][am][an] for VR and non-VR groups. This suggests that the VR training in that specific context is not detrimental to learning and appears to be as effective as the traditional training modality but, crucially, with improved user investment in the training experience. However, the distribution difference between both training modalities suggests that the VR training provided a more consistent experience across participants than watching video slides, but more evaluation would be required to verify this.
In addition, perceived learning confidence and sense of perceived presence were reported to be all significantly better in VR over the non-VR group. The reported differences in perceived learning confidence between participants from both groups suggest that those from the VR group, despite having acquired a similar amount of knowledge, were feeling more assured about the applicability of that knowledge. These findings thus suggest that the VR training resulted in a more engaging and psychologically involving modality able to increase participants’ confidence in their own learning. [ao][ap][aq]Further research will also aim to explore the applicability and validation of the perceived learning confidence questionnaire introduced in this investigation.
Additionally, VR system usability was quantifiably excellent, according to the SUS score and feedback text sentiment analysis.
Although our experimental data demonstrate the value of the VR modality for health and safety training in chemical manufacturing settings, the sampling, and more particularly the variation in digital literacy[ar] among participants, may be a limitation to the study. Therefore, future research should explore the training validity of the proposed approach involving a homogeneously digitally literate cohort of participants to more rigorously measure knowledge development between experimental conditions.
4.1. Implications for Chemical Health and Safety Training
By requiring learners to complete core protocols virtually in advance of real work, VR pretask training has the potential to address issues of complex learning, knowledge retention[as][at][au], training turnover times, and safety culture enhancements. Researchers in the Chemical and Petrochemical Sciences operate across an expansive range of sites, from small laboratories to pilot plants and refineries. Therefore, beyond valuable safety simulations and training exercises, outputs from this work are envisaged to breed applications where remote virtual or augmented assistance can alleviate the significant risks to staff on large-scale manufacturing sites.
As a space category in buildings, chemistry laboratories are significantly more resource-intensive than office or storage spaces. The ability to deliver virtual chemical safety training, as demonstrated herein, could serve toward the consolidation and recategorization, minimizing utility and space expenditure threatening sustainability.[av][aw][ax][ay][az][ba][bb] By developing the new Chemistry VR laboratories, high utility bills[bc][bd][be][bf] associated with running physical chemistry laboratories could potentially be significantly reduced.
4.3. Bridging Chemistry and Psychology
By bringing together psychological and computational assessments of safety training, the workflow applied herein could serve as a blueprint for future developments in this emerging multidisciplinary research domain. Indeed, the need to bring together chemical and psychological skill sets was highlighted in the aforementioned safety review by Trant and Menard.
5. Conclusions
Toward a higher standard of safety training and culture, we have described the end-to-end development of a VR safety training platform deployed in a dangerous chemical manufacturing environment. Using a specific process chemical case study, we have introduced a transferable workflow for the psychological assessment of an advanced training tool versus traditional slide-based safety training. This same workflow could conceivably be applied to training developments beyond safety.
Comparing our VR safety training versus GSK’s established training protocols, we found no statistical difference in the task-specific learning[bg] achieved in VR versus traditional slide-based training. However, statistical differences, in favor of VR, were found for participants’ positive perception of learning confidence and in their training presence (or involvement) in what was being taught. In sum, VR training was shown to help participants invest more in their safety training than in a more traditional setting for training[bh][bi][bj][bk][bl].
Specific to the VR platform itself, the standard System Usability Scale (SUS) found that our development ranked as “A–” or 80%, placing it toward an “excellent” rating and well within the level of acceptance to deliver competent training.
Our ongoing research in this space is now extending into related chemical safety application domains.
[a]I would think the expense of VR could be seen as more “worth it” at this scale rather than at the lab scale given how much bigger and scarier emergencies can be (and how you really can’t “recreate” such an emergency in real life without some serious problems).
[b]Additionally, I suspect that in manufacturing there is more incentive to train up workers outside of the lecture/textbook approach. Many people are hands on learners and tend to move into the trades for that reason.
[c]I was also just about to make a comment that the budget for training and purchasing and upkeep for VR equipment is probably more negligible in those environments compared to smaller lab groups
[d]Jessica…You can create very realistic large scale simulations. For example, I have simulated 50-100 barrel oil spills on water, in rivers, ponds and lakes, with really good results.
[e]Oh – this is a good point. What is not taken into account here is the comfort level people have with different types of learning. It would be interesting to know if Ph.D. level scientists and those who moved into this work through apprenticeship and/or just a BA would’ve felt differently about these training experiences.
[f]Neal – I wasn’t questioning that. I was saying that those things are difficult (or impossible) to recreate in real life – which is why being able to do a simulation would be more attractive for process scale than for lab scale.
[g]The skill level of the participants in not known. A pilot plant team spans very skilled technicians to PhD level engineers and other scientists. I do not buy into Ralph’s observation.
[h]Jessica, I disagree. Simulated or hands-on is really valuable for learning skills that require both conceptual understanding and muscle memory tasks.
[i]I’m not disagreeing. What I am saying is that if you want to teach someone how to clean up a spill, it is a lot easier to spill 50 mL of something in reality and have them practice leaning it up, than it is to spill 5000 L of something and ask them to practice cleaning it up. Ergo, simulation is going to automatically be more attractive to those who have to deal with much larger quantities.
[j]And the question wasn’t about skill level. It was about comfort with different sorts of learning. You can be incredibly highly skilled and still prefer to learn something hands-on – or prefer for someone to give you a book to read. The educational levels were somewhat being used as proxies for this (i.e. to get a PhD, you better like learning through reading!).
[n]I’d be curious to see how a case 3 of video lecture then VR training compares, because this would have a focused educational component then focused skill and habit development component
[o]I’d be interested to see how in person training would compare to these.
[p]This would also open up a can of worms. It is in-person interactive learning? Is it in-person hands-on learning? Or is it sitting in a classroom watching someone give a Powerpoint in-person learning?
[q]I was imagining hands-on learning or the reality version of the VR training they did so they could inspect how immersive the training is compared to the real thing. Comparision to an interactive train could also have been interesting.
[r]I was about to add that I feel like comparison to interactive in-person training would’ve been good to see. I tend to think of VR as same as an in-person training but just digital.
[s]Thats why I think it could be interesting. They could see if there is in fact a difference between VR and hands-on training. Then if there was none you would have an argument for a cost saving in space and personnel.
[t]Comparing these two very different methods is problematic. If you are trying to assess the added immersive value of VR then it needed to be compared to another ‘active learning’ method such as computer simulation.
[u]Wouldn’t VR training essentially be a form of computer simulation? I was thinking that the things being compared were 2 trainings that the employees could do “on their own”. At the moment, the training done on their own is typically some sort of recorded slideshow. So they are comparing to something more interactive that is also something that the employee can do “on their own.”
[v]A good comparison could have been VR with the headset and controllers in each hand compared to a keyboard and mouse simulation where you can select certain options. More like Oregon Trail.
[w]I’ve never seen this graphic before, but I love it! Excel is powerful but user-hostile, particularly if you try to share your work with someone else.
Of course, Google and Amazon are cash cows for their platforms, so they have an incentive to optimize System Usability (partially to camouflage their commercial interest in what they are selling).
[x]I found this comparison odd. It is claiming to measure a computer system’s fitness for purpose. Amazon’s purpose for the user is to get people to buy stuff. While it may be complex beyond the scenes, it has a very simple and pretty singular purpose. Excel’s purpose for the user is to be able to do loads of different and varied complex tasks. They are really different animals.
[y]Yes, I agree that they are different animals. However understanding the limits of the two applications requires more IT education than most people get.(Faculty routinely comment that “this generation of students doesn’t know how to use computers”.) As a result, Excel is commonly used for tasks that it is not appropriate for. But you’re correct that Amazon and Google’s missions are much simpler than those Excel are used for.
[z]I’d be very interested to know what would have happened if the learners were asked to perform the offload task.
[aa]Yes! I just saw your comment below. I wonder if they are seeing no difference here because they are testing on different platform than they trained. Would be interesting to compare written and in practice results for each group. Maybe the VR group would be worse at the written test but better at physically doing the tasks.
[ab]This could get at my concerns about the Dunning-Kruger effect that I mentioned below. as well Just because someone feels more confident that they can do something, doesn’t mean that they are correct about that. It definitely would’ve been helpful to actually have the employees perform the task and see how the two groups compared. Especially since the purpose of training isn’t to pass a test – it is to actually be able to DO the task!
[ac]“Especially since the purpose of training isn’t to pass a test – it is to actually be able to DO the task!” – this
[ad]If I’m understanding the plot correctly, it looks like there is a higher skew in the positive direction for the control group which is interesting. I.e. lower but also higher scores. It seems to have an evening effect which makes the results of the training more predictable.
[ae]I’m slightly alarmed that VR give people the confidence to perform the tasks alone when there was no statistical difference in the task-specific learning scores than the control group. VR seems to give a false sense of confidence.
[af]I had the same reaction to reading this. I don’t believe a false sense of confidence in performing a task alone to be a good thing. Confidence in your understanding is great, but overconfidence in physically performing something can definitely lead to more accidents.
[ag]A comment I’ve seen is that the VR trainees may perform better in doing the action than the control but they only gauged their knowledge in an exam style, not in actually performing the task they are trained to do. But regardless, a false confidence would not be good.
[ah]Wonder if there is concern that this creates a false confidence given the exam scores.
[ai]Wow these are big differences. There is basically no overlap in the error bars except the negative effect dimension.
[aj]If the task-specific learning was equivalent, but one method caused people to express greater confidence than the other, is this necessarily a good thing? Taken to an extreme perhaps, wouldn’t we just be triggering the Dunning-Kruger effect?
[ak]I’d be interested in the value of VR option as refresher training, similar to the way that flight simulators are used for pilots. Sullenberger attributed surviving the bird strike at LaGuardia to lots of simulator time allowing him to react functionally rather than emotionally in the 30 seconds after birds were hit
[al]I had the same comment above. I was wondering if people should be worried about false confidence. But the problem with the assessment was on paper, they may be better at physically doing the tasks or responding in real time, that was not tested.
[an]From Ralph’s comment, I think there is a lot of value in VR for training for and simulating emergency situations without any of the risks.
[ao]Again, this triggers the question: should they be more confident in this learning?
[ap]I don’t think the available data let us distinguish between an overconfident test group and an under-confident control group (or both).
[aq]I am curious what the standard in the field is for determining at what point learners or over confident. For example, are these good scores? Is there a correlation between higher test scores and confidence or inverse meaning false confidence?
[ar]I am curious as well how much training there needed to be to explain how the VR setup worked. That is a huge factor to how easy people will perceive the VR setup to be if someone walks them through it slowly, versus walking up to a training station and being expected to know how to use it.
[as]I suspect that if people are more engaged in the VR training that they would have better retention of the training over time and would be interesting to explore. If so, that would be a great argument for VR.
[at]Right and if places don’t have the capacity to offer hands on training, an alternative to online-only is VR. Or, in high hazard work where it’s not advisable to train under the circumstances.
[au]I argee that their is a lot of merit for VR in high hazard or emergnecy response simulation and training.
[av]I’m personally more interested in the potential to use augmented reality in an empty lab space to train researchers to work with a variety of hazards/operations.
[aw]Right, this whole time I wasn’t thinking about replacing labs I was thinking about adding a desk space for training. I am still curious about the logistics though, in the era of COVID if people will really be comfortable sharing equipment and how it will be maintained, because that all costs money as well.
[ax]This really would open up some interesting arguments. Would learning how to do everything for 4 years in college by VR actually translate to an individual being able to walk into a real lab and work confidently?
[ay]From COVID, some portion of the labs they were able to do virtually – like moving this etc. From personal standpoint, I’d say no and like we talked about here – it would give a false sense of confidence which could be detrimental. It’s the same way I feel about “virtual fire extinguisher training.” I don’t think it provides any of the necessary exposure.
[az]Amanda totally agree. The virtual fire extinguisher training does not provide the feel (heat) smell (fire & agent) or sound or a live fire exercise.
[ba]Oh wait – virtual training and “virtual reality training” are pretty different concepts. I agree that virtual training would never be able to substitute for hands-on experience. However, what VR training has been driving at for years is to try to create such a realistic environment within which to perform the training that it really does “feel like you are there.” I’m not sure how close we are to that. In my experiences with VR, I haven’t seen anything THAT good. But I likely haven’t been exposed to the real cutting edge.
[bb]Jessica, I wouldn’t recommend that AR/VR ever be used as the only training provided, but I suspect that it could shorten someone’s learning curve.
Amanda and Neal, having done both, I think that both have their advantages. The heat of the fire and kickback from the extinguisher aren’t well-replicated, but I actually think that the virtual extinguisher can do a better job of simulating the difficulty. When I’ve done the hands-on training, you’d succeed in about 1-2 seconds as long as you pointed the extinguisher in the general vicinity of the fire.
[bc]Utility bills for lab spaces are higher than average, but a small fraction of the total costs of labs. At Cornell, energy costs were about 5% of the cost of operating the building when you took the salaries of the people in the building into account. This explains why utility costs are not as compelling to academic administrators as they might seem. However, if there are labor cost savings associated with VR…
[bd]I’d love to see a breakdown of the fixed cost of purchasing the VR equipment, and the incremental cost of developing each new module.
[be]Ralph, that is so interesting to see the numbers I had no idea it was that small of an amount. I suppose the argument might need to be wasted energy/ environmental considerations then, rather than cost savings.
[bf]Yes, that is why I had a job at Cornell – a large part of the campus community was very committed to moving toward carbon neutrality and support energy conservation projects even with long payback periods (e.g. 7 years)
[bg]It seems like it would be more helpful to have a paper-test and field-test to see if VR helped with physically doing the tasks, since that is the benefit that I see. In addition, looking at retention over time would be important. Otherwise the case is harder to make for VR if it’s not increasing the desired knowledge and skills
[bh]How much of the statistical favoring of VR was due to the VR platform being “cool” and “new” versus the participants familiarity with traditional approaches? I do not see any control in the document for this.
[bi]I agree that work as reported seems to be demonstrating that the platform provides an acceptable experience for the learner, but it’s not clear whether the skills are acquired or retained.
[bj]three of the 4 authors are in the VR academic environment; one is in a chemistry department. Seems to be real bias in showing that VR works in this.
[bk]One application I think this could help with is one I experienced today. I had to sit through an Underground Storage Tank training which was similar to the text-heavy powerpoint in the video. I had been able to self direct my tour of the tank and play with the valves and detection systems, I would have been able to complete the regulatory requirements of understanding the system well enough to oversee its operations, but not well enough to have hands-on responsibilities. The hands-on work is left the contractors who work on the tanks everyday.
[bl]We have very good results using simulators in the nuclear power and airflight industries. BUT, there is still significant learning that occurs when things become real. The most dramatic example is landing high performance aircraft on carrier decks. Every pilot agrees that nothing prepares them for the reality this controlled crash.
Theme 1: Setting Core Values and Leading By Example.
Mission statements. …Mission statements establish the priorities and values of an organization, and can be useful in setting a positive example and signaling to potential new researchers the importance of safety. To examine the current state of public-facing mission statements, we accessed the websites of the top [a][b]50 chemistry graduate programs in the United States and searched any published mission statements for mentions of safety. [c]Although 29 departments had prominently displayed mission statements on their websites, only two (the University of Pennsylvania and UMN) included practicing safe chemistry in their mission statement (~at time of publishing in early 2021).[d][e][f][g]
Regular and consistent discussions about safety. Leaders can demonstrate that safety is a priority by regularly discussing safety in the context of experimental set-up, research presentations, literature presentations, or other student interactions[h][i]. Many potential safety teaching moments occur during routine discussions of the day-to-day lab experience.[j] Additionally, “safety minutes[k][l]”[m][n] have become a popular method in both industry and academia to address safety at the beginning of meetings.
Holding researchers accountable[o][p]. In an academic setting, researchers are still in the trainee stage of the career. As a result, it is important to hold follow-up discussions on safety to ensure that they are being properly implemented.[q] For example, in the UMN Department of Chemistry a subset of the departmental safety committee performs regular PPE “spot checks,”[r][s] and highlights exemplary behavior through spotlights in departmental media. Additionally, each UMN chemistry graduate student participates in an annual review process that includes laboratory safety as a formal category.[t] Candid discussion and follow-up on safe practices is critical for effective trainee development.
Theme 2: Empowering Researchers to Collaborate and Practice Safe Science.
Within a group: designate safety officers. Empower group members to take charge of safety within a group, both as individuals and through formal appointed roles such as laboratory safety officers (LSOs).[u] LSOs can serve multiple important roles within a research group. First, LSOs can act as a safety role model for peers in the research group, and also as a resource for departmental policy. Further, they act as liaisons to ensure open communication between the PI, the research group, and EHS staff. These types of formal liaisons are critical for fostering collaborations between a PI, EHS, and the researchers actually carrying out labwork. LSOs can also assist [v][w]with routine safety upkeep in a lab, such as managing hazardous waste removal protocols and regularly testing safety equipment such as eyewash stations. For example, in the departments of chemistry at UMN and UNC, LSOs are responsible for periodically flushing eyewash stations and recording the check on nearby signage[x].[y][z][aa][ab] Finally, LSOs are also natural picks for department-level initiatives such as department safety committees or student-led joint safety teams.[ac]
Within a group: work on policies as a collective.Have researchers co-write, edit, and revise standard operating procedures (SOPs).[ad] Many EHS offices have guides and templates for writing SOPs. More details on SOPs will be discussed in the training section of this chapter. Co-writing has the double benefit of acting as a safety teaching moment while also helping researchers feel more engaged and responsible for the safety protocols of the group.[ae][af][ag]
Within a department: establish safety collaborations.[ah] Research groups within departments often have very diverse sets of expertise. These should be leveraged through collaboration to complement “blind spots”[ai] within a particular group to the benefit of all involved—commonly, this is done through departmental safety committees, but alternative (or complementary) models are emerging. An extremely successful and increasingly popular strategy for establishing department-wide safety collaborations is the Joint Safety Team model.
Joint Safety Teams (JSTs). A Joint Safety Team (JST) is a collaborative, graduate student- and postdoc-led initiative with the goal of proliferating a culture of laboratory safety by bridging the gaps between safety administration, departmental administration, and researchers. JSTs are built on the premise that grassroots efforts can empower students to take ownership of safety practices, thus improving safety culture from the ground up. Since its inception in 2012, the first JST at UMN (a joint endeavor between the departments of Chemistry and Chemical Engineering & Materials Science, spurred through collaboration with Dow Chemical) has directly and significantly impacted the adoption of improved safety practices, and also noticeably improved the overall safety culture. Indeed, the energy and enthusiasm of students, which are well-recognized as key drivers in research innovation, can also be a significant driver for improving safety culture.[aj][ak]
…The JST model has several advantages[al]: (1) it spreads the safety burden across a greater number of stakeholders, reducing the workload for any one individual or committee; (2) it provides departmental leadership with better insight into the “on-the-ground” attitudes and behaviors of researchers;[am][an][ao][ap] and (3) it provides students with practical safety leadership opportunities that will be beneficial to their career. In fact, many of the strategies discussed in this chapter can be either traced back to a JST, or could potentially be implemented by a JST.
An inherent challenge with student-led initiatives like JSTs is that the high student turnover in a graduate program necessitates ongoing enthusiastic participation of students after the first generation. In fact, after the first generation of LSOs left the UMN JST upon graduation, there was a temporary lag in enthusiasm and engagement. In order to maintain consistent engagement with the JST, the departmental administration created small salary bonuses for officer-level positions within the JST, as well as a funded TA position. Since spending time on JST activities takes away from potential research or other time, it seems reasonable that students be compensated accordingly when resources allow it…[aq][ar]
3. Training
Initial Safety Training. Chemical safety training often starts with standardized, institution-wide modules that accompany a comprehensive chemical hygiene plan or laboratory safety manual. These are important resources, but researchers can be overwhelmed by the amount of information[as][at]—particularly if only some components seem directly relevant. There is anecdotal evidence that augmentation with departmental-level training[au][av][aw] initiatives can provide a stronger safety training foundation. For example, several departments have implemented laboratory safety courses. At UNC, a course for all first-year chemistry graduate students was created with the goal of providing additional training tailored to the department’s students. Iteratively refined using student feedback, the course operates via a “flipped classroom” model to maximize engagement. The course combines case study-based instruction, role playing (i.e., “what would you do” scenarios), hands-on activities (e.g., field trips to academic labs), and discussions led by active researchers in chemistry subdisciplines or industrial research.
Continued Safety Training[ax]. Maintaining engagement and critical thinking about safety can help assure that individual researchers continue to use safe practices,[ay] and can strengthen the broader safety culture. Without reinforcement, complacency is to be expected—and this can lead to catastrophe. We believe continued training can go well beyond an annual review of documentation by incorporating aspects of safety training seamlessly into existing frameworks.
Departments can incorporate safety minutes into weekly seminars or meetings, creating dedicated time for safety discussions (e.g., specific hazards, risk assessment, or emergency action plans). In our experience, safety minutes are most effective when they combine interactive prompts[az] that encourage researcher participation and discussion. These safety minutes allow researchers to learn from one another and collaborate on safety.
While safety minutes provide continuous exposure and opportunities to practice thinking about safety, they lack critical hands-on (re)learning experiences. Some institutions have implemented annual researcher-driven activities to help address this challenge. For example, researchers at UNC started “safety field days” (harkening back to field days in grade school) designed specifically to offer practice with hands-on techniques in small groups. At UMN, principal investigators participate in an annual SOP “peer review” process, in which two PIs pair up and discuss strengths and potential weaknesses of a given research group’s SOP portfolio.[ba][bb]
Continued training can also come, perhaps paradoxically, when a researcher steps into a mentoring role to train someone else. The act of mentoring a less experienced researcher provides training to the mentee, but also forces the mentor to re-establish a commitment to safe practices and re-learn best practices[bc].[bd] Peer teaching approaches have long been known to improve learning outcomes for both the mentor and the mentee. With regards to safety training, mentors would be re-exposed to a number of resources used in initial safety trainings, such as SOPs, while having to demonstrate mastery over the material through their teaching. Furthermore, providing hands-on instruction would require demonstrating techniques for mentees and subsequently critically assessing and correcting the mentee’s technique. Additionally, mentors would have to engage in critical thinking about safety while answering questions and guiding the mentee’s risk assessments.
Continued training is important for all members of a research group. Seniority does not necessarily convey innate understandings of safety, nor does it exempt oneself from complacency. For example, incoming postdocs[be] will bring a wealth of research and safety knowledge, but they may not be as intimately familiar with all of the hazards or procedures of their new lab, or they may come from a group or department with a different safety culture. Discussing safety with researchers can be difficult, but is a necessary part of both initial and continued training.
Awareness as it pertains to chemical safety involves building layers of experience and expertise on top of safety training. It is about being mindful and engaged in the lab and being proactive and prepared in the event of an adverse scenario. Awareness is about foreseeing and avoiding accidents before they happen, not just making retroactive changes to prevent them from happening again. There are aspects of awareness that come from experiential learning—knowing the sights and sounds of the lab—while other aspects grow out of more formal training. For example, awareness of the potential hazards associated with particular materials or procedures likely requires some form of risk assessment.
Heightened awareness grows out of strong training and frequent communication, two facets of an environment with a strong culture of safety. Communication with lab mates, supervisors, EHS staff, industry, and the broader community builds awareness at many levels. Awareness is a critical pillar of laboratory safety because it links the deeply personal aspects of being a laboratory researcher with the broader context of safety communication, training, and culture. It also helps address the challenge of implementing a constructive and supportive infrastructure.
Like any experience-based knowledge, safety awareness will vary significantly between individuals. Members of a group likely have had many of the same experiences, and thus often have overlap in their awareness. Effective mentoring can lead to further overlap by sharing awareness of hazards even when the mentee has not directly experienced the situation. Between groups in a department, however, where the techniques and hazards can vary tremendously, there is often little overlap in safety awareness. [bg][bh][bi][bj]In this section, we consider strategies to heighten researcher safety awareness at various levels through resources and tools that allow for intentional sharing of experiential safety knowledge.
Awareness Within an Academic Laboratory[bk]. …Some level of awareness[bl] will come through formal safety training, as was discussed in the preceding section. Our focus here is on heightened awareness through experiential learning, mindset development, and cultivating communication and teaching to share experience between researchers.
We have encountered or utilized several frameworks for building experience. One widely utilized model involves one-on-one mentoring schemes, where a more experienced researcher is paired with a junior researcher. This provides the junior researcher an opportunity to hear about many experiences and, when working together, the more experienced researcher can point out times when heightened awareness is needed. All the while, the junior researcher is running experiments and learning new techniques. There are drawbacks to this method, though. For example, the mentor may not be as experienced in particular techniques or reagents needed for the junior researcher’s project. Or the mentor may not be effective in sharing experience or teaching awareness. Gaps in senior leadership can develop in a group, leaving mentorship voids or leading to knowledge transfer losses. Like the game “telephone” where one person whispers a message to another in a chain, it is easy for the starting message to change gradually with each transfer of knowledge.[bm] This underscores the importance of effective mentoring,[bn] and providing mentors with a supportive environment and training resources such as SOPs and other documentation…
…Another approach involves discussing a hypothetical scenario, rather than waiting to experience it directly in the lab. Safety scenarios are a type of safety minute that provide an opportunity to proactively consider plausible scenarios that could be encountered.[bo][bp] Whereas many groups or departments discuss how to prevent an accident from happening again, hypothetical scenarios provide a chance to think about how to prevent an accident before it happens[bq][br]. Researchers can mediate these scenarios at group meetings. If a researcher is asked to provide a safety scenario every few weeks, they may also spend more time in the lab thinking about possible situations and how to handle them on their own.
Almost all of these methods constitute some form of risk or hazard assessment. As discussed in the training section, formal risk assessment has not traditionally been part of the academic regimen. Students are often surprised [bs]to learn that they perform informal risk assessments constantly, as they survey a procedure, ask lab mates questions about a reagent, or have a mentor check on an apparatus before proceeding. Intuition provides a valuable risk assessment tool, but only when one’s intuition is strong. A balance[bt] of formal risk assessment and informal, experiential, or intuition-based risk assessment is probably ideal for an academic lab.
…Checklists are another useful tool for checking in on safety awareness. [bu][bv][bw]Checklists are critically important in many fields, including aviation and medicine. They provide a rapid visual cue that can be particularly useful in high-stress situations where rapid recall of proper protocol can be compromised, or in high-repetition situations where forgetting a key step could have negative consequences. Inspired by conversations with EHS staff, the Miller lab recently created checklists that cover daily lab closing, emergency shutdowns, glovebox malfunctions, high-pressure reactor status, vacuum line operations, and more. The checklists are not comprehensive, and do not replace in-person training and SOPs, but instead capture the most important aspects that can cause problems or are commonly forgotten.[bx][by] The signage provides a visual reminder of the experiential learning that each researcher has accumulated, and can provide an aid during a stressful moment when recall can break down.
A unifying theme in these approaches is the development of frameworks for gaining experience with strong mentoring and researcher-centric continued education. Communication is also essential, as this enables the shared experiences of a group to be absorbed by each individual…[bz][ca]
[b]I can’t remember exactly what we did, but if we were writing this today; that is where I’d probably start. Plenty of discussion about how accurate/useful those rankings are, but in this instance, it serves as a good starting point
[c]If safety is not part of your mission statement or part of the your graduate student handbook, then this could cause issues with any disciplinary actions you may want to take. For instance, we had a graduate student set off the fire alarm twice on purpose in a research intense building. It was difficult for the department this person was part for actions to be taken.
We have a separate section on safety in our handbook “Chemical engineering research often involves handling and disposing of hazardous materials. Graduate
students must follow safe laboratory practices as well as attend a basic safety training seminar before starting
any laboratory work. In order to promote a culture of safety, the department maintains an active Laboratory
Safety Committee composed of the department head, faculty, staff and a student members which meets each
semester. Students are expected to be responsive to the safety improvements suggested by the committee,
to serve on the committee when asked, and utilize the committee members as a resource for lab safety
communication.”
[d]This is an interesting perspective on how others have prioritized safety.
[e]I find these sorts of things ring hollow given how little PIs or department leadership seem to know about what is happening in other people’s labs.
[f]I agree, particularly at the University level. However, a number of labs have mission statements, including the Miller lab, that mentions safety. I think at that level, it certainly can demonstrate greater intent
[g]Agreed. Having that language come from the PI is definitely different from having it come from the department or university – especially if the PI also walks the walk.
[h]So important because what is important to the professor becomes important to the student
[i]I definitely agree with this. I have always noticed that our students immediately reflect, pick up on, and look for what they think their professors deem most necessary/important essential.
[j]I think this is an underappreciated area. These are the conversations that help make safety a part of doing work, not just something bolted on to check a box or provide legal cover.
[l]I would say somewhat? I’ve come to realize that these two terms are often used interchangeable, but can mean very different things. At UNC, what my lab called “safety minutes” would be a dedicated section of group meeting every week where someone would lead a hypothetical scenario or a discussion of how to design a dangerous experiment with cradle-to-grave waste planning. At other places; these can mean things as simple as a static slide before seminar.
[m]The inverse of my comment above. I think these have their place, but if they’re disconnected from what the group’s “really” there to talk about, they can reinforce the idea that safety isn’t a core part of research.
[n]Agreed. I’ve seen some groups implement this by essentially swiping “safety minutes” from someone else. While this could be a way to get started, the items addressed really should be specific to your research group and your lab in order to be meaningful.
[o]Note how all these examples are of the department holding its own researchers accountable, not EHS coming externally to enforce
[p]Yes! It doesn’t fight against the autonomy that’s a core value of academia.
[q]This is a good point. Can’t be a one-and-done although it often feels like it is treated that way.
[r]It seems like this practice would also build awareness and appreciation of the other lab settings and help to foster communication between groups.
[s]I would also hope that it would normalize reminding others about their PPE. I was surprised to find how many faculty members in my department were incredibly uncomfortable with correcting the PPE or behavior of a student from a different lab. It meant effectively that we had extremely variable approaches to PPE and safety throughout our department.
[t]LOVE this idea. I’m guessing it makes the PI reflect on safety of each student as well as start a conversation about what’s going well and ways to improve
[u]We’ve seen dramatic improvement in safety issue resolution once we implemented this kind of program.
[v]Note how the word assist is used. Important to emphasize that the PI is delegating some of their duties to the LSO and throughout the lab but they are ultimately responsible for the safety of their researchers.
[w]Really important point. Too often I’ve seen an LSO designated just so the PI can essentially wash their hands of responsibility and just hand everything safety related to the LSO. It is also important for the PI to be prepared to back their LSO if another student doesn’t want to abide by a rule.
[x]Wondering what the risk is of institutions taking advantage of LSO programs by putting tasks on researchers that should really be the responsibility of the facility or EHS
[y]At UMN and UNC, do these tasks/roles contribute towards progress towards degree?
[z]In my experience at UNC serving as an LSO, it does not relate to progressing one’s studies/degree in any way (though there is the time commitment component of the role). At UMN, I know departmentally they have stronger support for their LSOs and requirements but I would not say that serving as an LSO helps/hinders degree progression (outside again of potential time commitments).
[aa]Thanks! I’m glad to hear that it doesn’t seem to hurt, but I think finding ways for it to help could make a huge difference. Thinking back to my own experience, my advisor counseled us to always have that end goal in mind when thinking about how we spent our time. This was in the context of not prioritizing TA duties ahead of research, but it is something that could argue against taking on these sorts of tasks.
[ab]Yeah, I think that is a really important point. If you’re PI continuously stresses only the results of research/imparts a sense of speed > safety; the students will pick up on that and shift in that direction through a function of their lab culture. So the flipside is if you can build and sustain a strong culture of safety; it becomes an inherent requirement, not an external check
[ac]It is important to keep in mind that this work should be considered in relation to other duties and to somehow be equally shared out among lab members. Depending on how this work is distributed, it can become an incredibly time-consuming set of tasks for one person to constantly be handling.
[ad]I have struggled with how to get buy-in for production of written SOPs.
[ae]It also increases the likelihood that the researchers will be able to implement the controls!
[af]Important point. It is often difficult in grad school to admit when you don’t understand or know how to do something. It is critical to make sure that they understand what is expected. I ran a small pilot project in which I found out that all 6 “soon to be defending” folks involved in the pilot had no clue what baffles in a hood were. Our Powerpoint “hoods” training was required every year. Ha!
[ag]In addition, it serves as a review process to catch risks hazards that the first writer may not have thought of. In industry this is a common practice that multiple people must check off on a protocol before it is used.
[ah]As outlined in this section, a good idea. We’re also working toward development of collaboration between staff with safety functions. For instance, have building managers from one department involved in inspections of buildings of other departments.
[ai]Also to avoid re-inventing the wheel. If another group has this expertise and has done risk assessments on the procedure you’re doing, better to start there rather than from scratch. You may even identify items for the other group to add.
[aj]Bottom up approach works extremely well when you have departmental support but not so much when the Head of the Dept doesn’t care.
[ak]They also can’t be effective if the concerns that the JST raise aren’t taken seriously by those who can change policies.
[al]An additional advantage is displaying value for performing safety duties. I.e. The culture is developed such that you work is appreciated rather than an annoyance
[am]I’m curious to hear what discoveries have been made about this.
[an]At UConn, we were trying to use surveys to essentially prove to our department leadership and faculty safety committee that graduate students actually DID want safety trainings – we just wanted them to be more specific to the things that we are actually concerned about it lab (as opposed to the same canned ones that they kept offering). My colleagues have told me that they are actually moving forward with these types of trainings now.
[ao]We also started holding quarterly LSO meetings because we proved to faculty through surveys that graduate students actually did want them (as long as they were designed usefully and addressed real issues in the research labs).
[ap]We work with representatives from various campus entities, which brings us a variety of insights. Yes focused training is much more valuable, and feels more worthwhile, educating both the trainer and trainee.
[aq]This is an impressive way for department administration to show endorsement and support for safety efforts.
[ar]Another way departments can communicate their commitment towards safety.
[as]We have one of these, too, but we need to move away from it. Not augmentation, but replacement. I don’t know what that looks like yet, but a content dump isn’t it.
[at]Agreed. At UNC we actually have an annual requirement to review the laboratory safety manual with our PI in some sort of setting (requires the PI to sign off on some forms saying they did it). Obviously with the length, that isn’t feasible in its entirety; so we’d highlight a few key sections but yeah. Not the most useful document
[au]I see the CHP or lab safety manual as education/resources and the training as actually practicing behaviors that are expected of researchers, which is critical to actually enforcing policies
[av]Agreed. I always felt any “training” should actually be doing something hands-on. Sitting in a lecture hall watching a Powerpoint should not qualify as “training.”
[aw]Agreed as well Jessica. That is why now all our safety trainings are hands on as you stated. It has worked and come across MUCH better. Even with our facilities and security personnel.
[ax]I really like how continued training is embedded throughout regular day-to-day activities in many cases, this is important. I would add that it is important to have the beginning training available for refresher or reference as needed but don’t think it’s worth it to completely retake the online trainings.
[ay]Let me remind everyone that when a senior lab person shows a junior person how to do a procedure, training is occurring. Including safety aspects in the teaching is critical. Capturing this with a note in the junior person’s lab notebook documents it. The UCLA prosecution would not have occurred if the PD had done this with Ms. Sangji.
[az]I’m very curious to hear about examples of this.
[ba]Wow. Getting the PIs to do this would be awesome. I wonder what is the PI incentive. Part of their annual review?
[bc]In recognition of this our department has put together an on-line lab research mentoring guide and we’re looking for ways to disseminate info about it.
[bd]This works both ways; a mentee ending up with a mentor who doesn’t emphasize safety in their work might be communicating that to their mentees as well.
[be]This has been something I’ve been concerned about and am not sure how to address as an embedded safety professional.
[bf]Just a thought – It seems like there is a big overlap between continued training and developing awareness, which makes sense
[bg]In working with graduate students, I have found a really odd understanding of this to be quite common. Many think that they are only responsible for understanding what is going on in their own labs – and not for what may be going on next door.
[bh]So true. This is where EHS could really help departments or buildings define safety better. Most people may not be aware that the floor above them uses pyrophorics, etc.
[bi]I think this speaks to how insular grad school almost forces you to be. You spend so much time deepening your knowledge and understanding of your area of research that you have no time to develop breadth.
[bj]Yeah, these are great points. Anecdotally, at UNC when we started the JST we really struggled to get any engagement whatsoever out of the computational groups, even when their work space is across the hall from heavily synthetic organic chemistry groups. We didn’t really solve this, but I know it was and is something we’re chewing on
[bk]Not mentioned explicitly in this section, but documenting what is learned is critically important. As noted earlier in the paper, students cycle through labs. The knowledge needs to stay there.
[bl]From my perspective, awareness seems to be directly tied t the PIs priorities except for the 1 in 20 student.
[bm]It seems like something like this happened in the 2010 Texas Tech explosion.
[bn]This highlights the importance of developing procedures/protocols/SOPs, secondary review, and good training/onboarding practicing for specific techniques
[bo]These are always great discussions and fruitful.
[bp]Agreed. Even if you never encounter that scenario in your career, the process of how to think about responding to the unexpected is a generalizable skill.
[bq]I also think it is important to use something like this to help researchers think about how to respond to incidents when they do happen in order to decrease the harm caused by the incident.
[br]This is really great. I think a big part of knowing what to do when a lab incident occurs has a lot to do with thinking about how to respond to the incident before it happens.
[bs]I’m really glad this is included in here. Most of risk assessment is actually very intuitive but this highlights the importance of going through the process in a formal way. But the term is so unfamiliar to researchers sometimes that it seems unapproachable
[bt]I’m interested in learning how others judge the way to find this balance.
[bu]These are useful if the student doing the work develops the checklist otherwise it becomes just a checklist without understanding and thought. I see many students look at these checklist and ignore hazards because it is not on or part of the list.
[bv]Good point. It would likely be a good practice to periodically update these as well – especially encouraging folks to bring in things that they’ve come across that were done poorly or they had to clean up.
[bw]These are great points. The checklists we’ve designed try their best to highlight major hazards, but due to brevity it isn’t possible to cover everything. I think as Jessica pointed out, is that if they are reviewed periodically, that can be a huge boost in a way that reviewing and updating SOPs as living documents is also important
[bx]This is important – a good checklist is neither an SOP nor a training.
[by]I utilize checklists but sometimes see a form of checklist fatigue – a repeated user thinks they know it and doesn’t bother with the checklist.
So the comment about a GOOD checklist is applicable.
[bz]I’m impressed with the ideas and diversity and discussion of alternatives. It’s inspiring. However, I’m trying to institute a culture of safety where I am and many of these ideas aren’t possible for me. I’m in chemistry at a 2-year (community) college, and I don’t have TAs, graduate students, etc. We’re not a research institution which is somewhat of an advantage because our situations are relatively static and theoretically controllable . My other problem is I’m trying to carry the safety culture across the college and to our sister colleges, to departments like maintenance and operations, art, aircraft mechanics, welding etc.
I would love to see ways to address
1. just a teaching college
2. other processes across campus.
I head a safety committee but am challenged to keep people engaged and aware.
[ca]I’ve found some helpful insights from others about this sort of thing from NAOSMM. They have a listserv and annual conferences where they offer workshops and presentations on safety helpful for non-research institutions like to what you’re describing.
Safety is at the core of industrial and academic laboratory operations worldwide and is arguably one of the most important parts of any enterprise since worker protection is key to maintaining on-going activity.28 At the core of these efforts is the establishment of clear expectations regarding acceptable conditions and procedures necessary to ensure protection from accidents and unwanted exposures[a]. To achieve adherence to these expectations, frequent and well-documented inspections are made an integral part of these systems.23
Consideration of the inspection format is essential to the success of this process.31 Important components to be mindful about include frequency of inspections, inspector training, degree of recipient involvement, and means of documentation[b][c][d][e] . Within documentation, the form used for inspection, the report generated, and the means of communicating, compiling, and tracking results all deserve scrutiny in order to optimize inspection benefits.27
Within that realm of documentation, inspection practice often depends on the use of checklists, a widespread and standard approach. Because checklists make content readily accessible and organize knowledge in a way that facilitates systematic evaluation, they are understandably preferred for this application. In addition, checklists reduce frequency of omission errors and, while not eliminating variability, do increase consistency in inspection elements[f][g][h][i][j] among evaluators because users are directed to focus on a single item at a time.26This not only amplifies the reliability of results, but it also can proactively communicate expectations to inspection recipients and thereby facilitate their compliance preceding an inspection.
However, checklists do have limitations. Most notably, if items on a list cover a large scale and inspection time is limited, reproducibility in recorded findings can be reduced[k]. In addition, individual interpretation and inspector training and preparation can affect inspection results[l][m][n][o][p][q].11 The unfortunate consequence of this variation in thoroughness is that without a note of deficiency there is not an unequivocal indication that an inspection has been done. Instead, if something is recorded as satisfactory, the question remains whether the check was sufficiently thorough or even done at all. Therefore, in effect, the certainty of what a checklist conveys becomes only negative feedback[r][s][t].
Even with uniformity of user attention and approach, checklists risk producing a counterproductive form of tunnel vision[u] because they can tend to discourage recognition of problematic interactions and interdependencies that may also contribute to unsafe conditions.15 Also, depending on format, a checklist may not provide the information on how to remedy issues nor the ability[v][w] to prioritize among issues in doing follow-up.3 What’s more, within an inspection system, incentive to pursue remedy may only be the anticipation of the next inspection, so self-regulating compliance in between inspections may not be facilitated.[x][y][z][aa][ab][ac]22
Recognition of these limitations necessitates reconsideration of the checklist-only approach, and as part of that reevaluation, it is important to begin with a good foundation. The first step, therefore, is to establish the goal of the process. This ensures that the tool is designed to fulfill a purpose that is widely understood and accepted.9 Examples of goals of an environmental health and safety inspection might be to improve safety of surroundings, to increase compliance with institutional requirements, to strengthen preparedness for external inspections, and/or to increase workers’ awareness and understanding of rules and regulations. In all of these examples, the aim is to either prompt change or strengthen existing favorable conditions. While checklists provide some guidance for change, they do not bring about that change, [ad][ae]and they are arguably very limited when it comes to definitively conveying what favorable conditions to strengthen. The inclusion of positive feedback fulfills these particular goals.
A degree of skepticism and reluctance to actively include a record of positive observations[af][ag] in an inspection, is understandable since negative feedback can more readily influence recipients toward adherence to standards and regulations. Characterized by correction and an implicit call to remedy, it leverages the strong emotional impact of deficiency to encourage limited deviation from what has been done before.19 However, arousal of strong negative emotions, such as discouragement, shame, and disappointment, also neurologically inhibits access to existing neural circuits thereby invoking cognitive, emotional, and perceptual impairment.[ah][ai][aj]10, 24, 25 In effect, this means that negative feedback can also reduce the comprehension of content and thus possibly run counter to the desired goal of bringing about follow up and change.2
This skepticism and reluctance may understandably extend to even including positive feedback with negative feedback since affirmative statements do not leave as strong of an impression as critical ones. However, studies have shown that the details of negative comment will not be retained without sufficient accompanying positive comment.[ak][al][am][an][ao][ap][aq]1[ar][as] The importance of this balance has also been shown for workplace team performance.19 The correlation observed between higher team performance and a higher ratio of positive comments in the study by Losada and Heaphy is attributed to an expansive emotional space, opening possibilities for action and creativity. By contrast, lower performing teams demonstrated restrictive emotional spaces as reflected in a low ratio of positive comments. These spaces were characterized by a lack of mutual support and enthusiasm, as well as an atmosphere of distrust and [at]cynicism.18
The consequence of positive feedback in and of itself also provides compelling reason to regularly and actively provide it. Beyond increasing comprehension of corrections by offsetting critical feedback, affirmative assessment facilitates change by broadening the array of solutions considered by recipients of the feedback.13 This dynamic occurs because positive feedback adds to an employees’ security and thus amplifies their confidence to build on their existing strengths, thus empowering them to perform at a higher level[au].7
Principles of management point out that to achieve high performance employees need to have tangible examples of right actions to take[av][aw][ax][ay][az], including knowing what current actions to continue doing.14, 21 A significant example of this is the way that Dallas Cowboys football coach, Tom Landry, turned his new team around. He put together highlight reels for each player that featured their successes. That way they could identify within those clips what they were doing right and focus their efforts on strengthening that. He recognized that it was not obvious to natural athletes how they achieved high performance, and the same can be true for employees and inspection recipients.7
In addition, behavioral science studies have demonstrated that affirmation builds trust and rapport between the giver and the receive[ba]r[bb][bc][bd][be][bf][bg].6 In the context of an evaluation, this added psychological security contributes to employees revealing more about their workplace which can be an essential component of a thorough and successful inspection.27Therefore, positive feedback encourages the dialogue needed for conveying adaptive prioritization and practical means of remedy, both of which are often requisite to solving critical safety issues.[bh]
Giving positive feedback also often affirms an individual’s sense of identity in terms of their meaningful contributions and personal efforts. Acknowledging those qualities, therefore, can amplify them. This connection to personal value can evoke the highest levels of excitement and enthusiasm in a worker, and, in turn, generate an eagerness to perform and fuel energy to take action.8
Looked at from another perspective, safety inspections can feel personal. Many lab workers spend a significant amount of time in the lab, and as a result, they may experience inspection reports of that setting as a reflection on their performance rather than strictly an objective statement of observable work. Consideration of this affective impact of inspection results is important since, when it comes to learning, attitudes can influence the cognitive process.29 Indeed, this consideration can be pivotal in transforming a teachable moment into an occasion of learning.4
To elaborate, positive feedback can influence the affective experience in a way that can shift recipients’ receptiveness to correction[bi][bj]. Notice of inspection deficiencies can trigger a sense of vulnerability, need, doubt, and/or uncertainty. At the same time, though, positive feedback can promote a sense of confidence and assurance that is valuable for active construction of new understanding. Therefore, positive feedback can encourage the transformation of the intense interest that results from correction into the contextualized comprehension necessary for successful follow-up to recorded deficiencies.
Overall, then, a balance of both positive and negative feedback is crucial to ensuring adherence to regulations and encouraging achievement.[bk]2, 12 Since positive emotion can influence how individuals respond to and take action from feedback, the way that feedback is formatted and delivered can determine whether or not it is effective.20 Hence, rooting an organization in the value of positivity can reap some noteworthy benefits including higher employee performance, increased creativity, and an eagerness in employees to engage.6
To gain these benefits, it is, therefore, necessary to expand from the approach of the checklist alone.16 Principles of evaluation recommend a report that includes both judgmental and descriptive information. This will provide recipients with the information they seek regarding how well they did and the successfulness of their particular efforts.27 Putting the two together creates a more powerful tool for influence and for catalyzing transformation.
In this paper the authors would like to propose an alternative way to format safety inspection information, in particular, to provide feedback that goes beyond the checklist-only approach. The authors’ submission of this approach stems from their experiences of implementing a practice of documenting positive inspection findings within a large academic department. They would like to establish, based on educational and organizational management principles, that this practice played an essential role in the successful outcome of the inspection process in terms of corrections of safety violations, extended compliance, and user satisfaction. [bl][bm][bn][bo]
[a]There is also a legal responsibility of the purchaser of a hazardous chemical (usually the institution, at a researcher’s request) to assure it is used in a safe and healthy manner for a wide variety of stakeholders
[b]I would add what topics will be covered by the inspection to this list. The inspection programs I was involved in/led had a lot of trouble deciding whether the inspections we conducted were for the benefit of the labs, the institution or the regulatory authorities. Each of these stakeholders had a different set of issues they wanted to have oversight over. And the EHS staff had limited time and expertise to address the issues adequately from each of those perspectives.
[c]This is interesting to think about. As LSTs have taken on peer-to-peer inspections, they have been using them as an educational tool for graduate students. I would imagine that, even with the same checklist, what would be emphasized by an LST team versus EHS would end up being quite a bit different as influenced by what each group considered to be the purpose of the inspection activity.
[d]Hmm, well maybe the issue is trying to cram the interests and perspectives of so many different stakeholders into a single, annual, time-limited event 😛
[e]I haven’t really thought critically about who the stakeholders are of an inspection and who they serve. I agree with your point Jessica, that the EHS and LST inspections would be different and focus on different aspects. I feel include to ask my DEHS rep for his inspection check list and compare it to ours.
[f]This is an aspirational goal for checklists, but is not often well met. This is because laboratories vary so much in the way they use chemicals that a one size checklist does not fit all. This is another reason that the narrative approach suggested by this article is so appealing
[g]We have drafted hazard specific walkthrough rubrics that help address that issue but a lot of effort went into drafting them and you need someone with expertise in that hazard area to properly address them.
[h]Well I do think that, even with the issues that you’ve described, checklists still work towards decreasing variability.
In essence, I think of checklists as directing the conversation and providing a list of things to make sure that you check for (if relevant). Without such a document, the free-form structure WOULD result in more variability and missed topics.
Which is really to say that, I think a hybrid approach would be the best!
[i]I agree that a hybrid approach is ideal, if staff resources permit. The challenge is that EHS staff are responding to a wide variety of lab science issues and have a hard time being confident that they are qualified to raise issues. Moving from major research institutions to a PUI, I finally feel like I have the time to provide support to not only raise concerns but help address them. At Keene State, we do modern science, but not exotic science.
[j]I feel like in the end, it always comes down to “we just don’t have the resources…”
Can we talk about that for a second? HOW DO WE GET MORE RESOURCES FOR EHS : /
[k]I feel like that would also be the case for a “descriptive comment” inspection if the time is limited. Is there a way that the descriptive method could improve efficiency while still covering all inspection items?
[l]This is something my university struggles with in terms of a checklist. There’s quite a bit of variability between inspections in our labs done by different inspectors. Some inspectors will catch things that others would overlook. Our labs are vastly different but we are all given the same checklist – Our checklist is also extensive and the language used is quite confusing.
[m]I have also seen labs with poor safety habits use this to their advantage as well. I’ve known some people who strategically put a small problem front and center so that they can be criticized for that. Then the inspector feels like they have “done their job” and they don’t go looking for more and miss the much bigger problem(s) just out of sight.
[n]^ That just feels so insidious. I think the idea of the inspector looking for just anything to jot down to show they were looking is not great (and something I’ve run into), but you want it to be because they CANT find anything big, not just to hide it.
[o]Amanda, our JST has had seminars to help prepare inspectors and show them what to look for. We also include descriptions of what each inspection score should look like to help improve reproducibility.
[p]We had experience with this issue when we were doing our LSTs Peer Lab Walkthrough! For this event, we have volunteer graduate students serve as judges to walk through volunteering labs to score a rubric. One of the biggest things we’ve had to overhaul over the years is how to train and prepare our volunteers.
So far, we’ve landed on providing training, practice sessions, and using a TEAM of inspectors per lab (rather than just one). These things have definitely made a huge difference, but it’s also really far from addressing the issue (and this is WITH a checklist)
[q]I’d really like to go back to peer-peer walkthroughs and implement some of these ideas. Unfortunately, I don’t think we are there yet in terms of the pandemic for our university and our grad students being okay with this. Brady, did you mean you train the EHS inspectors or for JST walkthroughs? I’ve given advice about seeing some labs that have gone through inspections but a lot of items that were red flags to me (and to the grad students who ended up not pointing it out) were missed / not recorded.
[r]This was really interesting to read, because when I was working with my student safety team to create and design a Peer Lab Walkthrough, this was something we tried hard to get around even though we didn’t name the issue directly.
We ended up making a rubric (which seems like a form of checklist) to guide the walkthrough and create some uniformity in responding, but we decided to make it be scored 1-5 with a score of *3* representing sufficiency. In this way, the rubric would both include negative AND positive feedback that would go towards their score in the competition.
[s]I think the other thing that is cool about the idea of a rubric, is there might be space for comments, but by already putting everything on a scale, it can give an indication that things are great/just fine without extensive writing!
[t]We also use a rubric but require a photo or description for scores above sufficient to further highlight exemplary safety.
[u]I very much saw this in my graduate school lab when EHS inspections were done. It was an odd experience for me because it made me feel like all of the things I was worried about were normal and not coming up on the radar for EHS inspections.
[v]I think this type of feedback would be really beneficial. From my experience (and hearing others), we do not get any feedback on how to fix issues. You can ask for help to fix the issues, but sometimes the recommendations don’t align with the labs needs / why it is an issue in the first place
[w]This was a major problem in my 1st research lab with the USDA. We had safety professionals who showed up once per year for our annual inspection (they were housed elsewhere). I was told that a set up we had wasn’t acceptable. I explained why we had things set up the way we did, then asked for advice on how to address the safety issues raised. The inspector literally shrugged his shoulders and gruffly said to me “that’s your problem to fix.” So – it didn’t get fixed. My PI had warned my about this attitude (so this wasn’t a one-off), but I was so sure that if we just had a reasonable, respectful conversation….
[x]I think this is a really key point. We had announced inspections and were aware of what was on the checklists. As a LSO, we would in the week leading up to it get the lab in tip-top shape. Fortunately, we didn’t always have a ton of stuff to fix in the week leading up to the inspection, but its easy to imagine reducing things to a yes/no checklists can mask long-term problems that are common in-between inspections
[y]My lab is similar to this. When I first joined and went through my first inspection – the week prior was awful trying to correct everything. I started implementing “clean-ups” every quarter because my lab would just go back to how it was after the inspection.
[aa]Our EHS was short-staffed and ended up doing “annual inspections” 3 years apart – once was before I joined my lab, and the next was ~1.5 years in. The pictures of the incorrect things found turned out to be identical to the inspection report from 3 years prior. Not just the same issues, but quite literally the same bottles/equipment.
[ab]Yeah, we had biannual lab cleanups that were all day events, and typically took place a couple months before our inspections; definitely helped us keep things clean.
One other big things is when we swapped to subpart K waste handling (cant keep longer than 12months), we just started disposing of all waste every six months so that way nothing could slip through the cracks before an inspection
[ac]Jessica, you’re talking about me and I don’t like it XD
[ad]I came to feel that checklists were for things that could be reviewed when noone was in the room and that narrative comments would summarize conversations about inspectors’ observations. These are usually two very different sets of topics.
[ae]We considered doing inspections “off-hours” to focus on the checklist items, until we realized there was no such thing as off hours in most academic labs. Yale EHS found many more EHS problems during 3rd shift inspections than daytime inspections
[af]This also feels like it would be a great teachable moment for those newer to the lab environment. We often think that if no one said anything, I guess it is okay even if we do not understand why it is okay. Elucidating that something is being done right “and this is why” is really helpful to both teach and reinforce good habits.
[ag]I think that relates well to the football example of highlighting the good practices to ensure they are recognized by the player or researcher.
[ah]I noticed that not much was said about complacency, which I think can both be a consequence of an overwhelm of negative feedback and also an issue in and of itself stemming from culture. And I think positive feedback and encouragement could combat both of these contributors to complacency!
[ai]Good point. It could also undermine the positive things that the lab was doing because they didn’t actually realize that the thing they were previously doing was actually positive. So complacency can lead to the loss of things that you were doing right before.
[aj]I have seen situations where labs were forced to abandon positive safety efforts because they were not aligned with institutional or legal expectations. Compliance risk was lowered, but physical risk increased.
[ak]The inclusion of both positive and negative comments shows that the inspector is being more thorough which to me would give it more credibility and lead to greater acceptance and retention of the feedback.
[al]I think they might be describing this phenomenon a few paragraphs down when they talk about trust between the giver and receiver.
[am]This is an interesting perspective. I have found that many PIs don’t show much respect for safety professionals when they deliver bad news. Perhaps the delivery of good news would help to reinforce the authority and knowledge base of the safety professionals – so that when they do have criticism to deliver, it will be taken more seriously.
[an]The ACS Pubs advice to reviewers of manuscripts is to describe the strengths of the paper before raising concerns. As an reviewer, I have found that approach very helpful because it makes me look for the good parts of the article before looking for flaw.
[ao]The request for corrective action should go along with a discussion with the PI of why change is needed and practical ways it might be accomplished.
[ap]I worked in a lab that had serious safety issues when I joined. Wondering if the positive -negative feedback balance could have made a difference in changing the attitudes of the students and the PI.
[aq]It works well with PIs with an open mind; but some PIs have a sad history around EHS that has burnt them out on trying to work with EHS staff. This is particularly true if the EHS staff has a lot of turnover.
[ar]Sandwich format- two positives (bread) between the negative (filling).
[at]I wonder how this can contribute to a negative environment about lab-EHS interactions? If there is only negative commentary (or the perception of that) flowing in one direction, would seem that it would have an impact on that relationship
[au]I see how providing positive feedback along with the negative feedback could help do away with the feeling the inspector is just out to get you. Instead, they are here to provide help and support.
[av]This makes me consider how many “problem” examples I use in the safety training curriculum.
[aw]This is a good point. From a learning perspective, I think it would be incredibly helpful to see examples of things being done right – especially when what is being shown is a challenge and is in support of research being conducted.
[ax]I third this so hard! It was also something that I would seek out a lot when I was new but had trouble finding resources on. I saw a lot of bad examples—in training videos, in the environments around me—but I was definitely starved for a GOOD example to mimic.
[ay]I read just part of the full paper. It includes examples of positive feedback and an email that was sent. The example helped me get the flavor of what the authors were doing.
[az]This makes a lot of sense to me – especially from a “new lab employee” perspective.
[ba]Referenced in my comment a few paragraphs above.
[bb]I suspect that another factor in trust is the amount of time spent in a common environment. Inspectors don’t have a lot of credibility if they are only visit a place annually where a lab worker spends every day.
[bc]A lot of us do not know or recognize our inspectors by name or face (we’ve also had so much turn around in EHS personnel and don’t even have a CHO). I honestly would not be able to tell you who does inspections if not for being on our universities LST. This has occurred in our lab (issue of trust) during a review of an inspection of our radiation safety inspection. The waste was not tested properly, so we were questioned on our rad waste. It was later found that the inspector unfortunately didn’t perform the correct test for testing the pH. My PI did not take it very well after having to correct them about this.
[bd]I have run into similar issues when inspectors and PIs do not take the time to have a conversation, but connect only by e-mail. Many campuses have inspection tracking apps which make this too easy a temptation for both sides
[be]Not only a conversation but inspectors can be actively involved in improving conditions- eg supply signs, ppe, similar sop’s…
[bf]Yes, sometimes, it’s amazing how little money it can take to show a good faith effort from EHS to support their recommendations. Other times, if it is a capital cost issue, EHS is helpless to assist even if there is a immediate concern.
[bg]I have found inspections where the EHS openly discuss issues and observations as they are making the inspection very useful Gives me the chance to ask the million questions I have about safety in the lab.
[bh]We see this in our JST walkthroughs. We require photos or descriptions of above acceptable scores and have some open ended discussion questions at the end to highlight those exemplary safety protocols and to address areas that might have been lacking.
[bi]Through talking with other students reps, we usually never know what we are doing “right” during the inspections. I think this would benefit in a way that shared positive feedback, would then help the other labs that might have been marked on their inspection checklist for that same item and allow them a resource on how to correct issues.
[bj]I think this is an especially important point in an educational environment. Graduate students are there to learn many things, such as how to maintain a lab. It is weird to be treated as if we should already know – and get no feedback about what is being done right and why it is right. I often felt like I was just sort of guessing at things.
[bk]At UVM, we hosted an annual party where we reviewed the most successful lab inspections. This was reasonably well received, particularly by lab staff who were permanent and wanted to know how others in the department were doing on the safety front
[bl]This can be successful, UNTIL a regulatory inspection happens that finds significant legal concerns related primarily to paperwork issues. Then upper management is likely to say “enough of the nice guy approach – we need to stop the citations.” Been there, done that.
[bm]Fundamentally, I think there needs to be two different offices. One to protect the people, and one to protect the institution *huff huff*
[bn]When that occurs, there is a very large amount of friction between those offices. And the Provost ends up being the referee. That is why sometimes there is no follow up to an inspection report that sounds dire.
[bo]But I feel like there is ALREADY friction between these two issues. They’re just not as tangible and don’t get as much attention as they would if you have two offices directly conflicting.
These things WILL conflict sometimes, and I feel like we need a champion for both sides. It’s like having a union. Right now, the institution is the only one with a real hand in the game, so right now that perspective is the one that will always win out. It needs more balance.