Worked with other university’s EH&S Departments; data directly sent to Imke’s Team; encouraging emails to complete the survey; stopped collecting data when N is representative of the campus; greatest response rate associated with asking researchers in-person to complete the survey
Speak to relevance of findings for the challenges for promoting responsibilities associated with undefined risks (DURC).
How to train researchers to think about this.
Imke has no focus on DURC, but they did look at attitudes
How accepting are researchers of safety policies?
Could be embedded with an ethical question.
The phrase “safety takes priority” was used here whereas Dupont states “Safety and productivity are of equal value.”
There is a difference between the rules and the tools; we use the RAMP model to train researchers to think critically about their work; “push” information out based on need, but also provide resources so that there is something there when the researcher “pulls” for information
Should not put safety and productivity at odds; “safety with”… Instead of “safety first, then…”; think of the value-add of working safety and productivity together
Safety and productivity are the same problem framed differently
Reaching for accurate risk perception
Risk perceptions vary from person to person (it’s very personal)
We delude ourselves that there is one “best risk perception”
Who resolves productivity vs safety? Safety should take priority in cases where there is a conflict. This is our ultimate responsibility.
Imke mentioned how big the influence PIs have on research safety in the lab; when a student moves from training with a PI that maintains a strong safety culture, and then the student graduates and moves on to a place with a weaker safety culture, how does that experience translate?
If a PI emphasizes safety, the lab is much easier to work with for safety professionals; these PIs think about research safety as training their trainees to be safer and better researchers in the greater research community; the emphasis is on professional development, not just about being safe in this particular research lab; also, I am seeing this much more among younger PIs
Agrees with the PIs correlation on safety
Does anyone know if any students trained in these “strong safety culture” labs been followed into their careers to see how they do?
This would be a fantastic and also very difficult project! What we are seeing is that these “strong safety culture” students are going to companies well-known for their emphasis on strong safety culture
It is very difficult for individuals to sustain their safety culture belief system; in one example, a person maintained a safety standard at one institution, then when they moved to another institution, they abandoned the better safety practice even though they themselves said it was better. When asked why they abandoned it, they said that the safety practice was not required at the new institution, so they didn’t do it. So was this an example of a strong safety culture or an example of more compliance rules creating a safer environment?
Inspiring a proactive, open-ended way of thinking of safety is more difficult than getting someone to use a particular safety device; extra challenging; Two Categories: motivations via norms and what is incentivized; Norms are important, but incentives tend to beat norms as suggested by Imke’s survey results
Dr. Mulcahy introduced Edgar Schein’s Levels of Culture discussion within the field of Organizational Culture – from which the concept of Safety Culture was initially derived. Schein has a large body of work out there if you are interested in doing a deep dive. As an initial introduction to Schein and his work, I recommend you watch this video of an excerpt of an interview with him from 2016 in which he touches upon the use, and misuse, of the term “culture” and what we should actually be measuring:
The term “Safety Culture” gets thrown around a lot – often times to punish.
Edgar Schein’s levels of culture
Texas Tech had artifacts, but not much of the stuff below the surface
We don’t have a “safety culture” or a “business culture” – we just have A CULTURE. What are we doing? Why are we doing that?
When seeing someone at work looking off into the distance what question do you ask? “What is that person thinking about?” versus “Why is that person wasting time?” How you frame that question says something about the culture in which you reside.
Ask the workers why they are or are not doing something. They may have a really good reason for the “infraction.”
Mining the Diamond: not everything at the bottom tier will lead to a fatality, but is still important to examine.
Many scientists push back on safety advice saying “That’s just not how scientists work” – but sometimes the question is “is the way the scientist is doing it the right way?”
General Conversation
I got into safety because as a researcher I saw these conflicts become an “us versus them” and it stops things. Now on the safety side, I realize that it is important to be vulnerable.
Somebody needs to be the first to lower the barrier and put ourselves out there.
I thought that because I had a PhD and I was a researcher that the scientists were going to be more open with me. This was not the case at all.
We want to be heard and understood more than we want to be agreed with.
Diversity of expertise is important to the safety team.
Difficult to get at things when their mission is compliance.
Every time you are dealing with someone on an issue, it is not just that issue – you are also dealing with their past 30 years of experiences they have had with similar issues.
It is important to realize that mistakes aren’t intentional.
How do we embed decision-making opportunities into the flow of a researchers’ research?
Language is important: instead of “this is a problem”, say “this is a learning opportunity” or “look I see an opportunity here.”
On Wednesday October 28th, Sammye Sigmann of Appalachian State University and Debbie Decker (UC Davis, retired) hosted a CHAS chat dicsussion on Chemical Safety: Demonstrations and Outreach Activities
At this time of year, many groups like to engage in outreach activities. While there are likely to be fewer outreach activities this year, we thought that a informal discussion on demonstration and outreach safety might still be useful for October when so many of these events take place for Halloween.
Among other topics, they discussed developing risk analyses documentation for a popular demonstration, the Elephant’s Toothpaste.
The Appalachian State JHA for Elephant’s Toothpaste can be downloaded here.
Samuella B. Sigmann, MS, NRCC-CHO Immediate Past Chair, ACS Division of Chemical Health & Safety, 2020 Senior Lecturer/Safety Committee Chair/Director of Stockroom Chemistry Appalachian State University
Led by: Prof. Mahesh K. Mahanthappa Department of Chemical Engineering & Materials Science University of Minnesota, Minneapolis MN 2020 Laboratory Safety Institute Graduate Research Faculty Safety Award Winner
After Jessica A. Martin reviewed the document “Exploring Definitions of Safety Culture” (see link bel0w), Journal Club participants were asked to spend 5 minutes typing into the chat questions that came to mind when considering these definitions as well as considering the list of our upcoming discussion leaders. The questions shared are below.
Regardless of your definition, how do you measure “safety culture” with the goal of improving?
Given that incident rates are relatively low and incidents themselves are typically not as dangerous in academia, what would a more positive safety culture look like in academia?
What do we define as the “problems” in academia that makes us worry about the status of our Safety Culture?
Do all of the actors identified within an academic safety culture identify the same problems? (i.e. do we all actually agree on what the problems are?)
If culture is a combination of what we do/behaviors and what we think/believe, for safety culture do we only/mostly care about what people do/behaviors since it’s our actions that impact our outcomes (harmed or not)?
In a safety survey, we can identify safety behaviors and awareness on a scale and provide actions. How do we quantify and change negative safety attitudes? What advice can we provide?
Does safety culture really reduce incidents in the research setting? Where is the proof? Is it just an excuse to put every aspect of safety under one umbrella?
To what extent do we need to measure it if we can adequately describe a group’s safety culture from inside and outside observations?
Defining Safety Culture
How do you empower individuals (students, faculty, staff, etc.) to take personal responsibility for safety, while making sure adequate training is provided and demonstrated (best laboratory practices) to others in the lab?
How has the COVID epidemic changed the safety culture of the USA? Have those changes been reflected in your organization?
What other types of culture do we measure in an attempt to change the culture?
What are the boundaries of an organizational culture? Are these the same boundaries as the safety culture of the organization?
How many people does it take to have a culture?
What are other concepts which have undergone a period of disagreement and then been resolved? How did they do that?
How (and to what extent) can organizational culture and institutional management hierarchies influence positively academic laboratory safety culture?
What are the other parent fields and what should we be drawing from them as the chemical health and safety field develops (ex. organizational psychology)?
The questions sent out to everyone to contemplate:
1) What safety-related incident have you experienced that taught you the most about how you approach safety?
IPA + dry ice container exploded – thought it was at room temperature when person put the lid on it, but it wasn’t.
Lesson: Safety is not as straight-forward as you think it is.
Working with a post-doc on LiH reaction. Post-doc told me to quench the reaction with water, so I dumped 100 mL right in and it exploded. Green goo goes everywhere and I was covered in it too. Noticed the goo was cold.
Lesson:First big lesson in explicit communication – we clearly meant 2 different things by “quench.”
Went to undergrad college with no safety personnel. As students, we were isolating DNA using phenol:chloroform extraction. A fellow student dropped a bottle of phenol. The bottle broke and splashed all over her. She went under the shower and technically we responded correctly. However, looking back we did not appreciate at all the seriousness of this incident or how dangerous it was.
Lesson: Educating on safety hazards is just as important as educating on the chemistry.
How management of change is not managed; lots of small incidents in developing SOPs and any processes.
Lesson:As safety professionals, we think about the safety of processes and why we do the things we do, but we don’t necessarily communicate it (or communicate it well).
Sustained an injury with a thin needle that took a core of my skin out of my thumb. Was sent to administrators to deal with paperwork and was informed that because of the particular situation, I actually was able to file a worker’s comp claim, however, this would not have necessarily been true depending on where I was working on campus.
Lesson: Who you are and where you are working determines workers’ comp status!
Within 30 days of starting job, a mislabeled bottle of biologicals had everybody in a panic; I had not had HAZWOPER training yet; turned out to be spirulina – someone had labeled it “eco” and they that it was e. coli; I did not act right away and this was a mistake because it exploded into a nightmare of infighting among a bunch of the faculty and staff over what this stuff was and how it got there.
Lesson:learned to be proactive as I can be immediately following a situation – communication is such a big issue.
In undergrad, used chlorosulfonic acid for an experiment in undergrad class – instructor dispensed it, everyone was double-gloved, in lab coats; instructors thought they had accounted for all hazards; however, they did not say that all of your equipment had to be secure before obtaining your aliquot of the acid; someone’s condenser hoses came out and sprayed water around the hood with the acid sitting in there; somehow it managed to miss the acid! Scary near miss. Back to communication!
Lesson: Even if you think you have covered all of the safety precautions, unexpected things can still happen. It is important to double-check things and communicate effectively.
Became safety officer at an institution when no one really knew what it was; staff member talking to x-ray crystallographer; fumes coming into the hallway; I noticed but I thought maybe it was okay because no one else was reacting; I didn’t feel confident in my position initially so I asked a bunch of questions about the situation and learned a TON about ventilation and the history of the situation; then had to learn to interact with facilities.
Lesson: Fully understanding a situation leads to a much more thorough resolution of a chronic safety issue than “name-and-blame” tactics. Also, realized how many different parts were contributing to one unsafe situation.
Used DMF right at edge of fume hood – after ½ hour decided to stop doing this; later in the day when outdoors, I suddenly couldn’t breathe and fell on the ground; figured out later that this was a common effect of DMF exposure.
Lesson: Learn the hazards of what you are working with. Also understand your protective equipment; the fume hood was being improperly used because it was overcrowded. Dangerous exposures can happen so easily when you don’t understand what you are handling and how to protect yourself from them.
Popular science magazines as kids (11 years old) – would do the experiments; tried to prepare copper nitrate; got copper and nitric acid from a small shop; the mixture produced brown gases that was not mentioned in the procedure. We ran away from it until the gases cleared. We learned to do our experiments outside from that point forward. I had many such incidents growing up and going through my own education.
Lesson:I learned that I could survive the accidents; and before you learned that this was simply the professional life of being a chemist. The UCLA incident really changed how I thought about safety. Now we think more about how to prevent exposure in the last 10 years. High levels of exposure are no longer thought of simply as “what it is to be a chemist.”
2) If you had an unlimited budget & unlimited authority, what change impacting laboratory safety would you make to your department/university? (Something reasonably realistic, but beyond what you can do now given $/authority)
More 1-on-1 training with people; you learn by doing; our university hires professors the same month we expect them to start teaching – wish that the onboarding process was longer and more thorough.
Focus on training the PI and changing the culture; graduate students get signals from the PI; safety as part of evaluation process for your career; most PIs don’t know what RAMP stands for!
In Michigan, offer “driving in the snow” classes; teach how to skid correctly, etc. This same idea could be applied to safety training.
Many years ago, I developed a “spill response” training with actual chemical spills; spill of hexane, 50% sodium hydroxide (4 L or 1 gallon), 98% sulfuric acid; trainings worked really well; several things happened that mitigated this hands-on training that all had to do with liability; county dept of health walked in when we were down to doing only 100-mL spills for this training – were told that we couldn’t do this anymore due to liability reasons and that we would be punished if we continued. This ended a great training program.
I would like to see some sort of Netflix-style series on chemistry and safety (make it badass); create a space in which PIs can just be PIs so they can focus on training in the lab.
We set up a presentation to actually show the researchers where their waste goes and why it was important to separate and label properly. This seemed to be well-received and was effective in getting better compliance from researchers on how they handled their waste.
We created a presentation in which researchers followed their waste stream; waste was burned right next to one of the poorest neighborhoods in our area; we used this to drive home the importance of minimizing waste production in labs as much as possible.
So much of the responsibility for safety spending falls on individual labs. Labs have very uneven access to money to spend on safety; lab groups literally are impacted healthwise and science-wise by this inequity.
Design a hazard review certification course so that graduate researchers can actually acquire a separate certification for this knowledge.
If develop a hazard review certification course, try to recruit area chemical companies to get involved in the design, and even delivery, of the education. This could help in getting graduate students to see that this sort of knowledge and skill is valued by employers.
UCSF: PI training course; only consider very new faculty members; bias towards the UC system
3) Given that you don’t have unlimited budget & authority, what have you seen to be the most successful safety culture tool in your area?
We did not get to this question in the discussion (although a few things were mentioned in the question above).
What specific aspects of your organizational safety climate are you trying to improve right now?
Publishers enforcing safety information sections in research publications
Uncovering how work is actually being performed in academic research laboratories – it seems too much is either “surprising to find out” or unknown by those who should know since they are the ones designing trainings and policies to address safety matters
Academic researchers being more willing to communicate to safety professionals what the problems are; want the opportunity to address the problem so that this encourages others to share problems
Want to get researchers to be more systematic in use of SOPs – use them and make sure the content is relevant. For risk assessment training, I want to make this useful – not so general that it is useless, but not so detailed that it is overwhelming
Ventilation is a serious problem in our building. An engineering study was done in 2016, but with change in administration of the last few years, fixing this issue has dropped on the priority list – even though students are reporting headaches and nausea when working on the top floor of the building.
Not addressing major safety issues undermines trust that students have in safety culture and education efforts: if you don’t care enough about safety to fix these problems, then why should I be mindful of my safety practices? (bad example)
Dealing with COVID; discovering the resilience of a smaller institution; it is a different thing to deal with because we are considering “community protective equipment”
Due to COVID, we are aiming to create videos to introduce 1st years to the research labs; hope to incorporate discussions about research safety in the labs into these videos – could even be useful post-COVID
In the long term, ask one’s self every day: Have I impacted anyone’s life positively (or negatively)?
Compassion Fatigue: often referred to as “burnout” although this term is controversial.
Hot topic in veterinary sciences and pre-med programs
It is possible that many PIs actually suffer from compassion fatigue which may be why so many come off as “uncaring.”
For graduate students and others working in academia, if I took your work away, would you be the same person?
The Younger Chemists Committee is supporting some programs addressing mental health for graduate students.
Recently, we have identified some “strange crimes” in academic labs that appear to be a result of the boredom of isolation due to COVID. This isolation may be reducing accountability resulting in students engaging in more risk-taking behavior than usual.
Also, many are finding that students who normally function very well are now just not doing their work at all in the lockdown.
Efforts are being made at some schools to reach out to students in order to communicate with them and let them know that someone is thinking about them.
I attended a very small PUI for undergrad – no mental health services, no security. A student wigged out and started yelling things that included talk of wanting to burn down a building. A professor who knew the student, tackled him to the ground and then brought him into his office. The student calmed down & they talked about stuff including ways for him to destress & access mental health services if needed. I thought this event was a glaring example of why we need dedicated mental health services on campus for the unusual student population of the place. However, the conversations among students over the next several weeks were about the lack of security on campus and expressed support for open-carry laws.
There has also been a long debate in some areas of the US about whether or not guns should be allowed to be carried by citizens on campuses.
Is knowing why a safety rule exists important to compliance?
Knowing why a rule exists not only convinces me that following the rule is important but also makes it easier to remember the rule. Example: labelling secondary containers versus labelling waste containers and why these are different.
You can set aside a rule to get a job done faster, but this can backfire if things go wrong. Now you have to figure out why things are going wrong.
In the industrial environment discussed in the paper, it is understandable that workers would be reluctant to discuss safety issues because their job might be on the line. With students, they find it easier to discuss these issues with one another.
I’ve seen students hide broken glassware in educational labs because of a potential $25 fee; I’ve had graduate students tell me that they are afraid to reveal incidents and Near Misses because they are afraid they will “get in trouble” even if they can’t really define who they would be in trouble with or what kind of trouble it would be.
Culture is important to safety conversations and conversations about incidents being acceptable and accepted. People who are “highly reactive” when things do not go as planned can deteriorate a culture; instead incidents should be used as opportunities to be proactive.
I’ve worked hard in my department to build the safety culture; I have worked to build a space that influences graduate students’ behavior – then graduate students have influenced PIs.
How do we study what we are actually talking about?
The ethnographic studies undertaken by Silbey and Huising (see last week’s discussion for Huising’s paper) at MIT are an interesting place to start; they help illuminate the culture, the power centers, and the conflicting work dynamics that shape an academic safety culture.
Asking “safety recalcitrant” PIs about why they are so can be a revelatory experience. Some just want talk to you at all, but those who do talk end up describing bad experiences they have had with safety professionals; back in the 1970s and 1980s, the expectations changed fast. All of a sudden there was all of this “safety stuff” just shoved down people’s throats without any critical discussion. This left a bad taste for lots of people. It does seem like it is older PIs who are the ones who are less willing to engage. The younger PIs seems to be much more open to the conversations.
PIs can be seen as “the workers” or “the operatives” in that sense. They need to be included in the discussion as the frontline just as much as the graduate students.
This paper seems to formalize and put into academic terms what safety professionals have learned about behavioral safety over the years as practitioners.