Tag Archives: journal club

Perspectives on Safety Culture: Safety Journal Club Discussion, OCT 27, 2020

Led by:
Dr. Mary Beth Mulcahy
Editor, ACS Chemical Health & Safety

[office_doc id=[office_doc id=4996]]

The papers she shared:

Dr. Mulcahy introduced Edgar Schein’s Levels of Culture discussion within the field of Organizational Culture – from which the concept of Safety Culture was initially derived. Schein has a large body of work out there if you are interested in doing a deep dive. As an initial introduction to Schein and his work, I recommend you watch this video of an excerpt of an interview with him from 2016 in which he touches upon the use, and misuse, of the term “culture” and what we should actually be measuring:


Dr. Mary Beth Mulcahy’s introductory talk

  • The term “Safety Culture” gets thrown around a lot – often times to punish.
  • Edgar Schein’s levels of culture
  • Texas Tech had artifacts, but not much of the stuff below the surface
  • We don’t have a “safety culture” or a “business culture” – we just have A CULTURE. What are we doing? Why are we doing that?
  • When seeing someone at work looking off into the distance what question do you ask? “What is that person thinking about?” versus “Why is that person wasting time?” How you frame that question says something about the culture in which you reside.
  • Ask the workers why they are or are not doing something. They may have a really good reason for the “infraction.”
  • Mining the Diamond: not everything at the bottom tier will lead to a fatality, but is still important to examine.
  • Many scientists push back on safety advice saying “That’s just not how scientists work” – but sometimes the question is “is the way the scientist is doing it the right way?”

General Conversation

  • I got into safety because as a researcher I saw these conflicts become an “us versus them” and it stops things. Now on the safety side, I realize that it is important to be vulnerable.
  • Somebody needs to be the first to lower the barrier and put ourselves out there.
  • I thought that because I had a PhD and I was a researcher that the scientists were going to be more open with me. This was not the case at all.
  • We want to be heard and understood more than we want to be agreed with.
  • Diversity of expertise is important to the safety team.
  • Difficult to get at things when their mission is compliance.
  • Every time you are dealing with someone on an issue, it is not just that issue – you are also dealing with their past 30 years of experiences they have had with similar issues.
  • It is important to realize that mistakes aren’t intentional.
  • How do we embed decision-making opportunities into the flow of a researchers’ research?
  • Language is important: instead of “this is a problem”, say “this is a learning opportunity” or “look I see an opportunity here.”

By 3:12 PM, we had 26 participants

Making Safety Second Nature in an Academic Lab: Safety Journal Club Discussion, OCT 20, 2020

Led by:
Prof. Mahesh K. Mahanthappa
Department of Chemical Engineering & Materials Science
University of Minnesota, Minneapolis MN
2020 Laboratory Safety Institute Graduate Research Faculty Safety Award Winner

Resources Dr. Mahantappa highlighted in his talk are are:

Exploring Definitions of Safety Culture: Safety Journal Club Discussion, OCT 13, 2020

Led by:

Jessica Martin


After Jessica A. Martin reviewed the document “Exploring Definitions of Safety Culture” (see link bel0w), Journal Club participants were asked to spend 5 minutes typing into the chat questions that came to mind when considering these definitions as well as considering the list of our upcoming discussion leaders. The questions shared are below.

Measuring Safety Culture

  • Regardless of your definition, how do you measure “safety culture” with the goal of improving?
  • Given that incident rates are relatively low and incidents themselves are typically not as dangerous in academia, what would a more positive safety culture look like in academia?
  • What do we define as the “problems” in academia that makes us worry about the status of our Safety Culture?
  • Do all of the actors identified within an academic safety culture identify the same problems? (i.e. do we all actually agree on what the problems are?)
  • If culture is a combination of what we do/behaviors and what we think/believe, for safety culture do we only/mostly care about what people do/behaviors since it’s our actions that impact our outcomes (harmed or not)?
  • In a safety survey, we can identify safety behaviors and awareness on a scale and provide actions. How do we quantify and change negative safety attitudes? What advice can we provide?
  • Does safety culture really reduce incidents in the research setting? Where is the proof? Is it just an excuse to put every aspect of safety under one umbrella?
  • To what extent do we need to measure it if we can adequately describe a group’s safety culture from inside and outside observations?

Defining Safety Culture

  • How do you empower individuals (students, faculty, staff, etc.) to take personal responsibility for safety, while making sure adequate training is provided and demonstrated (best laboratory practices) to others in the lab?
  • How has the COVID epidemic changed the safety culture of the USA? Have those changes been reflected in your organization?
  • What other types of culture do we measure in an attempt to change the culture?
  • What are the boundaries of an organizational culture? Are these the same boundaries as the safety culture of the organization?
  • How many people does it take to have a culture?
  • What are other concepts which have undergone a period of disagreement and then been resolved? How did they do that?
  • How (and to what extent) can organizational culture and institutional management hierarchies influence positively academic laboratory safety culture?
  • What are the other parent fields and what should we be drawing from them as the chemical health and safety field develops (ex. organizational psychology)?

Dreaming Big & Learning Well: Safety Journal Club Discussion, OCT 6, 2020

Led by:

Jessica Martin


The questions sent out to everyone to contemplate:

1) What safety-related incident have you experienced that taught you the most about how you approach safety?

  • IPA + dry ice container exploded – thought it was at room temperature when person put the lid on it, but it wasn’t.
  • Lesson: Safety is not as straight-forward as you think it is.
  • Working with a post-doc on LiH reaction. Post-doc told me to quench the reaction with water, so I dumped 100 mL right in and it exploded. Green goo goes everywhere and I was covered in it too. Noticed the goo was cold.
  • Lesson: First big lesson in explicit communication – we clearly meant 2 different things by “quench.”
  • Went to undergrad college with no safety personnel. As students, we were isolating DNA using phenol:chloroform extraction. A fellow student dropped a bottle of phenol. The bottle broke and splashed all over her. She went under the shower and technically we responded correctly. However, looking back we did not appreciate at all the seriousness of this incident or how dangerous it was.
  • Lesson: Educating on safety hazards is just as important as educating on the chemistry.
  • How management of change is not managed; lots of small incidents in developing SOPs and any processes.
  • Lesson: As safety professionals, we think about the safety of processes and why we do the things we do, but we don’t necessarily communicate it (or communicate it well).
  • Sustained an injury with a thin needle that took a core of my skin out of my thumb. Was sent to administrators to deal with paperwork and was informed that because of the particular situation, I actually was able to file a worker’s comp claim, however, this would not have necessarily been true depending on where I was working on campus.
  • Lesson: Who you are and where you are working determines workers’ comp status!
  • Within 30 days of starting job, a mislabeled bottle of biologicals had everybody in a panic; I had not had HAZWOPER training yet; turned out to be spirulina – someone had labeled it “eco” and they that it was e. coli; I did not act right away and this was a mistake because it exploded into a nightmare of infighting among a bunch of the faculty and staff over what this stuff was and how it got there.
  • Lesson: learned to be proactive as I can be immediately following a situation – communication is such a big issue.
  • In undergrad, used chlorosulfonic acid for an experiment in undergrad class – instructor dispensed it, everyone was double-gloved, in lab coats; instructors thought they had accounted for all hazards; however, they did not say that all of your equipment had to be secure before obtaining your aliquot of the acid; someone’s condenser hoses came out and sprayed water around the hood with the acid sitting in there; somehow it managed to miss the acid! Scary near miss. Back to communication!
  • Lesson: Even if you think you have covered all of the safety precautions, unexpected things can still happen. It is important to double-check things and communicate effectively.
  • Became safety officer at an institution when no one really knew what it was; staff member talking to x-ray crystallographer; fumes coming into the hallway; I noticed but I thought maybe it was okay because no one else was reacting; I didn’t feel confident in my position initially so I asked a bunch of questions about the situation and learned a TON about ventilation and the history of the situation; then had to learn to interact with facilities.
  • Lesson: Fully understanding a situation leads to a much more thorough resolution of a chronic safety issue than “name-and-blame” tactics. Also, realized how many different parts were contributing to one unsafe situation.
  • Used DMF right at edge of fume hood – after ½ hour decided to stop doing this; later in the day when outdoors, I suddenly couldn’t breathe and fell on the ground; figured out later that this was a common effect of DMF exposure.
  • Lesson: Learn the hazards of what you are working with. Also understand your protective equipment; the fume hood was being improperly used because it was overcrowded. Dangerous exposures can happen so easily when you don’t understand what you are handling and how to protect yourself from them.
  • Popular science magazines as kids (11 years old) – would do the experiments; tried to prepare copper nitrate; got copper and nitric acid from a small shop; the mixture produced brown gases that was not mentioned in the procedure. We ran away from it until the gases cleared. We learned to do our experiments outside from that point forward. I had many such incidents growing up and going through my own education.
  • Lesson: I learned that I could survive the accidents; and before you learned that this was simply the professional life of being a chemist. The UCLA incident really changed how I thought about safety. Now we think more about how to prevent exposure in the last 10 years. High levels of exposure are no longer thought of simply as “what it is to be a chemist.”

2) If you had an unlimited budget & unlimited authority, what change impacting laboratory safety would you make to your department/university? (Something reasonably realistic, but beyond what you can do now given $/authority)

  • More 1-on-1 training with people; you learn by doing; our university hires professors the same month we expect them to start teaching – wish that the onboarding process was longer and more thorough.
  • Focus on training the PI and changing the culture; graduate students get signals from the PI; safety as part of evaluation process for your career; most PIs don’t know what RAMP stands for!
  • In Michigan, offer “driving in the snow” classes; teach how to skid correctly, etc. This same idea could be applied to safety training.
  • Many years ago, I developed a “spill response” training with actual chemical spills; spill of hexane, 50% sodium hydroxide (4 L or 1 gallon), 98% sulfuric acid; trainings worked really well; several things happened that mitigated this hands-on training that all had to do with liability; county dept of health walked in when we were down to doing only 100-mL spills for this training – were told that we couldn’t do this anymore due to liability reasons and that we would be punished if we continued. This ended a great training program.
  • I would like to see some sort of Netflix-style series on chemistry and safety (make it badass); create a space in which PIs can just be PIs so they can focus on training in the lab.
  • We set up a presentation to actually show the researchers where their waste goes and why it was important to separate and label properly. This seemed to be well-received and was effective in getting better compliance from researchers on how they handled their waste.
  • We created a presentation in which researchers followed their waste stream; waste was burned right next to one of the poorest neighborhoods in our area; we used this to drive home the importance of minimizing waste production in labs as much as possible.
  • So much of the responsibility for safety spending falls on individual labs. Labs have very uneven access to money to spend on safety; lab groups literally are impacted healthwise and science-wise by this inequity.
  • Design a hazard review certification course so that graduate researchers can actually acquire a separate certification for this knowledge.
  • If develop a hazard review certification course, try to recruit area chemical companies to get involved in the design, and even delivery, of the education. This could help in getting graduate students to see that this sort of knowledge and skill is valued by employers.
  • UCSF: PI training course; only consider very new faculty members; bias towards the UC system

3) Given that you don’t have unlimited budget & authority, what have you seen to be the most successful safety culture tool in your area?

We did not get to this question in the discussion (although a few things were mentioned in the question above).

Improving safety culture through the health and safety organization: A case study: Safety Journal Club Discussion, Sept 29, 2020

Led by:

Kali A. Miller

[office_doc id=[office_doc id=4919]]


Nielsen, K. J. Improving safety culture through the health and safety organization: A case study. Journal of Safety Research2014, 48, 7-17. https://www.sciencedirect.com/science/article/pii/S0022437513001552?via%3Dihub

Rae, D.; Provan, D. How do you know if your safety team is a positive influence on your safety climate? Safety of Work: Ep. 3, December 1, 2019. https://safetyofwork.com/episodes/how-do-you-know-if-your-safety-team-is-a-positive-influence-on-your-safety-climate


What specific aspects of your organizational safety climate are you trying to improve right now?

  • Publishers enforcing safety information sections in research publications
  • Uncovering how work is actually being performed in academic research laboratories – it seems too much is either “surprising to find out” or unknown by those who should know since they are the ones designing trainings and policies to address safety matters
  • Academic researchers being more willing to communicate to safety professionals what the problems are; want the opportunity to address the problem so that this encourages others to share problems
  • Want to get researchers to be more systematic in use of SOPs – use them and make sure the content is relevant. For risk assessment training, I want to make this useful – not so general that it is useless, but not so detailed that it is overwhelming
  • Ventilation is a serious problem in our building. An engineering study was done in 2016, but with change in administration of the last few years, fixing this issue has dropped on the priority list – even though students are reporting headaches and nausea when working on the top floor of the building.
  • Not addressing major safety issues undermines trust that students have in safety culture and education efforts: if you don’t care enough about safety to fix these problems, then why should I be mindful of my safety practices? (bad example)
  • Dealing with COVID; discovering the resilience of a smaller institution; it is a different thing to deal with because we are considering “community protective equipment”
  • Due to COVID, we are aiming to create videos to introduce 1st years to the research labs; hope to incorporate discussions about research safety in the labs into these videos – could even be useful post-COVID
  • In the long term, ask one’s self every day: Have I impacted anyone’s life positively (or negatively)?

Compassion Fatigue: Safety Journal Club Discussion, Sept 22, 2020

Led by:

Anthony Appleton Anthony.Appleton@colostate.edu Research Safety Culture Coordinator, Colorado State University

[office_doc id=[office_doc id=4907]]


Compassion Fatigue in Animal Research Webinar by Marian Esvelt, DVM , University of Michigan:


Higher Ed Jobs Article “Overcoming Burnout and Compassion Fatigue in Higher Education” by Justin Zackal:


Journal Article: The prevalence and effect of burnout on graduate healthcare students



  • Compassion Fatigue: often referred to as “burnout” although this term is controversial.
  • Hot topic in veterinary sciences and pre-med programs
  • It is possible that many PIs actually suffer from compassion fatigue which may be why so many come off as “uncaring.”
  • For graduate students and others working in academia, if I took your work away, would you be the same person?
  • The Younger Chemists Committee is supporting some programs addressing mental health for graduate students.
  • Recently, we have identified some “strange crimes” in academic labs that appear to be a result of the boredom of isolation due to COVID. This isolation may be reducing accountability resulting in students engaging in more risk-taking behavior than usual.
  • Also, many are finding that students who normally function very well are now just not doing their work at all in the lockdown.
  • Efforts are being made at some schools to reach out to students in order to communicate with them and let them know that someone is thinking about them.
  • I attended a very small PUI for undergrad – no mental health services, no security. A student wigged out and started yelling things that included talk of wanting to burn down a building. A professor who knew the student, tackled him to the ground and then brought him into his office. The student calmed down & they talked about stuff including ways for him to destress & access mental health services if needed. I thought this event was a glaring example of why we need dedicated mental health services on campus for the unusual student population of the place. However, the conversations among students over the next several weeks were about the lack of security on campus and expressed support for open-carry laws.
  • There has also been a long debate in some areas of the US about whether or not guns should be allowed to be carried by citizens on campuses.

Catching them at it: An ethnography of rule violation: Discussion, September 15, 2020

Paper: Iszatt-White, Marian Catching them at it: An ethnography of rule violation (2007)


Presenter: Sarah Zinn


Is knowing why a safety rule exists important to compliance?

  • Knowing why a rule exists not only convinces me that following the rule is important but also makes it easier to remember the rule. Example: labelling secondary containers versus labelling waste containers and why these are different.
  • You can set aside a rule to get a job done faster, but this can backfire if things go wrong. Now you have to figure out why things are going wrong.
  • In the industrial environment discussed in the paper, it is understandable that workers would be reluctant to discuss safety issues because their job might be on the line. With students, they find it easier to discuss these issues with one another.
  • I’ve seen students hide broken glassware in educational labs because of a potential $25 fee; I’ve had graduate students tell me that they are afraid to reveal incidents and Near Misses because they are afraid they will “get in trouble” even if they can’t really define who they would be in trouble with or what kind of trouble it would be.
  • Culture is important to safety conversations and conversations about incidents being acceptable and accepted. People who are “highly reactive” when things do not go as planned can deteriorate a culture; instead incidents should be used as opportunities to be proactive.
  • I’ve worked hard in my department to build the safety culture; I have worked to build a space that influences graduate students’ behavior – then graduate students have influenced PIs.
  • How do we study what we are actually talking about?
  • The ethnographic studies undertaken by Silbey and Huising (see last week’s discussion for Huising’s paper) at MIT are an interesting place to start; they help illuminate the culture, the power centers, and the conflicting work dynamics that shape an academic safety culture.
  • Asking “safety recalcitrant” PIs about why they are so can be a revelatory experience. Some just want talk to you at all, but those who do talk end up describing bad experiences they have had with safety professionals; back in the 1970s and 1980s, the expectations changed fast. All of a sudden there was all of this “safety stuff” just shoved down people’s throats without any critical discussion. This left a bad taste for lots of people. It does seem like it is older PIs who are the ones who are less willing to engage. The younger PIs seems to be much more open to the conversations.
  • PIs can be seen as “the workers” or “the operatives” in that sense. They need to be included in the discussion as the frontline just as much as the graduate students.
  • This paper seems to formalize and put into academic terms what safety professionals have learned about behavioral safety over the years as practitioners.

Scut Work and Safety Roles in the Lab: Safety Journal Club, 9/8/2020

Topic/Paper: Scut Work and
Safety Roles in the Lab

Do you have an idea for future papers? Contact Jessica Martin jessica.a.martin@uconn.edu If you would like to join us for the Journal Club on Tuesdays at 3 PM EST, please sign up here: https://docs.google.com/forms/d/1sIlFpKHNNxTtXaQ6rEmOvvvZRW1rNLZ8lqfZfJkoACU/edit

The Science of Safety Journal Club met on 9/8/2020 to discuss the paper presented by Ralph Stuart called “To Hive or to Hold? Producing Professional Authority through Scut Work” which came out in the journal Administrative Science Quarterly in 2015.

Presenter: Ralph Stuart

After Ralph Stuart’s presentation (see .pdf), the participants were broken into 2 groups of 7 participants each to discuss.

Group Discussion (Breakout Room 1)

·       EH&S – How can you separate work in this manner, hard to imagine that you would not do all four activities in order to offer good customer service, hard to distinguish between each lab research group

·       Grad school – students more “dirty work” vs PIs

·       Some can be a “main hub” for what work is/can be done

·       EH&S groups struggle when they take a regulatory approach when they take a hands on and cooperative approach

·       Two ways in scut work issue (from HES view):

o   Here is what the reg says and here is what you need to get it done.

o   Hey folks, we have to solve a problem. Here are my ideas, what are your ideas and how can we work together to get it done?

§  Second approach would be more productive.

·       Huising paper: Good in theory, but not in reality.

o   As a lab manager, you want do and be able to do a lot of the work; valuable actions are completing tasks (scut could be task based)

o   Labels useful for academic discussion, useless when it comes to getting practical things done

Group Discussion (Breakout Room 2)

·       The conclusion of the paper is not at all surprising. One of the problems I’ve seen is that there isn’t enough interface between graduate researchers (the frontline) and safety professionals at my institution. The safety professionals don’t really know what is going on our labs – and we don’t know enough about what the safety professionals do or can offer to even know what to ask. LSTs can work to bring these 2 populations together to interface more to bridge this gap.

·       Game of spotting things that were wrong with a hood set-up incorporated into a safety training by an LST: researchers have difficulties finding at least 10 things wrong (in context where a lot of things were wrong and it shouldn’t have been hard), also safety professionals were able to see, live, how researchers see and think about these problems; the safety professional expressed surprise at the conversations they were eavesdropping on – saw a lot of utility in learning this.

·       I’ve had experience as a lab technician then on the other side as a lab safety professional and I have seen some of the same issues.

·       I’m at an undergrad institution, so the mix of issues is a little different. We have no overarching EHS department. The further you go up the chain of command, the less knowledgeable/useful people are in terms of laboratory safety.

·       During my time as EHS, I would walk the hallways and read posters from the “hazmat” perspective to gain an idea of what safety challenges could be encountered in these labs. I would try to knock on doors and talk to people in labs, but I often found labs empty or with just one person who only knew about their own work.

·       Cultural differences between researchers and staff—constant recurring problem. Regular business working hours. If they walk into lab and try to find people, maybe they will maybe they won’t—TAing responsibilities and a lot of other things happening during those hours for grad students. Most work gets done during non-business hours. That is perhaps the time to cross paths with them. Non-overlapping schedules is a challenge when looking for more creative ways for getting engagement between safety professionals and researchers.

·       In the 1980s/1990s, career lab technicians used to exist in large research-intensive departments. These people were around for ~20 years and had historical knowledge and were often responsible for lab safety. These positions seem to have disappeared over the years with budget cuts after the 2008 crash, leaving no “long-term” people housed in the research labs.

·       Bio oriented departments typically have lab staff, chemistry don’t. Arguments exist that if chem labs had lab staff, safety would be better—interesting to consider that these positions used to exist, but no longer do.

·       Trying to drag a grad student group into training (grad students time and attention limited—unreasonable to consider teaching, researching, and having life at same time) is different than trying to drag other professionals who expect training

Groups came back together

·       Two spheres, but they may not know what each other is doing or not many meaningful interactions; do not know what the other does.

·       EHS versus researchers

·       Perception of safety people as compliance rather than as someone who can assist in research

A Manifesto for Reality-Based Safety Science: CHAS Safety Journal Club, 9/1/20

The Science of Safety Journal Club met on 9/1/2020 to discuss the paper A Manifesto for Reality-based Safety Science which came out in the journal Safety Science earlier in 2020.

The first two authors of the paper also conduct a podcast titled “The Safety of Work” in which they discuss papers from across the field of safety science. You can listen to their discussion of this paper (~1 hour) by clicking on “Ep. 20 What is reality-based safety science?” at

Below are notes of the opening and general discussion that took place during the Journal Club. We’d love to hear your thoughts about this conversation in the comments below! If you would like information about joining the club or presenting a paper there, sign up on our Google form here Contact Jessica Martin jessica.a.martin@uconn.edu with any questions about this group.

Topic/Paper: A Manifesto for Reality-Based Safety Science

Presenter: Jessica A. Martin

Jessica’s summary of the paper

  • All research programs have a theoretical hard core with a contestable set of auxiliary hypotheses (this statement is based on the ideas in Imre Lakatos’s method of evaluating scientific progress (Lakatos, 1978)
  • The auxiliary hypotheses link the hard core of theory (e.g. Newton’s three laws of motion) with the observed world by providing ways to measure, test and apply the hard core. The auxiliary hypotheses form a protective belt around the hard core – empirical anomalies are accommodated by adjusting the auxiliary hypotheses rather than by rejecting the hard core.
  • A research program is progressive under 2 conditions:
    1. Each new theory must have greater empirical content than its predecessors,
    2. at least some of this novel content must turn out to be true.
  • There is room for a new theory to make wrong predictions, particularly if these can be explained by adjusting the auxiliary hypotheses. However, once a program bogs down in constant adjustment of auxiliary hypotheses to explain away wrong predictions, at the expense of novel true content, the program has become degenerate.
  • To the extent that safety science makes progress, it does so by adopting and customizing progressive research programs from related fields. The problem is that once those programs become part of safety science, they usually cease making progress. 
  • In other words, in terms of novel and confirmed empirical content, “safety science” is usually where research programs come to die.
  • The Hope:
  • We have stopped growing empirically growth – the problem we are facing in safety science stems from a lack of evidence production
  • The Professional Safety field consists of a variety of stakeholders
    • Empiricists: all knowledge is based on experience derived from the senses
    • Theorists: concerned with the theoretical aspects of a subject
    • Practitioners: actively engaged in an art, discipline, or profession
  • To address this challenge, theorists, empiricists, and practitioners need to come together.
  • The Manifesto for Reality-based Safety Science: where theory is grounded in rigorous observations of existing practice, and where practice is based on established theory

Safety Science Research Manifesto

 Notes and Ideas from Breakout Room 1

  • How much safety research aligns with chemical education research is amazing! (we had two Chem Ed researchers in our group)
  • Chem Ed Research does do observations of work
  • Seems that leaving out “main fields” is an issue;
  • Accident investigation involves the “Why did the researcher make that choice”
  • “We” might not be the best safety examples
  • “I understand what I am doing…but others do not fully understand the situation I am in”
  • Observing is not grounds for just doing it yourself
  • Get comfortable with where we are at.
  • Trust, communication
  • Cameras for observation? TALK TO YOUR HR DEPT and  IRB if research is being published
  • Reviewing near miss/safety concerns a good idea for safety research Look at what is reported vs actual accidents at your institution.
  • Know what is HIPAA protected
  • At CSU…we find what is reported with near misses does not really indicate what accidents have ended up happening
  • Theory vs Practitioner (general sense of conversation)·       How to share…peer reviewed not peer reviewed
  • Make safety approachable, equitable, and useful

Points from Breakout Room 2 Discussion

  • Is Safety Science actually a science?
  • Safety practice is often driven by learning from bad experiences
  • Are other scientific approaches more appropriate for safety scientists?
  • Susan Silbey (Professor of anthropology at MIT): studying how new EHS system at MIT works using an ethnographic approach
  • When attempting to “do science” on a safety question, it feels like a “chicken or egg” problem. What are you going to measure? Who decides how the work should be done? And if the theory dictates how the work should be done, then you are using the theory to measure itself – so the logic becomes circular. 
  • You can perform a bunch of incorrect behaviors, and still not get injured.
  • Performing safety assessments in academia are the exception, not the rule.
  • There are many more practitioners than researchers in the field, and what is useful to the researchers is not necessarily useful to the practitioners.
  • Tracking “near misses” can be much more informative than tracking accidents; although even here, what constitutes a “near miss” can be extremely unclear.
  • Collecting data on edge events/accidents can be entirely too complex, making it difficult to get good data.
  • How does one share “safety experience” with the wider community without going through the peer review process that looks for statistical analysis, control groups, etc.

Return to larger group discussion

  • Field is practitioner heavy
  • Safety is driven by individual learning experiences
  • If it’s not being published…then no one is aware…how/where to publish
  • Validating your own theory
  • Is this a science in a proper sense?
    • Should safety science be separated from other sciences?
    • It had gone separate already.
    • Cross field interactions.
  • System. Person. System interacting with the Person.
  • You should not divorce from the other fields
  • Be involved with culture, make sure not just for white, cis-gender, hetero…example of reducing sexual harassment in field campaigns

Do you have an idea for future papers? Contact Jessica Martin jessica.a.martin@uconn.edu