Category Archives: Reference material

CHAS Fellows Award

Download the Nomination Application Form for this Award Here: [Click to Download CHAS Fellows Nomination Form]

Statement of Award Purpose

Fellows Awards recognize CHAS members in good standing who have provided continuous, active service to CHAS and who have made significant contributions to the Division during their active service.

Award Recognition

Certificate and lapel pin presented at the Awards Symposium during the Fall ACS national meeting.

Description of Eligible Nominees

An eligible nominee must be current members of the division who has been a member for at least fifteen years or has been a member for at least seven years (continuous membership) and has been active in CHAS activities for at least three years. Active membership is defined below.

CHAS Fellows have shown support for the goals and activities of CHAS, and has through personal effort, helped CHAS achieve those goals. The following activities designate a member as “active”:

  1. Division volunteer service including holding office in the division, chairing division committees, etc.
  2. Organization of symposia, major presentations, or other programming at national/international meetings
  3. Contributions to safety in ACS Publications
  4. Leadership of or other contributions to Division outreach activities

There is no limit to the number of CHAS Fellows named each year.

Eligible Sources of Nominations

  • Self-nomination
  • Any CHAS division member

Roster of CHAS Fellows

(Those listed in bold are also ACS Fellows)

  • Robert Alaimo
  • Peter Ashbrook
  • W. Emmett Barkley
  • Janet Baum
  • Ernest I. Becker
  • Leslie Bretherick
  • Brandon Chance
  • Daniel A. Crowl
  • Debbie Decker
  • Lou DiBerardinis
  • Laurence J. Doemeny
  • Harry J. Elston
  • Howard Fawcett
  • Barbara Foster
  • David Finster
  • Kenneth Fivizzani
  • Bill Galdenzi
  • Lawrence M. Gibbs
  • Marta Gmurczyk
  • Ruth Hathaway
  • Donald Hedberg
  • Dennis C. Hendershot
  • Robert H. Hill Jr.
  • Chris Incarvito
  • Robin Izzo
  • James Kaufmann
  • Kimberly B. Jeskie
  • Jyllian Kemsley
  • Sheila Kennedy
  • Warren Kingsley
  • Mary Beth Koza
  • Ken Kretchman
  • Daniel Kuespert
  • Neal Langerman
  • Mark Lassiter
  • Po-Yung Lu
  • Sung Moon
  • John Palmer
  • Joe Pickel
  • Lyle H. Phifer
  • Russ Phifer
  • Stanley Pine
  • Pat Redden
  • Malcolm Renfrew
  • Peter Reinhardt
  • Monona Rossol
  • Eileen Segal
  • Stephen Sichek, Sr.
  • Diane G. Schmidt
  • Sammye Sigmann
  • Mary Ann Solstad
  • Ralph Stuart
  • Ellen Sweet
  • Stephen Szabo
  • Erik A. Talley
  • Robert Toreki
  • George Wahl
  • Douglas B. Walters
  • Stefan Wawzyniecki
  • Elizabeth Weisburger
  • Frankie Wood-Black
  • I. J. Wilk
  • Kenyon D. Yoder
  • Jay Young

CHAS Graduate Student Safety Leadership Award

Download the Nomination Application Form for this Award Here: [Click to Download GRADUATE STUDENT SAFETY LEADERSHIP AWARD in Word format]

Statement of Award Purpose

This award is given to recognize a graduate student researcher or recent graduate (within 3 years of latest degree) who demonstrates outstanding leadership in the area of chemical health and safety in their laboratory, research group, or department.

Each year the award is dedicated to a different historical figure in chemical safety.

Award Amount and Recognition

Award Amount and Recognition

This award is made possible by the generosity of an anonymous donor.

The award consists of:

  • A $500 honorarium will be payable directly to the award recipient
  • A certificate that includes information about that year’s dedicatee
  • An invitation to deliver a 15 – 20-minute presentation at the CHAS Awards Symposium.  The presentation should describe the work recognized by the award.
  • An optional additional $2,000 will be made available to support a project that promotes graduate student safety at the home school and/or for travel expenses to the CHAS Awards Symposium at an ACS national meeting, as applicable

This award is made possible by the generosity of an anonymous donor.

The recipient of this award is expected to deliver a 15 – 20-minute presentation at the CHAS Awards Symposium at the ACS Fall national meeting in the year that they receive the award. The presentation should describe the work recognized by the award.

Description of Eligible Nominees

Eligible nominees are current master’s or Ph.D. candidates in research fields or those who have graduated within the past three years who demonstrate values and behaviors consistent with the criteria of this award.  The researcher may be a member of any academic department at their institution provided that chemical use is a significant part of their research. Non-ACS and CHAS members are eligible for this award. 

Award Criteria

The primary criterion for this award is demonstrated leadership of specific project(s) that support a proactive safety culture in the laboratory, research group, and/or department where the student’s research or teaching responsibilities take place. Such projects empower their peers and students to address technical and cultural safety concerns related to chemical usage, in either the teaching or research environment. 

Examples of work that calls on other safety leadership skills (e.g. participation in extramural safety conferences and organizations, publication of safety related information in research papers, development of new approaches to safety education in the lab) will support the award application, but not replace the need for a specific example of project-based leadership.

Required support for nomination

  • Cover letter from the nominator describing why the nominee is deserving of this award.
  • Two letter(s) of support from institution’s Environmental Health and Safety Office, Senior Administration or Departmental leadership.
  • Descriptions of or samples of research safety projects or initiatives lead by nominee

Eligible Sources of Nominations

  • Self-nomination
  • Teammate (fellow student or lab member)
  • Department chair
  • EH&S Department
  • Vice Provost for Research or another Senior Administrator
  • Peer

Additional Information about this Award

This award was proposed in 2020 by an anonymous donor and was developed by the CHAS Awards committee in collaboration with the donor.  The award was first given in 2021.  

Previous Awardees:

2022, given in honor of the Radium Girls

Quinton Bruch, University of North Carolina at Chapel Hill

2021, given in honor of Sheharbano “Sheri” Sangji          
  • Graduate Student Team Leaders:
    • Jessica De Young, University of Iowa
    • Alex Leon Ruiz, University of California, Los Angeles                      
    • Sarah Zinn,University of Chicago
    • Cristan Aviles-Martin, University of Connecticut 

SafetyStratus College and University Health and Safety Award

Download the Nomination Application Form for this Award Here: [Click to Download SafetyStratus College and University Nomination Form in Word format]

Statement of Award Purpose

This award is given to recognize the most comprehensive chemical safety programs in higher education (undergraduate study only).

Award Amount and Recognition

  • $1,000 Honorarium
  • Engraved plaque including name of recipient and sponsor logo to be presented to winner at award symposium

This award is made possible by the generosity of SafetyStratus.

The recipient of this award is expected to deliver a 15 – 20-minute presentation at the CHAS Awards Symposium at the ACS Fall national meeting in the year that they receive the award. The presentation may be on any topic related to chemical safety.

Description of Eligible Nominees

The nominee may be a college/university chemistry academic department, an entire campus, or an EH&S office. Joint-nominations including chemistry departments and other offices will also be accepted. Preference will be given to those submissions which include participation from the chemistry department.

Previous recipients of this award will only be considered eligible again 10 years after they receive the award. Their current undergraduate lab safety program must differ significantly from the program as it was when the award was previously received. In this case, supporting documentation must highlight the program improvements since the year that the award was last received.

Detailed award criteria are given below.

One award is given per year.

Award Criteria

1. Chemistry Department’s safety policy statement

2. Chemical Hygiene Plan(s) for instructional laboratories

3. Evidence that safety concepts, such as risk assessment and utilization of chemical-safety information, are included in the teaching curriculum.

Evidence may include the following:

  • Safety rules for students
  • Course syllabus showing covered safety topics
  • Laboratory manuals with safety guidance
  • Examinations or exercises used to teach or reinforce safety concepts
  • Safety course offerings
  • Description(s) of offered seminar(s) on safety topics
  • Results of safety research
  • Other examples as appropriate

4. Description of, or documentation for, chemical waste collection policies and procedures for instructional courses and prep lab

5. Chemical Storage policies

This may include descriptions of:

  • Access control
  • Segregation
  • Protected storage
  • Inventory tracking methods
  • Storage quantity limitations, approvals for ordering new chemicals, etc.

6. Policies relating to the prep-lab space (if applicable), as well as written safety requirements for instructors and teaching assistants working during non-class times.

This may include:

  • Separate Chemical Hygiene Plan for prep lab (or clear inclusion in the department CHP)
  • General policy and procedures, for use of and access to the space (e.g. restricted access, policy for not working alone in the lab, etc.)

7. Evidence of waste-minimization and green-chemistry strategies.

This may include:

  • Policies for waste minimization or a list of prohibited materials
  • Description of green practices used in the instructional labs
  • Incorporation of sustainability concepts into the curriculum

8. Evidence of faculty and teaching-assistant safety training and development.

This may include:

  • Description of safety training requirements and policies for individuals overseeing, or directly supervising, laboratory classes
  • Learning objectives for the safety training that describe the skills or knowledge that the faculty and teaching assistants will acquire
  • Copy of safety training materials for training instructors and T.A.s, if applicable
  • Seminars, workshops, production of videotapes, slides, etc.

9. Lab Safety Event (incidents and near miss) reporting.

Documentation may include:

  • Policy for reporting incidents
  • Summary reports and analysis of incidents, injuries, and near misses from the previous 3 years
  • Summary of lessons learned from events and corrective actions taken
  • Example reports with root cause analysis and corrective actions

10. Policy, procedure, and frequency for routine assessments of laboratory’s physical condition (i.e. audits/inspections). Assessments may be internal (self-inspection) or external (e.g. EH&S department inspections)

Supporting documentation may include:

  • Description of inspection process, frequency, and responsible parties
  • Copy of inspection checklist
  • Metrics or data demonstrating program effectiveness. This could include data regarding frequency of inspections, average number of findings, median time for corrective action, etc.

Required support for nomination

The nomination must include a cover letter from the nominator describing why the nominee is deserving of this award. The letter shall describe the nominee’s work and how it is aligned with the purpose, eligibility, and/or criteria of the award.

Other supporting materials are suggested in the details of the award criteria. The supporting information may be provided as links to applicable webpages on the organization’s website or as attached files. The documents and/or links must be labeled/named using the numbering and descriptions provided in the details of the award criteria. The name of the institution must be included in all file names.

The nomination may also include a letter of support from the institution’s Environmental Health and Safety office, as well as, a letter of support from a senior administrator such as the head of the academic department, the vice provost for research, or the dean of the school; although this is not a requirement for this award.

Site visit criteria (if applicable)

If a nominee is eligible based on supporting documentation, a committee member or other ACS delegate will perform a site visit to verify.

As part of the site visit, the following laboratory and chemical-use area conditions of the undergraduate laboratory facility may be assessed:

  1. General ventilation
  2. Engineering controls in working order
  3. Housekeeping and general facility condition including chemical storage areas
  4. Adequate student supervision
  5. Security of chemical storage areas and general lab spaces
  6. Emergency irrigation tested and working
  7. Emergency response equipment and supplies stocked and inspected (spill kits, fire extinguishers, etc.)
  8. Personal protective equipment available and in good condition
  9. Posted emergency procedures and contact numbers

Eligible Sources of Nominations

  • Self-nomination
  • Subordinate (Student)
  • Department chair
  • EH&S Department
  • Vice Provost for Research or another Senior Administrator
  • Peer or another ACS member who is familiar with the nominee’s undergraduate program

Previous Winners

2022: University of Nevada, Reno, Environmental Health and Safety Department & Department of Chemistry

2021: C. Eugene Bennett Department of Chemistry, West Virginia University

2020: Massachusetts Institute of Technology Department of Chemistry Undergraduate Teaching Laboratory and Environment, Health & Safety Office

2019: University of Pittsburgh, Department of Chemistry and Department of EH&S

2018:  University of North Carolina at Chapel Hill, Department of Chemistry and Department of Environment, Health and Safety

2017: Department of Chemistry and the Department of Environmental Health and Safety, Stanford University

2016: Duke University

2015: University of Pennsylvania

2014: University of California Davis

2013: North Carolina State University

2012: Wittenburg University, Springfield, Ohio

2011: University of California, San Diego

2010: Princeton University

2009: Wellesley College

2008: Franklin and Marshall

2007: University of Connecticut

2006: none awarded

2005: Massachusetts Institute of Technology and the University of Nevada-Reno

2004: University of Massachusetts-Boston

2003: none awarded

2002: none awarded

2001: West Virginia University

2000: none awarded

1999: Francis Marion University

1996: Williams College

1995: University of Wisconsin-Madison

1993: College of St. Benedict, jointly with St. John’s University of St. Joseph, MN

1991:Massachusetts Institute of Technology

Laboratory Safety Institute Graduate Research Faculty Safety Award

Download the Nomination Application Form for this Award Here: [Click to Download LSI Grad Research Faculty Safety Nomination Form]

Statement of Award Purpose

This award is given to recognize graduate-level academic research faculty who demonstrate outstanding commitment to chemical health and safety in their laboratories.

Award Amount and Recognition

  • Up to $1,000 to support travel expenses to attend the ACS national meeting and deliver a 15 – 20 min presentation at the CHAS Awards Symposium, and/or a grant to be used for safety enhancements in the faculty member’s research group
  • Engraved plaque including name of recipient and sponsor logo to be presented to winner at award symposium
  • Award certificate mailed to the university president.

This award is made possible by the generosity of The Laboratory Safety Institute.

Description of Eligible Nominees

Eligible nominees are faculty members who have responsibility for a graduate-level research laboratory, and who demonstrate values and behaviors consistent with the criteria of this award. The faculty member’s laboratory may be part of any academic department at the institution, provided that chemical use is a significant part of the laboratory’s research. Detailed criteria provided below.

One award is given per year.

Award Criteria

Sets Safety-Compliance Expectations:

Enforces all institutional health and safety practices, protocols, and rules within his or her laboratory space.

  • Participates in formal laboratory inspections and group safety committee meetings.
  • Maintains safety in the laboratory by conducting unannounced walk-throughs.
  • Ensures all members of the research group and visitors (including vendors and contractors) read, understand, agree to follow, and realize the consequences of not following the safety rules.
  • Actively demonstrates his/her commitment to laboratory health and safety.
  • Personally provides a “new employee/student safety orientation” for each new member of the research group.

Monitors and Provides Safety Information and Training:

Insists that everyone who works in the lab receives comprehensive, lab-specific safety information and training.

  • Ensures students and others who work in his or her research lab are educated, informed, and trained in the safety skills they need to conduct research safely and to work independently.
  • Establishes coaching and mentoring relationships to enable new researchers to receive hands-on training in safety practices from more experienced researchers.
  • Requires a structured lab-orientation, including emergency information and safety rules, for new lab workers.
  • Annually and as needed, reviews and revises the lab safety manual and chemical hygiene plan (if separate from the lab safety manual).

Models Safe Behaviors:

Serves as a role model by personally exhibiting good safety behavior.

  • Wears proper lab attire and PPE when entering laboratories or handling research materials.
  • Completes, at a minimum, the same institutional safety training requirements as lab workers and signs the group’s rules agreement.
  • Personally proposes new safety initiatives and/or shares safety best practices with the department and/or Health and Safety department.
  • Actively participates in research group or department safety committees and Joint Safety Teams.
  • Includes safety information in published research and requires it in their students’ thesis proposals and thesis defenses.

Assesses Hazards and Evaluates Risks:

Ensures that lab members complete hazard analyses prior to conducting experimental procedures.

  • Identifies the hazards, types of emergencies that could occur, and what needs to be done to be prepared for them.
  • Implements prudent practices, protective facilities, and PPE needed to minimize risk
  • Reviews new laboratory procedures for potential risks.
  • Requires hazard analyses to be incorporated into lab notebooks prior to an experiment.
  • Expects hazard analysis to be included in thesis proposals, dissertation proposals, and published research.
  • Requires hazard analysis to be revisited if an experimental procedure yields unexpected results or if the procedure requires changes before conducting the experiment.

Creates Safety Leaders:

Empowers researchers to assume leadership roles in establishing safety practices within research groups and for entire departments.

  • Encourages lab members to participate in department safety committees or Joint Safety Teams.
  • Encourages lab members to propose new safety initiatives and/or share safety best practices with the department and/or Health and Safety department.
  • Rewards good safety performance.

Promotes Positive Safety Culture:

Takes actions to encourage safety and promote a strong, positive safety culture in the research lab.

  • Provides instruction and encouragement to lab members for reporting any incident no matter how minor and injuries.
  • Fosters a nonthreatening atmosphere for free expression of safety concerns or questions.
  • Provides ample budgetary support for safety supplies such as PPE and engineering controls.
  • Recognizes that psychological stress can undermine safety culture and performance.
  • Provides and encourages lab members to take advantage of resources for stress management.
  • Encourages open and ongoing dialog about safety.
  • Requires “safety moments” at laboratory group meetings or otherwise incorporate safety into research discussions.
  • Encourages and acknowledges lab members for working safely.
  • Accepts responsibility for safety.
  • Takes initiative to reduce waste and promote greener, more sustainable research practices in his or her lab.

Required support for nomination

The nomination must include a cover letter from the nominator describing why the nominee is deserving of this award. The letter shall describe the nominees work and how it is aligned with the purpose, eligibility, and criteria of the award.

The nomination must also include a letter of support from the institution’s Environmental Health and Safety Officer, as well as, a letter of support from a senior administrator such as the head of the academic department, the vice provost for research, or the dean of the school or college. In cases when the nominator (author of the cover letter) is a member of the Environment Health and Safety office or a senior administrator, an additional letter of support from that party is not required.

Individuals writing letters of support are not required to be members of the American Chemical Society or the Division of Chemical Health and Safety. The supporter’s contact information must be included in the letter. The support letter should clearly describe the impact of the nominee’s work and how the individual has met the purpose and criteria of the award.

Site visit criteria (if applicable)

A site visit will not be required for this award.

What, if any, involvement will the sponsor have in evaluating nominees?

The sponsor will serve as a reviewer of the nominations for this award. The sponsor’s evaluation will be given no more or less weight than the evaluations of the other members of the Awards Selection Subcommittee.

Eligible Sources of Nominations

  • Self-nomination
  • Subordinate (Student/lab member)
  • Department chair
  • EH&S Department
  • Vice Provost for Research or another Senior Administrator
  • Peer

Additional Information about this Award

This award was proposed in 2018 by James Kaufmann, Ph.D. (Founder, Laboratory Safety Institute), and was developed in collaboration with Dr. Kaufmann by the CHAS Awards Committee in 2019. The award was first given in 2020. Criteria for the award were derived primarily from the following sources of academic laboratory safety guidelines:

  1. National Research Council. 2014. Safe Science: Promoting a Culture of Safety in Academic Chemical Research. Washington, DC: The National Academies Press. https://doi.org/10.17226/18706.
  2. ACS Joint Board/Council Committee on Chemical Safety. 2012. Creating Safety Cultures in Academic Institutions: A Report of the Safety Culture Task Force of the ACS Committee on Chemical Safety. Washington, DC: American Chemical Society. https://www.acs.org/content/dam/acsorg/about/governance/committees/chemicalsafety/academic-safety-culture-report.pdf
  3. Laboratory Safety Guidelines: 40 Suggestions for a Safer Lab, James A. Kaufman, Laboratory Safety Institute, https://www.labsafety.org/resource

Previous Winners

2022: Alexander J. M. Miller, Ph.D., University of North Carolina
2021: Ian Albert Tonks, Ph.D., University of Minnesota
2020: Mahesh K. Mahanthappa, Ph.D., University of Minnesota

August 2022 CHAS Technical Program

Sunday, August 21, 2022
08:00am – 11:40am CDT

Safety Across the Chemical Disciplines 

Marta Gmurczyk, Organizer,  Presider;  Dr. Daniel R. Kuespert, CSP, Organizer,  Presider

Safety is a common theme across most chemical disciplines, but hazards deemed common in one field may be opaque to other researchers. Interdisciplinary research brings with it risks associated with several intersecting disciplines, and researchers trained in one speciality will not necessarily be familiar with all the relevant risks. What hazards are recognized in their research field? What hazard control challenges present themselves? Where are the high risks that have been accepted as “normal?” This symposium draws together representatives of many chemical disciplines to share their safety knowledge and challenges 

3737607 – Blueprint thinking to build sustainable solutions on the solid foundation of safety; Kalyani Martinelango, Presenter; Jason Fisk

3743933 – Hazard identification and mitigation in a multidisciplinary industrial research environment, George Athens, Presenter; Jeff Foisel; Steve Horsch

3741150 – Safety in the catalysis research lab, Mark Bachrach, Presenter

3754630 – Risk, safety, and troublesome territoriality: Bridging interdisciplinary divides, John G Palmer, PhD, Presenter; Brenda Palmer

3751492 – Risk-based safety education fosters sustainable chemistry education, Georgia Arbuckle-Keil, Presenter; David Finster; Ms. Samuella Sigmann, MS, CCHO; Weslene Tallmadge; Rachel Bocwinski; Marta Gmurczyk

02:00pm – 04:05pm CDT

Division of Chemical Health & Safety Awards Symposium  

Brandon Chance, Organizer, Presider

3754798 – Interdepartmental initiatives to improving campus chemical safety, Luis Barthel Rosa, Presenter

3738511 – Building and sustaining a culture of safety via ground-up approaches, Quinton Bruch, Presenter

3750224 – Safety net: Lessons in sharing safe laboratory practices, Alexander Miller, Presenter

3740645 – Governing green labs: Assembling safety at the lab bench, Susan Silbey, Presenter

Monday, August 22, 2022, 09:00am – 11:45am CDT

Indicators of Success in Laboratory Safety Cultures

Over the past decade, there has been increasing interest in improving the safety culture of laboratory settings. How do we identify and implement indicators of success of these efforts? When are quantitative cultural measurements available and when do we need to rely on qualitative indicators of movement forward? Both theoretical ideas and concrete examples of the use of this approach are welcome in this symposium. 

3745133 – Indicators of success in a safety culture, Ralph Stuart, CIH, CCHO, Presenter

3733703 – What’s in a name? Mapping the variability in lab safety representative positions, Sarah Zinn, Presenter; Imke Schroeder; Dr. Craig Merlic

3735305 – Supporting high school educators with a chemical hygiene officer, Kevin Doyle, Presenter; Yvonne Doyle

3755577 – Factors for improving a laboratory safety coordinator (LSC) program, Kali Miller, Presenter

3754791 – Empowering student-led organizations to create effective safety policies, Angie Tse, Presenter

3738023 – Quantitative and qualitative indicators of safety culture evolution by the joint safety team, Demetra Adrahtas, Presenter; Polly Lynch; Sofia Ramirez; Brady Bresnahan; Taysir Bader

Monday, August 22 2:00pm – 05:35pm

Division of Chemical Health & Safety General Papers

3752873 – Case studies and chemical safety improvements, Sandra Keyser, Presenter

3737640 – Storytelling is an art in building a “safety first” culture, Irene Cesa, Presenter; Dr. Kenneth P Fivizzani; Michael Koehler

3728705 – Vertical safety engagement through new community connections committee of the UMN joint safety team, Vilma Brandao, Presenter; Zoe Maxwell, Presenter; Jeffrey Buenaflor, Presenter; Gretchen Burke, Presenter; Steven Butler, Presenter; Xin Dong, Presenter; Nyema Harmon, Presenter; Erin Maines, Presenter; Taysir Bader; Brady Bresnahan

3752771 – Health and safety information integration: GHS 2021 version 9 in PubChem, Jian Zhang, Presenter; Evan Bolton

3754174 – Laboratory databases: Applications in safety programming, Magdalena Andrzejewska, Presenter

3748086 – Boundary Conditions- designing and operating laboratory access controls for safety, Joseph Pickel, Presenter

3743418 – Health risk of natural radioactivity and trace metals in shaving powder, Akinsehinwa Akinlua, Presenter

3744635 – Field-portable colorimetric method for the measurement of peracetic acid vapors, Angela Stastny, Presenter; Amos Doepke; Robert Streicher

August 23, 2022, 08:00am – 11:50am   

EHS Leadership & Diversity

Mary Beth Koza, Organizer, Presider

3720285 – Remember: No matter where you go, there you are, Ms. Samuella Sigmann, MS, CCHO, Presenter

3758440 – What is your safety role? An introduction to structured safety programs, Mary Heuges, Presenter

3754152 – Safety leadership & organizational design, Mary Beth Koza, Presenter

3742098 – Chemical safety in the ACS Chicago section, Dr. Kenneth P Fivizzani, Presenter

3746562 – Workplace safety, needs diversity to ensure that everyone is safe, Frankie Wood-Black, Presenter

3753096 – Improving research safety: Activities of the University of California center for laboratory safety, Imke Schroeder, Presenter

3723285 – Circadian rhythms based system for managing the risks of human factor manifestation, Amir Kuat, Presenter

Tuesday August 23, 2022 02:00pm – 05:15pm

Latest Developments in Cannabis Science and Sustainability: Analytics of Various Compounds from the Cannabis Plant

Kyle Boyar, Organizer, Presider; Julia Bramante, Organizer, Presider; Amber Wise, Organizer, Presider

3741048 – Extraction, purification and bioactivity of policosanols from Cannabis sativa L. Megi Gjicolagj; Virginia Brighenti; Alberto Venturelli; Lisa Anceschi; Caterina Durante; Prof. Federica Pellati, Presenter

3750806 – Cannasulfur compounds: A new paradigm for the chemistry of cannabis, Iain Oswald, Presenter; Marcos Ojeda; Thomas Martin; Twinkle Paryani; Kevin Koby

3744744 – Purple rain: CBD discolouration results from cascading photo-redox reactions, Brodie Thomson, Presenter; Markus Roggen, Presenter; Glenn Sammis

3741211 – Optimization of a new HPLC method for the simultaneous separation of cannabinoids by experimental design techniques, Caterina Durante; Cindy Afezolli; Lisa Anceschi; Virginia Brighenti, Presenter; Federica Pollastro; Prof. Federica Pellati

3738752 – Sustainability in the evolving scenario of cannabinoid bioanalysis: Microsampling for the assessment of Δ8-THC and metabolites, Dr. Michele Protti, Presenter; Marco Cirrincione; Dorota Turonová; Lenka Krčmová; Prof. Laura Mercolini

Blazing Trails: Cannabis Chemistry in Post-secondary Education

08:00am – 11:45am   August 24, 2022

Brandon Canfield, Organizer, Presider; Amber Wise, Organizer

Formal post-secondary instruction in the area of cannabis chemistry has emerged across North America in recent years, ranging from entire undergraduate programs to individual courses to specific modules within a course. This session is intended to provide early reporting on the development and progress of education in this field, and to foster a discussion on the directions it is heading.

3739767 – Cannabinoid separation: A new HPLC system suitable for cannabis research in undergraduate laboratories. Alicia Douglas; Maria Swasy; Candice Cashman; Benedict Liu, Presenter

3757535 – Medicinal plant chemistry five years in: A report on the development and status of the first cannabis chemistry undergraduate degree program. Brandon Canfield, Presenter; Lesley Putman

3757906 – Masters in cannabis science and therapeutics at University of Maryland, Baltimore: An overview. Alexandra Harris, Presenter; Dr. Chad R Johnson; Brittany Namaan

3758112 – Olive Harvey College urban agriculture curriculum development…the cannabis path. Akilah Easter, Presenter; Steven Philpott

3744000 – Cannabis biology and chemistry at Colorado State University Pueblo. David Lehmpuhl, Presenter; Chad Kinney; David Dillon; Jeffrey Smith; Jonathan Velasco

3751509 – Fields to findings, from seeds to statistics. Alexander Wilson, Presenter; Maris Cinelli; James DeDecker; Lesley Putman; Ara Kirakosyan; Brandon Canfield

3725733 – Second-year, course-based undergraduate research experience (CURE) in the organic chemistry of cannabinoids. Matthew Mio, Presenter

3742921 – Advancing cannabis/hemp technology platform through an educational-consulting interface (CFS). Jerry King, Presenter

3742395 – Cannabis chemistry in the European scenario of higher education programmes: A diversified network of opportunities. Roberto Mandrioli; Dr. Michele Protti; Stefano Girotti; Camelia Draghici; Fernando Remiao; Ana Morales; Premysl Mladěnka; Luca Ferrari; Prof. Laura Mercolini, Presenter

Latest Developments in Cannabis Science and Sustainability: Characterization of the Cannabis Plant and Modern Extraction Techniques

Kyle Boyar, Organizer, Presider; Julia Bramante, Organizer, Presider; Amber Wise, Organizer, Presider

3752003 – Beyond terpenes: An untargeted approach for characterizing and categorizing cannabis cultivars to derive a robust chemotaxonomy. Aldwin Anterola, Presenter; Laura McGregor; John Abrams

3738042 – Closer look at cannabis: Using cryo-SEM, GC/MC, and HPLC to compare trichome density and trichome ratio to secondary metabolite profiles in cannabis sativa. Steven Philpott, Presenter

3735582 – Advances in extraction of compounds of value from biomass using hot air as an extraction medium. Steven Bonde, Presenter

3756218 – Beyond THC remedation: Examples of cannabinoid isolations with centrifugal partition chromatography. Manar Alherech, Presenter

Laboratory Lessons Learned Pages Updated

There are many different kinds of hazards associated with laboratory work and sometimes these hazards results in unforeseen safety incidents. No one knows how often such laboratory incidents occur, but over the last several years, high profile laboratory events have raised public concern about them. This concern was crystallized in the Chemical Safety Board’s 2011 report on academic laboratory safety.

In order to address the concerns raised by the CSB, the Division of Chemical Health and Safety of the ACS believes that it is important to take advantage of the learning opportunities such incidents represent. Therefore, we have joined together to develop a form which we believe will be an effective tool for collecting appropriate information to develop “Lessons Learned” from such events.

These incidents can include events which result in injuries, financial or scientific losses, near-misses, and safety observations. Our motivation for collecting this information is for many reasons:

  • To avoid having the same incident occur again;
  • To enhance safety awareness of laboratory workers as they conduct routine work;
  • To support the lateral thinking required to develop “what if” scenarios when planning laboratory work;
  • To improve emergency planning for response to laboratory events
  • To identify successful health and safety protection measures;
  • To provide stories that can add interest to safety training efforts; and
  • To help laboratory safety managers correctly prioritize their concerns.

Currently, a variety of laboratory organizations have developed “lessons learned” programs. Examples of these programs can be found at:

Lab Manager Webinar on “Preventing and Managing the Most Likely Lab Accidents”

On June 29, Ralph Stuart presented a webinar for Lab Manager magazine on the topic of Preventing and Managing the Most Likely Lab Accidents. This presentation highlighted a variety of ACS safety resources produced over the last 5 years and described how they can be used in the context of two historical laboratory incidents, specifically the death of Dr. Karen Wetterhahn.

A recorded version of the webinar is available for viewing on the Lab Manager website. A PDF version of the presentation is below, as well as the audience poll responses to the questions asked about their lab risk assessment processes.

This webinar was presented as part of the Divisions Innovative Project funded by the ACS to understand current practices in laboratory risk assessments and how ACS can support improving those practices. If you are interested in participating in this project, contact Ralph Stuart at membership@dchas.org

The Responses to Poll Questions were:

Which of these best describes how often your review your lab’s risks and safety practices?

We have regular (weekly or monthly) safety discussions as refreshers for all lab staff4631%
We rely on consistent use of general best lab safety practices4531%
We review safety as new people are hired or procedures change2618%
We review our SOPs for safety concerns annually3020%
Total147

What part of your lab safety program do you find most challenging?

Planning for Emergencies3223%
Assessing Risks4533%
Recognizing Hazards1813%
Managing Safety to Minimize Risks4333%
Total1389

What is your primary approach to communicating your lab safety practices to people in your lab?

We place notices and signs pointing out specific hazards in the lab5136%
We rely on paper Standard Operating Procedures and Lab Guidance5538%
We put alerts in an Electronic Lab Notebook system107%
We focus on word of mouth and chemical intuition1611%
Total142

Who is involved in developing and reviewing your laboratory risk assessments?:

The person who writes the SOP for procedure4134%
Everyone who handles the chemicals involved in a SOP3126%
Everyone in the lab, because they could be impacted by a safety incident even though they aren’t conducting the procedure involved4134%
Our emergency responders who are expected to provide assistance in can of an incident65%
Total119

New ACS Chemical Safety Videos for College Teaching Labs

The ACS Office of Safety Programs is pleased to announce that it has published a group of 6 safety videos for college teaching labs.

From Chemical Safety Rules to Risk Management
Chemical Safety Information Resources
Assessing Risks in the Chemistry Laboratory
Minimizing Risks in the Chemistry Laboratory 
Minimizing Risks in the Chemistry Laboratory: Techniques
Emergencies in the Chemistry Laboratory

Psychological Evaluation of Virtual Reality Applied to Safety Training

04/14/22 Table Read for The Art & State of Safety Journal Club

Excerpts and comments from “A Transferable Psychological Evaluation of Virtual Reality Applied to Safety Training in Chemical Manufacturing”

Published as part of the ACS Chemical Health & Safety joint virtual special issue “Process Safety from Bench to Pilot to Plant” in collaboration with Organic Process Research & Development and Journal of Loss Prevention in the Process Industries.

Matthieu Poyade, Claire Eaglesham,§ Jordan Trench,§ and Marc Reid*

The full paper can be found here: https://pubs.acs.org/doi/abs/10.1021/acs.chas.0c00105

1. Introduction

Safety in Chemical Manufacturing

Recent high-profile accidents—on both research and manufacturing scales—have provided strong drivers for culture change and training improvements. While our focus here is on process-scale chemical manufacturing,[a][b][c][d][e][f][g][h][i][j] the similarly severe safety challenges exist on the laboratory scale; such dangers have been extensively reviewed recently. Through consideration of the emerging digitalization trends and perennial safety challenges in the chemical sector, we envisaged using interactive and immersive virtual reality (VR) as an opportune technology for developing safety training and accident readiness for those working in dangerous chemical environments.

Virtual Reality

VR enables interactive and immersive real-time task simulations across a growing wealth of areas. In higher education, prelab training in[k][l][m] VR has the potential to address these issues giving students multiple attempts to complete core protocols virtually in advance of experimental work, creating the time and space to practice outside of the physical laboratory.

Safety Educational Challenges

Safety education and research have evolved with technology, moving from videos on cassettes to online formats and simulations. However, the area is still a challenge, and very recent work has demonstrated that there must be an active link between pre-laboratory work and laboratory work in order for the advance work to have impact.

Study Aims

The primary question for this study can be framed as follows: When evaluated on a controlled basis, how do two distinct training methods[n][o][p][q][r][s], (1) VR training and (2) traditional slide training[t][u][v] (displayed as a video to ensure the consistency of the provision of training), compare for the same safety-critical task?

We describe herein the digital recreation of a hazardous facility using VR to provide immersive and proactive safety training. We use this case to deliver a thorough statistical assessment of the psychological principles of our VR safety training platform versus the traditional non-immersive training (the latter still being the de facto standard for such live industrial settings).

Figure 3. Summarized workflow for safety training case identification and comparative assessment of PowerPoint video versus VR.

2. Methods

After completing their training, participants were required to fill in standardized questionnaires which aimed to formally assess 5 measures of their training experiences.

1. Task-Specific Learning Effect

Participants’ post-training knowledge of the ammonia offload task was assessed in an exam-style test composed of six task-specific open-questions.

2. Perception of Learning Confidence

How well participants perform on a training exam and how they feel about the overall learning experience are not the same thing. Participant experiences were assessed through 8 bespoke statements, which were specifically designed for the assessment of both training conditions.

3. Sense of Perceived Presence

“Presence” can be defined as the depth of a user’s imagined sensation of “being there” inside the training media they are interacting with.

4. Usability

From the field of human–computer interaction, the System Usability Scale (SUS) has become the industry standard for the assessment of system performance and fitness for the intended purpose… A user answers their level of agreement on a Likert scale, resulting in a score out of 100 that can be converted to a grade A–F. In our study, the SUS was used to evaluate the subjective usability of our VR training system.

[w][x][y]

5. Sentiment Analysis

Transcripts of participant feedback—from both the VR and non-VR safety training groups—were used with the Linguistic Inquiry and Word Count (LIWC, pronounced “Luke”) program. Therein, the unedited and word-ordered text structure (the corpus) was analyzed against the LIWC default dictionary, outputting a percentage of words fitting psychological descriptors. Most importantly for this study, the percentage of words labeled with positive or negative affect (or emotion) were captured to enable quantifiable comparison between the VR and non-VR feedback transcripts.

3. Results

Safety Training Evaluation

Having created a bespoke VR safety training platform for the GSK ammonia offloading task, the value of this modern training approach could be formally assessed versus GSK’s existing training protocols. Crucial to this assessment was the bringing together of experimental methods which focus on psychological principles that are not yet commonly practiced in chemical health and safety training assessment (Figure 3). All results presented below are summarized in Figure 8 and Table 1.

Figure 8. Summary of the psychological assessment of VR versus non-VR (video slide-based) safety training. (a) Task-specific learning effect. (b) Perception of learning confidence. (c) Assessment of training presence. (d) VR system usability score. In parts b and c, * and ** represent statistically significant results with p < 0.05 and p < 0.001, respectively.

1. Task-Specific Learning Effect (Figure 8a)

Task-specific learning for the ammonia offload task was assessed using a questionnaire[z][aa][ab][ac] built upon official GSK training materials and marking schemes. Overall, test scores from the Control group and the VR group showed no statistical difference between groups. However, there was tighter distribution[ad] around the mean score for the VR group versus the Control group.

2. Perception of Learning Confidence (Figure 8b)

Participants’ perceived confidence of having gained new knowledge was assessed using a questionnaire composed of 8 statements, probing multiple aspects of the learning experience… Within a 95% confidence limit, the VR training method was perceived by participants to be significantly more fit for training purpose than video slides. VR also gave participants confidence that they could next perform the safety task alone[ae][af][ag]. Moreover, participants rated VR as having more potential than traditional slides for helping train in other complex tasks and to improve decision making skills (Figure 8b). Overall, participants from the VR group felt more confident and prepared for on-site training than those from the Control group.[ah]

3. Sense of Perceived Presence (Figure 8c)

The Sense of Presence questionnaire was used to gauge participants’ overall feeling of training involvement across four key dimensions. Results show that participants from the VR group reported experiencing a higher sense of presence than those from the Control group. On the fourth dimension, negative effects, participants from the Control group reported experiencing more negative effects than those from the VR group, but the result was not statistically different (Figure 8c). [ai]

4. Usability of the VR Training Platform (Figure 8d)

The System Usability Scale (SUS) was used to assess the effectiveness, intuitiveness, and satisfaction with which participants were able to achieve the task objectives within the VR environment. The average SUS score recorded was 79.559 (∼80, or grade A−), which placed our VR training platform on the edge of the top 10% of SUS scores (see Figure 5 for context). The SUS result indicated an overall excellent experience for participants in the VR group.

[Participants also] disagreed with any notion that the VR experience was too long (1.6 ± 0.7) and did not think it was too short (2.5 ± 1.1). Participants agreed that the simulation was stable and smooth (3.9 ± 1.2) and disagreed that it was in any way jaggy (2.3 ± 0.8). Hand-based interactions with the VR environment were agreed to be relatively intuitive (3.8 ± 1.3), and the head-mounted display was found to provide agreeable comfort for the duration of the training (4.0 ± 0.9).

5. Sentiment Analysis of Participant Feedback (Table 1)

In the final part of our study, we aimed to corroborate the formal statistical analysis against a quantitative analysis of open participant feedback. Using the text-based transcripts from both the Control and VR group participant feedback, the Linguistic Inquiry and Word Count (LIWC) tool provided further insight based on the emotional sentiment hidden in the plain text. VR participants were found to use more positively emotive words (4.2% of the VR training feedback corpus) versus the Control group (2.1% of the video training feedback corpus). More broadly, the VR group displayed a more positive emotive tone and used fewer negatively emotive words than the Control group.

Table 1. LIWC-Enabled Sentiment Analysis of Participant Feedback Transcripts

LIWC variable

brief description

VR-group

non-VR group

word count

no. of words in the transcript

1493

984

emotional tone

difference between positive and negative words (<50 = negative)

83.1

24.2

positive emotion

% positive words (e.g., love, nice, sweet)

4.2%

2.1%

negative emotion

% negative words (e.g., hurt, nasty, ugly)

1.1%

2.2%

4. Discussion

Overall, using our transferable assessment workflow, the statistical survey analysis showed that task-specific learning was equivalent[aj][ak][al][am][an] for VR and non-VR groups. This suggests that the VR training in that specific context is not detrimental to learning and appears to be as effective as the traditional training modality but, crucially, with improved user investment in the training experience. However, the distribution difference between both training modalities suggests that the VR training provided a more consistent experience across participants than watching video slides, but more evaluation would be required to verify this.

In addition, perceived learning confidence and sense of perceived presence were reported to be all significantly better in VR over the non-VR group. The reported differences in perceived learning confidence between participants from both groups suggest that those from the VR group, despite having acquired a similar amount of knowledge, were feeling more assured about the applicability of that knowledge. These findings thus suggest that the VR training resulted in a more engaging and psychologically involving modality able to increase participants’ confidence in their own learning. [ao][ap][aq]Further research will also aim to explore the applicability and validation of the perceived learning confidence questionnaire introduced in this investigation.

Additionally, VR system usability was quantifiably excellent, according to the SUS score and feedback text sentiment analysis.

Although our experimental data demonstrate the value of the VR modality for health and safety training in chemical manufacturing settings, the sampling, and more particularly the variation in digital literacy[ar] among participants, may be a limitation to the study. Therefore, future research should explore the training validity of the proposed approach involving a homogeneously digitally literate cohort of participants to more rigorously measure knowledge development between experimental conditions.

4.1. Implications for Chemical Health and Safety Training

By requiring learners to complete core protocols virtually in advance of real work, VR pretask training has the potential to address issues of complex learning, knowledge retention[as][at][au], training turnover times, and safety culture enhancements. Researchers in the Chemical and Petrochemical Sciences operate across an expansive range of sites, from small laboratories to pilot plants and refineries. Therefore, beyond valuable safety simulations and training exercises, outputs from this work are envisaged to breed applications where remote virtual or augmented assistance can alleviate the significant risks to staff on large-scale manufacturing sites.

4.2. Optimizing Resource-Intensive Laboratory Spaces

As a space category in buildings, chemistry laboratories are significantly more resource-intensive than office or storage spaces. The ability to deliver virtual chemical safety training, as demonstrated herein, could serve toward the consolidation and recategorization, minimizing utility and space expenditure threatening sustainability.[av][aw][ax][ay][az][ba][bb] By developing the new Chemistry VR laboratories, high utility bills[bc][bd][be][bf] associated with running physical chemistry laboratories could potentially be significantly reduced.

4.3. Bridging Chemistry and Psychology

By bringing together psychological and computational assessments of safety training, the workflow applied herein could serve as a blueprint for future developments in this emerging multidisciplinary research domain. Indeed, the need to bring together chemical and psychological skill sets was highlighted in the aforementioned safety review by Trant and Menard.

5. Conclusions

Toward a higher standard of safety training and culture, we have described the end-to-end development of a VR safety training platform deployed in a dangerous chemical manufacturing environment. Using a specific process chemical case study, we have introduced a transferable workflow for the psychological assessment of an advanced training tool versus traditional slide-based safety training. This same workflow could conceivably be applied to training developments beyond safety.

Comparing our VR safety training versus GSK’s established training protocols, we found no statistical difference in the task-specific learning[bg] achieved in VR versus traditional slide-based training. However, statistical differences, in favor of VR, were found for participants’ positive perception of learning confidence and in their training presence (or involvement) in what was being taught. In sum, VR training was shown to help participants invest more in their safety training than in a more traditional setting for training[bh][bi][bj][bk][bl].

Specific to the VR platform itself, the standard System Usability Scale (SUS) found that our development ranked as “A–” or 80%, placing it toward an “excellent” rating and well within the level of acceptance to deliver competent training.

Our ongoing research in this space is now extending into related chemical safety application domains.

[a]I would think the expense of VR could be seen as more “worth it” at this scale rather than at the lab scale given how much bigger and scarier emergencies can be (and how you really can’t “recreate” such an emergency in real life without some serious problems).

[b]Additionally, I suspect that in manufacturing there is more incentive to train up workers outside of the lecture/textbook approach. Many people are hands on learners and tend to move into the trades for that reason.

[c]I was also just about to make a comment that the budget for training and purchasing and upkeep for VR equipment is probably more negligible in those environments compared to smaller lab groups

[d]Jessica…You can create very realistic large scale simulations.  For example, I have simulated 50-100 barrel oil spills on water, in rivers, ponds and lakes, with really good results.

[e]Oh – this is a good point. What is not taken into account here is the comfort level people have with different types of learning. It would be interesting to know if Ph.D. level scientists and those who moved into this work through apprenticeship and/or just a BA would’ve felt differently about these training experiences.

[f]Neal – I wasn’t questioning that. I was saying that those things are difficult (or impossible) to recreate in real life – which is why being able to do a simulation would be more attractive for process scale than for lab scale.

[g]The skill level of the participants in not known.  A pilot plant team spans very skilled technicians to PhD level engineers and other scientists. I do not buy into Ralph’s observation.

[h]Jessica, I disagree.  Simulated or hands-on is really valuable for learning skills that require both conceptual understanding and muscle memory tasks.

[i]I’m not disagreeing. What I am saying is that if you want to teach someone how to clean up a spill, it is a lot easier to spill 50 mL of something in reality and have them practice leaning it up, than it is to spill 5000 L of something and ask them to practice cleaning it up. Ergo, simulation is going to automatically be more attractive to those who have to deal with much larger quantities.

[j]And the question wasn’t about skill level. It was about comfort with different sorts of learning. You can be incredibly highly skilled and still prefer to learn something hands-on – or prefer for someone to give you a book to read. The educational levels were somewhat being used as proxies for this (i.e. to get a PhD, you better like learning through reading!).

[k]Videos

The following YouTube links provide representative examples of:

i. The ammonia offload training introduction video; https://youtu.be/30SbytSHbrU

ii. The VR training walkthrough; https://youtu.be/DlXu0nTMCPQ

iii. The GSK ammonia tank farm fly through (i.e. the digital twin used in the VR training;

iv. The video slide training video; https://youtu.be/TZxJDJXVPgM

[l]Awesome – thank you for adding this.

[m]Very helpful. Thank you

[n]I’d be curious to see how a case 3 of video lecture then VR training compares, because this would have a focused educational component then focused skill and habit development component

[o]I’d be interested to see how in person training would compare to these.

[p]This would also open up a can of worms. It is in-person interactive learning? Is it in-person hands-on learning? Or is it sitting in a classroom watching someone give a Powerpoint in-person learning?

[q]I was imagining hands-on learning or the reality version of the VR training they did so they could inspect how immersive the training is compared to the real thing. Comparision to an interactive train could also have been interesting.

[r]I was about to add that I feel like comparison to interactive in-person training would’ve been good to see. I tend to think of VR as same as an in-person training but just digital.

[s]Thats why I think it could be interesting. They could see if there is in fact a difference between VR and hands-on training. Then if there was none you would have an argument for a cost saving in space and personnel.

[t]Comparing these two very different methods is problematic. If you are trying to assess the added immersive value of VR then it needed to be compared to another ‘active learning’ method such as computer simulation.

[u]Wouldn’t VR training essentially be a form of computer simulation? I was thinking that the things being compared were 2 trainings that the employees could do “on their own”. At the moment, the training done on their own is typically some sort of recorded slideshow. So they are comparing to something more interactive that is also something that the employee can do “on their own.”

[v]A good comparison could have been VR with the headset and controllers in each hand compared to a keyboard and mouse simulation where you can select certain options. More like Oregon Trail.

[w]I’ve never seen this graphic before, but I love it! Excel is powerful but user-hostile, particularly if you try to share your work with someone else.

Of course, Google and Amazon are cash cows for their platforms, so they have an incentive to optimize System Usability (partially to camouflage their commercial interest in what they are selling).

[x]I found this comparison odd. It is claiming to measure a computer system’s fitness for purpose. Amazon’s purpose for the user is to get people to buy stuff. While it may be complex beyond the scenes, it has a very simple and pretty singular purpose. Excel’s purpose for the user is to be able to do loads of different and varied complex tasks. They are really different animals.

[y]Yes, I agree that they are different animals. However understanding the limits of the two applications requires more IT education than most people get.(Faculty routinely comment that “this generation of students doesn’t know how to use computers”.) As a result, Excel is commonly used for tasks that it is not appropriate for. But you’re correct that Amazon and Google’s missions are much simpler than those Excel are used for.

[z]I’d be very interested to know what would have happened if the learners were asked to perform the offload task.

[aa]Yes! I just saw your comment below. I wonder if they are seeing no difference here because they are testing on different platform than they trained. Would be interesting to compare written and in practice results for each group. Maybe the VR group would be worse at the written test but better at physically doing the tasks.

[ab]This could get at my concerns about the Dunning-Kruger effect that I mentioned below. as well Just because someone feels more confident that they can do something, doesn’t mean that they are correct about that. It definitely would’ve been helpful to actually have the employees perform the task and see how the two groups compared. Especially since the purpose of training isn’t to pass a test – it is to actually be able to DO the task!

[ac]“Especially since the purpose of training isn’t to pass a test – it is to actually be able to DO the task!” – this

[ad]If I’m understanding the plot correctly, it looks like there is a higher skew in the positive direction for the control group which is interesting. I.e. lower but also higher scores. It seems to have an evening effect which makes the results of the training more predictable.

[ae]I’m slightly alarmed that VR give people the confidence to perform the tasks alone when there was no statistical difference in the task-specific learning scores than the control group. VR seems to give a false sense of confidence.

[af]I had the same reaction to reading this. I don’t believe a false sense of confidence in performing a task alone to be a good thing. Confidence in your understanding is great, but overconfidence in physically performing something can definitely lead to more accidents.

[ag]A comment I’ve seen is that the VR trainees may perform better in doing the action than the control but they only gauged their knowledge in an exam style, not in actually performing the task they are trained to do. But regardless, a false confidence would not be good.

[ah]Wonder if there is concern that this creates a false confidence given the exam scores.

[ai]Wow these are big differences. There is basically no overlap in the error bars except the negative effect dimension.

[aj]If the task-specific learning was equivalent, but one method caused people to express greater confidence than the other, is this necessarily a good thing? Taken to an extreme perhaps, wouldn’t we just be triggering the Dunning-Kruger effect?

[ak]I’d be interested in the value of VR option as refresher training, similar to the way that flight simulators are used for pilots. Sullenberger attributed surviving the bird strike at LaGuardia to lots of simulator time allowing him to react functionally rather than emotionally in the 30 seconds after birds were hit

[al]I had the same comment above. I was wondering if people should be worried about false confidence. But the problem with the assessment was on paper, they may be better at physically doing the tasks or responding in real time, that was not tested.

[am]Exactly, Kali!

[an]From Ralph’s comment, I think there is a lot of value in VR for training for and simulating emergency situations without any of the risks.

[ao]Again, this triggers the question: should they be more confident in this learning?

[ap]I don’t think the available data let us distinguish between an overconfident test group and an under-confident control group (or both).

[aq]I am curious what the standard in the field is for determining at what point learners or over confident. For example, are these good scores? Is there a correlation between higher test scores and confidence or inverse meaning false confidence?

[ar]I am curious as well how much training there needed to be to explain how the VR setup worked. That is a huge factor to how easy people will perceive the VR setup to be if someone walks them through it slowly, versus walking up to a training station and being expected to know how to use it.

[as]I suspect that if people are more engaged in the VR training that they would have better retention of the training over time and would be interesting to explore. If so, that would be a great argument for VR.

[at]Right and if places don’t have the capacity to offer hands on training, an alternative to online-only is VR. Or, in high hazard work where it’s not advisable to train under the circumstances.

[au]I argee that their is a lot of merit for VR in high hazard or emergnecy response simulation and training.

[av]I’m personally more interested in the potential to use augmented reality in an empty lab space to train researchers to work with a variety of hazards/operations.

[aw]Right, this whole time I wasn’t thinking about replacing labs I was thinking about adding a desk space for training. I am still curious about the logistics though, in the era of COVID if people will really be comfortable sharing equipment and how it will be maintained, because that all costs money as well.

[ax]This really would open up some interesting arguments. Would learning how to do everything for 4 years in college by VR actually translate to an individual being able to walk into a real lab and work confidently?

[ay]From COVID, some portion of the labs they were able to do virtually – like moving this etc. From personal standpoint, I’d say no and like we talked about here – it would give a false sense of confidence which could be detrimental. It’s the same way I feel about “virtual fire extinguisher training.” I don’t think it provides any of the necessary exposure.

[az]Amanda totally agree.  The virtual fire extinguisher training does not provide the feel (heat) smell (fire & agent) or sound or a live fire exercise.

[ba]Oh wait – virtual training and “virtual reality training” are pretty different concepts. I agree that virtual training would never be able to substitute for hands-on experience. However, what VR training has been driving at for years is to try to create such a realistic environment within which to perform the training that it really does “feel like you are there.” I’m not sure how close we are to that. In my experiences with VR, I haven’t seen anything THAT good. But I likely haven’t been exposed to the real cutting edge.

[bb]Jessica, I wouldn’t recommend that AR/VR ever be used as the only training provided,  but I suspect that it could shorten someone’s learning curve.

Amanda and Neal, having done both, I think that both have their advantages. The heat of the fire and kickback from the extinguisher aren’t well-replicated, but I actually think that the virtual extinguisher can do a better job of simulating the difficulty. When I’ve done the hands-on training, you’d succeed in about 1-2 seconds as long as you pointed the extinguisher in the general vicinity of the fire.

[bc]Utility bills for lab spaces are higher than average, but a small fraction of the total costs of labs. At Cornell, energy costs were about 5% of the cost of operating the building when you took the salaries of the people in the building into account. This explains why utility costs are not as compelling to academic administrators as they might seem. However, if there are labor cost savings associated with VR…

[bd]I’d love to see a breakdown of the fixed cost of purchasing the VR equipment, and the incremental cost of developing each new module.

[be]Ralph, that is so interesting to see the numbers I had no idea it was that small of an amount. I suppose the argument might need to be wasted energy/ environmental considerations then, rather than cost savings.

[bf]Yes, that is why I had a job at Cornell – a large part of the campus community was very committed to moving toward carbon neutrality and support energy conservation projects even with long payback periods (e.g. 7 years)

[bg]It seems like it would be more helpful to have a paper-test and field-test to see if VR helped with physically doing the tasks, since that is the benefit that I see. In addition, looking at retention over time would be important. Otherwise the case is harder to make for VR if it’s not increasing the desired knowledge and skills

[bh]How much of the statistical favoring of VR was due to the VR platform being “cool” and “new” versus the participants familiarity with traditional approaches? I do not see any control in the document for this.

[bi]I agree that work as reported seems to be demonstrating that the platform provides an acceptable experience for the learner, but it’s not clear whether the skills are acquired or retained.

[bj]three of the 4 authors are in the VR academic environment; one is in a chemistry department.  Seems to be real bias in showing that VR works in this.

[bk]One application I think this could help with is one I experienced today. I had to sit through an Underground Storage Tank training which was similar to the text-heavy powerpoint in the video. I had been able to self direct my tour of the tank and play with the valves and detection systems, I would have been able to complete the regulatory requirements of understanding the system well enough to oversee its operations, but not well enough to have hands-on responsibilities. The hands-on work is left the contractors who work on the tanks everyday.

[bl]We have very good results using simulators in the nuclear power and airflight industries. BUT, there is still significant learning that occurs when things become real.  The most dramatic example is landing high performance aircraft on carrier decks.  Every pilot agrees that nothing prepares them for the reality this controlled crash.

Positive Feedback for Improved Safety Inspections

Melinda Box, MEd, presenter
Ciana Paye, &
Maria Gallardo-Williams, PhD 
North Carolina State University

Melinda’s powerpoint can be downloaded from here.

03/17 Table Read for The Art & State of Safety Journal Club

Excerpts from “Positive Feedback via Descriptive Comments for Improved Safety Inspection Responsiveness and Compliance”

The full paper can be found here: https://pubs.acs.org/doi/10.1021/acs.chas.1c00009

Safety is at the core of industrial and academic laboratory operations worldwide and is arguably one of the most important parts of any enterprise since worker protection is key to maintaining on-going activity.28 At the core of these efforts is the establishment of clear expectations regarding acceptable conditions and procedures necessary to ensure protection from accidents and unwanted exposures[a]. To achieve adherence to these expectations, frequent and well-documented inspections are made an integral part of these systems.23 

Consideration of the inspection format is essential to the success of this process.31 Important components to be mindful about include frequency of inspections, inspector training, degree of recipient involvement, and means of documentation[b][c][d][e]        . Within documentation, the form used for inspection, the report generated, and the means of communicating, compiling, and tracking results all deserve scrutiny in order to optimize inspection benefits.27

Within that realm of documentation, inspection practice often depends on the use of checklists, a widespread and standard approach.  Because checklists make content readily accessible and organize knowledge in a way that facilitates systematic evaluation, they are understandably preferred for this application.  In addition, checklists reduce frequency of omission errors and, while not eliminating variability, do increase consistency in inspection elements[f][g][h][i][j] among evaluators because users are directed to focus on a single item at a time.26 This not only amplifies the reliability of results, but it also can proactively communicate expectations to inspection recipients and thereby facilitate their compliance preceding an inspection.

However, checklists do have limitations.  Most notably, if items on a list cover a large scale and inspection time is limited, reproducibility in recorded findings can be reduced[k]. In addition, individual interpretation and inspector training and preparation can affect inspection results[l][m][n][o][p][q].11 The unfortunate consequence of this variation in thoroughness is that without a note of deficiency there is not an unequivocal indication that an inspection has been done. Instead, if something is recorded as satisfactory, the question remains whether the check was sufficiently thorough or even done at all.  Therefore, in effect, the certainty of what a checklist conveys becomes only negative feedback[r][s][t].

Even with uniformity of user attention and approach, checklists risk producing a counterproductive form of tunnel vision[u] because they can tend to discourage recognition of problematic interactions and interdependencies that may also contribute to unsafe conditions.15 Also, depending on format, a checklist may not provide the information on how to remedy issues nor the ability[v][w] to prioritize among issues in doing follow-up.3 What’s more, within an inspection system, incentive to pursue remedy may only be the anticipation of the next inspection, so self-regulating compliance in between inspections may not be facilitated.[x][y][z][aa][ab][ac]22

Recognition of these limitations necessitates reconsideration of the checklist-only approach, and as part of that reevaluation, it is important to begin with a good foundation.  The first step, therefore, is to establish the goal of the process.  This ensures that the tool is designed to fulfill a purpose that is widely understood and accepted.9 Examples of goals of an environmental health and safety inspection might be to improve safety of surroundings, to increase compliance with institutional requirements, to strengthen preparedness for external inspections, and/or to increase workers’ awareness and understanding of rules and regulations.  In all of these examples, the aim is to either prompt change or strengthen existing favorable conditions.  While checklists provide some guidance for change, they do not bring about that change, [ad][ae]and they are arguably very limited when it comes to definitively conveying what favorable conditions to strengthen.  The inclusion of positive feedback fulfills these particular goals.

A degree of skepticism and reluctance to actively include a record of positive observations[af][ag] in an inspection, is understandable since negative feedback can more readily influence recipients toward adherence to standards and regulations.  Characterized by correction and an implicit call to remedy, it leverages the strong emotional impact of deficiency to encourage limited deviation from what has been done before.19 However, arousal of strong negative emotions, such as discouragement, shame, and disappointment, also neurologically inhibits access to existing neural circuits thereby invoking cognitive, emotional, and perceptual impairment.[ah][ai][aj]10, 24, 25 In effect, this means that negative feedback can also reduce the comprehension of content and thus possibly run counter to the desired goal of bringing about follow up and change.2 

This skepticism and reluctance may understandably extend to even including positive feedback with negative feedback since affirmative statements do not leave as strong of an impression as critical ones.  However, studies have shown that the details of negative comment will not be retained without sufficient accompanying positive comment.[ak][al][am][an][ao][ap][aq]1[ar][as] The importance of this balance has also been shown for workplace team performance.19 The correlation observed between higher team performance and a higher ratio of positive comments in the study by Losada and Heaphy is attributed to an expansive emotional space, opening possibilities for action and creativity.  By contrast, lower performing teams demonstrated restrictive emotional spaces as reflected in a low ratio of positive comments.  These spaces were characterized by a lack of mutual support and enthusiasm, as well as an atmosphere of distrust and [at]cynicism.18

The consequence of positive feedback in and of itself also provides compelling reason to regularly and actively provide it. Beyond increasing comprehension of corrections by offsetting critical feedback, affirmative assessment facilitates change by broadening the array of solutions considered by recipients of the feedback.13 This dynamic occurs because positive feedback adds to an employees’ security and thus amplifies their confidence to build on their existing strengths, thus empowering them to perform at a higher level[au].7 

Principles of management point out that to achieve high performance employees need to have tangible examples of right actions to take[av][aw][ax][ay][az], including knowing what current actions to continue doing.14, 21 A significant example of this is the way that Dallas Cowboys football coach, Tom Landry, turned his new team around.  He put together highlight reels for each player that featured their successes.  That way they could identify within those clips what they were doing right and focus their efforts on strengthening that.  He recognized that it was not obvious to natural athletes how they achieved high performance, and the same can be true for employees and inspection recipients.7 

In addition, behavioral science studies have demonstrated that affirmation builds trust and rapport between the giver and the receive[ba]r[bb][bc][bd][be][bf][bg].6 In the context of an evaluation, this added psychological security contributes to employees revealing more about their workplace which can be an essential component of a thorough and successful inspection.27 Therefore, positive feedback encourages the dialogue needed for conveying adaptive prioritization and practical means of remedy, both of which are often requisite to solving critical safety issues.[bh]

Giving positive feedback also often affirms an individual’s sense of identity in terms of their meaningful contributions and personal efforts. Acknowledging those qualities, therefore, can amplify them.  This connection to personal value can evoke the highest levels of excitement and enthusiasm in a worker, and, in turn, generate an eagerness to perform and fuel energy to take action.8 

Looked at from another perspective, safety inspections can feel personal. Many lab workers spend a significant amount of time in the lab, and as a result, they may experience inspection reports of that setting as a reflection on their performance rather than strictly an objective statement of observable work. Consideration of this affective impact of inspection results is important since, when it comes to learning, attitudes can influence the cognitive process.29  Indeed, this consideration can be pivotal in transforming a teachable moment into an occasion of learning.4 

To elaborate, positive feedback can influence the affective experience in a way that can shift recipients’ receptiveness to correction[bi][bj].  Notice of inspection deficiencies can trigger a sense of vulnerability, need, doubt, and/or uncertainty. At the same time, though, positive feedback can promote a sense of confidence and assurance that is valuable for active construction of new understanding.  Therefore, positive feedback can encourage the transformation of the intense interest that results from correction into the contextualized comprehension necessary for successful follow-up to recorded deficiencies.  

Overall, then, a balance of both positive and negative feedback is crucial to ensuring adherence to regulations and encouraging achievement.[bk]2, 12 Since positive emotion can influence how individuals respond to and take action from feedback, the way that feedback is formatted and delivered can determine whether or not it is effective.20  Hence, rooting an organization in the value of positivity can reap some noteworthy benefits including higher employee performance, increased creativity, and an eagerness in employees to engage.6

To gain these benefits, it is, therefore, necessary to expand from the approach of the checklist alone.16 Principles of evaluation recommend a report that includes both judgmental and descriptive information.  This will provide recipients with the information they seek regarding how well they did and the successfulness of their particular efforts.27 Putting the two together creates a more powerful tool for influence and for catalyzing transformation.

In this paper the authors would like to propose an alternative way to format safety inspection information, in particular, to provide feedback that goes beyond the checklist-only approach. The authors’ submission of this approach stems from their experiences of implementing a practice of documenting positive inspection findings within a large academic department. They would like to establish, based on educational and organizational management principles, that this practice played an essential role in the successful outcome of the inspection process in terms of corrections of safety violations, extended compliance, and user satisfaction. [bl][bm][bn][bo]

[a]There is also a legal responsibility of the purchaser of a hazardous chemical (usually the institution, at a researcher’s request) to assure it is used in a safe and healthy manner for a wide variety of stakeholders

[b]I would add what topics will be covered by the inspection to this list. The inspection programs I was involved in/led had a lot of trouble deciding whether the inspections we conducted were for the benefit of the labs, the institution or the regulatory authorities. Each of these stakeholders had a different set of issues they wanted to have oversight over. And the EHS staff had limited time and expertise to address the issues adequately from each of those perspectives.

[c]This is interesting to think about. As LSTs have taken on peer-to-peer inspections, they have been using them as an educational tool for graduate students. I would imagine that, even with the same checklist, what would be emphasized by an LST team versus EHS would end up being quite a bit different as influenced by what each group considered to be the purpose of the inspection activity.

[d]Hmm, well maybe the issue is trying to cram the interests and perspectives of so many different stakeholders into a single, annual, time-limited event 😛

[e]I haven’t really thought critically about who the stakeholders are of an inspection and who they serve. I agree with your point Jessica, that the EHS and LST inspections would be different and focus on different aspects. I feel include to ask my DEHS rep for his inspection check list and compare it to ours.

[f]This is an aspirational goal for checklists, but is not often well met. This is because laboratories vary so much in the way they use chemicals that a one size checklist does not fit all. This is another reason that the narrative approach suggested by this article is so appealing

[g]We have drafted hazard specific walkthrough rubrics that help address that issue but a lot of effort went into drafting them and you need someone with expertise in that hazard area to properly address them.

[h]Well I do think that, even with the issues that you’ve described, checklists still work towards decreasing variability.

In essence, I think of checklists as directing the conversation and providing a list of things to make sure that you check for (if relevant). Without such a document, the free-form structure WOULD result in more variability and missed topics.

Which is really to say that, I think a hybrid approach would be the best!

[i]I agree that a hybrid approach is ideal, if staff resources permit. The challenge is that EHS staff are responding to a wide variety of lab science issues and have a hard time being confident that they are qualified to raise issues. Moving from major research institutions to a PUI, I finally feel like I have the time to provide support to not only raise concerns but help address them. At Keene State, we do modern science, but not exotic science.

[j]I feel like in the end, it always comes down to “we just don’t have the resources…”

Can we talk about that for a second? HOW DO WE GET MORE RESOURCES FOR EHS  : /

[k]I feel like that would also be the case for a “descriptive comment” inspection if the time is limited. Is there a way that the descriptive method could improve efficiency while still covering all inspection items?

[l]This is something my university struggles with in terms of a checklist. There’s quite a bit of variability between inspections in our labs done by different inspectors. Some inspectors will catch things that others would overlook. Our labs are vastly different but we are all given the same checklist – Our checklist is also extensive and the language used is quite confusing.

[m]I have also seen labs with poor safety habits use this to their advantage as well. I’ve known some people who strategically put a small problem front and center so that they can be criticized for that. Then the inspector feels like they have “done their job” and they don’t go looking for more and miss the much bigger problem(s) just out of sight.

[n]^ That just feels so insidious. I think the idea of the inspector looking for just anything to jot down to show they were looking is not great (and something I’ve run into), but you want it to be because they CANT find anything big, not just to hide it.

[o]Amanda, our JST has had seminars to help prepare inspectors and show them what to look for. We also include descriptions of what each inspection score should look like to help improve reproducibility.

[p]We had experience with this issue when we were doing our LSTs Peer Lab Walkthrough! For this event, we have volunteer graduate students serve as judges to walk through volunteering labs to score a rubric. One of the biggest things we’ve had to overhaul over the years is how to train and prepare our volunteers.

So far, we’ve landed on providing training, practice sessions, and using a TEAM of inspectors per lab (rather than just one). These things have definitely made a huge difference, but it’s also really far from addressing the issue (and this is WITH a checklist)

[q]I’d really like to go back to peer-peer walkthroughs and implement some of these ideas. Unfortunately, I don’t think we are there yet in terms of the pandemic for our university and our grad students being okay with this. Brady, did you mean you train the EHS inspectors or for JST walkthroughs? I’ve given advice about seeing some labs that have gone through inspections but a lot of items that were red flags to me (and to the grad students who ended up not pointing it out) were missed / not recorded.

[r]This was really interesting to read, because when I was working with my student safety team to create and design a Peer Lab Walkthrough, this was something we tried hard to get around even though we didn’t name the issue directly. 

We ended up making a rubric (which seems like a form of checklist) to guide the walkthrough and create some uniformity in responding, but we decided to make it be scored 1-5 with a score of *3* representing sufficiency. In this way, the rubric would both include negative AND positive feedback that would go towards their score in the competition.

[s]I think the other thing that is cool about the idea of a rubric, is there might be space for comments, but by already putting everything on a scale, it can give an indication that things are great/just fine without extensive writing!

[t]We also use a rubric but require a photo or description for scores above sufficient to further highlight exemplary safety.

[u]I very much saw this in my graduate school lab when EHS inspections were done. It was an odd experience for me because it made me feel like all of the things I was worried about were normal and not coming up on the radar for EHS inspections.

[v]I think this type of feedback would be really beneficial. From my experience (and hearing others), we do not get any feedback on how to fix issues. You can ask for help to fix the issues, but sometimes the recommendations don’t align with the labs needs / why it is an issue in the first place

[w]This was a major problem in my 1st research lab with the USDA. We had safety professionals who showed up once per year for our annual inspection (they were housed elsewhere). I was told that a set up we had wasn’t acceptable. I explained why we had things set up the way we did, then asked for advice on how to address the safety issues raised. The inspector literally shrugged his shoulders and gruffly said to me “that’s your problem to fix.” So – it didn’t get fixed. My PI had warned my about this attitude (so this wasn’t a one-off), but I was so sure that if we just had a reasonable, respectful conversation….

[x]I think this is a really key point. We had announced inspections and were aware of what was on the checklists. As a LSO, we would in the week leading up to it get the lab in tip-top shape. Fortunately, we didn’t always have a ton of stuff to fix in the week leading up to the inspection, but its easy to imagine reducing things to a yes/no checklists can mask long-term problems that are common in-between inspections

[y]My lab is similar to this. When I first joined and went through my first inspection – the week prior was awful trying to correct everything. I started implementing “clean-ups” every quarter because my lab would just go back to how it was after the inspection.

[z]Same

[aa]Our EHS was short-staffed and ended up doing “annual inspections” 3 years apart – once was before I joined my lab, and the next was ~1.5 years in. The pictures of the incorrect things found turned out to be identical to the inspection report from 3 years prior. Not just the same issues, but quite literally the same bottles/equipment.

[ab]Yeah, we had biannual lab cleanups that were all day events, and typically took place a couple months before our inspections; definitely helped us keep things clean.

One other big things is when we swapped to subpart K waste handling (cant keep longer than 12months), we just started disposing of all waste every six months so that way nothing could slip through the cracks before an inspection

[ac]Jessica, you’re talking about me and I don’t like it XD

[ad]I came to feel that checklists were for things that could be reviewed when noone was in the room and that narrative comments would summarize conversations about inspectors’ observations. These are usually two very different sets of topics.

[ae]We considered doing inspections “off-hours” to focus on the checklist items, until we realized there was no such thing as off hours in most academic labs. Yale EHS found many more EHS problems during 3rd shift inspections than daytime inspections

[af]This also feels like it would be a great teachable moment for those newer to the lab environment. We often think that if no one said anything, I guess it is okay even if we do not understand why it is okay. Elucidating that something is being done right “and this is why” is really helpful to both teach and reinforce good habits.

[ag]I think that relates well to the football example of highlighting the good practices to ensure they are recognized by the player or researcher.

[ah]I noticed that not much was said about complacency, which I think can both be a consequence of an overwhelm of negative feedback and also an issue in and of itself stemming from culture. And I think positive feedback and encouragement could combat both of these contributors to complacency!

[ai]Good point. It could also undermine the positive things that the lab was doing because they didn’t actually realize that the thing they were previously doing was actually positive. So complacency can lead to the loss of things that you were doing right before.

[aj]I have seen situations where labs were forced to abandon positive safety efforts because they were not aligned with institutional or legal expectations. Compliance risk was lowered, but physical risk increased.

[ak]The inclusion of both positive and negative comments shows that the inspector is being more thorough which to me would give it more credibility and lead to greater acceptance and retention of the feedback.

[al]I think they might be describing this phenomenon a few paragraphs down when they talk about trust between the giver and receiver.

[am]This is an interesting perspective. I have found that many PIs don’t show much respect for safety professionals when they deliver bad news. Perhaps the delivery of good news would help to reinforce the authority and knowledge base of the safety professionals – so that when they do have criticism to deliver, it will be taken more seriously.

[an]The ACS Pubs advice to reviewers of manuscripts is to describe the strengths of the paper before raising concerns. As an reviewer, I have found that approach very helpful because it makes me look for the good parts of the article before looking for flaw.

[ao]The request for corrective action should go along with a discussion with the PI of why change is needed and practical ways it might be accomplished.

[ap]I worked in a lab that had serious safety issues when I joined.  Wondering if the positive -negative feedback balance could have made a difference in changing the attitudes of the students and the PI.

[aq]It works well with PIs with an open mind; but some PIs have a sad history around EHS that has burnt them out on trying to work with EHS staff. This is particularly true if the EHS staff has a lot of turnover.

[ar]Sandwich format- two positives (bread) between the negative (filling).

[as]That crossed my mind right away as well.

[at]I wonder how this can contribute to a negative environment about lab-EHS interactions? If there is only negative commentary (or the perception of that) flowing in one direction, would seem that it would have an impact on that relationship

[au]I see how providing positive feedback along with the negative feedback could help do away with the feeling the inspector is just out to get you. Instead, they are here to provide help and support.

[av]This makes me consider how many “problem” examples I use in the safety training curriculum.

[aw]This is a good point. From a learning perspective, I think it would be incredibly helpful to see examples of things being done right – especially when what is being shown is a challenge and is in support of research being conducted.

[ax]I third this so hard! It was also something that I would seek out a lot when I was new but had trouble finding resources on. I saw a lot of bad examples—in training videos, in the environments around me—but I was definitely starved for a GOOD example to mimic.

[ay]I read just part of the full paper. It includes examples of positive feedback and an email that was sent. The example helped me get the flavor of what the authors were doing.

[az]This makes a lot of sense to me – especially from a “new lab employee” perspective.

[ba]Referenced in my comment a few paragraphs above.

[bb]I suspect that another factor in trust is the amount of time spent in a common environment. Inspectors don’t have a lot of credibility if they are only visit a place annually where a lab worker spends every day.

[bc]A lot of us do not know or recognize our inspectors by name or face (we’ve also had so much turn around in EHS personnel and don’t even have a CHO). I honestly would not be able to tell you who does inspections if not for being on our universities LST.  This has occurred in our lab (issue of trust) during a review of an inspection of our radiation safety inspection. The waste was not tested properly, so we were  questioned on our rad waste. It was later found that the inspector unfortunately didn’t perform the correct test for testing the pH. My PI did not take it very well after having to correct them about this.

[bd]I have run into similar issues when inspectors and PIs do not take the time to have a conversation, but connect only by e-mail. Many campuses have inspection tracking apps which make this too easy a temptation for both sides

[be]Not only a conversation but inspectors can be actively involved in improving conditions- eg supply signs, ppe, similar sop’s…

[bf]Yes, sometimes, it’s amazing how little money it can take to show a good faith effort from EHS to support their recommendations. Other times, if it is a capital cost issue, EHS is helpless to assist even if there is a immediate concern.

[bg]I have found inspections where the EHS openly discuss issues and observations as they are making the inspection very useful Gives me the chance to ask the million questions I have about safety in the lab.

[bh]We see this in our JST walkthroughs. We require photos or descriptions of above acceptable scores and have some open ended discussion questions at the end to highlight those exemplary safety protocols and to address areas that might have been lacking.

[bi]Through talking with other students reps, we usually never know what we are doing “right” during the inspections. I think this would benefit in a way that shared positive feedback, would then help the other labs that might have been marked on their inspection checklist for that same item and allow them a resource on how to correct issues.

[bj]I think this is an especially important point in an educational environment. Graduate students are there to learn many things, such as how to maintain a lab. It is weird to be treated as if we should already know – and get no feedback about what is being done right and why it is right. I often felt like I was just sort of guessing at things.

[bk]At UVM, we hosted an annual party where we reviewed the most successful lab inspections. This was reasonably well received, particularly by lab staff who were permanent and wanted to know how others in the department were doing on the safety  front

[bl]This can be successful, UNTIL a regulatory inspection happens that finds significant legal concerns related primarily to paperwork issues. Then upper management is likely to say “enough of the nice guy approach – we need to stop the citations.” Been there, done that.

[bm]Fundamentally, I think there needs to be two different offices. One to protect the people, and one to protect the institution *huff huff*

[bn]When that occurs, there is a very large amount of friction between those offices. And the Provost ends up being the referee. That is why sometimes there is no follow up to an inspection report that sounds dire.

[bo]But I feel like there is ALREADY friction between these two issues. They’re just not as tangible and don’t get as much attention as they would if you have two offices directly conflicting.

These things WILL conflict sometimes, and I feel like we need a champion for both sides. It’s like having a union. Right now, the institution is the only one with a real hand in the game, so right now that perspective is the one that will always win out. It needs more balance.