Subido por maracaverik

Lab Manager - November 2021

Anuncio
October 2021
Volume 16 • Number 9
LabManager.com
ROLE
S
’
R
E
G
MANA
B
A
L
D THE IT AGAIN
E
Z
I
N
O
IO
OLUT BOUT TO D
V
E
R
ATA
AI IS A
BIG D
Optimizing
Lab Assets
Scalable Lab
Monitoring
—
TRF and Alpha lasers for speed and
improved sensitivity
Patented Hybrid Technology with
independent filter and monochromatorbased optics for performance and flexibility
Variable bandwidth selection for
optimized fluorophore specificity
and sensitivity
Ultra-fast plate processing speeds
with multiple PMT detectors
Microplate incubation to 70 ° C and
CO2/O2 gas control option
There can only be one Highest-Performance reader...
and it’s BioTek’s Synergy Neo2, the most advanced,
high-performance, high-speed plate reader on the
market today. Designed to meet the sophisticated
needs of laboratories, the fully featured and flexible
Synergy Neo2 offers uncompromising performance for
cell-based and biochemical assays.
To learn more about Neo2, visit www.biotek.com/neo2
www.biotek.com
Optimize every lab.
Every way. Every day.
Maximize lab efficiency with scalable,
end-to-end solutions from Avantor® Services.
Scientific discovery is time-consuming and resource-intensive. Efficiency and speed to market is critical in
advancing your scientific efforts. You need a partner you can trust to deliver scalable solutions to help optimize
your lab operations and advance your research.
From one-stop procurement to inventory management to technical lab support, Avantor Services offers the
deep expertise and global solutions you need to streamline processes in every workflow, create peak efficiencies,
eliminate bottlenecks and free high-value resources through:
— Lab & production services with on- and offsite support for general and technical lab
services, procurement, production and business
process consulting
— Equipment services offering full equipment
life cycle management from identification to
decommissioning, including multivendor technical
services and certification
— Procurement and sourcing services providing
simplified, one-stop procurement process, expert
product advice and responsive fulfillment
— Digital solutions including hardware, software and
advanced analysis solutions for every stage of lab
management
— Clinical services with a team of experts delivering
custom kitting service, biorepository and archiving
services, and equipment and ancillary solutions,
anywhere in the world
— Commercial kitting & services providing global
expertise, logistical support and scalable kitting
solutions
Move research and discovery forward with Avantor
avantorsciences.com/avantor-services
contents
October 2021
LabManager.com
10
22
36
14
feature
10
leadership & staffing
The Lab of the Future: Big Data
Enables a Big Role for AI
22
labs less ordinary
28
Big data revolutionized the lab manager’s role—AI is about
to do it again.
Sridhar Iyengar
14
18
The Data Science for Social Good Lab
Strategic Teams and Lab Culture
A lab’s culture has a strong influence over the success of an
organization’s strategic agenda.
Daniel Wolf and Lydia Werth
business management
lab design
How to Effectively Communicate
with Non-Scientists
asset management
4
How key lessons from Monopoly can be applied to promote
equity and limit hypercompetition in academia.
Ane Metola Martinez, Claire Lyons, Katharina Herzog,
Mathew Tata, Natalie von der Lehr, and Shruti Jain
Researchers strive to improve the world through big data.
Lauren Everett
Learning the “languages” of other departments outside the
lab is an important component of business success.
Sherri L. Bassner
20
Creating an Inclusive Academic
Research Culture
32
Designing for the Unknown
36
A Beacon for Life Sciences
Optimizing the Utilization of Lab Assets
Analyzing data to improve the availability and effectiveness
of instruments and equipment in the lab.
Scott D. Hanton
Lab Manager
October 2021
Flexible lab design plans accommodate current needs,
future possibilities.
Jeffrey Zynda and Adana Johns
University of Michigan BSB wins Excellence in Innovation
prize in 2021 Lab Design Excellence Awards.
MaryBeth DiDonna
health & safety
42
Building Habits through Safety Activities
Participation in safety activities from staff and management
improves safety culture.
Tabi Thompson
LabManager.com
FLEXIBLE.
SENSITIVE.
DEPENDABLE.
CLARIOstar® Plus
The CLARIOstar® Plus multimode mic
microplate reader
streamlines assay development
and validation by combining
monochromator flexibility with
best-in-class sensitivity.
www.bmglabtech.com
©2021 All rights reserved. All logos and trademarks are the property of BMGLABTECH.
· LVF MonochromatorsTM for highest sensitivity
· EDR technology makes gain adjustment no longer required
· Dedicated detectors for luminescence and red fluorescence
· Best performance in TRF, TR-FRET, FP and AlphaScreen®
· Atmospheric control with gas ramping function
· Made-in-Germany dependability
October 2021
laboratory product reports
LabManager.com
QUALITY & COMPLIANCE
DIGITAL SUMMIT
DEPARTMENTS
manager minute
Lab managers must protect the quality of their
data, research, and projects by being familiar with
the procedures and regulations under which they
are governed. Lab managers can utilize different
tools and ideas to make their jobs easier, and help
them to meet or exceed their required standards.
Join Lab Manager’s Quality and Compliance Digital Summit Oct. 20-21 as seasoned quality managers discuss the best resources lab managers need
to achieve maximum compliance for their labs.
09 Three Keys to More Effective Project Planning in the Lab
Learn more:
summit.labmanager.com/qualityandcompliance
50 Ask the Expert
Better project planning leads to better outcomes and results.
Scott D. Hanton
industry insights
46 A New Era of Protein Interaction Engineering
New bioengineered protein devices allow near-instant response times, but highlight system needs.
Rachel Brown
ask the expert
Computation predictions empower drug discovery.
Tanuja Koppal
product focus
52 Electronic Laboratory Notebooks
Semantic enrichment is helping overcome issues associated with vast amounts of unusable ELN data.
Lab Manager® (ISSN: 1931-3810) is published 11 times per year;
monthly with combined issues in January/February, by LabX, 1000
N West Street, Suite 1200, Wilmington, Delaware, 19801. USPS
024-188 Periodical Postage Paid at Fulton, MO 65251 and at an
additional mailing office. A requester publication, Lab Manager, is
distributed to qualified subscribers. Non-qualified subscription rates
in the U.S. and Canada: $120 per year. All other countries: $180
per year, payable in U.S. funds. Back issues may be purchased at
a cost of $15 each in the U.S. and $20 elsewhere. While every
attempt is made to ensure the accuracy of the information contained
herein, the publisher and its employees cannot accept responsibility
for the correctness of information supplied, advertisements or opinions expressed. ©2013 Lab Manager® by Geocalm Inc. All rights
reserved. No part of this publication may be reproduced without
permission from the publisher.
WDS Canadian return: 1000 N West Street, Suite 1200,
Wilmington, Delaware, 19801.
POSTMASTER: Send address changes to
Lab Manager®, PO Box 2015, Skokie, Il 60076.
Aimee O’Driscoll
54 Lab Monitoring Systems
Scalable laboratory monitoring systems can help minimize risks—but come with challenges.
Aimee O’Driscoll
56 Next-Generation Sequencing
Using NGS technologies to understand the impact of the microbiome on health.
Andy Tay
58 PCR
Leveraging digital PCR to improve detection of viral pathogens in wastewater.
Brandoch Cook
in every issue
29 Infographic
Authenticating your cell lines.
49 The Big Picture
Tackling the topics that matter most to lab managers.
60 Innovations In: Mass Spectrometry
Important advances in mass spectrometry
leading up to the 2021 ASMS Conference.
63 Lab Manager Online
6
Lab Manager
October 2021
LabManager.com
IF IT USES GAS,
IT NEEDS
HARRIS.
®
+
Full Spectrum of In-Stock and Customized
Turnkey Gas Distribution Solutions
+
Gas Regulators, Flowmeters, Automatic
Switchover Manifolds and More
+
+
24–48 Hour Shipping on Most Models
Safe and Reliable
FIND THE
PERFECT PRODUCT
NO MATTER THE INDUSTRY
+ PETROCHEMICAL + FORENSIC
+ AVIATION + EMISSIONS MONITORING
+ UNIVERSITIES/RESEARCH + PULP & PAPER
+ MEDICAL + FOOD, WINE, BREWERIES
+ BIOTECH & PHARMA
CONTACT A HARRIS®
SPEC GAS EXPERT
1.800.733.4043 ext. 2
harrisproductsgroup.com
[email protected]
HarrisSpecGas.com
editor’s note
How Does Data Influence
Your Decision-Making?
Data is the backbone of most scientific organizations. Every
laboratory produces some form of data; lab managers rely on data
to make important business decisions; and if used efficiently, data
can offer insight into emerging trends and solutions.
Our cover story for this issue, authored by Sridhar Iyengar, PhD,
discusses the evolution of how data is being used by lab leaders,
and how AI is playing a larger role in collecting, interpreting, and
utilizing data produced in the lab. As the capabilities of big data
continue to expand, so too, does the lab manager’s role in utilizing
it. Today, managers need the technical expertise to maximize the
value of the data being produced, and the ability to envision what
the lab of the future looks like. As Iyengar states, “Outfitting a lab
to collect data for today’s needs alone is shortsighted. It’s imperative that those seeking to leverage data in the lab consider not
only the expanded data pipeline of today, but the colossal one
of tomorrow.” Iyengar outlines the five stages of laboratory data
maturity and sophistication, from elementary to transformative.
Turn to page 10 to read the full article, and determine where your
lab fits within the five stages.
As mentioned, data is key for managers during many decisionmaking processes. This is especially true when evaluating the
performance of your lab’s instruments. Our Asset Management
piece (page 20) discusses the importance of using data to optimize
lab equipment. “Lab asset optimization involves investigating and
monitoring a variety of different data about the instruments and
In addition to our data-focused pieces, I also wanted to highlight a
new feature we are launching in this issue, “Innovations In.” Each
issue, our editorial team will highlight some of the latest developments and applications of techniques or products in the lab. For
this issue, we discuss innovations in mass spectrometry, detailing
three main themes in mass spectrometry that are contributing to
new developments in the field, and show promise for the future.
Tur n to page 60 to see what those three trends are.
Lauren Everett
Managing Editor
laboratory products group director
editorial director
art director
Danielle Gibbons
Robert G. Sweeney
[email protected]
[email protected]
creative services director
graphic designer
[email protected]
203.530.3984
Scott Hanton
Alisha Vroom
Trevor Henderson
[email protected]
[email protected]
senior digital content editor
managing editor
Rachel Muenz
Lauren Everett
[email protected]
[email protected]
contributors
scientific technical editor
Sridhar Iyengar, PhD
Sherri L. Bassner, PhD
Ane Metola Martinez, PhD
Claire Lyons, PhD
Katharina Herzog, PhD
Mathew Tata, PhD
Natalie von der Lehr, PhD
Shruti Jain, PhD
Daniel Wolf
Lydia Werth
Jeffrey Zynda
Adana Johns
Tabi Thompson
Tanuja Koppal, PhD
Aimee O'Driscoll, BSc, MBA
Andy Tay, PhD
Brandoch Cook, PhD
8
equipment in the lab,” writes editorial director Scott Hanton, PhD.
As Hanton highlights, a common challenge with data collection
and investigation is that it is often done manually. Software and
other tools are now available to assist lab managers with this process, and offer more meaningful interpretations of their laboratory
asset usage. “By having standardized, meaningful definitions of
utilization, deploying the right technology to capture the data that
fits that definition across critical workflows, and visualizing the
data in a way that can translate to lab insights, lab managers are
on their way to understanding current usage of lab instruments,”
explains Melissa Zeier, enterprise services product manager at
Agilent Technologies.
Lab Manager
Michelle Dotzert
[email protected]
creative services coordinator
Sherri Fraser
[email protected]
sales manager
Reece Alvarez
[email protected]
203.246.7598
senior account managers
Alyssa Moore
Mid-Atlantic, Southeast & International
[email protected]
610.321.2599
Melanie Dunlop
[email protected]
West Coast
[email protected]
888.781.0328 x231
business coordinator
account manager
scientific writer/coordinator
Rachel Brown
Andrea Cole
[email protected]
eMarketing coordinator
Laura Quevedo
Ryan Bokor
Northeast, Midwest
[email protected]
724.462.8238
Published by LabX Media Group
president
Bob Kafato
[email protected]
managing partner
Mario Di Ubaldi
[email protected]
executive vice president
Ken Piech
[email protected]
production manager
Greg Brewer
[email protected]
custom article reprints
The YGS Group
[email protected]
800.290.5460
717.505.9701 x100
subscription customer service
[email protected]
[email protected]
circulation specialist
Matthew Gale
[email protected]
October 2021
1000 N West Street, Suite 1200
Wilmington, Delaware, 19801
888.781.0328
LabManager.com
manager minute
MANAGER
MINUTE
Three Keys to More Effective
Project Planning in the Lab
by Scott D. Hanton, PhD
M
any different things have to go right for large
lab projects to be successfully executed. We
need the right people, with the right training,
using the right instruments. As lab managers, we are
very aware of these priorities, and take significant time
and effort to get them right. However, there are many
other aspects of project planning that require a little
forethought to ensure the delivery of large projects. It
can be helpful to review some of these details with key
team members, and ensure that someone is monitoring
these details. Here are three tips that will help you and
your staff develop a project planning system that will
improve delivery and execution on your projects.
#1 – Ensure consistent sourcing and supply
plan for each of their critical assets. This can ensure
the project continues on time, even if equipment problems occur along the way.
#3 – Ensure milestone and compliance reporting
Many big projects come with important milestones
and regulatory oversight. Develop a clear understanding with customers and agencies about what needs to be
reported, in what detail, and at what time points during
the project. Build the infrastructure that is required to
meet those expectations. It is very difficult to catch up
on reporting during a large project. To ensure success,
it is important to confirm the reporting tools are available and key staff are trained on their use before the
project begins.
One of the key impacts of the COVID-19 pandemic
was the interruption of sourcing and supply. Even
ubiquitous lab equipment and supplies had shortages and
backlogs. As you prepare for a new project, build plans
for consistent supply for the critical equipment, supplies,
consumables, kits, and assays. Talk to your suppliers and
negotiate delivery times. For especially critical components of the project, find alternate and emergency suppliers. Choose suppliers based on all the value they provide,
not simply on one-time costs. Make sure project leaders in
the lab have contingency plans for critical components.
#2 – Ensure service support
Having the right instruments in the lab is only half
the battle. The next step to effective project planning is
having a strategy to keep all of those instruments and
pieces of equipment running consistently during the
project. Work with your service providers to have repair and maintenance plans for the course of the project. Negotiate with them for rapid response for critical
assets. It might even be worthwhile to build additional
redundant capacity for vital systems for some projects.
Make sure project leaders have a project continuity
Thanks for reading. I hope you can use this information. I am very interested
in hearing from you. If you have feedback or comments on this set of tips,
or suggestions for future Manager Minutes, I’d love to hear from you. Please
reach out to me at [email protected]. I’m looking forward to
our conversations. Thanks.
October 2021
Lab Manager
9
Big data revolutionized the lab manager’s role—AI is about to do it again
by Sridhar Iyengar, PhD
the lab of the future: big data enables a big role for AI
B
uried in a working paper scribed in October 1980
by sociologist Charles Tilly was a consequential
union—a marriage of two words forged out of necessity to describe a concept not yet named with repercussions not yet imagined.
Such was the first recorded mention of “big data”—a
phrase that would swiftly enter the common vernacular and
become common practice across industries and geographies.
To understand how data can and will be used to shape
decisions in the lab, one must first understand what decisions lab managers need to make in the first place.
That means maintaining frequent contact with techs and
even the representatives of the machines themselves.”
Luckily, Internet of Things (IoT) technology has enabled the collection of thousands of data points without
human involvement. Today, sensors are embedded in
new equipment, while legacy assets can be connected to
the cloud via inconspicuous and easy-to-install sensors.
Outfitting a lab to collect data for today’s needs alone
is shortsighted. It’s imperative that those seeking to
leverage data in the lab consider not only the expanded
data pipeline of today, but the colossal one of tomorrow.
A new era of science means new decisions
for lab managers
Artificial intelligence requires
operational excellence
Ten years ago, the titles of those who supported a lab’s
operations were similar to today—technicians, lab managers, and IT managers. Yet, the responsibilities under
their purview and the challenges associated with them
have changed drastically in the decade since.
Today’s scientists don’t just need their equipment to be
operational; they need it to be transformational. Researchers now expect their tools to act as both collectors and reporters of data. To empower scientists with the data they
require, operations professionals are now presumed to be
experts in cloud infrastructure, data security, and encryption. As for the assets under their jurisdiction, many were
manufactured before the internet was even established.
The questions answered with data today will be asked
by artificial intelligence (AI) tomorrow. Yesterday’s
executive hypotheses are today’s data-driven plans and
tomorrow’s fully automated discoveries.
In the not-so-distant future, robotics will handle automation as AI evaluates protocols. Eventually, discovery
will require little to no human involvement at all.
But as Lily Huang, associate scientist at Pliant Therapeutics, explains, the initial impact of AI will be a welcomed one: “Many professionals might be worried about
robots and AI leading to a high rate of unemployment for
manpower-based jobs. I personally think that machines,
especially smart machines, will take the boring tasks
away from their human counterparts. AI technology has
the potential to assist daily operations in the lab as well
as facilitate the improvement in various processes. If well
designed and executed, AI is able to identify process flaws
and procedure redundancy, in addition to catching operational defects and optimization opportunities.”
As the tidal wave of data crests, some researchers are
still recording measurements on paper, manually transferring their notes to spreadsheets, and individually exporting spreadsheets into databases for processing and storage.
If such habits are antiquated today, they’ll certainly be
detrimental tomorrow.
Any remaining “if it ain’t broke” devotion to paper
notebooks will break under the weight of a data-hungry,
AI-shaped future. But eventually, so will manual collection of any kind.
To be truly transformative, AI requires input from mass
quantities of data. Its collection must be copious, reliable,
and automatic. Such widespread collection requires universal connection of every asset, every metric, and even the lab
environment itself. IoT technology was born for such a time
“Today’s scientists don’t just need
their equipment to be operational;
they need it to be transformational.”
“I’ve definitely seen a paradigm shift,” says Russell
Lund, facility manager at Zai Lab in Menlo Park, CA. “A
lab manager needs to intricately understand every piece
of equipment—what each does, why it does it, and what
to do if it goes down.”
Such growing responsibilities give lab managers a litany
of new decisions. How can we draw new insights from old
equipment? How is our data encrypted? How do we get data into the
right hands with ease and out of the wrong hands with veracity?
As Lund describes, today’s lab manager role is highly
technical: “I have to ensure that the computer is talking
to the machine and the data is being stored properly.
October 2021
Lab Manager
11
the lab of the future: big data enables a big role for AI
as this. “We were recording data by manually handwriting
in a spreadsheet twice per day,” explains Joanna Schmidt,
lab manager at Ionis Pharmaceuticals. “Since installing an
online system, we can see all temperature changes of all
freezers over time on one page. I’m moving users to underutilized freezers to help increase the lifespan of the units.”
Some growing pains remain
While most recent laboratory equipment on the market comes with cloud connectivity embedded, vendorspecific solutions solve one problem while creating
another. Data is siloed into superfluous and clunky dashboards, rendering it all but useless to those who need it.
Meanwhile, according to internal research by Elemental Machines, an estimated 300 million assets aren’t yet
connected to anything. Most are fully operational and
widely familiar (balances, centrifuges, freezers, etc).
But thanks to turnkey IoT sensors and vendor-agnostic cloud solutions, the world’s unconnected assets will
live on and live as one. Rather than being sidelined in
favor of connected equipment, inconspicuous sensors
enable seamless retrofitting in seconds. As such, tomorrow’s connectivity needs can be met while stewarding
yesterday’s investments.
“The organizations who will
dominate market share tomorrow
are those who prioritize data today.”
In the lab, data maturity advances in reverse
In most categories, maturity is a quality that comes
effortlessly to the aged and arduously to the young—
not so for data maturity. When it comes to data, today’s
startups spring to life already pushing the boundaries of
its collection, harnessing its insights, and leaning on AI to
make sense of its root causes. Despite their bound booklets
titled “Digitization Strategy” and secured rooms labeled
“Lab of the Future,” titans of industry are challenged with
wriggling their way out of longstanding practices, breaking free of tired infrastructure, and asserting their way to
modern data practices over the objections of sometimes
thousands of internal stakeholders. Inertia is real.
Despite the unequal hurdles presented to startups
and industry leaders, the importance of achieving data
12
Lab Manager
October 2021
maturity in the lab remains imperative to both. The
organizations who will dominate market share tomorrow
are those who prioritize data today.
Amidst the myriad models and guidelines for data
maturity in other sectors, practical handrails for leveraging data in the lab are few and far between. As such,
the following offers an outline of the five stages of data
sophistication in the lab. Evaluate your organization’s
standing using the information below.
The Five Stages of Laboratory Data Sophistication
Stage 1: Elementary
• Asset data is available but siloed
• Equipment data populates single-asset interfaces
• Some assets remain unconnected
• Accuracy is questioned
• Access is cumbersome
Stage 2: Irregular
• Organizational data strategy plans are forged
but confusing
• Sensors are deployed for complete lab connectivity
• Data is trusted but siloed either by seniority or asset type
• Progress is stunted as data strategy is not fully prioritized
Stage 3: Championed
• Data strategy and vision are formalized, adopted,
and concise
• Lab director champions the use of data and analytics
• Algorithms detect and alert of anomalies
• A single universal dashboard enables access to all data
anytime, anywhere
• Primary and secondary data are integrated
• Humans remain integral to analysis
Stage 4: Committed
• Lab director and company executives fully buy in to
organization-wide data and analytics strategies
• Data informs business decisions and lab activity alike
• Data and analytics are viewed as a key component in
driving lab innovation
• AI details the root causes of reactions, anomalies, and
errors, and predicts those to come
Stage 5: Transformative
• Data and AI are central to discovery
• Discoveries are fully automated without human
involvement
LabManager.com
the lab of the future: big data enables a big role for AI
• Robotics handle automation and AI evaluates protocols and results automatically
• Utilization data informs all purchasing decisions in
the lab and across the organization
• A chief data officer maintains a seat on the board
For now, achieving a “transformational” level of data
maturity may sound like a lofty goal and a clear competitive advantage. But a day is coming where it will be
essential for survival and thus become the status quo.
Thanks to IoT, AI, and organizational prioritization of
data maturity, the lab of the future is coming into focus.
Charles Tilly likely had no idea in 1980 that his casual
declaration of big data would eventually become sacrosanct. Lab managers had little indication of how quickly
assets would measure themselves. But for anyone willing
to listen, every indication is that AI will fulfill the promises enabled by big data.
For legacy scientific and research enterprises, mature
handling of data will determine whether their reign
continues or ends. For emerging players, data maturity
could be their ticket to disruption. The lab managers
enacting the automation and optimization of data collection within each will maintain a place in history as
the linchpins who enabled discoveries long elusive. The
future of the lab is bright.
Sridhar Iyengar, CEO and founder of Elemental Machines,
is a serial entrepreneur in IoT, medical devices, and wearables.
Iyengar was a founder of Misfit, makers of elegant wearable
products acquired by Fossil in 2015. Prior to Misfit, he founded
AgaMatrix based on his PhD research, a blood glucose monitoring company that made the world’s first iPhone-connected medical device. Iyengar holds more than 50 US and international
patents and received his PhD from Cambridge University as a
Marshall Scholar.
We love to
speed
things
up.
Productive pipetting
from 1 to 384
channels.
We accelerate science together.
To fulfill this mission, we develop the
most precise and easy-to-use pipettes
for labs all over the world.
www.integra-biosciences.com
labs less ordinary
The Data Science for Social Good Lab
RESEARCHERS STRIVE TO IMPROVE THE WORLD
THROUGH BIG DATA by Lauren Everett
A
fter finishing his postdoc studies at the University of Washington, where he was the recipient of
the Moore/Sloan Data Science Fellowship and
the WRF Innovation Postdoctoral Fellow in Data Science, Michael Fire envisioned building a lab “dedicated
to solving real-world problems using open data and data
science tools across different domains, with an emphasis
on making social good,” he explains. After becoming
an assistant professor in the Software and Information
Systems Engineering Department at Ben-Gurion University of the Negev (BGU), Fire “made good” on his vision, and founded the Data Science for Social Good Lab.
“Since then, thanks to data science's growing popularity,
our lab has rapidly grown. The lab currently has over 20
members that conduct active research encompassing a
wide range of research domains,” says Fire.
The team’s work indeed covers a variety of real-world
issues and challenges, including inequality and diversity,
public health, smart cities, and sustainability, to name a few.
Specific projects range from investigating gender biases in
clinical trials, to developing an approach to monitor palm
14
Lab Manager
October 2021
trees for insect infestation, to analyzing global dietary
habits and the linked economical effects—all of which are
driven by collecting and analyzing big data.
One of the team’s most recent endeavors, a collaboration with Dr. Talia Meital Schwartz-Tayri, of the university’s social work faculty, aims at helping at-risk youth
“by utilizing historical data from documented records
of governmental family services to identify factors that
influence their life for the better or worse,” says Fire.
“For example, historical records can be used to better
predict the types of help services that are more likely
to positively impact their lives.” Fire and fellow leaders of the lab, Galit Fuhrmann Alpert, PhD, and Dima
Kagan, a PhD student, explain that the large amounts of
data they collect for their projects can either be structured—meaning large tables with billions of records—
or unstructured—texts, images, and sounds. “To analyze
different data types, we use state-of-the-art data science
tools to manipulate the data and extract information,”
says Fire, Fuhrmann Alpert, and Kagan. “For example,
in our recent work with the Palm Trees Infestation, we
LabManager.com
labs less ordinary
used deep learning algorithms to automatically detect
palm trees in Google Street View Images.”
The team can carry out these data-intensive projects
thanks to the lab’s numerous robust high-capacity servers. The lab also has access to university high-performance computing clusters. Amazon Web Services and
Google Cloud are also used when needed.
Free access for all
The Data Science for Social Good Lab openly shares
its data and codes on its website, in an effort to help other researchers across all types of disciplines with their
studies. Examples of public large-scale datasets featured
on the site include online social network datasets, such
as Facebook, Google+, Academia.edu, and Reddit Communities, as well as time series datasets, and the largest
public network evolution dataset, with more than 20,000
networks and a million-plus real-world graphs.
To the layman, the concept of big data can sound
overwhelming. So, in addition to sharing data and codes
with fellow scientists, the lab team also makes it a priority to connect with the general public. “We do our best
to communicate our research to the public by creating
short videos, infographics, and publishing blog posts. By
doing this, we hope to increase awareness of issues that
we believe are important. Our tools and code are highly
transferable to different fields, and we hope will be used
by the broad community,” says the team.
Sharing knowledge with the next generation
While the lab’s capabilities to collect, analyze, and share
massive amounts of data for social good is impressive enough,
“the human capital is what makes our lab unique,” says Fire.
“We have a fantastic group of researchers from very diverse
backgrounds.” Fire brings his background in computer
science and theoretical mathematics to the team, while
CONCEPTION ■ INTERPRETATION ■ PRODUCTION ■ SATISFACTION
The PLAS■LABS’ 840-Series “Powder Handling Isolator” is a completely
sealed, double HEPA filtered, closed-loop containment isolator. It features
a door inter-lock system for safety, anti-static ionizer, and front
access door.
Observed containment level: <1.0 nano gram / cubic meter.
Anaerobic Chambers
Powder Handing Isolator
The PLAS■LABS’ advantages include:
■ Optically clear one-piece top with rounded
corners and complete 360° visibility.
■ Complete customization is available.
■ Over 50 years of servicing the scientific community.
■ Completely sealed for lower gas consumption.
■ Bright white one-piece bottom with rounded
corners for easier cleaning.
Isolation & Containment
Glove Boxes
■ PLAS■LABS partners with global research facilities and
laboratories to develop the latest products in isolation and
containment glove boxes.
■ 2-YEAR WARRANTY
(not on consumables)
PLAS■LABS®
www.PLAS-LABS.com
[email protected]
Now Shipping
from Europe
HEPA Filtered
PCR Chambers
labs less ordinary
1.
2.
3.
1. The Data Science for Social Good Lab is located in the Carole and Marcus Weinstein Software and Information Systems Engineering and
Cybersecurity Building. Credit: Dani Machlis 2. The Data Science for Social Good Lab's founding team. Front (left to right): Dr. Michael
Fire and Dima Kagan. Back: Aviad Elyashar and Dr. Galit Fuhrmann Alpert. 3. The Software and Information Systems Engineering Department's servers cluster. Left to right: Dima Kagan, Dr. Michael Fire, Dr. Galit Fuhrmann Alpert, and Aviad Elyashar. Credit: Shay Shmueli
Fuhrmann Alpert comes from the field of computational
neuroscience, and Kagan has a strong background in software
engineering. The diversity of expertise enhances the lab’s
skillset and contributes to innovative problem-solving.
As the lab team keeps growing, their ability to reach
and advise students also grows. The lab offers three
avenues for student mentorship for those interested in
big data and relevant fields. Senior-year undergraduate
students can find mentorship among the team for their
engineering and research projects, and PhD and MSc
students can receive help with their data science-related
research as well. Most recently, Fire and the team developed a course that educates students on how to use
state-of-the-art data science tools. “One of the lab's most
important goals is to train students in the field of applied data science. For this purpose, we designed and are
teaching a unique course titled ‘The Art of Analyzing Big
16
Lab Manager
October 2021
Data—The Data Scientist's Toolbox,’" explains Fire. Currently, about 100 students enroll in the course each year,
but the team hopes to eventually make the course publicly
available to reach thousands more.
Fire, Fuhrmann Alpert, and Kagan are witnessing
first-hand how the field of big data research is gaining
momentum, and will continue to make a positive impact
on society. “Big data and data science are currently changing the world,” says the team. “For example, the Large
Hadron Collider generates petabytes of data. Another
example, the Gaia project, stores a large amount of data to
map our galaxy. In these enormous piles of data, there can
be things that will change the world and make our lives
better, if we ask and study the right questions.”
Lauren Everett, managing editor for Lab Manager, can be
reached at [email protected].
LabManager.com
Get Lab Manager on the go.
Download a pdf at LabManager.com
business management
How to Effectively Communicate
with Non-Scientists
LEARNING THE “LANGUAGES” OF OTHER DEPARTMENTS OUTSIDE THE LAB IS AN
IMPORTANT COMPONENT OF BUSINESS SUCCESS by Sherri L. Bassner, PhD
T
he lab manager has many responsibilities involving the
optimization and effectiveness of the laboratory, and
its contribution to the success of the business. This
holds true for labs that are the focal point of a business,
as well as for labs that support a business with a different
primary offering. While it is tempting to spend all one’s
energy on internal lab issues, lab managers must recognize
the critical importance of developing effective communication techniques and working
relationships with functions
outside of the lab itself. This
article will summarize the
reasons these communications
are important, provide some
methodologies for building
those skills and relationships,
and point out a few pitfalls to
avoid along the way.
Understanding all the
roles of the business
“The success of the lab
itself depends upon the flow
of support and information
from the other functions, as
well as the ability of those
functions to capitalize on
the output of the lab.”
It is human nature to place
more importance on skills and
activities you know well versus
those that you don’t know quite
as intimately. An extension of this mindset is the tendency
of managers of various business functions, including the
lab, to feel that their function is central to the success of the
overall business. The reality, of course, is that all business
functions need to work smoothly and cooperatively for any
business to be most successful. The success of the lab itself
depends upon the flow of support and information from
18
Lab Manager
October 2021
the other functions, as well as the ability of those functions
to capitalize on the output of the lab. For this to happen to
greatest effect, lab managers must develop skills to effectively communicate with the managers of those other functions, many of whom are not scientists or are scientists who
do not have intimate knowledge of how the lab functions.
The first step to learning how to communicate well
with those outside of the lab is to spend the time to learn
as much as you can about the
other functions in the business. How does sales work?
Manufacturing operations?
Finance? Business management? What are their
goals and objectives? What
headaches do those managers often have to contend
with? What keeps them up at
night? Learn the “language”
that other functions use to
describe their work. Either
ask to spend time with those
managers to truly understand
their function, or look for
mentors or friends who can
educate you. If this seems like a waste of time to you,
ask yourself this question: How frustrated do you get
when a manager from another function minimizes the
challenges of the lab or acts as if the lab output is less
important to the business? If you expect other managers
to understand how the lab works, then you should put in
the effort to understand how their functions work.
LabManager.com
business management
Once you’ve gained an understanding of the workings
and challenges of other functions, map the functions and
output of the lab with the goals of the other functions.
How does the output of the lab help sales meet their
goals? What does manufacturing require from the lab for
them to be most successful? How does business management view the contributions of the lab toward meeting
overall business objectives? Once you’ve gained the
ability to “see” the lab through the eyes of the leaders of
the other business functions, you are better positioned to
not only prioritize lab activities that will most effectively
drive the business, but also ensure that those other business leaders see the contributions of the lab as vital to
meeting their own objectives.
A cohesive approach
Let’s take the example of seeking to gain approval for a
large capital expenditure related to a new instrument. It
may be the need to replace an aging tool or the purchase of
an instrument that would provide a new capability. While
it is usually the role of business management to sign off on
these investments, often the input from other functions
will be required. A typical approach to justify the purchase
might be to discuss the technical capabilities of the instrument and how those capabilities fill a critical need within
the lab. The technical aspects are often of most interest to
the lab, but leaders of the other functions need to hear how
the investment will help them meet their own goals, too.
The sales manager wants to hear what market segments she
might lose if that capability goes away, or what new customers she might gain with the new capability. The manufacturing manager wants to hear how this capability will lead
to quicker answers to production problems, more robust
products to begin with, or how loss of this capability would
otherwise negatively impact production processes. The
business and finance managers want to understand how this
investment will return more in profitable sales than the cost
of the investment itself or how loss of that capability would
impact profit margin or overall revenue.
To make these arguments effectively, the lab manager
must have a deep enough understanding of the other
functions to be able to make specific and quantitative
statements that tie this investment to the objectives of
the other functions and the business as a whole—and
to make those arguments in the language of those other
functions, not the language of the lab.
Another example is the cyclic setting of objectives and
the related budgeting process aimed at achieving those
objectives. The lab manager aims to set objectives that
contribute to the overall success of the business and then
requests resources required to meet those objectives.
However, if the lab manager expresses the objectives
only in the terms of the lab (technical objectives, what
skills and capabilities are required, and what do they
cost), the path to approval is a steep one. The challenge
for the lab manager is not to simply connect a lab objective to a business objective, but to specifically describe
how the lab objective will lead to the achievement of
the business objective. To achieve this, the lab manager
needs to explain how the successful completion of the
lab objective enables the successful completion of the
objectives of other functions and the business as a whole.
Key to this process is specificity and quantitation, which
requires the deep understanding of how the other functions achieve their objectives.
Achieving support and appreciation for the lab
If this sounds like a lot of work, it is. It is easy for the
lab manager to cut corners and not do sufficient homework. Don’t assume that the other managers understand
how the lab works, and constantly challenge your own
understanding of other functions. Ask questions. Spell out
acronyms. Work hard to see the business through the eyes
of your management peers and never stop expanding on
that understanding. Be prepared to teach others as often as
necessary so that they gain understanding of the lab. The
investment in mutual education will pay rich dividends in
continued support and appreciation of the lab.
Finally, remember to stay humble. As noted above, it
is easy to fall into the trap of seeing the lab as the critical
function around which the business revolves. Certainly, it
is important, but there would be no business if the other
functions did not work just as hard. All the functions of
the business are interdependent. The reality is that most
managers, of all functions, don’t embrace this concept.
Remember to keep the overall objectives of the business
as your primary touchstone and always discuss the needs
and accomplishments of the lab in the context of those
business drivers. If you, as the lab manager, can do that,
you will always be positioned to best support the lab.
Sherri L. Bassner, PhD, is a retired chemist and manager who
spent 30 years developing new products and services, and then
leading others in those same efforts. She is a part-time leadership
coach and blogs on personal and professional development (among
other topics) at www.sherribassner.com.
October 2021
Lab Manager
19
asset management
Optimizing the Utilization of Lab Assets
ANALYZING DATA TO IMPROVE THE AVAILABILITY AND EFFECTIVENESS OF
INSTRUMENTS AND EQUIPMENT IN THE LAB by Scott D. Hanton, PhD
O
ptimizing the usage of equipment and instruments in the lab is a key responsibility for lab
managers. After people and space, ensuring
working tools in the lab represents the next largest investment for most labs. Providing properly functioning instruments and equipment impacts both the capital budget—
for new investments—and the operational budget—for
repair, maintenance, and calibration.
Optimizing the assets of the lab brings important benefits. Implementing a more efficient approach to optimizing asset management can help to reduce overall costs and
generate greater productivity of the lab. “Optimized lab
assets means you have the most productive mix of assets
that are available, reliable, and performing the right tasks
to meet your business goals,” says Melissa Zeier, enterprise
services product manager at Agilent Technologies. “It is the
responsibility of the lab manager to justify the cost of the
laboratory asset life cycle to the business—from planning,
acquiring, deploying, repairing, maintaining, and disposal
of those assets while balancing risk to benefits.”
Lab asset optimization involves investigating and
monitoring a variety of different data about the instruments and equipment in the lab. Some of the data that
is important to the optimization process includes repair
history, maintenance schedules, calibration requirements, utilization history and expectations, operating
costs, space requirements, asset life span, capital replacements, and disposal options. While labs track some or all
of these data, most have a manual approach to recording
the data, and few are really effective at analyzing these
disparate data sets to optimize the availability, usage,
and costs associated with their assets. “Manual methods
20
Lab Manager
October 2021
of collecting data in spreadsheets or from logbooks can
be time consuming and inconsistent,” says Zeier. “By
having standardized, meaningful definitions of utilization, deploying the right technology to capture the data
that fits that definition across critical workflows, and
visualizing the data in a way that can translate to lab
insights, lab managers are on their way to understanding current usage of lab instruments.” Having software
designed to capture these data and help analyze it effectively can make a significant difference in the ability
of lab managers to optimize the use of lab assets.
There are options available now from several vendors
that can enable more powerful tools to track and understand the usage and availability of lab assets. According to
Jim Sweeney, senior product manager, PerkinElmer, “One
approach is to have a software solution automatically
capture utilization from the instrument, either through
parsing of an instrument log file, or through a direct connection to the instrument through an application program
interface.” Getting data about the lab instruments directly
removes the need for staff to remember to gather these
data during their already busy schedules, and provides an
electronic record that is easy to share and document.
When approaching a choice about the various tools now
available to manage these kinds of data, it is important to
understand the scope of the data that the tools measure
and analyze. According to Joachim Lasoen, vice president
of product BINOCS at Bluecrux, both visibility and optimization are critical for making the best use of lab assets.
“Visibility helps lab managers understand instrument
availability around maintenance and calibration schedules
and the test demands on the instruments, and drives a
LabManager.com
asset management
MINIATURE Solenoid-Operated
Pinch & PTFE Isolation Valves
capacity requirement profile for each instrument,” says Lasoen. Once lab managers have visibility on the data for their
lab, they can begin to optimize the usage of the lab tools.
Lasoen continues, “Two aspects determine the success of
instrument utilization—maximize the white space between
test runs, and maximize the test campaign fill rate.” Lasoen
also adds that some of these tools are now using optimizers
based on business rules and artificial intelligence to address
more sophisticated workflows.
Before implementing a tool to optimize asset utilization, it
is important to understand the metrics that drive lab performance. “There are concrete steps that lab managers can take
to optimize the usage of lab instruments,” says Zeier. “Start
by prioritizing meaningful, standardized operational data for
lab instruments across critical workflows.” Doing the hard
work of standardizing the lab work process will enable the
asset optimization tool to align the collected and analyzed
data with the key metrics of the lab. “To help optimize lab asset management, managers should establish key performance
indicators (KPIs) and assess them on an ongoing basis. These
KPIs need to reinforce the important metrics for the lab,”
says Carola Schmidt, global director of automated solutions
at PerkinElmer. Using an asset utilization tool will help lab
managers correlate the use of the lab instruments with the
lab metrics. “By keeping a watchful eye on established KPIs,
lab managers can pinpoint areas for improvement or prevent
small issues from becoming bigger problems,” adds Schmidt.
The use of asset management software tools makes this process much more efficient and useful.
Another part of optimizing lab assets is to figure out which
instruments and pieces of equipment are needed in the lab.
Most labs tend to hang on to old or rarely used instruments
because of the challenges in obtaining capital to purchase
new ones. “Users of the lab assets often have an emotional
attachment to some of the instruments,” says Sweeney. However, applying effective metrics and KPIs can overcome those
emotional ties, especially when that data can demonstrate
that the space, time, and effort to keep those instruments
operating is not helping the lab deliver on its mission.
An asset optimization tool can help lab managers perform
a key function of lab management—making good decisions
about the effective use of time, money, and effort to deliver
on the lab’s mission. Zeier reminds us that, “Asset management optimization is central to maximizing the return on the
investments that a lab manager makes in their operations.”
Scott D. Hanton, editorial director for Lab Manager, can be
reached at [email protected].
1
1 Pinch
nch & Media Isolation Valves
2 Pneumatic
P eumatic Pinch Valves
2
3 NEW!
EW! “Cordis” Electronic Pressure
Controls
ontrols
4 NEW!
EW! “Eclipse” Proportional
Isolation Valves
5 NEW! “DR-2” Miniature Precision
Regulators
3
6 Electronic Valves Ideal for Oxygen
Applications
Introdu cing
More Precision Flow Controls
4
5
bility of
Repeata
!
±0.15 psi
6
CINCINNATI • BRUSSELS • SHANGHAI
877-245-6247 • clippard.com
leadership & staffing
Creating an Inclusive Academic
Research Culture
HOW KEY LESSONS FROM MONOPOLY CAN BE APPLIED TO PROMOTE EQUITY
AND LIMIT HYPERCOMPETITION IN ACADEMIA by Drs. Ane Metola Martinez,
Claire Lyons, Katharina Herzog, Mathew Tata, Natalie von der Lehr, and Shruti Jain
T
he classic game Monopoly is often dictated by a set of
highly variable “house” rules that can lead to heated
words over the consequences of landing on “free
parking.” These discrepancies inevitably favor some players over others, yet future games are doomed for failure as
the rulebook remains unexplored, poorly understood, or
simply does not provide regulations for a specific situation. Unfortunately, many careers in the academic sector
share this predicament. Researchers are trained to perform science, but few are taught a strategy for a successful
academic career. Much like the infamous Monopoly game,
the academic career path follows an equally ambiguous
set of rules and requires more than just rolling the dice.
During its workshop series, “Improving Research Culture–Proposals & Discussions from the Science Community”, the National Junior Faculty (NJF) of Sweden
discussed how to instill a more sustainable, creative,
and inclusive environment for early career researchers
(ECRs). Though discussed in the context of Swedish research conditions and politics, young researchers around
the world face similar challenges and seek similar solutions. Just like in Monopoly, the problems that can arise
from poorly defined house rules are universal.
Time to rewrite the rulebook
More and more researchers are seeking the opportunity
to rewrite academia’s ailing rulebook. The sun needs to
set on a research assessment system based primarily on
journal metrics, where “bad winners” survive by targeting
a portfolio of papers, rather than mentoring and supporting people. Researchers are implicitly encouraged to be
22
Lab Manager
October 2021
individualistic, yet the length of academic careers is rapidly
falling even amongst those who publish research articles as
the lead author. Both international and national movements
are raising the idea that embracing a different approach to
research careers can foster a sense of inclusion that provides
a place at the table for all of academia’s players. A more
inclusive environment begins with a proper examination of
the rulebook, so that researchers can eliminate discrimination against different individuals (such as women and
minorities) by endorsing alternative measures of success,
and reevaluating career step time limits.
Valuation practices based on bibliometric measures
and so-called research excellence, combined with
greater competition for funding, have consistently replaced other considerations like novelty, reproducibility,
interdisciplinarity, cooperation, and the societal impact
of research. The hypercompetitive environment exacerbates the leaky pipeline of academia because it fails to
provide support and training to valuable individuals who
enter the game from a disadvantaged position.
To ensure equity, educating ECRs on career expectations
and forthcoming challenges needs to become a requirement
for institutions—so that all players have the same chance
of success. Mentorship and counseling can be important
elements of this education; by recognizing their value, these
activities are elevated from altruistic kindness to criteria
for promotion. This helps to limit elusive house rules from
excluding aspiring researchers, and diminish equity and inclusion. Another positive change would be to flatten research
group hierarchy by including intermediate career positions
that bring welcome stability and allow scientists to be creative.
LabManager.com
leadership & staffing
Competing or playing together?
Many have experienced a long, contentious
Monopoly game that turned bitter and left us
feeling like quitting. Unfortunately, this description accurately portrays how an increasing number of ECRs perceive an academic
career. In any competitive game where obtaining resources and victories leads to further
resources and victories, failing to score early
makes recovery difficult. The snowball effect—where a successful player starts setting
up hotels while others are still squabbling over
railroads—is more apparent as the number of
players in the game increases.
In the present academic system, where
resources and tenured positions are limited
and not correlated to the increasing supply
of PhDs, those who aren’t effectively educated around the rules
from the start are quickly excluded. Additionally, in the research
game, grant applicants are often disadvantaged by the Matthew
effect, when funding is distributed disproportionately toward
established scientists.
Discussions during the NJF workshops gave rise to a number of
possible solutions. Equitable rules for career development would
encourage playing as teams (cooperation), rather than as individuals (competition), starting with improving the way credit is given
The laboratory is patient care
Your organization strives to provide high quality patient care CSMLS can help.
With CSMLS certification, you know laboratory assistants and
technologists have proven their competence to practice within
the nationally set standards.
Maintain the high standard so your institute and the public
are guaranteed effective and accurate laboratory services.
certifcationmatters.csmls.org
leadership & staffing
for contributing to publishing scientific data or conducting
projects. ECRs were strongly in favor of separating funding pools and setting caps on what a single group leader
can receive. For effective funding pools, inclusive considerations need to be made to avoid replacing one set of
inequitable house rules with another, such as biological or
academic age as eligibility criteria. Funding through more
stable, institutional, or state funds could help shift away
from a precarious and project-based funding culture, and
randomizing funds at late stages of project evaluation can
help counter bias and diversify the reach of public money.
Constructive feedback from funding agencies and senior
researchers would help those who fail to attract funds and
make the process more pedagogical.
The game should provide enough incentives for players to feel motivated to continue and enjoy the struggle
of research, even if there are few winners. Not everyone
will win the research grant or academic position of their
dreams, but the joy of playing needs to be enhanced. It may
mean the extension of “winning” to embrace and promote
other scientific careers and pathways outside of academia,
where ECRs can apply their knowledge and experience.
Gaining knowledge by losing
In Monopoly, the winner may not have acquired the most
properties or executed the best strategy. Luck is simply
part of the game. In academic research, winning is represented by gathering groundbreaking data, which leads to
publications and funding. But what happens when your
findings are considered negative, do not show any effect
from an intervention, or go against the current accepted
paradigm? These studies are difficult to report and are
rarely published, leading to both a publication and citation
bias toward positive data. This problem is compounded
when fraudulent results are actively published and cited.
The advantages gained from losing a game or obtaining negative data are increasing knowledge; for example,
through publishing datasets or saving time and resources
by teaching other researchers to avoid the same investigations. If used appropriately, negative results can demonstrate a lack of reproducibility and provide tactical value
for future experiments. Sharing data and methods represents a core principle in science. Negative data needs to be
fully understood, more widely accepted, and published.
Journals dedicated solely to negative data are rarely successful—much like playing a game where nobody wins.
The NJF workshop proposed calling all findings ‘data’,
whether they are positive or negative, whilst recognizing
24
Lab Manager
October 2021
that initial results shape a theory which needs to be tested
over time. ECRs can benefit from the reminder that not
every experiment leads to spectacular findings, much like
Monopoly newcomers won’t win every round.
“The game should provide
enough incentives for players to
feel motivated to continue and
enjoy the struggle of research,
even if there are few winners.”
Alternatively, publications could be reviewed and
approved at the planning stage, before data collection.
PLoS has adopted this method in a preregistration stage
where a research study is assessed based on its rationale
and proposed methods. Upon journal acceptance, authors can rest assured that their results will be published
regardless of the outcome of the study, thus increasing their willingness to publish data of any kind. More
journals can adopt a similar approach to ensure that
any findings are the result of robust methodology and
statistical analysis. Like knowing how to play the game,
one will feel a sense of inclusion while playing without
knowing the outcome before starting.
A different way to play
During the workshop series, attendees were unanimous in calling for a change in attitude across academia:
all stakeholders need to be more mindful of how research is performed, what results are sought, and which
professional factors drive the contemporary scientist.
This transformation must also begin with those in positions of power welcoming the perspectives and ideas of
ECRs into academic governance. Only then can we end
the monopolies of hypercompetition and discrimination
on the research system, and set the board equitably for
the current and future generation of players.
Drs. Ane Metola Martinez, Claire Lyons, Katharina Herzog,
Mathew Tata, Natalie von der Lehr, and Shruti Jain are all members of the National Junior Faculty of Sweden.
LabManager.com
REGISTRATION NOW OPEN
SUMMIT.LABMANAGER.COM/QUALITYANDCOMPLIANCE
Compliance
Standards and
regulations
Laboratory
operations
Accreditation of
ISO quality systems
Validation
MORE
Lab Manager’s Quality/Regulatory Digital Summit
on October 20-21 will feature expert speakers from
various professional agencies, as well as seasoned quality
OCTOBER 20-21, 2021
 LEARN MORE
managers, who will discuss important standards ranging
from GxP and ISO and what they mean to lab managers
and staff. The sessions featured in this virtual conference
will provide lab managers with the resources they need in
order to achieve maximum compliance for their labs, and
SUMMIT.LABMANAGER.COM/QUALITYANDCOMPLIANCE
will follow up with an audience Q&A after each session.
UNLOCKING THE LAB
OF THE FUTURE
Universal data platforms empower laboratory operations and
bring the lab of the future into focus
L
aboratory organizations that prioritize data reap
short-term benefits and set the foundation for future success. The laboratory environment contains
a potential wealth of data—including a variety
of environmental parameters, as well as instrument and
equipment metrics. While the evolution of the Internet of
Things (IoT) has made it easier to collect this data, it must
be transformed into actionable insights to benefit laboratory
operations. The Elemental Machines ecosystem collects
environmental, instrument, and equipment data, transforms
it into usable data, and makes it accessible to laboratory
managers. This powerful platform can improve productivity and reproducibility, optimize laboratory operations, and
support the process of data sophistication—bringing the
laboratory of the future into focus.
HARNESS DATA TO DRIVE
LABORATORY OPERATIONS
Each laboratory organization consists of a variety
of different roles, and individuals in every role face a
unique set of challenges. The laboratory manager often
has a diverse range of responsibilities and must balance planning, budgeting, problem-solving, and quality
control, among a myriad of other tasks. Often, laboratory
managers act as firefighters—springing into action to
address problems as they arise. This is an inefficient approach to laboratory operations.
A robust platform can transform a wealth of data into
actionable insights, making it easier to shape strategy,
reduce operational efficiencies, and maximize ROI. Solutions designed specifically to support laboratory managers
combine alerting and monitoring with asset management,
asset utilization, and quality assurance/quality control
functions to improve laboratory operations.
Asset management functionality supports the entire
team by maximizing uptime, and decreasing costly delays
and lost productivity. Real-time monitoring and alert
functions improve visibility and provide invaluable peace
of mind. Further, from a quality assurance/quality control
perspective, deploying and monitoring autonomous quality checkpoints at consequential steps in manufacturing or
research processes reduces waste and regulatory burden.
Joanna Schmidt, lab manager at Ionis Pharmaceuticals
Inc. shares how the Elemental Machines platform benefitted her laboratory. Using the monitoring functionality to
continuously monitor the temperature of her laboratory’s
ULT freezers, she was able to identify a problem and
intervene before valuable samples were lost. “Without the
long-term data, we would not have noticed the freezer’s
bottom freeze point had shifted by almost 10°C over several
months,” she explains. With the platform’s asset utilization
functionality, she was also able to identify overutilized assets, and is “in the process of moving users to underutilized
freezers to help increase the lifespan of the units.”
in focus | Elemental Machines
ACHIEVE DATA MATURITY TO
ENSURE FUTURE SUCCESS
By taking immediate action to break free of old infrastructure, longstanding practices that may no longer be
effective, and addressing concerns from staff, laboratories
will be better positioned to achieve data maturity. Working toward achieving the transformational stage of data
maturity—as a new startup or established organization—
will help to ensure a successful laboratory into the future.
Discovery thrives in the transformational stage. At this
point, data and artificial intelligence are central to new
discoveries, many of which are fully automated. Robotics
are used to automate manual tasks, and artificial intelligence is used to evaluate protocols and results automatically. It is also in this stage that utilization data can be
used to inform better purchasing decisions.
However, for artificial intelligence to enable these
outcomes, it requires input from mass quantities of data
pertaining to every asset, metric, and environmental
parameter. IoT technology is essential for this purpose.
A POWERFUL PLATFORM FOR A
CHANGING LANDSCAPE
The Elemental Machines platform drives laboratory
operations and supports data maturity by capturing large
quantities of data and making it usable and accessible to
the right personnel. The platform combines a variety of
sensors, powerful software, a cloud-connected dashboard, and reliable support. All data is consolidated into
a single dashboard that provides monitoring, alerting,
and asset utilization insights. It also facilitates asset management, data management, calibration and maintenance
management, quality assurance, and quality control.
A variety of turnkey IoT sensors and cloud solutions can be combined to create a complete monitoring
solution for the laboratory. Temperature sensors enable
monitoring inside ovens, incubators, freezers, and liquid
nitrogen tanks; ambient sensors can be deployed to monitor humidity, air pressure, and light in the laboratory
or micro-environments like vivariums; and equipment
sensors are available for nearly every asset and metric,
from blood gas analyzers to mass spectrometers and everything in between. All sensors are easily implemented
without hardwiring or cables, connect nearly every asset
regardless of manufacturer or era, and integrate with
existing third-party electronic laboratory notebooks,
enterprise resource planning, and laboratory information
management systems.
The IoT is changing the laboratory environment,
and requires a powerful platform to transform large
amounts of data into valuable insights. Not only can the
right platform yield immediate benefits for laboratory
operations, it can also support the ever-expanding data
pipeline to ensure long-term success.
The Elemental Machines ecosystem is designed to
monitor the moment and inform the future.
To learn more, visit:
https://elementalmachines.com/
leadership & staffing
we’re headed” in the lab. They define the why, what, and
how of the lab’s work, and the focus of people across and
beyond the network of stakeholders.
Talented individuals and teams serve as active agents
for the lab’s strategic agenda, as integrators and executors,
makers and shakers, scouts and support crews, process
wizards and progress teachers. People assume different
roles, functions, and goals in strategic teams with assignments that are based on specific “talent blocks and beams”
that connect a range of skills sets, surrounded by a series
of experiences, behaviors, and perspectives. These weave
together to influence culture and build strategy.
Everyone in the lab feels the culture, and it prepares people
to engage in the work to be done at every level. Culture
shapes the casting of people in key roles across the lab, and
the cultural agenda advances two practical elements that connect people and strategy in action: foundations—readiness,
competence, and expressions—intentions and exchanges.
Foundations
The foundations of the cultural agenda are like DNA.
They drive purpose, meaning, and scope and define
what matters most. Foundations derive from core themes
in strategic focus, value propositions, and value-based
statements of standards and principles. Aspirations are
driven by foundations. They provide touchpoints that reflect the lab’s purpose, vision, and mission. They are the
grounding of the lab, the blueprint for the lab’s offering.
Expressions
Expressions of the cultural agenda are like conversations
that convey what really matters, what effort and impact looks
like. They define, inform, and persuade how people should
think about their work and their relationships. Expressions are
driven by the morning announcements, planning discussions,
and feedback exchanges that communicate the cultural agenda.
Culture provides the backdrop of attention. Managers use
the cultural agenda to bring attention to key issues, behaviors,
and efforts. They build on themes that support team engagement, learning, and advancement, with clear links to the lab’s
cultural foundations. They reinforce values with examples
and narratives that move people in their everyday thought and
behavior. Culture is active and dynamic; managers work on
and work through the foundations and expressions of culture.
The culture conversation
Many organizations today have “culture decks” that
provide summary references on values, principles, and
30
Lab Manager
October 2021
norms. These are often supported by graphics and images that contribute to the intended picture of culture
and climate. Culture decks frame expectations and
behaviors. They provide a general look at the aspirations of the lab, and they enable a more specific view of
the language that the managers and strategic teams use
to discuss the work to be done, working together, and
the nature of the road ahead. These serve as statements
that drive everyday conversations, reinforcing the
foundations and expressions, supporting the strategic
agenda for growth, performance, and change.
Culture conversations are sparked by questions like:
• Would you recommend your lab to others for
employment, and why?
• Would you share what success means to different people
across the lab setting, and why?
• Would you share how people work together to achieve
strategic goals of the lab, and why that really matters?
• Would you say that managers, staff, and teams follow the
best intentions of the lab’s cultural agenda on a consistent basis?
• Would you say that management is walking the talk—so
to speak?
• Does your view of the present and future look different
than management’s?”
Questions like these open conversations that connect staff to the strategy and culture. In the process,
individuals and teams gain permission to engage in
constructive debate about what matters most, and how
to address strategic intentions through teamwork.
Management connections
Building the climate for lab excellence is an essential
task of management. That task percolates, matures, and
integrates through the work of strategic teams. The
efforts of people serving on strategic teams drive the
collective impact that managers promise to stakeholders. Culture is the intersection for people in motion,
making things happen. Culture and strategy together,
blended in the work of strategic teams, is the formula
for excellence.
Daniel Wolf is president of Dewar Sloan, a strategy and development group with extensive ties in lab and technical markets.
He can be reached at [email protected]. Lydia Werth is
a research consultant with Dewar Sloan, focusing on strategic
teams and communication models.
LabManager.com
ask the expert
Data Management for the
Entire Product Lifecycle
THE RIGHT LIMS CAN HELP ORGANIZATIONS
IMPROVE THE QUALITY AND SAFETY OF
PRODUCTS, FROM CONCEPT TO CONSUMER
Jacqueline Barberena is the senior director, Global Marketing and Product
Management at STARLIMS.
Q: How does a LIMS (laboratory information
management system) improve day-to-day
operations across an entire enterprise?
A: A LIMS typically extends beyond laboratory data management. It is a comprehensive solution that manages data
and helps with quality, regulatory compliance, and safety
throughout the entire product life cycle. LIMS solutions
can integrate with existing systems, and identify opportunities to improve processes so organizations can bring safe,
high-quality products to market faster.
Q: What are barriers associated with implementing
a LIMS and how can they be overcome?
A: The success of a LIMS project requires the creation of
accurate laboratory workflows within the LIMS, and staff
involvement to ensure everyone understands the benefits
and will work with the system. Overcoming barriers and
successfully implementing and deploying a LIMS requires:
• A clear understanding of the business and user requirements.
• LIMS workflows that match or improve existing workflows to enhance productivity.
• A clear understanding of how proper data management
contributes to organizational success, to name a few.
Q: What type of infrastructure is needed to
ensure the success of a data management
solution such as a LIMS within an organization?
A: I don’t believe a specific type of infrastructure needs to be
in place for a LIMS to help an organization. Our customers
range from completely paper-based start-ups, to large global
enterprises that are fairly automated but lacking the latest
informatics innovations. They can all benefit from a LIMS if
it is properly implemented with the correct functionality and
requirements to address the needs of the business.
Q: How does “technical debt” occur, and how
does STARLIMS address this problem?
A: “Technical debt” essentially means that as a laboratory’s
instruments, platforms, and software are updated and replaced, outdated systems are retired from use and may still
need to be maintained—at a significant cost— to access
data held in proprietary formats. New technologies may
support digital transformation in the short term, but may
result in technical debt five to 10 years later. STARLIMS
has taken an evolutionary approach to technological development, so it can grow in parallel with the data requirements, formats, and requirements of laboratories taking on
new analytical technologies and workflows.
Q: What is on the horizon for STARLIMS?
A: Looking ahead, to support customers on their journey, futuristic technologies are essential, including IoT,
AI/AR, etc. For STARLIMS, an example of this is the
Digital Assistant. This technology provides the ability to interact with the STARLIMS HTML5 solution
using user voice. Customers can utter commands using
the microphone, launch applications and KPIs, conduct
hands-free workflows while away from desk, and build
their own skills to meet individual business needs. The
tool leverages advancements in AI and NLP (natural language processing), allowing the user to interact with the
system using only voice. With continual and consistent
product releases and innovations, STARLIMS will stay
ahead of the technology curve.
sponsored by
October 2021
Lab Manager
31
lab design
Designing for the Unknown
FLEXIBLE LAB DESIGN PLANS ACCOMMODATE CURRENT NEEDS, FUTURE POSSIBILITIES
by Jeffrey Zynda and Adana Johns
W
hat type of research environment is needed to
support the development of technologies that
will not exist for another decade? This was the
exact challenge faced by the project team that designed the
Integrated Engineering Research Center (IERC) at Fermi
National Accelerator Lab in Batavia, Illinois.
In the world of particle physics, advanced devices, hardware, software, and technology development for multinational projects such as the Deep Underground Neutrino
Experiment (DUNE), laboratory planners and designers
don’t have the luxury of asking researchers, “What do you
need?” Often, there are not definitive answers. Researchers may only be able to share a concept for an experiment
that will be funded and designed years down the road—
perhaps as many as 20 years in the future.
“Laboratory planners and designers
don’t have the luxury of asking
researchers, ‘What do you need?’”
Physics and engineering laboratories like IERC often
have unique needs, highly specific to equipment requirements and functional capabilities. With these kinds
of ever-evolving constraints, the design of successful
next-generation facilities demands a change in perspective and approach, starting with a robust and adaptable
framework that can be augmented to meet the specific
needs of today and the speculative needs of the future.
32
Lab Manager
October 2021
A view into the laboratory space of the Fermilab facility.
Credit Perkins&Will; Julian Roman and Thang Nguyen
In contrast to life sciences laboratories—where the
range of activities and needs can be reasonably predicted,
and the type of flexibility can be anticipated—projectbased physics and engineering laboratory environments
must speculate based on the trajectory of the science. To
deliver a facility that will meet the needs of today’s known
programs and provide future flexibility, one must first
dive into the drivers of science and engineering.
A modular approach
At IERC, this process was based on programming
around the “scientific use-cases” of project types that
were planned on a 20-year horizon. The IERC is the
first building at Fermilab that intentionally brings
together multiple departments and capabilities to foster
a cross-departmental and cross-divisional collaboration
platform. From this perspective, a broad look at current
needs and future potential created a strong set of guiding principles for the design of this unique facility.
Rather than embracing the idea of flexibility—attempting to anticipate every future need and incorporating features that meet these speculative needs—the IERC takes
a markedly different approach in providing an adaptable
framework that removes obstacles to future augmentation
as new research and development needs emerge.
During the course of reviewing more than 400 use-cases
and determining the types of spatial support that projects
might need—such as clean-class requirements, assembly
capabilities, fabrication, equipment, and services—the
LabManager.com
lab design
design team began to analyze the data to identify commonalities and distinct differences in capabilities. Based on the
spectrum of use-case potential needs, the team developed a
modular approach to space allocation and building systems.
The outcome was a program that included clean-class fabrication space or project labs, core instrumentation and tool
laboratories, and function-specific electronic fabrication
laboratories, as well as adaptable dry-bench lab space for
small electronic design and fabrication.
The project labs were conceptualized as large workshops for the development and testing of novel technologies in support of Fermilab’s initiatives. Individual project labs are anchored by a central support “spine” that
delivers critical service needs such as power, laboratory
gas (nitrogen, CO2, compressed air), fiber-optic data,
and exhaust capabilities for specialty needs. Flanking
either side of the support spine, individual project labs
were designed with three generations of project support
taken into account. Each project lab was designed to be
ISO clean-class capable. Rather than installing expensive
building infrastructure that may never be used, space
was set aside to add air-handling units, ductwork, and
HEPA or ULPA filtration units. This ability to construct
for the known needs of today and in the immediate
future controls cost while providing the means to adapt
the facility to future requirements.
The ground level of the IERC has been designed
to facilitate fabrication for large-scale project needs,
expressed in a series of project laboratories. Overhead
cranes were provided to utilize these spaces for the fabrication, assembly, and movement of large-scale detectors
and vessels, such as dilution refrigerators. Each project
laboratory is designed with utility service panels in the
perimeter walls providing power, data, compressed air,
nitrogen, and system-process chilled water on an 11-ft.
module. Service distribution trenches are provided
lab design
The Liquid-Argon Cube Laboratory will test detectors that are
destined for the “far-site” in Sanford, North Dakota as part of the
Long-Baseline Neutrino Facility.
Credit: Perkins&Will; Julian Roman and Thang Nguyen
in each laboratory to enhance delivery to equipment,
instrumentation, and workbenches, all while eliminating
tripping hazards and allowing the end-users to readily
access services on an as-needed basis.
First signs of success
While the true test of this level of adaptable design
and planning will come when the IERC opens in 2022,
its approach to adaptability has already been tested
during the design process. After the preliminary design
phase of the project, the project team learned of significant programmatic changes to accommodate new
DUNE-related initiatives and emerging technology
developments—which required little to no re-design
of the building. Some simple re-configuration of utility
requirements and specific instrumentation accommodation met the new program needs, underscoring the value
of this adaptable approach and its potential to support
generations of unknown research needs.
The project team developed a toolset of “core capabilities” to support the development of future projects and
eliminate needless duplication of resources. Programmatically, this was aptly named Shared Core Lab and provides
the ability to support common needs across the spectrum of
34
Lab Manager
October 2021
project labs, such as coordinate measuring machines, wafer
fabrication tools, and similar commonly shared equipment
assets. To support the widest range of known requirements
today, these tools are housed in an ISO 7 clean-class environment that is future upgradable to ISO 5.
“Researchers may only be able to
share a concept for an experiment
that will be funded and designed
years down the road—perhaps as
many as 20 years in the future.”
Today’s researchers, including those at Fermilab, are
working to deepen humankind’s understanding of the
universe so that scientific discovery can advance our
global society. Buildings like IERC have the potential to
set an example for how architecture—and specifically,
the thoughtful design of physical spaces for scientific inquiry—might support and advance these altruistic goals.
Jeffrey Zynda is principal, Northeast regional practice leader,
science and technology; and Adana Johns is associate principal,
practice leader, science and technology; both with Perkins&Will.
LabManager.com
product in action
INTEGRA MINI 96
PORTABLE, PRECISE, AND AFFORDABLE PIPETTING OF 96- AND 384-WELL PLATES
The INTEGRA MINI 96 is an ultra-compact, lightweight, and portable 96-channel pipette, offering high
throughput and reproducibility for virtually any microplate based liquid handling tasks around your lab.
LIGHTWEIGHT AND PORTABLE DESIGN
TOUCH WHEEL-CONTROLLED
GRAPHICAL INTERFACE
The built-in carry handle makes it easy to move anywhere
Large, easy-to-use touch wheel-controlled graphical
in the lab, including inside laminar flow cabinets. The
interface offers a selection of predefined pipetting tasks,
pipette’s small size makes it easy to have two instruments
or allows users to develop custom workflows. On-screen
side-by-side to perform different steps in the same workflow.
tutorials for new users mean no special training is required.
MOTOR-ASSISTED
OPERATION
Ensures precise electronic tip loading and
RANGE OF VOLUMES
Available in four volume ranges to
offer 0.5 to 1250 µl pipetting.
ejection, mixing, and dispensing. Every
channel is positioned at the same height
and angle, and dispenses at the same rate,
providing high precision and reproducibility.
DEVELOPED TO
MEET VARIOUS
DISPENSING NEEDS
From repeat dispensing and mixing
to reservoir-to-plate or plate-to-plate
transfers in 96- or 384-well formats.
An optional two-position stage and
removable second stage provides
flexibility in your workflows.
The MINI 96 can transform the productivity of your workflow. Discover the most
affordable 96-channel pipette on the market at www.integra-biosciences.com
To learn more, visit www.integra-biosciences.com |22 Friars Drive Hudson, NH03051 USA
October 2021
Lab Manager
35
lab design
WINNER
2021
A Beacon for Life Sciences
UNIVERSITY OF MICHIGAN BSB WINS EXCELLENCE IN INNOVATION PRIZE IN
2021 LAB DESIGN EXCELLENCE AWARDS by MaryBeth DiDonna
T
he University of Michigan has hosted a biological
sciences program since the hiring of its first botany
and zoology professor in 1884. The Kraus Natural
Sciences Building was later built in 1915, followed by the
Ruthven Museums Building in 1928. By the early twentyfirst century, however, the university realized that it would
be better served by a modern biological sciences building
that could accommodate fast-paced research, as well as an
up-to-date museum facility able to house a collection that is
surpassed in size only by the Smithsonian Institution.
“We wanted to create a facility that
would attract talented researchers
and students, and allow worldclass investigation to occur.”
The resulting building—the Biological Sciences Building & Museum of Natural History—unites the University
of Michigan’s biology departments under one roof, promoting interdepartmental collaboration and opportunities for researchers and the public to interact. The project
broke ground in September 2015, and achieved occupancy
in April 2019. At a cost of $261 million, it offers a highly
visible, spacious home for the school’s natural history
museum, earmarked by a magnificent mastodon display in
the building’s soaring atrium.
36
Lab Manager
October 2021

Views into the open labs from the atrium and bridge.
Credit: Aislinn Weidele
In recognition of their success in developing a building
that combines science, research, and community-building, Lab Manager has awarded SmithGroup and Ennead
Architects—the architect of record and the design architect, respectively, for the Biological Sciences Building
& Museum of Natural History—with the Excellence
in Innovation prize in the 2021 Lab Design Excellence
Awards. SmithGroup additionally served as the laboratory planner, MEP engineer, structural engineer, civil
engineer, and landscape architect for the project, and
Ennead also acted as the interior designer.
Science showcase
The purpose of the Biological Sciences Building
(BSB) was to transform the way science is conducted
and communicated in the twenty-first century. Susan
Monroe, capital projects manager for the University of
Michigan’s College of Literature, Science, and the Arts,
says that the main goal of this project was to unite the
school’s Molecular, Cellular, and Developmental Biology (MCDB) and Ecology and Evolutionary Biology
(EEB) departments. “We wanted to bring them together
with neuroscience research and with the paleontology
researchers. We wanted to create a facility that would
attract talented researchers and students, and allow
world-class investigation to occur,” says Monroe. “We
wanted a facility that would foster collaboration and
interaction. And we wanted a building that would engage
the broader campus and public community to showcase
LabManager.com
lab design
science and the exhibit collections in the form of a new
Museum of Natural History.”
The BSB is located on the University of Michigan’s
central campus in Ann Arbor, on the site where North
Hall and the Museum Annex Building once stood. The
BSB serves as a “beacon” for the surrounding Life Sciences Neighborhood, where it acts as a liaison between
the historic academic campus core and the developing
life sciences core, and then continues on to the medical
center campus to the north. The ground level of the BSB
contains a main entrance to the atrium, along with striking exhibits and a café, to draw in the school community.
The plaza-level Science Lawn connects the surrounding Life Sciences Institute and Undergraduate Science
Building, and forms an outdoor gathering area.
The project team decided to avoid the traditional
design of insular blocks that appear in many other
research buildings. Instead, they developed a plan that
includes three articulated towers, joined by large atriums,
to form a collection of research neighborhoods. These
neighborhoods share certain collaborative spaces such as
break rooms, conference rooms, and lounge spaces. The
research neighborhoods also share open labs, flexible
work spaces, lab support spaces, and core facilities such
as imaging, aquatics, and plant growth. However, other
amenities are spread across the three towers, encouraging
research groups to travel to other neighborhoods and interact. The elevators are strategically placed in the center
tower, to create an active zone of circulation. PI office
space was reduced by about a third, in an effort to increase
square footage for open lab and collaboration spaces.
Promoting collaboration
The project team utilized a social network mapping
survey to document existing research collaboration,
as well as desired future research collaboration, to
Detect pesticide residue at trace levels (<10 ppb) while
improving overall throughput
Triple Quad GC-MS/MS
(978) 535-5900 | [email protected] | go.jeolusa.com/TQ4000_LM
lab design
1.
2.
3.
4.
1. The Biological Sciences Building at dusk is a beacon for
scientific discovery. Credit: Bruce Damonte 2. Biodiversity Lab with
transparency to atrium. Credit: Aislinn Weidele 3. The museum atrium
houses remarkable mastodon skeletons, witnessed when you first enter
the museum. Credit: Aislinn Weidele 4. Two-way communication with
the Paleontology researchers. Credit: Aislinn Weidele
38
Lab Manager
October 2021
LabManager.com
lab design
understand the depth of connection between building
users and develop the best possible collaborative infrastructure. The planning team identified 21 themes/
sciences that could be co-located in areas by neighborhood. This included 19 wet lab neighborhoods and two
computational neighborhoods, based upon compatible
sciences but not necessarily different departments.
“I think one of the fantastic formal recognitions that
the team came to...is that this program could not be
represented monolithically; that it would not lend itself
to the superblock notion of the laboratory. The analysis
showed some fairly discrete neighborhoods that could
be developed, and really begin to organize into research
neighborhoods,” says David Johnson, design and lab liaison, SmithGroup. “[It allows] people to recognize how
engaging and charismatic science is, in that laboratories
like fossil prep really become inside-out opportunities to
engage the user community and to engage the public in
the mission of science at the University of Michigan.”
The university’s Museum of Natural History was
previously housed in the Ruthven Building, but was
“relatively unknown,” says Jarrett Pelletier, project
designer, Ennead. “You'd have to know it was there; you
didn't walk by it and see the museum, you had to sort of
uncover it. While it was beloved, in its previous location
it was hard to spot. The museum wanted to turn that
around on its head and really make it a centerpiece for
the design of the biological buildings and really celebrate
those two interactions,” he says.
The museum is co-located on three floors of the
BSB alongside research programs, “so there's a sort of
overlap. We chose to maximize all of the contact areas
as much as we could in the design of the museum space,
to encourage collaboration and chance encounters and
really leverage the proximity of those two programs,”
says Pelletier. According to SmithGroup and Ennead,
museum memberships have tripled since the museum
was relocated to the BSB.
Inspired by the sciences
The university’s natural sciences program was an
influence on the design of the BSB, as well as its energyefficient strategy. Natural systems, native landscape, and
stormwater management strategies are utilized to form
an immersive environment surrounding the building.
The labs embrace natural light to overcome the stereotype of the lab as a dark, dingy place. The labs are also
designed to be free of barriers, to promote the well-being
of the occupants, and to develop a sense of community
via shared bench space. Holistic ventilation has resulted
in lower energy costs, and also means increased occupant ventilation and enhanced indoor air quality. The
BSB has received LEED Gold Certification.
“We did a lot of things to reduce energy usage while
supporting the health and safety of everyone in the
building. The systems design really takes advantage of
transferring and utilizing energy efficiently at every
stage of the building operation through some key design
decisions and strategies,” says Jeff Hausman, principal in
charge, SmithGroup. “First, makeup areas are pre-heated with waste heat from the process cooling water loop,
then the amount of air needed is reduced by the use of
low-velocity chilled beams. Then, outside air is used to
cool the highly utilized spaces, such as classrooms, offices, and the graduate student spaces, and then transferred
to the labs as makeup air. Finally, the non-exhausted air
is used to ventilate the linear equipment rooms. We are
wringing every bit of energy out of the airside system to
make the overall building as efficient as possible for an
intensely used lab building. The energy use intensity for
the building is actually below 100 kBTU per square foot,
which is 70 percent better than a benchmark lab building from 2003, and met the AIA 2030 challenge when
the building opened. The building is 30 percent more
efficient than any code building is today.”
Safety initiatives in the BSB include shared yet
separate storage areas for chemicals and biologicals
jointly used among research neighborhoods; fume hoods
(including shared radioisotope fume hoods) that are
housed in alcoves apart from human circulation; and the
location of biosafety cabinets in lab support spaces, apart
from general circulation areas. Each neighborhood in
the BSB is outfitted with lab sinks, eye wash stations, and
emergency showers. Desks are located away from open
labs to reduce exposure. Passenger elevators and circulation areas are situated near faculty offices to avoid the
laboratories, and separate service elevators accommodate
the transport of chemicals, biologicals, and research
specimens from the loading docks to the labs.
“We think the building is really poised to help elevate
science education in the future, and future generations,”
says Pelletier. “And we think that's a great aspect about
bringing all these programs together.”
MaryBeth DiDonna, lab design editor for Lab Manager, can
be reached at [email protected].
October 2021
Lab Manager
39
IMPLEMENTING A DOCUMENT
MANAGEMENT SYSTEM
Working with a vendor to create a laboratory document control system
D
ata integrity requires vigorous adherence to
protocols and meticulous record-keeping.
Hand-typed dates and times no longer satisfy
regulators and accreditation bodies. Digital
options have dramatically evolved into sophisticated
software platforms that track users, dates, procedures,
and more. This means the platform itself does the
record-keeping. Using such software now allows laboratories to meet and exceed regulatory standards.
The current document management landscape is
divided into two separate types of systems. Older systems
have data fields for information entry, such as when files
were reviewed or approved. Manual entry in these older
systems puts your lab at risk of human error, because they
rely on the word of the person who typed in the information. More modern systems directly track when those files
are reviewed and approved. These newer, more sophisticated systems don’t even give users the option to manually
enter data, which ensures built-in data integrity.
Modern systems also allow teams to collaborate directly, automatically keeping track of versions. Although
most documents required for accreditation and validation
don’t change from year to year, it still feels like starting
from scratch when paperwork is due. Document management software creates checklists to help identify what
can be reused to prove compliance. Such platforms can
also assign tasks to individual users, so everyone understands their responsibilities. These features will save
you time and money.
in focus | Softtech Health
DEVELOPING A LAB DOCUMENT
CONTROL SYSTEM
Build a team
First, identify stakeholders. Define all the users
and assign them roles. Quality management software
provider SoftTech Health suggests labels such as
beneficiaries, drivers, supporters, or champions.
Once the list of stakeholders is complete, choose the
key players. These team members will brainstorm the
requirements of the software and evaluate its delivery.
Completing the project
After deployment, evaluate the project. Ensure the
acceptance criteria list is complete and the deliverables
satisfied. Sign off on all contracts, and back up the documents and the system. Finally, announce the launch and
send thank you messages.
BEST FEATURES IN A DOCUMENT
CONTROL SYSTEM
A key deliverable is to understand the functionality you’re
looking for, as well as the minimum functionality required.
Define the scope of work
First, define the purpose of the project—include why
the software is needed. Your vendor must understand
what your needs are—time management, cost savings,
or regulatory compliance. Your entire team needs to
understand the “why” for the new system now, not
when they are trying to learn it later.
Use the defined purpose to develop a list of desired
functions and features, but also make a clear list of
project exclusions. A wish list is great, but remove
unrealistic items to avoid disappointment on both sides
of the contract.
Finally, create a list of acceptance criteria. Use questions
with yes/no answers paired to items in the scope of work.
This makes the project clear for both you and your vendor.
Ultimately, it provides you with backup in case those
criteria are not met upon delivery.
Once this work-scope document is complete, circulate it to all team members. Getting buy-in early
smooths the way later and helps your team look forward to the new system.
Today’s sophisticated software systems help labs
guarantee data integrity, while ensuring maximum
user uptake with a user-friendly interface. For lab
managers, it’s key to recognize which systems will
safeguard their data integrity by providing automatic
tracking rather than manual record-keeping on a computer. By following the simple best practices listed
here, your lab can take proactive steps toward
a successful software rollout.
Managing the project
As mentioned above, get a clear list of the vendor’s
deliverables. If your software deployment falls behind
schedule, you can keep the team and management updated.
Software installations rarely happen overnight.
Tweaks and updates are inevitable. If changes come up,
discuss them with the team and ask for input. Meet the
issues head-on and keep senior management informed.
Training is a huge part of the management of the
project. Deployment day is exciting, but also full of
anxiety. Having excellent training helps ease deployment. Training should include real-world examples.
Ask if the vendor has a train-the-trainer option—one
or two team members receive in-depth training to provide on-site support to the rest of the team.
To learn more, visit:
https://softtechhealth.com/
For minimum functionality:
• Vendor must migrate all your files into the system
• Vendor to migrate your users into the system
• A clear list of what the vendor requires before the launch
Questions your vendor must answer in writing:
• What training is included, how is it delivered, and for
how long
• What support is available and what are the response
time guarantees
health & safety
PARTICIPATION IN SAFETY ACTIVITIES FROM STAFF AND MANAGEMENT
IMPROVES SAFETY CULTURE by Tabi Thompson
C
onsistent safety activities have the potential to create
a stronger safety culture. When lab management fully
embraces these tools, the benefits can include decreased injuries, increased morale and productivity, and improved cost savings. How can safety activities contribute to a
stronger safety culture? Consider this: Major League Baseball
players practice similar drills as Little Leaguers. By turning
the fundamentals into habits, professionals and amateurs alike
can focus on more complex tasks. This lesson applies to all
aspects of life, including a safe laboratory environment.
The following safety activities are options for companies to consider implementing. These activities should
be completed consistently with intention and recorded
for the sake of accountability.
Lab inspections
Lab inspections help provide a baseline for lab cleanliness
and serve to ensure that labs are safe for employees. By regularly inspecting a lab, the staff are encouraged to keep their
space tidy and safe. Most workplaces already perform lab
inspections, but how they are done can be just as important as
simply doing them. Lab cleanliness requires time. Through
inspections, lab managers can see first-hand that their employees require more time to attend to cleaning activities.
An inspection checklist and inspector diversity can further
improve lab inspections. It’s easy to only catch the most obvious offenses and miss smaller problems, which have the potential to become larger over time. Prepared checklists allow
inspectors to inspect labs with greater consistency. Including
a diverse range of employees on the inspection team (i.e.,
employees from all levels of the organization) can ensure that
42
Lab Manager
October 2021
inspections don’t become routine, and regularly rotating the
members of an inspection team can help spot problems other
inspection teams overlook.
Incident prevention tasks
Incident prevention tasks (IPTs) are performed when an
employee observes an unsafe behavior or condition and addresses or corrects the issue. For example, a common unsafe
condition is water on the floor. If an employee observes this
condition, they should immediately correct the condition
by removing the water from the floor to prevent someone
else from slipping and falling, which is “among the leading [cause] of serious work-related injuries and deaths,”
according to OSHA.1 The key to preventing common—and
sometimes more complex—incidents is to act immediately
when unsafe behaviors or conditions are seen. Incidents
cannot be prevented if employees assume issues will be corrected by someone else. All employees have the responsibility to speak up about, and get involved in, the site’s safety.
Eventually, employees will observe and habitually correct
risks when they routinely perform IPTs.
IPTs should be performed by all employees within an
organization to be effective. Involvement throughout the
organization shows employees that safety is a priority,
and it fosters a cooperative environment. Moreover, employees exposed to greater and more various risks should
perform more IPTs than those with less exposure risks.
Consequently, IPTs may offer a certain level of preemptive action when enough information is recorded.
For instance, if there is a persistence of IPTs concerning
ergonomic issues, the organization can be proactive in
LabManager.com
health & safety
health & safety
addressing future ergonomic issues by strengthening their
existing practices, policies, or trainings. Also, if there appears
to be a preponderance of issues recorded within IPTs over a
shorter time period, lab managers may choose to focus discussions with their employees on specific hazard categories.
Activity drift checks
Over time, lab workers can, often unintentionally, start to
perform an activity differently than the written procedure
details. This “drift” from the procedure can introduce hazards
when left unchecked. Activity drift checks (ADCs) provide an
opportunity for lab managers to check in with their employees on activities with documented procedures they regularly
perform. Regular review of how tasks are performed allows
good things to happen. First, quickly catching unsafe drifts
from the written procedure can prevent injuries from occurring. Second, drifts that result in increased safety or efficiency
of the task may be added to the procedure to document and
share best practices. Finally, the review process presents an
opportunity to improve the lab safety equipment.
ADCs work best when a lab manager observes an
employee physically performing the task. The lab manager should simply observe the employee’s actions while
comparing to the written procedure and make note of
any differences, both safe and unsafe. Once the activity is
complete, both lab manager and employee can discuss potential improvements. ADCs aren’t about assigning blame
or getting anyone in trouble; they’re a learning opportunity and a chance to improve the safety of the lab.
Organizational accountability
Organizations can implement many safety activities,
but they’re meaningless if no one performs them. Enacting a safer work culture to prevent common mistakes or
injuries requires lab managers and site leadership to see
the value in a stronger safety culture. Participation of
everyone—at all levels within the organization—is required. Organizations that recognize and advocate for a
work culture where nothing is more important than the
safety of its employees and the communities to which
health & safety
they belong empowers its employees to act, support, and
challenge one another to work safely.
Frequency of safety activities
How often do safety activities need to be done to form
positive workplace habits? Habits are defined as “actions
that are triggered automatically in response to contextual
cues that have been associated with their performance.”2
For example, automatically addressing or correcting an
unsafe situation (action) after seeing the unsafe situation
(contextual cue). “Decades of psychological research”2
shows that repetition of an action in response to a consistent
contextual cue leads to habit formation. Organizations can’t
survive on safety alone, though. Consistency of activities is
key, so setting a weekly or monthly goal for every employee
can build slow, but steady, progression of safety habits.
In a jaded world where employees frequently feel expendable, organizations can boost morale by prioritizing
their safety. But improving the safety culture doesn’t just
benefit employees. By effectively implementing safety
activities designed to build positive workplace habits,
organizations can decrease and prevent injuries and
increase productivity, giving way to greater cost savings;
a rare win-win scenario.
Natural Instincts
Supervision
Self
Team
DuPont Bradley Curve
In the 1990s, DuPont devised and implemented a model
known as the DuPont Bradley Curve,3 which describes
stages an organization’s safety culture must go through to
progress toward a goal of zero injuries. These stages are:
• Reactive: People don’t take responsibility and believe
accidents will happen.
• Dependent: People view safety as following rules.
Accident rates decrease.
• Independent: People take responsibility and believe
they can make a difference with actions. Accidents
reduce further.
• Interdependent: Teams feel ownership and responsibility for safety culture. They believe zero injuries is
an attainable goal.
When applied in tandem with safety activities designed
to build positive workplace habits, this system provides a
guideline for organizations to follow to develop a stronger
safety culture. A strengthened safety culture saves organizations money by avoiding and preventing injuries; and when
organizations focus on safety, employees feel valued, leading
to lower turnover rates from higher morale and productivity.
However, organizations hoping to achieve the interdependence stage should make a critical distinction.
Believing that zero injuries is attainable through a strong
safety culture is not the same as attaining zero injuries
by underreporting or changing what qualifies as an
injury. A culture born from sweeping incidents “under
the rug” harms employee morale, and will not prevent
injuries or more serious incidents from occurring.
44
Lab Manager
October 2021
REACTIVE
DEPENDENT
INDEPENDENT
INTERDEPENDENT
The DuPont Bradley Curve shows the four stages an organization’s safety culture should go through to progress: Reactive,
Dependent, Independent, and Interdependent. As the relative
safety culture improves the injury rate decreases. Credit: DuPont
Sustainable Solutions
The writer acknowledges Air Products & Chemicals and
Evonik Industries as sources of inspiration for some of the best
practices outlined in this article.
Tabi Thompson is a former bench chemist with a BS in chemistry
from Wittenberg University, who spent more than 15 years working
in a variety of roles in chemical and pharmaceutical industries. Most
recently, she apprenticed as a safety training coordinator at Evonik in
Allentown, PA. There, she influenced the implementation of the Safety
at Evonik program aimed at bolstering the safety culture as well as
managing updates to the safety training program. Currently, she is a
freelance writer, proofreader, and copyeditor in Bethlehem, PA.
References:
1. https://www.osha.gov/walking-working-surfaces
2. Gardner, B., Lally, P., & Wardle, J. (2012). Making health habitual:
the psychology of 'habit-formation' and general practice. The British
Journal of General Practice : The Journal of the Royal College of General Practitioners, 62(605), 664–666. https://doi.org/10.3399/bjgp12X659466
3. https://www.consultdss.com/bradley-curve/#:~:text=The%20
DSS%20Bradley%20Curve%20identifies,and%20believe%20
accidents%20will%20happen.&text=Accidents%20reduce%20
further.,and%20responsibility%20for%20safety%20culture.
LabManager.com
5-9
Boston
Convention
and Exhibition
FEBRUARY Center | MA
GET
HYPED
to connect
• Nine scientific tracks:
• Submit an abstract to present!
• More than 250 exhibitors +
• Innovation AveNEW for startups
• Two keynotes: Carolyn Bertozzi
• and David Walt
• Early registration discounts
• Special programming
Learn more at SLAS.ORG/2022
SLAS.ORG/2022 | #SLAS2022
industry insights: synthetic bio
A New Era of Protein
Interaction Engineering
NEW BIOENGINEERED PROTEIN DEVICES ALLOW NEAR-INSTANT RESPONSE TIMES,
BUT HIGHLIGHT SYSTEM NEEDS by Rachel Brown, MSc
I
magine detecting and instantly counteracting an overdose through a micro in vivo device. Or an early detection
system that warns of an impending heart attack based on
trace biomarkers. Or a pill that can diagnose disease. New
synthetic biology research out of MIT paints these seemingly science fiction scenarios as realistic in the not-toodistant future with a new approach to protein switches.
The burgeoning field of synthetic biology has already
gifted us with the incredible. Active applications of
design advances in biocatalysts, protein switches, gene
switches, and genome engineering tools include intracellular biosensor devices. Proof of concept exists for micro
46
Lab Manager
October 2021
engineered biosensors that can be swallowed and wirelessly relay real-time detection of internal bleeding.
To date, the field has primarily relied on lengthy and
resource-taxing transcription and translation processes
that take minutes, hours, or even days to respond to
inputs. Applications requiring faster, near-instantaneous
responses—like those relating to cell signaling or metabolism—require engineering fast protein-protein circuits
with an eye to systems behaviors. This approach faces
challenges relating to biological complexity and the unknown. How the field of synthetic biology advances from
here depends on how these challenges are addressed.
LabManager.com
industry insights: synthetic bio
ENGINEERING A PATH THROUGH
BIOLOGICAL COMPLEXITY
Biological complexity is a tricky beast—one that
stymies progress in synthetic biology. Synthetic biology
is a collision of biology, chemistry, and engineering that
aims to engineer new biological functions and inform
systems biology research.
Fundamental to an engineering approach to programming biological functions and cell behaviors is introducing modularity—breaking down complex processes into
manageable units that can be assembled, reorganized, and
connected to existing endogenous functions. Engineering
relies on predefined materials with predictable behavior,
on-demand procurement, and foundational rules or models
dictating how materials can be combined1. Establishing this
baseline within biological systems is fraught with challenges.
“The issue of biological complexity
is frequently raised when assessing
reproducibility of research.”
Starting from nucleic acid design and regulation
has resulted in impressive strides in the field. Genetic
engineering was in full swing during synthetic biology’s formation and forms the base of early established
abstraction levels: “DNA” manipulation, biological
“parts” like proteins, “devices” assembled from parts, and
“systems” from patterns of devices1. It is—to an extent—
modular in form and allows engineered functions to be
considered separately from endogenous functions. The
wealth of knowledge of regulation networks that has
built up over the decades has allowed for the engineering
of sophisticated networks. However, starting a function
with DNA transcription and translation slows down any
programmed biological response, in part due to competition for shared cellular resources, which limits the
applications of the technology.
Establishing networks composed solely of protein-protein
interactions speeds biological response time, but is difficult, according to Ron Weiss, MIT professor of biological
engineering, electrical engineering, and computer science.
Understanding how to design appropriate chimeric proteins
and reliably predict the upstream and downstream interactions is still developing and will prove a major challenge to
the field moving forward. It highlights the need for thinking
of engineered networks as part of a whole.
Weiss describes a current perspective shift, “I think a lot
of our [early] thinking in synthetic biology, certainly mine,
[was] ‘let's build a circuit and then put it into the cell, but
without [fully considering] endogenous pathways.’” According to Weiss, improved understanding of the systems
context around embedded engineered networks is critical
to success in the field. While conversations on this need are
as old as the field, Weiss draws a distinction between talking about it and investigating it deeply. Once synthetic biology and systems biology are better integrated, he believes
that both areas will provide valuable insights into the other,
“ultimately [paying] huge dividends.”
While greater insights into systems biology will aid
synthetic biology in managing complexity as research expands in scope, it’s possible that improved standardization
will reduce it. The issue of biological complexity is frequently raised when assessing reproducibility of research2–4.
But how much variability in results is due to inadequate
standardization of methods, protocols, and measurements?
While in-depth conversations around the need for improved measurements and standardization in biology—and
across the sciences—have been ongoing for years3,5, correction in the research community has lagged2,6,7. Measurements of biological activity, for example, are often relative.
This increases ambiguity in results across the field but
becomes more urgent when involving engineering efforts.
Jacob Beal, a senior scientist at Raytheon BBN Technologies, is part of the wave of researchers pushing for
improved consistency. He believes that a considerable
portion of the variability and unpredictability in results
would be eliminated with improved standardization.
“There's a huge amount of time and energy wasted just
trying to recreate missing information about units, protocols, and designs,” he explains. He’s observed researchers
waste “months or even years debugging tricky biological
issues” that turn out to be instrument or protocol related that—once fixed—demonstrates “more systematic
and predictable” biology than previously believed. Beal
expects that once enough labs commit to a particular minimum information quality, reduced time and effort costs
will instigate “a vast acceleration of the field.”
A NEW APPROACH
A new paper in Science describes an engineered protein
circuit that uses reversible protein phosphorylation-driven
interactions8. As Deepak Mishra, lead author and MIT
October 2021
Lab Manager
47
industry insights: synthetic bio
research associate in biological engineering, notes in a
press release, this provides “a methodology for designing
protein interactions that occur at a very fast timescale,
which no one has been able to develop systematically.”
The authors used a combination of endogenous and
exogenous proteins to build complexity in a novel, reversible bistable toggle switch. Taking it a step further, they
demonstrated its ability to control basic cell processes by
tying the switch to cell division, successfully flipping on
and off a yeast cell’s ability to bud by alternately exposing
the yeast to two different chemical signals.
Using endogenous proteins in the network, the authors
created a more complex toggle than most others to date,
with more dependencies. Weiss, senior author, hopes
this demonstration of incorporating existing biological
systems in device design will contribute to pushing the
boundaries of synthetic biology, particularly when it
comes to building complexity and drawing inspiration
from existing sophisticated networks.
The added layers and complexity in their toggle circuit prompted the authors to search the study organism
for similar endogenous toggle circuits. So far, according
to Weiss, regulatory network discovery has been limited
by the streetlight effect, searching familiar areas for
familiar topologies. The authors instead searched for a
diverse array of—less optimal but potentially more evolutionarily probable—topologies that would achieve the
same result. They found six. “We wouldn’t think to look
for those because they’re not intuitive… This is a new,
engineered-inspired approach to discovering regulatory
networks in biological systems,” says Weiss.
This marks a considerable step forward in designing
protein circuit devices. The authors see immediate use
in developing biosensors for environmental pollutants,
particularly given high sensitivity to triggers. Future
development of similar custom protein networks opens
the door to a wide range of diagnostic capabilities.
Similar networks can add complexity to the type of
micro engineered biosensor mentioned earlier, suggest the
authors. This could allow for detection of multiple biomarkers, such as those associated with cancer, neurodegenerative
disease, inflammation, or infection, and changes in concentration over time9,10. There’s even a potential for immediate
intervention. “You could have a situation where the cell
reports that information to an electronic device that would
alert the patient or the doctor, and the electronic device
could also have reservoirs of chemicals that could counteract a shock to the system,” Weiss notes in the press release.
48
Lab Manager
October 2021
There is more work to be done before realizing these
potentials, both regarding the findings and within the field.
Understanding the modularity of fusion proteins and signal
strength for this sensor is key, says Beal. He also notes the
work that remains to build consistent systematic libraries.
And, of course, a better understanding of circuitry integration is needed to drive smart designs. Still, in a field with immense progress year over year, it seems that when standardization leads to reliable reproducibility, anything’s possible.
REFERENCES:
1. Endy, D. 2005. Foundations for engineering biology. Nature
438, 449–453. DOI: 10.1038/nature04342. https://www.
nature.com/articles/nature04342.
2. Coxon, C. H., Longstaff, C. & Burns, C. 2019. Applying the
science of measurement to biology: Why bother? PLoS Biol. 17,
e3000338. DOI: 10.1371/journal.pbio.3000338. https://www.
ncbi.nlm.nih.gov/pmc/articles/PMC6605671/.
3. Vilanova, C. et al. 2015. Standards not that standard. J. Biol.
Eng. 9, 15–18. DOI: 10.1186/s13036-015-0017-9. https://jbioleng.biomedcentral.com/articles/10.1186/s13036-015-0017-9.
4. Sené, M., Gilmore, I. & Janssen, J. T. 2017. Metrology
is key to reproducing results. Nature 547, 397–399. DOI:
10.1038/547397a. https://www.nature.com/articles/547397a.
5. Plant, A. L. et al. 2018. How measurement science can improve confidence in research results. PLoS Biol. 16, e2004299.
DOI: 10.1371/journal.pbio.2004299. https://journals.plos.org/
plosbiology/article?id=10.1371/journal.pbio.2004299.
6. Stark, P. B. 2018. No reproducibility without preproducibility.
Nature 557, 613. DOI: 10.1038/d41586-018-05256-0. https://
pubmed.ncbi.nlm.nih.gov/29795524/.
7. Beal, J. et al. 2020. The long journey towards standards for engineering biosystems. EMBO Rep. 21, e50521. DOI: 10.15252/
embr.202050521.
8. Mishra, D. et al. 2021. An engineered protein-phosphorylation
toggle network with implications for endogenous network discovery. Science 373, eaav0780. DOI: 10.1126/science.aav0780.
https://science.sciencemag.org/content/373/6550/eaav0780.
9. Gibson, D. G. et al. 2010. Creation of a bacterial cell controlled by a chemically synthesized genome. Science 329, 52–
56. DOI: 10.1126/science.1190719. https://science.sciencemag.
org/content/329/5987/52.
10. Duhkinova, M., Crina, C., Weiss, R. & Siciliano, V. 2020.
Engineering intracellular protein sensors in mammalian cells.
J. Vis. Exp. 185, e60878. DOI: 10.3791/60878. https://pubmed.
ncbi.nlm.nih.gov/32420982/.
Rachel Brown, MSc, science writer/coordinator for Lab
Manager, can be reached at [email protected].
LabManager.com
the big picture
The BIG Picture
TACKLING THE TOPICS THAT MATTER
MOST TO LAB MANAGERS
W
hether creating a new lab facility from
scratch or moving your existing lab to a new
space, the process of setting up a working laboratory is a complex process. Depending on the
equipment and other needs of your lab, it can take weeks
after the initial move-in day to get fully operational. In
this web series, we offer tips and solutions to some of the
main hurdles of leading a new lab setup.
To see this and other Big
Picture series, please go
to Lab Manager’s website:
LabManager.com/
big-picture.
The Big Picture is a digital series produced by the
Lab Manager editorial team. Each month, the series features
a collection of in-depth articles, expert insight, and helpful
resources, covering a specific industry, trend, or challenge.
To see the setting up a new lab and other Big Picture series,
please go to Lab Manager’s website at
www.labmanager.com/big-picture.
October 2021
Lab Manager
49
ask the expert
ASK THE EXPERT
COMPUTATIONAL PREDICTIONS
EMPOWER DRUG DISCOVERY
by Tanuja Koppal, PhD
Michelle Arkin, PhD
Q: Can you share with us the
goals of the ATOM consortium?
A: The vision of the ATOM research
initiative is to use ML and AI to bring together data from public databases and from
pharmaceutical partners to perform multiparameter optimization on a drug target.
Another aspect of the ATOM pipeline is to
do automated experimentation. Nearly five
years ago, the pharmaceutical company
GlaxoSmithKline (GSK) and the national laboratories (Lawrence Livermore,
Oak Ridge, Argonne, and Brookhaven)
started re-envisioning drug discovery as
a computationally driven approach. They
realized that if we are going to do personalized medicine for a patient, we need
to do it much faster, with fewer resources
and a higher success rate. That’s where the
idea of ATOM and using computational
tools along with rapid experimental drug
discovery came from.
Our goal is to start with a drug target and
a set of molecules that impinge on that
target, along with a set of design criteria
for the drug. The AL/ML models use that
information to design new molecules in
silico and virtually assess whether they meet
50
Lab Manager
October 2021
Michelle Arkin, PhD, professor and chair of the Department of Pharmaceutical
Chemistry and co-director of the Small Molecule Discovery Center at the University
of California, San Francisco, talks to contributing editor Tanuja Koppal, PhD, about
the growing applications of artificial intelligence (AI) and machine learning (ML) for
automating chemistry, drug target optimization, systems-level modeling, and eventually
for predicting if a drug is going to work in the patient. She discusses the vision of ATOM
(Accelerating Therapeutics for Opportunities in Medicine), a public-private endeavor that
she is working with, to transform drug discovery using computational tools.
those design criteria. This is done iteratively
until you get a set of compounds that fits the
criteria well. Laboratory automation then
enables automated synthesis and purification
of those compounds and testing in biological
assays of interest. The goal was to go from
an identified target to a drug worth testing
in animals in about a year. People used to
say that’s crazy, but now they are asking,
“what is it that you are doing differently
from what everyone else is trying to do?”
which shows how fast the field is moving.
Q: How do the experimental
and computational components
work together?
A: There are two kinds of computational
models. Parameter-level models measure
and predict experimental endpoints such
as hERG channel activity, MDCK permeability, and more. There is a lot of data
around those parameters that can be used
to develop AI/ML models. The longterm goal, however, is to use systems level
computation, where models can predict
a “therapeutic index,” i.e., how safe and
effective a drug is based on its on-target
activity and toxicity, at predicted in vivo
concentrations of the drug. What we can
do right now is parameter level modeling and some amount of systems level
modeling for pharmacokinetics. However,
in the future we are looking to do mostly
systems level modeling. We are also using
transfer learning or matrix learning approaches to see how little data you need
to understand a target based on what you
already know about a related target.
There are two reasons why we do experiments alongside computation. One is
to make and test compounds to validate
predictions and then use the compounds
in “real” biology. The other goal is to
make and test compounds in a chemical
space where none exists. Data obtained
from the new molecules that are designed, made, and tested, is fed back into
the model, which continually updates
itself and is self-correcting. We can do
this intense computational work because
we are working in collaboration with
the national laboratories who have the
biggest computers in the country. Human biology is very complex and drug
discovery is a hard problem to tackle. If
we crack biological problems using computational approaches, we can push our
computational capabilities forward.
LabManager.com
ask the expert
Q: What has been your
biggest challenge so far?
A: We ran into two main challenges with
our first project. When we started with
data that was collected a long time ago or
over a long period of time, we found that
the new molecules that we designed were
consistent with the old dataset, but the
data itself could not always be reproduced.
Thus, we need ways to demonstrate that
the data sets are robust and experimentally reproducible. Secondly, it can take
several months to source and synthesize
compounds to test. With computational
design, you can have several different
scaffolds that are not related to each other
and making those compounds can take
time. Hence, we needed flexible, robust,
automated chemistry to support the
computational chemistry efforts. These
are both active areas of research.
Q: How is ATOM different
from other public-private
partnerships?
A: There are a few things that make
ATOM different. One is the integration of computational and experimental
data, and the other is the systems-based
modeling. Most companies are working only on parts of the puzzle, such as
finding hits against a particular target
or improving the pharmacokinetics
or therapeutic index of a molecule.
Big companies do most of the work
internally and small companies take on
focused aspects with a vision of doing
more. What it’s going to take is people
sharing data and models, and groups
becoming comfortable with finding
ways to do that. One basic approach is
data sharing with an honest broker who
maintains that data and creates a model
using all the data. Alternatively, each
organization can make models based on
its own data, and the models themselves
can be shared and “federated.” Another
differentiation is that ATOM products
are all open science. The goal is to put
all the models and data in public domain so people can use the models and
continuously improve them. We intend
to publish all the datasets, and be open
about describing what we are learning,
what works and what doesn’t, and developing best practices. We have more of
an educational and sharing approach.
Q: What are some of the trends
that will likely help improve AIdriven drug discovery?
A: People are developing automated
ways to design chemical synthetic
routes and optimize chemical reactions. Then there is parallel, automated chemistry; the slow step in
automated chemistry is always the
purification. We are also interested
in selecting the initial inputs to the
chemical optimization. DNA encoded
libraries could be an amazing way to
seed our initial design loop. These
libraries include billions of molecules,
and compounds are screened for binding to the target of interest. Machine
learning can use a lot of the screening
data that was previously thrown out
due to its size and noisiness. We can
use this data to design and predict better molecules that can then be tested.
DNA encoded library technology is
rapidly changing because of opensource collaboration with companies.
Crowdsourcing the information helps
advance the field. So, in a way, you are
democratizing DNA encoded library
screening and drug discovery using
computational approaches.
I am excited about AI for academic
drug discovery and chemical biology
(that is, using the compounds as tools
to explore biology). Drug discovery
usually requires lengthy and costly
cycles of making compounds and
testing them. If computational models
in the ATOM pipeline can give us
compounds with much better properties with less chemistry, we can learn
much more biology and get closer to
discovering new drugs.
“Human biology
is very complex and
drug discovery is a hard
problem to tackle.”
Michelle Arkin is professor and
chair of Pharmaceutical Chemistry
at the University of California, San
Francisco, and member of the Joint
Research Committee for the ATOM
Research Initiative. Her lab develops chemical probes and drug leads
for novel targets, with a particular
interest in protein-protein interactions
and protein-degradation networks.
Michelle is co-director of the UCSF
Small Molecule Discovery Center,
president of the board of directors
(BOD) of the Academic Drug Discovery Consortium, member of the
BOD of the Society for Laboratory
Automation and Screening (SLAS),
and co-founder of Ambagon Therapeutics and Elgia Therapeutics. Prior
to UCSF, Michelle was the associate
director of Cell Biology at Sunesis
Pharmaceuticals, where she helped
discover protein-protein interaction
inhibitors for IL-2 and LFA-1 (lifitegrast, marketed by Novartis).
Tanuja Koppal, PhD, is a freelance science
writer and consultant based in New Jersey.
She can be reached at [email protected]
October 2021
Lab Manager
51
ELECTRONIC
LABORATORY NOTEBOOKS
product focus | electronic laboratory notebooks
52
Lab Manager
SEMANTIC ENRICHMENT IS HELPING OVERCOME ISSUES
ASSOCIATED WITH VAST AMOUNTS OF UNUSABLE ELN DATA
by Aimee O’Driscoll
P
rior to the age of digitization, the norm
in the lab was to scribble information in a
notebook before storing said book along
with hundreds or even thousands of others in stacks
of boxes. With this traditional method of recording,
there were obvious issues. Even if the correct
notebook could be located among a sea of boxes,
there’s no guarantee that the recordings would be
legible. Plus, there was no practical way to compile
similar or related data from multiple books.
Enter electronic laboratory notebooks (ELNs). These
make the lives of lab personnel infinitely easier by
providing a tool to record, store, and analyze vast
amounts of data. While ELNs have obvious advantages,
they don’t offer the whole solution. As Gabrielle
Whittick, project leader and consultant, The Pistoia
Alliance, explains: “In theory, ELNs are easy to search
and archive, and users can link samples, experiments,
and results. In reality, this isn’t always the case.”
Whittick goes on to say that because of how
individual researchers work, the range of
nomenclature used, and the variance of structured
and unstructured data, search and retrieval doesn’t
always deliver the most accurate results.
But there are solutions to these problems underway.
Here, we examine the pros and cons of ELNs
more closely and reveal how semantic enrichment
is helping bridge the gap between a slew of
disorganized information and valuable, usable data.
ELNs and their advantages
and drawbacks
ELNs have revolutionized the way in which
laboratories operate. They allow users to input all
data associated with their work, including material
information, experiment equipment and conditions,
and, of course, results. As Whittick notes, “ELNs
are vital to how researchers work today, as a
October 2021
digital solution to record and store data safely and
securely.” They are also increasingly useful as
collaborative tools, enabling researchers to share
knowledge across organizations and with partners.
Whittick reveals that the early focus of ELNs was to
improve data capture by facilitating the transition from
paper-based notes to digital inputs. Even with this
component, there have been some issues. “Bespoke
ELNs tailored to lab workflows are most useful, but
‘out of the box’ ELNs may not fit how a researcher
works, which limits the benefits,” says Whittick. She
also notes that if an ELN is not platform-agnostic, a
researcher needs to be based in a lab to use it, and can’t
utilize it from home or on the move.
“ELNs are vital to how
researchers work today, as a
digital solution to record and
store data safely and securely.”
To overcome these issues and facilitate the changing
way in which personnel are working, remote
and mobile access to ELNs is necessary. Indeed,
Whittick notes that digital-native researchers
entering the lab in the early days of their career
expect digital solutions to be accessible.
While most of these challenges are readily solved,
recording, storing, and accessing data is only part of
the solution. There is also the issue of the usability
of the data being accessed. With vast amounts
of data input into ELNs, there can be challenges
in compiling and sorting information such that
researchers can easily locate and retrieve the data
points they require. “Some captured experimental
data are therefore locked in ELNs, and rendered
unusable and unsearchable. This results in
LabManager.com
product focus | electronic laboratory notebooks
duplicated experiments and time spent tracking down and
wrangling data,” explains Whittick.
Another problem arises with non-compatible ELNs. For
example, partner organizations may use different ELN
systems, which can actually end up creating more work for
both parties. A large potential benefit of ELNs is the ability
to collaborate, but this is stifled by issues of inefficient data
extraction and system incompatibility.
How semantic enrichment of ELN
data can help
The Pistoia Alliance is currently working on a large-scale
initiative that is set to overcome many of the challenges
faced by ELN users, dubbed the Semantic Enrichment of
ELN Data (SEED) project. Whittick reveals that semantic
enrichment of data includes enriching free text in ELNs with
metadata for every relevant term from agreed ontologies. “It
also uses dedicated ontologies for improved data management,
incorporating additional data like attributes, mappings, and
annotations,” she explains. “This creates relationships between
ontology classes to help to describe and define them.”
The alliance brings together more than a dozen large pharma
organizations to contribute to the project. These include
AstraZeneca, Bayer, Biogen, Bristol Myers Squibb, CDD,
Elsevier, GSK, Linguamatics, Merck, Pfizer, Sanofi, SciBite,
University of Southampton, and Takeda. The first phase of
the project involved the development of new standard assay
ontologies for ADME (absorption, distribution, metabolism,
and excretion), PD (Pharmacodynamic), and drug safety,
because there was a gap in existing ontologies. “These have
now been added to BioAssay Ontology (BAO) and are freely
available,” Whittick notes. “As a cross-pharma project team, we
built new standards and added them to the key ontology BAO,
and then used this in the semantic enrichment process.”
The next phase of the SEED project is underway and aims to
continue to make ELN data more searchable and usable. With
metadata assigned to each relevant term, data can become readily
accessible for future analysis. The aim is to develop a set of
standards for ELN data structure across the pharma and life science
industries. Among these, there is advocacy for the alignment with
FAIR principles (findability, accessibility, interoperability, and
reusability) as published in Scientific Data in 2016.
ELNs are incredibly useful tools in today’s laboratories, but
there are barriers to utilizing them to their full potential.
Semantic enrichment is paving the way for users to be able
to more efficiently extract data and enhance collaboration
opportunities. As Whittick puts it: “In short, semantic
enrichment unlocks the value of scientific data currently
‘trapped’ in ELNs.”
Aimee O’Driscoll, BSc, MBA, has a decade of experience as a
development chemist and is a seasoned science writer. She can be reached
at [email protected].
FOR ADDITIONAL RESOURCES ON ELECTRONIC LABORATORY NOTEBOOKS, INCLUDING USEFUL ARTICLES AND A LIST OF
MANUFACTURERS, VISIT WWW.LABMANAGER.COM/ELN
October 2021
Lab Manager
53
LAB MONITORING SYSTEMS
product focus | lab monitoring systems
SCALABLE LABORATORY MONITORING SYSTEMS CAN
HELP MINIMIZE RISKS—BUT COME WITH CHALLENGES
by Aimee O’Driscoll
I
n the past, laboratory personnel had to simply trust
that everything would run smoothly in their absence.
Inevitably, things would go wrong, and detectable issues
would be dealt with upon arrival at the lab. This could
result in lost time and wasted resources. There was also
the element of the unknown, as without monitoring you
wouldn’t know, for example, how temperatures or humidity
levels may have fluctuated overnight. Such changes could
lead to sample degradation and unreliable results.
Thankfully, modern laboratories have a wealth of options
when it comes to monitoring systems. Scientists can remotely
monitor experiments, receive real-time notifications, monitor
the performance of instruments and equipment, and more. As
Salvatore Savo, PhD, co-founder, TetraScience, notes, “The
ultimate goal of a lab monitoring system is to provide peace
of mind to lab operators who need them to keep samples safe
and improve operational efficiency.”
Monitoring has now become the norm and helps with not only
ensuring the integrity of samples and products, but also with
remaining compliant with industry standards and regulations.
With huge corporations utilizing laboratory monitoring
systems, this begs the question of how scalable these processes
are. We look at the importance of lab monitoring systems and
the challenges faced in implementing scalable systems.
Scalable lab monitoring systems have
many benefits to offer
Laboratory monitoring systems can have a large impact
on the integrity of work completed and often represent
huge cost savings for organizations. For example, Savo
explains, “By using a remote monitoring system, life
science organizations can prevent significant material
and financial losses that can have a serious impact on
their ability to take a drug to market.”
Joe LaPorte, director of Cold Chain, Projects, and
Regulatory, PHC Corporation, notes that “the largest
value comes from being able to determine when your
critical products fall outside their measured parameters
when nobody is around to witness it.”
There is also the compliance component. Lab monitoring
systems are often vital to proving the integrity of
research, development, and production processes.
“The best systems have audit trail capability to meet
compliance requirements and provide a method for
analysis to help implement best practices,” says LaPorte.
Implementation of scalable systems
presents challenges
While small-scale systems are broadly implemented,
as an increasing number of parameters are measured
across facilities, there is demand for organizations to
implement large-scale solutions. Aside from enabling
comprehensive monitoring, they need to deliver other
features such as synchronization of data, customizable
alerts, and maintenance tracking.
These systems are available, but there are some roadblocks.
One of the biggest issues is the large amount of data that must
be transferred and stored. Savo notes that larger systems must
support a nearly unlimited number of data streams. There’s
also the issue of compatibility as so many different types of
equipment and systems must be integrated. To overcome
these challenges, providers offer cloud-based systems that are
instrument agnostic and have superior integration capabilities.
LaPorte discusses reliability issues inherent in some
systems, including dropped signals due to electrical
interference or loss of internet connection, as well as
human error problems such as failure to change or charge
batteries in Wi-Fi systems. These issues are trickier to
fix, but there is an understanding that while no system is
infallible, the benefits certainly outweigh the risks.
Aimee O’Driscoll, BSc, MBA, has a decade of experience as a
development chemist and is a seasoned science writer. She can be
reached at [email protected].
FOR ADDITIONAL RESOURCES ON LAB MONITORING SYSTEMS, INCLUDING USEFUL ARTICLES AND A LIST OF
MANUFACTURERS,VISIT WWW.LABMANAGER.COM/LAB-MONITORING
54
Lab Manager
October 2021
LabManager.com
product in action
Labconco Axiom Type C1 Biosafety Cabinet
SAFE, FLEXIBLE, INTELLIGENT. THE MODERN BSC.
The Axiom Type C1 BSC opens the door of possibilities for all biosafety containment needs you expect from a
modern BSC. Inherent to the Axiom's design is Omni-Flex™, a convertible design with two exhaust modes. The
recirculating Type A mode or fully exhausted Type B mode will address your changing needs and reduce the cost
of operation in the process. The Chem-Zone™ worksurface clearly delineates a total exhaust area for safe chemical
handling within the BSC. Powered by MyLogic™ OS with Constant Airflow Profile™ Technology, the intelligent
Axiom keeps you safe and informed so you can focus on your work.
SAFETY FIRST
FLEXIBILITY LIKE NO OTHER
The innovative features of the Axiom C1
The unique design of the Axiom C1
were designed with safety in mind. In
allows for conversion between two
fact, the Axiom C1 is safer than other
different modes of operation—something
BSCs—and with its unique features
typically requiring two completely
you can be assured that personnel and
separate BSCs. This one-of-a-kind
product protection remain the number-
biosafety cabinet can operate in A mode
one priority. HEPA supply and exhaust
with recirculating airflow, then switch to B
filters are at least 99.99 percent efficient
mode for 100 percent venting when your
(at 0.3 microns in size), providing a
laboratory requirements change. The
safe environment for both you and your
easy conversion negates the need for
precious samples.
another BSC when your work changes.
INTELLIGENT CONTROLS
A bright LCD screen displays MyLogic OS, an easy-to-use interface
that displays filter life, alerts, and alarms right at eye-level within the
cabinet. Behind the scenes, the Axiom’s control system utilizes CAP™
Technology to continuously monitor and adjust enclosure airflow to
keep users safe as conditions change.
To learn more, visit: www.labconco.com/axiom
October 2021
Lab Manager
55
NEXT-GENERATION
SEQUENCING
product focus | next-generation sequencing
USING NGS TECHNOLOGIES TO UNDERSTAND THE IMPACT
OF THE MICROBIOME ON HEALTH
by Andy Tay, PhD
T
echnologies for DNA and RNA sequencing
are crucial to unlock secrets held within
our genome and transcripts that can be used
to better understand biology and impact medicine.
Sanger sequencing is traditionally being used to
sequence oligonucleotide strands one at a time by
capillary electrophoresis. While this method is still
considered the gold standard for analyzing small
numbers of gene targets and samples, next-generation
sequencing (NGS) is quickly overtaking it.
NGS, also known as massively-parallel
sequencing, enables the interrogation of larger
numbers of genes with high throughput and low
cost per run. Continual improvements have also
enabled NGS to capture a broader spectrum of
mutations and be more accurate in detecting
variants at low allele frequencies without bias.
NGS workflow
There are a few main companies providing
instruments for NGS, but their workflow is similar.
The first step is library construction where DNA
or complementary(c) DNA generated from RNA
is fragmented and bound to adaptors that have a
unique molecular “barcode.” With this barcode,
each fragment is uniquely labeled, and multiple
samples can be pooled together to save time during
sequencing. The second step is clonal amplification,
where more copies of each DNA fragment are
amplified via polymerase chain reaction. The third
step is sequencing, where most instruments use
optical signals to assess nucleotide incorporation
during DNA synthesis. The final step is analysis.
NGS to boost microbiome research
The microbiome describes the interactions of
microorganisms including bacteria, fungi, and viruses
with their ecological environment, such as a human
body. Such interactions can be further broken down
into commensal, symbiotic, and pathogenic.
A variety of bacteria resides in our guts and influence
our health, such as through adsorption and secretion
of metabolites. NGS has been useful for taxonomical
identification and classification of the microbiome in
different parts of our bodies, including the intestinal
and respiratory tracts. Kasai and colleagues studied
the composition of gut microbiota in Japanese
subjects using NGS technology and found that there
were significant differences in the bacterial species.
For instance, there was higher bacterial species
diversity in obese individuals.
NGS technologies have also facilitated an extension
of such research to evaluate the effects of probiotics
to modulate perturbed microbiota. Suez and coworkers found that probiotics supplementation
might be ineffective as resident gut bacteria can
resist the mucosal presence of probiotics strains.
Probiotics supplementation was only useful in
a subset of individuals, suggesting that this is
likely not a universal approach to influence gut
microbiota composition and that personalized
probiotics development should be further studied.
Antibiotics can also significantly alter the microbiome
and play a role in disease onset and progression. Using
NGS technologies, scientists have experimentally
identified bacteria species that are affected by antibiotics
and how their absence correlates with changes in
immunity and disease susceptibility. Specifically,
researchers have found that the use of antibiotics can
have negative impact on gut microbiota by reducing
species diversity (causing loss of bacterial ligands
recognized by host immune cells), changing metabolic
activities and selection of antibiotic-resistant organisms
that can cause recurrent Clostridioides difficile infections.
Andy Tay, PhD, is a freelance science writer based in Singapore.
FOR ADDITIONAL RESOURCES ON NEXT-GENERATION SEQUENCING, INCLUDING USEFUL ARTICLES AND A LIST OF
MANUFACTURERS, VISIT WWW.LABMANAGER.COM/NGS
56
Lab Manager
October 2021
LabManager.com
how it works
Cryogenic Preservation without Liquid Nitrogen
AIR PHASE CRYOGENIC STORAGE CAN ELIMINATE THE NEED FOR COSTLY, HAZARDOUS LIQUID NITROGEN
Q
A
Cryogenic preservation of samples and cells can be costly,
dangerous, and cumbersome.
The handling of LN2 tanks can be challenging and bog down laboratory workflow. Ensuring
that your inventory has appropriate temperature preservation at the cryogenic level can be
critical. While it does indeed offer excellent long-term storage capability, liquid nitrogen can
be dangerous to work with and requires additional investment.
Safe and economical air phase cryogenic storage without
liquid nitrogen
Consider the benefits of mechanical cryopreservation. The PHCbi brand MDF-C2156VANCPA offers tight temperature uniformity at -150°C, +/- 5°C, and can eliminate consumption of
LN2. This small footprint cryopreservation model can accommodate up to 165 two-inch boxes
and leverages standard 220V power requirements. Using this air phase storage system lowers total costs of ownership through reduced energy usage, lowered reliance upon LN2, and
decreased safety concerns.
To learn more, visit: https://www.phchd.com/us/biomedical/preservation/ultra-low-freezers/mdf-c2156vanc
October 2021
Lab Manager
57
PCR
product focus | pcr
LEVERAGING DIGITAL PCR TO IMPROVE DETECTION OF VIRAL
PATHOGENS IN WASTEWATER
by Brandoch Cook, PhD
A
lthough qPCR sounds quantitative when you
say it (it’s in the name!) and looks quantitative
when you analyze the results, its potential
for accuracy, sensitivity, and dynamic range lags
well behind that of digital PCR (dPCR). There are
several limitations to any traditional PCR platform,
either in endpoint or real-time assays. One example is
that exponential amplification always produces copy
numbers in multiples of two. Another is that target gene
expression must be derived by comparison to a standard
curve generated by amplifying a reference gene. Twofold measurements offer a disconnect from reality, and
normalizations to housekeeping reference points can
assume incorrect kinetics of amplification. Therefore,
qPCR is an inadequate technique to achieve some highsensitivity objectives, such as measuring copy number
variants of disease biomarkers, or identifying rare point
mutations against an overwhelming background of wild
type allelic expression.
Alternatively, dPCR provides a substantial upgrade
by detecting single molecules, quantifying their
abundance, and obviating the need for generation
of standard curves, which can consume excess time
and reagents and introduce amplification biases.
With increased sensitivity and accuracy, dPCR can
additionally be applied to cataloguing water-borne
microbial pathogens refractory to traditional methods,
and therefore accelerate and validate crucial use and
recycling decisions.
Digital PCR: Enhanced sensitivity and
accuracy through partitioning
Work by investigators, including Kary Mullis at
Cetus Corporation, in the early days of PCR enabled
detection and amplification of a single copy of the
λ-globin gene. Their approach was enhanced by
Pamela Sykes and colleagues to allow copy number
quantification, in a procedure first referred to as
digital PCR by Bert Vogelstein and Kenneth Kinzler.
The critical step Vogelstein and Kinzler contributed
was using oil emulsion to maximize separation
of individual PCRs into minimized volumes. By
partitioning identical reactions into thousands of
tiny individual microreactors, one can treat them
simultaneously as a population, with a demographic
distribution of on/off (or digital) signals corresponding
to starting material with one or zero target molecules.
Signals in this case consist of emitted fluorescence
via accumulation of a TaqMan-style probe. Although
each individual readout is analogous to what emerges
from a larger single-well qPCR, dPCR has an endpoint
readout, instead of a contemporaneous threshold cycle
that varies between samples.
To generate quantitative data, dPCR employs
several assumptions and statistical conditions: 1)
each partition begins with an equivalent, random
probability of containing target molecules; 2) all
partitions have the same volume; and 3) one can
therefore use binomial probability and Poisson
distribution to extrapolate absolute numbers of
independent “events” occurring at a constant rate
during a fixed period. Because of these relationships,
there is an optimal partition occupancy rate, λ,
58
Lab Manager
October 2021
LabManager.com
product focus | pcr
that drives considerations, including partition number and
volume that can impact the dynamic range and accuracy of
dPCR. For example, the Wilson method of direct calculation
incorporates the probability that a partition is empty, the
total number of partitions, and a confidence interval of 95
percent. In this algorithm, a λ of 1.6 is optimal for an assay
with 10,000 partitions and corresponds to about 20 percent
vacancy. Optimized thusly, dPCR can detect variations
within a linear range of less than 30 percent, an obvious
improvement over the classic two-fold limitation. However,
deviations toward much lower or higher occupancy can skew
accuracy, and bracketing an assay’s median intrinsic dynamic
range promotes the highest fidelity. One way to ensure this
is to develop dPCR partitioning strategies that allow for
different volumetric ranges across subsets of partitions, with
larger volumes promoting sensitivity, smaller ones enabling
optimal detection limits, and medium volumes for precision.
Finally, the incorporation of microfluidic devices into dPCR
workflows adds an aspect of massively parallel throughput
that can diversify analytic potential and improve accuracy
by reducing pressures on expense and reagent use to allow
unfettered reiteration of technical replicates.
“While bacterial pathogens grab many
of the headlines, waterborne viruses
can be silent killers because they often
occupy wastewater at levels below
facilitative thresholds of detection.”
Leveraging dPCR to characterize and solve
wastewater problems
In developing nations, there is often economic pressure to
mitigate water waste through reuse. There are analogous
pressures in historically privileged areas newly plagued
by drought or population influx to conserve household,
commercial, and municipal waters downstream of their initial
use. In both cases, there is a quandary over whether these
waters can be recycled safely, particularly for agricultural
purposes. However, policy makers have already implemented
many such programs, with active procedures and regulations
taking place well ahead of a detailed understanding of what’s
in the water before reusing it. While bacterial pathogens
grab many of the headlines, waterborne viruses can be
silent killers because they often occupy wastewater at levels
below facilitative thresholds of detection. Moreover, small
populations are often infected in a localized manner, with
houses, neighborhoods, and cruise ships all serving as
highly variable foci. However, the ubiquity of enteric viral
pathogens is almost prosaic in nature. Norovirus is the
most common source of viral acute gastroenteritis. Human
adenoviruses are omnipresent, non-seasonal, UV-resistant,
and can cause fatal infections in immunocompromised
people, but also respiratory, mucosal, and gastric issues in
otherwise healthy people.
Because qPCR is an insufficient platform to assess enteric
pathogens in wastewater, investigators have developed robust
epidemiological models to derive the statistical likelihood of
and quantity of their presence from overall infection rates,
and to predict whether ultrafiltration, membrane bioreactors,
and other treatments are sufficient before downstream reuse
(the developing consensus: they are not). Recently, dPCR has
begun to serve as empirical validation for these methods, and
subsequently to extend its own legitimacy in ever-improving
waves of sensitivity and accuracy. The incorporation of
array- and microfluidic-based droplets (ddPCR) has allowed
researchers to assess log removal values in ground water
downstream of agricultural runoff, graywater, blackwater,
and mixed wastewater for genogroup I and II noroviruses,
and for various adenoviruses including HAdV41, a common
diarrheal agent and bellwether for water treatment safety.
Commercial manufacture of plate-based dPCR instruments
continues to improve throughput, which will facilitate
making decisive and accurately informed policy changes that
can broadly impact human health.
Brandoch Cook, PhD, is a freelance scientific writer. He can be reached
at: [email protected].
FOR ADDITIONAL RESOURCES ON PCR, INCLUDING USEFUL ARTICLES AND A LIST OF MANUFACTURERS,
VISIT WWW.LABMANAGER.COM/PCR
October 2021
Lab Manager
59
innovations in: mass spectrometry
Innovations
in Mass
Spectrometry
IMPORTANT ADVANCES IN MASS
SPECTROMETRY LEADING UP TO THE
2021 ASMS CONFERENCE
by Damon Anderson, PhD, Scott D.
Hanton, PhD, and Rachel Muenz
A
s the 69th annual conference of the American Society for Mass Spectrometry (ASMS) quickly approaches, we examine recent key developments
and some of the most interesting applications poised to
impact the future of mass spectrometry.
RECENT ADVANCES
Many recent advancements in mass spectrometry
(MS) have centered on the study of proteomics. One
such example comes from the Max Planck Institute of
Biochemistry, where research group leader Jürgen Cox
and his team released a new version of the pioneering and widely used MaxQuant software platform for
analyzing and interpreting data produced from MSbased proteomics research. MaxQuant 2.0. includes “an
improved computational workflow for data-independent
acquisition (DIA) proteomics, called MaxDIA,” states a
recent press release.
The MaxDIA software enables researchers “to apply
algorithms to DDA [data-dependent acquisition] and
DIA data in the same way,” according to the institute.
60
Lab Manager
October 2021
The software accomplishes this by combining the use of
spectral libraries with machine learning algorithms. By
predicting peptide fragmentation and spectral intensities, more precise spectral libraries are created in silico,
which can then be applied to the data. Spectral libraries
are one of the limiting factors in the application of MS to
proteomics, so this innovation is helping to expand the
scale and scope of how MS can continue to contribute.
Use of this technology will enable researchers to compare data from DDA and DIA more easily, and harness
the power of both techniques to measure and analyze
thousands of proteins with greater breadth than before.
The Max Planck team is already working on further
enhancements for the new software. A paper on MaxDIA
was published in July in Nature Biotechnology.
Another technology gaining significant momentum,
primarily in industry but also in academic circles, is
MS-based multiple attribute monitoring (MAM). The
MAM concept was born from the need to monitor protein production and purification, and limit the inherent
heterogeneity of biologics, such as therapeutic antibodies,
LabManager.com
innovations in: mass spectrometry
and their impact on quality control. Although MAM is
not a new concept, advancements in MS instrumentation and data analysis solutions have made MS-based
MAM the preferred platform in biologics drug quality
control. Immediate applications include biopharmaceutical discovery and development, although the technique is gaining popularity in research groups studying
post-translational modifications and other potential
modifications of proteins.
“With ENABLE, we are starting with
a successful commercial design
capable of conducting diagnostics
from atmospheric pressure down
to ultra-high vacuum.”
MS instrument providers such as Thermo Fisher
Scientific, SCIEX, Waters, and others now offer complete MAM workflows that are built around their MS
technology platforms. The solutions are intended to
provide comprehensive characterization of proteins and
therapeutics by matching reagents and protocols with
instrument output and software analysis.
Another new area for MS development centers on the
concept of trapped ion mobility MS. The ion mobility mass spectrometry (IM-MS) method combines the
separation of ionized molecules based on their mobility
in a carrier buffer gas, with the high-accuracy resolving
power of mass spectrometry. This enables separation of
both mass and size/shape, which provides even greater
specificity for analyzing and understanding the complex
suite of proteins present in living cells. When combined
with chromatography and powerful analytical software,
the IM-MS technique offers a multi-dimensional approach toward resolving complex samples such as those
in proteomics studies.
Trapped ion mobility spectrometry is a modification that essentially traps the ions during ion mobility
separation, which allows for sequential fragmentation
over a series of timed millisecond scans. Combining
trapped ion mobility with a method termed parallel accumulation-serial fragmentation (PASEF), these
trapped ions can accumulate in parallel, and be released
sequentially. Rather than a quadrupole selecting a single
precursor ion for fragmentation, such as that of a typical
MS/MS experiment, sub-millisecond switching enables
the selection and fragmentation of multiple precursors in
a single 50 ms run.
Such performance can result in thousands of protein
identifications over a short run time using nanogram
amounts of material. These technological developments
have led to significant gains in sequencing speed without a decrease in sensitivity, ideally suited for complex,
high-throughput proteomics.
Significant advancements have been made recently
in trapped ion mobility MS. For instance, at the beginning of June, Bruker Daltonics launched new MS
technology that combines time-of-flight and trapped
ion mobility MS (timsTOF) with liquid chromatography and improved automation software. That
combination will allow for big steps forward in the
efficiency of epiproteomic and proteomic analyses in
labs—along with enabling the rapidly growing field of
single-cell proteomics.
An additional advancement impacting both the
proteomics and metabolomics spaces is spatially resolved MS. Development of multimodal imaging mass
spectrometry is helping researchers reveal more about
the workings of biological systems. Yet another area of
growth is high-throughput planar solid-phase extraction
coupled to high-resolution mass spectrometry, which
is helping streamline screening for antibiotics in foods,
among other applications.
APPLICATIONS
Researchers have recently been applying established
mass spectrometry techniques in many interesting ways.
In a study published in the journal Antiquity in August,
researchers used accelerator mass spectrometry to
analyze 26 human tooth and bone samples from Machu
Picchu, Peru, determining that the site is at least two
decades older than what textual sources indicate, demonstrating the ability of MS innovations to impact a very
wide range of important studies.
In another recent development, MS will have an
impact a little farther from Earth in the Southwest
Research Institute’s (SwRI’s) Environmental Analysis
of the Bounded Lunar Exosphere (ENABLE) project, a
three-year $2.18 million program funded by NASA that
was announced in July. The program aims to bring mass
October 2021
Lab Manager
61
innovations in: mass spectrometry
Through the ENABLE project, SwRI is adapting a mass spectrometer to return the technology to useful operations on the
lunar surface for the first time in half a century. This image of
the Lunar Atmospheric Composition Experiment, deployed by
Apollo 17 in 1972, was photographed from the lunar surface by
astronaut Harrison Schmitt.
Image Courtesy of NASA/Schmitt/AS17-134-20499
spectrometry back to the moon, adapting a commercially-available mass spectrometer to identify the composition of the lunar surface.
“The last mass spectrometer deployed to the lunar
surface was the Lunar Atmospheric Composition Experiment in December 1972 during the Apollo 17 mission,” says SwRI’s Edward Patrick, ENABLE principal
investigator, in a press release. “With ENABLE, we are
starting with a successful commercial design capable
of conducting diagnostics from atmospheric pressure
down to ultra-high vacuum.” This work demonstrates
continued innovation to adapt existing technology to
solve new problems.
MS has also been an important tool in studies of
SARS-CoV-2, the virus responsible for the current COVID-19 pandemic. Recently, in three studies explored
in a Science research article in August, MS helped reveal
how the B.1.427/B.1.429 variant of concern evades the
human immune system. Another study published in Sustainability in July shows how MS could be combined with
machine learning to provide surveillance of airborne
pathogens during the current COVID-19 pandemic
and future ones. “Widespread deployment of such an
MS-based contagion surveillance could help identify hot
62
Lab Manager
October 2021
zones, create containment perimeters around them, and
assist in preventing the endemic-to-pandemic progression of contagious diseases,” the researchers write.
While these studies used more established MS techniques, there are a number of new technologies emerging around MS-based SARS-CoV-2 and virus testing.
As reported in December 2020, there are two MS-based
SARS-CoV-2 diagnostic tests that have received Emergency Use Authorization from the US Food and Drug
Administration. The MassARRAY SARS-CoV-2 Panel
commercialized by Agena Bioscience and the SARSCoV-2 MALDI-TOF Assay from Ethos Laboratories
pair RT-PCR with MALDI-TOF MS to detect the
COVID-19-causing virus in samples collected at home
or at a point-of-care location. Many other MS techniques for virus testing are being developed and used at
the research level—certain to be an area of increasing
value moving forward.
“MS could be combined with machine
learning to provide surveillance of
airborne pathogens.”
TRENDS
Three general themes in MS that continue rapid and
important ongoing developments and offer promise
for the future involve MS imaging, single-cell proteomics, and remote monitoring. Further progress in
these areas will continue to bring new insights and
technologies into the hands of researchers to benefit
the general population. With that thought in mind, we
look forward to seeing more of the latest cutting-edge
developments impacting the MS industry at the ASMS
conference this year.
Damon Anderson, PhD, is the technology editor at LabX.com.
He can be reached at [email protected].
Scott D. Hanton, editorial director for Lab Manager, can be
reached at [email protected].
Rachel Muenz, senior digital content editor for Lab Manager,
can be reached at [email protected].
LabManager.com
lab manager online
LM ONLINE
What’s new at LabManager.com?
Managing Laboratory Complexity and Data-Driven Operations
The ability of scientific organizations to react swiftly to changing
scientific, financial, and business conditions is of paramount importance in today’s rapidly evolving scientific landscape. The global
pandemic exemplified how rapidly and explosively conditions can
change. The ability to adapt quickly to such monumental shifts is
necessary and achievable by optimizing lab performance using new
digital technologies and expert analysis to enhance the visibility and
utilization of assets. This piece highlights a few key steps to achieve
operational agility. To read the full article, visit:
www.LabManager.com/data-driven-operations
Check out what else LabManager.com
has to offer—from the latest research and
industry news, to product resource guides,
educational webinars, exclusive online
content, and much more.
LabManager.com
October 2021
Lab Manager
63
More space
for your ideas.
Our innovations help simplify medical progress
so that better therapies will reach patients faster.
What will happen to your ideas?
1,000+ open positions worldwide.
Join our team and elevate your creativity.
www.sartorius.com/morespace
Documentos relacionados
Descargar