LinkedIn

Tuesday, June 11, 2019

By any measure I am in the latter stages of my career.I have been around a long time and have seen
quite a bit of positive change in the Industrial Hygiene profession.We are evolving from a “pump jockey”
mentality into a much more rigorous scientifically based vocation.The refinement and enhancement of the
science of sampling statistics by Jerry Lynch, John Mulhausen and others along
with the pioneering efforts to use Bayesian statistics by Paul Hewitt and
others are prime examples.

At the AIHA
conference in Minnesota last month I attended a presentation by Dr. Jerome Lavoue
on a freely available statistical analysis tool: expostat.This effort was new to me and it looked like
a great tool.I sent a note to my
friend Tom Armstrong about it.He
responded that he was well aware of this tool and sends me a remarkable slide
deck on statistical analysis that he is working on.

Tom starts off with a 10,000 meter view of IH statistics and
rapidly zooms into providing useful, spot-on guidance and details on the
current state of the science and where to get more info.Like I said, it is a remarkable set of 30 or
so slides which Tom has agreed to allow me to send to you if you request
it:mjayjock@gmail.com.

Another highlight of the conference for me was a visit to
Dr. Susan Arnold’s laboratory at the University of Minnesota.It is a great lab with a lot of salient instruments
and a chamber that allows her and her students to conduct some controlled exposure
studies.She also told me something
about her IH curriculum which is very heavy in modeling and science.I am aware of a few other programs along
these lines but I was particularly happy to see Susan doing this.If I were a young person interested in a
top-notched program with opportunity for hands-on research I would consider
moving to Minnesota weather notwithstanding.

Tuesday, April 23, 2019

I have written about Tom previously in this blog but his
latest contribution to the realm of exposure modeling is really quite extraordinary.It is 111 slides in a PDF file that contains the following gifts for anyone willing to view and study them:

Worked examples annotated with Tom’s wonderful
insight and guidance.

Clear explanation of ALL the basic elements of inhalation exposure
modeling.

Specific guidance to the importance of using thermodynamic activity coefficients (ACs).

Numerous references and links to get what you
need to actually do exposure modeling.

A wonderful annotated primer for IH MOD 2.

It is easily equivalent a multi-day course on the general subject but presented in a way such that it is relatively easy to follow. This is especially true if you have some background in modeling or you are willing to delve into the AIHA Modeling text: Methematical Models for Estimating Occupational Exposure to Chemicals, 2nd Ed. as a companion resource.

Tom would also love to hear from you.He has been hospitalised twice recently
with a serious illness.The good news is
that he is on the mend and will be out of the hospital soon.The bad news is that he will miss this year’s
AIHA Conference because he needs to undergo further treatment for his
condition. We have been corresponding while he was in the hospital and I recently wrote to him:

"Tell your caretakers that you are a National Treasure and to get on with it!" He wrote back that this brought a smile to his face. The truth is that in the realm of IH he truly is a treasure.

Thursday, March 7, 2019

Dr Joonas Koivisto and 16 others, including this writer,
have recently authored what I believe is a very important paper:Source
specific exposure and risk assessment for indoor aerosols.It sounds a bit like a paper focused on
aerosol assessment but it is actually a comprehensive look at inhalation
exposure models and the quality of these models to make decisions relative to
chemical regulation and risk assessment. The reality is that aerosols represent the
most challenging scenarios for modeling because of their added properties
compared to gases.If one can accurately
model aerosols then gases are relatively simple to model.

The publication outlines the current state of the
science and available models.It also
makes a developing case for the use of first principle mathematical mass
balance models versus other types of models (knowledge-based models, and
statistical models of exposure determinants) especially for regulatory
decisions such as those mandated by REACh.

The Europeans are much more advanced than the US in
the application of exposure models because they have to be.The REACh regulation requires a risk
assessment for literally thousands of chemicals and a risk assessment requires
an exposure assessment.There is not nearly
enough measured exposure data available, so they have turned to models.It is clearly evident that the inputs to and data bases for the
mathematical mass balance models have not been sufficiently developed so the
European Regulators have turned to knowledge-based and statistical models of
exposure determinants.These models are
more easily applied because the inputs are relatively simple.The paper implies that these models are not performing
up to the task and that there is a real need to develop the input data
necessary to feed the more competent first principle mathematical mass balance models.

The paper points to an earlier paper I did with Tom
Armstrong and Mike Taylor in which we challenged the mass balance 2 zone Near-field/Far-field
(NF/FF) model to the Daubert legal criteria which is widely used by the Courts
to assess whether expert witnesses scientific testimony is methodologically
valid. In that paper we concluded the
NF/FF model fulfils the Daubert criteria and when it is used within its stated
limitations, it adequately estimates the exposure as applied to legal decisions.The implication is that the models currently
used for making decisions for REACh would, most likely, not pass the Daubert criteria, which requires that these models:

1) Are applicable and have been tested.

2) Have been subjected to peer-review and are generally
accepted.

3) The rate of error is known and acceptable.

4) have maintenance of standards and controls concerning
their operation.

What Dr. Koivisto and the other authors are asserting in
this paper is somewhat striking; namely, the currently used REACh models need
to be explicitly challenged by the Daubert (or similar objective) criteria and, if found wanting,
better alternatives should be developed and employed.This would, most likely, result in something this writer has been advocating
for many years; specifically, comprehensive research and compilation of exposure
source data bases.

This should be a straightforward objective scientific
exercise; that is, a technically competent and empowered group of scientists
would set open and objective criteria and test the currently used regulatory
sanctioned models to those standards.The reality, as I see it, is that there are
strong vested interests and forces at work in this case that may resist this
sort of effort.Change is never easy
but, hopefully, scientific integrity, good judgement and established facts will
ultimately work to improve the public health, partisan politics notwithstanding.

Wednesday, December 12, 2018

Jeff Burton is a treasure to our profession. He wrote a piece on ventilation earlier this year and published it in the AIHA Synergist. I found it to be incredibly valuable. On the chance that you did not see it, I am reproducing part of it below with his permission. It is a trove of practical advice born from a lifetime of experience and a great resource for any practising IH.One thing the Jeff did not mention but that I think is important is that much of this can be used for exposure modelling input.I am reproducing the first few paragraph of the article below. If you are a member of AIHA, you can go to the online version in the Synergist to get it in all its glory at:

Six Ways to Approximate Airflow

Every occupational health and safety professional
must be able to evaluate the air the occupants of a space are experiencing to
assess the potential for IAQ problems and their solutions.

MostOHS
professionals todayare unabletoconductin-depth testing or measurement of HVAC
systems and their airflows. Specialized
knowledge of testing, measurement, and
balancing is often required on the complex systems of today.
Industrial hygiene engineers or TAB (testing,
adjusting, and balancing) specialists can be employed to make detailed
measurements.However, an OHS professional can often
gather enough simple information to quickly provide approximate answers toquestions about airflow in a space,
regardless of the complexity of the system.

This article
provides guidelines for simple testing, measurements, and approximations an
OHS professional might perform.
These include temperature and humidity; air movement
and distribution, outdoorairflowrates,and air exchange
rates in the occupied space; concentrationsofcarbondioxide
in the air; and the effects of wind on the airflow through a building.

The following equipment is
needed to perform the simple tests and measurements described in this article:tape measure,thermometer, psychrometer, smoke tubes, and carbon dioxide monitor.

...

The political will in the European Union to enact REACh was
and is extraordinary. The body politic in the EU wants this
regulation and certainly needs it to be effective. It should be clear that it
cannot be effective if the exposure assessment half of the risk equation used
for REACh is faulty. Underestimation of exposure and risk hurts people's health directly,
over-estimations hurts people's well-being by unnecessary hurting of the
economy. The use of good modelling tools is critical or REACh, in my opinion,
will ultimately be doomed to fail.

I have always thought that first principle physical chemical
models (FPModels) have been superior to models that are not based on first principles
(NFPModels). Now a thoughtful and talented Danish
researcher (Dr. Antti Joonas Koivisto) is examining and demonstrating with logic and DATA exactly
why first principle models are better and, most likely, even necessary to make good regulatory
decisions.

An early question might be:Why develop NFPModels when FPModels are available for development?The easy and probably correct answer:They can be developed relatively quickly and
with less effort and expense. FPModels are available but need to be
parameterized for critical exposure scenarios and that means research dollars.

NFPModels, for the most part, are based on dimensionless
factors to calculate scores, which are then converted to exposure values.They are conceptual models
than do not have to conform to first-principles and are thus (using Joonas' word) somewhat vague.

While there are other NFPModels, the big hitter in the EU for
modelling exposure via REACh appears to be Stoffenmanager® v.7.1 which as of last month:

·is reportedly validated by 15 scientific studies
based on more than 6000 measurements.

·has more than 33,000 users with 50 new users
per week.

·used to make over 200,000 regulatory decisions

It is accepted by the Dutch Labour inspectorate as a
validated method to evaluate exposure to hazardous substances in the
workplace.More important, the European
Commission officially recognises Stoffenmanager as a instrument to comply with
the REACh regulation.

Other REACh-recommended NFPModels include:

ECETOC TRA

MEASE

EMK-EXPO-TOOL

ART

Although
somewhat varied in their approach, they all share the same feature that they
are

all based on dimensionless factors to calculate scores, which
are then converted to exposure values.They are conceptual models than do not have to conform to
first-principles (like the conservation of mass). Thus, they are not scientifically formalized and that leaves them difficult
to explain.

Dr. Koivisto
asserts, and I agree, that there should be minimum requirements for regulatory
exposure models and that those criteria should be no less than the Daubert
criteria used in US Courts for valid scientific testimony.The model criteria:

Is applicable and has been tested.

Has been subjected to peer review and is
generally accepted.

The rate of error is known and acceptable.

The existence and maintenance of standards and
controls concerning the operation.

Is generally accepted in the relevant
scientific community.

Joonas
goes on to advise that FPMmodels are superior to the above NFPModels (what he
calls “imaginary” models) because:

•Mass
flows are traceable à Model
can be used for environmental, occupational and consumer exposure assessment!!

•There
is No unit conversions!!

•Error
analysis can be made separately for emission source, emission controls, and
dispersion.

•No need
for Tier levels; the Tier level depends
on available information.

•Possible
”calibration” is straight forward (e.g. chamber tests)

•In the
NF/FF model the NF volume and air mixing are adjustable according to the source
(free parameterization).

Wednesday, September 12, 2018

I have not blogged for quite a while primarily because in 125 blogs I pretty much exhausted what I wanted to say on various topics. Also, new ideas for blogs from the readers also seemed to have dried up. I am, however, moved to post again by the wonderful work of Daniel Drolet and Tom Armstrong on the tool many of us know as IH Mod. For years, they have wanted to combine the power of these deterministic models with the new dimension of stochastic uncertainty modelling (e.g., Monte Carlo simulation). Daniel is a brilliant programmer and he made it happen! It is now available as IH Mod 2.0 and, as usual, its a free download. Daniel and Tom and all the folks who worked on this have done so without pay for the benefit of the professional. Below is Tom's announcement. I remain open at mjayjock@gmail.com for ideas for future blogs. I do have another blog that will come out soon with goodies from Jeff Burton and the wonderful tools on ventilation he has recently provided to the profession.

IH Mod 2.0 includes the same mathematical models as in the still available original IH Mod. IH Mod 2.0 gives the user the choice between running the models in deterministic (point value parameters) or in Monte Carlo Simulation mode, with choices of distributions of parameter values. This is right in MS Excel with no other software needed. It requires a desktop install of MS Excel, for Windows or Apple computers. The currently posted version has English, French, Serbo-Croation and Japanese language options. Spanish, German and Italian will be available soon.

A Support File for IH Mod 2.0 is also available. It includes useful information about IH Mod 2.0, and spreadsheet tabs to estimate liquid spill pool generation rates via the Hummel-Fehrenbacher equation, a units of measure conversion tool, examples of generation rate estimation, a "Bootstrap" procedure tool, a summary of approaches to estimate ALPHA for the exponentially decreasing emission rate models, and some links to other resources. The support file is evolving and will be updated periodically with new information. Check back at the EASC web page (URL above) for updates.

Monday, September 26, 2016

I have gotten very few requests for blog topics since
issuing the offer some time ago. One
such request has come from Richard Quenneville who asks how one might model
aerosol or airborne particulate exposure.

Aerosols are certainly different from vapors or gases and
the differences significantly complicate any attempt to model their exposure. Even relatively small aerosol particles
(microns or tenths of microns) are much larger than the individual molecules
that make up a gas or vapor. This gives
them different properties at least in the following areas:

·They are typically more readily electrically charged
especially if they are generated by sliding along a surface (e.g., dust from transporting powder in a
pneumatic tube).This charge can affect
the size distribution and sampling of the aerosol.

·With or without electrical charge, aerosol
particles are often susceptible to combining with one another in a mechanism
known as agglomeration.This process, of
course, changes the size distribution of the aerosol.

·Most important, because they have much more mass
than vapor molecules they have a settling velocity which increases with
increasing particle size and this, again, constantly changes the airborne size
distribution of the aerosol with time.

·Because of their mass, airborne particles do NOT always make it into sampling orifices thus biasing their measurement.

Assuming agglomeration is not happening in a time frame that
is relevant to the potential exposure, one can estimate any time-interval
concentration of any aerosol particle or size range of particles. This is done by taking the average settling
velocity of the particles in that size range and accounting for their loss from
settling. Typically is this done for
particles from 2 meters in height settling to the floor. If one is sure that the breathing zone
remains at say 2 meters high you can calculate the concentration loss from the
horizontal volume at 2 meters height to say, 1.8 meters. If you do this over small enough time
intervals you can estimate a time-weighted average of aerosol concentration for
any time period dependent on the nature of the aerosol source.

This brings up another complication of dealing with
aerosol. Compared to vapors, predicting
the “release” or generation rate of particulate into the air is highly
problematic because it depends on many undefined or unmeasured factors such as
inter-particle forces. I have never been
able to use first-principle models to predict this rate. Instead, we have had
success experimentally determining this rate from simulating the mechanism of
generation, measuring the resultant concentrations and back calculating the
rate of generation. I personally think
this is what needs to happen for the exposure assessment of nanoparticles
released to the air in various scenarios.

Please note, settling is dependent on the particle size
distribution of the generated aerosol. I
have seen situations in plants that were literally “particle fountains” with
particle size distributions with a significant portion of the particles were greater
than 100 microns. These particles hit
the floor in a time frame of seconds which dramatically lowers the total
aerosol mass/volume. Particles on the
other end of the spectrum, e.g., nanoparticles,
are going to essentially remain airborne and not settle at an appreciable rate
in most scenarios.

Finally, aerosol, especially insoluble aerosol, will deposit
in the respiratory track based particle size.
At the current time we have some aerosol exposure limits specified in
terms of total and respirable particulate. These are defined mathematically by the
ACGIH and these algorithms can be applied to the concentration in the above
size intervals above to render the amount of aerosol that might be inhaled
(inhalable mass concentration) or be able to reach the deep pulmonary regions
of the lungs (respirable mass concentration).

The above analysis sounds daunting mathematically and indeed
it is not simple; however, it is nothing that an Excel spreadsheet cannot
handle with relative ease given the proper input of scenario specific
dimensions, generation rate, initial particle size distribution, particle size
interval-specific settling velocity and ACGIH algorithms. Like
all models it is not exact but, I believe it is accurate enough to be useful.