Sunday, December 29, 2013

I'm reading an interesting book that deals with the subject of how science gets done, and how it is converted to societal impact (one of my favorite subjects) – "The Idea Factory: Bell Labs and the Great Age of Innovation". I've also found one can gain interesting (sometime provocative) insights on the same subject from the Nobel Banquet Speeches of newly-donned Nobel Prize winners. This week I read Dr. Randy W. Schekman's Dec. 10 Nobel Banquet Speech. Dr. Schekman is a co-winner of this year's Nobel Prize in Physiology or Medicine. He delivered a short but thought-provoking speech on the role of government in "managing" science...Schekman quotes from Vannevar Bush's (Bush was the science adviser to Presidents Roosevelt and Truman) 1945 report, "Science: Endless Frontiers":"Scientific
progress on a broad front results from the free play of free intellects,
working on subjects of their own choice, in the manner dictated by
their curiosity for exploration of the unknown ... Freedom of inquiry
must be preserved under any plan for government support of science."Schekman then goes on to lament the modern tendency for governments to meddle with scientists' exercise of their curiosity and talents:"... And yet we find a
growing tendency for government to want to manage discovery with
expansive so-called strategic science initiatives at the expense of the
individual creative exercise we celebrate today. Louis Pasteur
recognized this tension long before the trend towards managed science.
He wrote, "There does not exist a category of science to which one can
give the name applied science. There are sciences and the application of
science, bound together as the fruit of the tree which bears it".With all due respect to Schekman, one is left with the impression he believes the more appropriate role of government is simply to spread funding around to a "Priesthood of Scientists". The Priesthood, snug in their laboratory sanctuaries, and safe from the buffeting of current human and environmental realities, would deliver a continuing cornucopia of discoveries that would somehow solve society's and the planet's most pressing needs. I guess, in Schekman's mathematics:

Funding + Faith – Oversight = Useful Solutions

REALLY?

Schekman continues by citing Louis Pasteur as an example of someone who recognized the evils of "managed science". While it is certainly true one can identify a plethora of examples in which the results of basic research yield unexpected impacts, the elapsed time between the research and the impact vary wildly. (Brings to mind the old story about the blind hog underneath the acorn tree...)

But a different view was offered back in 1997 by Donald Stokes, in his book, "Pasteur's Quadrant – Basic Science and Technological Innovation". Stokes was himself, no lightweight. For eighteen years he was the Dean of Princeton University's Woodrow Wilson School of Public and International Affairs. Among other things, he was a fellow of the American Academy of Arts and Sciences, the National Academy of Public
Administration, and the American Association for the Advancement of
Science.

Contrary, to Schekman's view, Stokes, whose analysis of the interplay between unbridled scientific research and federal public policy spans the period from the late 1800s through the late 1990s, concludes that federal support should be focused on "use-inspired basic research" – research that is related to and focused on delivery of impact and results relevant to today's pressing challenges. (Italicized words are mine, not Stokes'. Read his book for the details...)

There
are of course, different perspectives on the relationship between "discovery science" and "applied research" and their justifications based on "delivered solutions" and
"societal impact". And then there's the question of the appropriate roles of the public vs. private sector, and the individual (or "lone wolf" ) researcher vs. large research organizations. All of this, and much more can and should influence public policy relative to and federal funding of the scientific enterprise.

With all due respect to Dr. Schekman, I lean heavily toward Stokes' view. From my vantage point scientific research (especially in the U.S.) suffers from multiple unhealthy realities and dissonant voices:

A weakening of society's belief in absolute truth and the value of seeking it. The unavoidable result of the weakening of belief in absolute truth is a devaluation of the search for it – a reduction of support for research and pursuit of knowledge. Think about it...

An entitlement mentality on the part of many in the scientific research business. Schekman's comments (to me) hint of this attitude. You can see it manifested in many quarters. One that comes to mind is the aggressive position taken by SOME in the global climate change research community that we should pour enormous amounts of funding into the research agenda of the global climate change community without regard to requirements for true verification and validation of methods and models against real-world data (but that's a subject for a future blog.)

Shrinking federal "discretionary" budgets. Scientific research comes after paying the federal debt, entitlement programs, and national defense. (Who can argue with that?)

A distortion of Stokes' definition of "use-inspired basic research" by leaders in the federal research establishment. It is very difficult to reconcile an objective reading of Stokes definition of use-inspired basic research, with some elements of the federal R&D portfolio for the past decade or two. Pasteur wasn't playing around in a sandbox with blind faith that a cascade of useful solutions to pressing problems would somehow magically emerge. He was focused on lines of research relevant to his chosen problem. Things have begun to improve a bit with regard to federal R&D investments over the past few years, but I'm confident an objective review of the federal R&D portfolio would bring to light a plethora of "R&D investments" that are simply impossible to justify based on prudent public policy.

A demand, in some quarters, that every federal research investment must be successful. This risk-averse viewpoint, often touted by those claiming to be caretakers of the American Taxpayer, is misguided and whispers a misunderstanding of how scientific discovery, engineering research, and technology development enterprises work. This relates closely to Schekman's (valid, in my view) lament that many in the government bureaucracy believe discoveries and breakthroughs can be "programmed" and scheduled.

A focus on immediate return on scientific research investment by non-governmental entities. This is (sort of) the opposite view of the entitlement crowd. It is held and practiced by many industrial concerns. "If we can't see a substantial return on our research investment within 2-3 years, we shouldn't be doing it." (I've blogged before about the embarrassing-low levels of research investment by the private sector in the nuclear energy arena.)

So, enough regurgitation of the status quo . What are my proposed solutions?

More about that in an upcoming post! :)Just Thinking &Happy New Year!Sherrell

Monday, December 9, 2013

I routinely violate the 1st Rule of Blogging, which states that one must add new content to one's blog frequently and regularly. The optimal interval between entries is supposedly no more than one week. Daily is far better. Really?

Look down below at the date of my last previous posting – August. Que pasa?

I am not a "professional blogger". I follow "3 Simple Rules" for blogging: (1) I do not blog simply to demonstrate I'm alive; (2) I blog when I have a well-structured, fresh, original, or insightful thought to offer; and (3) I honor my readers and never lose site of the privilege bestowed on me by those who actually take time away from the other demands of life to read what I have to say.

So why no blog since August?

Well, it's not because I've quite observing and thinking. It's because I've been otherwise preoccupied. Two other endeavors have sequestered my attention during the past several months. First, we've had some serious family health issues this year. They are, thankfully, mostly behind us. But they have deserved and demanded my priority. Secondly, during the past six months, I've devoted virtually every free moment to (finally) bringing to fruition a vision I've had for well over ten years...

One of the reasons I blog is that I have a passion for communicating technical information to diverse audiences. One of the most disconcerting observations I've made about technical professionals during the course of my career is that some of the finest scientists, engineers, and project professionals lack the training and skills to excel during those critical oral presentations that punctuate careers and business life cycles in scientific, technical, regulatory, and "high-tech" businesses.

This really shouldn't be surprising, because most technical professionals have had no formal training in the craft of preparing and delivering effective technical oral presentations.

For well over ten years, I've had a vision for integrating my (now 35 years of) experience as a technical communicator in the energy and research business, with the best thinking of the cognitive science community, to create a method for preparing and delivering high-consequence oral presentations in high-stress environments (think Regulatory & Safety Reviews, Investor/Sponsor Meetings, Best & Final Proposal Presentations, Major Project Reviews, etc.).

I'm pleased to say the vision is now a reality: announcing the "3C-Oral Presentation" or "3COP" Method. (In case you're wondering, the "3C" stands for CLEAR, CONCISE, and COMPELLING.) During the second half of this year, I put the finishing touches on 3COP, and began conducting 1-day workshops for my clients at Advanced Technology Insights, LLC.

I'm passionate about the 3COP Method, because I know it can transform the communication effectiveness of our technical communities, strengthen relationships, promote broader understanding of complex issues, advance careers, and build business success for its practitioners.

The 3COP Workshop is an immersive training experience designed to transform the oral presentation skills of technical professionals whose success is captive to their ability to effectively communicate complex, controversial, and detailed information in demanding business and tightly-controlled regulatory environments. If you and your company are in the

Saturday, August 24, 2013

One of my favorite topics is the interaction of technology and society (thus the "byline" for this blog)... One of the major channels of this interaction is the higher education enterprise. I've blogged before that I believe the cost of a college degree is, in many cases, outrageously overpriced in both absolute and relative terms. (I say this from the vantage point of one who is a product of the academic enterprise, who has interacted heavily throughout my professional career with the academic enterprise, and is a parent of young adults who have availed themselves of both traditional 4-yr colleges and the 2-year technical schools.)

I'm convinced many (some might even venture to say most) college degrees aren't worth what they cost - if one measures "worth" in terms of the value society places on the degree. My basic metric for the value society places on a degree is society's willingness to pay for the exercise of the knowledge and skills supposedly represented by that degree. (Now of course, the other element of this value proposition is the personal internal fulfillment and satisfaction one gains from the college experience and the knowledge gained therein. But that's not my topic here...) Based on my metric, it's abundantly clear a very large percentage of the college degrees being granted in this country today simply aren't worth their cost.

This morning I read one of the most insightful and damning assessments of this situation I've seen. It's an interview with Richard Vedder, of Ohio University and the Center for College Affordability and Productivity. The interview is documented in an opinion piece by Allysia Finley in the Weekend Edition of the Wall Street Journal. I encourage you to read it. Provocative and thought provoking. Good stuff.

Some have asked about my absence from the blogosphere during the past month... The answer is that I've been very busy with clients. I have also been putting the final touches on new 1-day workshop I'll be offering in the near future. The workshop, entitled, "Mastering High-Stakes Oral Presentations for Scientific, Technical, and Regulatory Professionals," is a 1-day equipping event, designed to transform individual and organizational oral presentation effectiveness when and where it matters most. More about this in a future blog...

Saturday, July 27, 2013

Caution: this post is extremely "U.S. - centric". My musings below are focused strictly on the U.S. energy situation. Things look different outside the U.S....

I am a believer in the "portfolio" approach to energy supply. You know – spread your dependency across several supply options. Reduce your vulnerability to problems in any one supply sector. Some are fond of calling this the "all of the above" approach to energy supply. And I'm strongly pro-nuclear energy. I believe it is the only energy source in hand that has the capacity to meet our global energy challenges. But...

According to the U.S. Energy Information Agency (EIA), based on the last 12 months of electricity generation data, the current U.S. generation mix is approximately 39% coal, 29% natural gas, 19% nuclear, 7% conventional hydro, 6% "other renewables" (primarily solar and wind).

An interesting detail buried in this data is that, despite a very successful industry-wide power uprate program that has yielded ~ 1950 MW of new generation capacity between 2007 and 2012, and expects to add another 500 MWe between 2013 and 2017, nuclear power's slice of the electrical generation portfolio in the U.S. seems destined to diminish over the coming decade.

I really, really hate to say it, but: Nuclear power is in serious trouble in the U.S.

Why? Three inter-related drivers:

Natural Gas – the new "King of Enegia". Barring the unforeseen, it appears the wonder-tech of fracking (thank you George Mitchell) is destined to deliver natural gas (in the U.S.) at $4 to $6 /MMbtu for a long, long time. (Of course an accident in which fracking is proven to contaminate a major groundwater aquifer, or an explosive growth in construction of U.S. LNG export terminals that allows us to economically export our gas might change that.) Now don't get me wrong. Cheap natural gas is a good thing. In fact, the combination of reduced energy demand, improved energy utilization efficiency, and the on-going switch from coal to natural gas-based electricity production enabled the U.S. to achieve a remarkable feat in 2012. According to EIA data, U.S. carbon dioxide emissions in 2012 were almost 12% below our 2005 emissions level. Remarkable! (See here for a penetrating analysis of this achievement.)

Aging Nuclear Fleet – the cost of maintaining 40+ year old nuclear plants is rising and appears destined to continue to do so. The recently-announced retirements of Crystal River (~860 MWe), Kewaunee (~ 550 MWe), and San Onofre 2 & 3 (~ 2150 MWe) are, at the risk of over-simplification, a likely harbinger of things to come. Depending on whom one believes, as many as 10 to 12 plants additional plants are also under intense financial pressure due to the combined effect of cheap natural gas and plant maintenance costs. Mark Cooper, at the University of Vermont, recently released a particularly interesting analysis. He identifies ten plants he believes are unlikely to weather the financial pressures of today's "Gas is King" environment. It's a sobering picture whether or not one agrees with every element of Cooper's analysis.

High Capital Cost of Nuclear Plant Construction – Five new nuclear plants are "under construction" in the U.S.: Watts Bar 2 (@ $4.5B), Vogtle 3 & 4 (currently estimated to be ~ $14B to $15B by Georgia Power), and Summer 2 & 3 ($10B+). Given the current market capitalization of the U.S. electrical generating utility industry, this is simply too expensive for all but a few utilities to seriously consider. (Only five U.S. utilities have current market values in excess of $25B : Duke, Southern, Dominion, Excelon and NextEra Energy.) A choice in plant sizes would help (a la Small Modular Reactors). Regulated markets help by reducing financing risk and bolstering investor confidence. But with the capital cost of combined cycle gas turbine plants hovering around $1000 / kWe (see EIA Report here), and natural gas at anything approach $4-$5/MMBtu, nuclear isn't going anywhere fast at a buy-in cost of ~ $6,000+/kWe). And everyone is watching to see if these new plants can actually be delivered at costs close to their current projected levels.

Possible game changers for nuclear?

I can think of a few:

As previously mentioned, should there be a case in which fracking is shown to pollute a ground water aquifer, a change in regulatory regime would almost certainly lead to a higher cost of natural gas.

Another major accident at a commercial nuclear power plant. Nail in coffin... Game over.

A major expansion in domestic LNG export terminals and the associated LNG supply infrastructure would open international markets for our natural gas and would almost certainly lead to an increase in domestic natural gas prices (presuming production levels did not increase in a manner to off-set the international demand).

The promise of Small Modular Reactors to be more affordable from the capital cost standpoint could prove to be true. (I'm not sure how this happens if no one is ordering them.)

New nuclear reactor technology might dramatically reduce the capital and operating cost of nuclear power plants (I'm not sure how this happens when, in real terms, there's almost no significant investment in game-changing nuclear power technology.)

Regulated electricity markets could expand in the U.S. – lowering investor risk and making large electricity generating capital projects more attractive from the investor standpoint (what are the odds?)

A radically new, more attractive investment model could be developed, in which more investors come together to finance a nuclear power plant and share the risks – similar to the petroleum platform financing model long used in the oil industry. Sounds like a "White Knight" scenario...

Watts Bar 2 and the new Vogtle and Summer plants could come in on schedule and cost. This would bolster industry and investor confidence. But I'm not sure that's a game changer.

Nuclear power could be "socialized" in much the same way other "civil infrastructure" (such as the interstate highway system) has been. (I don't think so...)

It would be fascinating to apply a supercomputer and some game theory analysis to evaluate various scenario combinations. Barring that, and based strictly on the computer between my ears, I'm having trouble coming out of this analysis with a picture that bodes well for nuclear power in the U.S. over the next couple of decades.

Sometimes you just have to stare the fiery dragon in the mouth. Yes, you will probably be burned. But you'll have a much better understanding of the challenges you face... Now where did I put those flame-proof goggles?

Thursday, July 18, 2013

We've been reading a lot recently about the impact of modern communications and surveillance technologies on individual privacy. I've been collecting news accounts from various media sources and have assembled a picture of the implications of technology evolution for individual privacy in the western world (particularly the U.S.).

Imagine a day in which...

Every phone call you make and every email you send can be monitored by governmental authorities in real time...

Every click of your computer mouse, and every location you visit with your internet browser can be recorded and analyzed not only be governmental authorities, but by private companies as well...

The camera and microphone on your computer and your telephone can be activated without your knowledge by nameless hackers...

If you carry a cell phone, every movement you make during the day can be tracked (with resolution in some cases down to a few feet) by governmental and private concerns...

Every trip you make in your automobile can be tracked by governmental surveillance cameras...

Your medical records (eventually including your personal DNA / genome map) are stored electronically online and can be accessed by anybody who wants them badly enough...

Every click of your TV remote and your detailed TV viewing habits are monitored by your cable TV provider and the "meta-data" can/is sold to advertisers and content providers...

Your minute-by-minute household electricity usage (and all that can be inferred from it) are known to your electric utility...

You (and unknown hackers) can control major appliances in your home from afar via telephone and internet...

Your buying habits at local retailers can be tracked in detail (do you think those "rewards" come with "no strings attached" ? )

This is enough to give pause to even the most ardent technoholic.

Especially when one understands that "day" is TODAY...

Yes... all of the above and more are actually happening TODAY, if the avalanche of news media accounts are to be believed...

We are evolving to a point in which every individual, as we go about our daily lives, is creating a highly intimate, dynamic, and enormously-large "life data cloud" or "life data echo" (my terms). Furthermore, unless current trends are halted, much (perhaps most or all) of this data can and will become commoditized, packaged as a data product, and either voluntarily or involuntarily made available to society.

This isn't simply "Big Data". This is "Huge Data".

Most of what we as individuals know, and most of who we are, will become "common property". Our lives will, literally, become an open book for all to read...

What has this to do with honey bees?

I used to keep honey bees. They are wonderful and fascinating creatures. When a honey bee returns to its hive, it brings its cargo of food and collected treasures. But it also does a "data dump". The worker bee conveys to the collective hive all it knows of value about its surroundings – thus allowing the entire hive to benefit from its individual life and effort. Every worker bee is, in essence, both a "sensor / detector" and a source of effort for a collective community of individuals – each born and destined for a specific role in the community.

Bee hives and bee colonies are indeed a source of something wonderful and highly valued by man. But honey bees (1) surrender their personal identities for the "good of the hive"; and (2) when honey bees die, they are carried out of the hive and their bodies non-ceremoniously dropped off the front porch.

This makes we wonder about where we are headed as a society. Aren't we beginning to look more and more like a bee hive?

Just thinking...

* The image above is presented under the terms of the GNU Free Documentation License. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 only as published by the Free Software Foundation;
with no Invariant Sections, no Front-Cover Texts, and no Back-Cover
Texts. A copy of the license is included in the section entitled GNU Free Documentation License.

Tuesday, June 18, 2013

I've been really busy the past several weeks with the relaunch of Advanced Technology Insights, LLC (www.ATInsightsLLC.com). But, as a quick follow-up to my last posting (Your Brain on Big Data), I wanted to share a confirming tidbit I ran across in the WSJ last Thursday (June 13 issue)...

Tucked away on page B6 that day was a fascinating "CIO Journal" blog entry by Rachael King, entitled, "How Spies May One Day Predict The Future". The content of her posting validates my musing from Post # 83 regarding potential displacement of the need to understand "first principles causality" with what I call "big data correality"...

Within the brief article, Ms. King discusses the potential for using "current data to predict the future." She discusses a declassified project, named "Open Source Indicators,"...

"...One declassified project, Open Source Indicators, reviews a range of publicly available sources, such as Twitter messages, Web queries, oil prices, and daily stock market activity, to gauge the likelihood of certain societal events. The goal is to develop continuously automated systems that can predict when and where a disease outbreak, riot, political crisis or mass violence might occur."
According to Ms. King...

"Already the project has been able to accurately forecast student protests that occurred in Paraguay when the president was impeached, and to predict a Hantavirus outbreak in Argentina last year."

All of this reminds of me of the discipline of noise analysis, the use of which allows one to extract meaning (if there's any to be found) from what might otherwise appear to be random, uncorrelated, and non-relevant signals. It's also closely related to regression analysis, in which one estimates the response of a dependent variable (say stock price) to a set of (hypothesized) correlated parameters (day of the week, P/E ratio, values of key financial indicators, etc.). Given some luck, sufficient time, and enough raw data to use in the curve fitting algorithm, it's possible to develop good predictive capabilities without truly understanding any of the underlying or fundamental cause and effect relationships.

So correlation replaces causality.

It raises the question, "Does one really need to know why, so long as one can predict when and where?"

Some heavy physics, metaphysics, and philosophy knotted up in this one...

Monday, May 6, 2013

I was chatting with a colleague recently about the wonders of technology and the response of the human mind to it. I'm talking long-term response – evolutionary adaptation of the human mind and human behavior to the increasing role of technology in our lives. While there are countless facets of this subject to explore, two I find particularly interesting (and a bit unsettling) are the potential influence of "Search" and "Big Data" on human cognitive capabilities.

The Impact of "Search" on the Evolution of the Human Mind

Most of us who spend time on our computers and the "net" rely heavily on search engines for our productivity. With the "doubling time" of knowledge shrinking to just a few years, what else are we to do? Think of it. Traditional encyclopedias (e.g., The Encyclopedia Britannica and World Book Encyclopedia) are on life support. Out of date before the ink is dry. And how long since you used an "on-line" version of a traditional encyclopedia? Can't remember? Why? Because it's easier to fire-up our favorite "Search Engine" and "Google It" - or go to Wikipedia. Knowing facts and information (rapidly evolving knowledge) is less important when Search Engines and Wikipedia are the universal portals to knowledge.

So this leads me to wonder... how long before our brains figure-out that retention of facts is less important when there's an (always and instantly accessible) universal conduit for information? When information is instantly at our fingertips, we don't need to remember it. (Many of my friends in the educational business tell me their students' behavior already reflects a devaluation of memory.) Still skeptical? How many phone numbers do you remember now that they can be instantly recalled by your cell phone? And how will our brain function adapt to this phenomenon? Hard to imagine... One could see the human mind evolving to increase it's efficiency at assimilating tremendous volumes of data, but not retaining the information for more than very short periods of time (sort of like a college freshman on 5-Hour Energy cramming for tomorrow morning's final exam). Think of it, long-term memory (except how and where to access our Search Engines and Wikipedia) becomes irrelevant! Thus the ability to avoid "information" overload, filter out noise, and organize and assimilate information will become much more important. This leads me to...

The Impact of "Big Data"on the Evolution of the Human Mind

A second evolutionary driver may well be the fact that insights gained via collection and analysis of "Big Data" may overwhelm the "need" to understand cause and effect relationships in our world. With enough data and computing power, causality can be inferred ("if this happens, then that happens") without the necessity for any real understanding of the underlying fundamental phenomenology and physics of the phenomenon. Institutions and individuals have grown quite wealthy applying this concept to the stock market since the first supercomputer began crunching stock performance data in endless regression and correlation analyses to determine what happens to the Dow Jones Industrial Average when it rains in Tasmania. Similar techniques are being applied today to understand the global evolution of killer viruses and health pandemics. Could we reach the point where simply knowing "A causes B" and the magnitude of that change, is more important than knowing why and how "A causes B"? And if we do reach that point, how will our brains "reprogram" the functions of our cerebral real estate in response? Will mankind lose the ability to understand the physics of causality? Will he care? Will it matter?

So much for pondering the inter-relationship of technology and the evolution of mens hominis. Lots of simplifications and assumptions above, but fascinating to ponder...

Saturday, April 27, 2013

There was a fascinating article in today's Wall Street Journal entitled, "The Diploma's Vanishing Value," by Jeffrey Selingo. The article details the fact that many two-year, Associate Degree individuals are beginning their professional careers with salary levels higher than many four-year college graduates. For instance, in my home state of Tennessee, the article points out that "the average first-year salaries of graduates with a two-year degree are $1000 higher than those with a Bachelor's Degree."

The article goes on to say, "Technical degree holders from the state's community colleges often earn more their first year out than those who studied the same field (my emphasis) at a four-year college." As an example, the article cites data for graduates in health professions from Dyersburg State Community College. Turns out, "They not only finish two years earlier than their counterparts at the University of Tennessee, but they also earn $5,300 more, on average, in their first year after graduating."

The phenomenon is not, of course, limited to my home state. Mr. Selingo also relates that in Virgina, graduates with two-year technical degrees from community colleges make $20,000 more in the first year after college than do many Bachelor's Degree graduates from the state's four-year colleges. (Those of you who wish to know more about this phenomenon can go to
http://collegemeasures.org/esm/ for access to a cache of raw data.)

Wow!

I grew up in a world where "higher education" was the key to a better
life at a time when there were no two-year community or technical
colleges. I worked my way through seven years of college (with the help of a scholarship or two, a 30 hour a week job when I was a freshman, a wonderful Cooperative Engineering (Co-op) program, and a graduate research grant. As the cost of "higher education" has skyrocketed in recent years, it has become more and more difficult for a young person to replicate my college experience.

I know of a situation in which a young person recently graduated with a Bachelor's Degree in economics from one our nation's premier universities. Despite working for "spending money" through all four years of undergraduate study, this young person graduated with over $100,000 in educational loan debt. Think about that! What can you do with a Bachelor's Degree in economics that will realistically allow you to pay-off $100,000 in loan debt before you are 40 years old? I maintain that the educational system failed that young person.

This is the little "dirty secret" of higher education: Society does not reward many, many "professions" and fields of study. It's absolutely wonderful (from the standpoint of personal satisfaction) if one wishes to pursue an undergraduate degree in some field of little commercial value – but we have an obligation to our youth to help them make informed decisions about such things. And we aren't doing it.

Some hard truths:

A four-year college isn't right for everyone

A 4-year Bachelor's Degree in many fields does NOT guarantee a financially-secure future

Speaking strictly in economic terms, many college degrees aren't worth their cost

Our nation desperately needs health technicians, physical therapists, radiological protection technicians, instrument technicians, auto mechanics, plumbers, etc., etc., etc., – and one can make a good living in these careers.

How does this relate to energy and society?

Knowledge discovery and innovation often occurs within the hallowed halls of research institutions, national laboratories, and academia. But SOCIETAL IMPACT only occurs when this knowledge and innovation is converted to hardware installed in the field. I'm speaking of power plants, electrical distribution networks, hospitals, telecommunications networks, and so forth. This impact cannot occur without an army of skilled and passionate professionals – many of whom will be trained in our two-year technical colleges and community colleagues. And many of whom will go on in life to be entrepreneurs who start their own businesses. When it comes to education, we need it all... Ph.D, Master's Degrees (the degree I still consider to be the ideal "do it all" degree), Bachelor's Degrees, and Associate Degrees – to get the job done! And the common thread? STEM – Science, Technology, Engineering, and Math.

So, "hats off" to our nation's two-year colleges and to those who are increasingly seeing them as a pathway to the future. These intrepid souls are proving that "grey collars" often lead to "greenbacks" (as in $$$$$).

Monday, April 22, 2013

I've been thinking recently about the role of scientific and technical knowledge retention and transfer in an increasingly complex and fast-paced world.

My thinking was catalyzed by a chorus of recent news releases and internet postings heralding the "new discovery" by someone that molten salt reactors (MSRs) can transform the world by providing cheap electricity, utilizing thorium, burning plutonium, and destroying actinides and radioactive waste. Without addressing the technical validity of these assertions, let me say I find this "discovery" or more accurately "re-discovery" phenomenon fascinating from the standpoint of knowledge retention and inter-generational knowledge transfer (or lack thereof). I wonder how many times through the ages fire has been "discovered"?

One of the reasons the molten salt reactor example is so interesting is that virtually all of the asserted benefits of of molten salt reactors were originally cited in the mid-to-late-1950's (when I was a toddler...) In fact, Briant and Weinberg asserted most of the benefits (including the ability to run on uranium, thorium, and plutonium fuel cycles) in their 1957 paper in Nuclear Engineering and Design. Subsequently, these attributes have been explored on a cyclical basis by a variety of domestic and international entities and collaborations, with the last real flurry of interest in the MSR coming a decade or more ago when there was renewed interest in the potential use of MSRs for radioactive waste transmutation. But enough about MSRs. They are simply the example that triggered this stream of consciousness.

Now back to knowledge retention and transfer...

Throughout my 30+ years in the energy R&D field, I've observed that the "dusting-off" of, or "re-look" at "old technologies" and technical approaches is generally wise whenever one or more of Three Criteria are met:

(1) A scientific & technical discovery has been made (such as the understanding of a fundamental phenomenon) that provides a critical insight previously unknown;(2) Changes and evolution in base or enabling technology (such as a new material) enables one to do things not previously possible;(3) Externalities (such as constraints, perceived need & urgency, societal / cultural values, changes in competing technology acceptability, etc.) shift or change in a manner that potentially improves the perceived risk/reward math for the "old technology".

Just as the biosphere is a preserve or "library" of "solutions to problems" (perhaps to problems or challenges we don't even know we face), our knowledge base of "old technologies" is a library of potential solutions to problems (current and future). But what happens when the "library" is lost? (After all, who knows what was lost when the Library of Alexandria burned?)

I can't help but recall a situation at ORNL some twenty years ago when I inherited the last remaining "gold files" (three file cabinets) of Art (Arthur P.) Fraas an internationally known energy technology engineer who retired from Oak Ridge in 1976 and passed away in 2011 at the ripe old age of 95. Art was an engineer's engineer – a remarkably gifted and versatile individual. Among other things, he was known in the 1950s-1970s as one of the most innovative engineers at work in the development of both advanced terrestrial and space power reactor concepts. When Art retired in 1976, he transferred what he considered to be his most important personal files, notes, and log books to another engineer, who, in turn, left them in the safe keeping of "management" when he retired. Some time afterwards I was made aware of the files and was asked if I wished to preserve them. Having been told these were "Art's files", I rushed down to the basement of the old Y-12 calutron building where they were being stored (one of the buildings where the uranium for the "Little Boy" atomic bomb of World War II was enriched). With great anticipation I approached the first file cabinet and opened the drawer. It was empty. I open a second drawer. Nothing but dust. A third drawer creaked as I pulled it open and surprised some cockroaches. And so on with the other two cabinets. It turns out that, with the exception of two binders of old photographs, all of Arts files had been tossed out about a week earlier in an effort to clear the area of "debris and refuse". I can't tell you how many times over the past twenty years I wondered what was lost. I could relate other similar stories. I guess someday someone with "discover" what we tossed out - or not.

We live in a "throw away" society. And the good news? History teaches us that, given enough time, mankind tends to "rediscover" that which has been lost – or at least fragments of what has been lost. The internet is making it possible to preserve more and more of our society's knowledge legacy. But rather than simply stumbling upon a rediscovery, wouldn't it be wonderful if our "search" capabilities enabled us to stitch together knowledge bases, filtered through the Three Criteria I cited above, to provide society a deliberate and structured approach to re-examining or "mining" historical knowledge and technology bases? Now that would be a "search engine" for the ages!

Sunday, April 14, 2013

A couple of years before I left ORNL (in Sept. 2011), I had the privilege of assembling a small team of very bright engineers to tackle an idea I had for a small, high- to very-high temperature nuclear reactor system that would lend itself to distributed generation of process heat and electricity. The reactor was to be usable both in single-unit applications and in clusters (much like NuScale and mPower small modular light water reactors) to meet higher energy demands. The concept was to integrate the best features, technologies, and system architecture elements from ORNL's historic Molten Salt Reactor Experiment (MSRE) and Molten Salt Breeder Reactor (MSBR) concept, gas-cooled reactor graphite fuel technology, ORNL's (then) recent Advanced High Temperature Reactor (AHTR) fluoride salt-cooled reactor concept, and system topology features from integral fast-spectrum liquid-metal cooled reactors.

Working with a multidisciplinary team of engineers, and with very limited (internal laboratory) funding, our team created a new concept we called SmAHTR (for Small modular Advanced High Temperature Reactors). SmAHTR can be described as an Integral Salt-Cooled Reactor (iSCR). Though we did not know it when we first began work on the SmAHTR concept, we
subsequently learned our Russian colleagues at the Kurchatov Institute in
Moscow had developed and published in 2002 a concept for a very small, < 20 MWt, integral, liquid salt-cooled concept they called MARS. SmAHTR and MARS share some design similarities but have some significant design differences as well.

The basic design requirements for SmAHTR were: 125 MWt power, ~700 ºC
core outlet temperature, and an integral system topology (no coolant
loops). Additionally, the reactor had to be transportable over public
roads with common heavy-transport multi-axle semi-tractor-trailors. The system also
had to be extremely safe and easily refueled and maintained. The inherent safety attributes of the system are a result of its very low (~ atmospheric) operating pressure, forgiving nuclear dynamics, large thermal margins, and the use of a coolant that doesn't chemically react with air or water in highly energetic modes.

The Small modular Advanced High Temperature Reactor (SmAHTR)

The driving force for SmAHTR was our belief that high-temperature salt-cooled reactors would offer superior economics to gas-cooled reactors; would open new doors to nuclear process heat applications; and could be more quickly developed, demonstrated, and licensed that fluid-fueled molten salt reactors (MSRs). SmAHTR's operating temperature would be limited by current structural material considerations to 700 ºC (probably a little lower in initial implementations), but the basic concept could evolve to much higher temperatures when and if superior compatible structural materials are developed (a long-term proposition). Fluoride salt-cooled reactors share many materials and component technologies with molten salt reactors. Thus, in addition to providing a potentially-game changing nuclear energy system, successful development of SmAHTR would resolve many of the technological challenges faced by MSRs as well. (Those of you who are interested can access the SmAHTR pre-conceptual design report, ORNL/TM-2010/199, here.)

Incidentally... building and working with high performing, innovative
teams was one of the activities I enjoyed most during my years at ORNL.
My experience with the SmAHTR team was a particular joy. Ever team
member contributed to the concept. For instance, Jess Gehin suggested
the idea of adapting the old MSBR plate-type
moderator assembly for use as a graphite fuel element. Venu Varma engineered the innovative "bayonet loading" concept we adopted to provide
quick access to the various components that load through the top of the
reactor. I could go on... It's a real thrill to build a great team and be part of its workings...

The Department of Energy and ORNL have done little to move the needle on the SmAHTR concept since I left ORNL. However, ORNL, with funding from the Department of Energy, has moved forward to integrate some of the best features of SmATHR into the large gigawatt-class AHTR concept, and to continue some critical fluoride salt-cooled reactor technology development. Time will tell whether someone sees sufficient merit in SmAHTR to further mature the concept.

However, others are building upon the approaches we pioneered with SmAHTR. I understand Dr. Per Peterson and his team at UC Berkeley are investigating integral versions of their pebble-bed salt cooled reactor concept (which is larger than SmAHTR and originally employed coolant loops). And just this week, I learned my colleague Dr. David LeBlanc has gone "back to the future" with our SmAHTR concepts to create an interesting small Integral Molten Salt Reactor (iMSR) concept. The concept (actually two concepts – a 650 MWt and a 60 MWt "ultra-small" version) are discussed in an April 12 posting on the Weinberg Foundation's website. While retaining many of the general system architectural and component features we specified in SmAHTR, David has discarded SmAHTR's solid graphite fuel and reverted to a liquid fluoride salt fuel. Since I haven't seen any design details at this point, I'm withholding judgement regarding the engineering viability of the concept. But from the overall philosophical perspective, David's concept appears to be the most innovative and fresh MSR approach I've seen since the heyday of MSR development in the 1960's and early 1970's.

Both MSRs and salt-cooled reactors (integral or otherwise) face many, many challenges in moving from a pre-conceptual design to an actual prototype system. But some very bright and passionate folks are expending considerable energy in that direction.

Sometimes it's enjoyable to consider "the road not taken" in nuclear energy. Alvin Weinberg would be smiling...

Note to cynics: Everyone in the nuclear energy business is familiar with Admiral Rickover's famous comment about "paper
reactors". Though obviously founded in truth, in my view, far too many have too often used his comments as the "nuclear
option" to stymie any serious discussion of real innovation in the nuclear
energy field. So please spare me the comments about paper reactors... Believe me, I know the prospects of developing a new reactor concept in today's environment are remote. And the prospects of developing a high-temperature fluoride salt-fueled or -cooled system are further impeded BOTH by overly pessimistic AND overly optimistic urban myths and legends from the MSRE days. I don't own or wear "rose-colored glasses". The first iSCR or iMSR won't come easily, quickly, or cheaply. But the payoff could be significant if the challenges can be successfully overcome.

Saturday, March 9, 2013

On the eve of the two-year anniversary of the accident at Fukushima Daiichi, the U.S. Nuclear Regulatory Commission (NRC), and the U.S. nuclear power industry are making progress dispatching the twelve recommendations of the NRC's Near-Term Task Force (NTTF) Review of Insights From the Fukushima Dai-ichi Accident.

The NRC is currently working to resolve a major question about the continued safe operation of the nation's commercial boiling water reactor (BWR) fleet. The question at hand, driven by NTTF Recommendation 5.1, is "Should hardened, filtered, primary containment venting (FCV) systems be required as backfits to the nation's thirty one commercial boiling water reactors (BWRs) that are similar to the units at Fukushima Daiichi?" Or, as Shakespeare's Hamlet might put it, "To vent, or not to vent, that is the question..."

Regardless of one's view on the matter, one has to have some sympathy for our colleagues at the NRC.

The NRC staff issued its analysis of the question in SECY-12-0157 (November 2012). Their conclusion was that engineered filtration systems should be required.

On January 15, 2013, the U.S. House of Representatives Committee on Energy and Commerce (dominated by Republicans) wrote a 10-page letter to the NRC saying, in essence, "Slow down. You're moving too fast!" The letter questioned whether the hardened filtered vents are needed, and reminded the NRC that it took some actions in the wake of the accident at TMI that were later judged to be unnecessary or otherwise ill advised.

Not to be outdone by their House colleagues, on February 20, 2013, the Senate Committee on Environment and Public Works (dominated by Democrats) sent the NRC a letter, saying, in so many words, "You should require the vents. Get on with it, now!"

In the mean time, the Industry's position on the hardened vents can be summarized as, "Wait a minute! Not so fast! We're implementing the FLEX strategy. We need to understand all the implications of the FLEX strategy before we require the plants to spend buckets of money installing hardened filtered vents. And oh, by the way, unless one can prevent the containments from failing (thus at least partially bypassing an engineered vent system), money spent on a hardened, filtered vent is money wasted. Far better to put the money into systems that can prevent the containment from failing."

It is important to remember that none of these questions, and non of these positions are new (see NRC Generic Letter 89-16). It's "deja vu all over again" as Yogi Berra said. These same questions and issues arose back in the early-mid 1980s when we were taking the first serious look at BWR severe accidents. I recall two basic viewpoints about filtered containment vents that arose then and are implicit in the dialog today: On the one hand, "Vents are a nod to what you don't know you don't know – a last best safety net (or barrier in the multi-barrier containment concept), and thus filtered vents should be required." The counter argument was, "Yes, we don't know what we don't know, and the unintended consequences of installing a filtered venting system may overwhelm the benefits – so FCVs should NOT be required."

It's fascinating to realize this dilemma has not been resolved despite thirty years of risk-informed regulatory debate.

I recently sat-in via telecon on an NRC / Industry public meeting on the matter of whether hardened filtered vents are to be required and how the NRC plans to make that decision. Buried in the dialog is a fascinating techno-philosophical issue that juxtaposes the desire for passivity in our safety systems, and the need for reliability and safety system effectiveness. The attraction of a passive system is that it requires no human action and no outside power sources to act. Most passive containment venting concepts I'm aware of employ a vent flow path that "once open, is always open." This also means once the vent is open (functioning), the filter system becomes vulnerable to any dynamics loads or forces that might be placed on it from explosions or energetic events in the containment. The effectiveness of a passive filtration system after such an event is obviously in question. In a perfect world, one would prefer the ability to remotely or manually open and close ("modulate") a hardened vent line as the accident progresses, thus protecting the vent filtration system from damage and ensuring its effectiveness when it is needed.But, as anyone who has analyzed the response of BWR reactor buildings during severe accidents knows (and the accident at Fukushima illustrates), the ability of personnel to maneuver through a reactor building to reach equipment requiring manual operation can be severely limited by the environmental conditions created by the accident. And the idea of a completely passive filtered containment venting system that can open and close as needed is a bit like the idea of a "one-ended stick"... hard to conceptualize.

Completely aside from the technical issues involved, the NRC faces the very practical, but still highly philosophical issue of HOW to make the filtered vent decision. There may be a good Ph.D. thesis in decision theory buried in there somewhere!

As I said, regardless of one's position on the matter of filtered containment vents, one has to have some sympathy for our NRC colleagues who are faced with untying this "Gordian knot". Hamlet could relate to their dilemma.

Friday, March 1, 2013

Relative to my last post regarding federally-funded research and development, and with one eye on the upcoming federal budget sequester, I want to relate a funny and thought-provoking story about a conversation I had with a colleague at Oak Ridge National Laboratory some thirty years ago...

One day in the early 1980's, I and a number of other "early-career" guys (yes, all guys) were sitting around the lunch table bemoaning the lack of support for our favorite R&D programs. We swapped stories about the challenges of working within the federal R&D enterprise and the struggles we shared in dealing with volatile funding profiles and Washington politics. Sitting with us (and listening quietly) that day was an older colleague. I will call him "Henry" for the purposes of this story. Henry was about the age I am now (late 50's), and had joined ORNL after a long career in the nuclear power industry. Henry listened for some time to our groaning. Then, with a twinkle in his eye, and a stroke of his graying red beard, he said in a voice salted with wisdom we were yet to acquire, "Never forget, Gentlemen, WE (speaking of those employed in the federal R&D establishment) are the aristocracy of the welfare class!"

Henry continued to explain that, while we complained about our taxes, our salaries were paid and our labs were equipped by everyone's taxes. While we complained about federal handouts to various groups, we were at the "top of the federal food chain" when it came to taxpayer support. Though we bemoaned the inefficiencies of the federal R&D establishment, our laboratory was a major element of that establishment. And, while we groaned about constant bickering between and within the Legislative and Executive Branches over R&D priorities, we all had elected officials to thank for the continuance of our favored research programs.

Henry did not know it then, and I don't recall telling him later, but his comments that day had a profound impact on my thinking and my career. For you see, it was at that instant I first recognized what a privilege it is to work in the federal R&D enterprise.

And it was in that instant I began to understand the responsibility those in the federal sector bear to "give back" to the tax payer some tangible positive IMPACT in exchange for the privilege of having their research supported by the taxpayer... the "Societal Contract" I mentioned in my previous post.

It occurs to me as I write this that today I am almost a perfect complement to Henry of thirty years ago. I'm in my late fifties as he was then. I've spent most of my career in the federal research enterprise and have now moved into private enterprise. The vector of mine and Henry's careers have been almost exact opposites in that respect. Yet the wisdom of Henry words from so long ago still ring in my ears. Amen Henry! Preach on!

Friday, February 22, 2013

In my previous post I bemoaned the relative lack of focus on applied research in the federal sector.
For most of my career, leaders in the federal research establishment typically described the research and technology development enterprise as
a 1-dimensional vector similar to this:

The primary role of federally-funded research and development (R&D) in the physical sciences – that is, R&D funded with mine and your tax dollars – was to support the extreme left side of the vector – to pursue "big science" or "discovery science." As a result of this thinking, a culture developed in which it became almost taboo for federal R&D dollars to be spent on anything with the potential for near-term applications. (Military and defense R&D, and some biological R&D were the notable exceptions to this norm.) This behavioral trend was strengthened by criticisms about "corporate welfare" whenever major federal applied research programs were contemplated.

One result of the "vector view" of scientific research and technology development was the emergence of a "valley of death" separating fundamental research and applied research.

New knowledge and infant technologies of a practical variety often died there because they were deemed too mature for the federal sector to invest in their
development, and too immature for the private sector to invest in their
development. (Recall that Risk Aversion problem ?) Having spent over thirty years in the federal research system, I'm all too familiar with the "valley of death" that obstructs the metamorphosis of knowledge to technology and the movement of a new technology from concept to the consumer marketplace.

One of my major issues with the dominant federal "big science" view of R&D in the past twenty years is that it ignores the Societal Contract underlying all federally-funded research.

In my view, that Societal Contract is one in which taxpayers who fund the research have a right to expect a return on their investment. Put another way, in my view those who lead and manage federally-funded research bear a heavy responsibility to POSITIVELY IMPACT SOCIETY in return for society's sponsorship of their research. I can recall multiple instances in my latter years at ORNL in which we came together to answer the question, "What R&D have we done recently that has really impacted the public?" Those were usually awkward and difficult discussions. (But not as awkward and difficult as those times when we asked ourselves the question, "What R&D have we done that has actually solved a problem the average person on the street really cares about?")

Perhaps these standards are too high and too lofty? Is it possible the major problems the seven billion people of Planet Earth face today are so complex they cannot be solved? Or perhaps our problems are so nebulous we cannot know when and if we have solved them?

Or, perhaps, just perhaps, science and technology aren't the solution to our greatest problems and challenges.

Friday, February 1, 2013

Risk aversion is killing us. Or more to the point, it's killing innovation. Everywhere.

I remember the moment Neil Armstrong stepped on the moon. I was a
teenager watching him and Buzz Aldrin on a black and white television in
our living room. Frankly, I can't imagine the U.S. pursuing so bold an
effort as the Apollo program in today's "take no risk" and "protect
everyone against everything" environment – even if such a program was
deemed affordable.

Many governments (including ours) have grown so risk-averse when it comes to research and development, it's becoming almost impossible to pursue the type of bold research required to solve our largest problems or pursue our grandest dreams.

Ask yourself how long it has been since you heard of any research at a federal R&D facility that actually solved a problem that really matters to the average person in your community. You know – the "average person" who's working hard every day to shelter, feed, clothe, and educate their family.

I'm not saying our federal R&D budget isn't bearing fruit. But overall, our federal research portfolios during the past 20-30 years have tended to focus on "basic research" and the pursuit of fundamental discoveries that often have few – if any – practical implications for the foreseeable future. Often this is also research in which failures (conveniently) go unnoticed by the "average person."

I'm all for discovery-oriented research. Really. But I feel the times in which we live should compel us to invest the majority of our tax-funded-research in areas with the potential to IMPACT our daily lives in meaningful, tangible ways. As much as I love physics, chasing the Higgs is not one of these areas. (I suspect my viewpoint stems from the basic motivational differences between a scientist and an engineer.) There are signs of hope, and the last few years have witnessed some improvement in the overall balance of our federal research portfolio (applied vs. basic). In any event, we still have a long way to go to achieve appropriate balance – one that places highest priority on "Pasteur's Quadrant" research (to quote Donald Stokes). (But more about Donald Stokes and "Pasteur's Quadrant" in a future post.)

Having spent over thirty years in the federal R&D environment, I know from experience that talented researchers in the federal R&D establishment struggle under a growing burden of well-intentioned orders and regulations that make it devilishly expensive and extremely time consuming to work with anything carrying the "hazardous" or "risky" label. Try building a laboratory today in a federal research facility to work with liquid metals, molten salts, high voltages, high pressures, high temperatures, or (heaven forbid) anything radioactive. It costs too much. It takes too long. It requires too many approvals.

The picture doesn't look much better in the private sector.

First, many of our energy industries have shockingly-low overall
rates of investment in applied research. For instance, the last number I
heard quoted for the nuclear energy industry was that something under 2% of
annual revenues are reinvested in product development and applied
research. (I hope that's wrong.)

One would think the private sector would reward innovation. Sometimes it does. Apple and Google are household names. (Heck, Google is now a verb!) But dig a litter deeper, and things look ugly. I just ran across a fascinating paper by Sahi Bernstein, an assistance professor at Stanford's Graduate School of Business. Dr. Bernstein has been studying the impact of the transition from private to public ownership on a business's innovation performance. He did this by comparing the performance (patent citations) of a large number of companies over a three-year period just prior to their initial public offering (IPO), to their performance in the five-year period following the IPO. His analysis found a forty percent drop in innovation novelty (the degree of incremental innovation) in the post-IPO period compared to the pre-IPO period, but little change in the rate at which patents were awarded. His conclusion? "The results suggest that the transition to public equity markets leads firms to reposition their R&D investments toward more conventional projects." The paper goes on to say... "I find that the quality of innovation produced by inventors who remained at the firm declines following the IPO and key inventors are more likely to leave." It appears that once a company goes public, it becomes more focused on bottom-line profits for its investors than on continuing the risky behaviors (read that bold innovations) that spawned their success in the first place.

So where will innovation occur in this poorly-regulated, bottom-line-oriented, better-is-the-enemy-of-good-enough culture we have made for ourselves?

Our best hope may the person I call "The Maker Next Door." You know. That guy (or gal) with the 3D printer, the Arduino, and the TIG welder. The neighbor whose garage lights seem to stay on late at night after all the other neighbors have gone to bed.

But wait! My garage is calling. The soldering iron is hot, the air compressor is running, and that new dual-head 3D printer just arrived. I'm outa here...

Monday, January 21, 2013

As those of you who've followed this blog for any time know, I've spent most of my career in the nuclear reactor safety and nuclear energy technology development fields. While authoring my recent paper, "The Canary, The Ostrich, and The Black Swan: An Historical Perspective On Our Understanding of BWR Severe Accidents and Their Mitigation," (see Posts # 73 & 74) I had considerable time to reflect on the basic skills, values, and perspectives required of a professional reactor safety analyst.

I consider this a subject of no small importance. The generation of reactor safety specialists who enabled the licensing of most of the existing plants in the U.S., who performed the original Individual Plant Examinations (IPEs) and Individual Plant Examinations for External Events (IPEEEs) in the wake of TMI, and who conducted most of the commercial power plant probabilistic risk assessments (PRAs) performed to date, are either no longer with us or are no longer practicing. Those that remain are slowly "passing the baton" to a new generation of reactor safety specialists who will become the guardians and sherpas of reactor safety for the next thirty years. (Least this sound like my swan song – no pun intended – I'll hasten to note that I'm one of the youngest of my generation of reactor safety specialists, and I hope to have many productive years left in the profession.) Other than the recent flurry of plant life extension actions, the pace of commercial nuclear reactor safety analysis has slowed considerably during the past 20 years. The nuclear reactor safety business hasn't exactly been seen as an "up and coming" profession and career path for aspiring young engineers.

Thus it is with this as background I share here what I call my "ethos of nuclear reactor safety" (as I like to believe Alvin Weinberg might have put it)... This ethos is comprised of four ideals, principles, or attitudes essential to the practice of reactor safety:

An acute awareness of one's responsibility to society. Abundant, reliable, and affordable electricity is the chief technical enabler of the quality of life most of us desire. Nuclear power is the only energy technology available today with a realistic potential to supply abundant electricity to billions of people around the world living with little or no access to it. It is also one of the few technologies which, if implemented poorly, has the potential to prevent our neighbors from ever returning to their communities and homes. These two realities (benefits vs. risk) should provide strong motivation to those who aspire to be a reactor safety specialist;

A chronic sense of uneasiness. This means having a persistentquestioning attitude regarding what we know, what we know we don't know, and what we don't know we don't know – a willingness to challenge the status quo and the establishment when necessary to ensure safety;

A zeal for fundamental understanding. The passion for and skills to integrate experimental data, simulation & analysis results, and operational experience to arrive at a science-based understanding of the facts. And finally;

A scientific and technical humility. One who has a "healthy respect" for the limits of our knowledge and the wisdom to operate within these limits. One who constantly asks oneself, "What if I'm wrong?"

No one ever sat me down and delivered these ideals to me as if they were chiseled in stone. These are the fundamentals I've distilled over the years by observing and learning from those who have gone before me in this
business. And I had never actually attempted to synthesize them myself until recently.

So there you have it: Greene's Ethos Of Nuclear Reactor Safety. Others may articulate it more eloquently or more precisely. But my sincere hope is that the nuclear power industry, those who regulate it, and those who educate and train the professionals who invest themselves and their careers in it, will always be guided by these ideals.

About Me

Sherrell is the President and Founder of Advanced Technology Insights, LLC ("ATI"). ATI provides technology assessment, systems analysis, personal coaching, and organizational training to individuals and organizations seeking to understand, effectively communicate, and benefit from technology and technology evolution in today's fast-paced, complex environment.
During a career spanning over three decades at Oak Ridge National Laboratory (ORNL) Sherrell Greene rose from individual contributor to the position of Director of Nuclear Technology Programs and Director of Research Reactor Development Programs, with leadership responsibilities for a $120M annual research, development, and demonstration program. He is widely acclaimed for his team building, innovation, knowledge organization, oral presentation, and technical communication skills. Sherrell is adept at communicating complex technical issues, and formulating, packaging, and presenting compelling scientific and technical stories – stories that inform, inspire, and impact where and when it really counts.