There is No Slam Dunk

Every day I’m reminded that there is no slam dunk in business. Everything is hard work and perseverance. Even apparently simple things are fraught with complications and layers of nuance.  The great appeal of gambling that some find so convincing is that complexity and vexing details have been somehow suspended and a path is clear for the slam dunk. Slam dunks do happen I suppose, but over time the slams outnumber the dunks.

In chemical manufacturing, there are no trivial operations. Every step in the manufacturing sequence requires thought and infrastructure. Even fillling drums with water and shipping it out has complications-  quality control, portion control, container quality, inventory control, purchasing, pallets and dunnage, quality control overhead.  Then there is the matter of receiving & shipping, accounts payable and receivable, auditing, taxes, sales and marketing, and all of the other overhead that goes with operating an above-the-board business operation.  Then there is the matter of managing a staff and all of the HR delights that go along with that.

Now imagine if you were manufacturing hazardous or controlled substances. Suddenly, your staff are partitioned into those who work with hazardous materials and those who do not. Those who do need a steady supply of personal protective equipment (PPE) as well as lots of documented training programs to operate in hazardous environments. They’ll need physical exams, coats, gloves, boots, eye protection, and respirators with annual training. A smart employer will have the piss wagon come by now and then looking for drug use.

Let’s say that you want to replace a process solvent. You want to replace ether with toluene. In order to do this, you’ll have to validate the process in R&D for scale up. The process change will have to go through some kind of stage gate process to validate the benefit of the change and the approval of all customers. Some process changes must be approved by the customer. Woe is he who wants to make such a change in the cGMP or military chemicals world.  Developing a perpetual motion machine may be easier.

Process changes will alter the material streams in your facility. This may trigger PSM protocols that will have to play out on its own schedule. Or it may trigger environmental permits or LVE limits under TSCA.

Process changes may also alter the quality or safety margins that you have previously been relying on, but didn’t know it. This often occurs when a company tries to intensify a process. Suddenly the process is generating more watts per kg of reaction mass than before. Or all of a sudden the reaction mass doesn’t filter well or the pot residence time during distillation is deleterious at the higher concentration or with the higher boiling composition. All changes have a down side. These are some of them.  There is no slam dunk.

Gold Rush Alaska. Getting the pay out of paydirt.

So I’ll come clean. I am a fan of Gold Rush Alaska on the Discovery Channel. The new season has started with some serious twists. What I like about the show is the technical side. The miners are struggling with serious mechanical problems and difficult issues with unit operations in placer mining. This is what made life precarious for the gold rush miners of the 19th century and is certainly what caused many to return home empty handed. 

Getting to the pay streak, conveying the ore from the pit, moving it to the sluicing equipment, and getting the fines to run over the riffles of the sluice properly require a great deal of energy input. The remoteness of the site, the high cost of heavy equipment, and wrestling with faulty equipment all contribute to the difficulty of getting the pay out of paydirt. The mining season is about 100 days in duration. That is 2400 hours. Every hour must be used to maximum effect.

This season there is a bad guy. This guy, Dakota Fred, tips over the apple cart. So, the boys are heading to the Klondike.  But first they need a claim to work in a time of record gold prices and intense activity in the mines. I love the vicarious life.

On the release of hazardous energy

What should you do if a raw material for a process is explosive? Good question. Just because a material has explosive properties does not automatically disqualify it for use. To use it safely you must accumulate some information on the type and magnitude of stimulus that is required to give a hazardous release of energy.

But first, some comments on the release of hazardous energy. Hazardous energy is that energy which, if released in an uncontrolled way, can result in harm to people or equipment.  This energy may be stored in the form of mechanical strain of the sort found in a compressed spring, a tank of compressed gas, the unstable chemical bonds of an explosive material, or as an explosive mixture of air and fuel. A good old fashioned pool fire is a release of hazardous energy as well. The radiant energy from a pool fire can easily and rapidly accelerate past the ignition point of nearby materials.

Accumulating and applying energy in large quantities is common and actually necessary in many essential activities. In chemical processing, heat energy may be applied to chemical reactions. Commonly, heat is also released from chemical reactions at some level ranging from minimal to large. The rate of heat evolution in common chemical condensation or metathesis reactions can be simply and reliably managed by controlling the rate of addition of reactants where two reactants are necessary.

There are explosive materials and there are explosive conditions. If one places the components of the fire triangle into a confined space, what may have been conditions for simple flammability in open air are now the components for an explosion. Heat and increasing pressure will apply PV work to the containment. In confinement, the initiation of combustion may accelerate to deflagration or detonation. The outcome will minimally be an overpressure with containment failure. If the contents are capable of accelerating from deflagration to detonation, then loss of containment may involve catastrophic failure of mechanical components.

Rate control of substances that autodecompose or otherwise break into multiple fragments is a bit more tricky. This is the reaction realm of explosives. The energy output is governed by the mathematics of first order kinetics, at least to some level of approximation. In first order kinetics, the rate of reaction depends on both the rate constant and the intitial concentration of one reactant.  Regarding the control of reactions that are approximately first order in nature, some thought should be given to limiting the reaction mass size to that which is controllable with available reactor utilities. A determination of the adiabatic ΔT will give information that will tell you if the reaction will self-heat past the bp of your solvent system.

There is a particular type of explosive behavior called detonation. Detonation is a variety of explosive behavior that is characterized by the generation and propagation of a high velocity shock through a material. A shock is a high velocity compression wave which begins at the point of initiation and propagates throughout the bulk mass.  Because it is a wave, it can be manipulated somewhat. This is the basis for explosive lensing and shaped charges.

Detonable materials may be subject to geometry constraints that limit the propagation of the shock. A cylinder of explosive material may or may not propagate a detonation wave depending on the diameter. Some materials are relatively insensitive to the shape and thickness. A film of nitroglycerin will easily propagate as will a slender filling of PETN in detcord.  But these compounds are for munitions makers, not custom or fine chemical manufacturers. The point is that explosability and detonability is rather more complex than you might realize. Therefore, it is important to do a variety of tests on a material suspected of explosability.

A characteristic of high order explosives is the ability to propagate a shock across the bulk of the explosive material.  However, this ability may depend upon the geometry of the material, the shock velocity, and the purity of the explosive itself. There are other parameters as well. Marginally detonable materials may lose critical energy if the shape of the charge provides enough surface area for loss of energy.  The point is that “explosion” and “detonation” are not quite synonymous, and care must be exercised in their use. The word “detonation” confers attributes that are unique to that phenomenon.

Explosive substances have functional groups that are the locus of their explosibility. A functional group related to the onset of explosive behavior, called an explosiphore (or explosaphore), is needed to give a molecule explosability beyond the fuel-air variety. Obvious explosiphores include azide, nitro, nitroesters, nitrate salts, perchlorates, fulminates, diazo compounds, peroxides, picrates and styphnates, and hydrazine moieties. Other explosiphores include hydroxylamino. HOBt, a triazole analog of hydroxyamine,  hydroxybenzotriazole, has injured people, destroyed reactors and caused serious damage to facilities. Hydroxylamine has been the source of a few plant explosions as well.   It is possible to run a process for years and never cross the line to runaway.

Let’s go back to the original question of this essay. What do you do if you find that a raw material or a product is explosive? The first thing to do is collect all available information on the properties of the substance. In a business organization, upper management must be engaged immediately since the handling of such materials involves the assumption of risk profiles beyond that expected.

At this point, an evaluation must be made in relation to the value of the product in your business model vs the magnitude of the risk. Dow’s Fire and Explosion Index is one place to start. This methodology attempts to quantify and weight the risks of a particular scenario. A range of numbers are possible and a ranking of risk magnitude can be obtained therein. It is then possible to compare the risk ranking to a risk policy schedule generated beforehand by management. The intent is to quantify the risk against a scale already settled upon for easier decision making.

But even before such a risk ranking can be made, it is necessary to understand the type and magnitude of stimulus needed to elicit a release of hazardous energy. A good place to start is with a DSC thermogram and a TGA profile. These are easy and relatively inexpensive. A DSC thermogram will indicate onset temperature and energy release data as a first pass. Low onset temperature and high energy release is least desirable. High onset temperature and low exothermocity is most desirable.

What is more difficult to come to a decision point on is the scenario where there is relatively high temperature onset and high exothermicity.  Inevitably, the argument will be made that operating temperatures will be far below the onset temp and that a hazardous condition may be avoided by simply putting controls on processing temperatures. While there is some truth to this, here is where we find that simple DSC data is inadequate for validating safe operating conditions.

Onset temperatures are not inherent physical properties. Onset temperatures are kinetic epiphenomena that are dependent on sample quality, the Cp of both the the sample and the crucible, and the rate of temperature rise. What is needed once an indication of high energy release is indicated by the DSC is a determination of time to maximum rate (TMS)  determination. While this can be done with special techniques in the DSC (i.e., AKTS).  TMR data may be calculated from 4 DSC scans at different rates, or it may be determined from Accelerated Rate Calorimetry, or ARC testing. Arc testing gives time, temp, and pressure profiles that DSC cannot give and in my mind, is the more information-rich choice of the two approaches. ARC also gives an indication of non-classical liquid/vapour behavior that is useful. ARC testing can indicate the generation of non-condensable gases in the decomposition profile which is good to know.

Other tests that indicate sensitivity to stimulus is the standard test protocol for DOT classification.  Several companies do this testing and rating. There are levels of testing applied based on the result of what the lower series tests show. Series 1 and 2 are minimally what can be done to flesh out the effects of basic stimuli.  What you get from the results of Series 1, 2, and 3 are a general indication of explosabilty and detonability, as well as sensitivity to impact and friction. In addition, tests for sensitivity to electric discharge and dust explosability should be performed as well.

The Gap test, Konen test, and time-pressure test will give a good picture of the ability to detonate, and whether or not any explosability requires confinement. The Konen test indicates whether or not extreme heating can cause decomposition to accelerate into an explosion sufficient to fragment a container with a hole in it.

BOM or BAM impact testing will indicate sensitivity to impact stimulus. Friction testing gives threshold data for friction sensitivity.

ESD sensitivity testing gives threshold data for visible effects of static discharge on the test material. Positive results include discoloration, smoking, flame, explosive report, etc.

Once the data is in hand, it is necessary to sift through it and make some determinations. There is rarely a clear line on the ground to indicate what to do. The real question for the company is whether or not the risk processing with the material is worth the reward. Everyone will have an opinion.

The key activity is to consider where in the process an unsafe stimulus may be applied to the material. If it is thermally sensitive in the range of heating utilities, then layers of protection guarding against overheating must be put in place. Layers of protection should include multiple engineering and administrative layers.  Every layer is like a piece of Swiss cheese. The idea is to prevent the holes in the cheese from aligning.

If the material is impact or friction sensitive, then measures to guard against these stimuli must be put in place. For solids handling, this can be problematic. It might be that preparing the material as a solution is needed for minimum solids handling.

If the material is detonable, then all forms of stimulus must be guarded against unless you have specific knowledge that indicates otherwise. Furthermore, a safety study on storage should be performed. Segregation of explosable or detonable materials in storage will work towards decoupling of energy transfer during an incident.  By segregating such materials, it is possible to minimize the adverse effects of fire and explosion to the rest of the facility.

With explosive materials, electrostatic safety is very important. All solids handling of explosable solids should involve provisions for suppression of static energy. A discharge of static energy in bulk solid material is a good way to initiate runaway decomposition in an energetic material.  This is how a material with a high decomposition temperature by DSC can find sufficient stimulus for an explosion.

Safe practices involving energetic materials require an understanding the cause and effect of stimulus on the materials themselves. This is of necessity a data and knowledge driven activity. Along with ESD energy, handwaving arguments should also be suppressed.

What is evil?

One of the things that neuroscience is doing today is the mechanistic examination of many conditions that we previously assumed to exist. Like the matter of evil. This concept is deeply embedded into culture and most of us are born, live, and die in a world where we assume there is a real metaphysical condition called “evil”. Well, except for some carpet walkers in the philosphy department who have actually thought about the problem for a while.

Ron Rosenbaum at Slate has written an interesting article on the question of evil in the age of neuroscience.  One thing the concept of evil does is let we mortals off the hook in a sense for responsibility for our misdeeds. If people commit atrocities and holocausts because of the intervention of supernatural forces, then individuals and groups are never really the original cause. Humans would only be guilty of propagating meanness or atrocities rather than being the true author. Seems very convenient.

People naturally saddle up on the notion that the world is a battleground between good and evil.  We think in terms of conquest when opposite forces are perceived. Good and evil constitute a natural dualism for ape brains that seem constructed to find patterns.

Since we started remembering the thoughts behind our thick brows thousands of generations ago, we have been anthropomorphizing our world into a haunt of bipedal spirits who, even in the spirit world, have gender identity and appendages for locomotion. The notion that evil may be manifested in a character isn’t much of a stretch for our brains- maybe it is an inevitability.

What I like about the work described in the Rosenbaum article is that it forces us to reconsider what it means to be “evil”.  Are evil deeds really the expression of some deep malevolence from the underworld? Or, is evil a particular assemblage of thought patterns and a lack of barriers in behavior? Evil is not a philosophy, to be sure, but it can manifest from a brain that is substantially normal in many respects.

Even though evil may be just the work product of highly focused psychopathy, it still serves as a useful descriptor for the outcome of despicable behavior.  No matter what you call it, it is 100 % human.

Refractory Problem

Here is an interesting problem. How do you analyze refractory materials? What if you are making materials that could be used as a crucible raw material? How do you digest refractory materials down to homogeneous solutions that themselves need to be contained in something even more refractory?

Obviously, it is done all of the time. Methods like AA, ICPOES, ICPMS, GDMS, etc., are all useful in quantitating or revealing mass spectra of materials. Of the above list, only GDMS can be applied to solid samples. The AA and ICP methodologies require homogeneous solutions. This can be problematic.

X-ray techniques like XRF and XRD are useful for solids characterization as well. Of these two, only XRF is useful in the absence of distinct crystal phases. XRD detects crystal phases and can be used to good end with the crystallograpic database that is available for the identification of solid substances. In contrast, XRF, X-ray fluorescence, detects elements easily down to sodium, and lighter with a bit more difficulty. Hand held XRF units are available for the price of a low end BMW that will alow the user to point the business end of the unit to a material and identify the elements present.

A useful company to get to know in this arena is Inorganic Ventures. These folks are extremely knowledgeable and supply stock and custom standards for flame and ICP methods. The trick to the analysis and characterization of refractory metal oxides in the category of RO2, R2O3, and RO, is to have reliable standards on hand as well as a choice of fluxes. Fluxing at high temperature is often critical to the digestion of refractory oxides.  Fluxes are molecular inorganic salts that may be acidic or basic and may or may not be oxidizing. 

If you started out as an organikker like me, there will be a period of slight adjustment to the notions of what are regarded as acids and bases at 1000 C. A flux is a substance that melts and dissolves an inorganic solid, usually through the digestion of the material in question. A melt is produced inside a crucible within a muffle furnace.  This melt can be poured into a mold to produce a button or the material may be allowed to solidify in the crucible followed by aqueous acid dissolution.

In addition to acidic and basic fluxes, there is the matter of melting temperature and the need for a eutectic mixture. A variety of compositions can be prepared to provide a melt temperature suitable for a particular need.  Volatility may be a problem, requiring adjustment of conditions.

Crazy Time

Work has been a seamless stretch of insane activity 24/7. An extended manic episode of multi-tasking and over-commitment. Nervously, we juggle chainsaws and flaming bowling balls on deck while the bow submarines into the swells. The gales of fortune tear at the spinnaker as every square foot of canvas strains to pull the ship forward.

Coworkers are mind-numb to the incessant demands of a production schedule that is absolutely fault intolerant. I’ve been on a boat in a storm for a bakers dozen of years. Rogue waves have become the norm. Reach and grasp become disconnected as you struggle to stay on the heaving deck. Yet the captain in the wheelhouse steers the steel boat into the storm again, hoping to drop the net for one more trawl.  We lash ourselves to the mast and hope for the best.

Dearly Departed

Our next play, Dearly Departed, is in production. I play a character named Royce. This part has some pretty good lines. I’ve always played some fairly minor characters. The trick is to always do your best no matter what the part. A play is an ethereal being that lives for about 2 hours and then folds into a mere memory.

The job world is sort of like that too. You might find yourself in what appears to be a minor part in a large production. You get upstaged and your lines are walked over by the main characters.  But the main characters are carrying larger risk. If they flub their cues or mangle their lines, the effect is commensurately larger. On some projects you definitely don’t want to be the leading character. What is critically important is that you play your part the best way you can, show up for all of the rehearsals, and most importantly, pay close attention to what the other actors are saying and doing now.  The best actors are always in the moment.

Imagine a Better Microsoft

Imagine this. Imagine having a form of payment that requires the payee to change the manner in which they receive and deposit their payment. Imagine a system in which the currency in circulation is “upgraded” periodically and that within 8 or 10 years, the previous versions are no longer “supported” by the banking system.

Still with me? Let’s continue to imagine.

Now imagine paying Microsoft for their upgraded Office platform with a banking and currency system that changes as described above. Microsoft would have to direct their employees to change out their credit cards, requisition policies, travel policies, and accounting platforms to accomodate external demands just to remain in the game.

Over the last several months I have had to adapt to upgrades to Windows 7 and Office 2007. It is very much like moving the furniture around on a blind person. The features are still there, but access to the various tools and menues are arranged much differently.

So, Microsoft, I have spent considerable time relearning how to use software that I was proficient with in the previous rev. I am not enjoying new capability- only new learning curves. WTF!!!!

Your productivity tools are having a negative effect on my accumulated lifetime productivity.  This is worth something. Where do I send the bill?

Theories X and Y

Just for grins you should look up the Wikipedia page describing management Theory X and Theory Y. Anything look familiar?? This is what B-School faculty do. Which theory do you think Stalin subscribed to? Which theory does your organization follow?  Hey man. Sign me up for an MBA program.

Of course, these are book end theories. Most organizations are in between somewhere.  One organization up in Ft Collins has a slide for employees to zip to the bottom floor. This is Theory E for Elmo.

Thus Spake George

Time for some full frontal iconoclasm.

Going over back issues of C&EN I found an article in the Sept 5th, 2011, issue, p. 14, that struck my interest.  Well, interest is the wrong word. The article opens with George Whitesides saying-

As many as 100,000 new jobs for chemists could be created in the next 20 years if the recommendations of an ACS Presidential Task Force on Innovation in the Chemical Enterprise are carried out, according to task force chair George Whitesides, a chemistry professor at Harvard University.

Other illuminati on the task force include the usual band of chem celebrities.

You know, I find this a little irksome. These oligarchs have been exploiting cheap student and post-doc labor for decades for their own professional gain. Now, after the economy is set to crystallize into a new phase, big prizes sitting on the mantle, they are suddenly showing concern for up and coming chemists and the future direction of the profession.

Are they concerned for chemists or is it the continuation of the grant business that they are after? Both are certainly worthy of support. But why do we have a system in place where the boat gets some needed navigation only when the rock stars hold a Farm Aid task force? Duh! Shit man, George and Bobby say we have to do something, so I guess we have to pay attention. These two characters are riding off on their high horses while the rest of us are shoveling out the barn.

These top tier professors sit at the apex of what is in fact an inbred patronage system that is now at risk of coming apart. That’s the issue behind the headlines.