In Part I of this series, I looked at the inherent inaccuracies of the Intox EC/IR II Breathalyzer instrument even as it attempts to measure the alcohol concentration of a known control sample produced in a lab (TL;DR: it tolerates an inaccuracy range of at least .02, which is deeply problematic when that number is relied upon in threshold cases involving a .08 or .09 BAC).
In this second entry in this series, I want to talk about another very specific problem relating to the reliability of the EC/IR II when used as a source of evidence in DWI cases in Durham and across the state of North Carolina. Here, I want to address the reporting protocols of the instrument.
Specifically, the EC/IR II is programmed to require two sequential samples of breath that are within .02 of each other, and if it obtains breath samples that meet those criteria, it is programmed to report the lower of those two results. So what about a case like this:
There are a number of values listed in this sample BAC test ticket, but I want to focus on the lines that read SUB TEST. Those are the breath samples, and in this case, there are three of them. The reason the “chemical analyst” (the cop) proceeded to obtain a third sample after the first two is that samples one and two were a full .04 apart, which did not meet the instrument’s requirements to report a BAC. The third sample, however, was within .02 of the second, and therefore the lower of those last two samples was reported. When this information is asserted as unassailable fact to a judge or jury in a Driving While Impaired trial, the officer will simply state: “The reported alcohol concentration was .16 grams of alcohol per 210 liters of breath.” What the officer won’t say is that the machine was all over the place and reported three different numbers from the same individuals breath in a span of seven minutes.
So we learned in Part I that the EC/IR II is wildly inaccurate when attempting to measure the alcohol concentration of a control sample engineered in a lab. So how can we rely on it to accurately measure an unknown sample from a human being? The short answer is, we can’t.
But let’s look at what these protocols tell us. In the above example, the instrument produces three different numbers. The first is a .20, and the second is a .16. Note that the .20 is a full 25% higher than the .16. The prosecutor will say “we’re giving the defendant the benefit of the doubt by only reporting the lower number!” But what we’re actually doing is allowing into evidence the report from an instrument that varies wildly from minute to minute, based on the testimony of an “analyst” who typically lacks any understanding of how the instrument arrives at that number. But our courts routinely allow that reported number as sufficient proof of guilt in a criminal DWI case.
To put a finer point on it, what if the instrument reports a .06 and then a .04, in a case in which a driver is accused of violating a .04 alcohol restriction on his license? In that case, the instrument accepts and reports the .04, and the individual loses his license for a full year based on that number. But note that the .06 is a full 50 percent higher than the .04, after testing the same individual’s breath within a matter of minutes. In that example, the tolerance range of the instrument’s own protocols skyrockets to a full 50 percent.
And we haven’t even gotten to the formula at work within the instrument, and the way it accounts for variables that differ from individual to individual (spoiler alert: it doesn’t. At all.). I’ll get into that in a later post, where I’ll talk about what’s known as Henry’s Law.
★★★
DWI attorney Ben Hiltzheimer is a criminal defense attorney in Durham, North Carolina, who represents individuals charged with DWIs and the full spectrum of misdemeanors and felonies. Contact us for a free, confidential consultation and case evaluation at (919) 899-9404.