Thu 13 Feb, 2020 06:53 pm
I am testing a alcohol detection device for my company. The fuel cell for alcohol detection is accurate between +- 0.007. To test the device, I am having people who drink get a reading from a breathalyzer which is accurate at +- 0.005 and then breathe into my companies' device. My question is how do I determine if the new alcohol device is giving accurate readings since it has an error rate of 0.007 and it is being compared to a device with a error rate of 0.005.