Pipette Testing Guide

You have a favorite pipette.  Familiarity with the electronic interface makes its use second nature. Maybe your hand fits it just right, or the smoothness of the action makes it easy to use. Then, one day, your dispensed volume just feels “off”.  You try pipetting the liquid again, and again it looks too low or too high. What can you do? Basic pipette testing verifies the accuracy – or inaccuracy – of your equipment.

Fortunately, you can perform basic pipette testing with a few pieces of lab equipment. Let’s get started.

With a stable environment, an accurate balance, and distilled water, basic pipette testing can verify the accuracy – or inaccuracy – of your favorite pipette.

Required Equipment

Collect the following equipment and supplies.

  • A weighing device. Balances offer more precision, but a digital scale can suffice for larger volume samples.
  • A weighing vessel, such as a beaker.
  • A steady and consistent pipette technique.
  • An environment with stable temperature and humidity.
  • A stable surface.
  • Distilled water.

Pipette Testing Preparations

Once you collect the needed items, start by setting up your weighing device on a stable surface in an environment with minimal temperature and humidity variations. If your scale or balance has a leveling bubble and internal calibration, utilize both now.  Finally, set your weighing device to read grams.

With your scale or balance ready, place your weighing vessel on the weighing device and either “tare/zero” the unit or record the weight of the vessel to be subtracted from your overall weight later.

When engaging in pipette testing, the weight of the vessel must be taken into account either by simple mathematics (subtraction of the weight of the vessel from the total weight) or by taring or zeroing the balance.

Pipette Testing Procedure

With all the preparations made, set the pipette’s volume to the middle of its range. Draw up a sample of water and dispense it into your weighing vessel. The elegance of this procedure lies in the near-perfect conversion of microliters of distilled water to grams of mass.  This conversion involves nothing more than moving the decimal three (3) places. To go from grams to microliters, move the decimal to the right. To from go from microliters back to grams, move the decimal to the left.

For instance, fifty (50) microliters of water weighs 0.05 grams: a movement of the decimal from 50.0 three places to the left to 0.05. Similarly, .0485 grams of distilled water translates to 48.5 microliters of water: a movement of the decimal from 0.0485 three places to the right, to 48.5.

So, when you verify a pipette with a range of 10-100 microliters at 50 microliters and the balance reads 0.0487g, then your pipette dispensed 48.7 microliters. If your scale or balance does not have a zero or tare function, subtract the weight of the vessel from the total measured weight after dispensing the water. After verification at the mid-range, set the pipette’s range for its lowest and highest settings and repeat the procedure for the respective ranges.


While this method does not yield the precision results of more in-depth procedures, it provides you with a general idea of pipette performance and accuracy. With that in mind, let’s say you’ve figured out that your pipette is reading too far out of range to be used within your lab.  What do you do now?  Throw the pipette away and buy new?

Of course not! Send it to the professionals at Precise Technical Solutions. The diagnosis, repair, and calibration of lab instruments such as pipettes forms a core part of our team’s competencies. With a little time in our lab, our technicians will have your favorite pipette operating just like it did the day it was new.

Thermometer Calibration by the Comparison Method

When employing thermometer calibration by the Comparison Method, readings from a thermometer with unknown accuracy are compared to those from a standard device. The standard device is calibrated to meet the quality requirements of the National Institute of Standards and Technology (NIST) or a similar governing body.

Typically, this method of calibration is used for liquid-in-glass thermometers. This technique often applies to Standard platinum resistance thermometers (SPRT) and resistance temperature detectors (RTD) for industrial equipment as well.

Common Thermometer Types

Two types of liquid-in-glass thermometers exist:

  • Mercury thermometers contain a bulb filled with mercury attached to a narrow tube. Changes in temperature yield changes in the volume of mercury. These small volume changes drive the mercury up the tube or pull it down the tube.
  • Alcohol or spirit thermometers look and act like mercury thermometers. However, they use ethanol instead. Ethanol is less toxic than mercury, and cleans up more easily after breakage because ethanol evaporates quickly. Since ethanol costs less than mercury, replacing broken thermometers also incurs less cost.

The three categories of mercury and spirit thermometers include:

  • Complete Immersion Thermometers, which show temperature correctly when completely covering the entire thermometer with fluid (gas or liquid).
  • Total Immersion Thermometers, which show temperature correctly when covered by fluid except for a small portion of the column [6 to 12 mm (0.24 to 0.47 in )].
  • Partial Immersion Thermometers, which show temperature correctly when immersed to a specific depth. A line on the thermometer usually indicates required depth.

Calibration Procedures

Thermometer calibration by the Comparison Method includes the following steps:

  1. Review the device to be calibrated.
  2. Prepare the calibration bath.
  3. Test the device to be calibrated.
  4. Record any difference and reset if possible.

Next, let us take a closer look at the processes involved in each step.

1. Review the device to be calibrated.

  • Document the level of accuracy required for the tasks the device will be used to complete.
  • Look at the device to be calibrated. Ensure that the column and bulb are not cracked and the legibility of the scale.
  • Note any identification numbers. These numbers often trace back to manufacturer’s specifications that could help in making calibration decisions.

2. Prepare the calibration bath.

  • Set the calibration bath to the desired calibration temperature.
  • Ensure the immersion of the NIST standard thermometer in the calibration bath at the proper depth.
  • If calibrating more than one device, begin with the device with the lowest temperature.
  • Wait until the calibration bath stabilizes at the desired temperature.
    • Use the NIST standard thermometer to measure the temperature.
    • When the temperature remains unchanged for at least three readings, taken thirty (30) seconds apart, the bath has stabilized.

3. Test the device to be calibrated.

  • Insert the thermometer to be calibrated into the calibration bath at the proper immersion depth.
  • Allow the thermometer to be calibrated to achieve a stable temperature. The temperature is stable when the reading on the device has not changed for at least three readings, taken thirty (30) seconds apart.

4. Record any difference and reset if possible.

  • Use a magnifying glass to look at the tested device.
  • Identify the difference between the device’s output and the calibration bath.
  • Document the difference.
    • Perform any reset using manufacturer’s instructions.
    • Repeat steps 2-4 for remaining devices being calibrated at higher temperatures.

Closing Thoughts

Thermometer calibration by the Comparison Method enjoys popularity as one of the most widely used techniques. Restaurants employ this method via ice bath to ensure proper temperatures for food storage, and home chefs may use it in conjunction with oven thermometers to get the known offset of an oven’s settings.

As we’ve seen here, however, laboratory-grade thermometer calibration by the Comparison Method involves a much more rigorous procedure and complies with the standards of the appropriate governing bodies.

Thermometer Calibration by the Fixed Point Method

What is the Fixed Point Method?

The Fixed Point Method of thermometer calibration assures the quality and accuracy of a thermometer’s measurements. Typically, only national metrology laboratories use it. Thermometer calibration by the fixed point method focuses on instruments that must measure accurately within ±.001℃.

This method uses the ITS-90 international temperature scale developed in 1990. Based on the thermodynamic or absolute temperature scale, ITS-90 is not truly a scale. It is a set of fixed points that defines an international equipment calibration standard. It helps scientists know that a temperature measured in Asia will be the same as a temperature measured in Europe. Scientists can repeat results regardless of location with properly calibrated equipment.

In creating the ITS-90 standard, a body of scientists selected seventeen fixed points. These points are based on temperatures where various elements or compounds achieve equilibrium. These points include [1]:

The ITS-90 temperature scale defines seventeen fixed points to use during thermometer calibration by the fixed point method.

Although Freezing Point and Melting Point are common terms, Triple Point and Vapor Pressure Point might not be. The Triple Point is the temperature of a substance where all the phases of matter – solid, liquid, and gas – exist in equilibrium. The Vapor Pressure Point is the temperature of a substance where the gas and liquid phases are in equilibrium.

How is the Fixed Point Method used in thermometer calibration?

To achieve fixed point calibration, the thermometer to be calibrated is placed in a specially designed flask. This flask maintains a constant temperature through both heating and cooling. The technician selects a limited number of fixed points from the ITS-90 with the goal of using as few as possible. The actual procedure varies depending on the fixed points selected. [2]

The points of the ITS-90 temperature scale have been identified, documented, and globally accepted. As a result, technicians with proper training can reproduce standard calibration conditions.

[1] B.W. Magnum. “International Temperature Scale of 1990 (ITS-90).” John R. Rumble, Editor-in-Chief. CRC Handbook of Chemistry and Physics, 98th Edition, 2017-2018. CRC Press, Boca Raton, FL (USA): 2017. 1-17.

[2] http://www.evitherm.org/files/795/ThermometerCalibrationMethods.pdf, last visited 01/18/2018