Automated calibration of dosimeters used in radiotherapy



Automated calibration of dosimeters used in radiotherapy


Automatización de la calibración de dosímetros de radioterapia



Andy L. Romero Acosta, Stefan Gutiérrez Lores1

1Laboratorio Secundario de Calibración Dosimétrica del Centro de Protección e Higiene de las Radiaciones (CPHR)
Calle 20 No 4113 e/ 41 y 47, Playa. La Habana, Cuba




Traceability, accuracy and consistency of radiation measurements are essential in radiation dosimetry, particularly in radiotherapy, where the outcome of treatments is highly dependent on the radiation dose delivered to patients. Calibration of dosimeters for external beam radiotherapy includes current and charge measurements, which are often repetitive. However, these measurements are usually done using modern electrometers, which are equipped with an RS-232 interface that enables instrument control from a computer. This paper presents an automated system aimed to the measurements for the calibration of dosimeters used in radiotherapy. A software application was developed, in order to achieve the acquisition of the charge values measured, calculation of the calibration coefficient and issue of a calibration certificate. A primary data report file is filled and stored in the computer hard disk. The calibration method used was calibration by substitution. By using this software tool, a better control over the calibration process is achieved and the need for human intervention is reduced. The automated system has been used for the calibration of dosimeters used in radiotherapy at the Cuban Secondary Standard Dosimetry Laboratory of the Center for Radiation Protection and Hygiene.

Key words: calibration, radiotherapy, automation, laboratories, dosemeters.


La trazabilidad, exactitud y consistencia de las mediciones son esenciales en la dosimetría de las radiaciones, sobre todo en radioterapia, donde el resultado del tratamiento depende mucho de la dosis de radiación suministrada. Los procedimientos de calibración de dosímetros de radioterapia incluyen mediciones de carga y corriente que son a menudo complejas y repetitivas. Sin embargo, para realizar estas mediciones, usualmente se emplean electrómetros modernos que incluyen una interfaz RS-232, la cual hace posible controlar estos equipos desde una computadora. En el trabajo se presenta un sistema automatizado para las mediciones en el proceso de calibración de dosímetros de referencia de radioterapia. Se confeccionó una aplicación informática que realiza la adquisición de los valores medidos de carga eléctrica, calcula el coeficiente de calibración y automatiza la emisión del certificado de calibración. Estos valores se guardan en un fichero de registro, en una computadora. El método de calibración empleado fue calibración por sustitución. El uso de la aplicación mejora el control sobre el proceso de calibración y contribuye a humanizar el trabajo. La herramienta informática desarrollada se aplicó en la calibración de dosímetros patrones de radioterapia, en el Laboratorio Secundario de Calibración Dosimétrica del Centro de Protección e Higiene de las Radiaciones.

Palabras claves: calibración, radioterapia, automatización, laboratorios, dosímetros.




The international measurement system provides the necessary structure in order to ensure the compatibility in dosimetry of ionizing radiation, by making available to the user community instrument calibrations that are traceable to primary standards. As an important element of this structure is the Secondary Standard Dosimetry Laboratory (SSDL) network, whose main role is to provide users with traceable calibrations to the international measurement system, allowing the transference of calibrations of the dosimeters from the primary standard to the user instruments [1]. One of the main objectives of SSDLs is the continuing maintenance and improving of their calibration capabilities. The Cuban Secondary Standard Dosimetry Laboratory at the Center for Radiation Protection and Hygiene (CPHR) is not an exception with more than fifteen years since its creation, it provides calibration services based on a Quality Management System in accordance with the international standard ISO/IEC 17025, acknowledged by the Euro-Asian Cooperation of National Metrological Institutions (COOMET) and accredited by the National Accreditation Body of the Republic of Cuba (ONARC).

One of the actions taken in order to improve the calibration and measurement capabilities of the Cuban SSDL was the automation of measurements performed during the calibration of dosimeters used in radiotherapy in terms of air kerma and absorbed dose to water. This paper presents the experiences of the Cuban SSDL in the design and development of a software, named Univait, which adopted the guideline recommended by the IAEA [2] and used for more than ten years for the calibration of these dosimeters at the SSDL. This software tool enables the communication with different models of PTW Unidos electrometers, which are equipped with an RS-232 interface that makes instrument control feasible using a computer.

By using an automated system for the acquisition of measurements taken during the calibration process, the amount of work for laboratory staff is relieved; in addition, it improves the control of the calibration process. This system also decreases the time of execution of the calibration process and, consequently, increases energy saving in this particular task.




The ionization chambers used were: NE 2561, NE 2571, NE 2581, W30001, W30004, W31002 and TM34001. These chambers were used with the following electrometers: PTW UNIDOS 10001, 10002, UNIDOS E and UNIDOS Webline. The Cobalt-60 teletherapy unit used in the calibration of these ionization chambers was a Phoenix-20 unit. Temperature and pressure were measured with a Thommen Climate SW HM30 digital thermometer and a Vaisala digital barometer, respectively (Figure 1). A cubic water phantom with plastic walls and a side length of 30 cm was used for the calibration in terms of absorbed dose to water.

Calibration method

The calibration method used was calibration by substitution [3]. In this method, first the reference dosimeter is placed at the calibration point to determine the reference output rate of the beam through a set of readings; it is then replaced by the dosimeter to be calibrated and a similar set of readings is taken.

The calibration was done in the following reference conditions: temperature of 20°C, pressure of 101.325 kPa, and radiation field width of 10 cm x 10 cm to the distance from the water phantom to the source of 80 cm. The reference depth in the phantom was 5 cm. The collimator setting was fixed throughout the calibration procedure, exposed in TRS-374 [2], TRS-469 [3], and TRS-398 [4]. The SSDL procedures for calibration in terms of air kerma and absorbed dose to water have been validated as well as followed the steps described in these technical reports. Using the substitution method, the calibration coefficient of an instrument is determined in two steps:

Step 1: Measurements are made with the reference standard dosimeter to determine the output rate   of a radiation beam of quality Q with the SSDL reference calibrated at IAEA [5]:

Where  is the calibration coefficient of the SSDL reference standard for the beam quality Q and is the reading of the reference dosimeter corrected for the influence quantities.

Step 2: Measurements are made with the user instrument at the same position as the reference standard in the beam of quality Q. The calibration coefficient is determined as the ratio of the output rate, , determined in step 1, to the mean reading obtained from the instrument to be calibrated, corrected for the influence quantities:

Where  is a correction for the effect of a change in source position.

is the reading obtained with either the reference dosimeter or the user dosimeter corrected for influence quantities.   and  , from equations (1) and (2), are denoted in the following

Where is the mean value of the readings taken after the instrument settled, is a factor to correct for departure of air density from reference conditions,  is a factor to correct for deviation of chamber position from the reference position, is a factor to correct for the stability of the SSDL reference standard, and is a factor including all the corrections whose uncertainties are too small to be individually considered in the uncertainty budget, because they are estimated to be much less than 0.1%. Combining the equations (1) - (3) we will have:

Where  is given by,

Since in our conditions,, and  were negligible and their uncertainties have been well determined, is given by:

Measurement of uncertainty

The evaluation method for the uncertainty of the calibration coefficient is that outlined in the IAEA publication [5] and the ISO document [6]. This method considers all the quantities that might contribute to the overall uncertainty and neglects those that contribute less than 0.1%. It then chooses typical values for the uncertainties of the remaining quantities and shows how to evaluate the overall uncertainty.

Table 1 shows a typical uncertainty budget for the calibration of a W30001 chamber against a working standard for absorbed dose to water in . The sources of uncertainty are shown in three groups: factors influencing the working standard, factors influencing the user dosimeter and factors influencing both dosimeters. In the case of measurement of current and field inhomogeneity, relative standard uncertainties of less than 0.1% were found. The corresponding relative standard uncertainty values were retained in this table to maintain clarity.



Figure 1 and 2 show the automated system components. It is composed of the Vaisala digital barometer, the Thommen digital thermometer, two PTW Unidos electrometers and a computer. The instruments are connected to the computer using RS232 cables and RS232-USB adapters.

A software tool for the automation of the calibration procedures was developed in LabVIEW [7], a platform and development environment for a visual programming language. This tool follows the steps included in the validated procedures for the calibration in terms of air kerma and absorbed dose to water. The developed application performs the acquisition from electrometers and the processing of charge values, it writes the final data in a data report file and issues a calibration certificate. On the following lines the steps related with to these tasks are described.

Leakage measurement: The software makes easy the setting of the time for the determination of the leakage current. It also enables to save the initial and final charge values, its respective time values and the calculated value of leakage current. If the value obtained is higher than 0.1% of the current determined or if it is greater than 10 -14 A, the program will notify this event and will stop the calibration.

Measurement with the working standard chamber: Five charge readings are taken by the application, using an integration time satisfying that the measured charge is at least of 1 nC. The temperature and pressure values can be acquired from the Thommen thermometer and the Vaisala barometer, respectively, as default, or they can be entered by the user. All these values are saved in the data report file. The software calculates the difference between the measured air kerma or absorbed dose to water and the reference values of these quantities after decaying correction. If the difference is higher than± 0.5%, a warning is shown. In that case the position of the chamber or the phantom should be checked and the measurement repeated. If this difference persists, the user must stop the calibration and restart it later.

Calculation of the calibration coefficient: The steps followed with the working standard dosimeter are then repeated with the user dosimeter. The application computes the calibration coefficients of air kerma or absorbed dose to water using equations from to . The operations carried out with both the working standard and the user dosimeters are repeated later and the calibration coefficient is recalculated by the software. The calibration coefficients are compared and the difference between them should not exceed 0.5%. Otherwise, it would be necessary to repeat the charge measurements. Then, the reported calibration coefficient then would be determined as the average of the two measured values.

Calibration Report: Instrument features and also all its data output are saved in the calibration data report file, whose fields are the same that were contained in the old calibration report book. 

Calibration certificate: Results of the calibration are reported in a calibration certificate. Although the calibration coefficient is the most important parameter, the application included additional information for the correct interpretation of the results of the calibration.

Quality controls on ionization chambers: It is a good practice to perform periodically quality controls on ionization chambers periodically. A portable check source is used with this aim. The module of the computer application responsible for this task carries out five measurements of ionization current, using an integration time set by the user in such a way that at least 1 nC was measured for the charge for each measurement. The mean of the obtained current values is compared to the reference current, taking into account the radioactive decay and corrections to the reference conditions of temperature and pressure. All these parameters are saved in the calibration data report file. The difference between the reference and the measured current should not exceed 0.5%. Otherwise, the quality control should be repeated and if the deviation persists, the control should be stopped, being necessary for the staff to analyze the possible causes of the failure to comply with the criterion.

User interface

The software tool is composed of three modules (Figure 3): Instrument, Calibration and Quality Control, which are described hereunder.

Instrument: It enables the control and communication with the electrometer, keeping the same operational functions given by the real key panel of the instrument. The graphical interface is identical to the view of the real front panel of the electrometer, which makes its control easier.

Calibration: This is the module that accomplishes the calibration procedure. It is the most important block and its functions were described earlier. As the instrument module, the calibration module presents an intuitive graphical user interface which enables to carry out the steps comprised in the calibration procedure. When the application is launched, the parameters corresponding to the last measurement, e.g., ionization chamber, path of the data report file, quantity in which the calibration will be done (air kerma or absorbed dose to water), are shown on the screen.

Quality control: As its name states, this module performs the quality controls on ionization chambers. As is shown in Figure 3, it presents a user interface that makes possible to specify the chamber model, reference current and date, source identification, temperature, pressure, integration time and number of readings; showing after execution the final value of the ionization current and the percentage difference from the reference current.

Comparison of the results obtained with and without the automated system

Table 2 shows data and result of calibration for the user dosimeters calibrated, in terms of absorbed dose to water, in , using the software application. The working standard was a NE 2581 chamber connected to a PTW Unidos 10002 electrometer. Two sets of five readings each one were taken for each user dosimeter. Shown values correspond to the mean values for the two sets of readings for each instrument. The calibration coefficient ND,w was reported with the expanded uncertainty (k = 2), which was estimated from Table 1.  When this result was compared to the one of the preceding calibration of the same dosimeter performed two years before, showed in Table 3, we noted that the difference was, in the worst case, in the order of 0.6%, which is in the expected range of variability.

Reduction of the calibration execution time

The developed application shortened the time required for the calibration process. The elapsed time corresponding to the setup of the measurement system, leakage measurement, determination of the calibration coefficient, data writing in the report file, issue of the calibration certificate and quality control on the user chamber, without using the program was about 4 hours 30 minutes; meanwhile with the software tool this time was 3 hours 15 minutes. Hence, there was a time reduction of 1 hour 15 minutes, representing a 28% saving of the time required without the application.




An automated system for measurements was developed for the calibration of dosimeters used in radiotherapy. A software application was designed and created using LabVIEW. The application improves the capability of the SSDL for answering to a higher demand of this calibration service, frees the staff from complex and repetitive tasks, and decreases the probability of occurrence of human errors. The calibration execution time has also been reduced. Comparison of the results obtained with and without the software application shows not significant differences in the calibration coefficients. The automated system has been used for the calibration of dosimeters used in radiotherapy at the Cuban Secondary Standard Dosimetry Laboratory at the CPHR.




1.OIML. Secondary Standard Dosimetry Laboratories for the Calibration of Dosimeters used in Radiotherapy. Document OIML D-21. Paris: OIML, 1990.
2. IAEA. Calibration of Dosimeters used in Radiotherapy. Technical Reports Series No. 374. Vienna: IAEA, 1994.
3. IAEA. Calibration of reference dosimeters for external beam radiotherapy. Technical Reports Series No. 469. Vienna: IAEA, 2009.
4. IAEA. Absorbed Dose Determination in External Beam Radiotherapy. Technical Reports Series No. 398. Vienna: IAEA, 2000.
5. IAEA. Measurement Uncertainty: A Practical Guide for Secondary Standard Dosimetry Laboratories. IAEA-TECDOC-1585. Vienna: IAEA, 2008.
6. ISO. Guide to the Expression of Uncertainty in Measurement. Geneva: ISO, 1995.
7. National Instruments. LabVIEW System Design Software 2010. [software en línea] [consulta: 7 enero 2013].


Recibido: 14 de agosto de 2013
24 de octubre de 2013