Gold APP Instruments​​

Tech Articles

How to Calibrate Differential Scanning Calorimetry (DSC) and Steps
来源: | From: Gold APP Instruments | Published Date: 2026-04-14 | 65 Time(s) of View | 🔊 点击朗读正文 ❚❚ | 分享到:
There is potential for slight inaccuracy of measurements in all DSC analysers because sensors, no matter how good, are not actually embedded in the sample, and the sensors themselves are also a potential variable. Therefore, in order to ensure accuracy and repeatability of data, a system must be calibrated and checked under the conditions of use.

There is potential for slight inaccuracy of measurements in all DSC analysers because sensors, no matter how good, are not actually embedded in the sample, and the sensors themselves are also a potential variable. Therefore, in order to ensure accuracy and repeatability of data, a system must be calibrated and checked under the conditions of use. It is important to understand that calibration itself is just a tool or process designed by the manufacturer to adjust the analyser to give a slightly different temperature or energy response. The accuracy and acceptability of the system can only be judged by separately checking the system against accepted standards. A good overview of currently available standards for DSC with comments on accuracy and procedure is given in.

 

When to calibrate

When the system is out of accepted specifications! It is important to distinguish between the process of calibration and the process of checking specification. If a system is brand new, has undergone some type of service, or is to be used under new conditions, it may be in need of complete calibration, but a systemin regular use should be checked regularly and calibrated when it is shown to be out of specification. Many systems, particularly in the pharmaceutical industry, will be used under GLP (good laboratory practices) or other regulations and have established guidelines as to when and how often checks should be made.

 

Checking performance

In many industries frequency and method of checking, together with accepted limits, will be determined by the standard operating procedure (SOP) adopted and may well be done on a daily basis. Regular performance checks are common sense. If a system is only checked once every 6 months and then found to be in error, 6 months work is suspect.

 

The most common procedure is to run an indium standard under the normal test conditions and measure the heat of fusion value and melting onset temperature. These values are then compared with literature values and a check made against accepted limits. For many industries limits of ±0.5℃ for temperature or 1% for heat of fusion may be accepted, though tighter limits of ±0.3◦C and 0.1% may also be adopted. The choice of limits depends on how accurate you need to be. Indium is the easiest standard to use because of its stability and relatively low melting point of 156.6℃, which means it can often be reused, provided it is not heated above 180℃.

 

Parameters to be calibrated

Areas subject to calibration are as follows:

(a) Heat flow or energy. This is normally performed using the heat of fusion of a known standard such as indium. As an alternative, heat flow may be calibrated directly using a standard of known specific heat.

(b) Temperature recorded by analyser. This is performed using the melting point of known standards.

(c) Temperature control of the analyser. Sometimes called furnace calibration.

 

Use the same conditions for calibration as will be used for subsequent tests. If a range of different conditions are to be used then a range of different calibrations must be performed, one for each different set of conditions. Some analysers/ software make allowance for changes in scan rate, so simplifying calibration procedures, but the effectiveness of calibration should in all cases be checked and verified for each set of conditions and scan rates used.

 

Some instruments may require set default conditions to be restored before measurements are made. The manufacturers’ instructions for calibration should in all cases be followed even if they deviate from the general principles outlined here. Use standard values obtained with the actual standards since different batches of standards may have different values.

 

Heat flow calibration

The y-axis of a DSC trace is in units of heat flow; this needs to be calibrated. This is usually performed via a heat of fusion measurement as shown in Figure 1. The area under the melting curve is used since this reflects the heat flow as a function of temperature or time. Some software packages may also make provision for the y-axis itself to be calibrated and this is done against a heat capacity standard such as sapphire.


Indium run as a check after a calibration procedure

Figure 1 Indium run as a check after a calibration procedure has been completed showing the onset calculation for melting point, and area calculation for heat of fusion. This particular curve would benefit from a higher data point collection rate to remove the stepping effect of the data and improve overall accuracy.

 

An appropriate standard is chosen and heated under the same conditions (e.g. scan rate, purge gas and pan type) as the tests to be performed. Measure the heat of fusion. This information is then entered into the calibration section of the software. Indium is probably the best standard to use but others may be available in order to check a wider temperature range. The weight of standard should be sufficient to be accurate, typically 5–10 mg and weighed with a six-figure balance for best accuracy. Inaccuracy of weight measurement will limit the accuracy of heat of fusion measurements. An inert purge gas should be used for calibration to minimise potential oxidation. Nitrogen can be used even if samples are to be subsequently run in air or oxygen since all these gases have similar thermal properties and can be used interchangeably as far as calibration is concerned. A clean melting profile is required, without irregularities such as spikes or a kink in the leading edge of the melt caused by sample collapse and flow. For this reason, the sample should be flattened when placed in the pan; most standard samples are sufficiently soft and they may be flattened using the flat part of tweezers before placing in the pan. It is also good practice to use indium on the reheat after it has been melted once, since this will improve the thermal performance. Indium may be reused provided it has not been overheated; other standards should be discarded after use, though metals such as lead, tin or zinc, which show irregularities on initial heating, may be used on the reheat since this is better than taking data from a poor trace. In all cases stop the run once melting is complete, and if a standard is taken well above its melting point do not reuse it even if the initial heat is unsuitable.

 

Heat of fusion values are more noticeably affected than temperature when a standard deteriorates. If the y-axis scale is to be calibrated directly then the approach used for Cp measurement should be employed. Values obtained for a Cp standard are then entered against standard values.

 

Temperature calibration

A typical temperature onset measurement is also shown in Figure 1. Note that melting point is determined from the onset of melt, not the peak maximum. Normally, at least two standards are needed to adequately calibrate for temperature, and ideally they should span the temperature range of interest, though if working in an ambient environment two widely spaced standards such as indium and lead or zinc are often sufficient. Temperature response is normally linear, so measurements outside of the standards range are normally accurate, but if in doubt then check against a standard. It is the verification process which is the critical aspect of calibration, not the actual procedure used with any specific system. For sub-ambient work a sub-ambient standard is advisable. Organic liquids do not make good reference materials but they are often the only available materials. Obtain as pure a material as possible. Cyclohexane is quite useful having two transitions, a crystal transition at −87 and melt at 6.5. Table 1 shows a list of standards.


Commonly used standards and reference materials

Table 1 Commonly used standards and reference materials

 

Use the same conditions as for the subsequent tests that you want to perform. Measure the onset values and enter them into the calibration software. Sample weights are not so critical if temperature measurement only is being made and typically a few milligrams should be used. The onset value of indium melt from the heat of fusion test is usually employed to save repeating the process.

 

Temperature control (furnace) calibration

In principle, calibration of the control of the analyser requires the same approach as temperature calibration: the information from temperature control should be compared with that of various standards. However, to save repeating the whole process again software often uses an automatic procedure to match the information already obtained with the control routines. Some parameters for this, e.g. the temperature range, may need to be chosen.

 

Choice of standards

Table 1 lists details of commonly used reference materials. Materials should be used once and then discarded. However, indium is one significant exception to this rule, since it can be reheated many times, provided it is not overheated. Indium fusion values repeat to very high precision (0.1%) provided it is not heated above 180◦C. To ensure complete recrystallisation between tests it should be cooled to 50◦C or below after each test. Since indium can be obtained to high purity, is very stable, and has a very useful melting point, it makes an ideal standard for most analysis and can be obtained as a certified reference material (CRM). Other metals listed should be discarded after use. For temperature ranges where no metal transitions are appropriate other materials may be used as reference materials. These may not be certified materials but may be the best available. Use fresh materials of as high purity as possible.

 

These materials have a certificate giving values obtained after the material has been tested by a range of certified laboratories. This is not necessarily a certificate of purity but they are regarded as the ultimate in reference materials. They can be obtained from Lab of the Government Chemist. Other materials may, for example, possess a certificate of high purity allowing use of theoretical melting and heat of fusion values. These types of materials are termed reference materials and are also widely accepted.

 

Factors affecting calibration

A number of factors are known to affect the response of a system and if varied may require different calibration settings. These include the following:

  • Instrument set-up and stability

  • Use of cooling accessories

  • Scan rate

  • Purge gas and flow rate

  • Pan type


Varying any of these factors can affect calibration, though effects may be more significant for one instrument than another. Initially, ensure that the instrument is set up properly, all services turned on and stable. Most analysers will contain some analogue circuitry which will cause slight drift as they warmup, therefore make sure that instruments have been switched on for a while (typically at least an hour) before calibration. There may be other instrument settings in specific analysers that affect calibration, so users should have a good familiarity with the analyser being used before proceeding with calibration. Use of cooling systems can cause temperature values to shift whilst they are cooling down, so make sure that these are switched on and stable. If using different scan rates, particularly the very fast rates employed by fast scan DSC, you should ensure that the analyser is suitably calibrated for the scan rate of use. Purge gases such as air, oxygen and nitrogen have similar thermal effects and can be used interchangeably, provided scan rates are not altered. The higher conductivity of helium or lower conductivity of argon can have very significant effects on calibration and therefore systems must be calibrated with these gases, if they are to be used. Typically, helium is used at low temperatures or with high scan rates, whilst argon may give better performance at high temperatures. Differing pan types generally do not have too much effect but if thermal contact is changed, for example by the differing thickness or material type of a pan base, then calibration can be affected. If in doubt, then check.

 

Calibration is a process of check and adjustment. It is not complete until you have checked the analyser afterwards to ensure that values obtained are acceptable; Verification ensures that the analyser is in good order, not the calibration procedure itself. A good overview of the calibration procedure together with recognised standards available is given in. Note: Always follow manufacturers’ recommended procedures when calibrating a system.