Definitions
Nearby Words

# Calorimeter

[kal-uh-rim-i-ter]
A calorimeter is a device used for calorimetry, the science of measuring the heat of chemical reactions or physical changes as well as heat capacity. The word calorimeter is derived from the Latin word calor, meaning heat. Differential scanning calorimeters, isothermal microcalorimeters, titration calorimeters and accelerated rate calorimeters are among the most common types. A simple calorimeter just consists of a thermometer attached to an insulated container.

To find the enthalpy change per mole of a substance A in a reaction between two liquids A and B, the liquids are added to a calorimeter and the initial and final (after the reaction has finished) temperatures are noted. Multiplying the temperature change by the mass and specific heat capacities of the liquids gives a value for the energy given off during the reaction (assuming the reaction was exothermic.). Dividing the energy change by how many moles of X were present gives its enthalpy change of reaction. This method is used primarily in academic teaching as it describes the theory of calorimetry. It doesn’t however account for the heat loss through the container or the heat capacity of the thermometer and container itself. In addition, the object placed inside the calorimeter show that the objects transferred their heat to the calorimeter and into the liquid, and the heat absorbed by the calorimeter and the liquid is equal to the heat given off by the metals.

## Types

### Reaction calorimeters

A reaction calorimeter is a calorimeter in which a chemical reaction is initiated within a closed insulated container. Reaction heats are measured and the total heat is obtained by integrating heatflow versus time. This is the standard used in industry to measure heats since industrial processes are engineered to run at constant temperatures. Reaction calorimetry can also be used to determine maximum heat release rate for chemical process engineering and for tracking the global kinetics of reactions. There are three common methods for measuring heat in reaction calorimeter:

Heat flow calorimetry

The cooling/heating jacket controls the temperature of the process. Heat is measured by monitoring the temperature difference between heat transfer fluid and the process fluid as follows:

$Q = UA\left(T-t\right)$

where

$Q$ = process heating (or cooling) power (W)
$U$ = overall heat transfer coefficient (W/(m2K))
$A$ = heat transfer area (m2)
$T$ = process temperature (K)
$t$ = jacket temperature (K)

Heat flow calorimetry allows the user to measure heat whilst the process temperature remains under control. It is however a difficult technique to use and not particularly accurate. The value of U has to be predetermined by careful experimentation and any change in product composition, liquid level, process temperature, agitation rate or viscosity will upset the calibration.

A variation of the 'heat flow' technique is called 'power compensation' calorimetry. This method uses a cooling jacket operating at constant flow and temperature. The process temperature is regulated by adjusting the power of the electrical heater. When the experiment is started, the electrical heat and the cooling power (of the cooling jacket) are in balance. As the process heat load changes, the electrical power is varied in order to maintain the desired process temperature. The heat liberated or absorbed by the process is determined from the difference between the initial electrical power and the demand for electrical power at the time of measurement. The power compensation method is easier to set up than heat flow calorimetry but it suffers from the similar limitations since any change in product composition, liquid level, process temperature, agitation rate or viscosity will upset the calibration. The presence of an electrical heating element is also undesirable for process operations.

Heat balance calorimetry

The cooling/heating jacket controls the temperature of the process. Heat is measured by monitoring the heat gained or lost by the heat transfer fluid as follows: $Q = m_s C_\left\{ps\right\}\left(T_i - T_o\right)$

where

$Q$ = process heating (or cooling) power (W)

$m_s$ = mass flow of heat transfer fluid (kg/s)

$C_\left\{ps\right\}$ = specific heat of heat transfer fluid (J/(kg K))

$T_i$ = inlet temperature of heat transfer fluid (K)
$T_o$ = outlet temperature of heat transfer fluid (K)

Heat balance calorimetry is, in principle, the ideal method of measuring heat since the heat entering and leaving the system through the heating/cooling jacket is measured from the heat transfer fluid (which has known properties). This eliminates most of the calibration problems encountered by heat flow and power compensation calorimetry. Unfortunately, the method does not work well in traditional batch vessels since the process heat signal is obscured by large heat shifts in the cooling/heating jacket. A recent development in calorimetry however is that of constant flux cooling/heating jackets. These use variable geometry cooling jackets and can operate with cooling jackets at substantially constant temperature. These reaction calorimeters tend to be much simpler to use and are much more tolerant of changes in the process conditions (which would affect calibration in heat flow or power compensation calorimeters).

Bomb calorimeters

A bomb calorimeter is a type of constant-volume calorimeter used in measuring the heat of combustion of a particular reaction. Bomb calorimeters have to withstand the large pressure and force of the calorimeter as the reaction is being measured. Electrical energy is used to ignite the fuel, as the fuel is burning, it will heat up the surrounding air, which expands and escapes through a tube that leads the air out of the calorimeter. When the air is escaping through the copper tube it will also heat up the water outside the tube. The temperature of the water allows for calculating calorie content of the fuel.

In more recent calorimeter designs, the whole bomb, pressurized with excess pure oxygen (typically at 20atm) and containing a known mass of fuel, is submerged under a known volume of water before the charge is (again electrically) ignited. The bomb, with fuel and oxygen, form a closed system - no air escapes during the reaction. The energy released by the combustion raises the temperature of the steel bomb, its contents, and the surrounding water jacket. The temperature change in the water is then accurately measured. This temperature rise, along with a bomb factor (which is dependent on the heat capacity of the metal bomb parts) is used to calculate the energy given out by the fuel burnt. A small correction is made to account for the electrical energy input and the burning fuse. After''' the temperature rise has been measured, the excess pressure in the bomb is released. ==

### Constant-pressure calorimeter

A constant-pressure calorimeter measures the change in enthalpy of a reaction occurring in solution during which the atmospheric pressure remains constant.

An example is a coffee-cup calorimeter, which is constructed from two nested Styrofoam cups and holes through which a thermometer and a stirring rod can be inserted. The inner cup holds the solution in which of the reaction occurs, and the outer cup provides insulation. Then

$Cp = \left(W*DH/\left(M*DT\right)\right)$

where

$DH$ = Enthalpy of solution
$DT$ = Change of temperature
$W$ = weight of solute
$M$ = molecular weight of solute

### Differential scanning calorimeter

In a differential scanning calorimeter (DSC), heat flow into a sample—usually contained in a small aluminium capsule or 'pan'—is measured differentially, i.e., by comparing it to the flow into an empty reference pan.

In a heat flux DSC, both pans sit on a small slab of material with a known (calibrated) heat resistance K. The temperature of the calorimeter is raised linearly with time (scanned), i.e., the heating rate dT/dt = β is kept constant. This time linearity requires good design and good (computerized) temperature control. Of course, controlled cooling and isothermal experiments are also possible.

Heat flows into the two pans by conduction. The flow of heat into the sample is larger because of its heat capacity Cp. The difference in flow dq/dt induces a small temperature difference ΔT across the slab. This temperature difference is measured using a thermocouple. The heat capacity can in principle be determined from this signal:

$Delta T = K \left\{dqover dt\right\} = K C_p, beta$

Note that this formula (equivalent to Newton's law of heat flow) is analogous to, and much older than, Ohm's law of electric flow: ΔV = R dQ/dt = R I.

When suddenly heat is absorbed by the sample (e.g., when the sample melts), the signal will respond and exhibit a peak.

$\left\{dqover dt\right\} = C_p beta + f\left(t,T\right)$

From the integral of this peak the enthalpy of melting can be determined, and from its onset the melting temperature.

Differential scanning calorimetry is a workhorse technique in many fields, particularly in polymer characterization.

A modulated temperature differential scanning calorimeter (MTDSC) is a type of DSC in which a small oscillation is imposed upon the otherwise linear heating rate.

This has a number of advantages. It facilitates the direct measurement of the heat capacity in one measurement, even in (quasi-)isothermal conditions. It permits the simultaneous measurement of heat effects that are reversible and not reversible at the timescale of the oscillation (reversing and non-reversing heat flow, respectively). It increases the sensitivity of the heat capacity measurement, allowing for scans at a slow underlying heating rate.

### Isothermal titration calorimeter

In an isothermal titration calorimeter, the heat of reaction is used to follow a titration experiment. This permits determination of the mid point (stoichiometry) (N) of a reaction as well as its enthalpy (delta H), entropy (delta S) and of primary concern the binding affinity (Ka)

The technique is gaining in importance particularly in the field of biochemistry, because it facilitates determination of substrate binding to enzymes. The technique is commonly used in the pharmaceutical industry to characterize potential drug candidates.

### X-ray microcalorimeter

In 1982, a new approach to non-dispersive X-ray spectroscopy, based on the measurement of heat rather than charge, was proposed by Moseley et al. (1984). The detector, and X-ray microcalorimeter, works by sensing the heat pulses generated by X-ray photons when they are absorbed and thermalized. The temperature increase is directly proportional to photon energy. This invention combines high detector efficiency with high energy resolution, mainly achievable because of the low temperature of operation. Microcalorimeters have a low-heat-capacity mass that absorbs incident X-ray (UV, visible, or near IR) photons, a weak link to a low-temperature heat sink which provides the thermal isolation needed for a temperature rise to occur, and a thermometer to measure change in temperature. Following these ideas, a large development effort started. The first astronomical spacecraft that was designed, built and launched with embarqued cryogenic microcalorimeters was Astro-E2. NASA as well as ESA have plans for future missions (Constellation-X and XEUS, respectively) that will use some sort of micro-calorimeters.

### High-energy particle calorimeter

In particle physics, a calorimeter is a component of a detector that measures the energy of entering particles.