Device for measuring heat produced during a mechanical, electrical, or chemical reaction and for calculating the heat capacity of materials. A common design, known as a bomb calorimeter, consists of a reaction chamber surrounded by a liquid that absorbs the heat produced by the reaction. The amount of heat can be determined from the increase in temperature, taking into account the properties of the container and the liquid.
Learn more about calorimeter with a free trial on Britannica.com.
To find the enthalpy change per mole of a substance A in a reaction between two liquids A and B, the liquids are added to a calorimeter and the initial and final (after the reaction has finished) temperatures are noted. Multiplying the temperature change by the mass and specific heat capacities of the liquids gives a value for the energy given off during the reaction (assuming the reaction was exothermic.). Dividing the energy change by how many moles of X were present gives its enthalpy change of reaction. This method is used primarily in academic teaching as it describes the theory of calorimetry. It doesn’t however account for the heat loss through the container or the heat capacity of the thermometer and container itself. In addition, the object placed inside the calorimeter show that the objects transferred their heat to the calorimeter and into the liquid, and the heat absorbed by the calorimeter and the liquid is equal to the heat given off by the metals.
Heat flow calorimetry
The cooling/heating jacket controls the temperature of the process. Heat is measured by monitoring the temperature difference between heat transfer fluid and the process fluid as follows:
Heat flow calorimetry allows the user to measure heat whilst the process temperature remains under control. It is however a difficult technique to use and not particularly accurate. The value of U has to be predetermined by careful experimentation and any change in product composition, liquid level, process temperature, agitation rate or viscosity will upset the calibration.
A variation of the 'heat flow' technique is called 'power compensation' calorimetry. This method uses a cooling jacket operating at constant flow and temperature. The process temperature is regulated by adjusting the power of the electrical heater. When the experiment is started, the electrical heat and the cooling power (of the cooling jacket) are in balance. As the process heat load changes, the electrical power is varied in order to maintain the desired process temperature. The heat liberated or absorbed by the process is determined from the difference between the initial electrical power and the demand for electrical power at the time of measurement. The power compensation method is easier to set up than heat flow calorimetry but it suffers from the similar limitations since any change in product composition, liquid level, process temperature, agitation rate or viscosity will upset the calibration. The presence of an electrical heating element is also undesirable for process operations.
Heat balance calorimetry
The cooling/heating jacket controls the temperature of the process. Heat is measured by monitoring the heat gained or lost by the heat transfer fluid as follows:
Heat balance calorimetry is, in principle, the ideal method of measuring heat since the heat entering and leaving the system through the heating/cooling jacket is measured from the heat transfer fluid (which has known properties). This eliminates most of the calibration problems encountered by heat flow and power compensation calorimetry. Unfortunately, the method does not work well in traditional batch vessels since the process heat signal is obscured by large heat shifts in the cooling/heating jacket. A recent development in calorimetry however is that of constant flux cooling/heating jackets. These use variable geometry cooling jackets and can operate with cooling jackets at substantially constant temperature. These reaction calorimeters tend to be much simpler to use and are much more tolerant of changes in the process conditions (which would affect calibration in heat flow or power compensation calorimeters).
A bomb calorimeter is a type of constant-volume calorimeter used in measuring the heat of combustion of a particular reaction. Bomb calorimeters have to withstand the large pressure and force of the calorimeter as the reaction is being measured. Electrical energy is used to ignite the fuel, as the fuel is burning, it will heat up the surrounding air, which expands and escapes through a tube that leads the air out of the calorimeter. When the air is escaping through the copper tube it will also heat up the water outside the tube. The temperature of the water allows for calculating calorie content of the fuel.
In more recent calorimeter designs, the whole bomb, pressurized with excess pure oxygen (typically at 20atm) and containing a known mass of fuel, is submerged under a known volume of water before the charge is (again electrically) ignited. The bomb, with fuel and oxygen, form a closed system - no air escapes during the reaction. The energy released by the combustion raises the temperature of the steel bomb, its contents, and the surrounding water jacket. The temperature change in the water is then accurately measured. This temperature rise, along with a bomb factor (which is dependent on the heat capacity of the metal bomb parts) is used to calculate the energy given out by the fuel burnt. A small correction is made to account for the electrical energy input and the burning fuse. After''' the temperature rise has been measured, the excess pressure in the bomb is released. ==
An example is a coffee-cup calorimeter, which is constructed from two nested Styrofoam cups and holes through which a thermometer and a stirring rod can be inserted. The inner cup holds the solution in which of the reaction occurs, and the outer cup provides insulation. Then
In a heat flux DSC, both pans sit on a small slab of material with a known (calibrated) heat resistance K. The temperature of the calorimeter is raised linearly with time (scanned), i.e., the heating rate dT/dt = β is kept constant. This time linearity requires good design and good (computerized) temperature control. Of course, controlled cooling and isothermal experiments are also possible.
Heat flows into the two pans by conduction. The flow of heat into the sample is larger because of its heat capacity Cp. The difference in flow dq/dt induces a small temperature difference ΔT across the slab. This temperature difference is measured using a thermocouple. The heat capacity can in principle be determined from this signal:
When suddenly heat is absorbed by the sample (e.g., when the sample melts), the signal will respond and exhibit a peak.
From the integral of this peak the enthalpy of melting can be determined, and from its onset the melting temperature.
Differential scanning calorimetry is a workhorse technique in many fields, particularly in polymer characterization.
A modulated temperature differential scanning calorimeter (MTDSC) is a type of DSC in which a small oscillation is imposed upon the otherwise linear heating rate.
This has a number of advantages. It facilitates the direct measurement of the heat capacity in one measurement, even in (quasi-)isothermal conditions. It permits the simultaneous measurement of heat effects that are reversible and not reversible at the timescale of the oscillation (reversing and non-reversing heat flow, respectively). It increases the sensitivity of the heat capacity measurement, allowing for scans at a slow underlying heating rate.
The technique is gaining in importance particularly in the field of biochemistry, because it facilitates determination of substrate binding to enzymes. The technique is commonly used in the pharmaceutical industry to characterize potential drug candidates.
In 1982, a new approach to non-dispersive X-ray spectroscopy, based on the measurement of heat rather than charge, was proposed by Moseley et al. (1984). The detector, and X-ray microcalorimeter, works by sensing the heat pulses generated by X-ray photons when they are absorbed and thermalized. The temperature increase is directly proportional to photon energy. This invention combines high detector efficiency with high energy resolution, mainly achievable because of the low temperature of operation. Microcalorimeters have a low-heat-capacity mass that absorbs incident X-ray (UV, visible, or near IR) photons, a weak link to a low-temperature heat sink which provides the thermal isolation needed for a temperature rise to occur, and a thermometer to measure change in temperature. Following these ideas, a large development effort started. The first astronomical spacecraft that was designed, built and launched with embarqued cryogenic microcalorimeters was Astro-E2. NASA as well as ESA have plans for future missions (Constellation-X and XEUS, respectively) that will use some sort of micro-calorimeters.