If market demand grows, capacity utilization will rise. If demand weakens, capacity utilization will slacken. Economists and bankers watch capacity utilization indicators for signs of inflation pressures.
It is believed when utilization rises above somewhere between 82% and 85%, price inflation will increase. Excess capacity means that insufficient demand exists to warrant expansion of output.
All else constant, the lower capacity utilization falls (relative to the trend capacity utilization rate), the better the bond market likes it. Strong capacity utilization (above the trend rate) reports leads to bonds being sold off, as investors expect higher interest rates (which decreases bond prices) to offset the higher expected rate of inflation.
Much statistical and anecdotal evidence shows many industries in the developed capitalist economies suffer from chronic excess capacity. Critics of market capitalism therefore argue the system is not as efficient as it may seem, since at least 1/5 more output could be produced and sold, if buying power was better distributed. However, a level of utilization somewhat below the maximum prevails, regardless of economic conditions.
In economic statistics, capacity utilization is normally surveyed for goods-producing industries at plant level. The results are presented as an average percentage rate by industry and economy-wide, where 100% denotes full capacity. This rate is also sometimes called the "operating rate". If the operating rate is high, this is called "overcapacity", while if the operating rate is low, a situation of "excess capacity" or "surplus capacity" exists. The observed rates are often turned into indexes.
There has been some debate among economists about the validity of statistical measures of capacity utilization, because much depends on the survey questions asked, and on the valuation principles used to measure output. Also, the efficiency of production may change over time, due to new technologies.
For example, Michael Perelman has argued the US Federal Reserve Board measure is just not very revealing. Prior to the early 1980s, he argues, American business carried a great deal of extra capacity. Running close to 80% indicated at the time approaching capacity restraints. Since that time, firms have scrapped much of their most inefficient capacity. As a result, an 77% capacity utilization now would be equivalent to a historical level of 70 or 77%.
One of the most used definitions of the "capacity utilization rate" is the ratio of actual output to the potential output. But potential output can be defined in at least two different ways.
One is the "engineering" or "technical" definition, according to which potential output represents the maximum amount of output that can be produced in the short-run with the existent stock of capital. Thus, a standard definition of capacity utilization is the (weighted) average of the ratio between the actual output of firms to the maximum that could be produced per unit of time, with existing plant and equipment (see Johanson 1968). Obviously, "output" could be measured in physical units or in market values, but normally it is measured in market values.
However, as output increases and well before the absolute physical limit of production is reached, most firms might well experience an increase in the average cost of production (even if there is no change in the level of plant & equipment used). For example, higher average costs can arise, because of the need to operate extra shifts, undertake additional plant maintenance, and so on.
An alternative approach, sometimes called the "economic" utilization rate, is therefore to measure the ratio of actual output to the level of output, beyond which the average cost of production begins to rise. In this case, surveyed firms are asked by how much it would be practicable for them to raise production from existing plant and equipment, without raising unit costs (see Berndt & Morrison 1981). Typically, this measure will yield a rate around 10 percentage points higher than the "engineering" measure, but time series show the same movement over time.
As a derivative indicator, the "output gap" (%OG) percentage can be measured as actual output (AO) less potential output (PO) divided by potential output x 100.
%OG= (AO/PO-1) x 100
In the survey of plant capacity used by the US Federal Reserve Board for the FRB capacity utilization index, firms are asked about "the maximum level of production that this establishment could reasonably expect to attain under normal and realistic operating conditions, fully utilizing the machinery and equipment in place."
By contrast, the Institute of supply management (ISM) index asks respondents to measure their current output relative to "normal capacity", and this yields a utilisation rate which is between 4 and 10 percentage points higher than the FRB measure. Again, the time series show more or less the same historical movement.
The average economy-wide capacity utilization rate in the US since 1967 was about 81.6% according to the Federal Reserve measure. The figure for Europe is not much different, for Japan only slightly higher.
The average utilization rate of installed productive capacity in industry, in some major areas of the world, was estimated in 2003/2004 to be as follows (rounded figures):