How Is Iron Converted to Steel?

The element iron is converted to the alloy commonly called steel by controlling the percentage of carbon in the product, which is usually 0.25 percent to 1.5 percent of the final alloy content. The first economical method for producing steel in large quantities was the Bessemer furnace, which was invented in the mid 1850s and introduced a stream of air to the molten metal that enabled better control over the carbon content. In the modern process, high-purity oxygen is pumped into the furnace instead, which avoids adding the impurities found in ordinary air.

Iron ore mined from the ground contains impurities that must first be removed by the process called smelting. The ore is heated until it becomes molten and the impurities separate into a layer called slag that floats on top of the molten iron. In the early steel-making processes, it was discovered that the fuel used in smelting, which was usually a form of coal called coke, was adding carbon to the molten iron. The connection between the percentage of carbon in the product produced and its physical properties was soon made.

What is commonly called steel can refer to a variety of alloy types, depending upon the amount of carbon present in the alloy. The carbon content affects properties such as the alloy's hardness, strength and malleability. Specialized steel alloys contain other elements such as nickel and chromium, which are contained in stainless steel and provide the alloy with its special property of resistance to corrosion.