The development of steel can be traced back 4000 years to the beginning of the Iron Age. Proving to be harder and stronger than bronze, which had previously been the most widely used metal, iron began to displace bronze in weaponry and tools.
For the following few thousand years, however, the quality of iron produced would depend as much on the ores available as on the production methods.
By the 17th century, iron's properties were well understood, but increasing urbanization in Europe demanded a more versatile structural metal. And by the 19th century, the amount of iron being consumed by expanding railroads provided metallurgists with financial incentive to find a solution to iron's brittleness and inefficient production processes.
A major breakthrough came in 1856 when Henry Bessemer developed an effective way to use oxygen to reduce the carbon content in iron. The modern steel industry was born.
Read more about the history of steel below:Part I: The Era of Iron
Part II: The Bessemer Process and Modern Steelmaking
The Era of Iron
At very high temperatures, iron begins to absorb carbon, which lowers the melting point of the metal, resulting in cast iron (2.5 to 4.5% carbon). The development of blast furnaces, first used by the Chinese in the 6th century BC but more widely used in Europe during the Middle Ages, increased the production of cast iron.
Molten iron run out of the blast furnaces and cooled in a main channel and adjoining moulds became referred to as pig iron because the large, central and adjoining smaller ingots resembled a sow and suckling piglets.
Cast iron is strong, but suffers from brittleness due to its carbon content, making it less than ideal for working and shaping. As metallurgists became aware that the high carbon content in iron was central to the problem of brittleness, they experimented with new methods for reducing the carbon content in order to make iron more workable. More...
The Bessamer Process and Modern Steelmaking
The growth of railroads during the 19th century in both Europe and America put great pressure on the iron industry, which still struggled with inefficient production processes. Yet steel was still unproven as a structural metal and production was slow and costly. That was until 1856, when Henry Bessemer came up with a more effective way to introduce oxygen into molten iron in order to reduce the carbon content.
Now known as the Bessemer Process, Bessemer designed a pear-shaped receptacle - refered to as a 'converter' - in which iron could be heated while oxygen could be blown through the molten metal. As oxygen passed through the molten metal, it would react with the carbon, realasing carbon dioxide and producing a more pure iron.
The process was fast and inexpensive, removing carbon and silicon from iron in a matter of minutes but suffered from being too successful. Too much carbon was removed and too much oxygen remained in the final product. Bessemer ultimately had to repay his investors until he could find a method to increase the carbon content and remove the unwanted oxygen.
At about the same time, British metallurgist Robert Mushet acquired and began testing a compound of iron, carbon and manganese - known as speigeleisen. Manganese was known to remove oxygen from molten iron and the carbon content in the speigeleisen, if added in the right quantities, would provide the solution to Bessemer's problems. Bessemer began adding it to his conversion process with great success.
Yet, one problem still remained. Bessemer had failed to find a way to remove phosphorus - a deleterious impurity that makes steel brittle - from his end product. Consequently, only phosphorus-free ores from Sweden and Wales could be used. More...