Sunday, 31 May 2015

Iron and Steel, Part 1: Early Times

Iron is a common element in the earth's crust, but it is hardly ever found in its raw state; and difficulties involved in smelting it meant that its use came relatively late. By comparison, there is evidence of copper-working from about 4000 BC. It was perhaps discovered accidentally: Neolithic people made pottery, and maybe found beads of copper formed from stones used to build a kiln. Pure copper is too soft to be very useful, but it was found to become much harder if alloyed with tin (to form bronze), zinc or other elements. Many ancient civilizations were based on bronze-making: Egypt, old Babylonia, Minoan Crete and Mycenaean Greece, where Homer’s heroes fought with bronze weapons. The Greek poet Hesiod (8th century BC) knew that a Bronze Age preceded the Iron Age of his own day. There were also Bronze Age empires in China, suggesting that metal-working might have been discovered independently in different parts of the world.

  Iron working began about 1200 BC in what is now Turkey (and perhaps independently in China, where it developed rather differently). Manufactured iron takes one of three forms:-   
Wrought iron contains only about 0.04% carbon and minute traces of other elements. It is worked by hammering or rolling, and pieces can be heated in a forge and welded together. It rusts only slowly, but in its pure form is rather soft. It is not much used nowadays: things like decorative “wrought iron” gates are really mild steel.  
Cast Iron is about 4% carbon. It is cast in a mould, but cannot be forged. It is much harder than wrought iron and is strong in compression (bearing heavy loads from above) but weak in tension (if bent, it tends to break suddenly). It does not rust, and therefore today is found in things like drain covers.
Steel.   Mild steel is about 0.25% carbon. Steel is much harder than wrought iron, and can be cast or forged and welded. It is prone to rust. Its qualities are improved by alloying it with other metals; e.g. nickel, chromium or tungsten. Many modern steels are less than 50% iron.

This model from the Science Museum illustrates early iron working

A small furnace was constructed of stones and clay, containing iron ore (iron oxide, or more complex compounds) and charcoal, and bellows were then worked for several hours. When the temperature rose above 800 degrees, the charcoal burned to produce carbon monoxide, which reacted with the oxygen in the iron ore to leave iron and carbon dioxide. Because iron ore usually contains various silicates, crushed limestone (calcium carbonate) was also needed as a catalyst; reacting with these to produce calcium silicate; a shiny mineral commonly known as slag. Eventually the ironmaster would judge it was time to halt the process and open the furnace. If successful, he would find inside a lump of iron, perhaps not much larger than a cricket ball, also containing bits of slag, limestone and charcoal. This would then be delivered to a forge, where a smith would hammer it to drive out the rubbish, ending up with a piece of almost pure wrought iron. This would usually be shaped into bars for convenience, which other smiths would forge into tools, weapons and other useful objects.
     For well over 2,000 years, this was the only way of making iron! The Romans, for instance, never discovered any better system. Therefore, any metals were expensive, and most tools were still made out of wood whenever possible.

    The coming of iron weapons was apparently linked with enormous changes. Around this time (1000 BC) the ancient empires collapsed in the face of massive tribal movements. Mycenaean Greece disappeared, and the Egyptian New Kingdom overthrown by the invasion of the “sea peoples”, who were possibly our old friends the Philistines. In the Bible we are told that the Philistines controlled the supply of iron, and tried to stop any getting to Israel. The first iron age empire was the Assyrians.

Wrought iron had only limited uses for weapons or tools: being soft, it tended to blunt or notch easily. But very early on, smiths discovered a way to improve it. If they heated up iron to above red-heat and then cooled it rapidly (e.g. by plunging it in water) it became much harder. This was called tempering, or quenching. What happened was that tiny quantities of carbon were absorbed in the surface of the iron, creating a layer of steel a few molecules thick. This is now called “case-hardening”. If the iron was then repeatedly heated, hammered out and quenched, a kind of multi-layer sandwich of steel was created, suitable for making a sword. This was a highly skilled process, entirely dependent on the judgement of the smith, who worked with rituals verging on the magical, because no-one knew the chemical processes involved (“temper once in running water, once in dew, once in blood”, for instance). In "Moby Dick", when Captain Ahab forges a weapon to kill the white whale he asks his three harpooners to give their own blood for the final tempering.   
    I believe this is how we have legends of "magic swords", which were given names and were handed down from generation to generation. Probably most steel in the days of primitive technology was poor quality, but occasionally a smith might produce a blade of top-quality steel, probably as a result of some fortuitous alloy, which would cut through other weapons and armour of the time. By about the 12th century, as techniques improve, we no longer come across magic swords.  
      A slender piece of well-tempered steel, if bent, will spring back to its original shape. Top quality fencing foils can bend almost in a semicircle and spring back; though they will weaken with each bending and will eventually break.

Iron manufacturing was changed for ever with the invention of the blast furnace about 1450 AD, probably initially in Belgium, whence it quickly spread throughout western Europe.
What’s different here? The ingredients are exactly the same (iron ore, charcoal and limestone) but the furnace is much bigger, about 30 feet high, and is a permanent structure. The ingredients are filled from the top, usually by wheelbarrow. The main difference is the much higher temperature, achieved by a far stronger blast of air – hence, blast furnace. Above 1535 degrees, the iron melts to a liquid, and being heavier than anything else, accumulates at the bottom of the furnace, where it is held in place by a clay plug, with the slag and any other rubbish floating on top. This is Cast Iron, with a high carbon content (plus usually a few accidental traces of other elements). When the ironmaster decides the time is right, the clay plug is broken, and the molten iron runs out into a trough of sand, thence into side-troughs. These were thought to resemble lines of pigs feeding at a trough: thus "pig iron"
    A blast furnace could be working continuously: fed from the top and the iron tapped off at intervals, usually twice a day; producing in the early days about a ton of pig iron every 24 hours. It would only be stopped (“blown out”) if repairs were needed, or for some other reason. The record for a modern furnace is 38 years continuous blasting! 
     Thus vastly greater quantities of iron were produced than ever before, but there were defects. The first was that such a powerful blast of air could not be achieved by human strength, or even by horses. Only a waterwheel in a strongly flowing river could achieve it. (Perhaps this is why the Romans never invented blast furnaces: not only did they despise practical science, but they relied upon the muscle-power of slaves and barely developed watermills). One consequence was that blast furnaces often had to be “blown out” in summer if water levels dropped too low to power the bellows. The iron industry in the Black Country region near Birmingham was held back by the lack of suitable water-power.
    Secondly, huge quantities of charcoal were needed. To smelt a ton of pig iron required the felling of an acre of hardwood! By Elizabethan times, there were serious concerns about England becoming deforested! The early iron industry was found in places like the Weald of Sussex, with plentiful timber and “hammer-ponds” to provide the power. The obvious answer would be to use coal in the furnaces, but it was found that the iron produced was useless because it was weakened by elements absorbed from the coal: principally sulphur, but also phosphorus. In the 17th century, Dud Dudley in the Black Country claimed to have successfully smelted iron using coal, but if true, his secret died with him.
     Thirdly, cast iron had limited uses. It could be melted and then cast into pots and pans and fire-grates, but industry wanted more wrought iron, and conversion was a difficult process.

Pig iron would be transferred to a furnace called a Finery, where it would be melted and stirred and air blasted across to burn off the carbon. It was then placed under water-powered trip-hammer which shaped it into a block called a Bloom, driving out any impurities.  Next it went to another furnace called a Chafery, where it was hammered again into a bar. Lastly came a Slitting Mill, where it was rolled out and cut into narrow bars. The whole procedure was known collectively as a Forge.
     All these operations required water-power for bellows and more supplies of charcoal. Because a single water-wheel might not be powerful enough to power all these operations, the different furnaces might be some distance apart, increasing the expense.
   The first region of Britain to develop a modern iron industry was Coalbrookdale in Shropshire. This will be covered in my next essay.
A trip-hammer in the Coalbrookdale museum. The springs are modern!

No comments:

Post a Comment