Premium
Early history of iron deficiency
Author(s) -
Poskitt Elizabeth M. E.
Publication year - 2003
Publication title -
british journal of haematology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.907
H-Index - 186
eISSN - 1365-2141
pISSN - 0007-1048
DOI - 10.1046/j.1365-2141.2003.04529.x
Subject(s) - pallor , iron deficiency , medicine , anemia , pediatrics , pathology , physiology
Iron deficiency has been described as ‘probably the most frequent nutritional deficiency in the world’ with perhaps 2 billion individuals across the world suffering the most obvious outcome – iron deficiency anaemia (IDA) (Hallberg et al, 2000). In its more severe forms, IDA leads to significant symptomatology and is accompanied by profound pallor. Yet anaemia as a specific cause of pallor seems rarely to have been discussed in early medical writing, perhaps because both anaemia and pallor were common to the late stages of many serious conditions. Today, signs suggestive of anaemia can be confirmed by the quick, more or less routine measurement of haemoglobin concentration. Without such testing, confirmation of anaemia is impossible and, without sophisticated biochemistry or stained bone marrow aspirate, confirmation of iron deficiency is likewise impossible. As a consequence, hypochromic microcytic anaemia was only fully recognized as iron deficiency potentially of dietary origin in the 1930s. Clinical consequences of iron deficiency without anaemia are still in the process of being defined. Our discussion will therefore focus on early developments leading to the recognition of IDA and the realization that disease can result from dietary deficiency of the fourth most common element in the world.