“Wet nursing,” or the practice of allowing a woman other than the mother of a child to provide milk to an infant, has been practiced for millenia. Two hundred years ago, wet nursing was common for a variety of reasons. Upper-class families could hire a wet nurse to enable the mother to more quickly become pregnant again, ensuring adequate nutrition for the newborn infant without the associated decrease in fertility that accompanies breast-feeding. In middle class families, employing a wet nurse allowed the mother to return to her job in the factory or in the field. This practice began to decline in the 1800s in the United States and Europe, as the use of animal milks and milk-based infant formulas began to increase in popularity.
Use of formula increased throughout the first half of the 20th century, as formula-feeding was heavily marketed and became the norm, leaving breast-feeding mothers in the minority of the population in many industrialized nations. The pendulum began to swing back the other way, however, in the 1960s and 1970s, with increasing numbers of new mothers using their own milk to feed their baby. Currently, approximately 70% of mothers in the United States breastfeed their infants for at least a short period of time.
While numerous studies have shown that breastfeeding is preferable to formula feeding for a number of health reasons, nursing isn’t always possible or practical for every woman. Some women find themselves unable to nurse for a variety of reasons: prior surgery, working outside the home and being away from the baby for extended periods of time; adoption of the child and not being the biological mother, etc. A new Time article notes that one solution to this problem is the resurrection of wet nursing and milk banking in developed countries.
Continue reading “Would you give your baby someone else’s breast milk?”