Doctor with Patient

Researchers Argue Sex and Gender Analysis Improves Science

It has long been understood in scientific circles that unconscious biases, particularly those relating to sex and gender, can have a negative impact on the objectivity of scientific findings. While the goal of science is to discover the truth in as objective a manner as possible, scientists are prone to the same unintentional, biased assumptions as anyone else, and the quality of scientific work can be affected. For instance, the appropriate dosage for a medicine may be devised with the assumption that the patient is male, leading to suboptimal dosage recommendations for women. As another example, safety equipment too can be designed with the physical concerns of men in mind, negatively affecting women who use the equipment. And as machine learning technologies advance, engineers are realizing that machine learning programs are capable of picking up on human beings’ unconscious biases and replicating them, perpetuating the problem.

In light of these realizations, much conversation has taken place regarding how best to correct for sex and gender bias in science. This concept is explored in an article posted in Nature entitled “Sex and gender analysis improves science and engineering.” The article’s authors argue that taking sex and gender into consideration while conducting science not only benefits less-advantaged individuals by recognizing the institutional challenges they face, but also improves the quality of science itself, as unconscious biases are identified and corrected. This approach, the authors claim, benefits multiple scientific fields, including medicine, artificial intelligence, and even climatology. 

Embed from Getty Images

While many consider the terms synonyms, the researchers explain the difference between sex and gender, defining the former as including mainly biological attributes, whereas they define the latter as “psychological, social and cultural factors that shape attitudes, behaviours, stereotypes, technologies, and knowledge.” This distinction is important because sex and gender interact in complex ways; for instance, there exist physiological differences relating to the experience of pain between the sexes, and gender impacts how patients communicate pain with doctors and researchers. The researchers point out several improvements which have been made in this area over the past several decades; for instance, crash test dummies were originally based on a male physique, but now represent more diverse body shapes, allowing engineers to design vehicles that are safe for a larger number of people. However, they also point out areas for future improvement. 

As advanced technology continues to influence society, ensuring that it doesn’t perpetuate harmful stereotypes takes on additional importance.

In their paper, the scientists focus on the surprising and complicated ways sex and gender manifest across a variety of disciplines, with the most focus placed on marine science, biomedicine, robotics, and artificial intelligence. The authors discuss how sex impacts science even in non-humans, as male and female marine life react differently to the effects of changing ocean temperatures, an observation which has generated insights about more accurately modelling the effects of climate change. In human beings, sex differences account for disparities in responses to various medicines, such as vasopressin and cancer immunotherapy, for biological reasons including differences in amounts of testosterone and estrogen and overall body composition.

Embed from Getty Images

Perhaps more surprisingly, artificial intelligence is a field in which unconscious biases can make their way into technologies, unintentionally perpetuating cultural biases and stereotypes. For instance, advertising algorithms are more likely to automatically serve ads for high-paying jobs to men than to women, and automatic image captioning algorithms tend to misidentify pictures of men in kitchens as women. As advanced technology continues to influence society, ensuring that artificial intelligence doesn’t perpetuate harmful stereotypes takes on additional importance.

The authors conclude by proposing solutions to many of the problems with sex and gender biases in science they identify. One suggestion is to foster greater interactions between the scientific community and the humanities, including social scientists. Allowing for interdepartmental conversations in this way helps scientists to learn about how biases emerge and affect human reasoning, and can incorporate this knowledge into their work. Additionally, the researchers advocate for greater transparency in scientists’ reporting by including variables relating to sex and gender in their data analyses.