Statistical Categories and Normativity: (Against) the Essentialization of Morality as Quantified Normality
Third-party funded project
Project title Statistical Categories and Normativity: (Against) the Essentialization of Morality as Quantified Normality
Principal Investigator(s) Rost, Michael
Organisation / Research unit Faculty of Medicine,
Departement Public Health / Medizin- und Gesundheitsethik (Reiter-Theil)
Project start 01.12.2019
Probable end 30.11.2020
Status Active

In the 1870s, when the Belgian mathematician Adolphe Quetelet started measuring human properties, no human property or deviation was considered non-normal, pathologic, or normatively loaded. Today, almost 250 years later, one can observe a normatively loaded normality. It almost seems as if the source of normativity is statistical voluntarism, in which obligation derives from authoritative statistics which articulate an ought for the individual. However, any claim about what ought to be, which is exclusively based on descriptive premises (e.g. statistical statements) falls victim to the is-ought problem articulated by David Hume. A statistical statement is, in the first place, a descriptive statement and, thus, a descriptive premise about what something is, which standalone does not vindicate a normative conclusion.

In flexible normalism, today’s dominant societal strategy of how normalism unfolds within a society, norms are being calculated ex post based on statistical data and an individual may or may not adjust to these norms. Searching for the normal and orientation, the individual does not face pre-defined norms, but is confronted with crystallized realms of normality. This means that normality is constantly changing. May norms have been developed in a flexible normalistic way, once established they still cause a suction effect, which, at the individual level, leads to internalization of these norms and fear of denormalization, and, at the societal level, to homogenization and normalization – the time of normalism has arrived. Flexible normalism has its dangers. Not being considered as normal may lead to stigmatization, social exclusion, and pathologization as well as to real pathologies on the part of the individual.

Thus, the proposed research project aims to (a) promote awareness for the dangers of tying statistics and normativity, and (b) empower individuals to overcome this nexus of statistics and normativity. The study will, first, make people think about the term “normal”, about its (mis)use, and about potential dangers, and, second, shed light on the meaning of the term “normal” in people’s everyday language and on the reasons for its use. Ultimately, the study aims to demonstrate the following choice between Scylla and Charybdis: the term “normal” either refers to all existing human properties and, therefore, becomes meaningless, or it exclusively refers to a subset of human properties and, thereby, excludes other properties from being normal and abets social exclusion, stigmatization, and normalism. The proposed project starts from the conviction that coming to realize the dangers of an essentialization of morality as quantified normality would be beneficial for society as a whole.

Financed by Swiss National Science Foundation (SNSF)

MCSS v5.8 PRO. 0.474 sec, queries - 0.000 sec ©Universität Basel  |  Impressum   |