For centuries, our interest in psychology made us try to establish a standard for classifying personality traits. It began with Hippocrates and the four temperaments - sanguine, choleric, melancholic, and phlegmatic. An early attempt at systematizing the relationship between a person's internal, fixed physiological state and their outward projection of the underlying, mood determining vectors. In fact, the word temperament has its roots in latin, meaning "proportioned mixture of elements". These elements, known as humours, would determine the nature of an organism - throw off the balance by having too much or too little of one of them and a person’s character disposition would be tilted a certain way.
As history unfolded, people expanded on the concept: The variability of personality must be grounded on some fundamentally different “essences” in people. With time, the articulation of those psychological phenotypes became more precise and less tangled up with metaphysical vagueness, before we finally agreed on the psychometric model we use today: The Big Five personality traits.
The Big Five constitute the taxonomy of human temperament, based on the most common behavioral characteristics running through the human population. Their methodology rests on a catalog self-assignable language descriptors (as in “I pay attention to details” or “I am the life of the party”) called the Big Five Aspect Scales. While these kinds of questionnaires aren't perfect, the results proved to be a valuable measuring index for finding common behavioral traits. They can, for instance, predict informative things about an individual, likes his or her political orientation, health behavior, and socioeconomic success.
In theory, devices that allowed for the scanning of various biomarker could exceed the accuracy of the assessment by automatically controlling for acute fluctuations in physiological factors - like a day’s hormonal cycle - which would otherwise bias the outcome towards one or another direction. (Think being sleep deprived or hangry.) While we’re already on our way to advance the methodology with genetic testing, up until now, we're still dependent on an inherently flawed way of testing personality. Not at least because the creation of this factor analysis itself may have been heavily biased - because who’s to say that five is the right number of factors to measure something as dynamic and continuous as personality?
Anyhow - as our intuition lets us know on a visceral level already, people’s characters are distinguishable from one another through behavioral markers that point to their potential, perspectives and aspirations. As such, the Big Five serve to make these observation more graspable for the pattern seeking animals we are. They’re a distillation of universal personality traits into higher levels of a structural hierarchy, to form a useful toolbox that allows a better understanding of us, our peers and the interpersonal dynamics. It’s the most comprehensive device we have for exploring the individual as the atom of society and thus serves an integral part of sociological research.
So, what are they?
As the name suggests, the Big Five’s way of quantizing people’s different temperaments uses five major dimensions of personality, hierarchically complemented by two sub-dimensions per trait, called facets. In order of the acronym O.C.E.A.N, they are:
- Openness:
- Openness to Experience
- Intellect
- Conscientiousness:
- Industriousness
- Orderliness
- Extraversion:
- Enthusiasm
- Assertiveness
- Agreeableness:
- Compassion
- Politeness
- Neuroticism:
- Withdrawal
- Volatility
Notably, these personality metrics aren’t mere products of cultural environment. Psychological differences between people are linked to subtle anatomical and endocrine profiles, many of which are genetically hard-coded. As such, each of the traits comes with its own biological underpinnings, creating the corresponding behavioral trends and making personality a relatively stable thing over a lifetime. Running the risk of oversimplifying, they range from a hyperactive amygdala - the fear processing center in the brain - to an atypically pronounced activity of the prefrontal cortex - the part of the brain that makes for conscious decision making. Those neurological profiles result in higher levels of neuroticism and conscientiousness, respectively.
Normalcy
Considering that these traits lie on a spectrum, it is possible to transgress the bounds of normalcy by being extraordinarily high or low in any of these factors. Some of the time, this will lead to functional impairment on a behavioral level - at which point we stop calling them personality and start declaring them mental disorders. For instance: An extremely extroverted person will exhibit certain behavioral tendencies that will make them appear manic. No agreeableness equals no empathy, which makes you a complete psychopath, according to the book. Too much neuroticism and you won’t be leaving your house - agoraphobia is knocking on the door. Very high levels of openness leads to making connections between seemingly unrelated things - we call this creativity. Except when too much of it leads lets you see things that aren’t actually there - then it becomes schizophrenia.
As you can see, the transition between a personality quirk and a full blown derangement of the mind is rather fluid. “Mental disorders are often associated with maladaptive extremes of the Big Five traits.” as evolutionary psychologist Geoffrey Miller put it. Very uncontroversial stuff. What is controversial, on the other hand, is the process by which these normalcy-defining bounds get set. A responsibility that lies completely within the hands of the mental health institutions. While this thought itself is already quite disconcerting, it gets all the more interesting in light of the steep increase of mental disorders in adolescents, for which there are basically two explanations: Either we have all suddenly gone mental, or the definition of mental was altered in a way that pathologizes normal human behavior.
I don’t want to refute the case for the first possibility, as making it is easy: Lack of social connections, the psychological pressure of social media and perhaps most of all, the failure to raise awareness for the relationship of physiological health and the implications for psychology are all factors that make modern people mentally more distressed than necessary. If you compromise the cornerstones for physical wellbeing, you’ll pay the price in the form of irritability, lack of executive control and mood swings. Sadly, this vital information seems as easily dismissed as your average alternative medicine fad - making every licensed psychiatrist who acts surprised in the face of a depressed first world either a fool or a hypocrite.
At the same time - knowing about the ever-widening spectrum that defines a mental disorder - a motive of willful denormalization of normalcy lines out. The threshold for a diagnosis gets lowered faster with each revision of the Diagnostic and Statistical Manual for Mental Disorders, making the new normal a state of disease, spanning from the individual to the society as a whole as a psychiatric meme. Normal, correctly applied in a statistical sense, but not in an honest, medically legitimate one. Having already discussed the trend of misconstruing physiologically detrimental states as mental disorders, I’d like to focus on the second possibility this time. Because as it turns out, there are more than enough reasons to disassemble personal weakness and repurpose them as signs of an ill mind. Most of them, while seemingly in good faith, don’t achieve what they set out to. At least if the goal is treating mental distress - whether or not your personality falls out of the spectrum of the ordinary.
For a start, let’s consider that normalcy is broadcasted as a set of factors in an individual that allow them to function properly in the environment. But with an environment that imposes certain behavioral standards, being normal doesn’t mean what it’s used to mean. In fact, it’s not the norm anymore - it’s an ideal.
Culture’s Bias
Viewed through an evolutionary lens, none of the Big Five traits are inherently good or bad: Instead, the benefits and hindrances they pose are largely contextual and entirely depended on their functionality in our ever-changing environment: Being a novelty-seeking individual (trait Openness) can lead to valuable discoveries or obscure ways of getting killed. Being overly cautious (trait Neuroticism) may get you to miss out on a reward or keeps you from danger - there are as many good reasons for taking a risk as for avoiding it. Being compliant and empathetic (trait Agreeableness) may help to raise your children but also makes you less assertive when confronted with other humans in a conflict.
You can clearly see a pattern emerge of stratifying organisms’ approach to adapt to a dynamic environment: The evolutionary selection process created psychological diversification, leading us to the melting pot of personalities that we’re living in today. That said culture, environment and sex can create certain trends - like the average personality differences between the genders and Brazilians’ tendency to be extroverted. It’s also not to leave out the symbiotic relationship between environment and genes.
In any case: Societies themselves can be thought of as an environment one needs to adapt to. Consequently, humans have to obey the same filtering mechanisms that evolutionary principles impose on them in any other context. Put differently, certain physical traits tend to be advantageous for living and thriving in specific environments. As such, the same logic applies to psychological features and the societal firmware they interface with. One of the ways this plays out is how developed countries reward conscientiousness - the trait of delayed gratification.
Like any other personality trait, conscientiousness is neither good nor bad. But with our civilizations having reached a certain degree of stability, the traits’ associated behavioral characteristics can manifest themselves more positively. If you think back a few millennia, it becomes less clear as why humans would benefit from delaying gratification. Directing one’s actions towards a future of uncertainty may not only haven’t paid out, but the sacrifice of the present could have impeded the most pragmatic survival principle of them all: Enjoy it while you can. A credo which undoubtedly lead to a lot of offspring and accumulation of energy stores in the form of fat.
Ignoring the evolutionary forces of instant gratification however, would have been a risky strategy in circumstances that don’t allow for executing one’s long term vision - an ironic reversion of what we nowadays consider a safe way to conduct one’s behavior. If you'd forgo pleasure-taking opportunities every time, your genome might go extinct because it makes it puts too much importance on the future, in disregard of the presence. It's therefore not unreasonable to theorize that the strong favoring of conscientious people only emerged during the first agricultural revolution, at around 10,000 BC. The mechanisms of sexual selection likely have caused an accelerated propagation of people high in the traits that allowed them to operate with a longer time frame in mind - as mentioned, a previously impractical feat to pull off. Subsequently, the increase in economic output allowed them to gather resources, trade and establish more crude hierarchies of power, an ascension within which demanded a similar work ethic.
So, even though there are perfectly viable reasons to act impulsively, this didn’t stop us from assigning people very low in conscientiousness the label of a mental disorder. ADHD describes the inability to keep your focus in the absence of real urgency. While there’s certainly a pathological degree of low attention span, this hardly applies to the situations where suspicions of the disorder in someone first emerge. Not being able to gather up the motivation to read the book you got as a school assignment isn’t that surprising, as any reason to do so simultaneously demands you to discount the present for a future with a potentially larger reward - as we’ve just seen, not the only evolutionary strategy humans are fond of. Just one of many.
Naturally, ADHD - or to be more precise, the personality profile described by ADHD - is especially handicapping in modernity where one is likely to be rewarded when continually working on something over years on end and gets left behind if he or she won’t. At the same time, the panopticon of pleasurable activities at our disposal is seductive as ever and takes everyone with it who exhibits the slightest signs of impulsivity, levering normal human attention ability into the realm of a disorder. On top of that, one has to wonder why the default position should be one of urgency when confronted with the obligations of modern life - most of which don’t show their bad bad consequences until many years later.
Despite its denial of Social Darwinist practices, there's no escape from the principles of evolution - society openly rewards people who exhibit certain behavioral traits. Since it operates under an agenda that assumes equal potential of all people, it must respond to the inability to thrive with coherent explanations. The resulting moral dilemma - people who don’t conform are going to struggle - is made more palatable by inventing moral justifications (disorders) for the alteration of consciousness to better adapt. It’s drugging people under the guise of medicating them. Society is preaching the virtues of being yourself while at the same time explaining to you who ought to be. It’s a tendon ripping case of mental gymnastics.
Doctor’s Bias
Receiving a diagnosis is not only dependent on one’s own subjectivity in laying out the DSM’s promiscuous criteria, but the readiness with which the psychiatrist agrees upon them: Consider that the preselected pool of people wanting to work in the health industry is already tilted towards humans with traits not reflected in their patients: Psychiatrists, or doctors in general, are rather agreeable, non-neurotic and conscientious people. They are likely to perceive the world on different wavelengths than the people seeking help. Therefore, an accomplished health care practitioner will have a hard time identifying with a person whose personality and life conditions are the polar opposite of the doctor, who will be further biased to diagnose - for him or her, it's likely that their behavior will look anormal enough, through the sheer power of contrast. It looks less normal than it actually is. In the end, though, it's the physician who is not normal.
This makes for a perceptual gap between the consultant and the patient. One that shows whenever anxiety and depression are discussed, since it’s almost always in neglect of any justifications for those feelings based in real life. Someone who’s a doctor is just not likely to consider slightly above average levels of neuroticism or lower than average levels of conscientiousness as normal - even though they are.
The Modern Denial of Suffering
Life is pain, highness. Anyone who says differently is selling something.
- William Goldman
While labels of mental disorders and subsequent pharmacological intervention can serve as a way of self-domesticating the human race, that’s not all there is to them. The increase of mentally disordered people points to a new way to deal with the existential condition.
The way this plays out is that, whenever we feel too bad for too long, we are encouraged to believe that we’re suffering a disorder - even though our experience can be considered to be in the realm of normal emotions And whenever we seek help to deal with these feelings life can cause, the false allure of modern day psychiatry tempts us to buy into an idea that is too good to be true - that we are right.
Wait. “Why would we want to be right?” one might ask. Why would we want to believe that our depression is a sign of a faulty brain rather than a normal reaction to life? Because the idea that your suffering might have nothing to do with being you, but with being itself, is exactly the kind of disturbing thought we’d rather sweep under the rug. After all, many of the things that bring us mental distress used to be thought of as a quality of existence itself. But viewing them a feature of your own mind allows you to rationalize life's suffering by shifting the blame away from existential conditions and towards something that can be allegedly targeted. The idea that a disorder is responsible for everything wrong with your life - that there’s an invisible perpetrator inhabiting your brain - gives you something to aim at, which can be quite the relieving experience. It sends the message that if you only accepted its existence, you’d be on the right path to eliminate in.
In his play “The Cocktail Party”, T.S Elliot’s captures this human desire in the character of Ceilia Coplestone. While talking to a psychiatrist at a party, Ceilia uses the opportunity to seek quick consultation for her misery:
Well, there are two things I can’t understand,
Which you might consider symptoms. But first I must tell you
That I should really like to think there’s something wrong with me—
Because, if there isn’t, then there’s something wrong,
Or at least, very different from what it seemed to be,
With the world itself—and that’s much more frightening!
That would be terrible. So I’d rather believe
There is something wrong with me, that could be put right.
I’d do anything you told me, to get back to normality
Ceilia expresses the desire to get back to normality: But what if normality was only a mental construct to begin with? A defensive mechanism that broke once she was faced with the more painful aspects of life. Now she’s left with the task to rebuild her mental image of the world - not an easy feat. One, that creates urgency for an easier answer. And there’s no shortage of people willing to sell her one.
So, undoubtedly, some credit of the increasing number of mental disorders goes to the human desire to rationalize one’s unhappiness and to act against its perceived cause - we like to self-diagnose. Yet one can't blame the human mind without mentioning the facilitating role the Diagnostic and Statistical Manual for Disorders plays in this matter.
The DSM
The DSM’s definition of disorders - resembling the genericness of a tabloid horoscope - illuminated a path to the doctor’s office for those who were initially uncertain about their psychological health; with the prescriptivist rhetoric suggesting every little uncomfortable fact of life to be an illness. The variety of disorders directly facilitates that desire to distance ourselves from the pain we experience: It offers everyone a unique platform to express one’s individual psychological inadequacy and dissatisfaction with life. After all, the word disorder carries a negative connotation. It implies something bad, something out of the ordinary; other than it should be. The DSM offers you an explanation for whatever misery you're experiencing - tailor-cut to your personality.
In most cases, the multiplicity of mental diseases can’t do what a “General Existential Disorder” couldn’t do. Yet while the GED would make the diagnosing process arguably easier and more efficient, it doesn’t acknowledge one’s unique brand of suffering: The increasing compartmentalization of mental distress - creating an ever-growing palette of terms to describe our own shortcomings - is a necessary feature to provide the much needed aura of psychiatric legitimacy. Depending on your levels of cynicism, they could also be viewed as bespoke drug ads for every personality demographic. Which is a stroke of genius:
If you’re alive, you got problems. And what every perceived fault of the life of yours will have in common are your fingerprints all over it. Going around and using these fingerprints as evidence that it’s only you who has them makes for a quite enticing opportunity for financial enrichment - which is what the pharmaceutical companies are doing.
It seems as if Allen Frances, Co-Author of the newest revision of the manual, adapted this view. He sentenced himself to a life of displaying regret about the lack of engagement he exhibited when he didn’t include the warning signs for disease inflation enabled in part by the DSM. A strategy that - as he somewhat reluctantly admits - mostly benefits people selling the antidote. For what it’s worth, he largely redeemed himself by writing down his own experiences in a cathartic exercise of a book:
We should have been far more active in educating the field and prospective patients about the risks of overdiagnosis. There should have been prominent cautions in DSM-IV warning about overdiagnosis and providing tips on how to avoid it. We should have organized professional and public conferences and educational campaigns to counteract drug company propaganda. None of this occurred to anyone at the time. No one dreamed that drug company advertising would explode three years after the publication of DSM-IV or that there would be the huge epidemics of ADHD, autism, and bipolar disorder—and therefore no one felt any urgency to prevent them.
Summing up, the shape-shifting properties of being generally unhappy play very nicely into the act of diagnosing a mental disorder, since a frustrated person's life can show its mercilessness in every way imaginable - most likely, though, it will express be expressed through the behavioral venues directed by the same thing that creates personality. Personality becomes the scapegoat for your problems, as it is the lens through which you experience life. As such, any kind of misery you experience will be retrofitted to the most fitting mind-pathology matching your character - which one that’s going to be will be decided in a 15 minute session of lopsided soul searching in the psychiatrist’s office.
In Filth It Will Be Found
Now, none of this is to say that it’s not your individuality screwing up your life. It’s that we are encouraged to view the problems we encounter as a product of a biological error - even though every painful perception can be viewed as a biological error. But until we realize the transhumanists’ wet dream of making suffering an artifact of a dystopian past, we need a different strategy. Thus, allow me to put a proposal forward: Maybe a lot of anxiety and depression is the result of denying life challenges, rather than confronting them.
Society values emotional safety above everything else - but it also misses how the neglect of life’s suffering doesn’t make it vanish, but rather intensifies thoughts of meaninglessness and fear of the future, for there is nothing in the present that is fulfilling enough to make these thought endurable; no feeling of victory that comes from successfully confronting a challenge. After all, we are told: There are no challenges. There’s no reason to fear, nor to look into the future any other way but optimistic. By pathologizing the experience of your average person, we are told that feeling anything less but content with our life’s all the time is a symptom of a deranged mind.
Maybe a mental disorder is a sign of a healthy mind in a sick society. Perhaps subjecting us to the story of an easy life that doesn’t demand anything from us, we lose our sense of self development and progress toward a meaningful goal, since everyone gets a trophy - which is of course a lie with consequences of ego shattering degree in the face of reality, once one gets a glimpse of it. Society's efforts to paint a picture of an egalitarian, non-threatening, non-competitive world order is not only a statistical impossibility, but has introduced unwelcomed repercussions: Creating more uncertainty than necessary taints people’s self-perception of their mental health in a self-fulfilling-prophecy way.
There’s an old Latin maxim - Insterquiliniis invenitur – telling us about the importance of overcoming primal fear responses to transform ourselves into stronger individuals. “In filth it will be found” it says; the challenges that we dread the most are the ones we need to confront to become better people. Transcending our limitations is what expands them, through the willful exploration of unknown territory. Where it becomes uncomfortable is where the learning begins. When familiarity sets in and mastery is achieved, the cycle repeats. It’s the engine driving the feeling of progress, the dynamic revealing itself to us as the feeling of a meaningful experience.
At the same time, what people like to call irrational fear is often not so irrational after all - it’s just a higher sensitivity to the mental picture of the worst possible outcome. But you don’t overcome this hurdle by eliminating it, but by acting in spite of it: Acclimating to a stronger mindset by accumulating experiences of repeated exposure to the projection of our inner demons. It’s what the highly effective exposure treatment is based upon. Whenever we’re siding with comfort, though, we’re robbing ourselves of the chances to realize our potential and experience satisfaction. We’re slowly forgetting how to differentiate a threat from a challenge, denying the subsequent development of the self; we’re made to believe in an ideal word, a thought we love to hold on for it eases the burden of existence, the harsh reality that life is essentially unfair and demands attention, hard work and the willingness to let go of any preconceptions to orient ourselves to new found circumstances. Fearing life is not the exception, it’s the default.
But that’s not what we’re told.
The broadness of the DSM criteria is the insurance that it’s not your fault, you’re the victim of an illness. It subtly takes away the confidence of our own capability to see life as the challenge that it should be framed as to succeed in it; your “response ability”. The indirect discouraging of this way of thinking mutilates our ability to overcome the things we fear, generating counterproductive psychological effects and leaving behind a person weaker than he or she needed to be. Not only does this way of thinking takes away away your responsibility from suffering the emotional results of your actions brought you, it neglects its existence. All it accomplishes is laying the groundwork for a fragile sense of self - one that is likely to collapse in the instance of any physiological or psychological inconvenience too intense to bear. It will flood the gates open with all the unresolved emotional problems all at once - the defense for which should have been part of a consistent self-development over the years.
The problem isn’t the categorization of people’s sometimes extreme mental distress. It’s how the categories are large enough to allow any of life tragedies to be dismissed as something akin to a cold virus, robbing us of incentives to make things right on our own - and thus experience what often cures it most effectively. Institutionalized mental health care makes way for the mutual agreement that life isn’t so hard, we can all relax, and if you can’t, don’t worry, there’s something wrong with your brain, but it can be fixed. A suspiciously instantaneous solution for a problem arguably embedded in the structure of nature herself. There’s no way to exit the game, except death - a rising tide, indicating society’ unwillingness to teach people its rules.
The Utility of Psychopharmaca
I’m not trying to discard the utility of pharmaceuticals or any other kind of intervention. I’m not old-fashioned in a way that I’d ignore the materialistic basis of these problems. It’s that it became too easy to lay off the responsibility of trying to overcome life’s problems to some mysterious and fuzzily defined biological disadvantage - and thus miss out on the painful, but more effective ways to learn dealing with life and the personality with which you experience it.
It’s also not to say that people aren’t correct in their assumption that their unique biology makes them suffer in certain ways that correspond to one of the mental disorders. Some of the time, certain personality traits will interfere with the goals one set to him or herself, such that using drugs as an effective counter strategy seems like an option worthy of contemplation. That still doesn’t make the person disordered in the sense that society conveys it, but rather an aspect of one’s character that can be compensated for through the means of a drug to achieve goals aligned with societal ideals - or your own, for that matter. Ironing out those flaws is what life consists of anyhow.
But instead of the assistive tool for mental wellbeing these drugs should be framed as, their popular portrayal is more akin to a prosthesis. Which achieves the opposite effect that they’re supposed to: You don’t want to take a person's sense of progressing towards their bettering by telling them that they’re already there as long as they're taking their medication. They will never be there, as there’s no there there. Moreover, the magical sales aura of these drugs’ isn’t true to reality - their efficacy often leaves a lot to be desired. Not at least because they tend to be prescribed before looking into comparatively benign health-related causes for the disordered state of mind, which they still can’t easily compensate for.
I could also see another partial explanation of the increased suicide rate of people taking antidepressants: The diagnosis and consecutive pharmacological treatment can lead to an involuntary feedback loop of self-deception, as the belief in the reality of a discorded psyche manifests itself as soon as some of the treatment methods begin to work. However, once the initially placebo and mild psychoactive kick-start begin to subside, individuals will be left searching for any indication that they are indeed biologically equipped with a malfunctioning brain, justifying another round-trip to the psychiatrist's office. The DSM - always lowering the threshold that would lead them to make an appointment at the doctor’s - creates a constant sense of impairment that will prevail, reassuring the believe in the neurochemical explanation for one's dire state of mind. When someone wrongly diagnoses you with cancer, you will find symptoms.
This applies especially to the most neurotic people, as they’re the ones most likely to think they are mentally disordered (explaining their disproportionate prevalence in a sample of people who visited a psychiatrist.) Anxiety-predisposed individuals will suffer the most under half-true diagnoses, for their hypochondriac tendencies can become exaggerated - once they are convinced that their brain is broken, and the drugs stopped working, it’s hard not to get hopeless by the thought that even the most sophisticated pharmacological means can’t mend their malfunctioning psyche. Life does contain as much suffering as they feared. A soul-crushing experience that could have been dampened by telling them outright that there are no simple solutions, mentally preparing them for the hard road that lies in front of them.
There’s a flipside to this coin as well: You don’t need to get your character pathologized in order to take advantage of the practical utility certain pharmaceuticals can provide. Having to obtain the label of a disorder before getting access to these drugs is humiliating and only necessary because society can’t get past its Freudian complex that changing your psyche is unethical. It’s insisting that we all experience life with the same amount of motivation, satisfaction and anxiety, and using drugs only gets a moral pass once you are under that baseline. But are you really depressed just because taking some Prozac improves your life? Do you have an anxiety disorder because you get nervous around new people, and GABAergic drugs take that pain away? Do you have ADHD because Adderall makes you more willing to focus?
Maybe. But in the end, what does it matter? If you get an advantage out of it, go ahead. There’s no need to think of yourself as disordered just to benefit from some of these drugs properties. We drink coffee, we consume alcohol and we smoke cigarettes - all drugs that could be prescribed as antidotes for many of the mental disorders in an alternate universe. Since almost the whole world indulges in at least one of them regularly, one has to ask whether these people would unveil their disordered state of mind once they stopped. Probably, because the drugs gave them an advantage they didn’t realize they are beneficiary of all the time.
Conclusion
Mental health institutions opened the door for explanations about life’s intrinsically unfair, dreadful conditions - unsurprisingly, people ran them the door in. And as we can see in the impossibility to converge to an ideal, the DSM creators necessarily got themselves the dilemma on hand to broaden the criteria in a way that encompasses what would have been considered normal only decades ago.
Yes, language is inherently pragmatic and becomes more so the closer the world gets, culturally and geodemographically. But with the transition of words general enough to describe a common theme of negative affection into diagnostic labels backed up by the authority of health institutions, everyone will inevitably begin to contemplate if his bad mood could be a disease in need of a reckoning whenever life gets too inconvenient.
What might sound like an advisable and prophylactic thing to do, poses a non-insignificant risk to the more vulnerable demographics - like the emotionally labile or children and teenagers - who, in their conviction that their periodic sadness is of pathological quality, can become subject to potentially harmful pharmacological interventions, resembling the numerous unnecessary operations surgeons perform for the patient's wishes and seemingly their own. This overly simplistic way of dealing with the complexity of human psychological can stunt personal development by leaving behind opportunities for character growth. Moreover, it stops any incentive to apply more health-focused strategies in order to regain any sense of lost mental wellbeing.
Society started to allow viewing each and every personal weaknesses or failure in life as attributable to a state of disorder, an illness that can be cured. Naturally, people’s natural proclivity to be lazy, anxious or all over the place are the prime character traits transformed into clinical pathologies. These days, they work either to the detriment of the individual or simply don’t bear any merit, arguably a disadvantageous itself in the highly competitive cultural context of current times. For what it’s worth though, that doesn’t make them disorders. Neither do low levels of attractiveness, small height or an under-average IQ, even though these things certainly do matter in terms of personal success - or climbing the socioeconomic ladder.
I get it: It’s nice to not be anxious in social situations. It’s nice to wake up being optimistic. It’s nice to be motivated to work on something that pays off only in a few years. But there’s no reason to believe it is normal. The idea that it is is what’s sold to you, and we buy into it because it’s an appealing thought; that all these things would start happening if we only acknowledged that a mental disorder causes our inability to feel that way. The mass appeal of this notion allowed psychiatric drugs to undergo a transformation from a niche product reserved for people in mental institutions to a mainstream psychoactive - all due to moving the goalposts of the normal distribution closer to the middle.
The scattering of authorities deciding what is and what isn’t disordered doesn’t help either: It makes the most corrupt healthcare practitioner the one setting the bar for what is worthy of prescribing medication. On top of that, the doctors who make the diagnosis are already a product of sampling bias, and propagate their own personality-standards through the population. People working in the healthcare industry are just not very likely to exhibit the same behavioral tendencies as their troubled patients - the filtering mechanisms their job imposes on them will only allow for people with a certain personality, mostly the exact opposite of their patients. All of which leads to great amounts of emotional dissonance between them and their patients.
What saddens me the most is how socially acceptable it has become to make our personality the scapegoat for our misery whenever we’re in a bad place. Realizing that your life isn’t the way you want it to be is not a sign of a disorder - it’s a reminder that this is how things work.
In light of this, treating our character like an affront to a messianic ideal would make everyone feel inferior. Getting the message that you do actually lack the capacity to make things better - it’s not your fault - might be relieving at first, but there’s a price to pay: You might never see the better version of yourself that you could have become. Because no one told you you’re able to when you needed to hear it the most.
Life is a pretty depressing and frustrating affair, a fact that we’re bound to remember as soon as more than half the population fit into the DSM’s definition of pathological, successfully inversing the definition of normalcy. At this point only a matter of a few generational reiterations of the manual. If we continue this pace of diagnosing - inventing new criteria as we go along - everyone will soon have a label for their unique mental disorder. They are going to become so specialized that they can’t be further separated into smaller parts; they are going to be individual - as individual as the different set of inadequacies each person has to bear.
Wouldn’t it be easier if we just continued using our names?