The Art of Begging

What do you first think of when you hear the word “begging”?

For me, growing up in The Netherlands where there was social security, it used to be the hungry children in developing countries I saw on TV.

It wasn’t until I started travelling abroad that I noticed people at night checking garbage bins for something worthwhile. Of course, it is perfectly possible that this also happened in the big cities in The Netherlands, but I had never seen it and being confronted with it in Athens, Brussels and New York – places I had considered rich and therefore capable of providing their citizens with at least enough to eat and a place to live – shocked me.

Since then I’ve learned that the wealth of a country that calls itself democratic is not a reflection of the wealth of its citizens (its voting population) and I have become somewhat used to seeing people sit on the pavement in a busy shopping area with a sign and a pot asking for money – even though it makes me uncomfortable.

But this post is not intended as a discussion of the rights and wrongs of western economy and politics or the poverty of a large part of the world population.

I want to discuss the act of begging itself.

According to the Merriam-Webster online dictionary, “to beg” means “to ask for as a charity”, “to entreat” (plead urgently) or “to require as necessary or appropriate”.

Considering these definitions, the child begging on the street corner for a bit of money to pay for food, clothing or school books is asking for charity for himself as a necessity – not just in his own eyes, but in those of most people – and this requirement is urgent, so that it can be considered appropriate. Though it makes me sad that it is necessary for many people to beg for their basic needs of life, I have no ethical issue with the act of begging for this purpose.

What I do have an issue with is the begging that is required of western children (often those of well-off parents) in order to help pay for a communal project or a larger charity in which the interest for the involved children is only that of being the beggar – not the beneficiary or the organizer. In other words, it is not their project but a project that is imposed upon them and which they may not even know the details of or agree with. They are not asking for themselves but for a third party.

Thus,  they may be asking for a charity and this may even be an urgent need, but the individual beggar has no necessity and my issue here is to question its appropriateness.

Now there are different angles to approach this topic with and it depends on the personality type of each person how they will feel about it.

There are those who consider it a good way to instil in children a sense of community and responsibility and to create awareness of how good they have it if compared to those the charity is for – it can be a larger project like “feed the world”, an awareness project to support a minority group, a project for research (cancer charities) or a project that has to do with raising money for a local club or hospital.

The other view – and the population is roughly 50:50 divided according to personality type theory – considers this manner of begging an ethical insult on their autonomy for two reasons: firstly, the being forced to participate because it is expected (the sense of making the act of begging into a moral obligation) and secondly, the act of begging itself being presented as a legitimate way to get to your funds rather than to work for the money.

Note that (despite my own preference) I do not say that either view – “group-responsibility” or “self-accountability” – is right or wrong, since these differences are directly related to a person’s psychological type and thus to one’s inborn perspective of life.

I have defined “group responsibility” as “the sense of belonging to the community one lives in and the duty of every member to partake in what needs to be done” (Concerto for Mankind: 351). In other words, the community is expected to take priority over the individual; the responsibility to obey the moral values of the group and to do one’s duty comes with being a member of the group, regardless of whether that membership was a choice. Thus being born in a certain country or being legally obliged to attend school makes one a member with obligations.

I have defined “self-accountability” as “the expectation that a person actively chooses his membership in groups and is fully accountable for the obligations that follow from this choice” (Concerto for Mankind: 372). Thus the group is expected to allow for the views of each individual and being born in a place or a legal requirement to attend school does not make one morally responsible for participating in that group since it was not an autonomous decision.

Again, this is an inborn perspective difference that cannot be changed. Western society currently supports the first perspective and thus believes that all citizens should share this view or else be considered undemocratic, selfish, manipulative, psychopathic, antisocial and so on. Today, with the risk of pre-emptive justice based on such labels greater than ever, it is extremely important that people become aware of these different perspectives as being different expressions of normal human psychology.

Allow me some personal examples.

When I was young all the fifth and sixth graders of both public and religious primary schools were supposed to go door to door to sell “children’s stamps” – specially printed sets of postage stamps and postcards of which the proceeds went to a nationwide charity – once a year. Not only was each child expected to participate, but between schools it was a competition with some schools allowing their students to leave fifteen minutes early so as to get a head start.

I absolutely detested that project because it is not in my nature to go and ring the doorbell of a stranger to try and sell them something and because asking for money (regardless of whether it is in exchange for goods) is something I don’t like doing. Today, knowing personality type theory, I know why that is and I know that I’m not alone in this view, but when I was ten and eleven I felt forced to go to a few neighbours I knew and have my family buy the rest to prevent being berated at school by teachers saying I was antisocial, did not care for the poor people and was not helping the school win the most points.

Later, when I had my own family in the UK, the US, Australia and New Zealand, I have had children ringing my doorbell asking for money (usually in exchange for chocolate bars) for their local scout group, their local netball team or their school. My daughters have occasionally been given boxes of chocolate with the expectation they sell them – by the same schools that lecture these children about how bad chocolate is for your health.

If the child at the door was one I knew I’d usually buy something – so as not to burden them with my political views and for that same reason I gave my children the choice whether they wanted to participate – otherwise I turned those children away, especially those that wore some kind of uniform (scouts or school).

In New Zealand they regularly have high school children (in uniform) standing at street corners or in supermarkets trying to get shoppers to part with their money before buying their groceries – yet those same schools complain that parents are not giving their children proper lunches or enough milk. And as far as I know, scouts are supposed to help people in need not exploit others by putting children out to beg.

My policy with any charity is that you can send me things in my mailbox to make me aware of your existence, but if I want to donate I will come to you. The moment somebody accosts me, I feel that my privacy is invaded and turn them down on principle, regardless if what the cause is.

If a charity uses school children (during school hours), then in my view they are using class time to teach them how to beg. The key of the art involved is the use of the innocent looking and the obedient (in uniform) and the objective is to make the public feel guilty or ashamed and so entice them to donate.

This appeal to shame or guilt – if you don’t give your neighbours may see you – I consider totally unethical. Besides it being emotional blackmail, they are teaching children that it is okay to beg and that peer pressure is an acceptable tactic to force people into compliance.

Even more unethical I consider the begging we start seeing more and more on “reality TV”, where some so-called charity is offering to change somebody’s home (usually with a sob story as an excuse) but without the budget (from the TV producer or network) to actually pull it off, so they invade a local store (usually a small business trying to make ends meet) and literally ask them for free donations of material. The store owner has no choice but to say yes – after all there is a guy with a camera standing behind them, while the presenter of these programmes is no doubt getting a fat salary out of the deal, as is the producer.

This attitude, in which school children and viewers are made to believe that begging is a moral good, equates the sad truth that some people do not have enough to eat with the moral choice of asking for favours using guilt and shame induction.

To me, begging is a desperate last resort for people who have nothing and it should not be made into a moral obligation.

 

Advertisements

The Information Age and the Persona

I have previously mentioned “the information age” and stressed that we are actually living in an information “avalanche”, which causes many people to pay attention mostly to superficial data and do very little actual background reading, so that the idea that people are better informed is an unsubstantiated assumption, and, as an uninformed population is at risk of becoming dogmatic, this is a concern.

Today, I want to focus on the positive aspects of “the information age”.

Having grown up in a time when there was no internet, no video, CD or DVD, no home computers, no eBooks and no mobile phones – yes, cars existed already and the dinosaurs were extinct – I am still amazed by the amount of interaction that has become available to people.

If I want to know something, I simply “google” it. When looking to buy a house, I can do all the price evaluations from my home computer and be completely informed. As a writer, I can have my books made into electronic files and send them all over the world. And as easy as it has become for me to publish my books, so other people have also found opportunities they would have before not had. Musicians record their work and put it on YouTube; photographers can share their work on Pinterest, as can decorators and designers. Other people create their own online magazines and become editors or they have their own radio show and do interviews. Anybody with a blog is in effect writing weekly or monthly columns.

And they all help each other. There are websites available, like sourcebottle.com and pitchrate.com which bring writers (journalists or bloggers) in contact with “experts”. To sign up as an expert is free and anybody who has some special interest or knowledge can be classed as an expert at something and thus find requests to respond to. To be asked for an article or an interview validates people; it makes them feel worthwhile and there is nothing more important to a person’s self-esteem than feeling worthwhile.

And this is more than obvious with today’s older generation.

Not too long ago, old people who went to a nursing home were considered no longer useful to society – whether it was said like that or not. Very often they lost hope or the “joie de vivre” and this had an impact of their lifespan. Today they are finding it back, their sense of being worth something, and it is thanks to the internet. It was jokingly told me – even if it is really rather sad – that this is because their children and grandchildren, who can not be bothered visiting very often, give them laptops as a present.

But it works; suddenly the old man who was thinking he was useless, is giving advice over the web to young people who lack his experience. Nobody asks for his age; nobody thinks they should talk slow because he is a bit deaf or explain things in a “poor-you” voice because he walks with a stick. Suddenly he is an expert and gets told he is such a great help. Suddenly he is free to choose what name other people use when addressing him instead of young doctors and nurses using his first name unasked.

The same applies for other people who are isolated – the farmer who doesn’t get to town very often, the mother who is cooped up at home with little kids all day  or the child who has trouble finding friends at school and who worries about not being pretty, smart or athletic enough to be liked by the in-group. They now find real friends on the web.

I say “real friends”, because even though many of them are physically distant, those that do later meet up find they already have a strong bond; looks are hardly noticed because they already connected on a deeper level – they found their soul mate.

This deeper level, of course, is their personality type; the only thing that is as deeply rooted in a person as their gender. It has been said that if all temporal, spatial, cultural and social boundaries were removed, the same people would still find each other.

That is because the persona – Jung’s word for the social mask people show the outside world – is removed. All the prejudices that go with age, gender, race and social status or culture (often associated with a person’s real name) are of no importance. People meet on the basis of interest (fandoms, Facebook and LinkedIn groups). – All that is important in these places is what people actually say; how they present their thoughts or beliefs and what they know. Different types of people are interested in different topics – yet there is an overlap which makes for the different opinions within groups. The good thing about social networks is that they are based on equality and not on arbitrary standards.

And not only that, but thanks to the opportunity to write under a penname do people often express what they would not otherwise. Especially fan fiction websites are a fascinating place to study psychology. The topics of the stories give a lot away about what is important to people – things they would never reveal to a psychologist in a survey and probably not even to their best friend.

Other aspects of the information age are just as exciting, like scientists reaching out to the general public (especially in astronomy) and finding that much more information comes available if any person is allowed to participate on a voluntary basis. The amount of effort people are willing to put into work (vocations) that were previously considered specialist jobs, is unlimited if they feel they get a return for it – not in pay but in acknowledgment, because they are following their natural talent. Could a society be founded upon this?

I think it is not only possible, but it is the only possible way forward. Yes, we live in an information age: to the average individual there is much more information available than ever before – that is access to the collective human databank – and more than ever that requires us to be alert to dogma. In fact, the cornerstones of our society (politics, justice and education) need to adapt to the new focus.

It is simply not good enough for schools, for example, to teach the same topics they always have with as the only difference that kids get to use the computer to do the work; instead of focusing on the contents of subjects (like geography or science), the facts to which every child today has instant access, schools should help them deal with information itself; focus on how to do research, how to communicate their ideas, how to prioritize and how to summarize, since those are the skills kids are going to need. Thus, schools need to shift their approach from contents to method.

The same applies to the political system; it is useless to have elected politicians chosen by promises (content) made in public places or on TV and which speech writers have written for them. Instead they should prove that they are natural leaders by engaging in serious discussions with any person who wants to raise an issue; prove they have insight and are capable of answering questions, prioritizing, making decisions for reasons of necessity (not votes) and dealing with controversy – skills.

And so, distributive justice should be adapted to the new era. First of all, we need to realize that, despite calling ours “the information age”, information has always been at the centre of existence. Information is not something new that we have invented and information is not a possession that one individual (or culture) can hold private. We are so intricately linked to information (as individuals and as a group) that to treat it like a possession is like treating existence itself as a possession; it is like a flea on the top of your head claiming that it owns you.

As discussed above, information is being shared more and more freely and that is a good thing as it helps all of humanity toward progress. As I repeatedly mention in my The Music of Life series and which is explained in my previous post “A Brief Overview of Typology”, information drives the personality type differences. They are not a choice people have or something they learn; they are a reflection of the aspects of information, and without these differences complementing each other – as in each person being tuned into a slightly different aspect – humanity itself could have not evolved to the level of civilization it is at now.

A Dangerous Misconception

This last week has been one of frantic e-mail exchanges and posts made on group sites and pages that have as their topic the on Jung, Myers-Briggs or Keirsey based personality type differences (the psychological types) that define and explain everything that motivates and influences every person.

I initiated this action in response to an article found on healthline.com that, in an effort to describe ‘antisocial-personality disorder’, referred to some of the psychological types as “abnormal”:

“Every person’s personality is unique. However, social scientists have identified distinct characteristics of personalities that can be assigned to specific categories.

One common way to label personalities is with the Myers-Briggs Personality Type Indicator (MBTI). The MBTI lists 16 types of personalities. Some types are considered abnormal. People who have those types have a mental health condition known as a personality disorder. Antisocial personality disorder (ASPD) is one type of these disorders.”The symptoms of this condition tend to worsen during late teenage years to early twenties. Treatment may help improve symptoms. Symptoms can improve with age for some people, allowing them to feel and act better by the time they reach their forties.

The article (www.healthline.com/health/antisocial-personality-disorder) displays a gross ignorance of the type differences and of the basic principles of the theory and sparked many furious reactions of those perfectly normal types that were in essence being called psychopaths.

My own first response was to write to Healthline and then to alert every psychological type friendly Facebook and LinkedIn group to this misconception – the latter generated much support and many also decided to write to Healthline with the request they correct their mistake.

Healthline itself responded to me within two days with a link to a scientific study by some members of the University of Colorado (http://www.linkedin.com/redirect?url=http%3A%2F%2Fwww%2Euccs%2Eedu%2FDocuments%2Fdsegal%2FAn-empirical-investigation-Jungs-types-and-PD-features-JPT-2%2Epdf&urlhash=pxps&_t=tracking_disc) on which the article was based, which implied that they consider the conclusions of the article (since it was based on an academic study) by definition correct. This is itself an assumption since the writer of the article drew conclusions over and beyond the conclusions of the article. Additionally, I also found that the study itself has many flaws that make its value questionable.

The article is flawed, not only because it shows a complete ignorance with regard the psychological types and the theory behind it, but also  because of the conclusion it draws, since it goes on to mention that ASPD  is being treated with medication or CBT, which implies that it is not a personality type but a temporary condition (as the types are inborn), and that some types “are considered abnormal”, which implies that either the MBTI makes this distinction or the study does, neither of which is true. By ‘abusing’ the word  ”personality” (see blog post The Abuse of Personality) the writer confuses inborn dispositions with a mental illnesses.

The study, on which Healthline says the article is based, is also using many assumptions:

The title of the study is “An Empirical Investigation of Jung’s Psychological Types and Personality Disorder Features”, although its bibliography does not list any of Jung’s works at all and only has the CCP Manual to the use of the Myers-Briggs Indicator, but no reference to any other books that explain the types nor to Isabel Myers or David Keirsey.

The sample they use for their study consisted of people recruited by psychology students, which implies they already had a certain outcome in mind and certain beliefs with regard what is accepted as human nature – most academic psychology department do not acknowledge multiple types of people – which is a result of the “brain story theory” currently accepted in the academic world (see my blog post The Black and White of Grey Matter).

The conclusion that there is a correlation between the characteristics they attribute to “personality disorders” and the MBTI personality trait descriptions is nothing new to any person who understands what the psychological types are about. Jung himself already made a study of psychological types in relation to psychiatric disorders, because that is how he got to the idea in the first place. The difference is that Jung and his followers consider personality differences healthy and acceptable differences in normal people, while the researchers consider these differences unhealthy.

The study makes a caveat that advices further investigation since

“Jungian types were not verified by any instrument other than the MBTI and personality disorders or their features were not verified by any instrument other than the CATI. In addition, both instruments are self-report….”

Of course, self-report is the only possible manner of measuring types and the mention that personality disorders are based on self-report may be unavoidable for the same reason, at which point it becomes a question of who accepts that what they experience is a sign of abnormality and label themselves thus and who chooses to explain what they experience as a type trait.

This ‘choice’ in turn depends on the personality type of the people involved. Some types are keen to trust authority or popular views and other types choose to make their own assessments.

The study also states that “Lifetime prevalence rates for personality disorders in the general population are estimated to range from 8% to 13%… The present sample was expected to have sufficient levels of personality disorder traits or features in order to conduct the present investigation”.

Thus, the study set out with the assumption that the disorders listed in the DSM-IV (because it has an official name) are objective facts rather than a collection of traits that are given a special name when sufficient people are found that match the description, forgetting that most people making these descriptions do so because they have learned that they exist as a fact. In other words, the more popular or accepted a ‘disorder’ becomes, the more people will either self-report having it or label others thus, on which, in turn, the existence of the disorder is based.

The sample used is that of the general population (“The present sample is limited by a non-clinical sample of convenience”)  and not of clinically ill psychiatric patients, yet the study is happy to assign certain types with “schizotypal” (a new fancy word for “psychotic”) and “antisocial” (which used to  be called “sociopath” or “psychopath”) and a whole range of descriptions that used to be considered “neurotic”.

However, such descriptions of syndromes used to refer to temporary problems (like depression), but are now suddenly referred to as “lifetime prevalence”, which is confusing the state of unwell being (of real depression) with a melancholic disposition, which, indeed , fits certain types more – even if not the ones the study mentions, and is making the same mistake as the writer of the article, mentioned above.

The study mentions that “Psychopathological MBTI poles were clearly more likely to be introversion, intuition, thinking, and perceiving” –  which ironically describes among others Jung himself, Einstein, Bill Gates,  Kant and David Keirsey and most any academic who chooses theoretical physics or computer programming, while the ASPD label would include people like Benjamin Franklin, Churchill, Socrates, Oscar Wilde, Mark Twain and Richard Branson.

My point is that the intent of the study was to accept personality disorders as a given and use the MBTI as a way of recognizing them, rather than the other way around – as in acknowledging the differences and then note that some types may be more sensitive to certain problems, which is what Jung did. The difference is that the former attitude predetermines that being melancholic, short tempered,  not practical, not inclined to express one’s feelings, and preferring one’s own company is by definition pathological and that the MBTI is merely an “instrument” that can help  find and treat these ‘potential patients or sociopaths’.

Therefore, the most important issue have with the study is that it claims to ‘know’ personality type theory, yet it totally omits to allow for the personality type influences of the researchers that influenced the manner in which the sample was collected and the conclusions drawn, as well as for the prevalence of certain types in the population that makes any random sample .

“No matter how open-minded we try to be, our own psychological types influence how positively or negatively we view the traits of others; we are stuck in our own type and nobody can look at other personalities objectively.” (Playing with Natural Talents, Nursery Rhymes and Musical Complement).

Many researchers, especially in academic psychology, are of the J type that measure to one standard and believe that every person should conform to the norm, which leads them to consider other people “abnormal”, so that the conclusion was predetermined , since most academics are of a personality type that prefers to work with categorical standards (as explained in my book Concerto for Mankind).

The other feature most academic psychologists have in common is that they are more likely Fs than Ts, while empirical studies tend to attract more Ss and extraverts are more likely to go out and interview others and to volunteer for such studies than are introverts.

Additionally, any statistics based on volunteers or a random sample of the population is bound to be flawed to begin with (as explained in Concerto for Mankind) since certain types are more common prevalent in the population.

In short, the study is naïve, not original, non-conclusive and based on assumptions.

So, if the correlation is nothing new, then  why the anger?

For two reasons: it justifies discrimination and opens the door to pre-emptive justice

The study itself  is just a study of some people who are only recently waking up to the accuracy of typology. The problem comes with people who write articles to the general public based on such studies and say things like “some of these types are considered abnormal”, as described above, which causes other members of the public or politicians, who have accepted the existence of personality disorders at face value, to conclude that if certain types are more likely to be psychopaths, we should be able to recognize them early and prevent school shootings by removing them.

Though I understand that sentiment, we are now talking pre-emptive justice: the idea that if we medicate or imprison these types before they act we are doing the world a favour.

Hitler (who was an ENFJ, by the way) thought so too: he thought that if most Jews behave in undesirable ways, they should be removed before they could do so.

This is discrimination of the same kind as racism, only unlike skin colour the different personality types are not immediately visible.

Yet it is discrimination of the same kind as is currently exercised in schools, where every child that cannot sit still (due to their absolutely normal personality type) gets tranquillizers like Ritalin and where every child that does not like team sports is labelled “autistic” – without anybody wondering if maybe humans were not meant to sit still in schools for hours on end without being allowed to exercise their individuality, their autonomy or their lateral thinking.

What is flawed is the school system; what is flawed is the academic investigation that has prejudged certain behaviour as abnormal because it is unlike that of the researchers.

I have yet to get a response from Healthline to my request that they allow me to publish a counter article, hence this blog post.

The Abuse of “Personality”

This is about the word “personality”, not the natural tendencies and gifts people are born with, although those might in some cases also be abused.

What concerns me here is the ease with which the word “personality” is used by medical and other ‘experts’ when they refer to a group of characteristics they detect in people – especially problematic characteristics – for which they cannot give an explanation. This is where we get the recent increase in “personality disorders”.

Throughout history there have been disagreements about whether “personality” only refers to the outward behaviour (from Jung’s “persona”) or whether it includes the inner motivations, emotions and unconscious functioning (psychological type) that cause this outward behaviour. The dictionary describes it as “the complex of characteristics that distinguishes a person” or “the totality of an individual’s behavioural and emotional characteristics”.

Half a century ago, in the west, it was largely accepted that personality formed as a result of the environment (behaviourism) and that children came into the world as “empty slates”. That view has been replaced by the idea that children inherit their disposition in part from their parents (DNA) and that the environment has much less of a role to play, while Jung’s psychological type theory, that is almost a century old, states that most of a person’s personality is predisposed and the environment only influences how happy a person learns to be with his inborn self depending on how tolerant it is.

In any case, what we refer to when we are talking about somebody’s personality  is something that is more or less permanent – not as changeable as moods or a stage of life and not as superficial as habits or manners – and includes their inner motivations and natural tendencies.

Implicit in any of the definitions – since they refer to “distinguishable” and “individual” – is the acknowledgement that not every person has the same personality. For both the behaviourist and the DNA-based view, this means that the amount of personalities could be unlimited, while in the Jungian view there are sixteen different types.

So what are personality disorders if inner motivations and tendencies are individual? How do ‘experts’ come to talk about “borderline personality disorder” when they have yet to define the “healthy” personality, since each personality is different – where is the border if there is no territory to circumscribe it?

How do they justify taking a group of behaviours, such as murder and sexual assault, and excuse it as “antisocial personality disorder”?

How do they explain saying that people who present with excessive mood swings are having “bipolar personality disorder”? – And let’s face it, most people who today claim having this disorder, because their doctor or therapist said so, are perfectly normal functioning individuals, who sometimes feel a bit down and at other times a bit happy. I wonder if these doctors or therapists have ever encountered a person with true manic-depressive psychosis, since if they did, they’d not so easily throw those labels around.

And since when do people who develop anxiety due to stress in their life and try to compensate for that with sometimes rigid routines suddenly have “obsessive-compulsive personality disorder”?

Not too long ago, these were called psychiatric disorders, such as psychopathy, psychoses and neuroses, which suggested that a person’s behaviour was ‘abnormal’ due to a certain problem.  What this new terminology suggests is that people who do not behave according to the norm have something wrong with the very core of their person.

If people present with a collection of symptoms that are physical, we call it a syndrome and consider it a temporary state of physical unwellbeing. So why do syndromes of behaviour refer to a permanent personality?

Since when is a temporary emotional problem due to stress and the very demanding life style we have imposed upon ourselves suddenly a personality related problem if a personality is largely inborn?

This misuse of the word “personality” destroys the acknowledgment of the healthy and natural differences (called personalities) in people that can explain our different life styles, moods, coping mechanisms and stress responses without forcing everybody who does not fit the template to start swallowing Ritalin or other behavioural drugs in order to be accepted. This is no different than putting non-conformists in mental institutions so they won’t trouble your social order.

I would like to know how psychologists who say that DNA and environment create different personalities so that “we are all individuals”, justify measuring each person to one psychological standard?

It is so common to meet people who claim having a personality disorder nowadays that the norm has become this tiny little fraction of the population that does not rely on labels to excuse their dissatisfaction with life. In other words the norm has become being “abnormal” – you figure it out.

Midlife Calling?

What does it mean: midlife?

We tend to think of it as the time after the age of forty when we begin to realize that life is not forever. Our children are growing up, our reproductive clock is ticking louder and suddenly we become aware that there are certain things we can no longer do – or are no longer expected to know: “Mum, your idea is old news; everybody who uses the internet knew that already.”

Men may not have the noise of their reproductive clock alarming them but they tend to have other hang-ups, like still being able to perform in bed.

Of course, there is no proof that I am past the halfway point of my life just because I’m in my fifties – if I live to 120, I may still be almost a decade short of my midlife.

On the other hand, my midlife may have come and gone. How would I know?

Some people refer to midlife as the period when a crisis causes them to reassess what is important – the midlife crisis, which usually expresses either mentally or physically: Some of use are suddenly confronted with a medical condition that needs immediate attention; others suddenly find themselves experiencing emotional  problems, such as anxiety and depression.

In reality, it is never either physical or mental but a combination of both.

My own crisis came with high blood pressure. After fifty or so years of never having any medical problems, I was convinced that I was healthy and would forever be so, so I ignored all the symptoms: the scales were wrong and the shortness of breath was just my age. Consequently, it took months of feeling increasingly ill before I actually walked into a doctor’s office.

The doctor was quick to say that some people are simply prone to getting high blood pressure at a certain age and I’d have to take medication for the rest of my life or I could be dead tomorrow.

Of course, it wasn’t my age that caused the high blood pressure; it was the accumulation of twenty or so years of keeping my emotions locked inside. Like so many women who want another child, I had been going between hope and disappointed month after month but kept the desire and the pain inside.

The day after I went to see the doctor and was told it was a blood pressure problem, I woke up suddenly knowing what the answer was: I had to open the taps; I was literally drowning in the emotions that had no way out.

This insight was later confirmed when I read Louise Hay’s book You Can Heal Your Life.

So I ignored the doctor’s threat, sought the help of a homeopath, an aroma-therapist and a reflexologist, and made some changes to my lifestyle. But most of all I shed all the tears that had been bottled up – privately, since I am still an introvert – and now my blood pressure is back to normal.

Some times it takes a crisis to wake up to what is happening under the surface; to understand that the body and the mind are intimately connected and that our beliefs direct what we experience; that “midlife” may be just a social construct we have grown up to expect.

What if midlife is not meant to be a crisis but a calling?

Midlife may have been calling me; it was alerting me to the need to make some changes to my life; to explore new possibilities and maybe to forget old regrets and explore a new future – I am now a hypnotherapist.

We talk about “a calling” when we mean a vocation; something in life you are meant to do because it is a natural drive and you can’t stop it.

Like so many people, I never got to follow my natural talent when I was young. Being a writer was not considered a job that brought food on the table and going to university was only for the rich, the smart and the eccentric. The word “philosophy” did not get used in our house, so it never even occurred to me that I could get a degree studying it, and despite having written stories since I learned how to write and spending the bulk of my teenaged years at home behind a type writer, the idea of getting published had never crossed my mind.

It took until midlife for me to actually seriously try to get the stories that were by then taking over all of my cupboard space published.

So, I am now living my ‘calling’. I am publishing my   philosophical contemplations in fiction and non-fiction and thereby doing what my nature intended me to do. It helped, of course, that I started to understand what this calling was and personality type theory gave me that understanding.

My advice for anybody struggling with a midlife crisis is to stop fighting with the world and learn to respect and enjoy who you are meant to be.

Don’t call it a midlife crisis; turn it into something positive, something constructive. Consider it a message from your inner self that tells you it is time to pay attention; time to make a change to the focus of your life.

You don’t need to wait for a health crisis. Each of us have  our dominant functions that are super developed and direct what our natural talents are, but even those who have been lucky enough to have lived according to their natural talents may choose to change their focus and have a go at those activities that belong to their weaker functions. For Ns that may mean exploring one’s artistic possibilities or try a hand at DIY, music or a new sport. If you don’t have to make a living with it, it can be a wonderful hobby. For Ss, this may be the time to try something a bit more theoretical; start studying something that always interested you.

In general, midlife is a time to reconsider, rekindle and refocus, but most of all to enjoy. After all, you may have another half a century to go.

Next Newer Entries