Thursday, November 21, 2019

The Art of The Workaround

I recently pitched over the front of my bicycle and broke my collarbone, which forced me to find ways around the limitations my injury presented.  As soon as I could dress and undress myself, I was back at the gym, having to work around the steel plate with screws in my clavicle.  My rotator cuff hadn’t fared well in the fall either.  Looking around the gym, I saw I wasn’t the only one with a handicap.  Most of the gym rats had wrapped knees and wrists, or wore belts, but like any injured athletes, they didn’t stop training.  If they had to cut back on leg exercises for a while to pamper their bad knees, they concentrated on upper body workouts until they could add some lower body to the routine. I took to calling my workouts my workarounds.  Because I had to back off my weak spots until they were healed, there were tedious hours of light weight lifting, feeling like I wasn’t making progress, but I couldn’t stop.  I had to baby my injuries until they were better and build up slowly once they were.  The clavicle and shoulder were gradually less painful, and the doctor assured me that the fix was stronger than the original bone, though I wasn’t convinced, now that it had so many screw holes in it.  Anti-inflammatories were also useful. 

Having the ability to find a workaround requires both vertical and horizontal thinking, an important skill in a host of professions from the petty thief to the holder of a corner office.  Problems arise in the real world, and conventional wisdom suggests straightforward remedies, which are not always the most helpful. In order to advance, a society needs to let its past injuries heal in peace without pushing too hard against the pain.  People learn to work around the handicaps they are given.  Some people rail against their limitations and bang their frustrated heads against the wall, others surrender to substances, while a few apply themselves like unheralded paralympians to show, if only to themselves, that there is always a way around.  Our society often stumbles, but we find ways to move forward, often a path learned the hard way through false starts.  If we aren’t ready to alter the structure of our society to fix recurring problems, we need to glue together the salvageable elements from the wreckage of the past to build bridges to the future.  Otherwise, we will shake each other to pieces in a never-ending, mutually damaging war.   The body, the soul, society, and even the marketplace run on compromise and innovation, or as I like to call it, the workaround.  

An early parallel to the workaround is the jury rig, which brings to mind ship repairs, with sailors lashing a broken boom with rope so the ship can sail home. There is a theory that the jury part of the phrase comes from the French word for day, jour, implying that the repair might only last a day.  One of the weaknesses of this workaround is that putting undue pressure on a fix may cause another system failure. A permanent repair to the ship would require a new piece of uncompromised timber.  

A jury rig, given full rein, can end up as a kludge, defined as an “ill-assorted collection of poorly-matching parts, forming a distressing whole."  A mechanism like this may continue to function, but it is clumsy and temporary, and related to the words bodge or fudge, which brings to mind MacGyver, the king of the workaround. 

Hackers are in the business of workarounds.  Software developers build systems, and hackers, from curiosity, notoriety or profit, delight in finding holes in the system that they can slide into, like cars merging on a freeway.  Before the system is aware, it finds itself serving another master.    Programmers know that to block hackers, they sometimes have to burn down their poorly constructed houses and start again.    

In the world of entertainment, Prince called himself a symbol because his very name belonged to Warner Brothers.  Television networks routinely bleep words they don’t want viewers to hear, leaving the impression of free speech intact except for the odd forbidden word.  The act of censorship itself is a futile attempt to cover up the truth, but the reality still exists behind the fig leaf.  An iconoclast would destroy the offending statue, but the humanist finds a workaround that saves the entity and appeases the censor.  

We see products on supermarket shelves that are designed to imitate original brands and skate close to the wrong side of patents and trademarks, so that only the original producers of champagne and Parmigiano Reggiano are allowed to use those names.  Shady producers sell merchandise with similar names and imitative packaging, though any attempt to sell a McDonald's burger or a puppet mouse with a particular face brings down the legal weight of ferocious brand defenders.  Obvious and clumsy workarounds like these can be easily dislodged.    

In Greece, those who build houses often leave unfinished construction rods poking out of the roofs to indicate that the building is not finished, because completed buildings are taxed at a higher rate.  To an outside eye, the rusted corner ornaments are a cultural curiosity, but they are visible signs of a broader social breakdown.  The authorities suspect the building may never be finished, but they have no certain knowledge of the owner’s intentions, and cannot prosecute uncertainty.  Until a government realizes the need to rewrite a law, people will find ways around it.  In this same country, cynicism is rampant.  People believe that when the government decides to make a law, they make an escape window or workaround for themselves and their friends, then build the law around it.   
 
Recently, it has been brought to light that rich parents can buy their children's acceptance into big-name universities even if the applicants don’t have the qualifications for admittance.  Some of those caught were celebrity parents, so the stories had more traction than they might have in another time and place.  Not long ago, an English aristocrat would have assumed he could purchase a place for his son at Cambridge or Oxford by making a large enough donation to either University.  His son could eventually learn.  That’s what tutors were for.  The workaround practice of buying university places isn’t new, but it came as a shock to some.

Workarounds are used so often in our lives that we hardly recognize them for what they are.  Objects that have been invented to help us, like eyeglasses,   originally started as workarounds.  Somewhere in history, a man noticed that rock crystals in the right shape could become a tool for starting a fire.  The best fire starters had magnifying qualities, which, over a thousand years of refinement, were ground into eyeglasses or even contact lenses.  We have adapted materials like quartz, silica and petroleum to make smartphones that exceed the thinking speed of the human brain. The use of these building blocks began as a way to overcome difficulties, like distance, speed, and human frailty.   If one day soon we can’t go outdoors, it won’t take long for tinkerers to adapt virtual reality and drone technology to let our eyes go out to explore while our bodies are indoors, safe and protected.  Whatever happens to poison our world for human habitation, we will find a way around it.  

Some basic social supports, like daycare, began as workarounds.  It was a logical fix in early societies that the duties of motherhood could be shared with a network of sisters and relatives, so that more women were able to participate in activities that benefited the group, like agriculture and food preparation.  The children benefited from having an extended family with its broader range of educational input.

The best of intentions can have unintended consequences.  In Italy, the government passed a bill to protect workers’ rights.  The new strict labour laws applied only to companies with more than fifteen employees.  As a result, many burgeoning businesses limited their growth to avoid being subject to the new rules.  The new law worked well for artisans and family businesses, but economic growth stagnated.  To fill the gap in industry, the government courted multinationals who initially performed well, but were, in turn, subject to the pressures of supply and demand.  When the markets changed, the big companies were as loyal to the country that courted and supported them as a hen is loyal to an egg.  The hosts had been used, but they should have seen it coming.  They had shot themselves in the foot with flawed rules that business found easy to work around.  The original ill-conceived law, a workaround in its own time, had been a detriment to everyone.   

One of the most lucrative markets in modern times cashes in on the problems people have in coping with existence.  Solutions that range from antidepressant medication to wellness marketing are nothing more than fixes to get around our feelings of inadequacy and sadness.  We turned coping solutions into big business, but in the end, the offered workarounds did not fix the original problem.  If someone suffers trauma, there is no way to reverse the original injury, so we find ways to push it to the back of our minds, but the memory and subsequent pain will never disappear until we learn how to cancel memories, which will not be a good step forward. 

There is a wide range of coping techniques involving drugs and therapy. Humour is one method for exorcising pain, perhaps because we can transfer our pain to someone else. Their misfortune is our healing laugh.  When I was young, I often went to the movies on Saturdays with my older sister.  To stop myself from crying during sad passages in the film, I would look over at her, sure that she was well ahead of me in tears, and the sight of water running down her face would make me laugh.  It kept the sadness on the screen from entering my heart.  I didn’t want to be sad, so at that young age, I was already learning workarounds to avoid the embarrassing phenomenon of crying in public.  

When politicians suggest imperfect fixes for long-term problems, they refuse to think that they are stacking one jury rig on top of another. Social democrats wish to eliminate the flaws in a system that permits inequality, but are more reluctant to accept a patchwork of temporary fixes than traditional politicians.  Those on the radical left advocate altering the system from the ground up, calling for fundamental change rather than putting more fingers in a failing levee to protect territory that is already underwater. Those who resist radical change probably know that their willful blindness will come back to bite them.  

Temporary fixes and workarounds were never meant to solve problems permanently.  Sooner or later, structural changes need to be made.  When change comes, any workarounds in place become unstable or fail altogether.  Workarounds are brittle constructions.  They are not positive or negative in themselves, but are tools that can generate temporarily beneficial or disastrous results.    

The further we go into our future, the greater the effects of stress become apparent.  Stress has always existed, but its force has grown in proportion with our ever-expanding shared knowledge.  It is useful to understand your adversary, but when you know that he has an atomic bomb he can drop on you if he is in a bad mood, it can be stressful. Probably the most well-known and most commonly used stress reliever is religion, followed by alcohol.  People find ingenious ways to cope.  Making beer is an art.  

Tobacco, since its worldwide diffusion, has been a method of dealing with stress.  A quiet cigarette is a moment to stop and reflect, and a smoke on the run is for someone who needs a quick nicotine top-up.  We know now what many years of using cigarettes as a stress reliever does to the lungs, but if soldiers in the 20th-century wars chose that over going out of their minds with shell shock or having a smoke, the cigarette was the clear choice.  As can happen, a particular workaround might be worse than the monster it is trying to avoid.  Alcohol plays a similarly insidious role.     

The obvious way to relieve stress isn’t to find new coping mechanisms, but to eliminate the stressor that causes so many to turn to workarounds.  Historically, people have taken the drastic step of leaving home because of wars, natural disasters, religious persecution, or for better economic opportunities.  Whether the reason for flight is violence or hunger, the main driver behind these migrations is always money.  Wars are fought over control of territory because territory generates wealth.  In a new world order, people would not need to move to stay alive, because they and their neighbours would have the same benefits of clean running water, electricity, transport and communications.  The proliferation of mobile phones has all but accomplished the latter, the evidence being that a video can be posted online from a dot on the map in Africa and be seen immediately by the rest of the world.  Food, water, and employment are taking longer to catch up.  When a man who lives in that dot on the map sees how the rest of the world lives, he wants the same benefits for himself.  Along with the promise of adequate food and productive employment, he also wants healthcare, education, infrastructure and a fair rule of law.  Regardless of what the Bible says, it is natural to want something better than what you already have.  If you have a broken-down, jury-rigged plough, you wish that you had a sturdy, unbroken one.  When people want more and can’t have it right away, they become jealous, vindictive and make bad decisions.  To attain the promised land, people who don’t steal from others are forced to work as wage slaves because that is the only road open to them.  They hope it will be a temporary solution, a fix, but they end up spending the rest of their lives in the limited options offered by the workaround. Is poverty in Peru worse than poverty in the United States? Climate change will provoke new generations of refugees hoping to change the trajectories of their lives.   

Men have tried to construct societies where nobody suffers from want, but it has been demonstrated that mind-numbing uniformity kills initiative.  In the end, these utopias fail because people have a tendency to work around the rules to reinstate a hierarchy of wealth.  People want more than their neighbours have, and are willing to become outlaws if that is what is needed to achieve their objectives. We look for workarounds, honest or dishonest, if we think it will improve our lot.    

The cautionary sting in the tale of workarounds is that we should not depend on them, and if necessary, should consider discarding the entire jury-rigged kludge and building a sound structure from a new set of plans that do not totally revolve around money.  If we ever manage to conquer our petty jealousies, envy and greed, war will be relegated to being an awful curiosity of the past.  

Wednesday, October 02, 2019

Race & Culture

These days, darkening one’s skin to pass for someone of another race is considered disrespectful and wrong-headed.  It's just not done. Unless worn by true aboriginals, the wearing of feather headdresses and Mexican sombreros as costumes has also fallen out of favour.   The collective sliding scale of acceptability has pushed them to the naughty list.  A web-connected public plebiscite decides what should be relegated to the back drawer of history, and what will be allowed to stand, for the moment.  At one time or another, everything comes up for analysis, and it has always happened that totems of the past are thrown onto the bonfire by the guards of the new revolution.     
      
Things that were once out in the open are now taboo, and things that were taboo are now out in the open.  We behave as if prohibition never existed.  Priest pump for peace as if holy wars never happened. Behaviours that were once commonplace have been ruled out of order.  Pushing these now unacceptable events of history into the closet doesn't cancel them.  No amount of nose holding or looking the other way will make them disappear.   Being aware of the past is necessary for survival.  How many times can we make the same mistake?   The past shouldn’t be something mysterious, unimportant, remote, or forbidden, because humans would rather forget. Their psyches tell them it is unhealthy to revisit bad experiences too often, but they shouldn’t take the balm of amnesia too far.  When memory is erased, mistakes are repeated, and new strains of bad old ideas come along and take root again.  

Toppling statues of icons like Saddam, Stalin, or Nero is a post-conflict knee-jerk reaction to life under a dictator’s heel, but it does not change history.  People have always used what they were given and what they learned from the mistakes of the past in the struggle to survive.  It is easy to ignore history when the greatest personal accomplishment is to stay alive.  Defeated people have no stomach for dredging up the wrongs of the past.  They hope that the present and future will be better, but they ignore that a hungry, intimidated, and uneducated populace is easier to control. The indignant can vent their anger on statues, but it is the ideas, not the statuary, that need addressing.  

Many statues should be consigned to the scrapheap, but pulling them down won’t erase the sins of the past.  The buzz about cancelling offensive images from the culture is a distraction from the real thing.  There is probably laughter, margaritas, and tortillas at a Cinco de Mayo party, but many Americans don’t know that it celebrates a Mexican victory over French troops, who would have gone on to join the US Confederacy, helping the slave owners defeat Lincoln.  Yet even in Texas, they mindlessly celebrate Cinco de Mayo while protecting statues of Confederate generals.  These contradictions are usually based on mutual ignorance, which keeps people fighting amongst themselves, a useful tactic for those who hold power.

Nobody has put forward a fair and workable political system, but the idea often finishes dead in the water simply because it is a system.  People are not systems.  They are round pegs trying to fit into square holes while experts debate whether it is better to make the holes more round or humans more square.  Meanwhile, to survive, humans find their own ways to adapt and move forward.  Along the way, they may ask themselves if there could be things they are doing now that will be considered unfathomable errors by historians of the future. They drink beverages from open containers when it is known that exposed liquids are bombarded by harmful viruses. They wear eyeglasses, an outdated technology, akin to a pirate having to strap on a wooden leg.  They fail to provide free healthcare, housing, and food to the weakest members of society.  They send soldiers to kill other soldiers over remote pieces of land so they can move forward another square on the chessboard of domination. They fill the atmosphere with poison.  

Looking back and judging things according to today’s standards is like driving down a cul-de-sac and expecting to get somewhere.  It is true that there are some sins too big to wash away, that no matter when or where they were committed, they will always be unforgiven. “Just following orders” is less convincing today than it once was. Concentration camps have always been hell.  Traumatized soldiers were once shot for desertion.  Hungry, thieving children were sent from the UK, over the edge of the earth, to Australia.  

In a tightly organized society, anxiety is a problem.  Presumably, bees and ants don’t feel the frustration of constant collisions with their own kind but experience the event as a bonding ritual.  People look for ways to keep themselves calm in the mêlée.  Tobacco was once the most widespread worldwide remedy for anxiety, but it has become a health taboo. Like Coca-Cola, its addictiveness was an early experiment in product loyalty.  But smoking, drinking Coke, and chewing tobacco had willing consumers, early adopters of microdosing, playing Alice in Wonderland, a bit more of this and a bit less of that, until they had found the right balance to help them navigate their unintelligible lives.  Most smokers don’t know that in the First War, soldiers used cigarette smoke to cover up the stench of rotting corpses.  Lung cancer was the least of their worries.  

If a long-dead, barely remembered man like Al Jolson were to come back from purgatory, he would be sent straight to hell by the latest cultural posse who would lynch him as if he were the antichrist of blackface.  He would be greeted by howls that his face paint is insensitive, hurtful, and dehumanizing. But it is dehumanizing to forget that Al Jolson was an actor and a singer who was trying to put food on his table.  If he thought that painting his face green and pretending to be a Martian would help him get work, he would probably have done it.  People did not believe he was an actual black man; blackface is never convincing, but he was a good singer, and that’s what mattered to his audience.  He didn’t think he was offending anyone.  When he was at the height of his popularity, the American Civil War had ended fifty years earlier, and though black entertainers were becoming known, they were not allowed the same access to the public as white artists.  Jolson paid homage to his black brothers; nobody saw it as mockery.  He played a character who knew how to tug at the heartstrings, and the audience thought he did a very good job of it, whatever colour he was. Disney probably copied Al Jolson to create Mickey Mouse, but there has been no outcry about mice in blackface.  There may have been black singers who were angry that Jolson was taking work that should have been theirs, and they were correct.  Jolson would probably have said, “Everyone’s gotta eat,” and it would have been left at that.  Worse damage was done to the image of blacks by entertainers like Stepin Fetchit, who did not need to put on blackface.   Like many actors, he discovered that he couldn’t find work unless he played a stereotype.  Some actors with big noses only find roles as greedy Jews or bad guys.  

As a post-war child, I saw some of these early performances repeated on television, though by then there had been some breakthroughs in the theatre and cinema by having blacks played by blacks.  Who could imagine that ”A Raisin in the Sun” could be presented by any other cast except one of colour?  It would not make sense if it were done in white.  Although blacks playing black in works in the 21st century is correct and admirable, in the beginning, it was controversial, like the current discussion of handicapped actors playing roles as handicapped characters.    

I grew up in a place that never had a black inhabitant until the mid-nineteen sixties, so the earliest impressions I had of black people came from stereotypes like Amos & Andy, and old clips of Bill Robinson teaching Shirley Temple to tap dance.  I didn’t know if the radio actors who played Amos & Andy were white or black.  It wasn’t a question I asked myself.  I was aware that they poked fun at each other and their wives, like Ralph and Ed on The Honeymooners. They could have been the Happy Gang, always good for a smile and a laugh, but I didn’t ask if their characters represented anything.  

With the taboo of blackface, brownface, or any other kind of cultural appropriation, people find other ways to step out of themselves on occasion.  They paint their faces blue, copy sci-fi creatures, and make tails out of pool noodles.  But will some real alien come along and tell them off for being disrespectful?  Children who dress up for Halloween would be mortified to be laughed at.  They are paying homage to their idols, and to them it doesn’t matter if the skin is green, painted like a skeleton, or a pleasing shade of tan like Princess Jasmine.  Blackface in show business may have been a lame imitation, but it was never comical based solely on skin colour.  

In more innocent days, I was friends at school with a skinny native boy with a mop of unwashed hair and dirt-streaked skinny arms.  We played marbles on the pavement around the school building and counted our wins together before the bell sounded to end recess.  One day, I came home with a yellow cats-eye cob that my friend had given me. 
 “That’s nice,” my mother said.  “Did you win it?” 
“No,” I piped up in my six-year-old voice.  “Fleabag gave it to me.”
“Who?” she turned to look down at me with a hard stare that made me shrink.
“Fleabag,” I said, unsure of myself.
“That’s not a name,” she spoke sternly.  “He must have a name.”
“Everyone calls him Fleabag,” I tried to excuse myself.
“Well, you are not to do it just because everyone does.  Find out and use his real name.”
My mother was a nurse and a democratic woman.  She had seen enough sickness and death to know what was good and important and what was wrong.  I was embarrassed by my thoughtlessness, but the event triggered a different and better way of looking at things. I have been allergic to nicknames ever since.

Recently, while researching a story set on the North West Coast of British Columbia, I needed to spin through many reels of microfilm from a small town newspaper printed in the early twentieth century. In these photographed broadsheets, I regularly came across evidence of racism that jumped out as being on the wrong side of history, but was accepted back then as normal.  The ignorance passed down from one generation to another had prompted the Canadian government to pass race exclusion laws, though not all citizens were convinced.  World news in these old newspapers was surprisingly well covered, with the latest in European battles, troubles in Ireland and Russia, as well as the latest Chaplin film at the Empire theatre, but between the ads for stomach remedies and cigarettes, there was an ongoing litany of small stories about men being killed in fishing, lumbering, railroading, drinking, and fighting.  

There was an alcohol prohibition in the province at the time, so the papers reported a constant parade of bootleggers before the judges.   Many of the accused were repeat offenders, bartenders who were only allowed to sell near-beer, workmen caught on a binge, an old widow selling spirits to buy food, and even a few policemen accused of selling contraband. The Chinese community came off very badly in the papers, because the court reports were also full of opium cases.  The accounts gave the impression that all Chinese were dope fiends, a title only slightly less respectable than running a laundry.  A laundry was a place where people took their dirty clothes, even though they were nervous that it could be an inscrutable front for nefarious dealings. Most Chinese had originally been brought to Canada as disposable labour to construct the railway, which was supposed to bring prosperity, and it was assumed that these immigrants would leave when the job was done.  When they wanted to bring their families over, the government put the brakes on and imposed a head tax. 

I read about one or two blacks who ventured north from Seattle on the steamer and ended up in street fights prompted by racist remarks.  Locals fought with outsiders, even though all of them except the natives were outsiders themselves.  There were women of no fixed address who were shown the road out of town when an unseemly disturbance made their profession clear to the court judge.  There were backcountry men who went mad and tortured or killed their families, and there were stopovers by minor royalty.  Breathless reporters gushed over celebrated transcontinental biplane pilots who had touched down just long enough to refuel on their way to Alaska.  Both local articles and items picked up from the worldwide press reeked of such blatant racism that a millennial would choke on his bubble tea. 

The Chinese workers that Canada had used to build the railway had done a good job, but some complained there were too many of them.  But then, the government imported eighty thousand young Chinese men, destined to be shipped to the European War to work as sappers.  These men were quarantined and trained at William Head in Victoria.  There were riots and escapes from the harsh conditions.  Politicians wanted a 2% cap on Orientals.  White women were not allowed to work for Chinese employers. Chinese were required to sit in the balconies of movie theatres.   In the 1920s, the government of Canada passed the Chinese Exclusion Act,  which disenfranchised any resident Chinese.  Struck from voters' lists, they did not have the right to join professional organizations as doctors, accountants, dentists, or nurses.  Since they weren’t officially recognized and certified, they weren't allowed to work. 
  
During WW1, 8,500 civilian prisoners, most of Ukrainian descent, were arrested and held in internment camps across the country, only because they were originally from Eastern Europe.   In certain periods, Canada encouraged immigration, but only accepted the right kind of people, Western Europeans mainly, preferably women who could be brides.  Germans and Russians were not welcome.  

There were the Sikh passengers who arrived in Vancouver on the Komagata Maru.  After two months at anchor in the port, they were sent back to India to be arrested and shot. The official word was that “having been accustomed to the conditions of a tropical climate, immigrants of this class are wholly unsuited to this country.”  

While researching the Miller Bay Indian Hospital near Prince Rupert, I discovered that the site was originally a farm belonging to a family called Miller or    Müller, who were believed to be Swiss, but because they spoke German, their property was confiscated.  

I knew there was worse to come in more recent history.  The internment of the Japanese during WW2 had traumatized the parents of some of my schoolmates.  They told me their parents had been lied to and robbed, and never felt safe again in Canada.

In 1936, a test case before the Supreme Court about a bar in Montreal refusing service to blacks concluded that it was in the interest of good morals and public order to refuse service to black people.  

Canada’s own aboriginal population were herded into residential schools to “civilize” them, and they were not given the right to vote until 1960.  The last racially segregated school in Canada closed in 1983, which brings us close enough to the present day to make it clear that there has always been racism in Canada.

 It used to take an invasion or revolution to shake up the structure of society,  but the ubiquity of the digital revolution has accelerated the exposure and drawn battle lines.  With a sense of history that only stretches back to the last ephemeral trend, new generations might come along and ask, “Who are these guys and why did they do that?”  They will learn that the world is, and always has been, full of good, bad, and questionable characters.  The bad ones are more fascinating, but their stories have already been told, so the sleuths go looking for chinks in the armour of those who have been judged to be good.   They want to stick in their lances to see what spills out while the spectators huddle round pretending to be aghast.   These pokers and prodders are not looking for context, but sound bites, the more shocking the better.  Online scandal-hungry communities attract like-minded moral bankrupts to their flame, until their indignation becomes a hurricane and causes a shift in the current moral compass.  Another figure in history, like the first Prime Minister of Canada, is stripped of his good intentions and pushed naked into the same human swamp the critics inhabit.    

When the ego-inflated, indignant boots of online crusaders march in, schools, streets and parks are renamed, and statues are pulled down. This general or that governor had views he shouldn’t have and needs to be stricken from the record.  Every person, living or dead, is fair game for the lawnmower of public opinion.  There are reasons that states are not governed by public referendum.  People are too easily manipulated.  The title of demon of the month moves as fast as fashion.  As Heidi Klum would say, “One day you’re in. The next day you’re out.”   

 Collectively, Canada likes to think of itself as a tolerant country, though we are made up of people from every part of the globe who landed on someone else’s native shore and imposed our way of life on them.  We are no different from the tribes from the steppes who swept over Asia, or the Normans who invaded England. 

In Canada, the English prevailed, so colonial tactics were adopted to subdue the troublesome natives by selling them alcohol, infected blankets, and by stealing their children.  At the beginning of the twentieth century, the imperial machine was at full power.  Canada was sparsely populated and needed people, so various schemes were cooked up to attract the right immigrants.  Unlike the now hollow American boast of “give me your tired, your poor and huddled masses”, Canada tried to be selective and open its borders to those who might settle successfully and participate in the experiment.  They didn’t want dreamers or idealists, revolutionaries or Bolsheviks; they didn’t want Chinese, Italians or Slavs who would stay on after their backs were broken.  Canada offered free land as bait, but the conditions were harsh.  There was a high failure rate, and many could not fulfil the conditions to keep the property they had been given, but they were allowed to stay on as wage slaves.   

Politicians have always pandered to voters and given voice to xenophobic theories by playing on the insecurities of populations struggling to make a living, who want someone to blame for their condition.  The implication behind these ideological campaigns is that the doors to the country should have been closed behind their ancestors, who were the last of the good immigrants.  

Canadians are fed pablum half-truths about their country and its history, so it is no surprise when a long-forgotten shoot of racism sprouts from the stump of a tree that was supposedly cut down long ago. We can never remedy the mistakes of the past, but before we trumpet ourselves as a do-gooder immigrant haven who never had a bad thought for anyone, we need to be aware of what we have already done.    
 
If others are offended by cultural practices that are no longer acceptable, we need to listen to their reading of the situation, but we should not be too hasty about throwing everything into the fire.  If blackface intends no harm, is not meant as a joke or a mockery, there should be nothing wrong with playing a part that pays homage to another race or culture.  

I watch a lot of Italian television.  There is a popular evening program in its ninth season that challenges contestants to imitate popular singers from the past, a mix of Italian, British and American artists.  These are not parodies, but genuine attempts by the performer to create the magic of the original.  It is difficult even for an olive-skinned Italian to be Louis Armstrong, early Michael Jackson, or Donna Summer without some sort of makeup.  If the contestants are from the south of Italy, some need white makeup to pull off a convincing Adele, Mick Jagger, or Taylor Swift.  There have been both tanned and powdery pale versions of Lady Gaga.  The point of the performances is not to make fun of the popular singers, but to be as true to the originals as possible, to find the soul in the song.  There is racism in Italy as there is in all countries, and some comic sketches that poked fun at ethnicities have been recently censored by the state media.  It could be said that the makers of this content didn’t understand at the time what sin was being committed, but went for the low-comedy, cheap laughs.  However, as they do with food, Italians take music seriously, and musical interpretations are not intended to be disrespectful, hurtful, or insensitive.  Italy has its own painful racist history, and a present situation that finds its shores the principal landing point for African migrants, so it is in the thick of coming to terms with its own multiracial society.  In the 1950’s, a man who moved from Sicily to Milan for work was called an immigrant and looked down on as dirty and uneducated.

The US news reported recently that Orange County’s John Wayne Airport should be renamed because the actor made some racist remarks in his time.  Actors are often unreconstructed examples of humanity, and some promote ideas which are questionable at best, but actors, like all of us, are human and sometimes exercise bad judgment.  Perhaps the solution to constant cultural revision is never to put anyone’s name ever again on a building, a street, a park, or an airport, calling new buildings A, B, or C.  Even that might be exclusionary to those who don’t use the Western alphabet, so we are reduced to symbols like illiterate people.  

There have been bad players in history, and their errors have been pointed out, but there is a mistaken assumption that everyone in the past should have acted according to our modern standards.  Dredging up forgotten sins and passing judgment on them doesn’t serve the present or the future.  The motto for the Province of Quebec is “Je me souviens,” which translates as “I will remember,” and is good advice for the entire country.  We should not forget the past, because it explains how we got to where we are now.  We can never be free from our history, nor should we be.

Wednesday, October 24, 2018

Worlds Collide

When a sperm fertilizes an egg, one world merges into another to create the miracle of a new entity.  But eggs and sperm are not the only organisms with a propensity to merge; all cells have needs and influences.  Viruses, bacteria and medicines circulate through our bodies, provoking electro-chemical reactions, keeping what is needed and discarding the rest. If there is an overwhelming invasion, a serious trauma, or disease, and things get out of hand, the body calls in a bigger army of antibodies to keep order.  All cellular encounters have an element of invasion and surrender that result in the establishment of new hierarchies, like animals joining a pack.

When the government of East Germany tottered and the wall came down, the former Soviet appendage collapsed into the West.  Perestroika looked better than another failed five-year plan, and those in the East saw the West with its flashy Mercedes success as rightfully theirs.   They wanted what other Germans had, even though the Westies had worked hard for their elevated standard of living.  The East had stood still in time; chunks fell from buildings, and the wartime infrastructure was crumbling.  When the rudderless government capsized into the West, it was a joyous but painful rebirth.  And like the right sperm meeting the right egg at the right time, the invasion and surrender were difficult but inevitable.    Those in the West weren’t thrilled to be overrun by job-hunting banana-hungry Easties, but in the end, all Germans were forced to resign themselves to the new reality. 

As European countries fight to keep their Union alive, there has never been a longer period of freedom from war and want.  The illusion of wealth and stability looks attractive to those on the outside who have been crippled by war and corruption.  From Africa to Asia, many governments can’t or won’t help their citizens to lead dignified, productive lives.  The old are resigned to stagnation, and the young look North and West, lured by a wonderland of things that appear to be available to everyone.  Unfortunately, many young souls with dreams don’t know that the Mediterranean is not a river, and that they will not be able to swim to the other side.  In North Africa, they are held hostage by traffickers until there is nothing left to squeeze out of them, and are then herded onto sinking boats that are pushed off toward Europe.  Decrepit fishing boats and cheap inflatable rubber craft drain the discontented youth of Africa and the Middle East into Europe.  Some who arrive are disillusioned that they can’t have everything as soon as they hoped, but many make a decent life for themselves, coping with nostalgia and the slings of assimilation like millions of migrants before them.   These days, the current of population flow is directed to the North and West, though someday it may flow in another direction. 

The merging of different worlds happens by osmosis.  There are clashes along the way, resistance and insistence, but the new order becomes a historical fact.  When the Ice Age retreated, and Cro-Magnon man took over from Neanderthals, it was a process of assimilation and elimination until modern man prevailed. 

If there is ever an arrival from space, the two worlds will fight and make peace and fight again until a hybrid species emerges by incorporating useful qualities from both parties.  Every new generation sheds its parental skin of outmoded preconceptions and accepts the new reality.

 The principles of one entity merging into another also occur on a human emotional level.  When people first meet, they use invisible antennae to look for signs of aggression or agreement.  People may become friends and participate in a complex dance of two souls who have not merged on a cellular level.  If the parties are pulled toward a more intimate bond, their exchange of genetic material and strong sense of unity epitomize one world melting into another.   Like a sperm knocking on the shell of an egg, East Germans going West, or Africans going North, the tendency to merge personally and socially dictates the path we need to take if we wish to survive.  Isolation is death. 

But Nature has some twisted tricks up her sleeve because she favours attraction between unequal forces. As Darwin demonstrated, the invader doesn’t need to be strong nor the defender weak.  When worlds collide, the species that proliferates is the most adaptable one.      

If two similar worlds meet, they can be like two suns or two male dogs, circling each other at a wary distance, looking for weaknesses to exploit.  If they are equally matched and engage in a fight for dominance, there is the risk that they will destroy each other.  Nature prefers pairs made of disparate elements because their union produces more adaptable offspring than those of identical homogeneous partners.  There is no attraction in sameness; we were not meant to couple with ourselves. 

The universe is full of moons, planets and suns caught in the orbit of stronger powers, and this configuration makes for both stability and instability in the universe.  Being caught in an orbit is a delicate balance, too close and you burn up in the face of the sun, too far away and you become cold and lose the grip of gravity so that you fly off into space.  Our universe is an active environment that is governed by laws, but it is subject to accident and coincidence, a grab bag of factors that can precipitate dramatic change.  The process of one nation merging with another is a minuscule illustration of universal inevitability. It is not a bad thing, but a necessary process that propels us forward, an inevitable change that many struggle to accept. 

Saturday, May 03, 2008

Lingua Franca

Language.  We hear and learn words from a very early age. We express ourselves by imitation and the slow realization that if we can name something, we have some control over it. We absorb vocabulary as a useful tool and we don’t think much about the words themselves, but we soon learn how to use them. Anyone who has been in a situation where they were surrounded by a language other than their mother tongue, will know the frustration and powerlessness of being without speech.

Words are a passion for me, and this has led me to learn languages other than English. I studied Latin, French, and Spanish at school and like every other student, tried to learn the foreign words as if they were new mathematical formulas to master; and like much of the math we study in school, I couldn’t see a way that these difficult words had anything to do with my real life. After graduation I started to travel, and soon wished that I had paid more attention to my studies. It was humbling to discover that a 2-year-old in his own country, could speak his language better than me, so I made a serious effort to learn and use more words.

After forcing myself to speak a new language for a while, I found that the easiest words came out of the mouth without thinking, especially words with which we are all familiar. Nobody in North America has to rack his brains to come up with words like “adios, amigo, mañana, rapido, or bueno”. Most of us don’t have to make a complicated excavation of memory to know that “adios” is another way of saying “goodbye”. It is from this point that I begin my theory of languages.

They say children can learn a second language more easily than an adult, that a child is in a more receptive state and can easily absorb new information. Adults tend to sort and categorize new information into manageable compartments.  When people try to remember something, they pull it out of a drawer somewhere at the back of the brain.  This storehouse of information can only be maintained by not mixing up the contents of the labelled packages. Luckily, the brain is more fluid than a filing cabinet, and we have the capacity to merge folders.

At some point in my language studies, having added Greek, Italian and a bit of German to the languages I have lived in, I realized that categorizing a word in another language into the overall box of Non-English Language was a mistake. I began to learn words as if they were a part of my mother tongue. If I learned the word “casa”, I wouldn't think of it as a Spanish word that means house.  I would think of it as just another word that stands for the image of a house.  I tried to eliminate the translation factor with the knowledge that when I see or hear the word “house”, I don’t first think of it as a word, but as any house, the house where I live, the house where I grew up, my dream house, or a composite image of houses. Therefore, when I hear the word “casa,” I skip the step of thinking of it as a Spanish word that means house, but bring up a visual image of a house. If I hear or read the word “spiti,” which means house in Greek, I think of it as just another word in my vocabulary that symbolizes a house. It doesn’t matter what language it is. In this way, I found language learning easier. Now, if I hear a combination of words in another language, I can visualize what is meant without having to translate that phrase into English. I believe that the greatest error in language instruction is to insist on reinforcing the categorization that happens in the adult brain. We shouldn’t study French, but study other ways of saying things, using words that just happen to be French.

This is a simplistic approach, and generally deals with just vocabulary, but the further we delve into any language, we realize that differences in grammar are part of the rhythm and essence of the culture to which the language belongs. Sometimes this requires learning different rules of structure, but these, like any rules of language, are only systems that have been developed to make words into sentences. 

One of the first and most inexplicable pitfalls for an English speaker is understanding gender designations in another language. Why is the moon feminine in Italian and the sun is masculine? Again, rules have been proposed, but rules are always broken, so in the end, one is forced to imprint the idea of a feminine Italian moon onto the understanding of an Italian way of being. We could learn “la luna” by rote, but the knowledge sticks better if we think of the moon in its Italian incarnation as a beautiful, mysterious female form, just the way an Italian might see it.  The French rule of thumb is to categorize nouns that are active and can be used by humans as masculine, while feminine nouns are objects that have things done to them.  A table is therefore feminine, while a knife is masculine. The way that children learn words in their language is not to think about gender, but that the correct word for a table is not "table", but "la table," as if it were one word.  The key is to avoid translation and language separation and to think of all languages as one language. We humans, should develop a richer, multilingual vocabulary to express ourselves.  

My Greek teacher often emphasized that you can’t separate language from culture, and the more I know about languages and their cultures, I see that this is true. In the connected world of today, all cultures and languages are beginning to blend. As we come closer together, we understand each other better and realize that there is only one language, and it is not English or French, or Italian, or Russian, but a plethora of words which stand for ideas, feelings, objects, hopes and dreams. It would be best for us all if we understood each other without looking at languages that are not our native tongue, as "other," but are part of the beautiful and broad spectrum of human language.  

Saturday, August 04, 2007

Generation Y Work

Following on the habit of naming successive generations, the Y generation follows generation X and the baby boomers. Each has its own attitudes, outlook and morals. Generation Y has a problem in the workplace. The economy is good, jobs are available in a host of occupations, which would leave a generation X’rs jealous that they only had an option of one McJob or another to choose from. Generation Y members have benefited from the shortage of workers and are hired for jobs for which they have little skill and are poorly suited.

The generation born in the 1970’s have been coddled and rewarded for mediocre behaviour because while they were growing up, it was considered cruel to hurt anyone’s feelings by judging them on their merits. All were rewarded equally, leaving those with fewer skills, believing that they were just as gifted as the top of the class. It is wrong to crush self-esteem with unnecessary criticism, but as often happens in America, the baby has been thrown out with the bathwater, and all criticism has become a sin. Therefore a generation has grown up and entered the workforce who believe that all work is beneath their worth, that they only are required to make a token effort, that they are not rewarded handsomely enough for their lackluster performance, and that even showing up for work is an imposition on their specialness. If they are not coddled as they expect to be, they leave, often with no notice or thought to what their sudden departure does to their colleagues. This tactic works for them as long as jobs are plentiful, but because they have no sense of history, they act as though their actions do not affect anyone else, not even the parents whose home they move back into.  Least of all, do they realize their work history will follow them.

This self-centred attitude, something that is common in the youth of any generation, will be more difficult for the Y’s to overcome, because they are a generation nurtured on the need for environmental cleanup, the rightness of anti-racism, the spread of technology, and other “One World” philosophies which are in direct opposition to their personal attitudes. These 20-somethings exclude themselves from this One World view through technological devices that don’t require eye contact, smell, touch or taste. They want to be paid well so that they can consume the products whose manufacture makes the rich richer and the poor poorer. They grew up with children of all races, but they wouldn’t marry one. In the workplace, they look out only for themselves and pursue their advancement with a self-belief that defies proof and borders on the fanatical. When they don’t get their way, they pout like spoiled children and blame everyone else for how unfairly they are treated. If they have rubbed their colleagues the wrong way, when they are criticized for their behaviour, they deny that it has anything to do with them.  It's not their fault, but that of their fossil of an old-school boss who doesn’t understand them. What they need to learn is that the world is not their indulgent mother, that it has no great love for them, and that, alongside the mass of humanity on this earth, they are nothing more than another grain of sand. There are others who are willing and adaptable and can easily take their place.

It has been said that the workplace needs the technological skills of Generation Y, since they are the only ones who understand the rapid advances in this field. This is an insult to any person with normal intelligence. Nobody in any situation needs to be at the mercy of a petty tyrant like this, who believes that only he has special powers and cannot be replaced. Technological skills are easy to acquire.  Children can learn them, and so can adults of any age. Generation Y makes the fatal error of believing that they are unique and have some secret knowledge which gives them power and superiority, but in truth, their special status is based on an illusion.

Generation Y, like all other generations before them, will grow out of their bubble, and it will take very little for that to happen.  A few eye-opening realities will come along for which they have no coping skills.  It could be a world economic downturn, or a little more experience of how the rest of the earth’s population survives, for them to wake up from their coddled existence. The great leveller, of course, is time, and in a few short years, they will be forced to deal with their own children, who will ridicule everything that they, as Generation Y, believed. Their arrogance will come back to bite them.

Tuesday, November 07, 2006

Vegetarian

When did primates begin to eat meat? Perhaps after they learned to harness fire and cook meat so it became soft enough for those with less-than-perfect teeth to chew and digest. In some areas of the world, there isn’t much of the vegetable world to choose from. The tribes who lived in the Arctic had a short season to gather fruit, seeds and roots. If there was game all around them, and they observed carnivorous species surviving on meat, their logic must have led them to adapt to a meat-only diet. The sea was a plentiful source of food, and with a hereditary knowledge that humans needed protein and fat to thrive, they added fish to their diet and saved themselves from a strict diet of roots and fruit. 

Meat-eating developed from a need to use what was available to combat hunger. Vegetarians contend that they are more evolved than these primitive people and don’t need meat. In richer countries, there is ready access to a variety of food groups, making it possible to live healthily without eating meat, but not all people across the world have the luxury of these choices.  Where is a dweller in the Sahara Desert to find a tomato?  Vegetarians may believe they are superior to meat eaters, but this superiority only applies to those who have a choice.  Having passed many periods on a strictly vegetable diet, I choose to eat meat because it is a ready source of good protein. Like most humans, I was born with a digestive system that can make good use of meat for energy. I cannot deny my biology.

Some don’t eat meat because they believe that killing animals is cruel. Depending on how the act is done, this is more or less true. Death for any living thing is tragic but inevitable. Animals die, plants die, and humans die. Humans perceive life on a limited level. Dogs and cats experience the world differently from us, and so does every living thing. I subscribe to the hypothesis that just because we can’t sense something with our limited faculties, it doesn’t mean the thing doesn’t exist. We can carry on munching carrots, deluding ourselves that nothing died and nothing suffered to feed us. But the carrot died; we interrupted its life cycle in harvest, pulling it up in the best of health. I accept that things die so that we can live. This is true whether I eat meat, vegetables, or both.

I silently say grace with every meal to give thanks for everything that gives its life so I can survive. I don’t argue that vegetarians should change their ways, but I believe they are somewhat misguided. However, if being vegetarian keeps these people’s bodies and consciences clean, they are welcome to their folly. In all aspects of life, it is important to remember that whatever we do or eat, it should be in moderation, with an ever-present awareness of what we are doing and why.

Saturday, September 16, 2006

The End of the Age of Plastic

Plastic, as a word, means malleable, but when most of us hear it, we think of a bag, a cup, a container for something, a disposable thing. Plastic is a petroleum product. Considering that petroleum is a non-renewable resource, it's time we envisioned a world without plastic. What is now manufactured as a commodity as insignificant as paper, which may, in the future, become a collector’s passion.  Like a grandparent’s remark, “When I was young, we walked everywhere,” our present use of plastic will become a curiosity of a time when everything was made of this inexpensive material. Will a plastic grocery bag become a museum item?

It is difficult to predict what might replace plastic, but in our commerce-driven world, whatever replaces it must be economically viable. We already use glass and metal, which existed before plastic came along, but they have not yet regained their superior position over inexpensive, oil-based products.  Plastic is considered disposable, and though glass and metal can be recycled, it is easier to discard plastic and be guilt-free.  By the time plastic is a rare commodity, as collectable as Bakelite, our lives will have changed thanks to the effect of scarce petroleum.  Until we sort out power storage systems like batteries, travel using other power sources may be limited to short hops.  Production of what were once inexpensive items for mass consumption will be limited, and we will have to come up with alternative heat and light sources. This slow demise of the petroleum culture will cause a major shift in our lifestyle. Although we won’t return to a savage existence, we will be forced to subsist on a smaller scale, more sustainably. The items we use in everyday life may not only be metal and glass, but also stone, wood and other plant materials.  

The tools we use in our daily lives will always employ basic materials, either renewable or unlimited, but what was once considered unlimited may not always be so.  Passenger pigeons and buffalo were once thought to exist in unlimited numbers. We presume that light, wind, rock, earth, and water are unlimited resources, but they are not renewable options. What grows on earth is considered a renewable resource, but will there be enough organic material to sustain a growing population? 

Instead of plastic being the throwaway material, it may return to specialized use, which exploits its nominal value, that of malleability. It is possible that plastic could be used almost exclusively for replicating living things like the human body. Our technology may advance to reproducing simulations of life from cells of anything that lives or once lived. Petroleum products, like plastic, will simply become a rare catalyst in the construction of inventions that promise to assist our survival. It staggers the imagination that the capabilities of a natural gift like petroleum are now squandered in products like shopping bags and throw-away temporary products.  Grandparents of the future will speak of plastic pipes, furniture, clothing, toys, tires, and casings for electronic devices.  Today's rubbish dumps may be mined to recover scarce plastic that has not yet broken down, so it can be repurposed into valuable, scarce commodities. 

Monday, July 31, 2006

Situational Awareness

Situational awareness is a term often used in aeronautical and military training to instruct combatants and pilots to be cognizant of what is happening around them. It is a skill that is infrequently used by many people in their everyday lives,  "I never saw it coming."  Many people go about their business with little awareness of where they are and what is in their vicinity. Situational awareness is like a mother’s claim of having eyes in the back of her head, but actually employs more senses than just vision. When walking down the street, many people unconsciously watch where they put their feet (and some don’t) and subliminally assess anyone they are approaching.  They make a judgement about a person or situation, and adjust their stance on a sliding scale of friend or foe. This instinct comes from the animal kingdom. We humans can fine-tune this simple scale into many tones on our way to making the decision about how to react. Do we ignore them, make eye contact, cross the street, or stop and speak to them?

Our awareness and reaction are also influenced by our surroundings. Is it day or night? Am I in familiar or strange territory? Are there other people present? What are the cultural habits of the place I am in? We use this skill of situational awareness to pass safely, and to communicate whatever it is that we need or want. This helps keep us safe in our world. 

When one begins driving lessons, an instructor may raise the point of situational awareness, because it is critical to safe driving. Bad driving is a perfect example of how people aren’t observant, as it results in an inability to judge situations and act appropriately. The worst case of a driver with a lack of situational awareness is a driver with tunnel vision, who drives straight ahead, looking only in front of him, but not too far.  He doesn’t look side to side, or use his mirrors, but drives his car like he has no control except stop and go. He may be driving in Bangalore, where this is normal, but for clarity's sake, we'll stick to Western habits. The tunnel vision driver may suffer from compromised motor and observational skills, so that staying inside one lane of traffic pushes him to the maximum of his capabilities. He fears that if he looks sideways or back, he may lose control of his forward motion, which, in his state of reduced capability, may happen. Driving is a skill that requires multitasking, but some people find this difficult or impossible. Apart from some differences in speed and capacity to retain information levels, humans can be trained to multitask. A new mother learns this from necessity, as there are many rapid changes in children, and they require constant attention.  Multitasking while driving is the ability to control the speed & direction of a vehicle, while being fully aware of what is happening on the rest of the road, and trying to anticipate what might happen. Some drivers believe that multitasking while driving is the ability to eat, drink coffee, apply makeup, window shop, and talk on a cell phone, all while changing lanes, gears, and radio stations. These dangerous habits would be better substituted by thinking about where they want to go on the road, and what is the best way to arrive.  Driving responsibly requires awareness of the other vehicles on the road whose drivers have their own agenda, which may or may not make sense.

An important point about situational awareness is that those who lack it are not only a danger to themselves but  a danger to others. What will happen to a child whose mother isn’t aware of the child's needs? What would happen in traffic if all drivers thought in only forward mode? What would happen if we perceived all who approached us as an enemy and reacted violently toward them with no reason?

Situational awareness can also be used to maintain our own physical and mental health. A doctor will often tell a patient to pay attention to his own body, repeating this obvious reminder because it is too often ignored. When an obese or thin person looks in the mirror, do they only see what they want to see, or do they see the objective truth about the state of their bodies? If a person experiences constant headaches, do they examine their life and try to discover if the cause is mental, physical, environmental, or do they just take a pill to cover up the pain?

There are three stages of situational awareness: the perception of the situation, the placing of the perceived factors on our own personal scale, and the decisions we make about our actions in this situation, which usually involve projecting any situation into the future.  How will this situation play out? Several factors figure into our ability to react appropriately to any given situation. The first is experience, the second is knowledge, the third is processing velocity, and the fourth is the degree of transparency of any situation.

In the absence of professional counselling, many people are unable to apply the concepts of situational awareness to their own life choices, and many people subvert the obvious. We know from information received from the outside world, from our own experience, that smoking is bad for us, yet we carry on with an addiction like this despite all the information that it is harmful. Overeaters continue to overeat and either admit that they do this or they delude themselves about what and how much they eat, yet continue to make unhealthy choices. Even in illogical situations like this, situational awareness plays a part. We may consider our life to be valueless, so we eat, drink, and smoke to comfort ourselves while we pass the time. We all die sooner or later, and if the future doesn’t look particularly bright, we choose to indulge ourselves along the way. This bleak perception of the future is particularly prevalent in the young. Negation of the future is a common state in adolescents and young people.  They don’t see themselves as capable of great things or their world to be heading in the right direction, so why try?  It's better to put on the blinkers and enjoy the ride, even if it leads to their own destruction. It is particularly damaging when this nihilistic approach is carried into full adulthood. These people may or may not be aware of the state that their negative beliefs have brought them to, but willful self-destruction is not a tenet of life – it is anti-life.

Our society doesn’t encourage people to think for themselves, nor to examine the causes of things that happen around them. Governments know that people are more easily controlled if they are accustomed to being told what to do. This creates a world in which people often don’t know how to react to an unfamiliar situation until someone else tells them. People feel comforted when they can easily categorize an event into a box that allows them to assimilate the event, and they feel righteous when that particular box is a widely held belief in their own culture. They feel unified and validated, even if they are mistaken, since they have lost the skill of judging information for themselves. They are not encouraged to be aware, to think for themselves, to act of their own volition, to trust their own reading of a situation, and to act appropriately based on what they know. When people have lost situational awareness, their own survival is at risk. Many people live their lives so entangled in petty dramas that they lose sight of who they are, where they are going, and how to get there. Like the Tarot fool with one foot off a cliff, they don’t realize that their lack of attention to necessary things severely compromises their survival.

Monday, June 26, 2006

The Poverty of Speech

While learning the Greek language, I was surprised that in the small village where I lived, when I asked for the correct word to describe something new, I was told an old word which I already knew. If I asked how a hosepipe became detached from a faucet, I would be told that it left the outlet.  The word "leave" has the same meaning it does in English, so using it here was not a good description.  The word used for outlet was the same as an electrical wall receptacle.  It was like saying that the hosepipe walked away from the plug.  Context is everything, otherwise the explanation could mean anything.  This use of simple words is not because the Greek language isn’t as rich as other languages, but because in a village where people only need to converse with their small circle, the same words are recycled to take on multiple meanings. There are complicated, precise terms in any language, but most people have no need of them as they are too difficult to remember, and people think that their neighbours wouldn’t understand them, and perhaps in that they are correct.  I once commented while in Italy that Italian seemed like an easy language, but was reprimanded by a German speaker, who correctly pointed out that although the language may appear simple at first blush, the more one learns, the more one realizes that correct Italian is as complicated as any other language.

Not only does a language have its idioms and dialects, which are enough to stump any learner, but it also has a plethora of words that are not heard on the street every day. Think of the English we use in daily oral communication as compared to the English in scientific or technical writing. Someone who has studied a language in school would probably have an easier time deciphering technical terms than they would have understanding a grunted, idiomatic, fractured conversation. One can always tell if a non-English speaker has learned the language from lessons or from the street, because their English is more precise, even if no one on the street understands them.

Languages always evolve, but often this is for the worse rather than the better. I see nothing wrong with invented words if they describe something more accurately. Nor is there anything wrong with the habit which has developed in the U.S. in recent years of using nouns as verbs, for example, “How does this impact our budget?” or "Mrs. Smith will chair the meeting." Words change their meanings according to usage. How else did “bad” become good? Technology also introduces new words, which are necessary to describe newly encountered entities.

English has lessened its descriptive power due to the tendency to limit vocabulary. A prime candidate for this in English is the verb “get”. We use it so much that it must be accompanied by a multitude of helper words because, by itself, it means so much or it means nothing. Try describing what “get” means to a foreigner. Get out, get busy, get by, get over, get through, get down, get back – the list is endless. All of these “get” phrases have better single words to describe the same thing, but we don’t use them. Do we prefer the poorer “get” because “get” is more common, tougher, more street, or is it that in American society, to show any sign of intelligence is considered an elitist weakness? This tendency to simplify things for whatever reason causes a language to lose words. Who nowadays uses “arise” for get up? We simply don’t have a word anymore for getting out of bed – the original word has all but disappeared.

Most people who speak only English tend to believe that English is some kind of mother language, which is the best at describing everything. English is, in fact, a great thief of words from other languages, which is one of the reasons it can be so rich. When one learns another language, however, one begins to understand that English is poor at describing many things. An example of this is the word “love”. Other languages have several words for love which describe various states. English speakers blather on about how much they love their car, their dog, or their McDonald’s hamburger, using the identical word “love” for their children or their spouse. The love for children and hamburgers is clearly not the same thing, so why then do we use the same word? Love has become such a catch-all word that, in the end, it means nothing.

The word “know” in English is another example of our laziness. Most other languages have one word for “know” in the sense of understand (Do you know how to ski?) and another word for “know” in the sense of being acquainted with. Not many people use the word “acquainted” anymore and would be thought old-fashioned for doing so, but the word “know” by itself is imprecise.

So many of our words now depend on context for meaning. That is, you can’t understand what they mean unless they are used in a phrase which explains them. This leads to many words which either mean nothing by themselves or are essentially non-words like “get”. If a language fills itself up with non-words which depend on usage for meaning, the language loses much of its beauty, precision, and power. Just as some people have a habit of overusing expletives in conversation, the power of a word is diminished when it is used as a filler, and it lends nothing to the meaning of a particular subject. If we use the “F” word as our only adjective, the shock value disappears.

Years ago, when I emerged from the cinema after seeing “Quest for Fire,” which was scripted with inflected grunts instead of words, I realized that our everyday conversations have not changed much in 10,000 years as I listened to comments about the film, which consisted of, “Yeah, mmh, huh, uuh, kinda, uhuh, y’know, like, wow!"  We've become one-dimensional guttural animals who disparage nuance and learning. 

Sunday, May 14, 2006

Take on Memes

A meme is a unit of cultural transmission. It takes its name from the French word “meme,” which means “same”, but also contains echoes of mimetic and memory. Whereas genes are passed on biologically, memes are units of information passed on by imitation and reproduction. Willingly or unwillingly, we absorb memes from the time we are born. Taking on these units of information is as important to our survival as a healthy set of genes might be.

From the start, people who have specific skills for childbirth, have had this knowledge passed along to them.   They weren't born with the information but have learned it from other mothers, midwives, or doctors. Beyond the latest technological tools for microsurgery, the fact that the doctor might wear glasses to help him see is itself a product of memes. When someone discovered that a piece of curved rock crystal could magnify things, he transmitted this information to someone else. This information about glass is only one of the millions of memes that assist us in our daily lives. With poor eyesight, the doctor might not have been able to attend medical school and go on to save lives. Simple eyeglasses help us see the information that helps us learn what cannot be passed on by those closest to us.  We speak, we write, we read, we learn, and we ask. Languages are produced by memes. From our family units, to our communities, our religions, our inner selves and our worldview, all of these belief systems are learned by imitation.

Memes are not new, but they have only recently been named. The study of meme dynamics helps us understand ourselves as a species on more than just a biological level. There are many branches of meme theory - meme warfare, memes as parasites, the study of macro memes (religion & theories) and micro memes (words & habits), the brain as a host for memes, the extinction & replication of memes, adaptation of memes by natural selection, and the death of memes. Memes are passed on and caught by word, by mouth, by action, by all of our senses. Memes live in us, in the media, on the internet. It has been said that “a human being is an animal infested by memes”. Humans can be faulty carriers of memes.  Computers are better at this as they can quickly calculate possible outcomes, but computers, for the moment, lack some of the tools for processing memes like morality and inspiration.

Memes mutate by re-imagining themselves in light of other memes. Much like our galaxy’s spiral form, memes, when reproduced, are not exact replicas of their seed, but are sown on another level up or down a spiral path of the long human march.

Unfortunately, many people these days consider memes to be a joke, a way of poking fun at a cultural icon.  The importance of the word and what it has done for humanity has been trivialized and discounted.  Without memes, there would be no Internet, yet the online memes that helped to create it have eaten their own mother. 

Saturday, April 29, 2006

Dreaming in Bytes

The brain is like a sponge, absorbing experiences during the day that may be acted on immediately and stored for later. Knowledge is constructed by integrating this new information into a semi-permanent file.  When we weigh it up against what we already know and find it compatible, we accept it as truth. There is a critical factor in this judgment, which is skepticism.  If something doesn't seem quite right to us, these unresolved experiences often come out in dreams. Our brains have filed these preoccupations into folders where they can be accessed and put forward for resolution at a later date. When an unsettling replay resists classification, the process of dreaming tries to re-enact a scene to understand what occurred, matching it with similar settings, characters, and emotions.  If the unresolved event is eventually understood, it can pass into our knowledge base as an evidential cohesive fact. If it is not resolved, it is put back into storage so it can be returned to the luggage carousel at a future date and mixed with a different set of suitcases to see if it now makes sense.
 
Some cultures believe that dreams are an alternate reality, and this, to me, is akin to the theory of a parallel universe – neither proved nor disproved. Some believe that knowledge is passed on in dreams, and to some extent, this is true, but waking reality plays a larger part in our understanding of the world. Since empirical knowledge is stored in an area that is accessible in dreaming, the mixed salad of our dreams also contributes to our knowledge base. When some undigested experience resurfaces in dreams, it can play out in a way that helps us understand it better.  It isn't often that dreams are understood immediately, but the practice of writing them down can reveal their truth long after the fact, even if it is only to reinforce our emotional state at the time we dreamed them.  Often, we don't recognize this state until it has passed and been filed in our memory. The way experiences are processed may have a territorial factor that harks back to the disgraced belief in phrenology, but has now returned thanks to the technology of CAT scans and MRIs. When someone says something is “in the back of my mind”, common experience tells us that the brain cells used for storing more permanent knowledge are located deep in the cranial filing cabinet. Frontal brain cells, among other things, control our more immediate facial and lingual responses. When a person says something is “on the tip of my tongue,” it is an indication that the frontal engine is trying to access the dustier reaches of our minds before it can move the information forward.

Filing cabinets of memory can work like a zip file, remaining compressed, occupying little space until a trigger or command asks for an unzip and the file or memory expands. Often, this trigger occurs in dreaming.  Memory access during a dream isn’t a perfect search, but all of our memory searches in waking time aren’t always successful.  When we are awake, an unexpected memory or desire can drift forward at an inappropriate moment. Unrelated events, scenes, and people can populate dreams, sometimes causing perplexing combinations. Often, we wake from dreams asking ourselves, “What was that about?” Unless we can separate extraneous elements from relevant ones, we can’t make sense of our dreams.

Sometimes a dream - more often a nightmare - will wake us. Before we wake, our body will try to react to an event in the dream, and we will kick, move our arms, try to speak, grunt, or shout aloud. Sometimes we enter a half-waking limbo, where we know we are dreaming, wish to wake up but can’t, and consciously try to move or make a noise that will wake us. Informed by the body's need, our knowledge base tells us that we are dreaming as the body struggles to overcome sleep paralysis. Our survival instinct knows that remaining in a panic state for a long time will be traumatic to the brain, and therefore signals the brain to push us up into consciousness. This struggle can also be thought of as the workings of the frontal brain and the rear brain to communicate. It is now widely accepted that the brain stem at the back of the brain controls the motor functions like breathing, heartbeat, etc., and the survival instinct is based there. Since messages travel across protein networks, it takes time to assemble the appropriate files to travel the circuits from the back to the middle knowledge base to the frontal cortex. It can take a few seconds to wake up from a nightmare. We are, after all, humans, not high RAM computers.

Generally, our brain tries to do what is best for our body. The body is the vessel for the brain. Included in this instinctual health program are dreams. Our mind tries to digest our experiences in an automatic defrag, which takes place every night. Defrag is short for defragmentation, which attempts to re-file stray bits of information so there are more blocks of stable usable space available for new memories. Unless we defragment our brains for a period every 24 hours, we do not remain effective, rational, or sane in our waking lives.

There are computer programs that suggest the user should delete information that is not connected to any usable material or that has not been accessed for a long time. Much like a computer, our brains sometimes tell us that certain bytes of information are gone, permanently deleted, but this is not actually the case. Microsoft and other computer systems would have us believe that deleted information is unrecoverable, but those who understand computers on more than a superficial level know that everything which was once there is still there. This is also true for the brain. We don’t really forget, we only can’t remember. Sometimes what we thought we had forgotten will return to us at unexpected moments. Sometimes lost memories return in dreams.

Dreams are natural, useful, healthy, elusive things. We benefit by understanding them. The antidote to fear is self-knowledge. The antidote to communal fear lies in understanding our world. Dreams help all of us in all realms.

Thursday, April 27, 2006

The Feelgood Virus Rules the West

In Western society, feeling good has become a guide for how to behave and what to buy.  The criterion for whether or not a thing has value is whether it feels good to the person who interacts with it.  The opposite of this is that if something doesn’t feel good, it has no value. In our lives, apparently, we should only keep what makes us feel good. The problem with this belief is that nothing has ever happened or been invented that advanced the well-being of mankind without some struggle, and struggle doesn’t feel good. Schools often teach that correcting children might stifle and therefore damage their creative spirit, so we must always praise everything a child does. This is not to say that criticism and punishment should flourish, but a balance between praise and criticism helps children understand the reality of the world.  Participation trophies are more damaging than helpful.

It may be that the prosperous post-war society in which parents and society indulged their children by giving them “all the things I never had” spawned the “Me” generation. If you indulge a child too much, he will expect to be rewarded with everything he wants and that he is the center of the universe. He is disappointed when his adult experience tells him otherwise. The “Me” generation couldn’t come to terms with the fact that they were not the most important people on earth. Those born after the Second War were taught to follow their dreams at the expense of everyone and everything else. It's a hard lesson when they realize that following their dreams doesn't necessarily result in success, either financially or in personal fulfillment.  Sometimes those dreams are unrealistic and unachievable.  A child can dream of being an astronaut, but if he doesn't have the intelligence, physical qualities and skills to get there, he will be disappointed. An American astronaut is required to be no taller than 190 cm, and chopping off his feet will not get him into space.
  
The isolation of individuals in a “Me” society has been exacerbated by indulgent parents, a tolerant society, the media, and a culture of fear, until almost an entire society is composed of disconnected individuals who substitute their imaginary sense of belonging for the slings and arrows of real life. There are exceptions to this world of isolation, such as participation in team sports, but only for players, since a spectator retains their isolation.  Players are indoctrinated by coaches who tell them the motive for playing is that it “feels good to win”. Pep talks are dominated by the dream of feel-good victories.  If teams or athletes come in second, it is seen as having no value. People can convince themselves that they belong to something important when, as spectators, they join together with other sports fans.  They celebrate this form of virtual belonging because it makes them feel good, but they are not doing anything active.  They count on the actions of their team or sports hero to make them feel good.  By themselves, they contribute nothing except noise. 

Pop music and films are geared to sales and encourage fans to buy products to take home. People still go to concerts, clubs, and movies, but the driving force behind the production of these events is to sell products for people to feel good with a replica version of the original.  Music concerts are loss leaders and intended to sell merchandise.  Films may not make the most money in their original cinema runs, but they bank on income from future sales.
 
The experimental drug culture from the ‘60’s onward encouraged drug taking as a way of “feeling good”. Of course, taking a drug is a personal experience as the user is the only one who feels the effects. Observers might see the results, but the experience is personal. No two trips from taking LSD are identical.  Drugs are taken for escape and entertainment. Alcohol is the same.
In less affluent societies, people have more pressing needs than “feeling good,” so the idea that this “feeling” is a reason for making decisions is viewed as an obscene Americanism. Someone who must expend all of his energy to search for food to give him strength for the next day doesn’t jeopardize his life by basing his decisions on what feels good. Most other societies have a much more solid foundation for making decisions, such as whether an act is in harmony with the society they live in, and is harmful or helpful to their community. Western society’s soul has come detached from its moorings, so it searches for something to fill the gaps left by former anchors like religion.  Religion has taken a battering from science. Worship may make a believer feel good, but so many people these days are no longer satisfied with the answers offered by religious texts. 

If a man in a poor country has scraped together enough money to buy a product, his choices would probably be based on usefulness, reliability and price. He might base his decision on what might help himself or his family, but to choose something because it “feels good” would be the least of his reasons for making a choice. It is true that choosing something for its rightness (usefulness, reliability and price) makes the buyer feel good, but choosing a product based only on its wow factor would be considered a foolish purchase.

The western marketing colossus attempts to create needs where none previously existed by exploiting human characteristics such as pride, envy, and a desire to feel superior. Television, which is watched worldwide, is an ideal medium for insinuating these new needs into every level of society. Feeling good is an easy sell, but the downside to the pitch is that we think we need these things that make us feel good because they cover up the emptiness.  This is not to say that man doesn’t have a desire to feel good, but to believe that this desire is a philosophy, a way of life, or a reason for action, makes for an empty life spent travelling between one indulgence and another.  It is an existence without heart, spirit or soul.

Tuesday, April 25, 2006

Sculpture in Public Places

Sculpture was born from figurines crafted at the dawn of man.  Some tribes practised cave painting, while others made their totems in stone, clay, ivory, and wood.  Egyptians cut obelisks from native rock, Greeks constructed temples with columns and statues, Romans built arches and colossi, and kings erected monuments to their battles and themselves. In the democratic age, unexpected icons like the Eiffel Tower, the Statue of Liberty, and the Sydney Opera House appeared to celebrate industry, science, nationhood, and the arts. Those who live in the countryside have mountains, trees, and the sky for spiritual nourishment, while the citizens of cities can only hope that their overlords allow them an integrated and aesthetic environment that reflects their culture, history, and community.

Large European cities like London, Paris, Rome, and Berlin have sculptures from every era that juxtapose the old and the new.  Rome has the visible bones of the old empire flanked by Fascist parade avenues, Catholic churches, palaces of rich families, and human-scale squares which have room for neo-classical fountains, cafes, souvenir shops, chic boutiques, neon, noise, roaring motorbikes, and all the racket that a living city generates.

Anyone who has ever lived in the country knows the claustrophobia of the city and how it feels to be confined to narrow streets and towered over by buildings. Humans need open space for their souls to breathe. Those who live in large cities are bombarded with sensory input, which is responsible for the stress and the excitement of city living. Early in the development of American cities, land for squares and public spaces was set aside. As cities grew, it wasn’t practical for the inhabitants to travel long distances to get the open space they craved, so parks were created. Sometimes this land was donated by civic-minded benefactors or, in the case of many European parks, ex-royal pleasure grounds were made public.  If we accept that we have to live and work in crowded cities to support ourselves and our families, we should have input on what our public spaces will be like. Sculpture is often the last thing to be added to a park because it is expensive and subject to damage by disgruntled residents.  A park is a natural home for sculpture, as it is already a beautiful location, so even a ragamuffin piece could look handsome there. Sculptures may become favourites, go unnoticed, or become universally disliked, and as all dictators have observed from their reserved seats in hell, statues can be knocked down and dragged away. 

Street settings for sculpture are more problematic, but can have interesting solutions. West Berlin and East Berlin were both restructured after the Second World War, and have done an excellent job of incorporating sculpture into a modern city.  West Berlin has a drainage problem, calling itself the Venice of the North, and supports a network of above-ground water pipes, but sculptural solutions have been found in giant modern pieces such as Adolf Behrens “Berlin,” a loose knot of fluted stainless tubes. It's in startling contrast to the sad, truncated tower of the Kaiser Wilhelm Church or the mounted statue of Frederick the Great, but the contrasts of history live comfortably together.  Wide country streets in North America became main thoroughfares with large football field intersections that would dwarf a Roman fountain, and in a strictly capitalist society, public art doesn't generate revenue, so it ostensibly has no purpose.
 
After the war, East Berlin created its version of Soviet chic, with interminable rectangular blocks of buildings and the windswept spaces between them, which, for some, hold a severe beauty. Yet the official East German artistic vision is a retro space-age TV tower.  From almost anywhere in the city, you can find your direction by it, like navigating by the moon and stars. It fits its situation because it rises from launchpad-sized Alexanderplatz.  Appropriate sculpture can add scale, history, mystery, and importance to a place. A sculpture, however, must be chosen to have particularity and universality. It should represent its era, its location, and also have longevity, not only in design, but also in the public imagination.

Equipping public spaces to be more livable has a price. Someone has to buy, install, and maintain whatever is installed there.  Architects and planners now create spaces in front of large buildings by using corner cutoffs and building setbacks.  Depending on the size of the found space, benches & shrubs are possible. Sculpture in these locations occupies less space and is less expensive.
Sculpture in public places democratizes art by bringing it outdoors. Living with art is no longer a preserve of the privileged. Yet in a rather American way, we tend to segregate our duties and pleasures. For open space, city dwellers go to a park, go to a mall for shopping, go to a sports complex for exercise, and go to a gallery for art. This fragmented approach makes every facet of every activity suffer by dislocating it from everything else. A sane, healthy living space should be integral to its surroundings.

Public parks are good things; people need them. Sculpture parks, however, are a step backward as they reinforce elitism and segregation. We should surround ourselves with some of the best examples of what artists, sculptors, architects, and town planners can produce. It is possible to be surrounded by beauty and thoughtfulness in the street, the bank, the shopping mall, and in the workplace, without having to make a trip to a gallery.

Old can mix with new, and different interests create diversity. Variety makes a powerful statement. Look at I.M. Pei’s glass and steel pyramid in front of the Louvre, Botero’s chubby bronze characters in a Florentine piazza, Chicago’s reflective Cloud Gate known affectionately as "The Bean" planted in windswept concrete, Joe Farfards’s circular filigreed iron corral “Mind’s Garden” in a flat Regina field, the HSBC atrium in Vancouver which barely contains Alan Storey’s precise monumental motion “Pendulum” The behind-glass location of the latter, solves the problem of vandalism and protects the piece from weather, but should a sculpture be protected from being climbed on and touched? Yet the original of Michelangelo’s David is kept indoors. The Copenhagen harbour mermaid has been damaged at least eight times, but has always been put right. Like painting over graffiti, repairing damage and supporting creative alternatives to youthful self-expression is good policy in maintaining any public space and its sculptures. 

Much effort and expense are channelled toward winning garden awards, yet in the Northern Hemisphere, flowers bloom only half the year. The same applies to Northern fountains; water freezes. Good choices of sculpture to be installed should thrive in all weather. Government and business often overlook the practical and healing role that well-chosen installations have in making a place attractive and memorable. Sculpture is an ideal candidate for lifting any location from banal to sublime.
A wealth of locations for locating public art exist, but local governments, when deciding how money should be spent, often overlook the practical and beautiful role sculpture has in making a place important. They concentrate on lighting and the smooth flow of traffic. Citizens' groups, which have a tendency to celebrate themselves by erecting boosterist welcome signs that are reminiscent of frontier-town timber gateways, could spend the same time, money and effort installing something that transcends politics and commerce.  Well-chosen sculptures that make people contemplate their society and how they fit into it are more valuable than banal beautification projects like painting flowers on crosswalks.  A statue of a politician is about as enlightening as a painted flower.
  
These decorations are fads that fade quickly, while a timeless work of art should say something profound to those who see it daily.