It’s been two weeks now, and the entire right-leaning media is stunned – stunned! – that people have so quickly moved on, forgiven and forgotten the scandal that was our Prime Minister darkening his skin for a dress up party some 20 years ago.
It would appear that the only individuals with any interest in making a big deal out of this were his political foes and the media who support them, as even the average conservative Canadian didn’t see this as much more than a tempest in a teapot, and that includes people of colour who had more reason and justification than anyone to voice their disapproval. The truth is that nobody cares that much about an old case of brownface, as it is called by Canadians, or blackface, as labeled by our neighbours to the south.
All the obvious points have been made – that it was an Asian-nights theme party, that Trudeau was dressed as Aladdin, that it was 2 decades ago, that the moral landscape was different, that he was a drama teacher who went all out for events such as these and mostly, that his policies and decisions as the leader of this country have been, through and through, the very opposite of racist. The sum total of these points should sufficiently demonstrate why this incident needs to be put to rest.
These same people will, of course, insist that the practice would not be acceptable in this day and age. Today’s society, after all, is woke and fully aware of the political incorrectness of such an act. Some even know about Jim Crow and the way blackface was used a hundred and fifty years ago, with its mocking quality and degrading way of presenting African Americans at a time when the N word was still freely used and slavery was still legal. Indeed, if Trudeau were to demonstrate this lack of judgment today, there would not be enough Sorrys in all of Canada to get him off the hook. That, at least, appears to be the consensus.
I’m prepared to take a step in the other direction.
While I agree that the incident was greeted with the proper eye-roll by the majority of the population, I disconnect from the idea that it would be disrespectful to engage in this type of activity today.
Bear with me.
In many cases – though certainly not all – brown/blackface is something that is done by Caucasians who simply don’t see colour – in other words the least racist people of all – and only when an occasion warrants it, such as Halloween. These are people who have engaged with friends, co-workers and clients of many different cultural, religious and ethnic backgrounds, and they truly don’t care what you look like and just treat you based on your qualities as a human being.
It’s highly unlikely that a racist would want to dress up as a black person any more than a homophobe would want to dress up as a drag queen. (I’m excluding anyone with the questionable common sense of wanting to dress as a slave.) But if the idea, on Halloween, is to disguise oneself – translation: “look like someone/something else” - then you go all the way. You want to be Jimi Hendrix? Without the darker skin you’re just a hippie. You want to be Aunt Jemima? Without the matching skin colour, you’re just a lady with a towel around your head. You want to be Michael Jackson – ok, bad example. You can be Michael Jackson and look like him regardless. But you get my point. Of course you don’t have to make yourself any darker – every Halloween party includes people having to explain exactly what they are supposed to represent because it just wasn’t obvious – but it should be perfectly OK to match the person or character you are trying to resemble.
The very definition of equality is not treating people differently based on colour. Should certain people be exempt because of a 150 year old practice that bears some resemblance to a joyful and fun party ritual many people engage in today? Halloween, the one occasion that screams “anything goes”, has become so politically-correcticized and regulated – no guns if you’re a cowboy! – that the fun has been completely diluted, on many levels.
Of course this doesn’t mean that no costume is ever in poor taste, but if you want to be Charles Manson or Hitler and that’s what floats your boat, then knock yourself out. We are stuck with our bodies and our looks 364 days out of the year. We should have free reign over any choice of costume we choose without worrying about whom we are offending or being made to apologize 20 years later.
Something to consider with Halloween being exactly one month away.
I don’t mean about what you planned to wear on Tuesday, or that time you ordered pizza instead of Chinese. Forget even more serious matters, such as considering a new career path, or cutting ties with certain people in your life.
I mean, rather, about a point of view. A belief. Your perspective. Those things that, although they don’t – or shouldn’t - define you, in essence reflect the person you are.
I can’t say I’ve ever heard anyone fess up to being close-minded. Being a slob? Sure. Lazy? Often. Having a temper? Yes, even having a temper. But being close-minded? Nosireebob, nobody wants to be that. When it comes to our tolerance, you would think we are all just a happy bunch of free-spirited hippies who are fully accepting of all human beings and ideas.
There are levels of open-mindedness of course, and, just so we’re clear, “I’m not racist but…” doesn’t fall under any category. And because it’s what racist people say when they can’t acknowledge being racist, they are in no position to change their mind about it.
Others come with a disclaimer, like the tired and overused “I’ve got nothing against gays, as long as they don’t hit on me.” Yeah, no, that’s not QUITE the open-mindedness we were shooting for, but I guess it’s a step up from shouting slurs through a megaphone at the LGBTQ parade.
So let’s focus on those who are generally quite open and accepting of people from all walks of life, those who won’t treat you differently based on your skin colour, gender, age, culture or sexual orientation. Those who are genuinely fond of human beings and believe that we are all equal. Chances are, even they will admit to reacting defensively when the issue at stake touches on long-held beliefs or world views, with certain subjects hitting a particularly raw nerve. The death penalty, abortion, climate change, euthanasia, gun control, war, immigration, the legalization of drugs are all sensitive issues, and with good reason. We don’t so much want to be right as we need to be right, because, as rational as we think we are, we are deeply emotionally invested in our beliefs, which is why very few discussions involve a factual, objective analysis of the pros and cons of either side, and fewer still result in a re-assessment of one’s convictions.
Much of our inner compass is formed by our early environment on an unconscious level. Our parents tell us, and model for us, what is right and wrong, and often tell us what to think. We are, for the most part, not encouraged to look at various facets of an issue, to question things, to form an outside opinion and arrive at a personal conclusion. We are certainly not encouraged to challenge our parents’ views, which would be seen as a threat, and to run counter to them could result in undesirable consequences, like withdrawal of affection or straight-up punishment. And so we learn that to be loved, we need to conform, and we internalize these viewpoints, and convince ourselves that we have come to these conclusions freely and of our own volition.
With this ingrained code of ethics, we then go into the world instinctively looking to connect with people reflecting similar values, thus finding our tribe, a group of like-minded individuals who make us feel normal and accepted, and sometimes feed our sense of superiority over those who think differently. The weaker the self-esteem, the stronger the dynamic.
Like our parents, we in turn feel threatened by any idea that runs counter to our belief system. We fear being wrong, and therefore rejected and outcast and somehow diminished. Being wrong feels like admitting defeat, while sticking to our guns, insisting on being right, almost becomes a matter of survival, and so we stay the course.
We don’t do that to ourselves when it comes to that career change, or cutting ties to toxic people. There, we can freely, happily, own our decision to make a change, and at no point do we feel that this reflects on us as having been wrong, or bad, or less worthy. We can happily admit that there was a time when that job, or that friendship, had a reason for being, but that, over time, transformations took place that have now brought us to a new place and time, making us seek conditions that are in sync with who we are now.
Yet, when it comes to our beliefs, it is a whole other story. We get defensive and lash out, when in fact we should embrace the challenge. Having someone question where we stand forces us to make a solid case for ourselves. We may come out of it with a renewed sense of conviction or come to realize that a view that had value 25, 10, even 2 years ago, needs to be re-adjusted in light of new information, or a new perspective. But if we are truly confident, if our position is solid, it will withstand the strongest counter arguments, and if it’s not, then it deserves to be dismantled, and not begrudgingly, but with a tipped hat, and congratulations. It’s a battle of wits minus the ego.
We’re not talking about changing your mind like your underwear – same symptoms, different result. But to allow for our mind to be open enough to hear another person out and to give credit where credit is due, to be happy to see our beliefs tested and challenged and find out what are really made of, and to realize that we simply don’t have a horse in the race – not because we don’t have a horse but because there’s no race – that is good stuff.
So when did I last change my mind?
It was a few months ago. A friend and I were discussing organ donation. As a strong supporter, I was thrilled to hear about countries making the practice mandatory, meaning that rather than consenting to organ donation by signing your card, you actually had to opt out if you did not wish for your organs to be removed. I thought this was terrific, as there is such a lack of donors, and such a dire need for organs. This was for the greater good, after all, how could anyone possibly argue?
And then my friend asked “But why should organ donation be the default?” I wasn’t sure I understood. “I mean, why isn’t the default to leave the body untouched? Why is it incumbent upon the individual to make sure that his or her body is NOT invaded? The default should be the natural state of the dead body. Any amendment to that should need to be consented to.”
And he was right. Mandatory organ donation was essentially forcing someone’s hand – either by making them donate, or by making them have to ensure they don’t.
I’d always been the first one to say that right and wrong were principles that could not be altered according to benefits, or risks, or popularity, and this was no different. So I re-adjusted my position. Not joyfully, I might add. I would much rather that there be an abundance of organs available for anyone who needs one. But that just means that we need to double down on our efforts to educate people, encourage them to sign those cards, help them see the benefit, and take the fear out of them thinking a misdiagnosis might lead to the removal of their organs when they could instead still wake up and return to a full, happy, healthy life. More work, really, but not a lost cause.
Changing one’s mind about one’s beliefs shouldn’t be as simplistic as ordering pizza instead of Chinese, but there is no reason it can’t be just as simple. And it’s quite liberating to be able to say “I guess I never looked at it that way before!” and shrug, and see that life goes on, not only not weakened, but strengthened by a new perspective.
I strongly recommend people try it, every once in a while.
If you live outside Quebec, chances are you have never heard of Bill 21, whose aim is to keep the province, and all people working in its name, secularist by forbidding the wearing of any overt visual symbols representing any kind of religion, including jewelry (such as the crucifix), and items that reflect a religion’s tradition (such as the veil or the turban).
If you live in Quebec, you probably can’t stand another discussion on the topic, although, to be honest, it’s not much of a discussion when the two sides just spew their arguments and then take off running.
Both groups have attempted to make theirs a case of ethics and morals. One side claims that it is crucial that all people employed by the province – judges, police, teachers – be free of any visible signs of religious influence, while the other champions freedom of religion as a basic right that trumps, well, pretty much all others.
As if religious neutrality could be attained by the simple removal of a couple of items worn by the person in question. And as if there were not already serious asterisk marks next to all our basic freedoms.
For all their touting that “it’s a question of principle” both sides are completely and utterly off the mark.
Let’s begin with our freedoms, specifically in regard to employment. We have already accepted a myriad of limits on our attire in the work place in general, regardless of whether we are employed by the state, a corporation or a private business. Dress codes are commonplace, the strictest version of which are uniforms, with some reflecting the profession (police and firemen and judges), and others the company (Air Canada stewardesses, UPS drivers and the army of McDonald’s employees, i.e.). There are many reasons for uniforms, but what they have in common is that they are non-negotiable. You want to work for us? Here’s what you’re going to wear. If you don’t like it, don’t bother showing up on Monday morning.
Other requirements are specifically determined with security in mind. Hard hats. Steel-toed boots. Gloves and glasses and hazmat suits. Nobody would argue those.
And then there are the no-brainers. These are not specified in writing, and for good reason – because they are, except to the truly delirious and those taking devil’s advocacy a step too far, painfully obvious. The company policy booklet should not need to specify that you can’t show up to your office job in a bikini, nor that it is unacceptable to service the public in your PJs.
So no - we don’t have full-fledged freedom when it comes to what we wear in the work place. Never did and never will.
But then what about jewelry, or other items that express our individuality, such as tattoos, or piercings? If this law were to go unchallenged, I could technically wear a T-shaped pendant if my first name is Tania, but not be allowed to wear a crucifix. I could wear a charm in the shape of a star as long as it’s not inside a circle and susceptible to being mistaken for a Wiccan symbol. I could wear a nose ring if I’m a punk, but not if I’m a Hindu.
With tattoos, the slope becomes even more slippery. Sometimes it’s a matter of where it is on the body, and whether or not it can be easily covered up, regardless of what it actually represents. Oftentimes, a simple tattoo of a butterfly or a flower will be accepted even if easily visible. But what if the new prospective hiree has ink in the very same place as the long-time employee, except this one is not of a sun but a star of David? What if, pushing it further, it’s a swastika, which was used as a symbol of divinity and spirituality in Indian religions before it got hijacked by the Nazis? What then?
Which leads us back to clothing – not the actual piece of cloth, but any message it might carry. In agreement with the new law, I cannot wear any clothing that represents my religion - therefore my beliefs - but it would be perfectly OK to wear a T-shirt that says “My religion is better than your religion”. Because those are just words, not symbols.
Which in turn leads us back to religious symbols. The proponents of Bill 21 argue that it is critical that no provincial employee show any sign that could be misconstrued as religious bias. But if someone is truly biased, how is removing any visible symbol going to remedy that? You can paint a poisonous apple red all you want, it won’t make it any less poisonous. I’m a big fan of cards on the table, and someone who shows their religious colour is no threat to me. Plus, if I’m the one doing the hiring, I’m looking for the best possible candidate, not the one who best hides what God he or she prays to.
Mostly, it’s difficult to ignore the irony in the fact that this law has been passed, among others, to “free Muslim women from the oppression of wearing the veil”, with the lawmakers somehow completely oblivious to the fact that forcing women to remove this item against their will is really just another form of oppression.
Muslim women of course insist that they wear the veil by choice, and in some, maybe many, maybe even most cases, this is true, at least for those living in the Western world. But it’s extremely difficult to measure the impact of, if not religious, then at the very least cultural traditions that are deeply engrained, and who knows exactly where that line even gets drawn. But before the self-appointed liberators rejoice, ask yourself if we, the supposed woke, don’t deal with certain cultural pressures ourselves, especially in the case of women. In a business setting, is it not still expected, to some degree, that a female wear heels? Make-up? That her hair be made? We’re not as far apart as we think sometimes.
The worst fallout of this new bill is the moral legitimization of overt racism. By saying we do not accept people expressing their religion via their physical appearance, we have in effect encouraged bigots to voice that rejection out loud, and sometimes even act on it. Mental, emotional and, in extreme cases, physical harm has come to some of those suffering the consequences of this hatred, none more so than Muslims, who have born the brunt of the discrimination. And let’s not kid ourselves: Muslims are the intended targets of these new legislations, but you can’t just single out one religion, so the hypocrites have officially included them all and now use that “fact” as justification.
But then what is the answer? How do you resolve this?
First, by nixing the bill, as it is, in fact, discriminatory. While it is true that other basic freedoms come with their own fine print (freedom of speech does not trump the laws on hate speech. i.e.) this particular bill is all about restricting freedoms of certain individuals while protecting nothing and nobody at all, unless you count the easily offended and those insisting on living inside their bubbles.
And then you apply common sense.
Utopian, I know.
But we can do this. We can collectively be reasonable and fair, and judge each situation individually. We are quite capable of differentiating between an item that poses a security risk in a given setting (a kirpan, for example, for a school teacher) and one that does not (kippah, anyone?). We can easily determine if an item poses a hazard or interferes with the requirements of a certain position (a turban will not fit under the helmet of a motorcycle cop). Those in charge of hiring new blood can surely figure out the best candidate based on qualifications, not appearance, while members of the public will have to trust that these people were hired on their merit and nothing else.
Of course some cases will always be ambiguous, or complex, or sensitive. But we need to let those who form the core of that particular workplace to weigh the options and make the most educated decision. Will the outcome always be fair? Probably not. Judgment is part of human nature. There will always be a chance you weren’t hired because you advertised your religious colours. But there’s also a chance that you weren’t hired because you look too much like the interviewer’s ex. You’ll never know for sure.
Regardless - leaving the issue to be handled by individuals who know exactly what is at stake in a particular situation, at least, allows everyone to rise to the challenge, to consider the unique aspects of the case at hand, and ultimately to do the right thing. Collectively.
What IS certain is that enshrining these dos and don’ts into legal guidelines is not only wrong, it is counter-productive, and rather than help create a more open and fair society, it will further divide. The proof is all around us.
I very much wanted to be the Badass. You know – the one from Jen Sincero’s book, “You are a Badass”, a motivational effort aimed at making you believe that nothing is impossible, that your magical powers are simply waiting to be tapped into and that the life you are dreaming of is completely within your grasp if only you tweak a few things, like your entire mental DNA.
And who can resist getting sucked into the positivity of this can-do approach, of believing we are invincible, capable of achieving the loftiest of goals by putting everything on the line in return for the proverbial pot of gold?
Of course, it’s easy to fall prey to the enchantress whispering in our ear that we are up to the task. After all, when was the last time we were truly encouraged? And by that I don’t mean the tepid I-know-you-can-do-it pats on the back from the co-worker trying to console us after a work meltdown, or strangers on social media deeming our comments worthy of a like-button. We so want to be able to identify with that mountain-moving image of ourselves that we get momentarily swept away, buying into the illusion that the author is addressing us – yes, us! – as if he or she truly knew our innermost selves. The words are so warm, and funny, and down-to-earth that we can almost be forgiven for such a moment of (let’s call it) weakness.
But the truth is we are woefully predictable. Imagine, if you will, a fake personality test inside a focus group with the results assigned to each participant in completely random fashion. As long as the overall feel was positive and not overly specific, with some traits made to sound personal, like they applied only to you - wink-wink - chances are that most people would likely identify with the character description. “You are a generous soul”, “your tastes are quite unique”, “you have a great sense of humour”, “you are trusting despite your initial scepticism”, “you can have trouble getting motivated”, “you feel misunderstood”, etc. One must almost be an alien to not feel like any of these statements fit in any way, shape or form. And who doesn’t want to believe they have a great sense of humour?
So here you are, reading this book, suddenly feeling very understood, and identifying with all these traits of potential greatness. We so desperately crave validation that we are willing to suspend all rational thought, our good common sense and the right we have earned to claim that we know ourselves better than some stranger who, frankly, has little more on the line than the few dollars he will make off the book you bought and that you will set aside as soon as you’re done with it in order to go back to your daily routine.
That is not to say that these authors are not sincere, or that they are not truly trying to help you get out of your rut. But they are not accountable, and there’s always that disclaimer – the one that implies that you have to want it enough – or else it won’t work, so it’s win-win for them, and status quo for us.
Because the truth is that most of us can’t or won’t be able to live out our ultimate dreams, and it belittles us to have examples waved in our faces of people – oftentimes the authors themselves – who started off with nothing, living out of an unheated one-room apartment and having to choose between breakfast, lunch and supper, yet somehow ended up with a beachfront mansion, a fancy car and a zoo of exotic pets. Of course, we are told that the dream need not be extravagant, and that if all you want is your little cupcake venture, then that’s just as good, but somehow it always comes back to your being able to rake in the millions doing exactly what you love if only you’re willing to go all the way.
So then how do you explain it to the person who has put in the effort, the time, the blood, sweat and tears, and still has nothing to show for? Do you tell her that the effort, while valiant, fell short of a couple of vials of blood and several beads of perspiration? That she simply didn’t believe in it quite enough? That she gave up too soon, even if she’s been at it for most of her adult life? That she wavered on her commitment when she was unwilling to quit the part-time job that was allowing her to pay the bills while she was working on her dream? That’s a bit of a copout. Is there never a time when one must accept that the dream has run its course and that it’s better to let it go? What about plainly unrealistic dreams, like that of the 5-ft-3 140-pounder who wants to play defense in football? Or people who want what they want for all the wrong reasons, like those who would rather be actors for the fame than for the love of the art? And what about inequality, for that matter? The aspiring doctor from a well-to-do family can easily afford not to work part-time while studying; the one from a low-income background can’t afford NOT to work while studying and may just have to make an excruciating choice.
Can people overcome the odds? Certainly. But many won’t, and it’s unfair to make people believe they can. Despite their best intentions, these types of authors border on disingenuous when they build the reader up and then leave him or her to deal with the fallout once the accumulation of life circumstances renders the wall too high to climb.
We all have baggage. We have responsibilities. We have handicaps. And we don’t live in a bubble – we are accountable to the people around us and we feel the need to respect those contracts of honour. We sacrifice parts of our selves, and try to find a balance between give and take, all of which takes a toll and puts a serious stick in our wheels at times. Maybe that’s why the message of “you can have it all” – one that basically gives us the green light to act selfishly - is so appealing.
But when all is said and done, I’d much rather read a book that encourages you to give all that you are willing to give within reason, as defined by you, and to do it with reckless joy despite the full awareness that it may not pan out – something that is in no way a reflection of your worth as a human being and certainly not a failure on your part. This frees you from any self-imposed expectations while saving you the regret of not having tried; you will have learned some new lessons, strengthened some mental and emotional muscles and made a ton of discoveries along the way. Not a loss by a long shot.
I’d much prefer a book that encourages you to take a chance, try something different, test the waters of the unfamiliar and see where it goes. It may lead you in a whole new direction, or it may bring you back to your starting point, with a renewed sense of appreciation for what you have. It may work out, or it may not, but that’s the thing – either result is fine. It’s not the obtaining it at all costs that is the goal here, it’s the trusting yourself to try, to be able to handle the outcome, whatever it may be, to be allowed to redefine your priorities should you change your mind, and the faith that you – YOU – know better than anyone who you are, what you want, what you need and how you define success.
You may not have all the answers - not right away, and maybe not for a while. You may not have the perfect plan, and you may not be operating under the most ideal conditions. Life is messy. But the important thing is to start with what you know with absolute certainty, and to do something today toward that goal. And then do it again tomorrow. And the day after that. We may have been misled into thinking that we are one big decision away from an entirely different life – which, really, seems a little daunting - when instead we can break it down to a succession of mini-decisions, none of which need to be perfect or written in stone, and you build it from there.
And in the end, who knows - you may just turn out be a Badass after all.
To say that many issues have become polarized in today’s culture is an understatement, and we owe this, at least in part, to the deepening wedge between the left and the right, our tribal sense of belonging, and a social media culture that constantly feeds the beast. It seems to require very little to stoke the flames and turn a seemingly harmless discussion into a full-fledged slug fest, and anybody with a horse in the race – or, frankly, just an opinion – is suddenly an expert, no credentials required, or benefit of the doubt allowed. Often, the conversation, drifting ever further from the original subject, spirals out of control in no time at all and takes on a life of its own.
But nowhere is this truer than when the subject matter is vaccines.
Follow any thread on either side of the debate, and it begins derailing about three lines in. The vaccine supporters accuse the anti-vaxxers of being uninformed, dangerous quacks who get their medical information from questionable sources and spread lies without an iota of conscience, while those who question vaccines label all those who support them as sheeple who have blindly accepted the word of Pharma the Almighty and are being duped about the safety of the ingredients injected into their babies and children.
Both sides are so busy shouting each other down and calling each other names – and we all know how conducive this is to a rational discussion - that nobody has really taken the time to establish any common ground as to what specific issues, exactly, are at play. Besides, do those who support vaccines really agree that all vaccines are safe, that all vaccines are necessary? That Big Pharma is beyond reproach, always looking out for the people’s best interest rather than its own pocket book? That there’s no reason to be cautious? I wouldn’t know, because that conversation has never taken place, but I doubt it. Conversely, do anti-vaxxers say that every vaccine should be rejected? That there’s no wiggle room and that maybe, in some cases, it makes sense to have your child inoculated? Again, we don’t know because we have never made it this far into the conversation.
So let’s begin with the presumption of innocence, and that the industry really does care about having a healthy population and protecting its weakest from any unnecessary and avoidable hazards. There will likely never be a shortage of diseases to treat, and it’s safe to assume that pharma will not go bankrupt due to lack of patients. And while the oft-recurring meme “The pharmaceutical industry does not create cures, it creates clients” is a popular soundbite, it reeks of cynicism and misplaced scorn as we have all been, at one point or another, the grateful recipients of medication that has helped us heal and get back on our feet. It’s just as difficult to imagine that a pediatrician - who has put years of blood, sweat and tears into his or her studies, passing the most gruelling exams, from the first admission test to finally being licensed to practice medicine 10 years later, driven above all by a deep concern for the health of infants - would wilfully ignore any signs pointing to dangers in vaccines or, worse yet, purposely agree to harm children.
Let us also safely assume that parents on both sides of the debate want only the best for their offspring, and that their intent is to protect them by any means possible. Regardless of the actual details of the issues at stake, nobody can accuse the other of acting in bad faith or against their child’s best interests. We have to, at the very least, be able to agree on this, and accept that all parents involved are acting out of concern and love.
Which leaves us with the actual issues.
Diseases eradicated by vaccines
The most common argument in favour of vaccines seems to be that their introduction has led to the eradication of certain diseases such as polio and the measles.
The polio vaccine came about in 1955, by which time the disease had pretty much run its course. The argument is made that sanitation, clean water systems and plumbing had had a lot to do with this, and that the vaccine was introduced at a time when polio had become almost irrelevant in terms of a real threat. The same can be said about Pertussis (whooping cough) and Diphtheria.
By comparison, diseases such as Scarlet and Typhoid fever saw a parallel decline at the very same time (around 1950) despite the complete absence of any vaccine to counter them, so it doesn’t seem that far-fetched to question to what degree, if any, vaccines were responsible for reining in some of those diseases.
The liability of vaccine companies
In 1986 Congress passed the National Childhood Vaccine Injury Act (NCVIA), allowing for a party alleging a vaccine-related injury to file a petition for compensation in the Court of Federal Claims. Thus the Vaccine Injury Compensation Program was created, taking the burden of responsibility - and compensation - off the pharmaceutical companies, with taxpayers left holding the bag. 42 U.S. Code § 300aa–22 states, “(1) No vaccine manufacturer shall be liable in a civil action for damages arising from a vaccine-related injury or death associated with the administration of a vaccine after October 1, 1988, if the injury or death resulted from side effects that were unavoidable even though the vaccine was properly prepared and was accompanied by proper directions and warnings.”
Vaccine companies, understandably, did not want to have to deal with legal matters and one might argue that it was critical to allow them the ability to continue their operations while letting a separate entity deal with the fallouts. That said, there is no denying that this court was set up, in the words of the US code itself, specifically to deal with vaccine injuries and/or deaths, so there is no argument about whether or not vaccine injuries occur, and $4 billion – billion, with a “B” - in payouts since then confirms that these occurrences are neither random nor exceptional.
The World Health Organization (WHO) itself lists the potential hazards for each vaccine, such as the MMR, so - just to be clear - the dangers may be downplayed or flat-out denied by doctors and the population in general, but they are very much on the radar of the companies producing vaccines.
The numbers game
Given that we have to admit that people do suffer injuries and even death as a result of vaccines, the question becomes, crudely: what’s the cost-benefit ratio? Depending on the vaccine, there are only single-digit occurrences of deaths for a million doses given (officially anyway). Do vaccine defenders feel that this is just a case of applied math and that the low numbers of deaths are negligible and thus justifiable? That one death is preferable to potentially many children infected? Do anti-vaxxers state the opposite? And how do we even know what the number of infections would be? We have no way of assessing the probabilities, or the level of impact.
Also worth mentioning is that the vaccine schedule has changed dramatically over the years. The dose has more than doubled between 1983 and 2017 for children by age 6, and almost tripled for children by the time they are 18.
Compare that to 5 doses in 1962, and we are not in Kansas anymore.
Are all of today’s vaccines critical and necessary? If supporters agree that they are not necessarily so, does it make sense to take them all anyway “just in case”? Besides, does anyone really believe that we have reached the end of this chapter? New vaccines are constantly being worked on, including one for the common cold. Is there a line that even vaccine supporters would not cross, a dosage amount that would be considered one too many? Should people be allowed to decide for themselves which vaccines are relevant? Which are not worth the risk? Is it really necessary to inject your newborn on his first day on earth with a shot against Hepatitis B, a disease you only contract through needles and sex and that becomes ineffective by the time your child is old enough to be exposed to needles and sex? What about chicken pox and measles? Were they not, just a short while ago, simply uncomfortable periods children suffered through as a rite of passage? Not to mention that contracting the disease usually affords you lifelong immunity from ever catching it again, compared to the shelf life of a vaccine.
Which brings us to the concept of herd immunity. Considering that most adults today have not had a vaccine or booster shot since turning 18, or 14, or 12, and that the effectiveness of vaccines wears off after a certain amount of years, exactly how much herd immunity do we really have?
A lot has been said about the dangers posed by the vaccine ingredients. The vaccine companies are quite forthcoming about what they include: aluminum, formaldehyde, thimerosal (mercury, still in flu vaccines), polysorbate 80 and aborted fetal cells, among some of the headscratchers. The argument by those in the industry is that their amounts are so minimal that it is ridiculous to worry, and that the form under which they are injected differs vastly from the actual raw chemical element we would find in a lab. But for those of us who are not pharmacists, chemists or scientists in general, it is difficult to understand how, exactly, this is different. And if it is normally dangerous to ingest any of these orally, how can it carry no risk whatsoever to inject them directly into the bloodstream of a tiny human being? If we are advised to control our intake of fish due to mercury, even though our body eventually rids itself of the food, how is it perfectly alright to inject aluminum, which the body doesn’t shed and which usually travels to the brain and settles there? No amount of research is going to help us see clear on this one. We are being asked to blindly trust that all is in order and not to ask any questions.
Except it is clearly not, because otherwise there would be no Vaccine Injury Compensation program. So if there are risks, why are parents not told about them? When we buy food, we not only have access to the information list of ingredients, it is required by law that food companies provide this to the consumer. When we pick up our prescribed medication at the pharmacy, we are handed a sheet of information listing all the possible side effects of said medication, all the precautions we need to take, all the warning signs we need to look out for.
How are parents not afforded the same consideration with vaccines? Allowed to weigh the pros and cons themselves and decide if they would like to proceed?
Without additional contributing factors, the parallel rise in vaccine doses and the rate of autism spectrum disorder is insufficient to establish a connection. Likewise, the US’ abysmal child mortality rate, which hovers around 6 deaths per thousand, placing it in 28th place worldwide, is likely mostly due to the absence of accessible universal health care.
But that doesn’t disprove, or rule out, vaccines as contributing factors to both autism and child mortality, especially considering that a connection has in fact been established, and not just in Dr Andrew Wakefield’s study – the famously retracted paper being held up as vindication by vaccine supporters who claim this was the only study creating a link between vaccines and autism. In reality, there have been numerous studies showing such a link, and to ignore them seems reckless. Examples, which can be found on the NCBI (National Center for Biotechnology Information) government web site, are given at the end of this article.
And finally, there are the parents. Those who know their children better than anyone. Those who watch their progress every day, who can tell you exactly what their child is capable of doing at every stage of its young life. And when a parent sees their child turn from an active, curious, agile little tot to a lethargic, non-reactive lump who can no longer utter words he could say the day before, or walk like he did all week, and that this all happened within 48 hours of a vaccine, then it becomes extremely difficult to chalk it up to sheer coincidence.
A badly needed dialog
We must get back to the basics of rational discussion and address the issues while steering clear of ad-hominem attacks – the modern-day equivalent of “Kill the Messenger”. If we are going to make any headway, vaccine supporters, and especially those involved in the production, promotion and dispensing of vaccines, need to address the vaccine skeptics’ legitimate points and concerns because, as we have shown, there is no shortage of those. This is not about being right, or feeling morally superior, or about making a point. It's about having all the info - on both sides - and evaluating the situation objectively so that we can make informed decisions regarding those that matter more than anyone else in the world and who do not have a voice, or a say, in the matter: our children.
Even with the best of intentions, it may not be enough.
But it would at least be a start.
Annals of Epidemiology, Hepatitis B vaccination of male neonates and autism diagnosis, NHIS 1997-2002. https://www.ncbi.nlm.nih.gov/pubmed/21058170
Toxicology and Applied Pharmacology, Porphyrinuria in childhood autistic disorder: Implications for environmental toxicity https://www.ncbi.nlm.nih.gov/pubmed/16782144
Journal of Child Neurology, Developmental Regression and Mitochondrial Dysfunction in a Child With Autism https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2536523/
The Neuroscientist, Large Brains in Autism: The Challenge of Pervasive Abnormality https://www.ncbi.nlm.nih.gov/pubmed/16151044
Journal of Pediatric Neurosciences, Pediatric Autoimmune Encephalitis. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5588635/
Developmental Medicine & Child Neurology, Anti-N-methyl-D-aspartate (NMDA) receptor encephalitis: an unusual cause of autistic regression in a toddler. https://www.ncbi.nlm.nih.gov/pubmed/24092894
Journal of Inorganic Biochemistry: Do aluminum vaccine adjuvants contribute to the rising prevalence of autism? https://www.ncbi.nlm.nih.gov/pubmed/22099159
Surgical Neurology International, Immunoexcitotoxicity as the central mechanism of etiopathology and treatment of autism spectrum disorders: A possible role of fluoride and aluminum. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5909100/
Metabolic Brain Disease, The putative role of environmental aluminium in the development of chronic neuropathology in adults and children. How strong is the evidence and what could be the mechanisms involved? https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5596046/
Transnational Psychiatry, Atopic diseases and inflammation of the brain in the pathogenesis of autism spectrum disorders, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4931610/
Journal of Alzheimers Disease and Parkinsonism, Natural and Synthetic Neurotoxins in Our Environment: From Alzheimer’s Disease (AD) to Autism Spectrum Disorder (ASD) https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5059837/
Journal of Immunotoxicology, Theoretical aspects of autism: causes--a review. https://www.ncbi.nlm.nih.gov/pubmed/21299355
Sandra is a blogger, life coach and activist.
Site powered by Weebly. Managed by Web Hosting Canada