Connect with us

Alternative News

The Insanity Of Modern Television & Entertainment. What Happened?

Published

on

Watching my son James grow up has been a blessing in so many ways, one of which has been the opportunity to learn more about myself. Watching him learn how to learn has given me so many insights into my own childhood mentality and how those early experiences continue to affect me today.

advertisement - learn more

Of all the things I have learned in my short time as a father, one of them sticks out like a sore thumb, and that is the remarkable influence television has on a developing mind. Television is not something I desired to introduce into my son’s life until much later on, but I’m almost glad I have because of what I’ve learned from watching him watch the tube. Of course, he isn’t sitting in front of it for very long, maybe an hour every morning to get through breakfast, but even in that time I’ve observed some things that I think are important  to consider.

Live Television

My journey with television also started at a young age. Mind you, I had three television channels while growing up, one of which could only be changed by someone going on the roof of the house and adjusting the direction of the antenna. Even though we only had three channels, I can tell you with certainty that I watched a lot of television. And my tastes were wide-ranging; from Oprah to Star Trek (the original and The Next Generation) to the great soap opera, Another World, to Spider-Man, Batman, Teenage Mutant Ninja Turtles, Tail Spin… I grew up in a great time for great television.

The Simpsons was a hit in our household, at least for my father, sister and myself. Looking back with the clarity and understanding that comes with age, I realize now that I only ever watched the Simpsons as a way to connect with my dad and my sister; I didn’t particularly like the show in and of itself. I was my mother’s son while my sister was daddy’s girl, and I often found it difficult to connect with the two of them. Television acted as a neutral medium through which I could bond with them, though I only figured this out later in life when I began to live and breathe television.

As a self-reflective adult, I know I’ve undertaken a long journey of transcendence, transforming my subordination to authorities into independence and an entrepreneurial spirit. I chose to break the old paradigms of my life and find out what I can really accomplish. For a kid who grew up on TV, a career in television which started out literally living inside a television studio made perfect sense, particularly if that child hoped to learn how to grow beyond the magic of the screen. I had to understand how what I had watched had been made if I was ever going to get it out of my head.

Bizarre Moments

I remember on the first day of my first real job in television, at Business News Network, I met the audio guy who worked on the original Sesame Street – a show I had watched often. What a bizarre moment that was! Here I am, meeting this slouching and overweight middle-aged man, who was an occasional cigarette smoker and regular complainer about all things audio and personal… and he is basically a parent of mine, however indirectly. I would liken the experience to meeting your childhood superhero in person only to find out that he or she is a complete asshole. Left in a state of shock, you immediately start to question your entire psychological framework, because the hero you have admired your entire life isn’t really the person you thought he was.

advertisement - learn more

The fastest way to transcend your hero is to find out that they are a villain as well.

Unfortunately, that’s been a common theme through my television career. A career that is more often filled by suppressed entrepreneurs and dependent workers, not to mention job-hating borderline psychopaths whose idea of fun consisted of sending correspondence letters to serial killers in prison (true story). I have spent a lot of time wondering why a person would intentionally put themselves into a work environment where they have to get paid to do what they despise and then complain about it for free. It’s completely bizarre. Then one day my sister gave me a gold mine of wisdom that put it all together.

She sent me an old audio file, hijacking one of my Grandfather’s cassette tapes that he had made for us, narrating some of the stories he wrote about his life. I guess at that time in my youth I thought it would be funny to record, at the start of one tape, the phrase “You’re a dork. You’re a dweeb! You suck!”. I wanted to shock anyone who might in future listen to the tape, and did so in a tone of voice that could only be described as Bart Simpson-esque. Hearing that blast from my past stunned me. My plan for shock and awe had worked, although I never thought the person being surprised would be myself.

Modeling Your Masters

Well, my sister and I had a well-deserved laugh over that. I must admit it’s a really funny audio clip. But it’s really stuck with me ever since because it’s not something I’d consider doing anymore, as the person I am today; not for any reasons of regret, but because I simply would choose not to, given the choice. I’m a different person now, much more mature in some ways (and much more immature in others). But on that sound file I heard Bart Simpson, and this has confirmed a huge theory of mine that I have been confronted with time and time again while working inside the television industry– that it’s called television programming for a reason.

At our most basic level, we are simply slightly intelligent monkeys, and as monkeys, well… monkey see monkey do. Now, I know that’s tremendously judgmental but I must make the point that we model behaviors we identify with and are exposed to, and that includes fictional characters. A good example of this is my friend, who we’ll call Andy. Andy is an interesting friend of mine. He himself believes he is scattered in many directions and often asks others for their opinion of his behavior. I often get the sense that he knows something is not quite right and he wants to help himself, but just doesn’t know what’s wrong or how to fix it. I didn’t even know what his problem was until another friend of mine showed me a remixed episode of The Fresh Prince of Bel-Air, set to a music track. Watching this remix that compiled short clips of all the characters completed so much of my own puzzle about Andy. I recognized that Andy’s personality is built from the characters on that show. Everything from how he talks, walks, moves, sits, behaves, speaks, thinks… and so on. I confirmed with Andy that he used to love watching that show when he was a child and that again confirmed what I’ve suspected for some time.

When your parenting is fictional, you’re going to live a fictional life.

The biggest challenge most people have in life is not living the life they have. They are profoundly disconnected from their own existence and constantly distract themselves with sensory stimulants to prevent themselves from admitting the pain of their lifestyle. This in turn blocks them from the opportunity to accept the present as it truly is, and to see through it to the truth of their own magnificence.

Think about this for a moment: if you love your television characters, you’re not going to want to go beyond that for fear of losing the feel-good chemicals your brain releases while watching them. That in turn creates a fear of your own self, which is by default, an entity without these fictional personalities you have cherished. It’s exactly like an addiction where you fear the loss of the substance while simultaneously fearing the version of you that exists without it.

While on that topic…

Children’s shows nowadays look more like LSD trips with their flying smiling cars and buses and talking sponges… and that’s actually the point of it. If I have learned one thing through investigating psychedelic medicines, it’s that what you learn (or watch) while in an ‘altered state’ will stay with you in an unaltered state. I learned this professionally by hypnotherapist and seduction expert Ross Jeffries, who is a master of emotional and perceptual manipulation. Today’s television is literally a non-substance based psychedelic drug replacement with a narrative that fits the government guidelines for children’s television, and from my experience working a decade behind the scenes in television I can tell you with certainty that television has a political motive.

I’m not approaching this from a conspiracy theorist’s lens. I have literally been in the meetings where politics decide programming, and while you may not believe it, I don’t think it takes much to recognize what the narrative (or propaganda, if we’re being honest) of children’s television is preparing them for – unquestioning participation in a system which doesn’t work all that well or serve their best interests. But this new form of art is something that fixes itself from independent and inspired thought. It’s up to the visionaries to create the future and our visionaries are being lost watching the real desperate housewives of fill-in-the-blank, being programmed to think fake, act stupid, and complain a lot about personal drama.

Where has Star Trek gone?

And so I bring this story back to my son James. I grew up on Star Trek. As far as I’m concerned, Gene Rodenberry was an inspired visionary who saw beyond his time. Today we have laptops, cell-phones, 3D printers, and a host of other world-changing inventions that can be attributed to his creativity. I don’t see that in television anymore. What I see is desperate housewives and other cocaine addicted and mentally disturbed celebrities getting famous off of your time that you spend watching their lives instead of working on your own. What I see is murder being made into entertainment. What I see is rich duck-hunters taking your time and your money without giving you any sort of return on the investment besides a temporary emotional high. What I see now are shows about time-warps, demons and angels, vampires, the power of cops and the government, and an extraordinary display of how unintelligent humans can be. What I see is insanity entertainment and I believe it’s affecting all of us.

I believe that when Gene Rodenberry died, television died with him. It stopped being a vision, a program of what we can do as a species, and became a program of what we already dislike about our civilization. Put simply, if you keep showing a person the murderous side of our nature as entertainment, why would you expect them to value human life and their own life at all? Whatever we repeat we become good at and if we keep repeating the wrong actions, don’t expect to last very long in a world that is constantly striving to challenge us.

James gets his TV fix once in a while, but even on those few occasions I’ve seen a noticeable difference in how little he develops as a person the next day when compared to when he doesn’t watch television. I’ve seen it in other children as well and I’ve even seen it in parents who are struggling to raise intelligent kids. Every day, the same shows, the same songs, the same scripts said in a slightly different way. Eventually you need to have those songs in your phone to keep your kid quiet from having a tantrum while in a waiting room– just like an addict. Metaphorically, that’s equal to saying, “Watch your show and be quiet while I control your life with a super-stimulating medium made by people who are going to produce whatever they have to, to survive – not necessarily what you need to see to survive”.

Repetition wires your brain and can lead to great things so long as that repetition is great. That’s the insanity of entertainment and television as it is today. It’s less about being great and more about churning a profit for the production team. Meanwhile, you look up to your Kim Kardashian and want to be like her and meet her and love her and when you finally get the chance you get tackled down by her security guard. It’s insanity. I mean, just now I look on Google News for a headline about the Kardashian’s and the first headline is Kardashians Reportedly Hate Fat Brother for Being Fat.

Honestly, are you getting paid to care about that?

I apologize for this being one of the longest and by far the most ‘charged’ blog I’ve written, but I feel it must be said. Your time is your own. It’s your life. Use it wisely for your wealth and your health and if you want to help that along, turn off the TV and turn on your life.

——————————————

Stephan Gardner is a Life Performance, Personal Development & Psychology Specialist who helps people achieve mental well being through a luminary understanding of human behaviour, emotions and life transformation. A teacher of personal and spiritual development and dedicated Yoga practitioner, his mission is to inspire you to reach life fulfillment through inspired work, wisdom, and love. www.stephangardner.com

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Advertisement
advertisement - learn more

Alternative News

The Anatomy of Conspiracy Theories

Published

on

Whether you believe in conspiracy theories or not, we can all agree that the use of the term has exploded in media and in conversation. The question is, why? Are we now using the term “Conspiracy Theory” more indiscriminately and on more platforms than previously? Are we, as a society, simply becoming unhinged and absurd? Are seemingly nonsensical stories, for some unknown reason, starting to resonate with people? Or are some conventional narratives getting challenged because some of these “alternative” explanations are in fact accurate, despite the fact that conventional sources refuse to acknowledge them as even potentially valid? Notice that the last two possibilities are different sides of the same coin. If you think  “conspiracy theorists” are unhinged, it is highly likely that they are suspicious of your sanity as well. Both sides insist that they are right and that the other has been hoodwinked. Note that if you choose to not pick a side, you are, by default, allowing the conventional narrative to perpetuate. That is how convention works. 

Merriam-Webster defines the term conspiracy theory as “a theory that explains an event or situation as the result of a secret plan by usually powerful people or groups”. The key elements of this definition remain consistent across all authoritative lexicons: the group responsible for an event must be powerful and covert. However, if we refer to the Wikipedia definition as of 11/2018 a new element emerges: “A conspiracy theory is an explanation of an event or situation that invokes a conspiracy—generally one involving an illegal or harmful act supposedly carried out by government or other powerful actors—without credible evidence.”

When an explanation is labeled a “Conspiracy Theory,” by today’s definition, it has no evidence to support it. An explanation with no supporting evidence is a hypothesis, not a “theory.” “Conspiracy Theory,” as it is used today, is thus an oxymoron. These “Conspiracy Theories” we seem to hear about everyday should really be called “Conspiracy Hypotheses.” More concerning is that the “Conspiracy Theory” label identifies an explanation as inherently baseless. Given this linguistic construct, where is there room for a conspiracy that is in fact true?

There is also something troubling about using the term “credible” in the definition of conspiracy theory. Legally, evidence that is credible is that which a reasonable person would consider to be true in light of the surrounding circumstances. If evidence suggests an explanation that seems at the surface to be unreasonable, how does a reasonable person avoid automatically labeling the evidence not credible? If we are not careful, the credibility of the explanation and resultant conclusions would then determine the credibility of the evidence that supports it. Is this really so important? Perhaps you are quick to see that with this approach, our understanding of what is true and real can never evolve. If any evidence arose that radically disproved our understanding or eroded our faith in trusted institutions we would automatically discard it as “not credible” and remain entrenched in our accepted paradigm. “Credible” evidence cannot be a necessary requirement of a theory that challenges what is credible to begin with.

To better illustrate this, let us consider an old but very real “conspiracy theory.” About 400 years ago, European civilization was emerging from centuries of scientific and philosophical stagnation known as the dark ages. What more befitting a place for such a renaissance to occur than the center of the universe? You see, the idea that the Earth was one of eight planets revolving around a star that is orbiting the center of one of hundreds of billions of galaxies would have been absurd in Europe in the sixteenth century. Any sane person could see that the Sun and the Moon and every celestial body rises in the East and sets in the West. At that time, if someone went about proposing the idea that everything rises and falls because the Earth was spinning, they would have been laughed out of the tavern. Would that person be a conspiracy theorist? They are not proposing that “powerful actors are carrying out a harmful act,” they are merely suggesting an alternative explanation for what is observed. However, the implication of their suggestion seems to incriminate the authority on such matters as ignorant of the truth or, possibly, the perpetrators of a lie. The possibility of a conspiracy has now been introduced.

Now, let us say that this person claims to have proof of their absurd theory. Would you have taken the time to examine the evidence or would you have been more likely to dismiss them without further consideration? The very idea that they could be right would have been not just silly or heretical, but inconceivable to many, if not all. How could the evidence be credible if it implied something inconceivable? Dismissing their idea would have seemingly been the most logical and, therefore, the smartest thing to do.

advertisement - learn more

When Galileo Galilei appeared in 1610 armed with a rudimentary “telescope,” few would peer into it. He claimed that the refractive properties of the pair of “lenses” would allow you to see things at great distances very clearly. With it one could see Jupiter and its moons revolving around the giant planet just as our moon revolves around Earth. How enchanting! The difficulty would arise when you put the telescope down: your feet would no longer be planted on the previously immovable center of creation. Would you have looked into his telescope? What would have been the harm in taking a peek? Certainly the fear of being proven more gullible than most would have been on your mind. What about the fear that he might be right?

Imagine what must have been going through Galileo’s mind after his monumental discovery. He saw irrefutably that the entire model of the universe had been completely misconceived. One just has to look. Most did not. I can only imagine how hard he must have tried to convince anyone to simply stop, look and listen to what he had discovered. At the time, Galileo was the Chair of Mathematics at the University of Padua and had previously held the same post at the University of Pisa. Despite his bonafides and reputation as a solid contributor to the Italian renaissance, his discovery would likely have died in obscurity if it weren’t for the support of an influential family, the Medicis, who offered Galileo a platform from which he could spread his theory. It was only through allying himself with political power that he was able to slowly generate interest in his heliocentric model of the solar system. His proposition eventually caught the attention of the Catholic church, who initially warned him to desist. Eventually, he was brought to trial in the Roman Inquisition 23 years after his discovery. At the age of 70, the intrepid mathematician and astronomer was allowed to return home if he agreed to recant his story. Instead Galileo chose to spend the rest of his years in prison because he believed that that would be the only way to get people to open their eyes.

Did it work? It did not. Galileo died incarcerated while Europe continued to slumber under stars that moved around them. By today’s standards, Galileo would have been labeled a Conspiracy Theorist from the day he announced his findings until he was proven right fifty years after his death.  When the Principle of Gravitational Attraction eventually became widely accepted as true, the church had to retract their position because the motions of the stars and planets could not be explained under Newton’s laws. 

On the other hand, Galileo is credited with being the father of not only observational astronomy, but of the scientific method as well. The scientific method demands that one tests an explanation without bias towards an outcome. All data is considered before deductions are made. When all other explanations have been proven wrong, the only explanation remaining becomes a theory. The theory persists as long as all subsequent experiments continue to uphold it. This is how we ultimately know what we know and have an inkling of what we don’t. If I had to choose a posthumous title for myself, “The Father of the Scientific Method” is one I could die with. Galileo is credited with this honorific not only because he valued it more than his freedom, but because he had the discipline to regard evidence objectively despite how unimaginable the implications were. This is how a body of knowledge expands. By considering the validity of the evidence first, we then can accept what was previously unimaginable, otherwise what we know tomorrow will be no different than what we know today.

All conspiracy theorists are not Galileos. Neither are all conspiracy theories true. However, can we be certain that all of them are false? At their very core, all conspiracy theories directly or indirectly point at a central authority acting covertly and simultaneously at the media for either missing it or looking the other way. This, of course, is unimaginable, as we all know the government can make mistakes but would never do anything intentionally harmful to its citizens and then hide it. Even if they did, somebody would come forward and the media would let us know about it. This is why such a deception could never occur. The idea that your lover could be in bed with your best friend is inconceivable. Evidence of such a thing would not be credible. Dismissing all conspiracy theories seems logical and therefore seems like the smartest thing to do. 

In “Sapiens”, Yuval Harari proposes an explanation for why our species, Sapiens, out fought, out thought and out survived all other Homo species on the planet. He suggests that it was our unique ability to describe and communicate situations and events that had no basis in reality which set us apart. In other words, we could tell stories and they could not. By uniting under a common idea, story or even myth, thousands (and now thousands of millions) of Sapiens could come together with a shared purpose, identity or belief system to disband our cousins who were as individuals more sturdy and just as cunning but not nearly as good at cooperating as we were. This advantage, Harari proposes, has not only led our species to eventual supremacy over all others, but has also allowed us to form communities, governments and global alliances. 

Siding with the majority has served us well–until it hasn’t. One only needs to revisit the history of Galileo and basic astronomy to understand this. In actuality, the first observant minds woke up to the fact that the Earth went around the sun and not the other way round nineteen centuries before Galileo did. The Greek mathematician, Aristarcus, is thought to be the first Western person to place the Sun in the middle of a “solar system” in 270 BC. A human being traveled to the moon just 360 years after Galileo “discovered” what Aristarcus had shown nearly two millennia before. How many centuries was this journey delayed because an alternative explanation in ancient Greece became a “conspiracy theory” against authority and convention?

This poses an intriguing question. Is there something hardwired in our behavioral patterns that push us towards conformist narratives and away from alternative ones at a precognitive level? Is it this tendency that gave rise to our enhanced ability to unite that keeps us in “group-think” more than we should be? How do we know we are looking at the world objectively and rejecting alternative belief systems from a purely rational basis? How does one know whether one is biased or not?

One way is to apply the scientific method. The scientific method demands that every possibility, no matter how outlandish, is tested for its veracity and dismissed only when it can be proven wrong. Without this objective pursuit of truth, misconceptions can persist indefinitely, just as the geocentric model of the universe did. Interestingly, Aristarcus was allowed to retain his theory because he lived at a time and place where philosophers, mathematicians and scientists were revered, protected and free to pursue their notions. The freedom ancient Greek society afforded its scientists only endured for a few centuries after Aristarcus lived. In Galileo’s day, the Roman Catholic church had been presiding over such things as facts for well over a thousand years. His incontrovertible proof was suppressed by the power that had the most to lose.

These days, establishing the facts of the matter may not be as easy as we presume. Conspiracy theorists claim to have proof just like the debunkers do. How do we know that the proof offered on either side is valid? Who has the time to apply the scientific method? It certainly seems safer to go with the conventional narrative because surely there are more rational minds in a larger group. Though it seems a reasonable approach, it may be in fact where we misstep. By deferring to others, we assume the majority will arrive at the truth eventually. The problem is that those in the majority who are trained to examine evidence objectively often must take a potentially career-ending risk to even investigate an alternative explanation. Why would an organization be willing to invest the resources to redirect their scientific staff to chase down and evaluate evidence that will likely endanger their reputation with the public without any upside? Thus, conventional narratives survive for another day, or in the case of an Earth-centered universe, for a couple of thousand years.

Whether or not you are not a “conspiracy theorist” we can all agree that there is a possibility, however slight, that some conventional narratives could be wrong. How would we know? Is there a source that we can trust 100%? Must we rely on our own wits? A short inquiry into this question can be disquieting. Most of us must admit that our understanding of history, science and geopolitics are merely stories that we have been told by people, institutions or media that we trust explicitly or implicitly. Because most of us are not authorities on anything, it would be impossible to overturn any conventional narrative with an evidentiary argument. Challenging these paradigms is necessarily left to others. Generally speaking, there is no real reason to argue with convention if everything is seemingly unfolding acceptably. But what if you wanted to know for yourself ? Is there any way to ever really know the truth without having to have faith in someone or something else?

There may not be. However, it is also naive to believe that if someone, scientist or not, was in possession of evidence that challenged our deepest held beliefs that it would take root in the ethos on its own. Galileo enjoyed unsurpassed credibility as one of Italy’s foremost mathematicians. He also possessed irrefutable, verifiable and reproducible evidence for his revolutionary theory, yet the convention he was challenging did not crumble through his discoveries. History has shown us that it makes no difference how valid a point is; truth emerges only when someone is listening

So, rather than seeking to independently validate or refute what we are being told, it becomes more productive to ask a different question: How biased is our society by historical standards? How does our society regard alternative theories? Do we let them co-exist with convention as the ancient Greeks did? Do we collectively invest resources to investigate them openly? Or do we dismiss, attack and vilify them as was done in the papal states in Galileo’s time? Which kind of society is more likely to get it right? Which runs the greater risk of being hoodwinked in the long run? Which is more free?

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading

Alternative News

US House of Representatives Investigating if the Government Created Lyme Disease As A Bioweapon

Published

on

In Brief

  • The Facts:

    A New Jersey lawmaker suggests the government turned ticks and insects into bioweapons to spread disease, and possibly released them. He is not the only one who believes so.

  • Reflect On:

    This is not the only example of supposed human experimentation on mass populations by the government

There are a number of subjects that were once considered ‘conspiracy theories,’ which are now no longer in that realm. ‘Conspiracy theories’ usually, in my opinion, arise from credible evidence. The implications, however, are so grand and so mind-altering that many may experience some sort of cognitive dissonance as a result. One of the topics often deemed a ‘conspiracy theory’ is weaponized diseases, and the latest example comes from an approved amendment that was proposed by a Republican congressman from New Jersey. His name is Chris Smith, and he instructed the Department of Defence’s Inspector General to conduct a review on whether or not the US “experimented with ticks and insects regarding use as a biological weapon between the years of 1950 and 1975” and “whether any ticks or insects used in such experiment were released outside of any laboratory by accident or experiment design.”

The fact that Smith brought this up shows that any intelligent person who actually looks into this has reason to believe it’s a possibility, yet mainstream media outlets are ridiculing the idea, calling it a conspiracy instead of actually bringing up the points that caused Smith to demand the review.

The fact that the amendment was approved by a vote in the House speaks volumes. Smith said that the amendment was inspired by “a number of books and articles suggesting that significant research had been done at US government facilities including Fort Detrick, Maryland, and Plum Island, New York, to turn ticks and insects into bioweapons”.

Most people don’t know that the US government has experimented on its own citizens a number of times. All of this is justified for “national security” purposes. National security has always been a term used as an excuse to prolong secrecy, justify the government’s lack of transparency, and create black budget programs that have absolutely no oversight from Congress.

For example, on September 20, 1950, a US Navy ship just off the coast of San Francisco used a giant hose to spray a cloud of microbes into the air and into the city’s famous fog. The military was apparently testing how a biological weapon attack would affect the 800,000 residents of the city.The people of San Francisco had absolutely no idea. The Navy continued the tests for seven days, and multiple people died as a result. It was apparently one of the first large-scale biological weapon trials that would be conducted under a “germ warfare testing program” that went on for 20 years from 1949 to 1969. The goal “was to deter [the use of biological weapons] against the United States and its allies and to retaliate if deterrence failed,” the government later explained. Then again, that’s if you trust the explanation coming from the government.

This could fall under the category of human subject research. It’s still happening! A dozen classified programs that involved research on human subjects were underway last year at the Department of Energy. Human subject research refers broadly to the collection of scientific data from human subjects. This could involve performing physical procedures on the subjects or simply conducting interviews and having other forms of interaction with them. It could even involve procedures performed on entire populations, apparently without their consent.

advertisement - learn more

Human subjects research erupted into national controversy 25 years ago with reporting by Eileen Welsome of the Albuquerque Tribune on human radiation experiments that had been conducted by the Atomic Energy Commission, many of which were performed without the consent of the subjects. A presidential advisory committee was convened to document the record and to recommend appropriate policy responses.

When it comes to Lyme disease, the Guardian points out that:

A new book published in May by a Stanford University science writer and former Lyme sufferer, Kris Newby, has raised questions about the origins of the disease, which affects 400,000 Americans each year.

Bitten: The Secret History of Lyme Disease and Biological Weapons, cites the Swiss-born discoverer of the Lyme pathogen, Willy Burgdorfer, as saying that the Lyme epidemic was a military experiment that had gone wrong.

Burgdorfer, who died in 2014, worked as a bioweapons researcher for the US military and said he was tasked with breeding fleas, ticks, mosquitoes and other blood-sucking insects, and infecting them with pathogens that cause human diseases.

According to the book, there were programs to drop “weaponised” ticks and other bugs from the air, and that uninfected bugs were released in residential areas in the US to trace how they spread. It suggests that such a scheme could have gone awry and led to the eruption of Lyme disease in the US in the 1960s.

This is concerning. It’s a story that, for some reason, instantly reminded me of the MK ultra program, where human subjects were used for mind control research.

If things like this occurred in the past, it’s hard to understand why someone would deem the possibility of this happening again a ‘conspiracy theory.’ What makes one think this wouldn’t be happening again, especially given the fact that there is sufficient evidence suggesting it is?

Lyme disease is also very strange. If you did get it, you probably wouldn’t know immediately – unless you’re one of the chronic sufferers that have had to visit over 30 doctors to get a proper diagnosis. Lyme disease tests are highly inaccurateoften inconclusive or indicating false negatives.

Why? Because this clever bacteria has found a way to dumb down the immune system and white blood cells so that it’s not detectable until treatment is initiated. To diagnose Lyme disease properly you must see a “Lyme Literate MD (LLMD).” However, more and more doctors are turning their backs on patients due to sheer fear of losing their practices! Insurance companies and the CDC will do whatever it takes to stop Chronic Lyme Disease from being diagnosed, treated, or widely recognized as an increasingly common issue.

You can read more about that here.

The Takeaway

It’s becoming more apparent that our government as well as our federal health regulatory agencies are extremely corrupt. There are a number of examples to choose from throughout history proving this. The fact that something like this doesn’t seem believable to the public is ridiculous and further enhances and prolongs the ability for the powerful elite and the government to continue conducting these activities. Awareness is key.

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading

Alternative News

The Medical Journals’ Sell-Out—Getting Paid to Play

Published

on

[Note: This is Part IX in a series of articles adapted from the second Children’s Health Defense eBook: Conflicts of Interest Undermine Children’s Health. The first eBook, The Sickest Generation: The Facts Behind the Children’s Health Crisis and Why It Needs to End, described how children’s health began to worsen dramatically in the late 1980s following fateful changes in the childhood vaccine schedule.]

The vaccine industry and its government and scientific partners routinely block meaningful science and fabricate misleading studies about vaccines. They could not do so, however, without having enticed medical journals into a mutually beneficial bargain. Pharmaceutical companies supply journals with needed income, and in return, journals play a key role in suppressing studies that raise critical questions about vaccine risks—which would endanger profits.

Journals are willing to accept even the most highly misleading advertisements. The FDA has flagged numerous instances of advertising violations, including ads that overstated a drug’s effectiveness or minimized its risks.

An exclusive and dependent relationship

Advertising is one of the most obviously beneficial ways that medical journals’ “exclusive and dependent relationship” with the pharmaceutical industry plays out. According to a 2006 analysis in PLOS Medicinedrugs and medical devices are the only products for which medical journals accept advertisements. Studies show that journal advertising generates “the highest return on investment of all promotional strategies employed by pharmaceutical companies.” The pharmaceutical industry puts a particularly “high value on advertising its products in print journals” because journals reach doctors—the “gatekeeper between drug companies and patients.” Almost nine in ten drug advertising dollars are directed at physicians.

In the U.S. in 2012, drug companies spent $24 billion marketing to physicians, with only $3 billion spent on direct-to-consumer advertising. By 2015, however, consumer-targeted advertising had jumped to $5.2 billion, a 60% increase that has reaped bountiful rewards. In 2015, Pfizer’s Prevnar-13 vaccine was the nation’s eighth most heavily advertised drug; after the launch of the intensive advertising campaign, Prevnar “awareness” increased by over 1,500% in eight months, and “44% of targeted consumers were talking to their physicians about getting vaccinated specifically with Prevnar.” Slick ad campaigns have also helped boost uptake of “unpopular” vaccines like Gardasil.

Advertising is such an established part of journals’ modus operandi that high-end journals such as The New England Journal of Medicine (NEJM) boldly invite medical marketers to “make NEJM the cornerstone of their advertising programs,” promising “no greater assurance that your ad will be seen, read, and acted upon.” In addition, medical journals benefit from pharmaceutical companies’ bulk purchases of thousands of journal reprints and industry’s sponsorship of journal subscriptions and journal supplements.

advertisement - learn more

In 2003, an editor at The BMJ wrote about the numerous ways in which drug company advertising can bias medical journals (and the practice of medicine)—all of which still hold true today. For example:

  • Advertising monies enable prestigious journals to get thousands of copies into doctors’ hands for free, which “almost certainly” goes on to affect prescribing.
  • Journals are willing to accept even the most highly misleading advertisements. The FDA has flagged numerous instances of advertising violations, including ads that overstated a drug’s effectiveness or minimized its risks.
  • Journals will guarantee favorable editorial mentions of a product in order to earn a company’s advertising dollars.
  • Journals can earn substantial fees for publishing supplements even when they are written by “paid industry hacks”—and the more favorable the supplement content is to the company that is funding it, the bigger the profit for the journal.

Discussing clinical trials, the BMJ editor added: “Major trials are very good for journals in that doctors around the world want to see them and so are more likely to subscribe to journals that publish them. Such trials also create lots of publicity, and journals like publicity. Finally, companies purchase large numbers of reprints of these trials…and the profit margin to the publisher is huge. These reprints are then used to market the drugs to doctors, and the journal’s name on the reprint is a vital part of that sell.”

… however, even these poor-quality studies—when funded by the pharmaceutical industry—got far more attention than equivalent studies not funded by industry.

Industry-funded bias

According to the Journal of the American Medical Association (JAMA), nearly three-fourths of all funding for clinical trials in the U.S.—presumably including vaccine trials—came from corporate sponsors as of the early 2000s. The pharmaceutical industry’s funding of studies (and investigators) is a factor that helps determine which studies get published, and where. As a Johns Hopkins University researcher has acknowledged, funding can lead to bias—and while the potential exists for governmental or departmental funding to produce bias, “the worst source of bias is industry-funded.”

In 2009, researchers published a systematic review of several hundred influenza vaccine trials. Noting “growing doubts about the validity of the scientific evidence underpinning [influenza vaccine] policy recommendations,” the authors showed that the vaccine-favorable studies were “of significantly lower methodological quality”; however, even these poor-quality studies—when funded by the pharmaceutical industry—got far more attention than equivalent studies not funded by industry. The authors commented:

[Studies] sponsored by industry had greater visibility as they were more likely to be published by high impact factor journals and were likely to be given higher prominence by the international scientific and lay media, despite their apparent equivalent methodological quality and size compared with studies with other funders.

In their discussion, the authors also described how the industry’s vast resources enable lavish and strategic dissemination of favorable results. For example, companies often distribute “expensively bound” abstracts and reprints (translated into various languages) to “decision makers, their advisors, and local researchers,” while also systematically plugging their studies at symposia and conferences.

The World Health Organization’s standards describe reporting of clinical trial results as a “scientific, ethical, and moral responsibility.” However, it appears that as many as half of all clinical trial results go unreported—particularly when their results are negative. A European official involved in drug assessment has described the problem as “widespread,” citing as an example GSK’s suppression of results from four clinical trials for an anti-anxiety drug when those results showed a possible increased risk of suicide in children and adolescents. Experts warn that “unreported studies leave an incomplete and potentially misleading picture of the risks and benefits of treatments.”

Many vaccine studies flagrantly illustrate biases and selective reporting that produce skewed write-ups that are more marketing than science.

Debased and biased results

The “significant association between funding sources and pro-industry conclusions” can play out in many different ways, notably through methodological bias and debasement of study designs and analytic strategies. Bias may be present in the form of inadequate sample sizes, short follow-up periods, inappropriate placebos or comparisons, use of improper surrogate endpoints, unsuitable statistical analyses or “misleading presentation of data.”

Occasionally, high-level journal insiders blow the whistle on the corruption of published science. In a widely circulated quote, Dr. Marcia Angell, former editor-in-chief of NEJM, acknowledged that “It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines.” Dr. Angell added that she “[took] no pleasure in this conclusion, which [she] reached slowly and reluctantly” over two decades at the prestigious journal.

Many vaccine studies flagrantly illustrate biases and selective reporting that produce skewed write-ups that are more marketing than science. In formulaic articles that medical journals are only too happy to publish, the conclusion is almost always the same, no matter the vaccine: “We did not identify any new or unexpected safety concerns.” As an example of the use of inappropriate statistical techniques to exaggerate vaccine benefits, an influenza vaccine study reported a “69% efficacy rate” even though the vaccine failed “nearly all who [took] it.” As explained by Dr. David Brownstein, the study’s authors used a technique called relative risk analysis to derive their 69% statistic because it can make “a poorly performing drug or therapy look better than it actually is.” However, the absolute risk difference between the vaccine and the placebo group was 2.27%, meaning that the vaccine “was nearly 98% ineffective in preventing the flu.”

… the reviewers had done an incomplete job and had ignored important evidence of bias.

Trusted evidence?

In 2018, the Cochrane Collaboration—which bills its systematic reviews as the international gold standard for high-quality, “trusted” evidence—furnished conclusions about the human papillomavirus (HPV) vaccine that clearly signaled industry bias. In May of that year, Cochrane’s highly favorable review improbably declared the vaccine to have no increased risk of serious adverse effects and judged deaths observed in HPV studies “not to be related to the vaccine.” Cochrane claims to be free of conflicts of interest, but its roster of funders includes national governmental bodies and international organizations pushing for HPV vaccine mandates as well as the Bill & Melinda Gates Foundation and the Robert Wood Johnson Foundation—both of which are staunch funders and supporters of HPV vaccination. The Robert Wood Johnson Foundation’s president is a former top CDC official who served as acting CDC director during the H1N1 “false pandemic” in 2009 that ensured millions in windfall profits for vaccine manufacturers.

Two months after publication of Cochrane’s HPV review, researchers affiliated with the Nordic Cochrane Centre (one of Cochrane’s member centers) published an exhaustive critique, declaring that the reviewers had done an incomplete job and had “ignored important evidence of bias.” The critics itemized numerous methodological and ethical missteps on the part of the Cochrane reviewers, including failure to count nearly half of the eligible HPV vaccine trials, incomplete assessment of serious and systemic adverse events and failure to note that many of the reviewed studies were industry-funded. They also upbraided the Cochrane reviewers for not paying attention to key design flaws in the original clinical trials, including the failure to use true placebos and the use of surrogate outcomes for cervical cancer.

In response to the criticisms, the editor-in-chief of the Cochrane Library initially stated that a team of editors would investigate the claims “as a matter of urgency.” Instead, however, Cochrane’s Governing Board quickly expelled one of the critique’s authors, Danish physician-researcher Peter Gøtzsche, who helped found Cochrane and was the head of the Nordic Cochrane Centre. Gøtzsche has been a vocal critic of Cochrane’s “increasingly commercial business model,” which he suggests is resulting in “stronger and stronger resistance to say anything that could bother pharmaceutical industry interests.” Adding insult to injury, Gøtzsche’s direct employer, the Rigshospitalet hospital in Denmark, then fired Gøtzsche. In response, Dr. Gøtzsche stated, “Firing me sends the unfortunate signal that if your research results are inconvenient and cause public turmoil, or threaten the pharmaceutical industry’s earnings, …you will be sacked.” In March 2019, Gøtzsche launched an independent Institute for Scientific Freedom.

In 2019, the editor-in-chief and research editor of BMJ Evidence Based Medicine—the journal that published the critique of Cochrane’s biased review—jointly defended the critique as having “provoke[d] healthy debate and pose[d] important questions,” affirming the value of publishing articles that “hold organisations to account.” They added that “Academic freedom means communicating ideas, facts and criticism without being censored, targeted or reprimanded” and urged publishers not to “shrink from offering criticisms that may be considered inconvenient.”

In recent years, a number of journals have invented bogus excuses to withdraw or retract articles critical of risky vaccine ingredients, even when written by top international scientists.

The censorship tsunami

Another favored tactic is to keep vaccine-critical studies out of medical journals altogether, either by refusing to publish them (even if peer reviewers recommend their publication) or by concocting excuses to pull articles after publication. In recent years, a number of journals have invented bogus excuses to withdraw or retract articles critical of risky vaccine ingredients, even when written by top international scientists. To cite just three examples:

  • The journal Vaccine withdrew a study that questioned the safety of the aluminum adjuvantused in Gardasil.
  • The journal Science and Engineering Ethics retracted an article that made a case for greater transparency regarding the link between mercury and autism.
  • Pharmacological Research withdrew a published veterinary article that implicated aluminum-containing vaccines in a mystery illness decimating sheep, citing “concerns” from an anonymous reader.

Elsevier, which publishes two of these journals, has a track record of setting up fake journals to market Merck’s drugs, and Springer, which publishes the third journal as well as influential publications like Nature and Scientific American, has been only too willing to accommodate censorship requests. However, even these forms of censorship may soon seem quaint in comparison to the censorship of vaccine-critical information now being implemented across social media and other platforms. This concerted campaign to prevent dissemination of vaccine content that does not toe the party line will make it harder than ever for American families to do their due diligence with regard to vaccine risks and benefits.


Sign up for free news and updates from Robert F. Kennedy, Jr. and the Children’s Health Defense. CHD is planning many strategies, including legal, in an effort to defend the health of our children and obtain justice for those already injured. Your support is essential to CHD’s successful mission.

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading
advertisement - learn more
advertisement - learn more

Video

Pod