Connect with us

Alternative News

Do Psychedelics Have The Power To Wake Us Up?

Published

on

Death: The root of all our fears. The underlying reason we hold back, take the safe route, stay in our comfortable abodes, and live platonic lives ruled by our fears. Death is the biggest mystery in our lives with the most speculation about what it entails, and it’s the fact that we can never really solve the mystery until it happens that keeps so many of us paralyzed.

advertisement - learn more

But is death really so terrifying? What if we could glimpse death before it occurred? Would we feel more open to life’s innumerable possibilities? Would we greet the demise of our loved ones with grace and acceptance instead of resentment and anger?

The answer is yes. Because when we’ve seen what lies beyond this realm, the afterlife isn’t nearly as terrifying. When we’ve felt an endless sense of oneness, even if it’s still, on some level, only speculation, most will find they’re much more comfortable with the idea of their bodies withering away. Such is the cycle of life, though rarely do we humans see it that way.

They say perspective is everything. This is partly why researchers are now using psilocybin mushrooms, LSD, and MDMA as a treatment for people with terminal illness. People who are in pain, depressed, and whose reality is seeped in despair are finding solace by being catapulted into an entirely different perspective.

The anecdotal accounts of perspective-shifting experiences from psychedelic substances are plentiful. But that’s not what I want to focus on here. I want to talk about the ancestral uses of psychedelics, as well as the modern research in regards to the benefits of these substances for healing our negative attitudes towards death.

The Death Of The Ego

It’s important when talking about death to talk about the ego, the ‘I’, the driving force behind the majority of our motivations as inherently selfish humans.

advertisement - learn more

When we take psychedelic substances, we experience a death of the ego. We can see ourselves from a much larger viewpoint and our selfish drives that often lurk in the shadows are brought into the light.

In this way, although eventually we snap out of it and return to our daily lives, we remain forever changed. Perhaps after such an experience with a diminished ego, we find it easier to humble ourselves, to remember that we may be alone in the world yet we’re still all in this together. We stop making ourselves the primary and experience the world, our surroundings, and our loved ones in an entirely different way.

For some it’s quite subtle, and for others their entire worldview may be shaken. Either way, it goes to show why we are where we are on a collective level. For thousands of years, our ancestors before us revered psychedelic substances and viewed the Earth as sacred. Contrasted with a society that condemns plant medicine and puts people in jail for trying to challenge the norm, all the while destroying the planet and embarking on more and more missions of violence and glorification of war, it’s easy to see how perspective changes everything, and how our egos have gotten out of control.

The Ancient Use Of Plant Medicines

Perhaps the most well-known description of the ancestral use of plant medicines is Terence Mckenna’s Food of the Gods. Terence references the use of psilocybin mushrooms being revered by ancient cultures long ago. The evidence shows that as far back as 3500 BC, images of dancing shamans holding mushrooms in the presence of white cattle are painted on the rock surfaces of Tassili Plateau in Southern Algeria.

While it’s difficult to know for certain how ancient cultures used these substances during death, it’s quite apparent that they honoured them deeply. On every continent, in every indigenous culture, there’s evidence of mind-altering substances being taken. Even today the use of ayahuasca, peyote, psilocybin, ibogaine, and san pedro continues in tribal settings around the world.

It’s been speculated that soma, described as a leafless, rootless plant in the Rig Veda, an ancient text written by the Aryans who came from Siberia to India, is in fact psilocybin mushrooms.

“We have drunk the Soma; we have become immortal; we have gone to the light; we have found the gods.” (Rig Veda 8.48.1-15)

Modern Psychedelia & Acceptance Of Death

DMT, psilocybin, LSD, and MDMA have all been studied for their potential to help us better understand and accept our inevitable demise.

And what do we have to lose when a person is already facing certain death? When the worst has already been conquered, we can allow our fears to wither away and begin to try new things. Perhaps this is why we’re starting to see a reemergence of mainstream research into psychedelics for helping the terminally ill cope with death.

This is happening in the U.S., with huge discussions taking place even on mainstream media like in this New Yorker article titled “The Trip Treatment,” which talks about the reemergence of psychedelics as therapy for those approaching death. The article focuses on one study in particular, documenting one man’s experience taking psilocybin to ease his fears surrounding his coming death from a terminal illness.

The study involving terminally ill cancer patients is described by researchers as “two treatment sessions, one with the active drug and one with a placebo, along with additional meetings for emotional preparation and supportive counseling. The meetings are designed to insure comfort and safety for participants in the study.”

So far the research has shown promising results to reduce anxiety, depression, and feelings of despair in terminally-ill cancer patients.

Trickling behind psilocybin research is LSD, the substance that changed a nation. Before 1966, LSD was being studied intensely for its medicinal benefits. Thousands of research papers were published on LSD, involving over 40,000 participants. The famous author and psychonaut Aldous Huxley asked for LSD while he died of laryngeal cancer.

In Huxley’s famous interview with Mike Wallace, he had this to say about his various psychedelic experiences:

The man who comes back through the Door in the Wall will never be quite the same as the man who went out. He will be wiser but less sure, happier but less self-satisfied, humbler in acknowledging his ignorance yet better equipped to understand the relationship of words to things, of systematic reasoning to the unfathomable mystery which it tries, forever vainly, to comprehend.

LSD is finally creeping back onto the research scene. A small study done by psychiatrist Dr. Peter Gasser in Switzerland which tested the effects of LSD along with talk therapy involved 12 terminally ill patients.

Gasser explained in his follow-up, 1 year after ending the study, that his patients’ anxiety went down and stayed down before their death.

A question apt to come up in this discussion is whether or not we’re simply deluding ourselves with the notions brought to light by psychedelics. The answer is unknowable. But the real question is: Who are we to impose the necessity of scientific proof upon the fact that someone’s life was made better because of an intimate experience they had with plant medicine? Should we all not be allowed the freedom to choose our own experiences, especially when it comes to preparing for death?

MDMA is also being used to help people find acceptance in their terminal diagnoses. A substance that was once freely researched in the 60s is only now making a comeback for its medicinal potential.

When we examine mortality, there is one very important molecule which comes to mind, and that is Dimethyltryptamine, otherwise known as DMT. Released only when we enter and leave the world, DMT is present in every living form on the planet, naturally occurring at higher levels in certain frog species and very few plants. To take DMT is to on some levels mimic the experience that comes along with death.

Rick Strassman, MD, is famous for his research at the University of New Mexico in the 1990s involving DMT to induce a near-death experience. In a world where even talking about death is taboo, experiencing it before it happens is practically incomprehensible.

Pure DMT is very different from other plant medicines, as the actual psychedelic experience only lasts for, at most, 30 minutes. Yet personal accounts describe no perception of time.

In plants like ayahuasca where DMT is naturally occurring yet combined with a myriad of other compounds, the experience is said to last anywhere from 12-24 hours. In a world obsessed with time constraints, this kind of trip is far more daunting than a 30-minute deep dive into one’s psyche.

Strassman’s research was made into a documentary as well as a book called The Spirit Molecule. There haven’t been any studies on isolated DMT since, even though it causes no side-effects aside from the discomfort encountered when we enter a state of no control.

A Good Death

While it may be a slow process, the medical community is beginning to come around to giving these methods a chance. Although nearly every study must be privately funded, organizations such as MAPS (the Multidisciplinary Association for Psychedelic Studies) are making major headway in the midst of endless challenges.

The reason that governments and pharmaceutical companies are hesitant to help with funding studies using psychedelics substances is twofold. One, they would be forced to acknowledge medicinal purposes and have no choice to remove these substances from their Schedule 1 drug class index which declares ‘no medical uses.’ Two, the pharmaceutical industry, fueled by profit as it is, would be unable to patent these natural substances, nor are these substances the type of thing that people would find long-term uses for.

This leaves us stuck between a rock and a hard place. Death, as taboo as it may be, needs to be talked about. Both in the literal definition and the spiritual death of the ego. Were we able to tame our egos earlier in life, perhaps we would find more freedom to truly live and break free from our invisible shackles of fear.

Stephen Jenkinson, author of Die Wise: A Manifesto for Sanity and Soul, has worked extensively in the medical community helping dying people and their families. His powerful statement “Not success. Not growth. Not happiness. The cradle of your love of life . . . is death” speaks to our society’s aversion to this topic. We hide from it, yet in the end, it is incredibly important to infuse deeper meaning into our lives.

Coming to grips with the fact that one day we all pass away is liberating, yet difficult. To die a good death is to leave the world with a certain level of peace and acceptance. What better way to do this than to glimpse the ego’s death and see the world without our culturally imposed filters and unnecessary fears?

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Advertisement
advertisement - learn more

Alternative News

The Anatomy of Conspiracy Theories

Published

on

Whether you believe in conspiracy theories or not, we can all agree that the use of the term has exploded in media and in conversation. The question is, why? Are we now using the term “Conspiracy Theory” more indiscriminately and on more platforms than previously? Are we, as a society, simply becoming unhinged and absurd? Are seemingly nonsensical stories, for some unknown reason, starting to resonate with people? Or are some conventional narratives getting challenged because some of these “alternative” explanations are in fact accurate, despite the fact that conventional sources refuse to acknowledge them as even potentially valid? Notice that the last two possibilities are different sides of the same coin. If you think  “conspiracy theorists” are unhinged, it is highly likely that they are suspicious of your sanity as well. Both sides insist that they are right and that the other has been hoodwinked. Note that if you choose to not pick a side, you are, by default, allowing the conventional narrative to perpetuate. That is how convention works. 

Merriam-Webster defines the term conspiracy theory as “a theory that explains an event or situation as the result of a secret plan by usually powerful people or groups”. The key elements of this definition remain consistent across all authoritative lexicons: the group responsible for an event must be powerful and covert. However, if we refer to the Wikipedia definition as of 11/2018 a new element emerges: “A conspiracy theory is an explanation of an event or situation that invokes a conspiracy—generally one involving an illegal or harmful act supposedly carried out by government or other powerful actors—without credible evidence.”

When an explanation is labeled a “Conspiracy Theory,” by today’s definition, it has no evidence to support it. An explanation with no supporting evidence is a hypothesis, not a “theory.” “Conspiracy Theory,” as it is used today, is thus an oxymoron. These “Conspiracy Theories” we seem to hear about everyday should really be called “Conspiracy Hypotheses.” More concerning is that the “Conspiracy Theory” label identifies an explanation as inherently baseless. Given this linguistic construct, where is there room for a conspiracy that is in fact true?

There is also something troubling about using the term “credible” in the definition of conspiracy theory. Legally, evidence that is credible is that which a reasonable person would consider to be true in light of the surrounding circumstances. If evidence suggests an explanation that seems at the surface to be unreasonable, how does a reasonable person avoid automatically labeling the evidence not credible? If we are not careful, the credibility of the explanation and resultant conclusions would then determine the credibility of the evidence that supports it. Is this really so important? Perhaps you are quick to see that with this approach, our understanding of what is true and real can never evolve. If any evidence arose that radically disproved our understanding or eroded our faith in trusted institutions we would automatically discard it as “not credible” and remain entrenched in our accepted paradigm. “Credible” evidence cannot be a necessary requirement of a theory that challenges what is credible to begin with.

To better illustrate this, let us consider an old but very real “conspiracy theory.” About 400 years ago, European civilization was emerging from centuries of scientific and philosophical stagnation known as the dark ages. What more befitting a place for such a renaissance to occur than the center of the universe? You see, the idea that the Earth was one of eight planets revolving around a star that is orbiting the center of one of hundreds of billions of galaxies would have been absurd in Europe in the sixteenth century. Any sane person could see that the Sun and the Moon and every celestial body rises in the East and sets in the West. At that time, if someone went about proposing the idea that everything rises and falls because the Earth was spinning, they would have been laughed out of the tavern. Would that person be a conspiracy theorist? They are not proposing that “powerful actors are carrying out a harmful act,” they are merely suggesting an alternative explanation for what is observed. However, the implication of their suggestion seems to incriminate the authority on such matters as ignorant of the truth or, possibly, the perpetrators of a lie. The possibility of a conspiracy has now been introduced.

Now, let us say that this person claims to have proof of their absurd theory. Would you have taken the time to examine the evidence or would you have been more likely to dismiss them without further consideration? The very idea that they could be right would have been not just silly or heretical, but inconceivable to many, if not all. How could the evidence be credible if it implied something inconceivable? Dismissing their idea would have seemingly been the most logical and, therefore, the smartest thing to do.

advertisement - learn more

When Galileo Galilei appeared in 1610 armed with a rudimentary “telescope,” few would peer into it. He claimed that the refractive properties of the pair of “lenses” would allow you to see things at great distances very clearly. With it one could see Jupiter and its moons revolving around the giant planet just as our moon revolves around Earth. How enchanting! The difficulty would arise when you put the telescope down: your feet would no longer be planted on the previously immovable center of creation. Would you have looked into his telescope? What would have been the harm in taking a peek? Certainly the fear of being proven more gullible than most would have been on your mind. What about the fear that he might be right?

Imagine what must have been going through Galileo’s mind after his monumental discovery. He saw irrefutably that the entire model of the universe had been completely misconceived. One just has to look. Most did not. I can only imagine how hard he must have tried to convince anyone to simply stop, look and listen to what he had discovered. At the time, Galileo was the Chair of Mathematics at the University of Padua and had previously held the same post at the University of Pisa. Despite his bonafides and reputation as a solid contributor to the Italian renaissance, his discovery would likely have died in obscurity if it weren’t for the support of an influential family, the Medicis, who offered Galileo a platform from which he could spread his theory. It was only through allying himself with political power that he was able to slowly generate interest in his heliocentric model of the solar system. His proposition eventually caught the attention of the Catholic church, who initially warned him to desist. Eventually, he was brought to trial in the Roman Inquisition 23 years after his discovery. At the age of 70, the intrepid mathematician and astronomer was allowed to return home if he agreed to recant his story. Instead Galileo chose to spend the rest of his years in prison because he believed that that would be the only way to get people to open their eyes.

Did it work? It did not. Galileo died incarcerated while Europe continued to slumber under stars that moved around them. By today’s standards, Galileo would have been labeled a Conspiracy Theorist from the day he announced his findings until he was proven right fifty years after his death.  When the Principle of Gravitational Attraction eventually became widely accepted as true, the church had to retract their position because the motions of the stars and planets could not be explained under Newton’s laws. 

On the other hand, Galileo is credited with being the father of not only observational astronomy, but of the scientific method as well. The scientific method demands that one tests an explanation without bias towards an outcome. All data is considered before deductions are made. When all other explanations have been proven wrong, the only explanation remaining becomes a theory. The theory persists as long as all subsequent experiments continue to uphold it. This is how we ultimately know what we know and have an inkling of what we don’t. If I had to choose a posthumous title for myself, “The Father of the Scientific Method” is one I could die with. Galileo is credited with this honorific not only because he valued it more than his freedom, but because he had the discipline to regard evidence objectively despite how unimaginable the implications were. This is how a body of knowledge expands. By considering the validity of the evidence first, we then can accept what was previously unimaginable, otherwise what we know tomorrow will be no different than what we know today.

All conspiracy theorists are not Galileos. Neither are all conspiracy theories true. However, can we be certain that all of them are false? At their very core, all conspiracy theories directly or indirectly point at a central authority acting covertly and simultaneously at the media for either missing it or looking the other way. This, of course, is unimaginable, as we all know the government can make mistakes but would never do anything intentionally harmful to its citizens and then hide it. Even if they did, somebody would come forward and the media would let us know about it. This is why such a deception could never occur. The idea that your lover could be in bed with your best friend is inconceivable. Evidence of such a thing would not be credible. Dismissing all conspiracy theories seems logical and therefore seems like the smartest thing to do. 

In “Sapiens”, Yuval Harari proposes an explanation for why our species, Sapiens, out fought, out thought and out survived all other Homo species on the planet. He suggests that it was our unique ability to describe and communicate situations and events that had no basis in reality which set us apart. In other words, we could tell stories and they could not. By uniting under a common idea, story or even myth, thousands (and now thousands of millions) of Sapiens could come together with a shared purpose, identity or belief system to disband our cousins who were as individuals more sturdy and just as cunning but not nearly as good at cooperating as we were. This advantage, Harari proposes, has not only led our species to eventual supremacy over all others, but has also allowed us to form communities, governments and global alliances. 

Siding with the majority has served us well–until it hasn’t. One only needs to revisit the history of Galileo and basic astronomy to understand this. In actuality, the first observant minds woke up to the fact that the Earth went around the sun and not the other way round nineteen centuries before Galileo did. The Greek mathematician, Aristarcus, is thought to be the first Western person to place the Sun in the middle of a “solar system” in 270 BC. A human being traveled to the moon just 360 years after Galileo “discovered” what Aristarcus had shown nearly two millennia before. How many centuries was this journey delayed because an alternative explanation in ancient Greece became a “conspiracy theory” against authority and convention?

This poses an intriguing question. Is there something hardwired in our behavioral patterns that push us towards conformist narratives and away from alternative ones at a precognitive level? Is it this tendency that gave rise to our enhanced ability to unite that keeps us in “group-think” more than we should be? How do we know we are looking at the world objectively and rejecting alternative belief systems from a purely rational basis? How does one know whether one is biased or not?

One way is to apply the scientific method. The scientific method demands that every possibility, no matter how outlandish, is tested for its veracity and dismissed only when it can be proven wrong. Without this objective pursuit of truth, misconceptions can persist indefinitely, just as the geocentric model of the universe did. Interestingly, Aristarcus was allowed to retain his theory because he lived at a time and place where philosophers, mathematicians and scientists were revered, protected and free to pursue their notions. The freedom ancient Greek society afforded its scientists only endured for a few centuries after Aristarcus lived. In Galileo’s day, the Roman Catholic church had been presiding over such things as facts for well over a thousand years. His incontrovertible proof was suppressed by the power that had the most to lose.

These days, establishing the facts of the matter may not be as easy as we presume. Conspiracy theorists claim to have proof just like the debunkers do. How do we know that the proof offered on either side is valid? Who has the time to apply the scientific method? It certainly seems safer to go with the conventional narrative because surely there are more rational minds in a larger group. Though it seems a reasonable approach, it may be in fact where we misstep. By deferring to others, we assume the majority will arrive at the truth eventually. The problem is that those in the majority who are trained to examine evidence objectively often must take a potentially career-ending risk to even investigate an alternative explanation. Why would an organization be willing to invest the resources to redirect their scientific staff to chase down and evaluate evidence that will likely endanger their reputation with the public without any upside? Thus, conventional narratives survive for another day, or in the case of an Earth-centered universe, for a couple of thousand years.

Whether or not you are not a “conspiracy theorist” we can all agree that there is a possibility, however slight, that some conventional narratives could be wrong. How would we know? Is there a source that we can trust 100%? Must we rely on our own wits? A short inquiry into this question can be disquieting. Most of us must admit that our understanding of history, science and geopolitics are merely stories that we have been told by people, institutions or media that we trust explicitly or implicitly. Because most of us are not authorities on anything, it would be impossible to overturn any conventional narrative with an evidentiary argument. Challenging these paradigms is necessarily left to others. Generally speaking, there is no real reason to argue with convention if everything is seemingly unfolding acceptably. But what if you wanted to know for yourself ? Is there any way to ever really know the truth without having to have faith in someone or something else?

There may not be. However, it is also naive to believe that if someone, scientist or not, was in possession of evidence that challenged our deepest held beliefs that it would take root in the ethos on its own. Galileo enjoyed unsurpassed credibility as one of Italy’s foremost mathematicians. He also possessed irrefutable, verifiable and reproducible evidence for his revolutionary theory, yet the convention he was challenging did not crumble through his discoveries. History has shown us that it makes no difference how valid a point is; truth emerges only when someone is listening

So, rather than seeking to independently validate or refute what we are being told, it becomes more productive to ask a different question: How biased is our society by historical standards? How does our society regard alternative theories? Do we let them co-exist with convention as the ancient Greeks did? Do we collectively invest resources to investigate them openly? Or do we dismiss, attack and vilify them as was done in the papal states in Galileo’s time? Which kind of society is more likely to get it right? Which runs the greater risk of being hoodwinked in the long run? Which is more free?

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading

Alternative News

US House of Representatives Investigating if the Government Created Lyme Disease As A Bioweapon

Published

on

In Brief

  • The Facts:

    A New Jersey lawmaker suggests the government turned ticks and insects into bioweapons to spread disease, and possibly released them. He is not the only one who believes so.

  • Reflect On:

    This is not the only example of supposed human experimentation on mass populations by the government

There are a number of subjects that were once considered ‘conspiracy theories,’ which are now no longer in that realm. ‘Conspiracy theories’ usually, in my opinion, arise from credible evidence. The implications, however, are so grand and so mind-altering that many may experience some sort of cognitive dissonance as a result. One of the topics often deemed a ‘conspiracy theory’ is weaponized diseases, and the latest example comes from an approved amendment that was proposed by a Republican congressman from New Jersey. His name is Chris Smith, and he instructed the Department of Defence’s Inspector General to conduct a review on whether or not the US “experimented with ticks and insects regarding use as a biological weapon between the years of 1950 and 1975” and “whether any ticks or insects used in such experiment were released outside of any laboratory by accident or experiment design.”

The fact that Smith brought this up shows that any intelligent person who actually looks into this has reason to believe it’s a possibility, yet mainstream media outlets are ridiculing the idea, calling it a conspiracy instead of actually bringing up the points that caused Smith to demand the review.

The fact that the amendment was approved by a vote in the House speaks volumes. Smith said that the amendment was inspired by “a number of books and articles suggesting that significant research had been done at US government facilities including Fort Detrick, Maryland, and Plum Island, New York, to turn ticks and insects into bioweapons”.

Most people don’t know that the US government has experimented on its own citizens a number of times. All of this is justified for “national security” purposes. National security has always been a term used as an excuse to prolong secrecy, justify the government’s lack of transparency, and create black budget programs that have absolutely no oversight from Congress.

For example, on September 20, 1950, a US Navy ship just off the coast of San Francisco used a giant hose to spray a cloud of microbes into the air and into the city’s famous fog. The military was apparently testing how a biological weapon attack would affect the 800,000 residents of the city.The people of San Francisco had absolutely no idea. The Navy continued the tests for seven days, and multiple people died as a result. It was apparently one of the first large-scale biological weapon trials that would be conducted under a “germ warfare testing program” that went on for 20 years from 1949 to 1969. The goal “was to deter [the use of biological weapons] against the United States and its allies and to retaliate if deterrence failed,” the government later explained. Then again, that’s if you trust the explanation coming from the government.

This could fall under the category of human subject research. It’s still happening! A dozen classified programs that involved research on human subjects were underway last year at the Department of Energy. Human subject research refers broadly to the collection of scientific data from human subjects. This could involve performing physical procedures on the subjects or simply conducting interviews and having other forms of interaction with them. It could even involve procedures performed on entire populations, apparently without their consent.

advertisement - learn more

Human subjects research erupted into national controversy 25 years ago with reporting by Eileen Welsome of the Albuquerque Tribune on human radiation experiments that had been conducted by the Atomic Energy Commission, many of which were performed without the consent of the subjects. A presidential advisory committee was convened to document the record and to recommend appropriate policy responses.

When it comes to Lyme disease, the Guardian points out that:

A new book published in May by a Stanford University science writer and former Lyme sufferer, Kris Newby, has raised questions about the origins of the disease, which affects 400,000 Americans each year.

Bitten: The Secret History of Lyme Disease and Biological Weapons, cites the Swiss-born discoverer of the Lyme pathogen, Willy Burgdorfer, as saying that the Lyme epidemic was a military experiment that had gone wrong.

Burgdorfer, who died in 2014, worked as a bioweapons researcher for the US military and said he was tasked with breeding fleas, ticks, mosquitoes and other blood-sucking insects, and infecting them with pathogens that cause human diseases.

According to the book, there were programs to drop “weaponised” ticks and other bugs from the air, and that uninfected bugs were released in residential areas in the US to trace how they spread. It suggests that such a scheme could have gone awry and led to the eruption of Lyme disease in the US in the 1960s.

This is concerning. It’s a story that, for some reason, instantly reminded me of the MK ultra program, where human subjects were used for mind control research.

If things like this occurred in the past, it’s hard to understand why someone would deem the possibility of this happening again a ‘conspiracy theory.’ What makes one think this wouldn’t be happening again, especially given the fact that there is sufficient evidence suggesting it is?

Lyme disease is also very strange. If you did get it, you probably wouldn’t know immediately – unless you’re one of the chronic sufferers that have had to visit over 30 doctors to get a proper diagnosis. Lyme disease tests are highly inaccurateoften inconclusive or indicating false negatives.

Why? Because this clever bacteria has found a way to dumb down the immune system and white blood cells so that it’s not detectable until treatment is initiated. To diagnose Lyme disease properly you must see a “Lyme Literate MD (LLMD).” However, more and more doctors are turning their backs on patients due to sheer fear of losing their practices! Insurance companies and the CDC will do whatever it takes to stop Chronic Lyme Disease from being diagnosed, treated, or widely recognized as an increasingly common issue.

You can read more about that here.

The Takeaway

It’s becoming more apparent that our government as well as our federal health regulatory agencies are extremely corrupt. There are a number of examples to choose from throughout history proving this. The fact that something like this doesn’t seem believable to the public is ridiculous and further enhances and prolongs the ability for the powerful elite and the government to continue conducting these activities. Awareness is key.

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading

Alternative News

The Medical Journals’ Sell-Out—Getting Paid to Play

Published

on

[Note: This is Part IX in a series of articles adapted from the second Children’s Health Defense eBook: Conflicts of Interest Undermine Children’s Health. The first eBook, The Sickest Generation: The Facts Behind the Children’s Health Crisis and Why It Needs to End, described how children’s health began to worsen dramatically in the late 1980s following fateful changes in the childhood vaccine schedule.]

The vaccine industry and its government and scientific partners routinely block meaningful science and fabricate misleading studies about vaccines. They could not do so, however, without having enticed medical journals into a mutually beneficial bargain. Pharmaceutical companies supply journals with needed income, and in return, journals play a key role in suppressing studies that raise critical questions about vaccine risks—which would endanger profits.

Journals are willing to accept even the most highly misleading advertisements. The FDA has flagged numerous instances of advertising violations, including ads that overstated a drug’s effectiveness or minimized its risks.

An exclusive and dependent relationship

Advertising is one of the most obviously beneficial ways that medical journals’ “exclusive and dependent relationship” with the pharmaceutical industry plays out. According to a 2006 analysis in PLOS Medicinedrugs and medical devices are the only products for which medical journals accept advertisements. Studies show that journal advertising generates “the highest return on investment of all promotional strategies employed by pharmaceutical companies.” The pharmaceutical industry puts a particularly “high value on advertising its products in print journals” because journals reach doctors—the “gatekeeper between drug companies and patients.” Almost nine in ten drug advertising dollars are directed at physicians.

In the U.S. in 2012, drug companies spent $24 billion marketing to physicians, with only $3 billion spent on direct-to-consumer advertising. By 2015, however, consumer-targeted advertising had jumped to $5.2 billion, a 60% increase that has reaped bountiful rewards. In 2015, Pfizer’s Prevnar-13 vaccine was the nation’s eighth most heavily advertised drug; after the launch of the intensive advertising campaign, Prevnar “awareness” increased by over 1,500% in eight months, and “44% of targeted consumers were talking to their physicians about getting vaccinated specifically with Prevnar.” Slick ad campaigns have also helped boost uptake of “unpopular” vaccines like Gardasil.

Advertising is such an established part of journals’ modus operandi that high-end journals such as The New England Journal of Medicine (NEJM) boldly invite medical marketers to “make NEJM the cornerstone of their advertising programs,” promising “no greater assurance that your ad will be seen, read, and acted upon.” In addition, medical journals benefit from pharmaceutical companies’ bulk purchases of thousands of journal reprints and industry’s sponsorship of journal subscriptions and journal supplements.

advertisement - learn more

In 2003, an editor at The BMJ wrote about the numerous ways in which drug company advertising can bias medical journals (and the practice of medicine)—all of which still hold true today. For example:

  • Advertising monies enable prestigious journals to get thousands of copies into doctors’ hands for free, which “almost certainly” goes on to affect prescribing.
  • Journals are willing to accept even the most highly misleading advertisements. The FDA has flagged numerous instances of advertising violations, including ads that overstated a drug’s effectiveness or minimized its risks.
  • Journals will guarantee favorable editorial mentions of a product in order to earn a company’s advertising dollars.
  • Journals can earn substantial fees for publishing supplements even when they are written by “paid industry hacks”—and the more favorable the supplement content is to the company that is funding it, the bigger the profit for the journal.

Discussing clinical trials, the BMJ editor added: “Major trials are very good for journals in that doctors around the world want to see them and so are more likely to subscribe to journals that publish them. Such trials also create lots of publicity, and journals like publicity. Finally, companies purchase large numbers of reprints of these trials…and the profit margin to the publisher is huge. These reprints are then used to market the drugs to doctors, and the journal’s name on the reprint is a vital part of that sell.”

… however, even these poor-quality studies—when funded by the pharmaceutical industry—got far more attention than equivalent studies not funded by industry.

Industry-funded bias

According to the Journal of the American Medical Association (JAMA), nearly three-fourths of all funding for clinical trials in the U.S.—presumably including vaccine trials—came from corporate sponsors as of the early 2000s. The pharmaceutical industry’s funding of studies (and investigators) is a factor that helps determine which studies get published, and where. As a Johns Hopkins University researcher has acknowledged, funding can lead to bias—and while the potential exists for governmental or departmental funding to produce bias, “the worst source of bias is industry-funded.”

In 2009, researchers published a systematic review of several hundred influenza vaccine trials. Noting “growing doubts about the validity of the scientific evidence underpinning [influenza vaccine] policy recommendations,” the authors showed that the vaccine-favorable studies were “of significantly lower methodological quality”; however, even these poor-quality studies—when funded by the pharmaceutical industry—got far more attention than equivalent studies not funded by industry. The authors commented:

[Studies] sponsored by industry had greater visibility as they were more likely to be published by high impact factor journals and were likely to be given higher prominence by the international scientific and lay media, despite their apparent equivalent methodological quality and size compared with studies with other funders.

In their discussion, the authors also described how the industry’s vast resources enable lavish and strategic dissemination of favorable results. For example, companies often distribute “expensively bound” abstracts and reprints (translated into various languages) to “decision makers, their advisors, and local researchers,” while also systematically plugging their studies at symposia and conferences.

The World Health Organization’s standards describe reporting of clinical trial results as a “scientific, ethical, and moral responsibility.” However, it appears that as many as half of all clinical trial results go unreported—particularly when their results are negative. A European official involved in drug assessment has described the problem as “widespread,” citing as an example GSK’s suppression of results from four clinical trials for an anti-anxiety drug when those results showed a possible increased risk of suicide in children and adolescents. Experts warn that “unreported studies leave an incomplete and potentially misleading picture of the risks and benefits of treatments.”

Many vaccine studies flagrantly illustrate biases and selective reporting that produce skewed write-ups that are more marketing than science.

Debased and biased results

The “significant association between funding sources and pro-industry conclusions” can play out in many different ways, notably through methodological bias and debasement of study designs and analytic strategies. Bias may be present in the form of inadequate sample sizes, short follow-up periods, inappropriate placebos or comparisons, use of improper surrogate endpoints, unsuitable statistical analyses or “misleading presentation of data.”

Occasionally, high-level journal insiders blow the whistle on the corruption of published science. In a widely circulated quote, Dr. Marcia Angell, former editor-in-chief of NEJM, acknowledged that “It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines.” Dr. Angell added that she “[took] no pleasure in this conclusion, which [she] reached slowly and reluctantly” over two decades at the prestigious journal.

Many vaccine studies flagrantly illustrate biases and selective reporting that produce skewed write-ups that are more marketing than science. In formulaic articles that medical journals are only too happy to publish, the conclusion is almost always the same, no matter the vaccine: “We did not identify any new or unexpected safety concerns.” As an example of the use of inappropriate statistical techniques to exaggerate vaccine benefits, an influenza vaccine study reported a “69% efficacy rate” even though the vaccine failed “nearly all who [took] it.” As explained by Dr. David Brownstein, the study’s authors used a technique called relative risk analysis to derive their 69% statistic because it can make “a poorly performing drug or therapy look better than it actually is.” However, the absolute risk difference between the vaccine and the placebo group was 2.27%, meaning that the vaccine “was nearly 98% ineffective in preventing the flu.”

… the reviewers had done an incomplete job and had ignored important evidence of bias.

Trusted evidence?

In 2018, the Cochrane Collaboration—which bills its systematic reviews as the international gold standard for high-quality, “trusted” evidence—furnished conclusions about the human papillomavirus (HPV) vaccine that clearly signaled industry bias. In May of that year, Cochrane’s highly favorable review improbably declared the vaccine to have no increased risk of serious adverse effects and judged deaths observed in HPV studies “not to be related to the vaccine.” Cochrane claims to be free of conflicts of interest, but its roster of funders includes national governmental bodies and international organizations pushing for HPV vaccine mandates as well as the Bill & Melinda Gates Foundation and the Robert Wood Johnson Foundation—both of which are staunch funders and supporters of HPV vaccination. The Robert Wood Johnson Foundation’s president is a former top CDC official who served as acting CDC director during the H1N1 “false pandemic” in 2009 that ensured millions in windfall profits for vaccine manufacturers.

Two months after publication of Cochrane’s HPV review, researchers affiliated with the Nordic Cochrane Centre (one of Cochrane’s member centers) published an exhaustive critique, declaring that the reviewers had done an incomplete job and had “ignored important evidence of bias.” The critics itemized numerous methodological and ethical missteps on the part of the Cochrane reviewers, including failure to count nearly half of the eligible HPV vaccine trials, incomplete assessment of serious and systemic adverse events and failure to note that many of the reviewed studies were industry-funded. They also upbraided the Cochrane reviewers for not paying attention to key design flaws in the original clinical trials, including the failure to use true placebos and the use of surrogate outcomes for cervical cancer.

In response to the criticisms, the editor-in-chief of the Cochrane Library initially stated that a team of editors would investigate the claims “as a matter of urgency.” Instead, however, Cochrane’s Governing Board quickly expelled one of the critique’s authors, Danish physician-researcher Peter Gøtzsche, who helped found Cochrane and was the head of the Nordic Cochrane Centre. Gøtzsche has been a vocal critic of Cochrane’s “increasingly commercial business model,” which he suggests is resulting in “stronger and stronger resistance to say anything that could bother pharmaceutical industry interests.” Adding insult to injury, Gøtzsche’s direct employer, the Rigshospitalet hospital in Denmark, then fired Gøtzsche. In response, Dr. Gøtzsche stated, “Firing me sends the unfortunate signal that if your research results are inconvenient and cause public turmoil, or threaten the pharmaceutical industry’s earnings, …you will be sacked.” In March 2019, Gøtzsche launched an independent Institute for Scientific Freedom.

In 2019, the editor-in-chief and research editor of BMJ Evidence Based Medicine—the journal that published the critique of Cochrane’s biased review—jointly defended the critique as having “provoke[d] healthy debate and pose[d] important questions,” affirming the value of publishing articles that “hold organisations to account.” They added that “Academic freedom means communicating ideas, facts and criticism without being censored, targeted or reprimanded” and urged publishers not to “shrink from offering criticisms that may be considered inconvenient.”

In recent years, a number of journals have invented bogus excuses to withdraw or retract articles critical of risky vaccine ingredients, even when written by top international scientists.

The censorship tsunami

Another favored tactic is to keep vaccine-critical studies out of medical journals altogether, either by refusing to publish them (even if peer reviewers recommend their publication) or by concocting excuses to pull articles after publication. In recent years, a number of journals have invented bogus excuses to withdraw or retract articles critical of risky vaccine ingredients, even when written by top international scientists. To cite just three examples:

  • The journal Vaccine withdrew a study that questioned the safety of the aluminum adjuvantused in Gardasil.
  • The journal Science and Engineering Ethics retracted an article that made a case for greater transparency regarding the link between mercury and autism.
  • Pharmacological Research withdrew a published veterinary article that implicated aluminum-containing vaccines in a mystery illness decimating sheep, citing “concerns” from an anonymous reader.

Elsevier, which publishes two of these journals, has a track record of setting up fake journals to market Merck’s drugs, and Springer, which publishes the third journal as well as influential publications like Nature and Scientific American, has been only too willing to accommodate censorship requests. However, even these forms of censorship may soon seem quaint in comparison to the censorship of vaccine-critical information now being implemented across social media and other platforms. This concerted campaign to prevent dissemination of vaccine content that does not toe the party line will make it harder than ever for American families to do their due diligence with regard to vaccine risks and benefits.


Sign up for free news and updates from Robert F. Kennedy, Jr. and the Children’s Health Defense. CHD is planning many strategies, including legal, in an effort to defend the health of our children and obtain justice for those already injured. Your support is essential to CHD’s successful mission.

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading
advertisement - learn more
advertisement - learn more

Video

Pod