Connect with us

Alternative News

Isaac Newton’s Lost Alchemy Recipe Discovered: Are ‘Magic’ & ‘Superpowers’ Just Science We Have Yet To Understand?

Published

on

Speaking seriously about either ‘magic’ or ‘superpowers’ will get you branded as a quack by the majority of mainstream scientists today. This is unfortunate for several reasons, most notably for the simple fact that what we perceive to be ‘superpowers’ — phenomena like telepathy, distant healing, psychokinesis, mental control over our own biology, and more — have been tested and researched, and have yielded a number of statistically significant results. The sheer volume of credible research which has been published in various peer-reviewed scientific journals on the subject is actually a bit overwhelming. Those who dismiss findings in this field as pseudoscience do not seem to be doing any research before arriving at this conclusion.

advertisement - learn more

A statistics professor at the University of California, Irvine, for example, published a paper regarding mind mater research which demonstrated that the evidence of Extra Sensory Perception (ESP) is significantly stronger than the statistics showing that a daily dose of aspirin helps prevent heart attack. The study also showed how parapsychological (psi) studies produced stronger results compared to the effectiveness of antiplatelets, which are a group of medicines that stop blood cells from sticking together and forming a blood clot.

There are many examples and strong results in psi. For a short list of a few (out of many) downloadable peer-reviewed journal articles reporting studies of psychic phenomena, mostly published in the 21st century, you can click HERE,

Another fact you might not be aware of is that most of our pioneering scientists were all mystics. Issaac Newton is a great example; most of his published works were classified as occult studies, but this is never mentioned in the mainstream scientific literature.

Isaac Newton And Alchemy

A 17th century document that has been held in a private collection for decades is now in the hands of the Chemical Heritage Foundation, a nonprofit group situated in Philadelphia, Pennsylvania. In February, they purchased the document and are now working on uploading digital images and transcripts to an online database so more people can study Newton’s take on alchemy.

According to science historian William Newman of Indiana University:

advertisement - learn more

While there’s no evidence that Newton actually made sophick mercury, the manuscript will help scholars understand how he interpreted alchemy’s often deeply encoded recipes. The document also underscores the fact that Newton—a father of modern physics and co-discoverer of calculus—was greatly influenced by alchemy and his collaborations with alchemists. (source)

National Geographic goes on to emphasize how “Newton wrote more than one million words about alchemy throughout his life, in the hope of using ancient knowledge to better explain the nature of matter. . . .”

Alchemy is considered to be a philosophical tradition which has been practiced by various cultures throughout human history. Its aim was to purify, mature, perfect, and transmute certain substances like lead and precious metals into gold. It’s commonly associated with ‘the Philosopher’s Stone,’ an alchemical substance that was also capable of turning base material into gold.

The document provides details on how to make “sophick mercury,” a substance seen as a main ingredient for the Philosopher’s Stone. The stone in turn could supposedly change base metals like lead into other substances, like gold. Newton copied the recipe by hand from a text by American-born alchemist George Starkey, but there is no evidence that he was actually successful in his experiments.

Again, this type of phenomenon is well documented throughout history, which is why scientists like Isaac Newton were so interested in it. He studied alchemy in depth, and it seems he had no doubts about its merit. Unfortunately, many of his writings on alchemy have been lost, apparently burned in a laboratory accident. Further complicating matters is the fact that much of his work on the subject was actually forbidden at the time, as scientists faced punishment and censure for pursuing occult topics. The English crown even feared the discovery of the Philosopher’s Stone because it would make gold — the substance they used (and still use) to control the monetary system — less valuable. All these facts are well documented. If you want to learn more or to confirm these findings for yourself, a great place to start is a documentary done by NOVA PBS, which you can access here.

Alchemy Is No Joke

Newton invented calculus, and is known (obviously) for many other things, but he is one of a long list of scientists throughout history to be heavily interested in what we often consider to be “occult studies.”

Evidence shows that alchemy may have more merit than we believe, however. Even in recent history we’ve seen heavy interest in the subject, with science historians working to decipher alchemical texts. Their task is not an easy one, however, as alchemists were obsessed with secrecy, and they would purposefully describe their experiments in figurative language.

Smithsonian Magazine tells us more about modern historical attempts to decipher these texts:

This painstaking process of decoding allowed researchers, for the first time, to attempt ambitious alchemical experiments. Lawrence Principe, a chemist and science historian at Johns Hopkins University, cobbled together obscure texts and scraps of 17th-century laboratory notebooks to reconstruct a recipe to grow a “Philosophers’ Tree” from a seed of gold. Supposedly this tree was a precursor to the more celebrated and elusive Philosopher’s Stone, which would be able to transmute metals into gold. The use of gold to make more gold would have seemed entirely logical to alchemists, Principe explains, like using germs of wheat to grow an entire field of wheat.

Again, there are multiple examples throughout ancient and modern history; even Robert Boyle, one of the 17th-century founders of modern chemistry, scavenged the work of German physician and alchemist Daniel Sennert.

I am going to leave you with this little excerpt from a book titled The Secret Teachings of All Ages, written by Manly P. Halla scholar of occult studies and a 33rd degree Mason:

The alchemical philosophers used the symbols of salt, sulphur, and mercury to represent not only chemicals but the spiritual and invisible principles of God, man and the universe. The three substances existing in four worlds, with the sum adding up to the sacred number 12. These 12 are the foundation stones of the sacred city.  In line with the same idea Pythagoras asserted that the dodecahedron, or twelve-faved symmetrical geometric solid, was the foundation of the universe.  Maybe there not be a relation also between this mysterious 3 times 4 and the four parties o three which in the legend of the third degree of Freemasonry go forth to the four angels of the cherubim, the composite creature of four parts?

As one of the great alchemists fittingly observed, man’s quest for gold is often his undoing, for he mistakes the alchemical processes, believing them to be purely material. He does not realize that the Philosopher’s Golf, the Philosopher’s Stone, and the Philosopher’s Medicine exist in each of the four worlds and that the consummation of the experiment cannot be realized until it is successfully carried on in four worlds simultaneously according to one formula. Furthermore, one of the constituents of the alchemical formula exists only within the nature of man himself, without which his chemicals will not combine, and though he spend his life and fortune in chemical experimentation, he will not produce the desired end. The paramount reason why the material scientist is incapable of uplifting the achievements of the mediaeval alchemists – although he follow every step carefully and accurately – is that the subtle element which comes out of the nature of the illuminated and regenerated alchemical philosopher is missing in his experimentation

This is the strength of all powers. This is a very strong figure, that does positively posses all the powers concealed in Nature, not for destruction but for exaltation and regeneration of matter, in the three departments of nature. With all this thous wilt be able to overcome all things, and to transmute all what is fine and what is corse. It will conquer every subtle thing, of course, as it refixes the most subtle Oxygen into its own fiery Nature and that with more power, penetration and virtue

The Philosophers stone is really the philosopher’s stone, for philosophy is truly likened to a magic jewel whose touch transmutes base substances into priceless gems like itself. Wisdom is the alchemist’s powder of projection which transforms which transforms many thousand times its own weight of gross ignorance into the precious substance of enlightenment

The Philosophers stone contains all the powers of nature, it is established by the harmony of the four elements

 

 

 

 

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Advertisement
advertisement - learn more

Alternative News

The Anatomy of Conspiracy Theories

Published

on

Whether you believe in conspiracy theories or not, we can all agree that the use of the term has exploded in media and in conversation. The question is, why? Are we now using the term “Conspiracy Theory” more indiscriminately and on more platforms than previously? Are we, as a society, simply becoming unhinged and absurd? Are seemingly nonsensical stories, for some unknown reason, starting to resonate with people? Or are some conventional narratives getting challenged because some of these “alternative” explanations are in fact accurate, despite the fact that conventional sources refuse to acknowledge them as even potentially valid? Notice that the last two possibilities are different sides of the same coin. If you think  “conspiracy theorists” are unhinged, it is highly likely that they are suspicious of your sanity as well. Both sides insist that they are right and that the other has been hoodwinked. Note that if you choose to not pick a side, you are, by default, allowing the conventional narrative to perpetuate. That is how convention works. 

Merriam-Webster defines the term conspiracy theory as “a theory that explains an event or situation as the result of a secret plan by usually powerful people or groups”. The key elements of this definition remain consistent across all authoritative lexicons: the group responsible for an event must be powerful and covert. However, if we refer to the Wikipedia definition as of 11/2018 a new element emerges: “A conspiracy theory is an explanation of an event or situation that invokes a conspiracy—generally one involving an illegal or harmful act supposedly carried out by government or other powerful actors—without credible evidence.”

When an explanation is labeled a “Conspiracy Theory,” by today’s definition, it has no evidence to support it. An explanation with no supporting evidence is a hypothesis, not a “theory.” “Conspiracy Theory,” as it is used today, is thus an oxymoron. These “Conspiracy Theories” we seem to hear about everyday should really be called “Conspiracy Hypotheses.” More concerning is that the “Conspiracy Theory” label identifies an explanation as inherently baseless. Given this linguistic construct, where is there room for a conspiracy that is in fact true?

There is also something troubling about using the term “credible” in the definition of conspiracy theory. Legally, evidence that is credible is that which a reasonable person would consider to be true in light of the surrounding circumstances. If evidence suggests an explanation that seems at the surface to be unreasonable, how does a reasonable person avoid automatically labeling the evidence not credible? If we are not careful, the credibility of the explanation and resultant conclusions would then determine the credibility of the evidence that supports it. Is this really so important? Perhaps you are quick to see that with this approach, our understanding of what is true and real can never evolve. If any evidence arose that radically disproved our understanding or eroded our faith in trusted institutions we would automatically discard it as “not credible” and remain entrenched in our accepted paradigm. “Credible” evidence cannot be a necessary requirement of a theory that challenges what is credible to begin with.

To better illustrate this, let us consider an old but very real “conspiracy theory.” About 400 years ago, European civilization was emerging from centuries of scientific and philosophical stagnation known as the dark ages. What more befitting a place for such a renaissance to occur than the center of the universe? You see, the idea that the Earth was one of eight planets revolving around a star that is orbiting the center of one of hundreds of billions of galaxies would have been absurd in Europe in the sixteenth century. Any sane person could see that the Sun and the Moon and every celestial body rises in the East and sets in the West. At that time, if someone went about proposing the idea that everything rises and falls because the Earth was spinning, they would have been laughed out of the tavern. Would that person be a conspiracy theorist? They are not proposing that “powerful actors are carrying out a harmful act,” they are merely suggesting an alternative explanation for what is observed. However, the implication of their suggestion seems to incriminate the authority on such matters as ignorant of the truth or, possibly, the perpetrators of a lie. The possibility of a conspiracy has now been introduced.

Now, let us say that this person claims to have proof of their absurd theory. Would you have taken the time to examine the evidence or would you have been more likely to dismiss them without further consideration? The very idea that they could be right would have been not just silly or heretical, but inconceivable to many, if not all. How could the evidence be credible if it implied something inconceivable? Dismissing their idea would have seemingly been the most logical and, therefore, the smartest thing to do.

advertisement - learn more

When Galileo Galilei appeared in 1610 armed with a rudimentary “telescope,” few would peer into it. He claimed that the refractive properties of the pair of “lenses” would allow you to see things at great distances very clearly. With it one could see Jupiter and its moons revolving around the giant planet just as our moon revolves around Earth. How enchanting! The difficulty would arise when you put the telescope down: your feet would no longer be planted on the previously immovable center of creation. Would you have looked into his telescope? What would have been the harm in taking a peek? Certainly the fear of being proven more gullible than most would have been on your mind. What about the fear that he might be right?

Imagine what must have been going through Galileo’s mind after his monumental discovery. He saw irrefutably that the entire model of the universe had been completely misconceived. One just has to look. Most did not. I can only imagine how hard he must have tried to convince anyone to simply stop, look and listen to what he had discovered. At the time, Galileo was the Chair of Mathematics at the University of Padua and had previously held the same post at the University of Pisa. Despite his bonafides and reputation as a solid contributor to the Italian renaissance, his discovery would likely have died in obscurity if it weren’t for the support of an influential family, the Medicis, who offered Galileo a platform from which he could spread his theory. It was only through allying himself with political power that he was able to slowly generate interest in his heliocentric model of the solar system. His proposition eventually caught the attention of the Catholic church, who initially warned him to desist. Eventually, he was brought to trial in the Roman Inquisition 23 years after his discovery. At the age of 70, the intrepid mathematician and astronomer was allowed to return home if he agreed to recant his story. Instead Galileo chose to spend the rest of his years in prison because he believed that that would be the only way to get people to open their eyes.

Did it work? It did not. Galileo died incarcerated while Europe continued to slumber under stars that moved around them. By today’s standards, Galileo would have been labeled a Conspiracy Theorist from the day he announced his findings until he was proven right fifty years after his death.  When the Principle of Gravitational Attraction eventually became widely accepted as true, the church had to retract their position because the motions of the stars and planets could not be explained under Newton’s laws. 

On the other hand, Galileo is credited with being the father of not only observational astronomy, but of the scientific method as well. The scientific method demands that one tests an explanation without bias towards an outcome. All data is considered before deductions are made. When all other explanations have been proven wrong, the only explanation remaining becomes a theory. The theory persists as long as all subsequent experiments continue to uphold it. This is how we ultimately know what we know and have an inkling of what we don’t. If I had to choose a posthumous title for myself, “The Father of the Scientific Method” is one I could die with. Galileo is credited with this honorific not only because he valued it more than his freedom, but because he had the discipline to regard evidence objectively despite how unimaginable the implications were. This is how a body of knowledge expands. By considering the validity of the evidence first, we then can accept what was previously unimaginable, otherwise what we know tomorrow will be no different than what we know today.

All conspiracy theorists are not Galileos. Neither are all conspiracy theories true. However, can we be certain that all of them are false? At their very core, all conspiracy theories directly or indirectly point at a central authority acting covertly and simultaneously at the media for either missing it or looking the other way. This, of course, is unimaginable, as we all know the government can make mistakes but would never do anything intentionally harmful to its citizens and then hide it. Even if they did, somebody would come forward and the media would let us know about it. This is why such a deception could never occur. The idea that your lover could be in bed with your best friend is inconceivable. Evidence of such a thing would not be credible. Dismissing all conspiracy theories seems logical and therefore seems like the smartest thing to do. 

In “Sapiens”, Yuval Harari proposes an explanation for why our species, Sapiens, out fought, out thought and out survived all other Homo species on the planet. He suggests that it was our unique ability to describe and communicate situations and events that had no basis in reality which set us apart. In other words, we could tell stories and they could not. By uniting under a common idea, story or even myth, thousands (and now thousands of millions) of Sapiens could come together with a shared purpose, identity or belief system to disband our cousins who were as individuals more sturdy and just as cunning but not nearly as good at cooperating as we were. This advantage, Harari proposes, has not only led our species to eventual supremacy over all others, but has also allowed us to form communities, governments and global alliances. 

Siding with the majority has served us well–until it hasn’t. One only needs to revisit the history of Galileo and basic astronomy to understand this. In actuality, the first observant minds woke up to the fact that the Earth went around the sun and not the other way round nineteen centuries before Galileo did. The Greek mathematician, Aristarcus, is thought to be the first Western person to place the Sun in the middle of a “solar system” in 270 BC. A human being traveled to the moon just 360 years after Galileo “discovered” what Aristarcus had shown nearly two millennia before. How many centuries was this journey delayed because an alternative explanation in ancient Greece became a “conspiracy theory” against authority and convention?

This poses an intriguing question. Is there something hardwired in our behavioral patterns that push us towards conformist narratives and away from alternative ones at a precognitive level? Is it this tendency that gave rise to our enhanced ability to unite that keeps us in “group-think” more than we should be? How do we know we are looking at the world objectively and rejecting alternative belief systems from a purely rational basis? How does one know whether one is biased or not?

One way is to apply the scientific method. The scientific method demands that every possibility, no matter how outlandish, is tested for its veracity and dismissed only when it can be proven wrong. Without this objective pursuit of truth, misconceptions can persist indefinitely, just as the geocentric model of the universe did. Interestingly, Aristarcus was allowed to retain his theory because he lived at a time and place where philosophers, mathematicians and scientists were revered, protected and free to pursue their notions. The freedom ancient Greek society afforded its scientists only endured for a few centuries after Aristarcus lived. In Galileo’s day, the Roman Catholic church had been presiding over such things as facts for well over a thousand years. His incontrovertible proof was suppressed by the power that had the most to lose.

These days, establishing the facts of the matter may not be as easy as we presume. Conspiracy theorists claim to have proof just like the debunkers do. How do we know that the proof offered on either side is valid? Who has the time to apply the scientific method? It certainly seems safer to go with the conventional narrative because surely there are more rational minds in a larger group. Though it seems a reasonable approach, it may be in fact where we misstep. By deferring to others, we assume the majority will arrive at the truth eventually. The problem is that those in the majority who are trained to examine evidence objectively often must take a potentially career-ending risk to even investigate an alternative explanation. Why would an organization be willing to invest the resources to redirect their scientific staff to chase down and evaluate evidence that will likely endanger their reputation with the public without any upside? Thus, conventional narratives survive for another day, or in the case of an Earth-centered universe, for a couple of thousand years.

Whether or not you are not a “conspiracy theorist” we can all agree that there is a possibility, however slight, that some conventional narratives could be wrong. How would we know? Is there a source that we can trust 100%? Must we rely on our own wits? A short inquiry into this question can be disquieting. Most of us must admit that our understanding of history, science and geopolitics are merely stories that we have been told by people, institutions or media that we trust explicitly or implicitly. Because most of us are not authorities on anything, it would be impossible to overturn any conventional narrative with an evidentiary argument. Challenging these paradigms is necessarily left to others. Generally speaking, there is no real reason to argue with convention if everything is seemingly unfolding acceptably. But what if you wanted to know for yourself ? Is there any way to ever really know the truth without having to have faith in someone or something else?

There may not be. However, it is also naive to believe that if someone, scientist or not, was in possession of evidence that challenged our deepest held beliefs that it would take root in the ethos on its own. Galileo enjoyed unsurpassed credibility as one of Italy’s foremost mathematicians. He also possessed irrefutable, verifiable and reproducible evidence for his revolutionary theory, yet the convention he was challenging did not crumble through his discoveries. History has shown us that it makes no difference how valid a point is; truth emerges only when someone is listening

So, rather than seeking to independently validate or refute what we are being told, it becomes more productive to ask a different question: How biased is our society by historical standards? How does our society regard alternative theories? Do we let them co-exist with convention as the ancient Greeks did? Do we collectively invest resources to investigate them openly? Or do we dismiss, attack and vilify them as was done in the papal states in Galileo’s time? Which kind of society is more likely to get it right? Which runs the greater risk of being hoodwinked in the long run? Which is more free?

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading

Alternative News

US House of Representatives Investigating if the Government Created Lyme Disease As A Bioweapon

Published

on

In Brief

  • The Facts:

    A New Jersey lawmaker suggests the government turned ticks and insects into bioweapons to spread disease, and possibly released them. He is not the only one who believes so.

  • Reflect On:

    This is not the only example of supposed human experimentation on mass populations by the government

There are a number of subjects that were once considered ‘conspiracy theories,’ which are now no longer in that realm. ‘Conspiracy theories’ usually, in my opinion, arise from credible evidence. The implications, however, are so grand and so mind-altering that many may experience some sort of cognitive dissonance as a result. One of the topics often deemed a ‘conspiracy theory’ is weaponized diseases, and the latest example comes from an approved amendment that was proposed by a Republican congressman from New Jersey. His name is Chris Smith, and he instructed the Department of Defence’s Inspector General to conduct a review on whether or not the US “experimented with ticks and insects regarding use as a biological weapon between the years of 1950 and 1975” and “whether any ticks or insects used in such experiment were released outside of any laboratory by accident or experiment design.”

The fact that Smith brought this up shows that any intelligent person who actually looks into this has reason to believe it’s a possibility, yet mainstream media outlets are ridiculing the idea, calling it a conspiracy instead of actually bringing up the points that caused Smith to demand the review.

The fact that the amendment was approved by a vote in the House speaks volumes. Smith said that the amendment was inspired by “a number of books and articles suggesting that significant research had been done at US government facilities including Fort Detrick, Maryland, and Plum Island, New York, to turn ticks and insects into bioweapons”.

Most people don’t know that the US government has experimented on its own citizens a number of times. All of this is justified for “national security” purposes. National security has always been a term used as an excuse to prolong secrecy, justify the government’s lack of transparency, and create black budget programs that have absolutely no oversight from Congress.

For example, on September 20, 1950, a US Navy ship just off the coast of San Francisco used a giant hose to spray a cloud of microbes into the air and into the city’s famous fog. The military was apparently testing how a biological weapon attack would affect the 800,000 residents of the city.The people of San Francisco had absolutely no idea. The Navy continued the tests for seven days, and multiple people died as a result. It was apparently one of the first large-scale biological weapon trials that would be conducted under a “germ warfare testing program” that went on for 20 years from 1949 to 1969. The goal “was to deter [the use of biological weapons] against the United States and its allies and to retaliate if deterrence failed,” the government later explained. Then again, that’s if you trust the explanation coming from the government.

This could fall under the category of human subject research. It’s still happening! A dozen classified programs that involved research on human subjects were underway last year at the Department of Energy. Human subject research refers broadly to the collection of scientific data from human subjects. This could involve performing physical procedures on the subjects or simply conducting interviews and having other forms of interaction with them. It could even involve procedures performed on entire populations, apparently without their consent.

advertisement - learn more

Human subjects research erupted into national controversy 25 years ago with reporting by Eileen Welsome of the Albuquerque Tribune on human radiation experiments that had been conducted by the Atomic Energy Commission, many of which were performed without the consent of the subjects. A presidential advisory committee was convened to document the record and to recommend appropriate policy responses.

When it comes to Lyme disease, the Guardian points out that:

A new book published in May by a Stanford University science writer and former Lyme sufferer, Kris Newby, has raised questions about the origins of the disease, which affects 400,000 Americans each year.

Bitten: The Secret History of Lyme Disease and Biological Weapons, cites the Swiss-born discoverer of the Lyme pathogen, Willy Burgdorfer, as saying that the Lyme epidemic was a military experiment that had gone wrong.

Burgdorfer, who died in 2014, worked as a bioweapons researcher for the US military and said he was tasked with breeding fleas, ticks, mosquitoes and other blood-sucking insects, and infecting them with pathogens that cause human diseases.

According to the book, there were programs to drop “weaponised” ticks and other bugs from the air, and that uninfected bugs were released in residential areas in the US to trace how they spread. It suggests that such a scheme could have gone awry and led to the eruption of Lyme disease in the US in the 1960s.

This is concerning. It’s a story that, for some reason, instantly reminded me of the MK ultra program, where human subjects were used for mind control research.

If things like this occurred in the past, it’s hard to understand why someone would deem the possibility of this happening again a ‘conspiracy theory.’ What makes one think this wouldn’t be happening again, especially given the fact that there is sufficient evidence suggesting it is?

Lyme disease is also very strange. If you did get it, you probably wouldn’t know immediately – unless you’re one of the chronic sufferers that have had to visit over 30 doctors to get a proper diagnosis. Lyme disease tests are highly inaccurateoften inconclusive or indicating false negatives.

Why? Because this clever bacteria has found a way to dumb down the immune system and white blood cells so that it’s not detectable until treatment is initiated. To diagnose Lyme disease properly you must see a “Lyme Literate MD (LLMD).” However, more and more doctors are turning their backs on patients due to sheer fear of losing their practices! Insurance companies and the CDC will do whatever it takes to stop Chronic Lyme Disease from being diagnosed, treated, or widely recognized as an increasingly common issue.

You can read more about that here.

The Takeaway

It’s becoming more apparent that our government as well as our federal health regulatory agencies are extremely corrupt. There are a number of examples to choose from throughout history proving this. The fact that something like this doesn’t seem believable to the public is ridiculous and further enhances and prolongs the ability for the powerful elite and the government to continue conducting these activities. Awareness is key.

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading

Alternative News

The Medical Journals’ Sell-Out—Getting Paid to Play

Published

on

[Note: This is Part IX in a series of articles adapted from the second Children’s Health Defense eBook: Conflicts of Interest Undermine Children’s Health. The first eBook, The Sickest Generation: The Facts Behind the Children’s Health Crisis and Why It Needs to End, described how children’s health began to worsen dramatically in the late 1980s following fateful changes in the childhood vaccine schedule.]

The vaccine industry and its government and scientific partners routinely block meaningful science and fabricate misleading studies about vaccines. They could not do so, however, without having enticed medical journals into a mutually beneficial bargain. Pharmaceutical companies supply journals with needed income, and in return, journals play a key role in suppressing studies that raise critical questions about vaccine risks—which would endanger profits.

Journals are willing to accept even the most highly misleading advertisements. The FDA has flagged numerous instances of advertising violations, including ads that overstated a drug’s effectiveness or minimized its risks.

An exclusive and dependent relationship

Advertising is one of the most obviously beneficial ways that medical journals’ “exclusive and dependent relationship” with the pharmaceutical industry plays out. According to a 2006 analysis in PLOS Medicinedrugs and medical devices are the only products for which medical journals accept advertisements. Studies show that journal advertising generates “the highest return on investment of all promotional strategies employed by pharmaceutical companies.” The pharmaceutical industry puts a particularly “high value on advertising its products in print journals” because journals reach doctors—the “gatekeeper between drug companies and patients.” Almost nine in ten drug advertising dollars are directed at physicians.

In the U.S. in 2012, drug companies spent $24 billion marketing to physicians, with only $3 billion spent on direct-to-consumer advertising. By 2015, however, consumer-targeted advertising had jumped to $5.2 billion, a 60% increase that has reaped bountiful rewards. In 2015, Pfizer’s Prevnar-13 vaccine was the nation’s eighth most heavily advertised drug; after the launch of the intensive advertising campaign, Prevnar “awareness” increased by over 1,500% in eight months, and “44% of targeted consumers were talking to their physicians about getting vaccinated specifically with Prevnar.” Slick ad campaigns have also helped boost uptake of “unpopular” vaccines like Gardasil.

Advertising is such an established part of journals’ modus operandi that high-end journals such as The New England Journal of Medicine (NEJM) boldly invite medical marketers to “make NEJM the cornerstone of their advertising programs,” promising “no greater assurance that your ad will be seen, read, and acted upon.” In addition, medical journals benefit from pharmaceutical companies’ bulk purchases of thousands of journal reprints and industry’s sponsorship of journal subscriptions and journal supplements.

advertisement - learn more

In 2003, an editor at The BMJ wrote about the numerous ways in which drug company advertising can bias medical journals (and the practice of medicine)—all of which still hold true today. For example:

  • Advertising monies enable prestigious journals to get thousands of copies into doctors’ hands for free, which “almost certainly” goes on to affect prescribing.
  • Journals are willing to accept even the most highly misleading advertisements. The FDA has flagged numerous instances of advertising violations, including ads that overstated a drug’s effectiveness or minimized its risks.
  • Journals will guarantee favorable editorial mentions of a product in order to earn a company’s advertising dollars.
  • Journals can earn substantial fees for publishing supplements even when they are written by “paid industry hacks”—and the more favorable the supplement content is to the company that is funding it, the bigger the profit for the journal.

Discussing clinical trials, the BMJ editor added: “Major trials are very good for journals in that doctors around the world want to see them and so are more likely to subscribe to journals that publish them. Such trials also create lots of publicity, and journals like publicity. Finally, companies purchase large numbers of reprints of these trials…and the profit margin to the publisher is huge. These reprints are then used to market the drugs to doctors, and the journal’s name on the reprint is a vital part of that sell.”

… however, even these poor-quality studies—when funded by the pharmaceutical industry—got far more attention than equivalent studies not funded by industry.

Industry-funded bias

According to the Journal of the American Medical Association (JAMA), nearly three-fourths of all funding for clinical trials in the U.S.—presumably including vaccine trials—came from corporate sponsors as of the early 2000s. The pharmaceutical industry’s funding of studies (and investigators) is a factor that helps determine which studies get published, and where. As a Johns Hopkins University researcher has acknowledged, funding can lead to bias—and while the potential exists for governmental or departmental funding to produce bias, “the worst source of bias is industry-funded.”

In 2009, researchers published a systematic review of several hundred influenza vaccine trials. Noting “growing doubts about the validity of the scientific evidence underpinning [influenza vaccine] policy recommendations,” the authors showed that the vaccine-favorable studies were “of significantly lower methodological quality”; however, even these poor-quality studies—when funded by the pharmaceutical industry—got far more attention than equivalent studies not funded by industry. The authors commented:

[Studies] sponsored by industry had greater visibility as they were more likely to be published by high impact factor journals and were likely to be given higher prominence by the international scientific and lay media, despite their apparent equivalent methodological quality and size compared with studies with other funders.

In their discussion, the authors also described how the industry’s vast resources enable lavish and strategic dissemination of favorable results. For example, companies often distribute “expensively bound” abstracts and reprints (translated into various languages) to “decision makers, their advisors, and local researchers,” while also systematically plugging their studies at symposia and conferences.

The World Health Organization’s standards describe reporting of clinical trial results as a “scientific, ethical, and moral responsibility.” However, it appears that as many as half of all clinical trial results go unreported—particularly when their results are negative. A European official involved in drug assessment has described the problem as “widespread,” citing as an example GSK’s suppression of results from four clinical trials for an anti-anxiety drug when those results showed a possible increased risk of suicide in children and adolescents. Experts warn that “unreported studies leave an incomplete and potentially misleading picture of the risks and benefits of treatments.”

Many vaccine studies flagrantly illustrate biases and selective reporting that produce skewed write-ups that are more marketing than science.

Debased and biased results

The “significant association between funding sources and pro-industry conclusions” can play out in many different ways, notably through methodological bias and debasement of study designs and analytic strategies. Bias may be present in the form of inadequate sample sizes, short follow-up periods, inappropriate placebos or comparisons, use of improper surrogate endpoints, unsuitable statistical analyses or “misleading presentation of data.”

Occasionally, high-level journal insiders blow the whistle on the corruption of published science. In a widely circulated quote, Dr. Marcia Angell, former editor-in-chief of NEJM, acknowledged that “It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines.” Dr. Angell added that she “[took] no pleasure in this conclusion, which [she] reached slowly and reluctantly” over two decades at the prestigious journal.

Many vaccine studies flagrantly illustrate biases and selective reporting that produce skewed write-ups that are more marketing than science. In formulaic articles that medical journals are only too happy to publish, the conclusion is almost always the same, no matter the vaccine: “We did not identify any new or unexpected safety concerns.” As an example of the use of inappropriate statistical techniques to exaggerate vaccine benefits, an influenza vaccine study reported a “69% efficacy rate” even though the vaccine failed “nearly all who [took] it.” As explained by Dr. David Brownstein, the study’s authors used a technique called relative risk analysis to derive their 69% statistic because it can make “a poorly performing drug or therapy look better than it actually is.” However, the absolute risk difference between the vaccine and the placebo group was 2.27%, meaning that the vaccine “was nearly 98% ineffective in preventing the flu.”

… the reviewers had done an incomplete job and had ignored important evidence of bias.

Trusted evidence?

In 2018, the Cochrane Collaboration—which bills its systematic reviews as the international gold standard for high-quality, “trusted” evidence—furnished conclusions about the human papillomavirus (HPV) vaccine that clearly signaled industry bias. In May of that year, Cochrane’s highly favorable review improbably declared the vaccine to have no increased risk of serious adverse effects and judged deaths observed in HPV studies “not to be related to the vaccine.” Cochrane claims to be free of conflicts of interest, but its roster of funders includes national governmental bodies and international organizations pushing for HPV vaccine mandates as well as the Bill & Melinda Gates Foundation and the Robert Wood Johnson Foundation—both of which are staunch funders and supporters of HPV vaccination. The Robert Wood Johnson Foundation’s president is a former top CDC official who served as acting CDC director during the H1N1 “false pandemic” in 2009 that ensured millions in windfall profits for vaccine manufacturers.

Two months after publication of Cochrane’s HPV review, researchers affiliated with the Nordic Cochrane Centre (one of Cochrane’s member centers) published an exhaustive critique, declaring that the reviewers had done an incomplete job and had “ignored important evidence of bias.” The critics itemized numerous methodological and ethical missteps on the part of the Cochrane reviewers, including failure to count nearly half of the eligible HPV vaccine trials, incomplete assessment of serious and systemic adverse events and failure to note that many of the reviewed studies were industry-funded. They also upbraided the Cochrane reviewers for not paying attention to key design flaws in the original clinical trials, including the failure to use true placebos and the use of surrogate outcomes for cervical cancer.

In response to the criticisms, the editor-in-chief of the Cochrane Library initially stated that a team of editors would investigate the claims “as a matter of urgency.” Instead, however, Cochrane’s Governing Board quickly expelled one of the critique’s authors, Danish physician-researcher Peter Gøtzsche, who helped found Cochrane and was the head of the Nordic Cochrane Centre. Gøtzsche has been a vocal critic of Cochrane’s “increasingly commercial business model,” which he suggests is resulting in “stronger and stronger resistance to say anything that could bother pharmaceutical industry interests.” Adding insult to injury, Gøtzsche’s direct employer, the Rigshospitalet hospital in Denmark, then fired Gøtzsche. In response, Dr. Gøtzsche stated, “Firing me sends the unfortunate signal that if your research results are inconvenient and cause public turmoil, or threaten the pharmaceutical industry’s earnings, …you will be sacked.” In March 2019, Gøtzsche launched an independent Institute for Scientific Freedom.

In 2019, the editor-in-chief and research editor of BMJ Evidence Based Medicine—the journal that published the critique of Cochrane’s biased review—jointly defended the critique as having “provoke[d] healthy debate and pose[d] important questions,” affirming the value of publishing articles that “hold organisations to account.” They added that “Academic freedom means communicating ideas, facts and criticism without being censored, targeted or reprimanded” and urged publishers not to “shrink from offering criticisms that may be considered inconvenient.”

In recent years, a number of journals have invented bogus excuses to withdraw or retract articles critical of risky vaccine ingredients, even when written by top international scientists.

The censorship tsunami

Another favored tactic is to keep vaccine-critical studies out of medical journals altogether, either by refusing to publish them (even if peer reviewers recommend their publication) or by concocting excuses to pull articles after publication. In recent years, a number of journals have invented bogus excuses to withdraw or retract articles critical of risky vaccine ingredients, even when written by top international scientists. To cite just three examples:

  • The journal Vaccine withdrew a study that questioned the safety of the aluminum adjuvantused in Gardasil.
  • The journal Science and Engineering Ethics retracted an article that made a case for greater transparency regarding the link between mercury and autism.
  • Pharmacological Research withdrew a published veterinary article that implicated aluminum-containing vaccines in a mystery illness decimating sheep, citing “concerns” from an anonymous reader.

Elsevier, which publishes two of these journals, has a track record of setting up fake journals to market Merck’s drugs, and Springer, which publishes the third journal as well as influential publications like Nature and Scientific American, has been only too willing to accommodate censorship requests. However, even these forms of censorship may soon seem quaint in comparison to the censorship of vaccine-critical information now being implemented across social media and other platforms. This concerted campaign to prevent dissemination of vaccine content that does not toe the party line will make it harder than ever for American families to do their due diligence with regard to vaccine risks and benefits.


Sign up for free news and updates from Robert F. Kennedy, Jr. and the Children’s Health Defense. CHD is planning many strategies, including legal, in an effort to defend the health of our children and obtain justice for those already injured. Your support is essential to CHD’s successful mission.

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading
advertisement - learn more
advertisement - learn more

Video

Pod