Connect with us

Alternative News

Why We Were Born To Do More Than Just Fit Into The System, Go To School, Work, Pay Bills & Die

Published

on

When I came across this discussion triggered by Josh Jones,  a writer and musician based in Durham, NC, from filmsforaction.org, I couldn’t help but ponder just how many people out there feel the same way about “work” and what we do in exchange for food on the table and a roof over our heads, among other things.

advertisement - learn more

From the day we are born, we are put into school for a couple of decades and told, not taught, how the world works, what path to take, why to follow it, and how to fit in and become a “productive” member of society. This basically means we have to spend a large majority of our lives striving for a degree or a diploma in order to qualify to work long hours and subsequently earn the right to live. There are many other roots than that as-well, much more appealing but they also require us to put in our time.

This sentiment reminds me of a video published by The School of Life (click link to see), which brings to light the fact that no matter how little sleep we get or what problems we are having at home, mental blockages and other things that can arise during the human experience, we are and always have been told that we must be at work on time, ready to go without excuses.

This doesn’t seem normal or near natural, yet it’s something we are forced into.

Mental illness is on the rise, take depression for example, an issue that’s now affecting more than 15 million adults, and that’s just in America alone. Could the current human experience be one that’s contributing to this rise? Are there more miserable people now because we basically spend our lives doing what we can to survive while ignoring what our hearts want? Are we not giving enough time to our wants and desires beyond the material world, and do we even have time to do so?

Josh sums it up quite well in his first paragraph:

advertisement - learn more

“Why must we all work long hours to earn the right to live? Why must only the wealthy have access to leisure, aesthetic pleasure, self-actualization…? Everyone seems to have an answer, according to their political or theological bent. One economic bogeyman, so-called ‘trickle-down’ economics, or ‘Reaganomics,’ actually pre-dates our 40th president by a few hundred years at least. The notion that we must better ourselves – or simply survive – by toiling to increase the wealth and property of already wealthy men was perhaps first comprehensively articulated in the 18th century doctrine of ‘improvement.’ In order to justify privatizing common land and forcing the peasantry into jobbing for them.”

My favourite part of that excerpt is the fact that he calls attention to the fact that all of us are simply working for a small group of elite people that, through the corporations they run, basically control almost all aspects of our lives. Their idea of “globalisation” or a “New World Order” is one that requires our participation, and our consent. This type of system, one in which basically all of us are economic slaves, is one that we’ve become accustomed to.

A great quote comes to mind here:

“Humans are so strange. We can climb mountains, explore the deepest oceans and travel to space. But for some reason we can’t move past this idea that we need political overlords who tell us what we can and can’t do with our own lives.” –Unknown

While we blindly continue to follow others, the world has experienced something it has never really experienced before. A massive paradigm shift is happening, a shift in the way we view, feel, and perceive our world and the current human experience. Not everybody is happy, and how could they be? When living on a planet where you die if you cannot pay for your life, our passions and heart’s desires slowly drift out of sight, unless we do something about it.

While we’ve remained complacent, and simply accepted the human experience for what it is, those that created our current economic model continue to destroy our planet and have absolutely no regard for preserving the integrity of the planet and all life on it. At the same time, large amounts of information are kept from us, all we know of our world is what’s given to us by the same people who designed this life for us: the corporate mainstream media.

Information alone is a threat to so many corporate interests.

This shift has come as a result of new information that’s now hitting the eyes and minds of millions, if not billions. This became evident when alternative media sites that cover global corporate corruption, as well as new discoveries in various fields that are ignored by the mainstream, like new energy, started to receive up to a billion views per year. Furthermore, whistleblowers like Edward Snowden and organizations like Wikileaks have also helped out hugely.

That all stopped when some of these sites, like CE, were labelled as “fake news.” An ironic title from mainstream media, isn’t it? They even appointed who they felt just to determine what’s real and what isn’t, as well as started a massive campaign to censor information that does not come from mainstream media news networks.

There is a lot more to the world than what we are presented with. Being so busy with our 9-5 and trying to survive, many people still can’t be bothered about it. When presented with information that’s outside the box, it’s common for cognitive dissonance to sink in.

What’s most frustrating about the current human experience is that it doesn’t have to be this way. This is where Buckminister Fuller comes in. Fuller, one of the most creative and interesting minds in modern history once said that “One in ten thousand of us can make a technological breakthrough capable of supporting all the rest. The youth of today are absolutely right in recognizing this nonsense of earning a wage.” 

This is something we at CE are well aware of. We’ve personally come across technologies that can revolutionize the planet. Although it depends what consciousness is operating behind that technology, it exists. Our entire planet could be, in a modern way, completely off the grid. There are so many wonderful creations and ideas out there that make a utopian society possible, it’s so simple that most people have a hard time believing it. The idea that we don’t really have to work to live on this planet and live a good live is still impossible to imagine for most, and that’s because we’ve been indoctrinated to believe that the current world economic model and globalization are the only way for humanity to move forward, when it’s doing the exact opposite. In my opinion, food, clothing, shelter and more should not require little pieces of paper along with a bits of our soul to receive it, a human experience that utilizes all of our developments instead of concealing them, one in which our leaders look out for humanity and the best interests of our planet instead of following the orders of their financial masters is desperately needed. Michael Jackson’s famous line, “they don’t really care about us,” rings true, but it’s not true for everyone.

Along with this consciousness shift, this realization that the wool has been pulled over our eyes, is the fact that consciousness interacts with our physical material world in ways that are not yet understood, and that is an encouraging thought given humanity’s change in thinking with regard to concepts that might not have fit the frame approximately a decade ago.

I won’t go into any specific examples. I’ll let you ponder how a utopian society would work, or how all of our needs could easily be provided for. Scarcity is something that doesn’t have to exist, neither does supply and demand. These were all creations by what’s known today as “the 1 percent.” The system was designed to benefit them, not us. Something new needs to be created, a new way of life that requires the complete shut down and change of our current economic model. Just as Fuller said:

“You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.” 

Fuller did not believe that we need to have wage earning jobs to live, and that if we do, we are not able to pursue our passions and interests unless they are for monetary gain. That’s an interesting thought, since when we grow up there are several “careers” to choose from. Is this simply the illusion of free will? We already have set paths chosen, there are only so many options, and our entire purpose of being “educated” or, as I like to call it, indoctrinated, is to make money. Do we really love what we do? Or do we just tell ourselves that? Can we even determine or identify our passions, wants, and needs in this world? Or are all of our wants, needs, desires, and passions given to us from the corporate world in the form of mass media, advertising, and marketing? Why is it that so many of us are all into the same material things, acquiring the same material things, yet never questioning the human experience? Have we become too comfortable? Change is never easy, and always greeted by ridicule. This is exactly what the human race is going through: we are recognizing the need to change currently on that path.

In New York Times column on Russell’s 1932 essay “In Praise of Idleness,” Gary Gutting writes, “For most of us, a paying job is still utterly essential — as masses of unemployed people know all too well. But in our economic system, most of us inevitably see our work as a means to something else: it makes a living, but it doesn’t make a life.” Bertrand Russell, a prominent British philosopher, mathematician, historian, writer and political activist agreed, stating that “Immense harm is caused by the belief that work is virtuous.”

Jones puts it well:

“In far too many cases in fact, the work we must do to survive robs us of the ability to live by ruining our health, consuming all our previous time, and degrading our environment. In his essay, Russell argues that “there is far too much work done in the world, that immense harm is caused by the belief that work is virtuous, and that what needs to be preached in modern industrial countries is quite different from what has always been preached.”

I agree. We do tend to glorify the idea of “hard work” as something to be proud of, without ever really taking a step back and looking at this human experience through an observer’s lens.

Russell referred to this type of an existence as a “slave state” operated by “those who give orders.” He calls it politics, which he elaborates on as having no real “knowledge of the subjects as to which advice is given, but only to manipulate: the art of persuasive speaking and writing.” This reminds me of the Sophists in ancient Greece, who used their intelligence and their ways with words to make life difficult for people.

“What is work? Work is of two kinds: first, altering the position of matter at or near the earth’s surface relatively to other such matter; second, telling other people to do so. The first kind is unpleasant and ill paid; the second is pleasant and highly paid…in a world where no one is compelled to work more than four hours a day, every person possessed of scientific curiosity will be able to indulge it, and every painter will be able to paint without starving, however excellent his pictures may be. Young writers will not be obliged to draw attention to themselves by sensational pot-boilers, with a view to acquiring the economic independence for monumental works, for which, when the time at last comes, they will have lost the taste and capacity.” (source)

His stuff has been talked about for decades:

“The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic societyThose who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country. We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of.” – Edward Bernays (“the father of public relations”), Propaganda, 1928

So, you see? Our lack of questioning and/or complacency has led to an interesting experience, one in which many are desiring change.

Can We Really Do What We Love In This Type of Human Experience?

So, is it possible to do what we love? Well, it might be a task to even figure that out when we are given our choices and paths in life. Furthermore, we have to pay rent, put food in our stomachs, and provide ourselves with the basic necessities. Even individuals with full time jobs are struggling to do this. These jobs take up to 8-10 hours of our lives every single day, so if you come from the belief that you cannot pursue your passions, you are in the company of many, including the two brilliant minds listed in this title.

To pursue something you love in this world, and are extremely passionate about it, it won’t be for monetary gain all the time. But we still have time to pursue the things we love as opposed to spending that time going to bars, or partying with friends. No matter how many excuses you have, if you love something there is always time to pursue it, but tell that to someone who just came home from a hard day’s work and has no mental/physical energy.

I am a big believer in the power of manifestation, meaning that one can manifest experiences into their lives with a shift in consciousness. Sure, the current human experience is a very hard one. It’s not easy, and for a soul to thrive here means they are very strong, especially if they will not quit in their pursuit to follow the call of their heart. That being said, what happens if you let the fear go and just start doing what you love, as much as you can? What if you take that road, and if you do so without worry, things workout for you? I believe if we want something badly enough, through the power of consciousness, we can manifest our own human experience, especially if it is something that’s rooted in the desire to do good for all. Based on all the science, history, philosophy and most of all, my intuition, this is something I firmly believe.

I’ve been able to be part of the CE team for several years now, and prior to it, it’s what I dreamed about. Being part of a team and having a platform to share information that we’ll never see in the mainstream media and to be in a position to bring new ideas and information to the world is all I wanted to do. I wanted it so bad that it’s what I did during school and when I had to work another job. I was always engaged in my passions, yet always heartbroken that I could not go through life solely pursuing what my heart beats for. But look at me now – I’m doing it.

I had a tough experience waking up to facts I was once unaware of, and on top of that was the normal human experience that just wasn’t resonating with me. What helped me manifest my experience?

The first thing was changing my perspective of the human experience. Instead of seeing it as a slave-like system, and labelling it as that, I chose to view it as an experience. I believe that this short lifetime is not our only one, and that this is my opportunity to “play” within the human experience. I looked at it as a challenge, and an opportunity to overcome many obstacles.

This helped my outlook on life big time, and instead of taking on a victim role where I felt hopeless and unable to change anything, the very perception of me looking at life as an opportunity is what helped me.

Life is too short to not put forth the effort into pursuing what your heart beats for. Yeah, it’s not easy I know, and it’s not hard to see why so many people believe it’s downright impossible when we have so many other duties to tend to.

Personally, I never perceived it as impossible. I was willing to die, go homeless, or whatever. There was no fear in me. We even have modern day science conforming that factors associated with consciousness, like thoughts, feelings and emotions, can actually affect our physical material world.

If you believe it’s possible, it is. If you don’t, it isn’t. The last thing I would say to you is that it’s not going to be easy, and will provide your life with a number of challenges/opportunities for growth. The joy lies within the journey itself, not within the ends.

Just Imagine

Just imagine, if human beings created an experience where all of our needs were provided for. As mentioned above, we have more than enough potential to do so…What would we do with our time?  It’s simple, we would explore, contemplate and discover. After all, that’s our natural state from birth, until we are told how the world works and what we are to do in it.

We’ve been brainwashed for so long and taken out of our natural state that it’s really time to create a human experience that resonates with all of us, and what we and all life are meat to do, and that’s thrive.

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Advertisement
advertisement - learn more

Alternative News

The Anatomy of Conspiracy Theories

Published

on

Whether you believe in conspiracy theories or not, we can all agree that the use of the term has exploded in media and in conversation. The question is, why? Are we now using the term “Conspiracy Theory” more indiscriminately and on more platforms than previously? Are we, as a society, simply becoming unhinged and absurd? Are seemingly nonsensical stories, for some unknown reason, starting to resonate with people? Or are some conventional narratives getting challenged because some of these “alternative” explanations are in fact accurate, despite the fact that conventional sources refuse to acknowledge them as even potentially valid? Notice that the last two possibilities are different sides of the same coin. If you think  “conspiracy theorists” are unhinged, it is highly likely that they are suspicious of your sanity as well. Both sides insist that they are right and that the other has been hoodwinked. Note that if you choose to not pick a side, you are, by default, allowing the conventional narrative to perpetuate. That is how convention works. 

Merriam-Webster defines the term conspiracy theory as “a theory that explains an event or situation as the result of a secret plan by usually powerful people or groups”. The key elements of this definition remain consistent across all authoritative lexicons: the group responsible for an event must be powerful and covert. However, if we refer to the Wikipedia definition as of 11/2018 a new element emerges: “A conspiracy theory is an explanation of an event or situation that invokes a conspiracy—generally one involving an illegal or harmful act supposedly carried out by government or other powerful actors—without credible evidence.”

When an explanation is labeled a “Conspiracy Theory,” by today’s definition, it has no evidence to support it. An explanation with no supporting evidence is a hypothesis, not a “theory.” “Conspiracy Theory,” as it is used today, is thus an oxymoron. These “Conspiracy Theories” we seem to hear about everyday should really be called “Conspiracy Hypotheses.” More concerning is that the “Conspiracy Theory” label identifies an explanation as inherently baseless. Given this linguistic construct, where is there room for a conspiracy that is in fact true?

There is also something troubling about using the term “credible” in the definition of conspiracy theory. Legally, evidence that is credible is that which a reasonable person would consider to be true in light of the surrounding circumstances. If evidence suggests an explanation that seems at the surface to be unreasonable, how does a reasonable person avoid automatically labeling the evidence not credible? If we are not careful, the credibility of the explanation and resultant conclusions would then determine the credibility of the evidence that supports it. Is this really so important? Perhaps you are quick to see that with this approach, our understanding of what is true and real can never evolve. If any evidence arose that radically disproved our understanding or eroded our faith in trusted institutions we would automatically discard it as “not credible” and remain entrenched in our accepted paradigm. “Credible” evidence cannot be a necessary requirement of a theory that challenges what is credible to begin with.

To better illustrate this, let us consider an old but very real “conspiracy theory.” About 400 years ago, European civilization was emerging from centuries of scientific and philosophical stagnation known as the dark ages. What more befitting a place for such a renaissance to occur than the center of the universe? You see, the idea that the Earth was one of eight planets revolving around a star that is orbiting the center of one of hundreds of billions of galaxies would have been absurd in Europe in the sixteenth century. Any sane person could see that the Sun and the Moon and every celestial body rises in the East and sets in the West. At that time, if someone went about proposing the idea that everything rises and falls because the Earth was spinning, they would have been laughed out of the tavern. Would that person be a conspiracy theorist? They are not proposing that “powerful actors are carrying out a harmful act,” they are merely suggesting an alternative explanation for what is observed. However, the implication of their suggestion seems to incriminate the authority on such matters as ignorant of the truth or, possibly, the perpetrators of a lie. The possibility of a conspiracy has now been introduced.

Now, let us say that this person claims to have proof of their absurd theory. Would you have taken the time to examine the evidence or would you have been more likely to dismiss them without further consideration? The very idea that they could be right would have been not just silly or heretical, but inconceivable to many, if not all. How could the evidence be credible if it implied something inconceivable? Dismissing their idea would have seemingly been the most logical and, therefore, the smartest thing to do.

advertisement - learn more

When Galileo Galilei appeared in 1610 armed with a rudimentary “telescope,” few would peer into it. He claimed that the refractive properties of the pair of “lenses” would allow you to see things at great distances very clearly. With it one could see Jupiter and its moons revolving around the giant planet just as our moon revolves around Earth. How enchanting! The difficulty would arise when you put the telescope down: your feet would no longer be planted on the previously immovable center of creation. Would you have looked into his telescope? What would have been the harm in taking a peek? Certainly the fear of being proven more gullible than most would have been on your mind. What about the fear that he might be right?

Imagine what must have been going through Galileo’s mind after his monumental discovery. He saw irrefutably that the entire model of the universe had been completely misconceived. One just has to look. Most did not. I can only imagine how hard he must have tried to convince anyone to simply stop, look and listen to what he had discovered. At the time, Galileo was the Chair of Mathematics at the University of Padua and had previously held the same post at the University of Pisa. Despite his bonafides and reputation as a solid contributor to the Italian renaissance, his discovery would likely have died in obscurity if it weren’t for the support of an influential family, the Medicis, who offered Galileo a platform from which he could spread his theory. It was only through allying himself with political power that he was able to slowly generate interest in his heliocentric model of the solar system. His proposition eventually caught the attention of the Catholic church, who initially warned him to desist. Eventually, he was brought to trial in the Roman Inquisition 23 years after his discovery. At the age of 70, the intrepid mathematician and astronomer was allowed to return home if he agreed to recant his story. Instead Galileo chose to spend the rest of his years in prison because he believed that that would be the only way to get people to open their eyes.

Did it work? It did not. Galileo died incarcerated while Europe continued to slumber under stars that moved around them. By today’s standards, Galileo would have been labeled a Conspiracy Theorist from the day he announced his findings until he was proven right fifty years after his death.  When the Principle of Gravitational Attraction eventually became widely accepted as true, the church had to retract their position because the motions of the stars and planets could not be explained under Newton’s laws. 

On the other hand, Galileo is credited with being the father of not only observational astronomy, but of the scientific method as well. The scientific method demands that one tests an explanation without bias towards an outcome. All data is considered before deductions are made. When all other explanations have been proven wrong, the only explanation remaining becomes a theory. The theory persists as long as all subsequent experiments continue to uphold it. This is how we ultimately know what we know and have an inkling of what we don’t. If I had to choose a posthumous title for myself, “The Father of the Scientific Method” is one I could die with. Galileo is credited with this honorific not only because he valued it more than his freedom, but because he had the discipline to regard evidence objectively despite how unimaginable the implications were. This is how a body of knowledge expands. By considering the validity of the evidence first, we then can accept what was previously unimaginable, otherwise what we know tomorrow will be no different than what we know today.

All conspiracy theorists are not Galileos. Neither are all conspiracy theories true. However, can we be certain that all of them are false? At their very core, all conspiracy theories directly or indirectly point at a central authority acting covertly and simultaneously at the media for either missing it or looking the other way. This, of course, is unimaginable, as we all know the government can make mistakes but would never do anything intentionally harmful to its citizens and then hide it. Even if they did, somebody would come forward and the media would let us know about it. This is why such a deception could never occur. The idea that your lover could be in bed with your best friend is inconceivable. Evidence of such a thing would not be credible. Dismissing all conspiracy theories seems logical and therefore seems like the smartest thing to do. 

In “Sapiens”, Yuval Harari proposes an explanation for why our species, Sapiens, out fought, out thought and out survived all other Homo species on the planet. He suggests that it was our unique ability to describe and communicate situations and events that had no basis in reality which set us apart. In other words, we could tell stories and they could not. By uniting under a common idea, story or even myth, thousands (and now thousands of millions) of Sapiens could come together with a shared purpose, identity or belief system to disband our cousins who were as individuals more sturdy and just as cunning but not nearly as good at cooperating as we were. This advantage, Harari proposes, has not only led our species to eventual supremacy over all others, but has also allowed us to form communities, governments and global alliances. 

Siding with the majority has served us well–until it hasn’t. One only needs to revisit the history of Galileo and basic astronomy to understand this. In actuality, the first observant minds woke up to the fact that the Earth went around the sun and not the other way round nineteen centuries before Galileo did. The Greek mathematician, Aristarcus, is thought to be the first Western person to place the Sun in the middle of a “solar system” in 270 BC. A human being traveled to the moon just 360 years after Galileo “discovered” what Aristarcus had shown nearly two millennia before. How many centuries was this journey delayed because an alternative explanation in ancient Greece became a “conspiracy theory” against authority and convention?

This poses an intriguing question. Is there something hardwired in our behavioral patterns that push us towards conformist narratives and away from alternative ones at a precognitive level? Is it this tendency that gave rise to our enhanced ability to unite that keeps us in “group-think” more than we should be? How do we know we are looking at the world objectively and rejecting alternative belief systems from a purely rational basis? How does one know whether one is biased or not?

One way is to apply the scientific method. The scientific method demands that every possibility, no matter how outlandish, is tested for its veracity and dismissed only when it can be proven wrong. Without this objective pursuit of truth, misconceptions can persist indefinitely, just as the geocentric model of the universe did. Interestingly, Aristarcus was allowed to retain his theory because he lived at a time and place where philosophers, mathematicians and scientists were revered, protected and free to pursue their notions. The freedom ancient Greek society afforded its scientists only endured for a few centuries after Aristarcus lived. In Galileo’s day, the Roman Catholic church had been presiding over such things as facts for well over a thousand years. His incontrovertible proof was suppressed by the power that had the most to lose.

These days, establishing the facts of the matter may not be as easy as we presume. Conspiracy theorists claim to have proof just like the debunkers do. How do we know that the proof offered on either side is valid? Who has the time to apply the scientific method? It certainly seems safer to go with the conventional narrative because surely there are more rational minds in a larger group. Though it seems a reasonable approach, it may be in fact where we misstep. By deferring to others, we assume the majority will arrive at the truth eventually. The problem is that those in the majority who are trained to examine evidence objectively often must take a potentially career-ending risk to even investigate an alternative explanation. Why would an organization be willing to invest the resources to redirect their scientific staff to chase down and evaluate evidence that will likely endanger their reputation with the public without any upside? Thus, conventional narratives survive for another day, or in the case of an Earth-centered universe, for a couple of thousand years.

Whether or not you are not a “conspiracy theorist” we can all agree that there is a possibility, however slight, that some conventional narratives could be wrong. How would we know? Is there a source that we can trust 100%? Must we rely on our own wits? A short inquiry into this question can be disquieting. Most of us must admit that our understanding of history, science and geopolitics are merely stories that we have been told by people, institutions or media that we trust explicitly or implicitly. Because most of us are not authorities on anything, it would be impossible to overturn any conventional narrative with an evidentiary argument. Challenging these paradigms is necessarily left to others. Generally speaking, there is no real reason to argue with convention if everything is seemingly unfolding acceptably. But what if you wanted to know for yourself ? Is there any way to ever really know the truth without having to have faith in someone or something else?

There may not be. However, it is also naive to believe that if someone, scientist or not, was in possession of evidence that challenged our deepest held beliefs that it would take root in the ethos on its own. Galileo enjoyed unsurpassed credibility as one of Italy’s foremost mathematicians. He also possessed irrefutable, verifiable and reproducible evidence for his revolutionary theory, yet the convention he was challenging did not crumble through his discoveries. History has shown us that it makes no difference how valid a point is; truth emerges only when someone is listening

So, rather than seeking to independently validate or refute what we are being told, it becomes more productive to ask a different question: How biased is our society by historical standards? How does our society regard alternative theories? Do we let them co-exist with convention as the ancient Greeks did? Do we collectively invest resources to investigate them openly? Or do we dismiss, attack and vilify them as was done in the papal states in Galileo’s time? Which kind of society is more likely to get it right? Which runs the greater risk of being hoodwinked in the long run? Which is more free?

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading

Alternative News

US House of Representatives Investigating if the Government Created Lyme Disease As A Bioweapon

Published

on

In Brief

  • The Facts:

    A New Jersey lawmaker suggests the government turned ticks and insects into bioweapons to spread disease, and possibly released them. He is not the only one who believes so.

  • Reflect On:

    This is not the only example of supposed human experimentation on mass populations by the government

There are a number of subjects that were once considered ‘conspiracy theories,’ which are now no longer in that realm. ‘Conspiracy theories’ usually, in my opinion, arise from credible evidence. The implications, however, are so grand and so mind-altering that many may experience some sort of cognitive dissonance as a result. One of the topics often deemed a ‘conspiracy theory’ is weaponized diseases, and the latest example comes from an approved amendment that was proposed by a Republican congressman from New Jersey. His name is Chris Smith, and he instructed the Department of Defence’s Inspector General to conduct a review on whether or not the US “experimented with ticks and insects regarding use as a biological weapon between the years of 1950 and 1975” and “whether any ticks or insects used in such experiment were released outside of any laboratory by accident or experiment design.”

The fact that Smith brought this up shows that any intelligent person who actually looks into this has reason to believe it’s a possibility, yet mainstream media outlets are ridiculing the idea, calling it a conspiracy instead of actually bringing up the points that caused Smith to demand the review.

The fact that the amendment was approved by a vote in the House speaks volumes. Smith said that the amendment was inspired by “a number of books and articles suggesting that significant research had been done at US government facilities including Fort Detrick, Maryland, and Plum Island, New York, to turn ticks and insects into bioweapons”.

Most people don’t know that the US government has experimented on its own citizens a number of times. All of this is justified for “national security” purposes. National security has always been a term used as an excuse to prolong secrecy, justify the government’s lack of transparency, and create black budget programs that have absolutely no oversight from Congress.

For example, on September 20, 1950, a US Navy ship just off the coast of San Francisco used a giant hose to spray a cloud of microbes into the air and into the city’s famous fog. The military was apparently testing how a biological weapon attack would affect the 800,000 residents of the city.The people of San Francisco had absolutely no idea. The Navy continued the tests for seven days, and multiple people died as a result. It was apparently one of the first large-scale biological weapon trials that would be conducted under a “germ warfare testing program” that went on for 20 years from 1949 to 1969. The goal “was to deter [the use of biological weapons] against the United States and its allies and to retaliate if deterrence failed,” the government later explained. Then again, that’s if you trust the explanation coming from the government.

This could fall under the category of human subject research. It’s still happening! A dozen classified programs that involved research on human subjects were underway last year at the Department of Energy. Human subject research refers broadly to the collection of scientific data from human subjects. This could involve performing physical procedures on the subjects or simply conducting interviews and having other forms of interaction with them. It could even involve procedures performed on entire populations, apparently without their consent.

advertisement - learn more

Human subjects research erupted into national controversy 25 years ago with reporting by Eileen Welsome of the Albuquerque Tribune on human radiation experiments that had been conducted by the Atomic Energy Commission, many of which were performed without the consent of the subjects. A presidential advisory committee was convened to document the record and to recommend appropriate policy responses.

When it comes to Lyme disease, the Guardian points out that:

A new book published in May by a Stanford University science writer and former Lyme sufferer, Kris Newby, has raised questions about the origins of the disease, which affects 400,000 Americans each year.

Bitten: The Secret History of Lyme Disease and Biological Weapons, cites the Swiss-born discoverer of the Lyme pathogen, Willy Burgdorfer, as saying that the Lyme epidemic was a military experiment that had gone wrong.

Burgdorfer, who died in 2014, worked as a bioweapons researcher for the US military and said he was tasked with breeding fleas, ticks, mosquitoes and other blood-sucking insects, and infecting them with pathogens that cause human diseases.

According to the book, there were programs to drop “weaponised” ticks and other bugs from the air, and that uninfected bugs were released in residential areas in the US to trace how they spread. It suggests that such a scheme could have gone awry and led to the eruption of Lyme disease in the US in the 1960s.

This is concerning. It’s a story that, for some reason, instantly reminded me of the MK ultra program, where human subjects were used for mind control research.

If things like this occurred in the past, it’s hard to understand why someone would deem the possibility of this happening again a ‘conspiracy theory.’ What makes one think this wouldn’t be happening again, especially given the fact that there is sufficient evidence suggesting it is?

Lyme disease is also very strange. If you did get it, you probably wouldn’t know immediately – unless you’re one of the chronic sufferers that have had to visit over 30 doctors to get a proper diagnosis. Lyme disease tests are highly inaccurateoften inconclusive or indicating false negatives.

Why? Because this clever bacteria has found a way to dumb down the immune system and white blood cells so that it’s not detectable until treatment is initiated. To diagnose Lyme disease properly you must see a “Lyme Literate MD (LLMD).” However, more and more doctors are turning their backs on patients due to sheer fear of losing their practices! Insurance companies and the CDC will do whatever it takes to stop Chronic Lyme Disease from being diagnosed, treated, or widely recognized as an increasingly common issue.

You can read more about that here.

The Takeaway

It’s becoming more apparent that our government as well as our federal health regulatory agencies are extremely corrupt. There are a number of examples to choose from throughout history proving this. The fact that something like this doesn’t seem believable to the public is ridiculous and further enhances and prolongs the ability for the powerful elite and the government to continue conducting these activities. Awareness is key.

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading

Alternative News

The Medical Journals’ Sell-Out—Getting Paid to Play

Published

on

[Note: This is Part IX in a series of articles adapted from the second Children’s Health Defense eBook: Conflicts of Interest Undermine Children’s Health. The first eBook, The Sickest Generation: The Facts Behind the Children’s Health Crisis and Why It Needs to End, described how children’s health began to worsen dramatically in the late 1980s following fateful changes in the childhood vaccine schedule.]

The vaccine industry and its government and scientific partners routinely block meaningful science and fabricate misleading studies about vaccines. They could not do so, however, without having enticed medical journals into a mutually beneficial bargain. Pharmaceutical companies supply journals with needed income, and in return, journals play a key role in suppressing studies that raise critical questions about vaccine risks—which would endanger profits.

Journals are willing to accept even the most highly misleading advertisements. The FDA has flagged numerous instances of advertising violations, including ads that overstated a drug’s effectiveness or minimized its risks.

An exclusive and dependent relationship

Advertising is one of the most obviously beneficial ways that medical journals’ “exclusive and dependent relationship” with the pharmaceutical industry plays out. According to a 2006 analysis in PLOS Medicinedrugs and medical devices are the only products for which medical journals accept advertisements. Studies show that journal advertising generates “the highest return on investment of all promotional strategies employed by pharmaceutical companies.” The pharmaceutical industry puts a particularly “high value on advertising its products in print journals” because journals reach doctors—the “gatekeeper between drug companies and patients.” Almost nine in ten drug advertising dollars are directed at physicians.

In the U.S. in 2012, drug companies spent $24 billion marketing to physicians, with only $3 billion spent on direct-to-consumer advertising. By 2015, however, consumer-targeted advertising had jumped to $5.2 billion, a 60% increase that has reaped bountiful rewards. In 2015, Pfizer’s Prevnar-13 vaccine was the nation’s eighth most heavily advertised drug; after the launch of the intensive advertising campaign, Prevnar “awareness” increased by over 1,500% in eight months, and “44% of targeted consumers were talking to their physicians about getting vaccinated specifically with Prevnar.” Slick ad campaigns have also helped boost uptake of “unpopular” vaccines like Gardasil.

Advertising is such an established part of journals’ modus operandi that high-end journals such as The New England Journal of Medicine (NEJM) boldly invite medical marketers to “make NEJM the cornerstone of their advertising programs,” promising “no greater assurance that your ad will be seen, read, and acted upon.” In addition, medical journals benefit from pharmaceutical companies’ bulk purchases of thousands of journal reprints and industry’s sponsorship of journal subscriptions and journal supplements.

advertisement - learn more

In 2003, an editor at The BMJ wrote about the numerous ways in which drug company advertising can bias medical journals (and the practice of medicine)—all of which still hold true today. For example:

  • Advertising monies enable prestigious journals to get thousands of copies into doctors’ hands for free, which “almost certainly” goes on to affect prescribing.
  • Journals are willing to accept even the most highly misleading advertisements. The FDA has flagged numerous instances of advertising violations, including ads that overstated a drug’s effectiveness or minimized its risks.
  • Journals will guarantee favorable editorial mentions of a product in order to earn a company’s advertising dollars.
  • Journals can earn substantial fees for publishing supplements even when they are written by “paid industry hacks”—and the more favorable the supplement content is to the company that is funding it, the bigger the profit for the journal.

Discussing clinical trials, the BMJ editor added: “Major trials are very good for journals in that doctors around the world want to see them and so are more likely to subscribe to journals that publish them. Such trials also create lots of publicity, and journals like publicity. Finally, companies purchase large numbers of reprints of these trials…and the profit margin to the publisher is huge. These reprints are then used to market the drugs to doctors, and the journal’s name on the reprint is a vital part of that sell.”

… however, even these poor-quality studies—when funded by the pharmaceutical industry—got far more attention than equivalent studies not funded by industry.

Industry-funded bias

According to the Journal of the American Medical Association (JAMA), nearly three-fourths of all funding for clinical trials in the U.S.—presumably including vaccine trials—came from corporate sponsors as of the early 2000s. The pharmaceutical industry’s funding of studies (and investigators) is a factor that helps determine which studies get published, and where. As a Johns Hopkins University researcher has acknowledged, funding can lead to bias—and while the potential exists for governmental or departmental funding to produce bias, “the worst source of bias is industry-funded.”

In 2009, researchers published a systematic review of several hundred influenza vaccine trials. Noting “growing doubts about the validity of the scientific evidence underpinning [influenza vaccine] policy recommendations,” the authors showed that the vaccine-favorable studies were “of significantly lower methodological quality”; however, even these poor-quality studies—when funded by the pharmaceutical industry—got far more attention than equivalent studies not funded by industry. The authors commented:

[Studies] sponsored by industry had greater visibility as they were more likely to be published by high impact factor journals and were likely to be given higher prominence by the international scientific and lay media, despite their apparent equivalent methodological quality and size compared with studies with other funders.

In their discussion, the authors also described how the industry’s vast resources enable lavish and strategic dissemination of favorable results. For example, companies often distribute “expensively bound” abstracts and reprints (translated into various languages) to “decision makers, their advisors, and local researchers,” while also systematically plugging their studies at symposia and conferences.

The World Health Organization’s standards describe reporting of clinical trial results as a “scientific, ethical, and moral responsibility.” However, it appears that as many as half of all clinical trial results go unreported—particularly when their results are negative. A European official involved in drug assessment has described the problem as “widespread,” citing as an example GSK’s suppression of results from four clinical trials for an anti-anxiety drug when those results showed a possible increased risk of suicide in children and adolescents. Experts warn that “unreported studies leave an incomplete and potentially misleading picture of the risks and benefits of treatments.”

Many vaccine studies flagrantly illustrate biases and selective reporting that produce skewed write-ups that are more marketing than science.

Debased and biased results

The “significant association between funding sources and pro-industry conclusions” can play out in many different ways, notably through methodological bias and debasement of study designs and analytic strategies. Bias may be present in the form of inadequate sample sizes, short follow-up periods, inappropriate placebos or comparisons, use of improper surrogate endpoints, unsuitable statistical analyses or “misleading presentation of data.”

Occasionally, high-level journal insiders blow the whistle on the corruption of published science. In a widely circulated quote, Dr. Marcia Angell, former editor-in-chief of NEJM, acknowledged that “It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines.” Dr. Angell added that she “[took] no pleasure in this conclusion, which [she] reached slowly and reluctantly” over two decades at the prestigious journal.

Many vaccine studies flagrantly illustrate biases and selective reporting that produce skewed write-ups that are more marketing than science. In formulaic articles that medical journals are only too happy to publish, the conclusion is almost always the same, no matter the vaccine: “We did not identify any new or unexpected safety concerns.” As an example of the use of inappropriate statistical techniques to exaggerate vaccine benefits, an influenza vaccine study reported a “69% efficacy rate” even though the vaccine failed “nearly all who [took] it.” As explained by Dr. David Brownstein, the study’s authors used a technique called relative risk analysis to derive their 69% statistic because it can make “a poorly performing drug or therapy look better than it actually is.” However, the absolute risk difference between the vaccine and the placebo group was 2.27%, meaning that the vaccine “was nearly 98% ineffective in preventing the flu.”

… the reviewers had done an incomplete job and had ignored important evidence of bias.

Trusted evidence?

In 2018, the Cochrane Collaboration—which bills its systematic reviews as the international gold standard for high-quality, “trusted” evidence—furnished conclusions about the human papillomavirus (HPV) vaccine that clearly signaled industry bias. In May of that year, Cochrane’s highly favorable review improbably declared the vaccine to have no increased risk of serious adverse effects and judged deaths observed in HPV studies “not to be related to the vaccine.” Cochrane claims to be free of conflicts of interest, but its roster of funders includes national governmental bodies and international organizations pushing for HPV vaccine mandates as well as the Bill & Melinda Gates Foundation and the Robert Wood Johnson Foundation—both of which are staunch funders and supporters of HPV vaccination. The Robert Wood Johnson Foundation’s president is a former top CDC official who served as acting CDC director during the H1N1 “false pandemic” in 2009 that ensured millions in windfall profits for vaccine manufacturers.

Two months after publication of Cochrane’s HPV review, researchers affiliated with the Nordic Cochrane Centre (one of Cochrane’s member centers) published an exhaustive critique, declaring that the reviewers had done an incomplete job and had “ignored important evidence of bias.” The critics itemized numerous methodological and ethical missteps on the part of the Cochrane reviewers, including failure to count nearly half of the eligible HPV vaccine trials, incomplete assessment of serious and systemic adverse events and failure to note that many of the reviewed studies were industry-funded. They also upbraided the Cochrane reviewers for not paying attention to key design flaws in the original clinical trials, including the failure to use true placebos and the use of surrogate outcomes for cervical cancer.

In response to the criticisms, the editor-in-chief of the Cochrane Library initially stated that a team of editors would investigate the claims “as a matter of urgency.” Instead, however, Cochrane’s Governing Board quickly expelled one of the critique’s authors, Danish physician-researcher Peter Gøtzsche, who helped found Cochrane and was the head of the Nordic Cochrane Centre. Gøtzsche has been a vocal critic of Cochrane’s “increasingly commercial business model,” which he suggests is resulting in “stronger and stronger resistance to say anything that could bother pharmaceutical industry interests.” Adding insult to injury, Gøtzsche’s direct employer, the Rigshospitalet hospital in Denmark, then fired Gøtzsche. In response, Dr. Gøtzsche stated, “Firing me sends the unfortunate signal that if your research results are inconvenient and cause public turmoil, or threaten the pharmaceutical industry’s earnings, …you will be sacked.” In March 2019, Gøtzsche launched an independent Institute for Scientific Freedom.

In 2019, the editor-in-chief and research editor of BMJ Evidence Based Medicine—the journal that published the critique of Cochrane’s biased review—jointly defended the critique as having “provoke[d] healthy debate and pose[d] important questions,” affirming the value of publishing articles that “hold organisations to account.” They added that “Academic freedom means communicating ideas, facts and criticism without being censored, targeted or reprimanded” and urged publishers not to “shrink from offering criticisms that may be considered inconvenient.”

In recent years, a number of journals have invented bogus excuses to withdraw or retract articles critical of risky vaccine ingredients, even when written by top international scientists.

The censorship tsunami

Another favored tactic is to keep vaccine-critical studies out of medical journals altogether, either by refusing to publish them (even if peer reviewers recommend their publication) or by concocting excuses to pull articles after publication. In recent years, a number of journals have invented bogus excuses to withdraw or retract articles critical of risky vaccine ingredients, even when written by top international scientists. To cite just three examples:

  • The journal Vaccine withdrew a study that questioned the safety of the aluminum adjuvantused in Gardasil.
  • The journal Science and Engineering Ethics retracted an article that made a case for greater transparency regarding the link between mercury and autism.
  • Pharmacological Research withdrew a published veterinary article that implicated aluminum-containing vaccines in a mystery illness decimating sheep, citing “concerns” from an anonymous reader.

Elsevier, which publishes two of these journals, has a track record of setting up fake journals to market Merck’s drugs, and Springer, which publishes the third journal as well as influential publications like Nature and Scientific American, has been only too willing to accommodate censorship requests. However, even these forms of censorship may soon seem quaint in comparison to the censorship of vaccine-critical information now being implemented across social media and other platforms. This concerted campaign to prevent dissemination of vaccine content that does not toe the party line will make it harder than ever for American families to do their due diligence with regard to vaccine risks and benefits.


Sign up for free news and updates from Robert F. Kennedy, Jr. and the Children’s Health Defense. CHD is planning many strategies, including legal, in an effort to defend the health of our children and obtain justice for those already injured. Your support is essential to CHD’s successful mission.

You Can Help Stop The 5G Infrastructure

We plan to investigate the telecom industry, it’s ties to politics, and expose its efforts to push 5G while ignoring the dangers and without proper safety testing, but we can't do it without your support.

We've launched a funding campaign to fuel our efforts on this matter as we are confident we can make a difference and have a strong plan to get it done.

Check out our plan and join our campaign here.

Continue Reading
advertisement - learn more
advertisement - learn more

Video

Pod