Pioneer1 Posted July 10, 2022 Report Share Posted July 10, 2022 There's something I've been noticing about White society and especially the "alternative" and "conspiracy" communties in White society, for some time now.... I've looked at this closely and it appears that the ONLY time they go against the grain, against the establishment, and reveal something "new" is AFTER a Black person reveals it. What do I mean? I'll give a few examples.............. Take Troy's thread on fake news and how so many people NOW acknowledge that the news is fake and manipulative. Well, prior to the 60s you didn't hear any talk. Back then and before it was assumed that everything reported in the mainstream media was "gospel truth" It wasn't until Black activists came along and pointed out the lies in the mainstream media and how they were racist and biased and were trying to manipulate the public- THEN all of a sudden different White people popped up claiming not to trust the "mainstream" media and started doing so called exposes on them. Now they're bashing the mainstream media left and right when THEY are the ones guilty of putting out the false propaganda in the first place. For centuries, White society taught that Adam and Eve were the first people and that humanity is no more than 6000 years old. There is almost no record of them teaching anything else before 1930s. Then AFTER the Nation of Islam came along and said that man is TRILLIONS of years old and so is this planet.....THEN all of a sudden you had all of these different "alternative religions" popping up after the 40s and 50s claiming to have Divine revelations that Earth is millions of years old and man is much older. They started talking about ancient civilizations and who built the pyramids. But BEFORE the 30s they taught that the Hebrews built the pyramids....that was the standard mainstream belief. Yet another example is how for centuries it was thought that the Western diet was the "best" diet. Steak and potatoes will make you strong. Bacon and eggs is good for you and will have you working hard all day. Those were supposed to be the healthiest food to make the healthiest people according to Western tradition. Then Black leaders as far back as the 40s and 50s were saying that Western diet was no good and how it's unnatural and full of toxins, poisons, and preservatives and how we need to go back to a more natural vegetarian based diet without so much meat and preservatives. THEN all of a sudden you started seeing these White gurus popping up with their "alternative health" books claiming we need to eat more grains, more vegetables, no more meat, ect... I can go on and on in terms of subject like religion, the Bible, history, etc....but I think yall get the point. It seems that the ONLY time White folks as a group actually are willing to tell the truth is AFTER we find out and expose it. Then they'll acknowledge it and put SOME of it (not the entire thing) out as if they are the ones exposing it. Link to comment Share on other sites More sharing options...
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!Register a new account
Already have an account? Sign in here.Sign In Now