Now that I’ve gotten older, I often contemplate how the United States of America is known for several things. Though these days, a reality show gone bad is at the top of the list—but there is more. While a lot of its redeeming qualities may be semi-decent and borderline good-ish—surprisingly, there’s quite a few that aren’t. Perhaps, the most consequential, and ugliest characteristic of this country is it’s love for racism.
To be fair, I don’t think all the wypipo in this country are racist, nor do I think they are all platinum members of white supremacy. However, I will say this—if you have a blood clot in one of your fingers—that goes untreated—chances are you’re going to lose your hand if you don’t fix it. That’s what racism has become in America. An untreated blood clot that has run rampant—rotting the entire hand and America refuses to amputate it out of fear of losing something it needs to function.
Racism has become so indoctrinated and embedded within its core—learning to ignore its symptoms and living with the pain has become its way of life. It is a growing tumor this country has been ignoring for centuries.
If you don’t believe me, follow along and I’ll give you ten instances, though there are a billion more that prove America is still racist AF.