Lord, I know this is going to ruffle some feathers, but I don’t have posting privileges on clashtalk and I really really need to get this off of my chest.

I currently live outside of the US and in my almost 10 years abroad, I have slowly developed a strong distrust of white women, specifically white Americans. Perhaps because other countries do not have the same history regarding slavery, I have no problems with white women from any other English speaking country (sometimes the Brits, but not really). But the white American women that I have come across have consistently done or said things that make me keep my distance. Examples include but are not limited to:

  • “They’re nice to me in the immigration office because I’m white”- This bitch literally said this to me, an obviously black woman
  • Using my race to make a point in an argument
  • Getting mad that I got my work visa before them, and throwing a tantrum about it.
  • Trying to get me to move from my seat in a bar so that her friends could sit down.

I have friends that have developed the same distrust of white Americans and they also keep their distance. My assumption is America has bred this sense of entitlement in many of them. But when they go to other countries (I’m speaking specifically about rich Western countries), their whiteness is no longer exalted. If you’re not French, Danish, Spanish, German, whatever, you do not get the same level of respect that is reserved for the citizens of that country and it pisses them off. My experience has shown that many of these countries treat Black and White Americans relatively equal, as non citizens. And when I receive treatment that is better than what they expect for themselves, they fall back on good old American racism as a way of improving their self-image. And I am so.fucking.tired.of.it.

Has anyone else experienced something similar?