I guess my question would be, what is it that you care about? You care that everybody admits that America is racist? Not that there is racism in America mind you (as I said, I think almost everybody would concede that), but you want people to say America is racist? Why? Especially given that you're white. Why? What do you get out of that? What do you think black people get?
The examples I gave are of racism in America. I could find it in almost every country on Earth. So again, why do you think it's any kind of priority to go that extra step and define America itself as racist? I am absolutely certain that every one of the people I wrote about would rather have their situation improved than debate whether America is racist.
To be perhaps brutally honest, I couldn't care less if some people aren't in a place where they can put their own ego aside and work to help those less fortunate than them. And sadly that includes some black people.
Most of the black people I see whining about microagressions on Medium have lives that are sooo much better than people who live in former redlined communities or are in jail for minor offenses as casualties of the war on drugs or who live in neighbourhoods where they get hassled by the police every day. They complain because an old lady didn't smile at them or some such crap. It's so narcissistic.
So no, I'm not interested in meeting people who aren't interested in anyone but themselves "where they're at". They were never going to say anything useful anyway so again, who cares?