This right here is I think what a lot of white people miss. We're so invested in avoiding the *label* "racist" that we don't put in the work. As if being seen as racist is a death sentence and something we cannot (read: will not) fix. I've seen a lot of Black writers suggest that we shouldn't call racist people racist because then they clam up and get defensive. And ... I more or less agree but I think it is important to call out racism but in such a way that people understand they can change themselves for the better.
I'm about to read the other comments here and I'm just dreading that I'm likely to see a lot of justifications and self-exceptions. I wish white people would accept that we were raised within an unequal system which unfairly privileges us and that we do have to actively work to include "the other" but that that work will be beneficial not only to "the other" but also ultimately to us.
Thank you for putting yourself out here.