Saved Events

Saved Stories

Custom Lists

Friends

Recent Comments

I want to be up front that I consider myself very attuned to issues of racial disparities. I work in public health to reduce health disparities among disadvantaged populations (often black/Latino), have been involved in Black Lives Matter, etc. - I don't want anyone to think that what I'm about to say comes from a place of invalidating that racial disparities exist.

But dear lord, this is absolutely abysmal journalism. You base your findings of racial bias on a data set where you only have race specified for 198 out of 763 alerts, but then go on to say that the alerts are disproportionate relative to BART's ridership. You do realize that you are missing data on 75% of the alerts? You don't actually know what the racial composition of your entire data set is. You simply cannot state, with any degree of certainty, that the findings don't represent BART's passengers.

Not only do you have insufficient data to make this claim, but your conclusion could be biased. For example, there's a well-documented sociological phenomenon in which race is only described in an account when the person is non-white. The average person telling a story about a white person will often omit that the person is white, whereas if the person is of a non-white race then that detail is much more often cited. Your 198 accounts may be self-selecting for situations where the person is more likely to be black, whereas the other 565 may be biased to situations where the person is white, because of this phenomenon.

Do I think it's possible that an app could magnify our own well-documented social biases? Absolutely. But you're drawing conclusions here that aren't support, and reporting them as fact. There has to be a higher standard than this for your journalism.