The creators of these original shows were white (and mostly male). The top-billed actors on these shows were white as well. It’s not hard to see the potential downsides to this in terms of who are getting opportunities now.
How much lived experience should a person have before tackling certain TV or movie projects? I’m talking about shows and films that confront stories, maybe controversial stories, about communities long pushed to the margins by Hollywood. How much — from the inside — does a writer or director or producer need to know and understand, in their bones, about a place or a culture in order to portray and explore it with nuance and intelligence?