If I asked a human to draw a woman with a phone in her hand explaining something to a man then they would attempt to draw exactly that. If it's a white human, I'd expect the output skin color to be white same with any other human biased towards their environment unless they themselves are biased toward other skin colors. Skin color isn't the problem with Stable diffusion.
Stable diffusion will instead draw up a man with a phone explaining it to a woman. WTF? Same with African doctor treating European or Asian patients. Sometimes it does the reverse or mixture i.e. Asian (yes this includes Indian, ya east Esian biased f*cks) and Euro doctors and African patient. You can be as specific as you want it and it'll still be stupid about it. A negative prompt can help with specificity, but it's still biased. This was the hullabaloo about ethical AI.
The training models being heavily biased. If I put a generic prompt then from a list of random things, a thing should be picked. If skin color is not specific then a random skin color and shade should be picked. Same with the look of a man or woman where random features that depict a man or woman are picked.
There is no general feature only stereotypes. Yes, Stable Diffusion is stereotypical. That's not to say it isn't useful in other ways.