Following Bill Cosby’s release, wellness advocates push for Consent Culture
Rape culture is a culture in which sexual violence is treated as the norm and projects blame for sexual assaults on the victims’ behavior, appearance, or profession while exempting or excusing perpetrators of assault from accountability for the violence they’ve inflicted and and harm they’ve caused. Educators for Consent Culture, has been working to raise awareness about consent culture.