Was it something your parents said? A personal experience? You're not sure why but for some reason it's a bad word? I would like your personal (and serious) recount of what made you have a negative feeling (if any) toward this term. Thanks! I'm looking for specific memories/incidents as to why you view the term feminism in a negative or positive light. Thanks!