Feminism: (noun) Definition: The radical idea that women are people.