Feminism was once defined as the "qualities of females" and for a while was considered a belief in equality between the sexes. From these fairly benign beginnings, modern-day feminism has morphed into something quite ghastly and destructive. Feminist leaders blame the 'patriarchy' for all the ills of society and, in their mad quest for power, hope to start a social revolution and restructure society from the ground up. Perpetuating gender myths, bemoaning the so-called 'rape culture' and toxic masculinity, naming and shaming men through the #MeToo movement and tearing down the traditional family structure, to name just a few of their tactics, are all done under the guise of empowering women.
But what will be the cost? Can normal relations between men and women survive this onslaught? Do we really want a world overrun with hysterical harpies and emasculated men?