good wiki

Feminism in the United States

Feminism is aimed at defining, establishing, and defending a state of equal political, economic, cultural, and social rights for women. It has had a massive influence on American politics.

More at Wikipedia

About