I encountered a self professed feminist not long ago. He put forth the theory that due to society not being open to women in terms of power and authority women have been returning to more submissive roles, such as domesticity and family. I had a few issues with this concept. The first was he couldn't provide anything to back up such an intense claim. The conversation lasted well over an hour and I couldn't make any progress with him about finding some sources to back this up. This view without any evidence seems to me to be very sexist. An assumption that women are inherently weak and will give up and fall back on traditional roles.
Also viewing domestic roles to be necessarily submissive (and necessarily negative) irks me. I have no problem with women who want to be submissive in their relationship and don't see it as something feminism can't support. Everyone should have choice and be happy with that choice. That is what feminism is to me.
Also generalisations can be made about anyone regardless of gender. Some men are rapists, some women are bimbos, etc. To identify SOME women falling back on traditional roles as a trend is deeply disturbing and not entirely relevant I don't think.
Maybe I have this all confused though, and my fellow debater wasn't skilled in communication. Can anyone shed some light on this very confusing issue? Or provide some personal insight?