• Powered by Roundtable
    feministstudent@bbs
    Jul 25, 2023, 13:30

    Hi, I'm 15 atm and I am a feminist for my whole life, and I'm educated on the fact that feminism is about letting women do whatever they want. I know the question seems stupid, but I just need absolute closure regarding the topic.

    I'm in school now [all female school] and an overwhelming majority of the people here are talking about females are always right and men concerns are not important. A lot of my feminists friends said it's okay to have sex with many men and never listen to men because all men are born evil - and that they are taught to oppress women and beat women.

    However, whenever I hang out with my brother who is older than me - he would bring me out with his friends sometimes - and most of his friends are from all boys school - nothing like what my feminists classmates say are even close to true.

    These boys do not talk about beating girls or hitting girls or women. They are always talking about "when i grow up, I want to eat a lot of food and I will bring all of you with me". And some of them even says when they have a girlfriend it will be nice to go out together with all of us.

    I ask them if they like to beat girls, and all of them look at me and look confused. One of them says, "what for?" I said, "I don't know.. it makes you happy?" All of them look even more confused , but one of them said, "We protect girls and if really enjoy beating girls, what are you still here having pizza with us?" 

    I felt embarassed and apologize.

    My brother said, " and if anyone were to beat you, i am here to protect you". And his best friend say, "all of us are friends and we do not care if you are girl or boy, we can always play and eat together like now."

    I realize that all the things that the feminists say in school are not true. But growing up in school full of girls and feminists made me believe in whatever they are saying and I thought it was true.

    Feminism even teaches me that men are always wrong and that women know and understand 100% about men and boys, but it seems like they totally don't know or understand boys and men at all. I am seeing people talk about men/boy in ways that are totally not true but a fully made up stories. Some of them would overthink of a situation and play victims. Some of my friends even think of making female the dominated gender. Most of my feminist friends are only looking to treat men like slaves - they boyfriend money are all their money but their money is theirs. They say they think men are useless and they think men do not know how to do housework but from what I see from my parents, they do housework together - it depending who are better at which scope of work.So because feminism seems to be very sexists, misandrist and self-centered mentality, I think of leaving feminism.

    I even ask my ex-feminist friend and she said that feminists are bad. I asked her why and she said:
    - Feminism projects gynocentric and narcissistic mentality. 
    - Feminists are entitled and have hypocritical attitudes.
    - Feminists use passive-aggressive and manipulative behavior in social, emotional, and financial contexts.
    - Feminists love making false accusations and engaging in gaslighting towards boys and men.
    - Feminists teach toxic misandric sexist advice from other women
    - Feminists adopt the belief of "just believing women" without unbiased investigation.
    - Feminism fails to hold females accountable for their actions and demanding true equality.
    and many more..