Friendly advice...
- Dec 6, 2016
- 2 min read
Having an interesting conversation with a friend and I was asked, "Do you like to be dominated?" The question was in reference to the bedroom, but it definitely got me thinking... Do women want to be dominated? Do they want to be controlled? My opinion... Every woman wants to be dominated. Every woman wants a man that can tame her without her feeling she has lost herself. Every woman wants a man she can trust so deeply that she'd put her being in his hands and she can trust that he'll take care of it. Every woman deserves to have that and it's her true natural desire.
I am not talking about being abused, controlled, and emotionally manipulated. That's not dominance that is abuse. I'm talking about allowing a man to see the true, transparency of everything that you are. Allowing another being to see what lies within you, no secrets, no hidden feelings. Just open and exposed.
What do you think? Do women need men to dominate them? Are we really independent creatures and we don't care to have that relationship with our significant other? How do you feel?

I believe that the first time a female learns to love and trust in a male dominant relationship is with her father. If that relationship is not formed well, she constantly looks for that security and relationship in every male she meets. I believe this is the foundation that allows a girl to grow into a woman. This basis affects the woman she will be. I often feel that men don't know how imperative their role is in their daughter's lives. They may sometimes feel that their daughters (being female) may need their mothers more, but nothing could be further from the truth. I encourage every man to learn to be a father. You may not have fathered a child yet or maybe you have but your ancestors fathered a great nation. Women need their fathers. Let me know what you think or any experience that could elaborate on the topic.




Comments