Women are too often indoctrinated at a young age to believe that sometimes sex is painful as if it is some inevitable part of intimacy and the sexual experience. Think about every movie you’ve ever seen when a woman has sex for the first time…

Women are too often indoctrinated at a young age to believe that sometimes sex is painful as if it is some inevitable part of intimacy and the sexual experience. Think about every movie you’ve ever seen when a woman has sex for the first time: the wincing look on her face as her partner (usually a man) tries to penetrate. While pain with sex is a pretty common occurrence among women, it is absolutely not normal.

Read More

Accessibility Tools

Increase TextIncrease Text
Decrease TextDecrease Text
GrayscaleGrayscale
Invert Colors
Readable FontReadable Font
Reset
Call Us Text Us