You guise, I may have saved a life yesterday. Just kidding. Well, sort of. My coworker came over to my desk with an unusually somber look on her face, and asked about my experiences with bladder infections. I've had tons, so I was happy to help. I asked her the usual questions, and she said her lower back felt tender. I did the responsible thing and told her to go see a doctor.

This morning she had blood in her pee. Yay, kidney infection! She did get an online diagnosis, because we work late and not many clinics are open by the time we get off work. She's on antibiotics and lots of water now.

It got me thinking, though. In health class we focused on all the horrible, terrible diseases we were definitely going to get from having sex, and then eventually die from those diseases unless we were married. But I don't remember ever once being told about bladder infections. How was this never mentioned? Why do we only find out about them when we can't stop feeling like we have to pee, and end up peeing blood? I only learned about bladder infections when my cousin got one along with a yeast infection, and her mom had to pass the story along to my mom, because that's what happens to naughty girls who have pre-marital sex.

Shouldn't this be part of regular sex education? Can't we start removing some of the mythology surrounding women's genitals?

Except vagina dentata. That one's too funny, we should totally perpetuate that myth. If you're stupid enough to believe it, you don't deserve sex.