Don't parents talk to their children about sex anymore, at all? My Mom was far from being an "open book" when it came to sex, but I remember well the talk she gave me when I was about 10 as to where babies come from. And starting in middle school, we learned about ovulation, how babies are conceived, sexually transmitted diseases, pregnancy, childbirth... aren't they teaching any of that in school these days?
Another thing that shocks me is the number of teenagers I see who are trying to have a baby with their boyfriends - with little thought as to how hard, and expensive it is, to raise a child. I have always loved children, but the thought of having one before I was an adult NEVER crossed my mind!
What do you think has caused this shift in thinking, societally?
Online Parent Support