There's a shitstorm brewing on this week's discussion boards for my psychology class. We are talking about cultural attitudes about punishing children, and it's turned into a whole bunch of "I was spanked as a kid and I'm fine, I spank my kids and they're fine" repeated over and over. "Kids these days are such brats because their parents don't smack 'em, you know. " "As long as you aren't beating them up or causing marks/hitting their faces, it's totally okay."

Note: I'm paraphrasing here. You don't want to see the exact quotes.

I do not have kids, so I carefully stated that these are only my opinions and I'd never tell anyone how to raise their kid as long they are within the confines of the law. I'm really trying to not cause fights and keep this an intelligent, thoughtful discussion.

Advertisement

However, I don't think it's ever right to hit a child. So to avoid getting into an argument with my classmates, I decided to post about how cultural attitudes shift, and it seems like we are moving away from corporal punishment as a culture (cited links showing that corporal punishment in schools is generally on the decline, and more and more states are making it illegal).

This did not avoid the argument as I had hoped.

Anyone have any links or anecdotes or anything that will be relevant to this discussion? I feel like I'm getting wailed on from several different directions, and I don't even know where to start responding.