Talkin' trash to the garbage around me.

25 March, 2006

Am I the one who should be teaching women about their bodies?

Well, I'm finishing up the grading for the term, my first teaching an introductory women's studies course, and as I read the exams, I'm a little disappointed to find that most of my students didn't "get it." That is, I don't think they really grasped the power dynamics that buttress all of the different and interrelated forms of inequality. In the best tradition of self-reflexivity, I have to ask myself why that is. The simple answer? I held back.

I suppose there's a number of reasons for this. I approached the course this term as a job I was just trying to muddle through, with very little passion for the whole project. For a course that relies upon fostering self-discovery as a pedagogical tool, I suppose it's a little problematic if the guide for the whole affair is a bit detached from the process.

I also think I was somewhat cowed by the political environment in which "we" academics (I'll include myself for another few months) are teaching. I think I was overly careful not to offend students who may have been a bit more conservative, and that might have blunted my ability to get them to really think about power.

Both of these things are fixable. I'm actually looking forward to next term - probably my last term of teaching ever! - and organizing the class, rather than teaching them.

That doesn't deal with all of the issues. Frankly, it's hard for me to square helping young people realize that there are structures out there which put them at a significant disadvantage to white, heterosexual, middle-class men when I myself am a white, heterosexual, middle-class man. I mean, we won't even discuss the absurdity of me trying to sort out all of the myths about the female orgasm in a class on sexuality - while I might have some, erm... technical understanding in that respect, hell if I know what's actually going on. But who am I to tell them that they are being made to feel bad about their bodies by mass culture, and that they shouldn't buy into it, even though it's true? Why should they believe me? Some have tried to tell me that maybe they'll take it more seriously since it's coming from a man, which really, is even more fucked up. Why should they believe me? What it boils down to is I'm up there teaching feminist principles in which I honestly believe (even if I fall short in their actual implementation), but I feel like a giant fraud who's going to be found out. It's a very bizarre place to be, and I definitely didn't have to deal with this teaching medical sociology.

Okay, now that that's off my chest, who's up for some beer and football?