Growing up, it never occurred to me that my body was something I was supposed to hate. I was overblessed with a great family, I was always surrounded by adults who treated me like I mattered, and I had the completely unfair advantage of being white, able-bodied, middle class, American. In my privileged little childhood, I was never given any reason to believe that my body could have anything wrong with it. But unfortunately, sometime around the beginning of puberty… I realized I was a woman.
Suddenly, the girl characters on TV didn’t matter as much as the boys. Boys kept coming to school in oversized hoodies and jeans, and the girls started wearing smudgy mascara and padded bras. Teachers started treating boys like their words carried more weight than mine did. Adult men started to look at my teenage body like I was something interesting but scary. Very slowly, over the course of a full decade, I picked up experiences that taught me that my womanness was a job, something I had to work to maintain, and something that required constant upkeep. If I made peace with one part of my body, another part started changing. There weren’t enough hours in the day to deal with all the parts of me that weren’t good. I felt myself constantly morphing into something less and less manageable, something uglier and uglier.