One of the things that bums us out about most post-baby body coverage is the overwhelming focus on "bouncing back," and the lack of real-talk about what moms can expect. That's why for many women, the most thrilling moment in all the hoopla surrounding the royal baby was when the duchess clasped her hands under her still-swollen, I-just-had-a-baby-YESTERDAY belly.
The thing is, pregnancy changes women's bodies, sometimes forever. You grow a tiny person in there, and then your body does the incredibly hard work of ushering him or her into the world. Some of the resulting shifts are less welcome than others, but -- if you won't hate us for being too insipid -- we'd argue that many are awesome in their own way. It's high time for a frank conversation about real post-pregnancy bodies so that women, and men, can push back against the notion that moms are supposed to magically appear as though they never gave birth at all.
Without further ado, here's what really happens to your body after baby.