You know, I think we were all on to something before we grew up and became members of the establishment.
Think about this with me, if you will. As a child, you trusted everyone, you never in a million years thought anyone would intentionally hurt someone else, you trusted your instincts, and boys had cooties. What was so wrong about any of that?
A lot of people seem to think that some women are, and have always been man-hating nightmares. Not so. Here's the thing. No one came out of the womb hating men. It's not like you were born and just decided on the playground someday at the age of two that all men were cheating assholes, did you? No, you didn't. You thought they were gross cos they ate bugs. Big deal. You still liked them just the same. Cos they were kinda cute and icky all at the very same time.
When we were born, we were so trusting, especially of ourselves. We had good instincts and gut reactions. It was somewhere in the process of growing up that we were taught to be rational, and that sucks. We were taught to effectively ignore everything we instinctively felt. We believed in magic, fairies, unicorns, and God. Yeah, all at the same time, and there was nothing wrong with it. I just want to know when in the course of a lifetime that distinctive moment comes when you quit believing in what you don't see.
And I think the unfortunate answer to all of this is: adults. Someone, at some point told us we were silly, or that we needed to grow up, and everything changed. Our tiny little hearts got broken, we cried about it, and then we got assimilated. And I think that sucks. A lot.