I'm wondering: Do you actually believe that Americans have the shittiest culture on Earth? And if so, why? In fact, do we really have our own culture, or are we home to an incredibly diverse range of constantly-evolving cultures? I'm an American, though I've never been very patriotic or whatever. I recently spent two months in Europe to study and to travel. I expected to return from Europe feeling very fed-up with America and Americans, but quite the opposite occurred. While I found much of the social and political awareness and atmosphere in Europe refreshing and far better than what I often see at home, I missed many, many things about American culture and people. And I don't think it's because I was feeling extra foreign; I visited countries where I can speak the native language fairly well. I found, upon my return, that I was actually happier than ever to be an American, and maybe even a little bit proud of being an American. This is a foreign feeling to me, but I feel it nonetheless. That being said, other cultures are doing A LOT of things right, and we should take note. I am an active advocate for all sorts of change in The States, and I think that's the mark of a real patriot anyway. A real patriot wants to change his nation and people for the better. I would, however, never say our culture is shitty. Anyway, I'd love for you to expand on that, and maybe we can talk about it more.the absolute shittest culture on earth, the American one.