So I’m European and am aware that American culture is very different in many ways. Idk if this is just some type of thing about American culture and mentality in general that has always been there or if it is a trend that started recently in the past few years.

I don’t wanna generalize any country and know that not everyone is like this but I definitely noticed this type of pattern.

I increasingly noticed in the past years that many Americans are very hateful/cruel, are lacking empathy, become more and more aggressive and it seems like it’s becoming worse.

I’m not sure if this is maybe related to Americans needing to be “though” or something because I always hear about that the American mentality is pretty competitive and individualistic and instead of saying “we will go through this this together” they often have this mentality “it’s either me or you but it can’t be both who will win”. I mean I’m pretty sure that all these things like this biking culture, driving big “manly” pick up trucks, wrestling etc. are pretty prevalent in America compared to other countries and American culture generally seems very loud and direct. I think here in Europe people are way more reserved and I guess the strongest opposite to Americans are probably Japanese people.

But to me this seems to go to the point where many Americans seem to have this attitude and are very ignorant and arrogant and basically think they’re better than anyone else and they only care for themselves.

And it feels like it’s so extreme to the point where everyone is hating, attacking and bashing on everyone and instead of being stronger united they’re just fighting against themselves and putting each other down and they always focus on the negative.

Especially online it seems like that no matter what the topic is and independent from whether they are Democrat or Republican they’re constantly bashing on someone and baselessly calling them “weak” even though in reality they’re probably the ones who are weak and trample onto people cause they’re obviously dissatisfied with themselves and aren’t able to man-up to face the real issues. You just can’t blame everything on others and have to take responsibility for yourself!

Some stuff that I’ve seen on American news like “Fox News” just seemed crazy where the reporters personally attack and bash on people which is something that would be unthinkable in Europe.

Even though many people were saying that Americans have this “fake friendliness” I’m thinking that even that disappeared in the last few years and they’re becoming more open to show what they really think which seems to be that they “don’t give a f* about you”.

Many Americans that I encountered seem so aggressive like they always need to bash onto something in this toxic way even though they’re actually in a very good position and have a lot to be grateful for. Like in other poor countries people have real problems and are literally starving because they have no food or they have war in their country.

I’m always thinking “dude, you need to chill” cause literally no one is attacking them and they’re fully secure. But it seems like they’re always searching for a fight or something.

It seems like many of these people are so disconnected from nature and become less human and I wonder why they can’t just spend meaningful time with other people being positive and not constantly waste their time with hating or complaining about something. Because this just doesn’t work and in a society with multiple people especially in a world where everything is more connected than ever we need to hold together and have empathy for one and another. That is one of the core morals that a human needs!

It seems like many Americans generally have this “cruelness” about them cause I also heard things that many Americans are physically beating their children and even the fact that guns are popular and legal in America to the point where you can’t even safely walk alone in public during the night or safely send your kid to school and also this general mindset of America is doing everything the best and “America first”. I really don’t wanna bash on Americans at all and only want to share my experience because I just haven’t experienced this type of hate here in Europe in that extreme way and it just makes me very uncomfortable because I feel like this mood is affecting the whole world since American media and influence is prevalent everywhere.

To me it feels like this won’t end well and it feels like it’s just a matter of time until something very bad happens like the second civil war or so and the storm on the capitol might be nothing compared to that. But maybe that’s the only way they will finally learn if they’re lacking these core morals and integrity and they don’t get educated about that in school.

It also seems like they can’t handle critique and can’t admit it/stand to those things. When I once asked a similar question on Reddit the only thing I got back was bashing and personal attacks and I hope it’s not the same here, cause that is literally just proving my point. There needs to be constructive discussions.

  • FeloniousPunk@lemmy.today
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    6 hours ago

    I lived in Texas for 30 years. For the last 10-15, I heard a gun in my area - in the city, nowhere near a gun range - at least once a week, if not more.

    I finally left the state of insanity and we’re much happier for it.

    • Mossy Feathers (She/They)@pawb.social
      link
      fedilink
      arrow-up
      5
      ·
      5 hours ago

      Yep, that’s my point. I’ve lived in Texas all my life and never heard someone discharge a gun outside of a legal area. You’ve lived in Texas for 30 years and heard guns being fired regularly despite being in the city.

      Don’t get me wrong, Texas is fucked in many ways, and while I’ll be sad to leave the scenery and ecology behind, I won’t be sad to leave the people (except my friends who live here).