I don't know if this classifies as a weird thing or not, and it certainly isn't a recent thing, I've been noticing this for many years, but the thing is how much people talk about health care as this basic human need for young adults. I'm turning 40 in a month. I've never gone to the hospital. I've never taken medication or have had a prescription. I haven't had any health scares. I've never had to go to the emergency room. I'm not some health nut or super careful person. I'm certainly not lucky. As far as I can tell I've lived a normal life. But whenever I'm on social media and people get into the weeds of talking about jobs, careers, politics, demographics, etc; I always see people talk about access to healthcare as if it's so necessary in their lives. I've seen people talk about working at horrible jobs because it gave them access to good healthcare. Everyone always obsesses over universal healthcare. How unhealthy is everyone?
People must be far unhealthier than I realize, even at my most cynical. But even my boomer parents, who are in their 60s and 70s haven't had much need for healthcare beyond some basic dental and eye appointments. They also aren't the healthiest of people in that they don't actively pursue better health outside of eating a good diet.
I also saw my wife go through the healthcare system. She had debilitating headaches. We had access to healthcare at the time and we used it. Nothing cured her headaches. They were eventually going to suggest cutting some nerves to numb the area (at the time she had taken something like 6 injections to the back of the head to numb her skull and she still had headaches). She no longer suffers from headaches, but you know what cured them? Her own mind. Simply by recognizing the pain as a distraction from emotions she was burying. That's literally all it took. It was all "in her head." Our current healthcare system is such a scam but so many people are under its spell and freak out if they don't have access.