Our culture has bastardized what “right” means.
Now a “right” means nothing can get in the way of you obtaining whatever is being referred to as a right.
In truth, a RIGHT is what we should be allowed to do, not what we should be entitled to get.
We all have the right to an education. I can’t say to you, “You’re female. You can’t go to school.” But I certainly should be able to say, “You don’t have the prerequisite high school coursework to enroll here.” Or, “You don’t have the financial means to afford this school.” Both conditions can be rectified by the individual, and then she can attend.
But today’s concept of “rights” dictate that either the school must accommodate her shortcomings, or society overall (through taxation) should rectify it.
Yes, we have a right to healthcare. Nobody can say to me, “You’re conservative, so I will not fix your broken arm.” You’re ■■■■■■■ You’re Christian. You’re black. You’re white… Whatever. Basic healthcare costs are covered for everyone and anyone through a network of various programs. It has been so for as long any of us here have been alive. But access to those basics HAVE been violated in the past because of discrimination. We’ve fixed that.
What the new “rights” advocates want are beyond that. When you hear our “woke” politicians use this term now, it applies to free abortions, free gender reassignment surgeries, all sorts of other non-basic procedures.