Words have meanings right? Society keeps “evolving” with words or in “reality” it just changes the meanings to suit the political atmosphere.
This used to be something only hard core politicos might attempt but I’m finding more and more of “mainstream” Christians are now doing the same thing. Some Christians find it necessary to dance around or water down the true meanings of God’s word and commandments so people around them wont think they are too harsh or “religious:
I was trying to figure out why? What would cause someone that says they believe in something to change it or ignore parts of it to make themselves feel better.
Read more at TheRealSide