What Ever Happened To Truth?
What ever happened to telling the truth? Webster defines truth as conformity to fact or reality; integrity; constancy; exactness; reality; or verified fact. When I was growing up, telling the truth was taken serious by mostly everyone, even children. Parents taught their children to always tell the truth. Even our siblings and friends held us accountable as we swore to tell the truth by crossing our hearts! That’s how important telling the truth was then.
With many people today it seems that telling the truth is of no value and of little importance. We witness deception all around us from the highest levels of government, political leaders, foreign leaders, business, media, social media, and in our education system. There are some who are so deceptive that they deny facts of history and seek to prevent the teaching of true historical facts in our schools.
Sadly, rather than speaking the truth of God’s word, some religious leaders fail to tell the truth when they remain silent in the face of injustice, the mistreatment of those without a voice, or go along with those who practice various life style choices.
However, Believers are to follow God’s word. Scripture tells us that the word of the Lord is right and all His works are done in truth, Psalm 33:4. God is immutable, He never changes, nor does His word and He never lies. God reminds us, “These are the things you are to do: Speak the truth to each other, and render true and sound judgment in your courts, do not plot evil against each other, and do not love to swear falsely”. “I hate all of this”, Zechariah 8:16-17. Finally, the Lord tells us, “I have no greater joy than to hear that my children are walking in the truth”, 3 John 4.
Picture: Oath Court Witness – Courtesy of learnreligions.com
Commentaires