I believe that the Bible should be taught in public schools. The Bible “…has had a profound impact on the history and development of the United States and remains a vital part of American life and culture. Yet, Americans are less biblically literate now than ever before…”(csmonitor) Our nation was founded on the bible, yet many don’t even know the simplest stories from it. Now I am not saying that we should all have to memorize verses or something, but simply that the bible should be used like any other historical reference (ex. The Odyssey, The Iliad) “The Bible is the most influential book ever written. Not only is the Bible the best-selling book of all time, it is the best-selling book of the year every year.”(time)
I understand that everyone has a constitutional right to practice whatever religion they want, but that does not mean we have to ignore everything in history that relates to religion. I also see the point that if the bible is taught then everyone else will want their religious texts to be taught as well, but the matter of fact is that the Bible is important to the history of our Nation.
No comments:
Post a Comment