Our country seems to have lost its way and gotten away from what it was founded upon. The true history of our country has been watered down and changed so much that what is taught in schools isn’t the same as what really happened. Does anyone care? Is anyone even interested in finding out the truth about our history or do we just accept what is taught as fact and live life as if our history was what “they” say it was?
We need to find out what the truth is and really teach our children the truth, otherwise the mistakes which have been covered over or smoothed out to be politically correct will happen again. Think about it!