11 Things You’ve Always Thought About the Wild West That Are Totally Wrong

Most myths about the American West originated on the silver screen. Hollywood renditions of cowboys, Indians, gunfights, and outlaws paint a romanticized version of what people believe…

Read more