War has played a key role in the history of the United States from the nation’s founding right down to the present. Wars made the U. S. independent, kept it together, increased its size, and established it as a global superpower. Understanding America’s wars is essential for understanding America…