War has played a key role in the history of the United States from the nation’s founding right down to the present. Wars made the U. S. independent, kept it together, increased its size, and established it as a global superpower. Understanding America’s wars is essential for understanding American history. In the Key Battles of American History, host James Early discusses American history through …