U.S. Wins the Vietnam War

The Vietnam War deeply scared the United States from he idea that the United States could lose a War to Vietnamese Militias, After defeating Nazi Germany and The Japanese Empire in World War II was shocking to many Americans and also many Americans lost trust with the System after seeing the atrocities caused by U.S. Troops overseas. This changed the way United States looked at the World and the way they reacted. But what if the Vietnam War had played out differently, and what if the United States had won the Vietnam War.