Board Thread:General Discussion/@comment-10975360-20131129121937/@comment-4919678-20141126124107

I do think that if Germany had won World War One, there would have been a period of tension between the USA and Germany over certain colonies in the Americas. It would be wrong to call it a cold war, though. But it would have been similar. Just no threat of an apocalypse hanging over everyones head. But could something like the nazis have shown up in Britain?