German Empire (No Imperialist America)

History
The German Empire was formed in 1871 as the result of the Franco-Prussian War. In 1914, Germany entered the Great War in defence of their ally, Austria-Hungary. The war ended in 1920 with the Treaty of Leipzig.