The 1940s
The 1940s was a tumultuous era. Much of the first half of the decade was dominated by the crisis of the Second World War, while the aftermath that followed the conflict filled the latter portion. Immediately after the events of Pearl Harbor, the United States entered the war and helped shift the outcome in favor of Allied forces. After the war, the United States was firmly established as the most dominant world power with the most sway over politics, economics, and military power. This website outlines some of the most important events, people, and things from the 1940s. Let's dig in!