It seems apparent from your question that you are uninformed about the US role during World War II
The US entered the general war as a result of the attack on Pearl Harbor by the Japanese on December 7, 1941, prior to that date the US was officially neutral. But the US entered against Japan and did not, declare war on Germany. Germany by then already captured Poland, Austria, and France. However, a few days after Pearl Harbor, Adolph Hitler (Germany) declared war on the US, thereby putting an end to the US dilemma concerning its allies in Europe. After a meeting between Churchill and Roosevelt, it was agreed that the British and Americans would have a "Germany first" policy.