A majority of Americans believe the U.S. should focus more on issues at home and withdraw from foreign affairs[1], despite an increasing number of Americans believing the U.S. should be more engaged and take the lead when it comes to international events....