During the current events meetup a couple of weeks ago, we talked some about the situation in Syria and what, if anything, would be an appropriate U.S. response. We will probably still talk about that, but I want to broaden the topic to include the role of the U.S. in the world in general today.
What has our role been in the past, how has that changed, and what do you think should be the guiding principle of U.S. foreign policy? Can we be a positive force in the world as opposed to merely supporting our national and economic interests? We will probably focus on the middle East, but don't need to be restricted to it.