All too often America is seen as a country that is willing to help other people before they help themselves. My problem with this view is that sure, we are seen as this country that is so great and willing to give things to other countries to help them do better, but is this because we are wanting to be nice? I think we naturally make ourselves seem like we are going to help a country do better on their own, but in the end Americans just want to be in control of everything. Don’t get me wrong, I love my country and our people, but ever since the British came to America they have been using their power of persuasion and force to get what they want, but spin the story to make them seem like they are doing good things for other people. This is similar to what Limerick was saying in her writing “Haunted America.” We glorify everything that goes on such as war and our desire to control other countries when in reality we are just searching to be a world powerhouse. If we just left other countries to do their own things without disrupting the flow of their lives, what would our lives look like here? Would our country not be in so much debt? Would we be able to actually house and feed our homeless families?