Imperialism can be viewed in a negative or positive way depending on how one feels about the opportunity at hand. Imperialism starts with many different aspects which seem to always find its way back to war or conflict. As best describe by wordnetweb, Imperialism is, “a policy of extending your rule over foreign countries.” Around the 19th and 20th century imperialism was at its peak in United States history. The start of imperialism seems to be on a trend of taking over foreign areas that may serve as a massive benefit in some kind of way. In my opinion America does not take on the role of still being a imperialistic nation. Although the country may help its allies or take over a situation that seems to be damaging future connections with the country, America does not go in and try to take over anything just to get ahead. At this point in time the nation seems to be fine with the assets they have at hand, there is no need for more. Previous history does not seem to show any imperialistic trends with the country. The only thing noticeable is the helping hand it gives out constantly. Imperialism may have led to the expansion of the machine gun, new medicine, technologies,, etc but it also led to the negative aspect of each. In conclusion one would say imperialism is not an issue with the United States.