According to James Sherk, a research fellow in Labor Economics at the Heritage Foundation states, “During the past decade, manufacturing employment has fallen by one-third while manufacturing output has remained roughly constant” (Sherk). Since the use of technology has increased dramatically throughout the past few decades, it has been increasingly incorporated into factories to give companies and industries a chance to save money. From 1987 to 2000, the United States manufacturing employment stayed relatively equal, at approximately seventeen million jobs. As of 2010, the number of jobs has dropped dramatically to 11.7 million jobs (Sherk). For over 200 years, America has been able to make new jobs to replace the ones that are no longer needed in order to keep up constant employment rates; however, with the sudden increase in technology advancement, a decrease in available manufacturing jobs was almost inevitable (McAfee). Even though that “the US labor force has been shaped by millennia of technological process” it has only recently began to show some of the disadvantages that come along with it (Thompson). According to Sherk, “Manufacturers used more than six times as much information processing equipment in 2007 as they used two decades earlier… Computers and robots now do tasks that once required workers on an assembly line” (Sherk). Many blue collar workers, the class known for manual work, are left with smaller chances to acquire a job in manufacturing in order to support themselves and/or their families. Previously, factories needed many unskilled labors to complete simple, repetitive tasks; Now, technology can be used to complete those simple tasks, therefore replacing the demand for workers with lower education, such as Americans who were high school dropouts or Americans with high school diplomas. Workers with higher education levels are now being hiring in factories to