The term “nursing” originally meant to describe new mothers breastfeeding their children. Then the meaning transitioned or expanded into those who care for the ill and disabled. Priests, nuns, and nannies were the first to take on the nursing career by helping heal those in need and taking care of children’s needs. During wartime such as, World War I and World War II, the nurses were women because the men were off in combat. Also, women becoming nurses was a socially acceptable job during this time. Over the years, men entering the nursing field has increased for “about 2.7 percent of registered nurses were men in 1970 compared with 9.6 percent in 2011” (Landivar, L., 2013). Compared to female nurses there were “3.5 million employed nurses in 2011, about 3.2 million of whom were female and 330,000 male” (Landivar, L., 2013). Nursing is primarily a female occupation, however, the number of men becoming nurses has increased and continues to do …show more content…
Nurses are not vixen or prostitutes, they are college educated health care workers with responsibilities that are life or death. In an episode of ‘Grey’s Anatomy’ Dr. Sloan slept with so many nurses that the nurses boycotted his surgeries. This example depicts nurses as easy replaceable, disposable, and inappropriate in the workplace. Shows like ‘Grey’s Anatomy’, ‘ER’, and ‘House’ associate nursing with sex and has the effect of viewing nurses as incompetent and unintelligent. Due to this stereotype along with others it discourages practicing nurses and encourages sexual abuse in the