Women in the Medical Field

Women in the Medical Field

Have they advanced?

Today, we as women enjoy more of the job market than we used to, including the women in the medical field. The field or industry of their choice is open to them, even if it is a rough journey getting there. Women in the medical field enjoy new job opportunities that we did not have in the bygone era. We are no longer “just a nurse” but have advanced that title to a title of importance. Nor is it just women in the nursing profession nut men also become nurses. Nursing is not a “woman’s job” anymore.

Women have also advanced to the roles and titles of surgeon and doctor. Women in the field of medicine now enjoy levels of accomplishment that were not open to us only one hundred years ago. We have redefined many of the “men’s jobs” and have shown ourselves to be more than capable of handling the men’s work and vice versa.

Women now have children, and the father raises the children while the mother works. Sex or gender no longer defines the roles we as women play in modern society. Woman wear the pants and fill them quite nicely, while the men fill the traditional “woman/s role”. Women are also allowed to serve in the armed forces in more than just a supportive capacity, but now make an impact by filling a job market that is and was in need of personnel.

In the earlier years of the medical industry, women were granted their doctorate, according to an article on the staffcare.com website. As the years have progressed, women are no longer required to attend a female-only college and attain their doctorate degrees from them. We attend the best schools and are some of the best teachers, the medical field has to offer.

Even though we are not there yet, we are slowly gaining ground and have become more than “just a nurse”. WE HAVE RISEN!

About The Author

French Connection

The Ordinary

Advertise Here




Advertise With Us

Jone’s of New York