Masters of medical apprentices taught their pupils what they themselves were taught as young men. It was in that way that, over time, medical education reached an unfortunate standstill where progress was actually hindered and not facilitated by the educational system. While new minds were entering the field armed with new experiences and dreams, the system in place was not well enough equipped to handle such advancement and could no longer be considered an adequate model for teaching. The solution to this problem came in the form of the establishment of medical schools, a trend that would survive its share of highs and lows until the need for reform was so severe that a complete revamp of the system was required.


A time honored tradition of apprenticeship ruled medical education in North America from the first days of European settlement until after the establishment of the continent’s first medical schools. The system in which masters shared their skill with a small number of apprentices dates back centuries and for the majority of that time was a sufficient form of training in the medical field. As expressed by Flexner himself, the quality of medical training in this time varied considerably with the “capacity and conscientiousness of the master.” When technologies and newly discovered information overwhelmed that tradition, formal medical classes began to supplement the training of practicing physicians and the informal anatomy classes that dated back to 1750. Beginning in the 18th century, American students seeking to continue their medical education crossed the Atlantic to study in European medical schools which were far more established at this time than their American counterparts. From this model grew proper medical schools in the Western Hemisphere, the first of which were associated with standing universities. Out of the collaboration of John Morgan of the College of Philadelphia and the partnership of Thomas Bond and Benjamin Franklin at the Pennsylvania Hospital, America’s first medical school was opened at the University of Pennsylvania. It is important to note that this first medical education facility was both part of an existing institution and had access to a public hospital as these characteristics became the downfall of many schools opened within the same period.

When American colonists rejected the authority of Great Britain in the second half of the 18th century interest turned away from the establishment of medical schools to the formation of an army and the search for able leaders. In 1783 the risks taken by revolutionaries in the American War for Independence were rewarded with victory. Also that year, new medical schools, including Harvard College’s Medical Department, were opened in the newly formed United States of America.

In an era of new possibilities proprietary medical schools across the settled parts of North America opened medical departments. With this, the beginning of the 19th century saw the accelerated decay of the apprentice system as well as lowered requirements for admission to institutions of medical education and an instability in the almost continuous opening and closing of these medical departments. In the words of Abraham Flexner himself, “scholarly ideals were soon compromised and then forgotten.” That is to say that the promising beginnings of medical schools in America soon surrendered to motives of profit and educational inadequacy due to lack of resources. 


In the decades following the American Civil War, when stability once again returned to the United States, a series of advancements further antiquated the medical educational system. Discoveries such as that of the bacteria that cause a number of common infectious diseases, the use of aseptic techniques in surgery, the performance of diagnostic lab procedures, the development of clinical pathology as a discipline, and the invention of new instruments such as the ophthalmoscope, the laryngoscope, the achromatic microscope, required that medical training include learning to use these new technologies, something not easily done with the system in place. 


   In the 1889-1890 school year, three quarters of American medical schools had in place a “repititional” or repeating curriculum in which students studied the same material every year for two or three years. If labs were available, laboratory classes were offered in the spring session.
   By 1900 the “repititional” curriculum had lost popularity and was all but replaced by the “graded” curriculum which was a four year curriculum in which students studied different subjects in each of their four years of medical school.
   In addition to the differences between “repititional” and “graded” curricula, both
“regular” and “sectarian” medical schools existed across the country. “Regular” schools taught the full range of treatment options while “sectarian” schools focused only on a specific philosophy and treatment. These sects of medicine included homeopathic, eclectic, and physiomedical philosophies.
   Between repititional versus graded curricula and regular versus sectarian schools it is perhaps not surprising in the least that entrance requirements to medical school varied greatly in this time period. In the first decade of the 20th century 74% of medical schools required four years of high school education, 20% required two or more years of college education, and 5% required only one year of college education. In fact, this was already a great improvement: before 1900, only Johns Hopkins University required two or more years of college education for admission to their medical program. From these statistics it is easy to understand the great difference in the quality of medical education that existed. It is also worth noting that medical schools that did require some college education at this time were “regular” schools that were associated with a university.