Ever since the first dental school was founded in the United States in 1840, dentistry and medicine have been taught as – and viewed as – two separate professions. That artificial division is bad for the public’s health. It’s time to bring the mouth back into the body. In 1840, dentistry focused on extracting decayed teeth…