Doctors



  • n. People who get paid to feel you up and stick needle into your exposed flesh for a living. Though it sounds like a good deal, to be one, you have to waste about 8 years of your life in college plus god-knows how many more years in med school. Being a doctor takes a massive toll on your mental health- They are tasked with having to tell people when they have an incurable disease, tell people that they can't have babies, and don't forget the occasional Colonoscopy!


 

Related Words

View More



© Define Dictionary Meaning. All rights reserved