This is an automated archive.

The original was posted on /r/askhistorians by /u/randallranall on 2023-10-18 02:56:37+00:00.


I’m interested in the history of the popular notion that people especially children- are afraid of the dentist. You can tell a joke predicated on this today without having to explain a fear of the dentist, because everyone knows that’s a thing. When did this start to circulate as common knowledge? Did dentists try to combat the idea?

see there are many posts about the history of dentistry generally and I’m happy to read through those for descriptions of what was and is scary about it, but I’m more interested here in the spreading of that idea in popular culture. Thanks in advance!