Is there too much emphasis on having the ‘whitest brightest smile”? Is that a mainly American thing or has it caught on globally?
What do YOU honestly think about it?
It’s funny that this question came up because I had to go in for my routine teeth cleaning yesterday.
Honestly, I think this concept has dulled in the recent few years, especially since the pandemic. Before then, I would see posters upon posters of people smiling with their perfectly white teeth whenever I visited the dentist office for a routine cleaning. I read this prompt before I went in for my routine teeth cleaning and examination yesterday and when I went into the office, it suddenly dawned on me that there wasn’t a poster anywhere.
I only started going to this dental clinic last year. Before then, I went to a dental clinic in my neighborhood, which I didn’t particular like because they weren’t being gentle with my gums.
Back to the question, I think this concept was emphasized more before 2020 and it wasn’t just an American thing but a global phenomenon. I think it’s always a global thing. When I was visiting China in 2017 and 2018, I would go into the store like Walmart and there would be posters pointing to the toothpaste aisle with suggestions to buy certain products to whiten teeth.
Though I have mostly given up on the idea because of my coffee-stained teeth as well as my tens of missing teeth, there is still a teeny tiny part of me that still holds onto this hope that one day perhaps, I will get my “whitest bright smile”.