Dentists still have work to do to get through to their fellow Americans. A recent survey reveals that Americans view dental care not as a necessity but as ‘nice to have,’ despite the fact that they understand the consequences of poor dental care.

(Visited 6 times, 1 visits today)

Leave a Comment

Your email address will not be published. Required fields are marked *