Dentists still have work to do to get through to their fellow Americans. A recent survey reveals that Americans view dental care not as a necessity but as ‘nice to have,’ despite the fact that they understand the consequences of poor dental care.

(Visited 10 times, 1 visits today)