Are you more inclined to buying a skincare product if the phrase “dermatologist-tested” is indicated on the packaging? Do you perceive the product to be better because of the association with skincare experts?
For as long as I can remember, I don’t think these words meant anything to me. At least, they’re never the first to influence my purchasing decision. I might take note of it, but I wouldn’t buy the product because it is tested by some dermatologists. Anyway, I have no idea if it refers to one or a group of dermatologists and who are they. Perhaps the phrase “clinically tested” or “laboratory tested” have more effects, but again, I do note that most of these tests are done by the company and are not independent or unbiased. Most often than not, there are fine prints indicating that the results will vary with individuals.
So what’s your say?