What's the significance of stores posting signs warning the public of the risk of cancer-causing products and chemicals in their vicinity?
When I was in California last month, I walked into stores that had signs posted warning the public that they were in an area “where there are known cancer-causing agents.” I was alarmed to see these signs posted in several shops, and wondered if there was any real harm to people visiting the stores on a daily basis, or to the employees who worked there. Is it a real danger? If not, why do they have signs like that warning people who go there?
This question is in the General Section. Responses must be helpful and on-topic.