The Department of Health Florida in West Palm Beach, FL is a government agency dedicated to promoting and protecting the health of residents in the state of Florida.
They work to prevent disease, ensure access to quality healthcare, and provide public health information and resources to the community.
Generated from their business information