The history of vaccine mandates in the US

Vaccine mandates are not new in the United States, dating back to the country's founding fathers.

Top Videos

Good Day Tampa Bay

We Live Here