As businesses continue to reopen and we prepare for a post-pandemic world, the interminable question confronting employers is whether they can mandate that employees receive a COVID-19 vaccine in order to work. In some ways, the premise seems completely antithetical to what we know about employment law. There are very few instances where an employer can outright demand that an employee submit to an invasive medical procedure. Front-line workers and those involved in public health come to mind, however, there are few other examples.

What does that mean for the rest of the business world? What about employees who have limited or no contact with the public? Does it make sense for employers to require those folks to be vaccinated too? Is it even legal to mandate vaccination?