I have a marketing degree, but want to change career paths. I would love to go into companies and teach them about preventative health and overall wellness culture, which will help the company overall to reduce insurance costs and make happier employees! Do I NEED to go back to school for this? How can I get started?