Works compensation, or workers comp, is a form of insurance that can be purchased by employers to ensure that they have the ability to cover the cost of one of their employers getting hurt on the job. Additionally, workers compensation can be purchased by individual employees to cover their wages should they get hurt on the job, although typically they will rely on their employer to have this form of insurance and offer its coverage.

For some it's requred

In fact, for some professions a worker’s compensation insurance policy is required by law of employers whose employees work potentially dangerous jobs that could result in injury for the employee. This means that it is illegal for companies to operate without the provision of workers compensation to offer to employees.

How did Workers Comp begin?

The ideas of workers compensation began in the early part of the twentieth century and arose as a compromise between employers and laborers. The compromise was that employers would pay in part the medical expenses and lost wages of workers who were injured while on the job in return for employees giving up the right to file lawsuits against their employers for injuries they sustained at the work place.

Since then little has changed in the law and has left many demanding further reforms to bring the concepts behind workers compensation into a more even distribution of power between laborer and employer, as most laborers see the current balance leaning towards the employer. No matter the future of hopeful reforms in workers compensation, the fact remains that those employers and employees who work for employers that have workers compensation insurance are better able to pay for the enormous expenses, both for the employer and the employee, of an accidental injury while at work.