Workers Compensation is an insurance program that provides medical and disability benefits for work-related injuries and diseases. One of the basic premises of this insurance is an experience rating system that encourages injury prevention by charging higher premiums to employers whose workers have more injuries.
While workers compensation is mandatory in most all regions including Florida, California, Georgia and others, it is not mandatory in Texas. It is a type of insurance that administratively is fraught with danger to the inexperienced lay person, as well as to the inexperienced attorney.
Workers Compensation is a social insurance system that gives partial wage replacement for temporary or permanent loss of earnings for job-related injuries and illnesses.
It was first started in Germany in the 1800’s and became common in the United States in the 1930’s and 1940’s. Now it appears to be here to stay. Though many would like to see it go away the major impediment to eliminating workers compensation is the absence of national health insurance and disability programs in the United States.
Depending on the region in which it is being sought, the laws will vary. For example though fairly close in proximity the states of Georgia and Florida may have different laws and requirements as they relate to workers compensation. Business owners should of course always consult with their attorneys for the most accurate and up to date information.
As for rates, they are awarded through a strict liability system, in which the employee is paid damages for their injury regardless of whether the injury was a direct result of an employer’s negligence.
The sole purpose for workers compensation is to cover the medical expenses accumulated by the injury, as well as financially support of the employee while they are recovering.
In order to encourage businesses to accept full responsibility for the premium costs of workers compensation, the workers gave up the right to sue the employer for damages resulting from a job related injury. When workers compensation was first proposed, a compromise was reached between businesses and the worker. Rather than a benefit, workers compensation is a legally mandated right of the worker. As such all medical bills relating to any on-the-job injury are covered by workers compensation.