Can an employer mandate that you research ways to get better at you job on your own time?

My employer used to “suggest” that we do anything on our own time to improve ourselves, now they’re mandating it. Is this legal? They aren’t paying us for this, and they’re being intimidating about it.

Asked on July 22, 2012 under Employment Labor Law, Washington


MD, Member, California Bar / FreeAdvice Contributing Attorney

Answered 8 years ago | Contributor

You need to understand that improving at job performance will depend on the level of employment and type of employment.  Usuaully, if you work for a law firm for example, the law firm will make available or pay for continuing education seminars and programs.  If your job is now requiring additional education at your cost, you need to speak with the state's department of labor, the human rights commission and the Equal Employment Opportunity Commission. 

IMPORTANT NOTICE: The Answer(s) provided above are for general information only. The attorney providing the answer was not serving as the attorney for the person submitting the question or in any attorney-client relationship with such person. Laws may vary from state to state, and sometimes change. Tiny variations in the facts, or a fact not set forth in a question, often can change a legal outcome or an attorney's conclusion. Although has verified the attorney was admitted to practice law in at least one jurisdiction, he or she may not be authorized to practice law in the jurisdiction referred to in the question, nor is he or she necessarily experienced in the area of the law involved. Unlike the information in the Answer(s) above, upon which you should NOT rely, for personal advice you can rely upon we suggest you retain an attorney to represent you.