Is it mandatory to purchase a home warranty policy?

My husband and I are in the process of purchasing a home and our real estate agent is advising us that it is mandatory to purchase a

Asked on July 9, 2017 under Real Estate Law, California


M.D., Member, California and New York Bar / FreeAdvice Contributing Attorney

Answered 3 years ago | Contributor

The law does not require the purchase of a home warranty, although your lender might. If that is the case, then you will either to buy it or find another lender. Also, ask your agent why they say it is "mandatory" in your case.

SJZ, Member, New York Bar / FreeAdvice Contributing Attorney

Answered 3 years ago | Contributor

A home warranty is not required by the law. But it may be required by your lender: that is, a lender can make it a term or condition of receiving a loan that you get a home warranty, the same way they can (and do) require homeowner's insurance. If you don't wish to get a home warranty when your lender wants one, you'd need to find a different lender.

IMPORTANT NOTICE: The Answer(s) provided above are for general information only. The attorney providing the answer was not serving as the attorney for the person submitting the question or in any attorney-client relationship with such person. Laws may vary from state to state, and sometimes change. Tiny variations in the facts, or a fact not set forth in a question, often can change a legal outcome or an attorney's conclusion. Although has verified the attorney was admitted to practice law in at least one jurisdiction, he or she may not be authorized to practice law in the jurisdiction referred to in the question, nor is he or she necessarily experienced in the area of the law involved. Unlike the information in the Answer(s) above, upon which you should NOT rely, for personal advice you can rely upon we suggest you retain an attorney to represent you.