Is My Employment Under Contract

An employment contract is a document that you and your employer sign and that dictates the terms of your professional relationship. Most employees in the United States do not work under an employment contract but in some instances it makes sense. If you signed a contract at the beginning of employment, you are under contract. […]