Eyespeak is the first autonomous system with an augmented reality interface by eye-tracking that will provide the possibility of communicating through the user’s eyes, in any position and orientation of the user's head. It will consist of a pair of augmented reality glasses that will project a virtual keyboard (or the display of your computer if you connect it to it) onto your field of view. Additionally, we will have a micro camera looking at your eyes to understand which key you are selecting. After writing a word or set of words, you will be able to select with your eyes the “speak” button that will speak what you have written in a synthetic voice, through a speaker that is integrated into your glasses. Thus you will be able to speak with any person through this device when in any position. Since the glasses are always on your head, you will always have access to them. We call it autonomous because the only assistance you need is for somebody to put it on your head and press the ON button. Everything else will be done by you. You won’t need any computer or any other device. Moreover, you will be able to use your current computer by connecting the device to it and using your eyes to control the mouse.