Mind-Controlled Systems for Human-to-Machine Interaction with Voice Assistants
Implementing mind-controlled systems with smartphone applications enhances the human-to-machine interaction with voice assistants.
Visual Component Prototype Description
In order to demonstrate the thesis statement for this project, I will create an interactive iOS application that is controlled by a brainwave controller (i.e., the Emotiv EPOC EEG, Neurosky MindWave, etc.) which will allow the user to send a limited set of mental commands to the iPhone. The custom application will consist of Objective-C algorithms that will listen for input from brainwave controller, a custom NLU (natural language understanding) feature to process the brainwave information, and methods for executing the mental commands on the smartphone device.
The prototype will also allow the user to use voice commands by waving the hand over a sensor on the front of the device to activate the voice recognition to listen for commands without touching the iPhone. The inclusion of this feature will speed up the prototype development for the applications side for testing before the brainwave controller link to the device is established. The application will use Siri speech recognition and will speak with Siri voice responses based upon the processing of the limited NLU feature. The limitations established for the NLU feature will allow the user to execute a number of functions on the iPhone such as, but not limited to the list below:
- Open a new view or menu
- Activating a button action
- Opening the Maps app for an address
- Playing a video
- Making a phone call
- Move a virtual ball
Visual Component Action Plan
The full visual component for the prototype will consist of a mockup for the iPhone application and components, and the working prototype (video recorded). A schedule of major milestones for the visual component is listed below:
ITGM 755 – By the end of ITGM 755, a mockup and a working prototype for the iPhone Siri portion will be completed minus the connection with the brainwave controller. Investigation of the Emotiv Epoc brainwave controller/SDK and other hardware will take place during this time to verify the use of this particular hardware or if other hardware will be used. Some limited brain-to-machine interaction may take place to evaluate the system. The iPhone virtual assistant will be voice operated for this term to verify and test the base working system. The brainwave controller will be added in the next phase during the next term. I will also implement the visual design elements for the project using available graphic design software for the user interface.
ITGM 765/Thesis Review – During this stage the working iPhone application will be coupled with the communication from the brainwave controller and the mental command processing will be implemented. The mind-controlled application will be demonstrated to show the enhancements of mind-control versus voice control on the iPhone application. The findings will be video documented to show the working visual component.