Three Topic Pitches
The purpose of this document is to propose three concepts for use in the development of an M.F.A. thesis topic. Each topic presented below explores a specific area and mode of implementation. The following concepts are presented:
- Augmented Reality in Mobile Games to Improve Family Relationships
- Gesture Recognition for Touch-less Gaming Systems
- Mind-controlled systems for Human-to-Machine Interaction with Voice Assistants
Each system explores how the system will enhance or benefit the currently available system. Being highly involved in pushing the boundaries of iOS development, many of these systems will make use of the iOS SDK and the iPhone as a primary means of exploring the technology benefits. Additional hardware for augmented reality, gesture recognition, and mind-controlled systems will be integrated into the development component to support the ideas presented.
Topic Pitch #1 – Augmented Reality in Mobile Games to Improve Family Relationships
With the proliferation of mobile device gaming worldwide, consumers have also seen the extension of other technologies to enhance the gaming experience. The technology of augmented reality has enhanced gaming by allowing consumers mix real-time views and digital graphic overlays. In the last few years, augmented reality has been used for an increasing number of applications. Augmented reality has been seen, not only in games, but also in art, military and navigation applications. Since augmented reality makes use of a variety of sensors and hardware, including cameras, accelerometers, GPS, direction tracking, it seems natural to explore the how gaming experiences can be enhanced by using augmented reality versus a device-only implementation.
Many people use their iPhones for entertainment by playing casual games, and many people also like to play their favorite games in the comfort of their own homes. However, for some families, games can also create an isolated environment, even when members of the same family are playing the same game on their own mobile device while sitting in the same room. Interaction between family members sitting next to one another is decreased as each is actively engaged in individual gaming pursuits.
One question to ask is how this type of casual activity can be transformed into gaming experience that also enhances the family unit through active interaction. By offering augmented reality mobile games that use mobile robotic devices and multiplayer capabilities, one will find that augmented reality gaming can enhance family relationships by promoting active interaction. By developing an iOS game that uses an augmented reality technology such as Sphero, one will find that families can improve their family relationships. Additionally, a game that promotes positive acts of kindness will show the benefits of using this technology as a family-strengthening tool.
Topic Pitch #2 – Gesture Recognition for Touch-less gaming systems
Many people who play games on smartphones and desktop PCs are familiar with the use of touch screen gestures, and mouse movements to interact with elements on the screen. Swipe, tap, pinch, and click gestures have become part of the activities of making things happen. For some people who have not previously experienced this type of interaction, gestures such as pinching and tap-and-hold are not immediately obvious. More confusion can be generated when one device calls for a different set of interaction techniques to engage in the game play. With all these gestures occurring on the screen, the decreased visibility of game elements can reduce the entertainment value, especially when playing games that require one to touch the player character to move it on the small screen of a mobile device.
The increased availability of gesture recognition and depth-sensing cameras to developers has provided new areas of interaction in gaming. Since many smartphones, such as the iPhone, include some type of front-facing camera, it would be possible to use this camera as an input device for hand tracking and other remote gestures to control game elements and characters on the device screen. For PCs, it would be possible to explore the use of gesture and depth-sensing cameras to explore remote gestures as an interaction and control system for PC games.
In order to explore remote gestures as a primary user interface for touch-less gaming, a project to explore this method would involve the creation of a system for tracking hand movements within a game on the iPhone and on a desktop PC. The project would use depth-sensing cameras and the available camera on the iPhone along with algorithms developed to control visual elements in a gaming environment. Both a touch and touch-less mode of play would be created for the game system, and each method would be evaluated to prove the benefits of the system.
Topic Pitch #3 – Mind-controlled systems for Human-to-Machine Interaction with Voice Assistants
In recent years, several EEG/brainwave systems and SDKs have been released to developers for research or development of mind-controlled interfaces. Many of these systems consist of some type of wearable sensor, such as a headset fitted with electrodes that detect electrical activity from the brain. The potential use of these brainwave-analyzing systems in games and other interactive media has attracted much interest. The idea of controlling a game or a computer program with the mind is alluring and exciting to many.
Another area drawing increased interest is that of speech recognition in virtual assistants, such as the iPhone’s Siri. While these virtual assistants are gaining interest to many users, some users may find that they still have difficulty communicating effectively with the system. Even the best natural-language-understanding fails if the system does not hear the words being spoken correctly. These speech recognition systems depend upon quality microphones and good speaking environments to be useful. If the speaker does not speak clearly or if the background sound is too loud, the system may fail to recognize the words.
The use of mind-controlled systems, such as the Emotiv Epoc, as an integrated part of a voice-recognition assistant can enhance the effectiveness of a voice assistant. In order to explore this avenue, an iPhone application would be created the uses the Siri speech-recognition as its core component. The application would allow interactivity through voice-recognition and brainwave control. The system would provide feedback to the user through a custom natural language understanding algorithm to perform user-initiated tasks on the iPhone. Evaluations of the working system would provide support for the benefits and effectiveness in improving the quality of the human-machine interface.