When compared to image recognition, object detection is a much more challenging task because it requires the accurate real-time localization of an object in the target image. In interaction scenarios, this pipeline can be simplified by incorporating the users’ point of regard. Wearable eye trackers can estimate the gaze direction, but lack own processing capabilities. We enable mobile gaze-aware applications by developing an open-source platform which supports mobile eye tracking based on the Pupil headset and a smartphone running Android OS. Through our platform, we offer researchers and developers a rapid prototyping environment for gaze-enabled applications. We describe the concept, our current progress, and research implications.