Nowadays, humans are surrounded by many complex computer systems. When people interact among each other, they use multiple modalities including voice, body posture, hand gestures, facial expressions, or eye gaze. Currently, computers can only understand a small subset of these modalities, but such cues can be captured by an increasing number of wearable devices. This research aims to improve traditional human-human and human-machine interaction by augmenting humans with wearable technology and developing novel user interfaces. More specifically, (i) we investigate and develop systems that enable a group of people in close proximity to interact using in-air hand gestures and facilitate effortless information sharing. Additionally, we focus on (ii) eye gaze which can further enrich the interaction between humans and cyber-physical systems.