I will talk about our new technology HuddleLamp (http://huddlelamp.org) that tracks the positions of multiple uninstrumented, off-the-shelf mobile devices in a room or on a desk. This real-time information about the spatial (or proxemic) relations between devices enables new designs of cross-device interactions and distributed applications. Apart from demonstrating the potential of such cross-device interaction for (collaborative) sensemaking, visualization, or knowledge work, I will also talk about different studies in which we elicited cross-device gestures from users or experimentally determined the effect of physically moving devices or hands in space on human cognition (e.g. memory performance).