This paper presents an interactive exploration platform and toolset for spatial, big-data auditory display. The exploration platform is part of the Citygram project, which focuses on geospatial research through a cyber-physical system that automatically streams, analyzes, and maps urban environmental acoustic energies. Citygram currently concentrates on dynamically capturing geo-tagged, low-level audio feature vectors from urban soundscapes. These various feature vectors are measured and computed via Android-based hardware, traditional personal computers, and mobile computing devices that are equipped with a microphone and Internet connection. The low-level acoustic data streams are then transmitted to, and stored in, the Citygram database. This data can then be used for auditory display, sonification, and visualization by external clients interfacing with the Citygram server. Client users can stream data bi-directionally via custom software that runs on Cycling ‘74’s Max and SuperCollider, allowing for participatory citizen-science engagement in auditory display.