February 27, 2015 by Maria Angela Ferrario
The Clasp project kicked off on 3rd February with a brief planning session for the Clasp 18-month agile roadmap – as a taster, here is our roadmap for the next six months. This was followed by a hands-on workshop on personal data interactions, during which we explored the wonders and perils of using Clasp Moodcons to turn our sleep data into NightShapes. Here we briefly explain how.
Let’s start from the beginning, there were seven of us (three core research team and four key stakeholders). MAF facilitated the workshop and asked participants to write down the five words that, in their opinion, best defines ‘personal data’. Examples of personal data were also brought to the table as well as the opportunities and challenges of interacting with personal data across all the steps of the data life-cycle.
Research shows that there is an association between anxiety levels and sleep patterns, including in people with ASD. The Clasp research team thinks that there little more personal than data captured during our sleep. The team first tried on themselves what it means to have your own sleep data automatically captured, represented and discussed in a group situation. We tried both automatic capture (and sharing) of sleep data through wearables like FitBit, and personal descriptions. Through this process we came up with the idea of representing sleep data as organic matters, like corals and presented some of our initial experiments with code linking to Shapeways ShapeJS to automatically generate 3D-viz of sleep data.
Back to the workshop: the team was invited to represent the last night’s sleep by combining specially designed and 3D printed ‘Clasp Moodcons’. Moodcons come in a set of different shapes: some are round, some have spikes. None have specific meanings apart from being ‘data’. Participants are free to add qualities to this data by colouring the shapes or cutting them or both. During this workshop each Moodcon represented the aggregated data of an hour sleep. Pipe cleaners where then used to connect each aggregated data units and white sheets of paper where used by the participants as metadata.
The resulting NightShapes where then securely stored in a box with their metadata and individually, picked up at random by the participants, and ‘guessterpreted’.
Overall, the ‘guessed’ interpretations were fairly accurate, even when the Moodcons had been altered (e.g. broken or painted in different colours) to convey very specific sleep quality. This may seem trivial, but indicates that we may all be different but we do convey individual meanings in specific and recognisable patterns. However, in most cases the meta-data that otherwise added context to the data was ignored.
Key findings were summarised and reported back during our second meet-up and workshop “Technology Kitchen” focused on the material aspects and functionalities of data-sensing devices.