Adding sensors to our computers revolutionized them. I remember buying my first computer paddles, my first mic, my first webcam, and the incredible new features unlocked by giving computers a way to sense and respond to the physical world.
1/
Today, our devices are stuffed with sensors to beggar the imagination. My latest phone has FOUR cameras, multiple mics, thermal sensors, and, of course, an accelerometer that lets the system measure how it's moving from moment to moment.
2/
Device security and privacy models treat cameras and mics as sensitive and control how apps access them, but accelerometers are treated as utilities, the kind of thing that apps should be able to tap into at will without risk to the user.
That's a bad assumption.
3/
In "Privacy Implications of Accelerometer Data: A Review of Possible Inferences," a trio of researchers from the Technical University of Berlin document the surprising ways in which acclerometer data can be used to infer sensitive facts about users.
Co-author @JL_Kroger did a great job of breaking down the team's findings, and emphasizing the gap between device permission models and the kinds of wide-ranging inferences that accelerometers enable.
He cites "patents and literature of diverse disciplines" that reveal users' "daily routines, physical activities, social interactions, health condition, gender, age, and emotional state" just by analyzing accelerometer data.
6/
The way you move has a sufficiently unique signature that accelerometers can identify you as the person carrying a device. The same techniques can infer your driving style, whether you are intoxicated, and, through dead reckoning, where you are - even without a GPS fix.
7/
Alarmingly, accelerometers can be repurposed as crude mics, translating sound vibrations into speech and keyword detection.
8/
Accelerometer analysis is imperfect and computationally intensive, but it's still worrying, especially in light of the lack of protection for accelerometer data in mobile OSes.
9/
The authors are skeptical that an "informed consent" model will fix this, in part because accelerometer data has a lot of nonobvious uses (correcting photo jitter, say), but also because of the well-theorized flaws in digital consent:
Kröger refers us to Sec 7 of another paper he co-authored on consent and privacy, "The myth of individual control," for some theoretical ways of striking a balance between privacy and functionality.
This section rejects "self-management" of privacy settings as ineffective and proposes things like institutionally administered "social impact assessments" that evaluate "the consequences of information use and misuse."
12/
Kröger admits "that most existing ideas in this area are still vague and hypothetical" and calls for "urgent" further research (that sounds right to me, too).
13/
He concludes by pointing out that there are many other potentially compromising inference techniques latent in all sensor data, and suggests his paper, "Privacy Implications of Voice and Speech Analysis."
ETA - If you'd like an unrolled version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
There's a difference between a con-artist and a grifter. A con-artist is just a gabby mugger, and when they vanish with your money, you know you've been robbed.
1/
If you'd like an unrolled version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
A grifter, on the other hand, is someone who can work the law to declare YOUR stuff to be THEIR stuff, which makes you a lawless cur because your pockets are stuffed full of THEIR money and merely handing it over is the least you can do to make up for your sin.
3/