Profile picture
Sarah Jamie Lewis @SarahJamieLewis
, 21 tweets, 4 min read Read on Twitter
In the Dat protocol white paper under Network Privacy, there is a section that reads: "There is an inherent tradeoff in peer to peer systems of source discovery vs. user privacy."

I disagree with the statement & the impact resulting design decisions have on privacy.

Some notes:
Dat defines a source discovery as the IP:Port pairing of a peer that has access to the data you want.

I really wish we wouldn't build p2p networks directly on top of IP addresses. We have better overlay tech.
To clarify some terms for twitter. Content Discovery and Source Discovery are two different (but related) problems

CD: Finding that some content exists
SD: Knowing where to get it.

An old web example:
CD: Google Search
SD: The resulting Website
So, if I have a document identifier the problem becomes how do I translate that to the actual document.

(In old web terms, I have a URI, I do DNS resolution to get an IP address and I initiate a HTTP connection and the server sends me the content)
p2p tech complicates the problem definition because a document identifier no longer belongs to a single server, it has been published to the network and could be living anywhere, in multiple places, hosted by peers.
And thus, the Dat paper states, that this problem makes privacy hard - because a peer has to ask other peers if they know where to get the document, and the more peers they ask the more peers who know they have asked for a given document.
So the first thing I will say is that freenet presented solutions to this problem which provided strong guarantees for reader anonymity and publisher anonymity nearly 2 decades ago.

Again: We've known that we can do private source discovery in p2p networks for literal decades.
2) We now have networks like Tor and i2p which present really neat peer addressing solutions that anonymize IP endpoints and protect publishers and readers.
3) We *know* that metadata analysis is *the* thing that drives mass surveillance systems.

Why in 2018 are we building new p2p networks that don't offer any reasonable privacy guarantees against mass surveillance capable adversaries?!
It seems like a large number of people on my timeline are talking about "the new web" and I'm sitting here like...these technologies and protocols have learned none of the security lessons we've been taught in the last few decades.
The point of p2p tech is to distribute trust.

You can't distribute trust without consent.

You can't consent without meaningful privacy.

Privacy should be a foundational element in any p2p stack, not a challenge, or a footnote, or a "maybe we will get to this in the future".
And I picked on Dat, but I want to clarify that it isn't just Dat, this problem is pervasive in the new generation of p2p tech in this space.
And I don't know why.

Maybe developers think that anonymizing networks are slow and have limited bandwidth (partially true, but mostly a resource issue not a fundamental tech issue)
Maybe it's a knowledge problem.

Surveillance and privacy are marginalized issues that impact different communities unevenly.

Solutions to these problems exist but may not be considered by those building the systems as high priority.
Maybe I'm being over demanding of a bunch of community led, open source projects to consider use cases that they don't have the bandwidth to consider.
I want to live in a world where we have a diverse set of p2p solutions. I want these projects to thrive. Fundamentally I think that is the only way we can hope to achieve decentralization of trust and a free and open internet ecosystem that resists censorship and surveillance.
But we have to build privacy into these systems from the ground up, at the first platform layer, not the application layer - application privacy does.not.work.
I'm not sure if the current generations of systems can have privacy built into them...my experience and intuition says "probably not" - privacy is really hard to layer onto a a system after design.
Privacy is not an optional design element. When you refuse to build privacy into a system you are further marginalizing populations, enforcing censorship and encouraging surveillance.
When you refuse to build privacy into a system you are stating that you believe that only certain types of people should be able to use your system, and only for certain things.

You might not intend that, but that is fundamentally the result.
So this is an invitation, tell me what you need to make p2p privacy happen.

Is it education about solutions? Is it programming libraries? Is it user stories? Is it design reviews? Something else?

It's part of my job at @OpenPriv to make this happen.

sarah@openprivacy.ca
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Sarah Jamie Lewis
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member and get exclusive features!

Premium member ($3.00/month or $30.00/year)

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!