DRM for Privacy


Building an uncrackable, foolproof and practical Digital Rights Management (DRM) system is an ongoing effort. Today's DRM systems all aim to protect the content producer's interests - they have little, if any, interest in protecting the rights of the consumers. To put it bluntly, the content producer produces the content (like a movie or a music file) and protects it using a DRM system. The consumer, who first pays for using the file, won't be able to copy or share it.

This kind of scenario might seem unfair. However, no technology is automatically evil or bad. How the technology is used determines its merits. This also applies to DRM. For instance, if the producer/consumer relationships in DRM systems were reversed, an interesting "good" use for DRM technology could be envisioned.

Let's play a thought exercise. The ordinary consumer (you or me) becomes the content provider. The content provider (the corporation) becomes the consumer. We have now reversed the roles.

But what content could the consumer-become-content-provider, the ordinary person, you or me (let's call this actor the "user"), produce? What could be interesting and rare for the corporation but found in abundance from the user? One answer is personal data.

Upon request by some corporation, the user decides to accept the request. The user creates a DRM-protected file containing the personal data the user wishes to reveal. When DRM technology is being used (the same technology used to protect e.g. movies), the user can be sure that the corporation is not able to:

  • use the personal data after the license period (e.g. 2 hours) has expired
  • share the personal data with third party companies without permission
  • do other non-authorized nasty stuff with the personal data

Using "evil" DRM technology a very good purpose can be achieved: the preservation of the user's privacy. The user gets to decide who can receive and use the information in the first place. The user can disallow use by companies the user considers not to be trustworthy, or who he considers to be immoral, or annoying, or who have the most idiotic advertisements.

It is only fair that this kind of application of DRM would be taken into wider use. It would guarantee privacy of the personal data of each individual person, while at the same time allowing companies access to that data with the consent of the user. To me this seems like best of both worlds.

If this idea gets wider publicity, I'm sure there will be arguments against it. I do hope there will be, as some of these arguments could also be, properly role-reversed, used to argue against the current way of things, i.e. the one-sided implementation of DRM technologies where the consumer is just a necessary evil and an untrusted element between the company and the wallet of the consumer.

If the corporations can't trust the consumers to handle the content with proper respect, why should the consumers trust the corporations to handle e.g. their personal data with proper respect? The role-reversal idea can help in achieving balance between these concerns and benefit all parties involved. At the very least, the idea can offer possibilities for stimulating and interesting discussion.

I sent an edited version of this post to the cryptography mailing list at metzdowd. The subject was "DRM of the mirror universe", if you wish to read the entire thread via some archive. The idea was shot down. See below for summaries of the responses.

  • Paul A. S. Ward stated that companies will just set policies to work around any restrictions with the idea of "Give us all rights, or take your business elsewhere".
  • Barney Wolff noted that there is a problem of trust. In ordinary DRM the supplier of content has certified the software to be safe in such a way that the valuable information cannot escape outside the realm of protection. This raises the question of how do you certify the software (or who do you trust to certify the software) which is used by some corporation to handle your private information? Also, Mr. Wolff noted that some personal information datum (such as HIV-positivity) can be so small in size (one bit) that it is easy to capture with e.g. a camera. Mr. Wolff pointed out also the fact that nothing stops the user from giving false data in the first place.
    (Note: The last point is especially interesting. It's an established fact that all kinds of systems which demand a registration before allowing access to new features or more content are at the mercy of the honesty (or dishonesty) of the users.)
  • Matt Blaze commented that this kind of idea had been expressed before. He acknowledged that there is a power imbalance between the corporation and the user, and balancing this imbalance might be the prerequisite to achieving technology-based privacy defences, whether they are based on DRM or something else.