Failure is Always an Option

What if a formal model isn’t possible?

Privacy may be impossible to isolate from other human values and ideals such as trust, justice or spirituality.  Attempting to model it allows for quick (and painless?) failure. A formal model does not allow for a detailed consideration of, for example, the given harms in an informational disclosure treating those as consequences rather than outcomes (as traditional in legal scholarship on privacy (Solove, 2005)).

In any case, a formal model encourages debate on privacy – regardless of discipline – allowing for the ability to discuss privacy in a precise and consistent manner.  

Aside from the conceptual nuances, the word ‘privacy’ itself varies in meaning and rightly so.  Half the world’s population speaks one of 13 languages, while the remainder of the world speaks a variety of others.  Computing could be a universal language, and the ability to express traditionally social concepts in computer models is a possible way to move forward in our understanding and shared experiences.  

Failing the application to privacy is nonetheless possible, for example, there are too many definitional complexities, or infinite possibilities of combinations leading to decisions on information disclosure.  Or perhaps a more complex computational model is required to undertake the scholarship.

An effort that fails still matters; it shows that privacy cannot be entirely formalized today, but the important issues of contextual and experiential specificity can be kept alive!  These are questions which are on the cusp of being dismissed before they can well be subjected to inquiry.  

Another justification is the current modality of privacy in tech itself.  

Models for implementation of privacy almost solely focus on the ability of the computing system to incorporate legal requirements once an information disclosure has occurred.  For example, now that I have created an email account, there is much work on managing the information I have shared.  However, little or none consider the original mechanisms for collection, or decision making on behalf of the data subject in making the disclosure of their information.  In order to allow such consideration, we have to make privacy speak the language of computing.  If that is simply not possible, then we can end the debate about using legal requirements as a basis for ensuring privacy in computational systems.  


NB: The top 13 languages spoken by half the world’s population are identified in the Swedish Nationalencyklopedin include: Mandarin, Spanish, English, Hindi, Bengali, Arabic, Portuguese, Russian, Japanese, Punjabi, German, Javanese and Wu.  English translation courtesy of Wikipedia.