I am honored and excited to carry on the unique work of the Program on Economics & Privacy. PEP is a great match for me because my interest in privacy law grew out of my experience running an economics research program.
The research had nothing to do with privacy, but it used quantitative methods and was very dependent on access to data. I started to notice that the things I found most exciting about the Big Data revolution as an engine for rapidly expanding human knowledge and problem-solving was on a collision course with public distrust and consumer protection regulations. My comfort with, and appreciation for, quantitative methods give me an atypical perspective in the privacy policy community. I have openly questioned whether the most popular privacy proposals have defined the anticipated social problems with enough clarity, and whether there is a sufficient basis for alarm based on the available evidence.
PEP promotes and facilitates both types of academic work– research that defines a personal data problem with precision, and research that measures and analyzes the effects of modern data practices. This is invaluable work because we are entering a critical period for the digital economy and its regulation.
Outside of PEP, many of the policy discussions adopt the standard model for privacy protection, based on some variation of the Fair Information Practices. Those practices are built on outdated assumptions about how information can be collected and used, whereby the anticipated benefits and problems of digitized information are no different from a large warehouse of individual files handled by a very fast clerk. The FIPs are poorly designed when the benefits and problems stem from an entirely different model—where filtering and machine learning can make inferences and perform constrained optimization. Consequently, traditional privacy regulations that attempt to enforce notice and consent have a very crude relationship to the risks of Big Data. The risks that we need to understand are related to the societal effects of constrained optimization—whether the goal of optimization has repercussions that the market is unlikely to correct, for example. The traditional approach to privacy cannot do this work. The traditional approach was designed to resist large scale collection, sharing, and unexpected uses of personal data, yet these are critical elements for the success and social benefits of Big Data. Rapid improvements in service and innovation, for example, require data to be repurposed.
This year, PEP will facilitate economic research on Big Data policy issues through a works-in-progress workshop in December. We will share some of the leading research in this area at a public conference on May 10th here at the law school. Please mark your calendars. We will also prepare comments for upcoming Federal Trade Commission hearings on privacy and big data.
Please get in touch with me if you have suggestions for issues that PEP should address or research that PEP should promote. I would love to hear from you. My email is jyakowit@gmu.edu.