Andreas Weigend | Social Data Revolution | Fall 2016
School of Information | University of California at Berkeley | INFO 290A

Contributors: Lukas Schwab, Vlad Rudoy
Video: https://youtu.be/8c1jp-wc1hc
Audio:

Topic 14: Cam4 reflection
external image Screen%20Shot%202016-11-25%20at%206.16.30%20PM.png

Introduction
After the in-class presentation by Cam4 (see: topic 13) SDR 2016 gathered for a conversational discussion of the various speakers’ technologies and methodologies, the key take-aways from the course at that point, and the implications of SDR itself as a sort of sousveillant project on the part of the class.

Main ideas from the first 6 weeks of class (0:00)
Having gone through 13 topics, the class was asked to reflect on the past couple of weeks and present their opinions. Morals and ethics seemed to be at the forefront of the discussion, the catalyst for which was the idea that technology can be used as a tool for battling inequality. The claims made by CAM4 executives, gave rise to the previously mentioned idea; more specifically, it was their mention of that their services give the unemployed youth an opportunity to escape structural unemployment. We were given income statistics of some CAM4 models, which made it tough to deny that being a cam model doesn’t provide a chance to alleviate the burdens of unemployment, nonetheless, the whole notion of selling one’s body appeared rather creepy and is something which can be extremely unsettling to many folks.
Relating back to the class motto of “Data by the people, for the people”, the discussion included a debate on whether the companies which have been our guests in the past couple of weeks, are actually striving to use data to better the society, or are they simply exploiting user data for profit? With the ample amount of data at their disposal, data refineries carry a lot of social responsibility as they have the power to either better the society or exploit users (both on a small and large scale) for the sake of profit. In relation to CAM4, is the CAM4 team actually striving to build a community, as it was mentioned in the presentation, or are they simply in the porn industry to exploit a relatively large market with a lot of potential for a serious payout? As a side note to this question and something worth considering, one of the students mentioned that money plays a very important part in the life of any company. If it wasn’t for the money, a lot of the companies would not exist; therefore it is not entirely selfish for the executives to place the generation of profit as a top priority in their business models.
Another value of this class discussed within the group was the exposure to the internal operations of visiting companies/start-ups which the “revolutionaries” of SDR were able to obtain by having the various executives come in and speak to the class. The homeworks from this class, made one not only think from the perspective of an executive or a PM of a certain company, but also appreciate the decisions, the small details, vast amounts of data, and other means of used input in order to produce an an output beneficial to the consumer. Furthermore, the class understood the sheer amount of data which the companies collect. None of these companies would exist without users providing them with raw data; therefore, it appears as if us consumers serve as a lifeline to all of these companies, and that we actually have control. But are we actually in control, or is it all simply an illusion?

Objective of the Social Data Revolution (11:25):

The main goal of this class is to open up a discussion about the data we create daily, talking about where this data ends up, who owns it, and which rights do we have over it. Andreas started the class by giving an example of which data is recorded; from the time we wake up, make coffee, and leave our houses, to who I meet throughout the day and where I have dinner. However how can this data be use for our own benefit or detriment?

According to the Google Glass experiment that Andreas conducted, one of the key findings according to him were that people are more scared of that data collected from them being taken out of context, than the data know itself. People are likely to be okay sharing their data, as far as it was anonymous. This brought up the topic of data recognition, meaning that technology will be, and in some cases already is capable of recognizing us through our voice, the way we move, our faces, etc. How will we stop people from recording us and tracing back that data to us. For example Facebook already posses a huge database of pictures, and is capable of recognizing fundamentally everyone that is on the internet.

One student brought up the example of Donald Trump being recorded talking degradingly about women, costing him about half of the republican vote and probably the election. This is an example of how data that was recorded in the pass can be used against oneself in the future. However this can also be an example of how data can be used to do good. After all is it not a good thing for voters to know the truth of candidates? Can we judge the way data is used to benefit or to hurt someone. When is it okay and when is it not? This is something we half talked about throughout the class. For example the case of the police officer in Florida who turned off his camera but forgot he had the audio on. This brought up an interesting topic about the difference between sound and video recordings. Should they be treated differently, since it is harder to tell when someone is using a microphone to record our voice than someone recording a video of us. These are ideas that we should have in mind while constructing the data rights of the future.

A student brought up the fact that privacy was introduced in class as a social norm, it became a social construct. When houses did not have a chimney we were forced to privacy was different than what we have in mind nowadays, we had to live together in order to keep warm with a fire. The became an issue of what is the norm for privacy and if we should adapt to a different norm or if technology should mold into what we expect the norms to be.

Andreas concluded this part of the discussion by taking a stand on his opinion on privacy, which he made clear by saying that privacy is changing and we are changing as a society, technology will become more advance and we cannot stop that. However, this is the reason why he proposes his six privacy rights, which we have discussed throughout the semester. The rights aim to provide transparency with what companies do with our data and also give us power to over this data. As this reality is here it is important for us as a society to know and protect our by means of these rights.


SDR as data broadcast (19:21)
To what degree can we consider SDR itself a data product? Clearly the core material––the various companies and speakers who have been engaged, and Andreas’s framework for evaluating data rights and data ownership––are not a data product. They’re collections of information constructed with the specific intention of broadcast.

On the other hand, the recorded discussions and homework (including crowdsourced feature recommendations for Yelp, beta testing of social applications like Minute, and feedback on the content of Our Data) constitute user data from the students “using” the aforementioned course materials.

This is a demonstrative component of the coursework: we aren’t only consuming course material in exchange for money, which was the dominant paradigm of economic exchange before the information age. Instead, we consume the course material in exchange for our data production about that course content. This contributes value to the various thinkers who guided the course and also to those interested in consuming SDR content without the privilege of access to an institution like UC Berkeley.

As mentioned earlier in this discussion, this kind of sousveillant contribution to economic exchange is not a new activity for many members of this class. Quite to the contrary, popular data-to-use services like Facebook or Yelp operate by the same principles. Less consensually, government surveillance programs collect digital traces in the name of counterterrorism and modern law enforcement develops facial, gait, and voice recognition to link individuals to criminal activity. Even when one’s activity isn’t criminal, one contributes data on non-deviant activity that’s critical for the development of more accurate filters and machine learning models.

What sets SDR apart is that the workings of this apparatus is laid bare. Some members of the course are uncomfortable being recorded at times; this is not because the experience of being recorded is a new one, but because the experience of knowing they are being recorded is new.

When discussing data collection, critical scholars frequently return to Michel Foucault’s genealogy of the prison. Specifically, Chapter 9 of Discipline and Punish describes Bentham’s development of the Panopticon: rather than punishing prisoners for deviant behaviour, it sought to reform them by exercising constant disciplinary power via the gaze of unseen guards.

More sinister modern data collection apparatuses actually buck that modernist trend. In a way, rather than attempting to influence an individual’s behavior by giving them the impression that they are being watched, they seek to eliminate perception of a disciplinary gaze altogether. The NSA does not want us to change our online habits in fear of the NSA; instead, the NSA wants us to forget it’s there so it can effectively observe our online habits sans influence.

That is one really tremendous advantage of the Social Data Revolution: by laying bare a system of data collection for student observation (albeit a simple one, where data refining is minimal), it serves as a reminder of similar but less conspicuous interactions we frequently forget we take part in. Moreover, by acting as conscious data producers and thinking through the potential data we collectively produce, we can come to understand the thinking that makes more complex data refineries/products what they are.