Introduction
【About the Presenter】Carwyn Morris (Twitter @Carwyn)holds a Ph.D. in human geography from the London School of economics and political science (LSE). He is starting a new position at the University of Manchester as a researcher this coming fall. His research interests include censorship, surveillance, migration, 低端人口, spatial informality, and china.
【Keywords for the presentation】 a spatial approach to digitality; surveillance and architectural systems; ethics and desirable surveillance; surveillance capitalism vs. authoritarian surveillance.
主讲人·Presenter: Dr. Carwyn Morris
主持人·Host: Punky
日期·Date: Sun. Sep. 13th, 2020
时间·Time: 10:30 am — 12:45 pm 美国西岸时间(PDT)
流程·Agenda: 40-min presentation and on-hour moderate discussion
【活动录屏】
【Timestamps】
00:00:00 — Introduction
00:04:35 — Part I-Lecture 1
00:29:54 — Part I-Q&A 1
00:53:32 — Part II-Lecture 2
01:20:33 — Part III Breakout Discussion and Group Discussion
02:11:30 — Ending
Contents below are 【Notes】based on the presenter’s presentation and the discussion. These notes are not the same as the presenters and audiences were saying. Please watch the video for the original version.
The copyright of the Slides below belongs to the presenter.
Part I — Lecture 1 by Dr. Morris
☛The theoretical positioning: Digital Space
Some phrases: [Digital Space][Digital Place]
[Physical/Digital Space] will be called interchangeably.
As a geographer, I’m interested in how places and territory are produced. We will think about how Physical territory and digital territory are produced online/offline: nation-state territory, local governmental territory, or borders are produced. Ex. the Great wall is a border for digital territory. How — Nationalism is produced.
[Spaces] Human and/or non-humans come together for social interaction. People make places, and places are constantly changing. It shifts based on the way people interact. I consider the hashtags, messaging groups, zoom meetings as places. Places as being sites where places can be potentially made, where places are produced. Control and surveillance are built into space (ex. WeChat and Zoom). Territory enabling surveillance and government
《网络空间的理论》(Theories of Cyberspace)by Fang Binxing (方滨兴), he is theoretically responsible for the ideas of cyberspace.
“Digital Cold War”
As most users of the Internet in the world are Americans, if the Internet is a global village, it is a village in CA, not somewhere else.
These positions enable me to think about how controlling space enables the technology of surveillance to be:
By doing this, we can do a lot of things such as surveillance capitalism, repressive surveillance, etc.
☛ Surveillance
The reason why you are doing something, the things driving you to do, or the structures of thoughts, philosophies, and ethics which are controlling the practices. Surveillance as a practice but no negative meaning.
According to Winner (1980), technologies may not have politics, but they form a sort of action.
By citing MacKenzie and Wajcman (1999), it seems when we are starting to use certain technology, we might be making a deal[…]technologies can influence our social, cultural, economic, and political formations.
【Does surveillance have politics?】
- When you are watching a video:
[Actions] Watching a camera feed, watching a video now
[Materialities] Camera/Screen
[Things/Meanings/Politics] attache to it (not only bad things): Securing property (ex. people watching something stolen in 7/11, Looking afterlife in Hospital, having fun and watching your cat cam) are built on the same principle but have different things/meanings than politics; Gratification (Sex cam); Doing parenting (Babycam). BUT when these data are hacked, these materialities and actions can be circumvented and subverted for different meanings.
BUT surveillance is not inherently a way to catch evil.
It is the power relations on meaning and politics that matter here (surveillance) rather than just the technology itself.
- In digital sites:
When you are parsing text on social media:
[Actions] Parsing text on social media
[Materialities] Algorithm/servers/databases
[Things/Meanings/Politics] Stopping hate speech; making profits (surveillance capitalism); Reducing ‘fake news’/’rumours’/ ‘谣言.’
There are numerous meanings or politics that can be attached to surveillance, and it matters how those meanings were attached. The power relations in the production of meanings, and the power relations and politics matter here.
Does anybody feel that this technology (surveillance)is inherently bad?
I’m still wondering that too.
- Extended Surveillance model:
Moderation = censorship, but different politics and meanings are intertwined and changing.
It is essential to know how it happens, why it happens.
☛ Imaging Surveillance
【How do we imagine surveillance? 】
We can know from the citation of Lyon (2018) what’s surveillance does, and how it is done, what is/is not surveillance.
West World, Black Mirror — imaginary of surveillance
Surveillance is constantly changing.
【What’s your surveillance imaginary?】
Surveillance imaginary can influence people, including legislators.
Technology’s capability of surveillance can be imaged in multiple ways.
☛ Surveillance Affects
【What is the effect of surveillance? 】
【Do you feel uncomfortable to be watched?】
Carwyn: do you feel uncomfortable, Izzy, knowing that I’ve been watching you?
Izzy/Punky: if I have my camera on, I have consented.
【?】 When you turn on the video, does it mean that you give other’s consent to watch/monitor you?
Consent is an important and interesting thing here.
Carwyn: For everybody here, how do you feel being seen? Or, how do you feel being a piece of data/ a data ghost? How do you feel being smelt around you? Is that make you feel anxious?
Part II: Q&A
Some questions:
Carwyn: it is interesting to think about the surveillance beyond vision/eyes. It is essential for algorithmic surveillance. It is possible to interpret the algorithmic surveillance/ surveillance capitalism through the eyes, but things beyond this is an interesting sort of point.
Adrián: I’d like to hear others about the “what social practices is surveillance a part of?” from a biological perspective. Ex. People use guarding dogs to defend themselves. As human beings, where do we draw the line between self-defense and beyond that? (This question also links to questions about the politics of surveillance: if the surveillance is inherently bad?)
Carwyn: the position is “surveillance is natural” you also suggested that securitizing is natural, and security is natural. It may be right. It is also related to the consent, but who’s consent?
Emily: The anonymity of data collection, is it still surveillance when the date is anonymous?
Carwyn: do we trust that the surveying institution is truly anonymizing? There’s an ethical thin about anonymity here.
Emily: Trust is very important.
Carwyn: Can you write it down in the chatbox? We can continue to think through this later.
(In the chatbox)Emilly: What the difference between being surveilled by a piece of algorithm vs. by a person? Do you feel differently? Why or why not?
Olivia: The question above is interesting. People in different places have different feelings about surveillance. To respond to the question, “is surveillance is neutral?” I think that different populations will also give different opinions on that. Then surveillance can’t be defined as positive or negative just by itself.
My question is, what do you think people’s attitudes towards surveillance’s influences by?
Carwyn: why are we seeing different surveillance imaginaries in different sites/countries? It is also about trust. Professor Yan Yunxiang at UCLA wrote about trust in China. These sorts of questions will be asked in my future research projects. The anthropological sights so far are arguing that there has been a breakdown of structures that can build a trustworthy society. Those people are looking for something to trust: structures can bring regularity and security in the society into life, even they don’t see them there. This is also an imagination question: how society is imagined. The researches on social credit systems being done right now say that a lot of people want social credit systems announced two points: firstly, it will give us a system of credit as the Western’s; secondly, they understand the technology as one that will produce a more ethical society. It will target equally to wrongdoers and others. In some “more equal societies,” research shows that people are less desirable to show themselves as unique and different. The economic inequality may have effects on this else well.
James: Is the surveillance a trade between privacy and security?
Carwyn: This question is related to the legal system and the imagination of society more broadly. It is less about surveillance, but despite the surveillance in a social credit system, there’s a neo-liberal situation like the US. Surveillance goes to reproduce those inequalities that were already existing in the security.
Flower: Surveillance may be a tool. Regarding the internal/external space, I think what’s happening outside may also affect the internal space.
Carwyn: we can think about visibility, where things can be seen. Pre-censor oneself since their thoughts might be problematic — internal of the body (individual/group). Effective surveillance would flatten space; they internalize those structures and scales.
Part III — Lecture 2 by Dr. Morris
My field research is in Beijing in 2017 (eviction of the low-end population).
The primary surveillance tool is still people. “朝阳群众”
Every surveillance technology has a position (spatial positioning).
【For more details about the analytical model, please follow @Carwyn】
…
Surveillance is not high tech; itis happening in other ways. We should be careful of techno-determinism, just because it is bound to surveillance technology; it doesn’t mean that the surveillance technology can be used to stop everything. The surveillance can also happen with basic technologies.
People create some memory spaces to record what had happened at the beginning of this year — other ways to defense like illegibility/legibility.
The surveillance imaginary is changing.
After March, the problem of ethical surveillance: surveillance was saving lives. The feeling of “being care for” through surveillance.
When facing the choice between two different regimes, some people can feel their bodies and feel that we were doing something.
By going here, we should ask:
“Smart city approach,” “digital everywhere approach.”
Now, the problem of surveillance is worthy of being debated:
If it is saving lives, how important surveillance is there?
How will it influence the future?
What are the boundaries of surveillance?
Where’s surveillance heading?
Part III- Group Discussion
Questions:
14:02:24 From carwyn morris : Has the fact that repressive surveillance caused the harm been forgotten?
With surveillance as surveillance, and not hidden, now saving lives, how will this affect future engagement with potentially ethical surveillance?
Wang (2019) notes that ‘social credit’ is seen as an ethical system too; intensification?
Will life saving surveillance result in an increase and further normalization of surveillance?
What are the cultural, social, economic and political effects of surveillance?
What are the boundaries of surveillance? Or, where is surveillance heading?
Smart cities, urban labs, the data-tification of everything. This means, surveillance?
How do we do this ethically?
14:14:55 From carwyn morris : On moderation (Whyte, 2006); people argue that 'covert' moderation creates 'conspiratorial atmospheres' which are negative. How do we make surveillance transparent? Actually transparent, not TECHNICHALLY/LEGALLY transparent through terms of service. How do we build in forms of consent which people understand?
14:26:54 From carwyn morris : " civil contract of photography"
1. In Breakout rooms — Group A
If you also want to share your opinions or participate in discussions, please register in our group, and you are welcome to share your views with us.
2. Group discussions
☛About data collection and its use
·Fear and not fear
Jennifer: I don’t mind that others collect my data on social media. If they (tech companies)can provide me information relating to my interests, it’s okay.
Olivia: It makes sense to use my data to provide me information/service on social media. But do you feel comfortable sharing your information with your friends/followers on social media? Ex. the recommendation of products bought by one of your friends?. This power is not more than just focusing on you. The uses can’t be foreseen.
Yun: It is scary to see the platforms collecting your data/preference. More than this, the platforms also manipulate your preference like political preference. Then it becomes the deterioration of your identity.
· Transparency and Consent
Izzy: You’ve mentioned the ethic of care. It is actually the concept of “care” in feminist ideas as I understood it. It comes back to consent. It is possible to give consent to corporations collecting data. It is possible to surveil the data double from the beginning to the end. Giving consent, but “然并卵.”
·Surveillance capitalism*
Louis: We are users of social media platforms. Then this question is also related to capital. For example, the delivery workers are trapped in the algorithm.* The data are pushing workers into very dangerous situations. It is not only about my privacy being compromised by social media.
·Transparency
Carwyn: I think that there was also broad transparency throughout this praxis, both governmentally and corporate. I begin to wonder, if our first engagement with those corporations is not transparent, how can we believe anything else can be transparent further along the chain. I was also doing research with the food delivery drivers. It was not easy for them both algorithmically and dealing with customers. The exploitation upscales.
【Editor’s recommendation】Two references for this topic:
- 《外卖骑手,困在系统里》文/赖祐萱;编辑/金石图 in. 知乎《人物》(Ch.);
- 《有进无出,700万外卖骑手的内卷人生》 Youtube Video LINK (Ch.)
James: The use of data is sometimes out of consent. It is a very critical thing to deal with how to use data correctly in the current political situation. In some platforms, the corporations (Facebook)don’t handle data/ intentionally use data in impropriate ways to manipulate their stock price.
Carwyn: As there’s no transparency, it is difficult to fix. It is about spacial imaginary. Here we consider Facebook as a privately own space where we congregate to people to interact with each other. There’re many public-private spaces around us where we, as the public, are allowed to enter into. Once we enter that, we have been holden to different rules. Then we should think about how we have minimum transparency in the physical spaces around us. We should be trying to push further levels of things. If it is unacceptable in physical space around us, why should we allow them in digital space? Why should it be different in digital space? It is not the perfect way to do it, but we should ask for more transparency.
Flower: I’m not sure that the thinking of data as digital space is a good analogy. But I’m thinking about the government. The data help us; we give up some of our rights and deliver power to the government. But how much should we give up, and how much should we deliver? Like the poem of Martin Niemöller First they came…(1946):
THEY CAME FIRST for the Communists,
and I didn’t speak up because I wasn’t a Communist.
THEN THEY CAME for the Jews,
and I didn’t speak up because I wasn’t a Jew.
THEN THEY CAME for the trade unionists,
and I didn’t speak up because I wasn’t a trade unionist.
THEN THEY CAME for the Catholics,
and I didn’t speak up because I was a Protestant.
THEN THEY CAME for me,
and by that time, no one was left to speak up.
Carwyn: Concerning the civil contract thing, a book about photography written in 2015 is more approachable. The author asks how you can consent that you do not understand. If you search “National geographic Afghanistan Girl” on google, you find the picture taken in 1984 extremely famous. It is linked to the cold war in this country. The Photographer trapped the person down. In this situation, even the kid said that you could do this. At the moment, the kid doesn’t know what can happen and what National Geography is. She doesn’t know the power of photos and who are audiences they are speaking to. The problem of consent comes to our question. When most of the populations don’t really know the use and implication of technologies in a basic way, the consent is broken at the moment for those things.
Three: The “consented” uses for your data in inappropriate ways. Do you feel comfortable with this?
Olivia: I agree that the cap/barrier to access to technology can prevent people from giving proper consent. What is the right way to regular data collection and surveillance? Better education overall? We always have some expectations, but how to achieve? Basically, to trust higher authority to regulate other authorities that you cannot control. Using power authorities against power surveillance is an interesting phenomenon. That is a more realistic approach now to make sure that the legislation can catch up with that.
Louis: Regarding the proper consent, you expect consent, but some platforms are part of your life. You may have no choice. About Jen’s statement, we know part of things you consented, but you may have no idea about most things in your consent. It is more than a personal preference and about power and your security in this case.
Jennifer: Many of y’all give many bad examples against my statement. In physical spaces, we cannot guarantee our privacy and avoid the invasion. The question for me is whether you can live with this reality.
Min: Your analogy of space is impressive. I thought that one of the biggest problems of social media is they are monopolies of power. If there’s competition, there will be special rules. If you don’t use one product, you can use others. But the reality is that we don’t have any other choice.
Ending
Carwyn: I will be very happy to see if you have more thoughts through our discussions or any other points on these questions in the future. Sometimes the things are just provocations. They are about pushing things further to see what happens, but it is important to break out the boxes we are currently thinking in. I’m always happy to engage those people. Thanks for having me.
Zoom Chat Box Discussions
13:16:46 From Emily: The difference between being surveilled by a piece of algorithm vs by a person; do you feel differently, why or why not?
13:18:52 From Mina : As a person working in data privacy, I'm not buying the argument that as technology gets better, anonymity would get increasingly compromised. I would love to hear why people think that way. My argument is, why couldn't it be the reverse? Maybe anonymization technology is the one that's getting better over time?
13:20:00 From Izzy: when we plan something for the roadmap, the first question is always, what is the business value $$$?
13:20:37 From Emily: I think that’s a really great point mina! and perhaps both technology will get better in an arm’s race manner, as long as there’s profit in maintaining / breaking anonymity.
13:21:34 From Emily: I feel the credit system in the west is highly oppressive as well..
13:25:01 From Izzy: information asymmetry
13:29:36 From Three : surveillance is from sur- ‘over’ + veiller ‘watch’. there’s an inherent power relationship in this word.
13:30:31 From Olivia: @Mina Personally I think that advancing technology generally means more data is collected about an individual and there will be more ways to link the data (eg. cross-site tracing) that fall out of the control of any individual / corporation and form a data network you can't escape from. Although I agree that there are always counter measurements towards that
13:35:08 From flower lalala : @Three The power inequality is imbedded in this word. Also, there is a strong sense of "we" against the "others."
13:37:33 From Emily: just curious - what was the chinese for beijing surgery?
13:40:00 From Mina : @olivia Thanks, good point. The question is who has access to all those data, scattered across different corporations etc. There is a theoretical possibility of a mega data thieve, gaining access to all those data and applying exceptionally good data linkage algorithms to piece data from different sources together to gain a picture of particular individuals' lives. It's like there is a theoretical possibility of a super stalker that follows a person everywhere, goes through her trash, etc.
13:41:21 From James: 好像叫北京切除
13:41:23 From Mina : I have to hop off, good discussion people and my thanks to Carwyn!
13:41:37 From Adrian: thanks for the good input, Mina!
13:47:14 From James: https://theinitium.com/channel/wheretogo/
13:54:14 From Olivia: Just a thought on neutrality of surveillance - surveillance may be neutral, but the power to surveil falls disproportionally to the government / corp instead of the individual. This power dynamic might be where anxiety comes from.
13:56:49 From Emily: Gotta go, but thanks so much Carwyn and everyone for the interesting discussion! :)
13:58:30 From Min : I think the technology of de-anonymizing data does not only exist theoretically. In fact, my understanding is that the challenge is at building technology to build truly anonymized dataset for research, not the other side.
13:58:42 From Min : https://techcrunch.com/2019/07/24/researchers-spotlight-the-lie-of-anonymous-data/
...
14:29:10 From Izzy: that’s what i meant by “it is not possible to give consent”. the power dynamic is off from the beginning
14:31:01 From carwyn morris : Who was Yiqiu Han ? I wanna speak to them but they left
14:31:14 From Izzy Wu : what did they say?
14:32:38 From Olivia: Digital space has more barrier to entry / access / knowledge to the general public than physical space
14:35:01 From carwyn morris : maybe in the Weixin group
14:35:47 From Izzy: ah… not sure which group they came from. we could try to ask around.
14:36:19 From carwyn morris : sure, would love to know
14:38:18 From Min : That reminds me of the super long terms&conditions for consent
14:38:29 From carwyn morris : yes, excattly; they are not a real form of consent
14:38:37 From Izzy: yep.
14:38:59 From Izzy: take it or leave it is not really options.
14:39:57 From carwyn morris : Olivia; on space, there are numerous forms of space/place with barriers and regimes of vsibility. Home? Bed room? etc. It is definitely not exactly the same, digital vs physical, but there is room to learn and think through the ethics of it from this.
14:40:05 From carwyn morris : sorry, not super coherent, listening, writing etc. ^^
14:40:55 From Olivia: For sure! Definitely similarities there
14:41:18 From Izzy: also the boundary between surveillance capitalism and state surveillance is increasingly blurred
14:44:08 From carwyn morris : true Jen, on that things are not like 'we had perfect privacy' before and habe now lost it. I think it's important to consider it from this perspective.
14:44:09 From Olivia: @Izzy Very true, especially when state has absolute authority
14:44:18 From Olivia: * power, not authority
14:46:32 From Izzy: they both have more and more power against the people. definitely obstacles when going through the legislation route.
14:48:05 From Olivia: Right, how do we entrust legislation when the same party also benefits from digital surveillance
14:49:18 From Olivia: Thank you Carwyn, great presentation & discussions :)
14:49:41 From Min : Thank you Carwyn!
14:49:43 From Mingyue : Thank you Carwyn and Punky!
14:49:51 From yunz : Thank you~
参考资料
- Datta, Ayona. 2020. ‘Self(Ie)-Governance: Technologies of Intimate Surveillance in India under COVID19’. Dialogues in Human Geography, May, 2043820620929797. https://doi.org/10.1177/2043820620929797.
- Gillespie, Tarleton. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.
- Haggerty, Kevin D., and Richard V. Ericson. 2000. ‘The Surveillant Assemblage.’ The British Journal of Sociology 51 (4): 605–22. https://doi.org/10.1080/00071310020015280.
- Lyon, David. 2018. The Culture of Surveillance: Watching as a Way of Life. John Wiley & Sons.
- Noys, Benjamin. 2015. ‘Drone Metaphysics.’ Culture Machine 16 (September): 1–22.
- Souza e Silva, Adriana de. 2017. ‘Pokémon Go as an HRG: Mobility, Sociability, and Surveillance in Hybrid Spaces.’ Mobile Media & Communication 5 (1): 20–23. https://doi.org/10.1177/2050157916676232.
- Vukov, Tamara, and Mimi Sheller. 2013. ‘Border Work: Surveillant Assemblages, Virtual Fences, and Tactical Counter-Media.’ Social Semiotics 23 (2): 225–41. https://doi.org/10.1080/10350330.2013.777592.
- Wall, Tyler, and Torin Monahan. 2011. ‘Surveillance and Violence from Afar: The Politics of Drones and Liminal Security-Scapes.’ Theoretical Criminology 15 (3): 239–54. https://doi.org/10.1177/1362480610396650.
- Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books.
【版权声明】
以上内容仅代表分享人及讨论者个人观点,版权归分享人所有,如需转载或合作请与讨论小组及分享人联络取得授权。
讲述部分演示文稿(Slides)版权均归分享着所有,请尊重知识产权,未经同意请勿转载。
笔记记录:Adrián, Mingyue
排版编辑:Mingyue
海报设计:Shuo Chen
— — — — — — — — — — — — — —
【欢迎加入理性讨论小组】
*Welcome to Discussion et Rationalié*
理性讨论小组微信公众号· Our WeChat Platform: 搜索“理性讨论小组 ”(Le_Debat_Rationnel)
理性讨论小组Medium账号· Our Medium Account: https://medium.com/@taolunx
理性讨论小组YouTube账号· Our YouTube Account: https://www.youtube.com/channel/UCaK1iJJGeIKw-mZ7BLcO3ZQ
理性讨论小组新成员加入表格 · New Member Form: https://docs.google.com/forms/d/e/1FAIpQLSfSt3LESds93GBMCoKZ1RPARcXLxsr4rDmcdagHMV_TI0Y9Eg/viewform