Benedict Evans interview

*He may not be right, but he sure is well-briefed.

Seeing things that aren't really there yet

(...)

SCHUKAI: On the point that we’re moving into other forms of realization – for instance, glasses – it brings up a question about privacy. There’s a constant battle these days that we’re really starting to see between how far technology can take us and the invasion of privacy that is resulting as a function of that. How do you see that tension playing out long-term?

EVANS: I think that there are multiple axes here. People think about privacy very differently based on the context, based on the author, based on the character of the product. Most obviously, your bank knows everything you spend money on and how much you have. You don’t really think that your bank is invading your privacy. Your mobile operator knows where you go, but again, you don’t really think of your mobile operator as invading your privacy.

There’s a really interesting tension point for Alexa where I’ve heard it said that Google could not have launched Google Home before Amazon had launched Alexa. Amazon had anchored the sense of this product, and some people have a different sense of privacy vis-a-vis Amazon, and Google, and Facebook. Apple has positioned themselves in a very unique place in that conversation.

You could also argue that it will be very difficult for Facebook to launch an Alexa competitor this year just because of where the news cycle is for Facebook regardless of what the actual product was. There are questions of perception in general about what this thing is, and also questions of perception around which company is doing it. They’re not necessarily entirely rational or predictable, it’s just where you come out as products evolve and people’s perceptions evolve.

I think there are a bunch of conflicted feelings around the Facebook News Feed. A lot of the questions around Facebook News Feed are not actually privacy questions, per se. They are questions around what Facebook chooses to show you and how it chooses to show it to you which are actually not privacy questions. The conversation is around, “Are you addicted to this stuff, and are you being manipulated?” Those aren’t privacy questions, those are different questions.

People have a lot of unresolved feelings on this topic; you’re using this thing and you’re telling it what you want but also it’s using you, and what exactly do we think about that?

SCHUKAI: Do you think, in a similar way, it’s almost a precursor to the tension we’re starting to see in the sphere of cognitive computing and artificial intelligence machine learning related to the ethics, the transparency, and the biases that are built into these models? Where does that take us, and how do you teach AI how to hold British ethics, or American ethics, or Australian ethics, or whatever the case may be in its decision-making process?

EVANS: There are a bunch of subtle and complex questions in that...