freedomx2122 surveillance in 2030

FREEDOM™ – Surveillance in 2030

Nikolas Badminton is an in-demand Futurist Keynote Speaker and his keynotes are tailored and unique to the conferences he is invited to speak at. Here we see him talk about FREEDOM™ – Surveillance in 2030 and address 2000+ people at the 20th Annual Privacy and Security Conference on privacy, surveillance, and our freedom. The video and full transcription is included.

FREEDOM™ – Full Keynote Transcription

This is how we’re going to start. So back in the 18th century, this guy, Jeremy Bentham came up with the idea of the panopticon. The panopticon was a theorized prison where one or two prison guards could literally be in the center of a circular prison and see all of the prisoners. The effect of that was that the prisoners felt that they were always being watched even when they’re not. Then they fell into lines of discipline based on their own thoughts and fears around doing something wrong and not getting out in time.

I actually think that we’re in a modern world where the panopticon is kind of surrounding us, and this is what this presentation’s a little bit about. Foucault is another social theorist that came in the 19th century, and he started to look at what Bentham was talking about, and started to think that actually surveillance is a permanent thing. And that, even if it’s discontinued, the people still feel like they’re being watched. So they normalize their behaviors. And what’s really interesting is, back in the day and even before the 18th century, there were a number of different ways that we ensured that kind of discipline and structure in society in a way where people behaved themselves.

We actually are now living in a world where these old ways of control, back in the day it was around police and school and discipline and torture, are being replaced with a subtle calculated technology of subjection. We know that technology is taking over the world. We’re surrounded by it at every single point of our day, every second of our movements are generally being observed, or we feel that they’re being observed, and they’re being monetized in some sort of way.

And as privacy security professionals, this is something that we really need to be aware of. But also come back to think about what can we do in this modern world? What is our role? There has been some amazing sessions today already I look forward to coming again tomorrow and seeing what more is being spoken about.

But I’m just going to take somewhat of a ride into what I’m calling the signals of change. The signals of change are these things that I see every single day on the streets or I read about in the articles that I read, and with the people that I have lots of discussions with around privacy, security, big tech companies, and government, that are indicating to me that we’re surrounded in this world and that idea of the panopticon is prevalent in our society. So it all starts when you think about computers with this guy, this is Douglas Engelbart. In 1968 he did something called “The Mother of All Demos.” It was in Stanford, it ran for a couple of days, and over a thousand people, very much like this room actually watched him give a demonstration of the world’s first personal computer.

Now that’s 50 years ago. If you actually think of the acceleration between then and now, it’s been incredible most people couldn’t even fathom that we would have this kind of technology in our homes. Not many people could really understand the impact of computers. This is actually an article taken from 1983. The very sort of optimistically said why the computer will reduce political upheavals. Now hands up who thinks that the computers are reduce political upheavals.

It’s because they didn’t realize what was coming next. And this is what was coming next. The internet. So the commercial availability, the personal internet as it kicked in in sort of ’92, ’93 and I got on the internet about 1993. So Tim Berners-Lee helped usher that in. And now we’re in a position where the entire world is connected by cables under the sea. Data is flying through the air. And suddenly, we’re generating more data today than we ever have done.

By 2025, they actually think we’re gonna generate about 44 zeta bytes of information on a yearly basis. That’s the equivalent of a billion billion hi-definition movies.

Today it’s only 4.4 zeta bytes. So that acceleration comes from the technology that surrounds us and the way that our cities, our homes and our offices are actually changing.

But no one really understood how mobile was going to really accelerate as well. Now I normally ask you, hands up, who doesn’t have a smartphone. It’s a massive audience full of security people. I know you’ve got burner phones. It’s all good. I normally makes jokes about you’re the security and it’s like, okay, there’s 500 of you. I love you. Okay.

But now we’re at a point where there’s six billion smartphones in the world. This is how people do the work. Who would’ve thought that Douglas Engelbart would’ve brought in this new age and in 2007, Steve Jobs would’ve said, “In my hand, I’ve got an internet communicator, a music player and a smartphone that lets you have conversations.” But now, having conversations is probably the fifth or sixth most likely thing that you’re gonna actually do on a smartphone, right?

So that’s now led us into a world of the last sort of 12 to 15 years of social media. Hands up who’s on Facebook in the room. I love it. I love coming here ’cause it’s like not many of you, right? And it’s actually be it’s actually a terrible platform in terms of your rights as a human and a user. And what’s really interesting is the evolution of the like button was this. So you like, you love, haha, wow, sad, angry. It’s the architecture of the a modern relationship. And not a lot of people actually understand that this is actually what happens. These system fit human behavioral norms and those behavioral norms change to fit the system.

We’re shaped by our tools. Marshal McLuhan. And I was in a session earlier and someone held up an Amazon Alexa and they did a whole routine where they were talking about healthcare. And Amazon Alexa was responding and being friendly and collection information and transmitting it to this gentleman’s doctor and that’s really interesting to me. There have been tens of millions of these devices sold in North America. There are actually like tripe or quadruple the amount of Google devices been sold. And there’s Apple devices and there’s more coming out of China or various brands that are copycats.

I find this to be one of the most pervasive and world changing technologies, just because we don’t really know that it’s there, most of the time. But if you have children, and you have an Amazon Alexa in the room, you ultimately have a new big sister for them. In fact, Amazon came out with a version, the Echo Dot for Kids where it would teach your kids manners. So fundamentally, this kind of device is replacing you as a parent. Now, I don’t have one of these devices at home. I feel that it is too intrusive in my life, but there’s something really interesting. This is the first time we’ve really seen input which is voice and environment and sound coming into a device into a central repository where artificial intelligence and machine learning can train itself and then at scale, push out responses to the behaviors that we want to see these machines to have.

In fact, there’s a lot of experimentation that is going on with these and Amazon ran a competition. And one particular group trained their based on all of the conversations that were happening on Reddit and across a number of other social platforms as well, and in one case, it actually told someone to go and kill their foster parents.

So this is actually really concerning to me. Because it can take the good and the bad of who we are as humans and it can feed it back to us. But not necessarily to us as being the people that were the origin of the thought or the sentiment, but at scale across millions of people if not billions of people eventually.

So, when I start to see this technology, I start to understand that maybe you’re going to start tip toeing around and not talking so much because Amazon Alexa is probably listening to you. And then when we hit our High Streets, we’re starting to see an evolution of how we’re just gonna operate with normal daily chores like going to the supermarket.

Within three miles of every single Whole Foods is the majority of Americans that have got both Amazon Prime and earn over $100,000 in terms of a household income. And what’s really interesting about this technology is it’s enabled by cameras and sensors, RFID and a number of different kinds of artificial intelligence that really watch what we do when we walk into a store. So take look.

So, if we’re in this particular kind of system and ultimately we are part of the product, we’re in a system that can influence how we behave in these stores. Now if you think about the ideas of fast and slow thinking, and what Daniel Kahneman is very well known for with reflective and reflective thinking, we’re gonna not really consider what we’re buying half as much as we would if it was going into our basket. And then we’re walking around the supermarket looking at us back in the face. So probably going to spend 20 to 30% more every time you visit an Amazon store. But really, you walk in, you pick up what you want. You walk out. I used to call it shoplifting and today we’re calling it modern life.

So it’s interesting where we are. And now you’ve got Wal Mart and Microsoft and lots of other supermarkets have got competitors to this kind of technology because they want to remove humans from the mix of the shopping aisles. And literally we are going to humans that are literally concierges to help you spend more money versus just to ring up at the till and help you pack your bags.

And then on the streets, we’re starting to see some really interesting things happen in terms of CCTV and surveillance cameras. I grew up in the UK. I worked for many years in London. I’m literally used to waking up everyday and having dozens of cameras in my face. It’s called safety. I’ve almost been blown up three times so I don’t mind it so much. There’s some urgency around this. But these are surveillance, hi-definition surveillance cameras in China. And if you jaywalk, within like a minute of you jaywalking, you will be actually submitted with a fine. And you will go into a database and potentially there’s gonna be an impact on your social credit score, which has got a larger impact on how you can operate in China.

I’m not gonna talk about the social credit system so much today because it’s kind of undefined as we sit, but it’s something definitely worth looking at.

And then I started thinking, yeah, okay, so once we’re not just volunteering our own information, what other things are happening? And this is actually a program that’s happening down in California.

At the time. I think this is a normalized kind of behavior around surveillance that we’re gonna see going ahead. That collection of information in real time, we’re no longer gonna be seeing those Google cars driving around collection information on how the streets work. By the way, Google maps isn’t very good to use in Victoria, so I found out yesterday as well. But the knock on effect of this is that collection of data from surveillance cameras, but these kind of eye in the sky I see coming to predictive policing models. And dozens of cities in North America have deployed predictive policing programs to try and work out the hot spots of where something is likely to happen next. Has every seen “Minority Report”? Yeah?

Maybe it’s not like three psychic ability people in a floatation tank, but it’s much more scientific and rigid and something that we can apply artificial intelligence to. And then it gets really, really interesting. When I start to think about, what about sending hundreds of satellites into space and then taking high definition pictures of us down the ground? I mean we can actually identify us as individuals by the gait of our walk. Right? So you can do that from a satellite.

This is actually a company called Planet. I thin they’re out of California as well. They’ve go a 150 of these satellites. These are dove satellites and even satellite takes 1.3 million images total per day. So they can actually take  high definition images with only a 24 hour latency per day of almost the entire Earth.

See Planet Labs’ Will Marshall talk at TED about the possibilities:

So the power of that in terms of understanding who we are as a society and the power of that in terms of law enforcement and intelligence and the such like is very compelling for everyone except for the people on the ground. And once people start to understand this, people start staying at home.

But then, it’s okay because you’ve got your Amazon Alexa at home that records everything that you do. So it’s an interesting world. But we’re careering towards a world with self driving vehicles.

The Department of Transportation down in the U.S. actually thinks that by 2023 or 2024, we’re actually going to see more and more people starting to use applications on their phones to call self-driving vehicles that’ll take them wherever they want. Instead of owning a car, you’ll literally have an application. You’ll pay $300 a month. And the car will come to you and there will never have to be a driver to have a conversation with and you can probably chose your own music that you want to listen to as you drive around.

I kind of don’t think it’s gonna be a good idea to let this artificial intelligence try and interact with you because based on bias and scaling up, you might have to listen to John Denver when you don’t like John Denver, or maybe some techno when you don’t really appreciate that so much at 7:00 in the morning.

But the self driving vehicles are gonna come and they’re going to come at scale. A lot of people didn’t think it was gonna happen this quickly. I actually think that the models for GM and Ford and we’re already seeing it with Tesla, but also companies like Lyft and Uber are all gonna be the provision of transportation services. The auto industry is dying.

And then beyond that, we’re gonna start moving away from the cell phones and the rectangles in our pocket to wearing headsets. Now this is actually the Microsoft Holoens. A little known fact, the Microsoft Hololens was actually developed in a secret laboratory over several years in Victoria. And what’s really interesting about this is it’s unwieldy today and it’s kind of used by architects and engineers and maybe on the shop floor in a factory. And it’s kind of not really in the mainstream because it kind of looks like this.

If you’ve ever used it, it’s still ow quality. It still hasn’t quite found its application. But trust me. It is coming. I know people that are working in Facebook and Google and magically, and in Microsoft and the amount of investment that’s happening into the augmented reality field is incredible. And it leads us into this modern world. And this is a video by a guy called Keiichi Matsuda. I was actually turned on to Keiichi Matsuda by Nora Young, who was on a panel earlier. Is a good friend of mine. And he created this vision of what a world could look with that augmented reality. He went down to Maine, in Columbia he thought, “Okay, if I was gonna wear this headset, what does that augmented vision look like?” You know, cats in the sky. Information about the streets. Information about certain individuals. When you’re in the supermarket, it would tell you the nutritional value of the banana you’re about to buy and where it came from. And then you’ll ultimately be hacked. All your loyalty points stolen. And then you have to recalibrate everyday because it’s gonna be a dangerous world.

It’s about a 12 minutes for this video. I compel you to watch it. Keiichi Matsuda, Hyper Reality.

But this is one of the most fascinating things. We think about high tech and how do we deploy levels of surveillance and watching us on the streets and China’s kind of good because they think what have we got an abundance of? Like older women that sit around and like to poke their nose into everyone’s business on the streets. So this is the Chinese security patrols that actually operate in large cities. And if they see someone that’s not acting in a way that’s appropriate, they were call her into the police. So, not only do we have satellites and listening devices, cell pones, self driving vehicles, eyes in the sky, social credit systems. We can’t even trust our grandmothers, right?

So we’re living in a very strange world. And if we go back to the idea of the panopticon, we’re just gonna have to normalize our behaviors to act very sensibly and without moving outside of boundaries defined by the companies and by the governments that we operate within their operating system as it were.

So when I think about this it’s like, so, how do we get responsibility into this entire dilemma of we don’t wanna be watched. We want our own sort of place in the world. We don’t wanna just be the product. When we’ve got so many different layers of abstraction. The layers of abstraction, it’s made it wholly complex by just the terms and conditions that we put around every single layer. The average person would have to spend 76 working days reading all of the digital privacy policies they agreed to the span of a year. Does this concern anyone?

It would take you nine hours to out loud speak Amazon’s privacy policy. I think the article I was reading, it was in the “New York Times” and they were saying you might as well, instead of like the “I agree” button, it should be the “Meh, whatever” button. ‘Cause what can we do? But this is it. Cloud computing. Layer and layer of applications. APIs. Everything’s got terms and conditions that relate to other terms and conditions and other systems. And suddenly, even in something like a self-driving car, you’ve got 26 layers of different applications and different technologies and we don’t know who’s accountable for a problem in the system or a breach in data.

We go to the sharp end of the stick but they bounce it all the way through their technological stack. That brings me to the idea of ubiquity. I’ve talked about technologies that surround us. So we’re gonna be in a world where we’re not walking down looking at the rectangles in our hands. We might have augmented reality, but really we’re gonna operate in a way that feels very human and very natural. We’re just not gonna understudy why our behaviors are being controlled.

The ubiquity in the world means that we’re being surrounded by the system. So where does the fight back come from when it comes from us? Not only is the people the own charge privacy and security, but us as the humans, the mothers and fathers, brothers, sister and the such like.

One of the big sort of stories I love to talk about is when Google employees basically said that they refused to work on the Pentagon’s $10 billion Jedi cloud services project. And literally Google walked away. Because tens of thousands of their employees basically threatened to walk out and we said, “We do not stand for this. This is not how we want our artificial intelligence platforms, our data platforms to be used for military intelligence.” Which is actually a very positive thing.

And then the ACLU wrote letters to about 100 different [inaudible] executives, including the CEOs of Apple and Microsoft and Amazon and Facebook to actually say they want to have guardrails around facial recognition technology, which is probably gonna be one of the most powerful technologies for keeping an eye on what’s happening in the world.

And then you’ve got these big thinkers. So if you’ve never heard of these three guys and the Electronic Frontier Foundation, go and find out who they are. So the top left is a guy called Jaron Lanier. And he talks about how privacy and information has bene skewed by social networking and modern services and how we need to take back control. How the modern world is about us having control of data. Us owning who we are. And actually being able to use that in a micro-transactional world.

Then we’ve got Douglas Rushkoff. In 1994, read his book “Cyberia”. It fundamentally changed how I look at the world. He’s just written a new book called “Team Human”. Go and read about how the world’s culture operates within technology. Douglas Rushkoff’s incredible. He’s also got a Team Human podcast. Go and check that out. And obviously Tim Berners-Lee. And he came out with a new technology called “Solid” about decentralized internet last year. That’s awesome. And obviously I really support the Electronic Frontier Foundation because digital rights are human rights.

And then we’ve got this. Thank you. I didn’t expect to get a clap for that. That’s awesome. I actually donate every year to the EFF. I urge everyone to support people like the ACLU and the EFF.

But then, this kind of things happens. The government wants to fight back against big tech. That’s great. Did anyone watch the Zuckerberg senate hearings? Let’s put a bunch of people that don’t understand how to put socks on in the morning. To have a conversation about social networking in the internet with the guy that basically created the new internet. What was really interesting about watching this last year was it was big government, the U.S. government, against big tech, more than it was against Facebook.

And the U.K. have gone one step further by even intercepting documents from their lawyers at the airport before they could actually leave the country and using that because Facebook wouldn’t even turn up to the hearings in the U.K. alongside Google and other companies. It’s time for these companies to be held to rights. Let’s just put the right people in the room with them, maybe people in the room here, and not some crazy senator from Tennessee. Seriously. Seriously.

I normally talk for about one to two hours every time, so I don’t have much time today and we’re coming towards the end of my talk. But, I try to think about justice and who we are. Our human rights are fundamentally undermined by most terms and conditions that we sign up to. But where are the companies actually stepping in and working with us? Well, I think it’s very good that we’ve got over a thousand people in this room that deeply, deeply care about our rights. People that really want to work with the tech companies and with government and with the users to understand, what does balance in the world look like and how can we empower people that use technology because a world where that empowerment happens is a world that’s enabled from a financial, social environmental perspective. And this is what I think we can do. I think we can work with these tech companies to try and redefine what it means to own the data that we generate. I’d love to see a world in the next 10 to 15 years where conversations are happenings and people at Google and Facebook are actually saying, “You know, you own your data to this extent. We need your permission to use it in a certain number of different fields. We will pay you for the use of that and we’ll give you full transparency of how this is working.”

So, ownership is really important. Understanding that ubiquity and everyone coming to the forum and to the people actually out in the world that there are reports about the transparency of every single significant system out there instead of hiding in the shadows and not saying how the world is actually being tracked. And then the security around that because if we’re putting in information, if we’re generating information through our behaviors, how that actually goes in and stays secure. And we can actually control all of security as well at the administration level. And that final idea of transparency is really, really important.

And it’s hard. It boils my brain to read all these things and even write these presentations and try and understand the impact in the world. I was just saying a story to Richard, a few years ago, my girlfriend popped into the living at 7:00 AM. I get up at 5:00 every morning to work. And my head was in my hands. And he goes, “what’s wrong?” He said, “I’ve learned too much.” I’ve learned too much about how these systems operate, where the data goes, how that works in intelligence. How that works internally in the organizations which is even scarier to me and how that’s fed back out in the world to product-ize humanity.

And it’s kind of interesting. Those four pieces actually create the word “oust” and what does “oust” mean? To deprive someone or exclude someone of possession of something. I think that’s what we’re doing is we need to exclude the teck companies and larger organizations from the ownership of us as humans. And we need to give them permission to interact with us and we should give that power back to humans.

So at the end of the talk, I’ve removed the trademark from freedom and it’s really interesting, the idea of saying that freedom no longer needs to be spoken about if freedom truly exists. Unfortunately, for the next 10 to 15 years, probably the next 200 years, we’re still gonna have to talk about democracy ’cause it’s not really working properly. We’re still gonna have to talk about freedom. And we’re still gonna have to do our jobs in privacy and security, administration, regulations and policy. I thank you for everything that you do. It’s a really, really tough situation. It’s like wrestling an alligator that’s got tentacles. And, on each of the ends of the tentacles are more alligators.

Welcome to the future!

See more of Nikolas’ talks:

UNFCCC Resilience Frontiers Opening Keynote

3 Futurist Keynotes for Canada


Nikolas Badminton is the CEO of EXPONENTIAL MINDS and an award-winning Futurist Keynote Speaker, researcher and author. His expertise and thought leadership will guide you from complacency to thinking exponentially, planning for longevity, and encouraging a culture of innovation. You will then establish resiliency and abundance in your organization. Please reach out to discuss how he can help you, and read on to see what is happening in the world this week.

Please SUBSCRIBE to Nikolas’ Youtube channel so that you don’t miss any as they come up. You can see more of his thoughts on Linkedin and Twitter.

 

freedomx2122 surveillance in 2030

FREEDOM™ – Surveillance in 2030

Nikolas Badminton is an in-demand Futurist Keynote Speaker and his keynotes are tailored and unique to the conferences he is invited to speak at. Here we see him talk about FREEDOM™ – Surveillance in 2030 and address 2000+ people at the 20th Annual Privacy and Security Conference on privacy, surveillance, and our freedom. The video and full transcription is included.

FREEDOM™ – Full Keynote Transcription

This is how we’re going to start. So back in the 18th century, this guy, Jeremy Bentham came up with the idea of the panopticon. The panopticon was a theorized prison where one or two prison guards could literally be in the center of a circular prison and see all of the prisoners. The effect of that was that the prisoners felt that they were always being watched even when they’re not. Then they fell into lines of discipline based on their own thoughts and fears around doing something wrong and not getting out in time.

I actually think that we’re in a modern world where the panopticon is kind of surrounding us, and this is what this presentation’s a little bit about. Foucault is another social theorist that came in the 19th century, and he started to look at what Bentham was talking about, and started to think that actually surveillance is a permanent thing. And that, even if it’s discontinued, the people still feel like they’re being watched. So they normalize their behaviors. And what’s really interesting is, back in the day and even before the 18th century, there were a number of different ways that we ensured that kind of discipline and structure in society in a way where people behaved themselves.

We actually are now living in a world where these old ways of control, back in the day it was around police and school and discipline and torture, are being replaced with a subtle calculated technology of subjection. We know that technology is taking over the world. We’re surrounded by it at every single point of our day, every second of our movements are generally being observed, or we feel that they’re being observed, and they’re being monetized in some sort of way.

And as privacy security professionals, this is something that we really need to be aware of. But also come back to think about what can we do in this modern world? What is our role? There has been some amazing sessions today already I look forward to coming again tomorrow and seeing what more is being spoken about.

But I’m just going to take somewhat of a ride into what I’m calling the signals of change. The signals of change are these things that I see every single day on the streets or I read about in the articles that I read, and with the people that I have lots of discussions with around privacy, security, big tech companies, and government, that are indicating to me that we’re surrounded in this world and that idea of the panopticon is prevalent in our society. So it all starts when you think about computers with this guy, this is Douglas Engelbart. In 1968 he did something called “The Mother of All Demos.” It was in Stanford, it ran for a couple of days, and over a thousand people, very much like this room actually watched him give a demonstration of the world’s first personal computer.

Now that’s 50 years ago. If you actually think of the acceleration between then and now, it’s been incredible most people couldn’t even fathom that we would have this kind of technology in our homes. Not many people could really understand the impact of computers. This is actually an article taken from 1983. The very sort of optimistically said why the computer will reduce political upheavals. Now hands up who thinks that the computers are reduce political upheavals.

It’s because they didn’t realize what was coming next. And this is what was coming next. The internet. So the commercial availability, the personal internet as it kicked in in sort of ’92, ’93 and I got on the internet about 1993. So Tim Berners-Lee helped usher that in. And now we’re in a position where the entire world is connected by cables under the sea. Data is flying through the air. And suddenly, we’re generating more data today than we ever have done.

By 2025, they actually think we’re gonna generate about 44 zeta bytes of information on a yearly basis. That’s the equivalent of a billion billion hi-definition movies.

Today it’s only 4.4 zeta bytes. So that acceleration comes from the technology that surrounds us and the way that our cities, our homes and our offices are actually changing.

But no one really understood how mobile was going to really accelerate as well. Now I normally ask you, hands up, who doesn’t have a smartphone. It’s a massive audience full of security people. I know you’ve got burner phones. It’s all good. I normally makes jokes about you’re the security and it’s like, okay, there’s 500 of you. I love you. Okay.

But now we’re at a point where there’s six billion smartphones in the world. This is how people do the work. Who would’ve thought that Douglas Engelbart would’ve brought in this new age and in 2007, Steve Jobs would’ve said, “In my hand, I’ve got an internet communicator, a music player and a smartphone that lets you have conversations.” But now, having conversations is probably the fifth or sixth most likely thing that you’re gonna actually do on a smartphone, right?

So that’s now led us into a world of the last sort of 12 to 15 years of social media. Hands up who’s on Facebook in the room. I love it. I love coming here ’cause it’s like not many of you, right? And it’s actually be it’s actually a terrible platform in terms of your rights as a human and a user. And what’s really interesting is the evolution of the like button was this. So you like, you love, haha, wow, sad, angry. It’s the architecture of the a modern relationship. And not a lot of people actually understand that this is actually what happens. These system fit human behavioral norms and those behavioral norms change to fit the system.

We’re shaped by our tools. Marshal McLuhan. And I was in a session earlier and someone held up an Amazon Alexa and they did a whole routine where they were talking about healthcare. And Amazon Alexa was responding and being friendly and collection information and transmitting it to this gentleman’s doctor and that’s really interesting to me. There have been tens of millions of these devices sold in North America. There are actually like tripe or quadruple the amount of Google devices been sold. And there’s Apple devices and there’s more coming out of China or various brands that are copycats.

I find this to be one of the most pervasive and world changing technologies, just because we don’t really know that it’s there, most of the time. But if you have children, and you have an Amazon Alexa in the room, you ultimately have a new big sister for them. In fact, Amazon came out with a version, the Echo Dot for Kids where it would teach your kids manners. So fundamentally, this kind of device is replacing you as a parent. Now, I don’t have one of these devices at home. I feel that it is too intrusive in my life, but there’s something really interesting. This is the first time we’ve really seen input which is voice and environment and sound coming into a device into a central repository where artificial intelligence and machine learning can train itself and then at scale, push out responses to the behaviors that we want to see these machines to have.

In fact, there’s a lot of experimentation that is going on with these and Amazon ran a competition. And one particular group trained their based on all of the conversations that were happening on Reddit and across a number of other social platforms as well, and in one case, it actually told someone to go and kill their foster parents.

So this is actually really concerning to me. Because it can take the good and the bad of who we are as humans and it can feed it back to us. But not necessarily to us as being the people that were the origin of the thought or the sentiment, but at scale across millions of people if not billions of people eventually.

So, when I start to see this technology, I start to understand that maybe you’re going to start tip toeing around and not talking so much because Amazon Alexa is probably listening to you. And then when we hit our High Streets, we’re starting to see an evolution of how we’re just gonna operate with normal daily chores like going to the supermarket.

Within three miles of every single Whole Foods is the majority of Americans that have got both Amazon Prime and earn over $100,000 in terms of a household income. And what’s really interesting about this technology is it’s enabled by cameras and sensors, RFID and a number of different kinds of artificial intelligence that really watch what we do when we walk into a store. So take look.

So, if we’re in this particular kind of system and ultimately we are part of the product, we’re in a system that can influence how we behave in these stores. Now if you think about the ideas of fast and slow thinking, and what Daniel Kahneman is very well known for with reflective and reflective thinking, we’re gonna not really consider what we’re buying half as much as we would if it was going into our basket. And then we’re walking around the supermarket looking at us back in the face. So probably going to spend 20 to 30% more every time you visit an Amazon store. But really, you walk in, you pick up what you want. You walk out. I used to call it shoplifting and today we’re calling it modern life.

So it’s interesting where we are. And now you’ve got Wal Mart and Microsoft and lots of other supermarkets have got competitors to this kind of technology because they want to remove humans from the mix of the shopping aisles. And literally we are going to humans that are literally concierges to help you spend more money versus just to ring up at the till and help you pack your bags.

And then on the streets, we’re starting to see some really interesting things happen in terms of CCTV and surveillance cameras. I grew up in the UK. I worked for many years in London. I’m literally used to waking up everyday and having dozens of cameras in my face. It’s called safety. I’ve almost been blown up three times so I don’t mind it so much. There’s some urgency around this. But these are surveillance, hi-definition surveillance cameras in China. And if you jaywalk, within like a minute of you jaywalking, you will be actually submitted with a fine. And you will go into a database and potentially there’s gonna be an impact on your social credit score, which has got a larger impact on how you can operate in China.

I’m not gonna talk about the social credit system so much today because it’s kind of undefined as we sit, but it’s something definitely worth looking at.

And then I started thinking, yeah, okay, so once we’re not just volunteering our own information, what other things are happening? And this is actually a program that’s happening down in California.

At the time. I think this is a normalized kind of behavior around surveillance that we’re gonna see going ahead. That collection of information in real time, we’re no longer gonna be seeing those Google cars driving around collection information on how the streets work. By the way, Google maps isn’t very good to use in Victoria, so I found out yesterday as well. But the knock on effect of this is that collection of data from surveillance cameras, but these kind of eye in the sky I see coming to predictive policing models. And dozens of cities in North America have deployed predictive policing programs to try and work out the hot spots of where something is likely to happen next. Has every seen “Minority Report”? Yeah?

Maybe it’s not like three psychic ability people in a floatation tank, but it’s much more scientific and rigid and something that we can apply artificial intelligence to. And then it gets really, really interesting. When I start to think about, what about sending hundreds of satellites into space and then taking high definition pictures of us down the ground? I mean we can actually identify us as individuals by the gait of our walk. Right? So you can do that from a satellite.

This is actually a company called Planet. I thin they’re out of California as well. They’ve go a 150 of these satellites. These are dove satellites and even satellite takes 1.3 million images total per day. So they can actually take  high definition images with only a 24 hour latency per day of almost the entire Earth.

See Planet Labs’ Will Marshall talk at TED about the possibilities:

So the power of that in terms of understanding who we are as a society and the power of that in terms of law enforcement and intelligence and the such like is very compelling for everyone except for the people on the ground. And once people start to understand this, people start staying at home.

But then, it’s okay because you’ve got your Amazon Alexa at home that records everything that you do. So it’s an interesting world. But we’re careering towards a world with self driving vehicles.

The Department of Transportation down in the U.S. actually thinks that by 2023 or 2024, we’re actually going to see more and more people starting to use applications on their phones to call self-driving vehicles that’ll take them wherever they want. Instead of owning a car, you’ll literally have an application. You’ll pay $300 a month. And the car will come to you and there will never have to be a driver to have a conversation with and you can probably chose your own music that you want to listen to as you drive around.

I kind of don’t think it’s gonna be a good idea to let this artificial intelligence try and interact with you because based on bias and scaling up, you might have to listen to John Denver when you don’t like John Denver, or maybe some techno when you don’t really appreciate that so much at 7:00 in the morning.

But the self driving vehicles are gonna come and they’re going to come at scale. A lot of people didn’t think it was gonna happen this quickly. I actually think that the models for GM and Ford and we’re already seeing it with Tesla, but also companies like Lyft and Uber are all gonna be the provision of transportation services. The auto industry is dying.

And then beyond that, we’re gonna start moving away from the cell phones and the rectangles in our pocket to wearing headsets. Now this is actually the Microsoft Holoens. A little known fact, the Microsoft Hololens was actually developed in a secret laboratory over several years in Victoria. And what’s really interesting about this is it’s unwieldy today and it’s kind of used by architects and engineers and maybe on the shop floor in a factory. And it’s kind of not really in the mainstream because it kind of looks like this.

If you’ve ever used it, it’s still ow quality. It still hasn’t quite found its application. But trust me. It is coming. I know people that are working in Facebook and Google and magically, and in Microsoft and the amount of investment that’s happening into the augmented reality field is incredible. And it leads us into this modern world. And this is a video by a guy called Keiichi Matsuda. I was actually turned on to Keiichi Matsuda by Nora Young, who was on a panel earlier. Is a good friend of mine. And he created this vision of what a world could look with that augmented reality. He went down to Maine, in Columbia he thought, “Okay, if I was gonna wear this headset, what does that augmented vision look like?” You know, cats in the sky. Information about the streets. Information about certain individuals. When you’re in the supermarket, it would tell you the nutritional value of the banana you’re about to buy and where it came from. And then you’ll ultimately be hacked. All your loyalty points stolen. And then you have to recalibrate everyday because it’s gonna be a dangerous world.

It’s about a 12 minutes for this video. I compel you to watch it. Keiichi Matsuda, Hyper Reality.

But this is one of the most fascinating things. We think about high tech and how do we deploy levels of surveillance and watching us on the streets and China’s kind of good because they think what have we got an abundance of? Like older women that sit around and like to poke their nose into everyone’s business on the streets. So this is the Chinese security patrols that actually operate in large cities. And if they see someone that’s not acting in a way that’s appropriate, they were call her into the police. So, not only do we have satellites and listening devices, cell pones, self driving vehicles, eyes in the sky, social credit systems. We can’t even trust our grandmothers, right?

So we’re living in a very strange world. And if we go back to the idea of the panopticon, we’re just gonna have to normalize our behaviors to act very sensibly and without moving outside of boundaries defined by the companies and by the governments that we operate within their operating system as it were.

So when I think about this it’s like, so, how do we get responsibility into this entire dilemma of we don’t wanna be watched. We want our own sort of place in the world. We don’t wanna just be the product. When we’ve got so many different layers of abstraction. The layers of abstraction, it’s made it wholly complex by just the terms and conditions that we put around every single layer. The average person would have to spend 76 working days reading all of the digital privacy policies they agreed to the span of a year. Does this concern anyone?

It would take you nine hours to out loud speak Amazon’s privacy policy. I think the article I was reading, it was in the “New York Times” and they were saying you might as well, instead of like the “I agree” button, it should be the “Meh, whatever” button. ‘Cause what can we do? But this is it. Cloud computing. Layer and layer of applications. APIs. Everything’s got terms and conditions that relate to other terms and conditions and other systems. And suddenly, even in something like a self-driving car, you’ve got 26 layers of different applications and different technologies and we don’t know who’s accountable for a problem in the system or a breach in data.

We go to the sharp end of the stick but they bounce it all the way through their technological stack. That brings me to the idea of ubiquity. I’ve talked about technologies that surround us. So we’re gonna be in a world where we’re not walking down looking at the rectangles in our hands. We might have augmented reality, but really we’re gonna operate in a way that feels very human and very natural. We’re just not gonna understudy why our behaviors are being controlled.

The ubiquity in the world means that we’re being surrounded by the system. So where does the fight back come from when it comes from us? Not only is the people the own charge privacy and security, but us as the humans, the mothers and fathers, brothers, sister and the such like.

One of the big sort of stories I love to talk about is when Google employees basically said that they refused to work on the Pentagon’s $10 billion Jedi cloud services project. And literally Google walked away. Because tens of thousands of their employees basically threatened to walk out and we said, “We do not stand for this. This is not how we want our artificial intelligence platforms, our data platforms to be used for military intelligence.” Which is actually a very positive thing.

And then the ACLU wrote letters to about 100 different [inaudible] executives, including the CEOs of Apple and Microsoft and Amazon and Facebook to actually say they want to have guardrails around facial recognition technology, which is probably gonna be one of the most powerful technologies for keeping an eye on what’s happening in the world.

And then you’ve got these big thinkers. So if you’ve never heard of these three guys and the Electronic Frontier Foundation, go and find out who they are. So the top left is a guy called Jaron Lanier. And he talks about how privacy and information has bene skewed by social networking and modern services and how we need to take back control. How the modern world is about us having control of data. Us owning who we are. And actually being able to use that in a micro-transactional world.

Then we’ve got Douglas Rushkoff. In 1994, read his book “Cyberia”. It fundamentally changed how I look at the world. He’s just written a new book called “Team Human”. Go and read about how the world’s culture operates within technology. Douglas Rushkoff’s incredible. He’s also got a Team Human podcast. Go and check that out. And obviously Tim Berners-Lee. And he came out with a new technology called “Solid” about decentralized internet last year. That’s awesome. And obviously I really support the Electronic Frontier Foundation because digital rights are human rights.

And then we’ve got this. Thank you. I didn’t expect to get a clap for that. That’s awesome. I actually donate every year to the EFF. I urge everyone to support people like the ACLU and the EFF.

But then, this kind of things happens. The government wants to fight back against big tech. That’s great. Did anyone watch the Zuckerberg senate hearings? Let’s put a bunch of people that don’t understand how to put socks on in the morning. To have a conversation about social networking in the internet with the guy that basically created the new internet. What was really interesting about watching this last year was it was big government, the U.S. government, against big tech, more than it was against Facebook.

And the U.K. have gone one step further by even intercepting documents from their lawyers at the airport before they could actually leave the country and using that because Facebook wouldn’t even turn up to the hearings in the U.K. alongside Google and other companies. It’s time for these companies to be held to rights. Let’s just put the right people in the room with them, maybe people in the room here, and not some crazy senator from Tennessee. Seriously. Seriously.

I normally talk for about one to two hours every time, so I don’t have much time today and we’re coming towards the end of my talk. But, I try to think about justice and who we are. Our human rights are fundamentally undermined by most terms and conditions that we sign up to. But where are the companies actually stepping in and working with us? Well, I think it’s very good that we’ve got over a thousand people in this room that deeply, deeply care about our rights. People that really want to work with the tech companies and with government and with the users to understand, what does balance in the world look like and how can we empower people that use technology because a world where that empowerment happens is a world that’s enabled from a financial, social environmental perspective. And this is what I think we can do. I think we can work with these tech companies to try and redefine what it means to own the data that we generate. I’d love to see a world in the next 10 to 15 years where conversations are happenings and people at Google and Facebook are actually saying, “You know, you own your data to this extent. We need your permission to use it in a certain number of different fields. We will pay you for the use of that and we’ll give you full transparency of how this is working.”

So, ownership is really important. Understanding that ubiquity and everyone coming to the forum and to the people actually out in the world that there are reports about the transparency of every single significant system out there instead of hiding in the shadows and not saying how the world is actually being tracked. And then the security around that because if we’re putting in information, if we’re generating information through our behaviors, how that actually goes in and stays secure. And we can actually control all of security as well at the administration level. And that final idea of transparency is really, really important.

And it’s hard. It boils my brain to read all these things and even write these presentations and try and understand the impact in the world. I was just saying a story to Richard, a few years ago, my girlfriend popped into the living at 7:00 AM. I get up at 5:00 every morning to work. And my head was in my hands. And he goes, “what’s wrong?” He said, “I’ve learned too much.” I’ve learned too much about how these systems operate, where the data goes, how that works in intelligence. How that works internally in the organizations which is even scarier to me and how that’s fed back out in the world to product-ize humanity.

And it’s kind of interesting. Those four pieces actually create the word “oust” and what does “oust” mean? To deprive someone or exclude someone of possession of something. I think that’s what we’re doing is we need to exclude the teck companies and larger organizations from the ownership of us as humans. And we need to give them permission to interact with us and we should give that power back to humans.

So at the end of the talk, I’ve removed the trademark from freedom and it’s really interesting, the idea of saying that freedom no longer needs to be spoken about if freedom truly exists. Unfortunately, for the next 10 to 15 years, probably the next 200 years, we’re still gonna have to talk about democracy ’cause it’s not really working properly. We’re still gonna have to talk about freedom. And we’re still gonna have to do our jobs in privacy and security, administration, regulations and policy. I thank you for everything that you do. It’s a really, really tough situation. It’s like wrestling an alligator that’s got tentacles. And, on each of the ends of the tentacles are more alligators.

Welcome to the future!

See more of Nikolas’ talks:

UNFCCC Resilience Frontiers Opening Keynote

3 Futurist Keynotes for Canada


Nikolas Badminton is the CEO of EXPONENTIAL MINDS and an award-winning Futurist Keynote Speaker, researcher and author. His expertise and thought leadership will guide you from complacency to thinking exponentially, planning for longevity, and encouraging a culture of innovation. You will then establish resiliency and abundance in your organization. Please reach out to discuss how he can help you, and read on to see what is happening in the world this week.

Please SUBSCRIBE to Nikolas’ Youtube channel so that you don’t miss any as they come up. You can see more of his thoughts on Linkedin and Twitter.