The AI-first world could look very different in the next decade

Tyagarajan S May 23, 2017

Google I/O presented a vision of our AI-first future. In this world where machines see, hear, perceive and perhaps even create, Google wants to organise information that lies beyond digital walls.

Like the internet, AI is likely to be a massively disruptive technology that will bring about enormous social, political and cultural change.

This week’s FactorFuture seeks to extrapolate into this AI-led world over the next decade and see what would be some major emerging shifts. While these are in no way exhaustive, they capture broad and diverse areas that will redefine our relationship with technology and, in the process, have a deep impact on our cultural and social behaviour as a species.

#1 Ubiquitous computing that’s invisible

The paradigm for harnessing computing power, which involves carrying around and interacting with a all-in-one device (like a smartphone) may be changing.

Voice assistants, like Amazon Echo and Google Home, are de-bundling computing to make it just a speech away as long as you are ‘near’ it. This natural form of interaction is expected to become the default platform of choice in most private settings especially as the machines understand us easier and better.

Voice assistants, like Amazon Echo and Google Home, are de-bundling computing to make it just a speech away as long as you are ‘near’ it  

It will be just one of the interfaces, though.

Amazon’s Echo Show, with a small screen and Alexa speaker may seem even regressive on first glance (aren’t we past the era of having fixed boxes in our homes?). But, it has perfectly valid use cases when you think about it grounds up. Mobile phones actually present a rather clunky form of interaction in our homes, where our personal space expands compared to public spaces. There’s a solid use case to us having little smart screens, that we can touch, talk to and see, all over our homes in the near future.

Amazon Echo Show is an Alexa-powered smart speaker and touchscreen that can play videos and support video calling. Image credit: Amazon

At Google I/O, the company presented a vision of connecting across the various screens that’s already in the house  —  mainly TVs and smartphones to make them all work using the common intelligent interface of Google Assistant.

But what we’ve seen from Google IO (Google Lens) and, to a lesser extent, on Facebook F8 earlier this year is that now computers (smartphones) are also getting good at ‘seeing’ things to visually understand the world at a basic component level.

Amazon, without a strong presence in smartphones, launched Echo Look last month. Echo Look has a camera that can see you and provide fashion advice. We can be sure that this narrow premise will expand very soon.

Also read: Other FactorFuture articles

Over the next decade, super high-resolution cameras powered by AI could see into our lives to order things when we are running out of them, help us build better habits when it sees us binge eating, or even automatically control parts of the house (like switching off lights or controlling temperature because it knows you aren’t in the house).

Over the next decade, super high-resolution cameras powered by AI could see into our lives to order things when we are running out of them, help us build better habits when it sees us binge eating, or even automatically control parts of the house  

Super impose the intimate personal information already online and we’ll have distributed, ubiquitous computing in our homes before anywhere else.

Tomorrow, screens which connect to the Google assistant API or Alexa API (or even Siri and Cortana, god forbid) would be commonplace.

In this AI-first world, ‘intelligence’ will reside in everyday objects, thanks to neat API to bring in intelligent voices of Google Assistant or Alexa (or even Siri and Cortana, god forbid) from toasters to washing machines to cars and bathroom mirrors. We’ll be interacting with them without ‘breaking’ the flow of our lives like we do today to immerse in smartphones and laptops.

#2 Babelfish

You don’t realise how groundbreaking construct language is until you travel to a country whose language you completely do not understand. The mundane, efficient and nearly invisible interactions that we have on a daily basis suddenly look Himalayan.

But what if the wall of language falls in the next decade? Or perhaps, at least, crumbles a bit.

Late in 2016, Google declared that its AI powered language translation tools were on par with human translators. This was powered by a shift from Google’s “phrase-based translation” (which breaks down the language into words and syntax and looks for matching components in the new language) to a neural network powered translation (one neural net ‘understands’ the meaning while another converts meaning to words in the new language).

Over this decade, we’ll likely sport earpieces, the real-life equivalent of little Babelfish from The Hitchhiker’s Guide to the Galaxy, that can translate speech in real time  

Language is deceptively complex. Beyond deciphering the generic meaning and being able to tell slangs, proper nouns and obtuse words, languages have subtleties — irony, sarcasm and metaphors that have been impossible for machines to understand. But thanks to machine learning, we are moving towards computers getting as close to 100% accuracy with language as possible.

Over this decade, we’ll likely sport earpieces, the real-life equivalent of little Babelfish from The Hitchhiker’s Guide to the Galaxy, that can translate speech in real time. Augmented reality (AR) lenses could be translating text in real time. By the time 2027 rolls around, the need for multilingual skills may even be on the wane, crumbling another barrier and shrinking the world.

Pilot earpieces from WaverlyLabs claim to provide instant translation between speakers. Image credit: WaverlyLabs

But this is also a necessary milestone for AI to take bigger leaps, for once we pass the language summit with flying colours, much of humanity’s codified knowledge becomes available for machines paving the way for building towards general artificial intelligence.

#3 AI deterrence

The biggest weaponisation of AI will be the result of our escalating cybersecurity requirements. Automated, rapid and increasingly complex cyberattacks will become more common. While the worst cyberattacks still do not leave us permanently crippled today, it will be catastrophic in a soon-to-come AI-first world.

From automated cars to factories to a world running on AI platforms, the ‘digital winter’ (a world post a devastating global cyberattack) would be treated on the lines of a ‘nuclear winter’ (a post nuclear-war world). In this case, a greater number of non-state actors may have their hands on weapons to enable this to happen.

Big governments will build cyber weaponry and stockpile more openly. The result: AI cyber weapons arms race  

This is likely to force our hand into employing AI to defend our networks, protect our data and battle cyber viruses. Big governments will build cyber weaponry and stockpile more openly. The result: AI cyber weapons arms race.

Back in 2014, Snowden revealed that NSA was working on MonsterMind, an autonomous system to detect and defend against cyberattacks (much like missile shield programs) and perhaps even initiate attacks of its own. These could become the norm as manual responses will be too slow to be relevant.

As we become fully dependent on an autonomous, AI-controlled cyber defence systems to keep us from plunging our world into chaos, the vicious imaginings from our dystopian fiction may just come crawling out.

Will we have conflicting AI battling with each other making humanity mere pawns in the game?

Person of Interest presents a thrilling dystopia where AI gains sentience and conflict with each other. Increasing weaponisation of AI presents the surest path towards an AI meltdown that science fiction often feared

We’ll have to wait and watch. But it is inevitable that over the next decade AI would become the key ‘deterrence’ keeping an uneasy status quo in cyberspace and beyond.

#4 People programming, aka governance

We’ve read enough about how big data and AI helped elect Donald Trump and enabled Brexit. Closer home, the BJP government has been doing some serious number crunching to drive political propaganda forward in a country that increasingly relies on Facebook and WhatsApp for news.

It will soon become commonplace for political parties to leverage machine learning to drive political propaganda and influence voters.

It will soon become commonplace for political parties to leverage machine learning to drive political propaganda and influence voters  

Also, AI support would become necessary to determine policies as leaders use these systems to churn through increasingly large amount of personal and public data and identify policy implications in an increasingly complex, interconnected world. Governments may increasingly implement their actions in simulated cities before taking real life action.

Another area where governments and citizens will likely see a huge impact of AI is in personalisation of services and benefits. Using complex, multi-dimensional data, governments can proactively offer the right benefits, subsidies and support to individuals, depending on their location, life cycle and changes in their life status. This personalisation may eventually even do away with the multi-hierarchy federal structure of governance and consolidate power with a central body that can micro-personalise and reach out to individuals thanks to automation.

Similarly, criminal justice and public safety will begin to increasingly leverage AI, biometrics to proactively bring down crimes and ensure order  

Similarly, criminal justice and public safety will begin to increasingly leverage AI, biometrics to proactively bring down crimes and ensure order. This will also open up more and more questions about privacy and freedom. A stifled, manipulated citizenry will perhaps be commonplace by the next decade as the power of AI consolidates with a handful of corporations and big government.

Read: You have a destiny and AI knows it

#5 AI companionship

I know what you are thinking  —  sex with robots.

If the many experts are to be believed, we’ll be bumping uglies with sex bots within the next decade and people will seriously start thinking about marrying them by 2050.

But sex with robots is going to take a while before both the technology and cultural shifts make it something that’s acceptable. And who knows, by then virtual reality could simulate all this without constructing elaborate life-like robots.

We will, however, get intimate with machines over the next decade in many other ways. While it may not be as simple as Mark Zuckerberg’s suggestion that there could be a “fundamental mathematical law” that can determine social relationships, AI will progress towards making its manifestations (as virtual characters, robots or just voices) likeable and even loveable.

If the many experts are to be believed, we’ll be bumping uglies with sex bots within the next decade and people will seriously start thinking about marrying them by 2050  

Culturally, we are laying the ground work for such a world today. Empathy and companionship are fundamental needs for humanity and pets are increasingly satisfying that need, increasingly taking the role that society had reserved for a spouse, friend or a baby. Animals are becoming household members and part of many daily lives.

Soon, they will have serious competition.

As AI gets good enough to simulate a pet, we’re going to see a big movement towards the adopting artificial pets ranging from known animals and to fantasy creatures that make for cute, domesticated creatures we nurture and grow. Some of these may even be virtual as the rise of AR enables mapping virtual entities onto our world.

Pepper, the humanoid robot from Softbank Robotics, can perceive human emotions

Increasing urbanisation and restrictions around living with animals, concerns around treatment of animals, allergies, flexibility to just “turn off”, lower maintenance will push people to look for robotic or virtual pets. AI will enable these to evolve and behave according to their owners. More importantly, they will provide the kind of emotional relationships that many would seek as the world moves towards a more independent, lonely existence.

This will be a significant step in the man-machine symbiosis which will remain the bedrock of our evolution through the rest of this century.


               

Thank you for reading FactorDaily

We hope this story worked for you.

Our journalism is produced by some of the best brains in the story-telling business who believe that good stories have only one master: you, the reader. Bringing these stories to you, just so you know, costs us a pretty dime even as the context of disruption remains unchanged in the journalism business the world over.

If you like what you read here, consider supporting the FactorDaily journey. We don’t have a paywall because we believe access to good journalism must be free to all, especially when it is in public interest and informs citizens with independence and accuracy. Such stories should not be restricted to a few who can pay. You are free to support us with any amount you like. 

Please note that 18% of your contribution will be paid to government as GST, per Indian accounting rules.