- On the outskirts of Bangalore, Tata Elxsi is testing its self-driving car rig fitted with 3D Lidar, short range sensors and cameras
- Over the last two and a half years, the team has been building a self-driving platform from scratch
- The company has requested the government for permission to test its robotic cars on a public road in Bangalore
“India would be the last place on earth to get self-driving cars.”
That was Travis Kalanick, Uber CEO and intrepid posterboy of urban mobility, dashing our hopes some months ago. But, we had known it all along, hadn’t we?
Except, someone forgot to give the memo to Tata Elxsi. Inside the tranquil, old-world campus of Tata’s design company, a small team of experts are busy working on rolling out self-driving vehicles on Indian roads. The endeavour caught public attention recently when the company requested the government for permission to test its robotic cars on a public road in Bangalore.
Intrigued (enough to brave the traffic around Whitefield), I made my way to the Tata Elxsi office earlier this month where I met with Nitin Pai, senior vice-president and head of marketing and strategy, and Rajesh Kumar, vice-president, strategic initiatives, to know more about its autonomous vehicle efforts.
India needs its own version of ‘autonomous’
Kumar, a Tata Elxsi veteran of two decades, has been leading the company’s autonomous car initiative for more than two years now. Back in 2014, he and his team bought a car from the market (Tata’s compact sedan Zest), tore it apart and built some mechanical adapters that would allow them to control it by just tapping keys on a keyboard. In essence, they hacked a low-range car and gave it “drive-by-wire” (an industry term for controlling acceleration and braking through electronics) capabilities, normally found in more sophisticated vehicles.
The word jugaad springs to mind.
At a test range on the outskirts of Bangalore, a couple of self-driving rigs are eating track, helping the Tata Elxsi autonomous vehicle team ‘real world’ test their platform
Today, at a test range on the outskirts of Bangalore, a couple of self-driving rigs are eating track, helping the autonomous vehicle team ‘real world’ test their platform (called the Autonomous Vehicle Platform). The rig consists of a couple of these hacked (to have drive-by-wire capabilities) low / mid-range cars fitted with all the sensory accoutrements: A 3D Lidar (what is Lidar?) on top, multiple cameras (two in the front), and a few short-range sensors.
The cars they are using are both priced below Rs 6 lakhs in the market — a point that underscores the key reason why Tata Elxsi is going through this journey in India even as Silicon Valley behemoths pursue it in other parts of the world. It is to build a solution that’s relevant to India.
Cost is a big factor. “Any autonomous solution for India has to work on a Rs 5 lakh car as much as a Rs 1 crore car,” says Pai, who worked more than a decade in R&D and engineering in the automotive sector before moving to marketing.
“Any autonomous solution for India has to work on a Rs 5 lakh car as much as a Rs 1 crore car” — Nitin Pai, head of marketing and strategy, Tata Elxsi
It’s not just cost, though. Pai believes that global solutions can’t be applied to India without significant ground-up crafting. He gives an example: A prototype that the AVP team is building, which integrates wheel motion sensors using mechanical adapters in order to bring that last bit of accuracy to the vehicle’s location and account for short periods when GPS may be unavailable due to a flaky network.
Elxsi’s self-driving car journey
When the Tata Elxsi autonomous car team started work almost three years ago, they had to make a strategic choice on which approach to follow to build such a platform: the auto industry approach of increasingly integrating advanced driver assistance systems (ADAS) together or the “Google approach”?
For the auto industry, ADAS was the way to vehicle intelligence. When you use “park assist” in a car or feel secure because of its “collision detection” capabilities, that’s ADAS at work. It relied on hard-coded intelligence inside the vehicle’s electronics control unit (ECU) to process the various inputs it receives to help the driver make a decision. In some super-advanced vehicles, it did some basic “drive by wire” too — to reduce speed, make a turn, etc — based on its conclusions, but the decision was usually left to the driver.
For a long time, the auto industry, driven by its ability to monetise ADAS more clearly, believed that integrating the various ADAS solutions together was the path to autonomous capabilities. Yet, that approach had a fundamental flaw — real autonomy could never be achieved by hard-coding rules because the real world is too dynamic. This is exactly what Google, with its experience in artificial intelligence (AI), machine learning and robotics, sought to solve.
Despite more than a decade’s experience working with automotive manufacturers on ADAS, Tata Elxsi chose the “Google path” when it started work on its Autonomous Vehicle Platform. Over the last few years, the original equipment manufacturers (OEMs) have begun to come around, realising that the Silicon Valley startups are rapidly upending their industry.
Despite more than a decade’s experience working with automotive manufacturers on ADAS, Tata Elxsi chose the “Google path” when it started work on its Autonomous Vehicle Platform
For Tata Elxsi, taking this approach meant working top-down on what kind of automated car it would like to begin with, even though it was never going to produce one of its own. What it wanted to do, instead, was to build an autonomous vehicle platform that vehicle makers and top vendors can use.
The company has deep expertise in individual pieces of the puzzle: the perception system, eyes and ears made up of sensors and cameras; a guidance navigation control (GNC) system, the intelligence that analyses inputs and makes path and movement decisions; and a drive-by-wire system, which translates the decisions into mechanical activities like braking, acceleration and wheel rotation.
Tying all of this together is the self-learning intelligence that the company has built. The sensor-fusion algorithm combines inputs from Lidar and other sensors to create a worldview. The object-classifier algorithm interprets what it is seeing (that’s a dog!). Together, they’re the eye and visual cortex, working in tandem to understand the world around.
The company has deep expertise in individual pieces of the puzzle: the perception system, a guidance navigation control system, and a drive-by-wire system
It’s one thing to know that a dog is standing by the side of the road, but reacting to it requires more complex intelligence. This is the role of the decision-making algorithm that comes up with drive-by-wire decisions. This forms of the core of self-driving platforms. The path-planning algorithm translates all of this of into a coordinated series of steps that takes the car from point A to point B.
The team spent most of the last two years simulating inputs before moving on to virtual environments using imported meta-data available in other markets. All of this was to mimic inputs to test and improve the “brain” without actually putting a car in the field. It was in 2016 that the team started running its full-setup rigs on test tracks.
What has made the journey trickier is the fact that sensors have been undergoing a massive revolution with costs dropping and capabilities exploding.
What has made the journey trickier is the fact that sensors have been undergoing a massive revolution with costs dropping and capabilities exploding
“Three years ago, using a radar and cameras would have been a sufficient approach, but today, Lidar has become the standard sensor backbone. The only debate is whether to use a 2D or 3D Lidar,” says Kumar. The platform, therefore, had to be flexible enough to adapt to technology changes without hardwiring it to a particular type of sensor input.
The team continues to stabilise algorithms and optimise the compute power needed. More importantly, it is ready to test its creation on public roads.
“When we wanted to apply for testing on public roads, we were unsure whether it was a state issue or a central government one,” says Pai.
One can imagine. The state government in Karnataka is still to come to grips with decades-old business models like e-commerce. Self-driving cars bring up complex questions. Here’s one: Who carries the liability in case of accidents?
“The entire industry is in a learning phase right now and the government needs to just regulate the learning experiments. It can start by providing an approved section of road for the tests, with signages and safeguards in place” — Pai
But, Pai wants the government to separate the larger issue of self-driving-car regulations from the short to medium-term issue of regulating the testing. “The entire industry is in a learning phase right now and the government needs to just regulate the learning experiments. It can start by providing an approved section of road for the tests, with signages and safeguards in place,” he says.
Tata Elxsi continues to work with various governments and other bodies to enable this. And the government seems to be listening. Recent reports cited a top government official as saying that the government would make amendments to the motor vehicles law, which would allow for limited testing on a case-by-case basis.
Future of self-driving cars in India
Despite all the excitement, fully autonomous cars are quite some time away from rolling out on Indian roads.
In a country like India, even (relatively) straight-forward algorithms like object classifiers get complicated. “The variety of objects that the algorithm needs to detect is quite large in India — from cyclists to cows to dogs to pedestrians; it’s not simple,” says Pai. There are also issues of inconsistent (or missing) road signage, missing lane markers and often, missing roads!
In a country like India, even (relatively) straight-forward algorithms like object classifiers get complicated. There are also issues of inconsistent (or missing) road signage, missing lane markers and often, missing roads!
There are other data-infrastructure requirements, like high-definition maps (Google Maps isn’t good enough for a car to drive itself). While the effort to create a high-definition map has started in Europe and the US, it is yet to begin in India. Kumar is hopeful that it will soon start in India as well.
All of this means that we are unlikely to start seeing self-driving cars being sold in 2017. Or 2018, for that matter. However, over the next few years, we are likely to start seeing carmakers introduce increasingly sophisticated ADAS features through an autonomous backbone.
Pai believes that the use of autonomous platforms for off-road vehicles — like tractors, mining trucks, etc — is likely come sooner (than on road vehicles) and will offer a market that is nearly as large. He believes that in the short term, they will also deliver far greater value and, perhaps, benefit from less regulatory constraints.
Pai believes that the use of autonomous platforms for off-road vehicles — like tractors, mining trucks, etc — is likely come sooner (than on road vehicles) and will offer a market that is nearly as large
Ultimately, Tata Elxsi is setting itself up for inevitable disruption. Apart from the autonomous vehicle platform that OEMs would want, the company believes that there is an equally enormous value in the expertise it will bring in by way of all the new technologies and skills needed to work on such products.
“We are neither trying to be a Google nor a BMW, but somewhere in between,” Pai says. Although he means that their focus is to leverage platforms and work with OEMs to bring self-driving capabilities, it also sounds like a good place to be in, philosophically speaking: Between the techno-utopia of Silicon Valley and the conservative scepticism of old automotive.