If you’re a Bengaluru resident, it’s quite likely that you might have seen mfine’s billboards, which are plastered around town, pitching their solution for an on-demand healthcare platform, where you can get a doctor consultation in 60 seconds. The year-old startup, founded by ex-Myntra co-founder Ashutosh Lawania and other members of Myntra’s leadership team is not only mobile first, it’s also an AI first startup. Its homepage touts its AI healthcare platform, claiming that the system is learning medical standards, protocols, diagnosis and treatment methods.
“Software is going to eat the world, but AI is going to eat software.” These words by Nvidia CEO Jensen Huang ring true for most sectors these days. AI and machine learning have made significant incursions into agriculture, transport, education, and healthcare this decade, and some of these ripples can be felt in India too. Notable AI-driven healthcare startups from India include Qure.ai, which is exploring the medical imaging space; Sigtuple in pathology; CureSkin in dermatology solutions; Niramai that uses AI for breast cancer detection; and Tricog Health whose cloud-connected ECG devices to detect heart complications.
In India, some of mfine’s peers include Docsapp, Medimetry, and Medlife. Even aggregators such as Practo and Lybrate have started doing online consultations. But mfine, the subject of our story today, is different in the technology and tools it deploys.
Curious to know what their AI strategy for a B2C healthtech startup like mfine is, we caught up with their team at their headquarters in Marathahalli, an east Bengaluru suburb, earlier this month.
“Ashu and I got together at the end of 2016 to think about what we can build again using technology,” says mfine CEO Prasad Kompalli, recounting how they zeroed in on the healthcare space. Prior to his four-year stint at Myntra, where he headed its commerce platform, Kompalli’s first entrepreneurial attempt was at Indus Bionics Systems, which tried to build implants in India. “I gave up on that. It didn’t go very far so I then joined Myntra,” he says.
Kompalli and Lawania decided to build something from scratch in healthcare as they didn’t see a lot of focus in the market in bringing technology to build a powerful consumer brand. They looked at international players in China, Singapore, UK, and US to zero in on mfine as the idea: “An on-demand healthcare service that can change the way healthcare is accessed,” as Kompalli says. The company started work early 2017 and its service has been operational since December.
Mfine’s international comparables would be US-based HealthTap and One Medical, and UK-based Babylon Health, Kompalli says. “Even the big guys are entering, Google is coming from an algorithm point of view, Apple is coming from a device point of view – Apple Watch is becoming primarily a health use-case. Amazon is coming from a commercial point of view, exploring medicine delivery, lab tests, insurance. I think there’s a huge movement from a tech point of view, tech players into healthcare,” he adds.
Mfine’s mission statement is to make quality healthcare available to consumers at scale. “The current healthcare delivery system is a manual system. It completely depends on doctors and how much the patient wants to share… there is no process around it,” says Lawania. “By deploying data collection and assistive AI technologies, it’s not taking quality time from doctors and improves their throughput,” he adds.
The product features are built to leverage the always-on connectivity of the mobile, says Kompalli. “When you get a prescription, you’re also getting medicine reminders. You can get an interactive prescription, with videos, gifs, for an exercise plan that an orthopaedic has suggested,” he adds.
Mfine currently covers nine specialities: paediatrics, general medicine, fertility, gynaecology, dietetics, orthopaedics, gastroenterology, and cardiology. It’s still early days for the startup, which presently has 50 doctors on its platform, over 100,000 app downloads on Google Play, and 15,000 consultations as of the end of May. It raised a $4.2 million Series A round in May and employs 20 doctors, 20 engineers, and about 10 who work on the business and operations aspects.
“A doctor in his lifetime can only see 360,000 patients. Our system will be figuring out the probability through millions of records,” says Lawania, talking about the AI under mfine’s hood. He also stressed that their AI was only an assistive technology, which would streamline the consultation process. “The idea is not to replace the existing system but to create a support system to help existing systems. The doctor marries both and takes the judgement,” he clarifies.
The startup has partnered with sixteen hospitals in Bengaluru. Kompalli says that the number is intentionally small. “We work with premium hospitals like Cloud Nine, Aster, Ovum, PeopleTree etc. And all these hospitals basically give us their doctors onto the platforms. That ensures that certain trust is there behind the brand… and the quality is also there.”
The payment for a consultation is made through the app by the patient, which is shared with the hospital, on a per-consultation basis. The startup wasn’t willing to share the terms of their revenue-share arrangement. Consultation rates start from Rs 499 and go up to Rs 700, depending on the specialty. The app is presently offering a discounted price of Rs 299 per consultation.
Hospitals are approaching mfine as a digital OPD or a digital primary care delivery channel, says Kompalli. “From a hospital point of view, primary and secondary healthcare becomes costlier and costlier, if you only depend on humans, and it’s not scalable.”
It only takes a few minutes for you to start your doctor consultation after installing the app. Tap on the new consultation button and you’re presented with a list of health issues ranging from fever, cough, headache etc. to ailments that need a specialty doctor – such as a nutritionist or an orthopaedic.
You get to choose your doctor in the next step which lists profiles of doctors based on their specialties, and the hospitals they belong to, and their years of experience. After making the payment for the consultation, the doctor’s assistant asked me a bunch of questions to collect information about the case, (it’s a chat based interaction, managed by the care team at mfine), related to your ailment, along with personal details like weight, height, age, and details on the medical condition for which you need treatment.
For a dandruff-related problem, the chatbot, chaired by Dr Anuradha Das asked me questions on how long I have been having dandruff, whether there is itching, and whether the dandruff scales were oily. She then asked me for a photograph of my scalp, and a bunch of follow-up questions on allergies, past history of medical illnesses, or any medication that I was taking. The interaction with the chatbot lasted around 18 minutes, following which a consultation with Dr Harish, a dermatologist at Vitals Clinic was scheduled for 15 minutes later.
Dr Harish went through the details submitted over chat, and after a few minutes, he sent me a prescription, which included general advice and a diagnosis of my condition – seborrhea capitis. All in all, it took about 35 minutes of time to get a prescription. You can export this prescription from the app and get your medicines from a pharmacy. The day after my consultation, the app flashed notifications asking me if I’d taken my morning dose of the prescribed lotion and shampoo.
Follow-ups with the mfine care team over chat, phone and video for a week are covered in your initial payment, without any additional consultation charges.
Mfine’s CTO Ajit Narayanan gave us a walkthrough of a dashboard used by the care team, where their AI can be seen in action. The team didn’t want to share screenshots of this technology, as it is in the process of applying for patents.
The dashboard has two main panels: on the left side, one can tap on buttons related to diseases symptoms. It also takes the patient’s gender and age as input. After tapping on five disease symptoms – fever, headache, vomiting, body ache, and joint pain, the system predicted the likelihood of the disease – chikungunya and dengue fever were listed as top probable diseases, with percentage values highlighting the certainty of the diagnosis.
Narayanan gave us a walkthrough of the elements of mfine’s AI platform, which helped the system arrive at this diagnosis. Its AI play can be divided into two pillars. The data aspect of it, what the team calls their ‘Data Ingestion Platform’, is based on four corpora of data. Firstly, there’s medical literature, derived from hundreds of thousands of publicly available articles from resources like PubMed. Another data source is UMLS (Unified Medical Language System), which feeds in thousands of standard medical terms for mapping core medicine concepts. Medicine data – detailing each class of drug, its composition, dosage guidance and potential complications – is also fed into the system. And finally, there’s mfine’s own body of data, based on its previous history of diagnosis and treatments.
As for machine learning aspect, the AI platform uses a combination of deep neural networks, disease decision trees, concept vectors, (using Word2vec), and a knowledge graph, based on a vocabulary of medical concepts and how they relate to one another. Deep neural networks, algorithms designed by how the human brain works, are used in pattern recognition problems, while decision trees are used in classification and regression problems. Concept vectors, using Word2vec (an NLP tool created by Google researchers) and the knowledge graph are used to make calculations on a medical condition, and what it is similar to.
Neural networks and decision trees are polar opposites in the world of explainable AI. Are there any debates internally on whether the former should be used, we asked. “We’re building with explainability in its core,” Narayanan says. “Our [neural] nets that are identifying very specific parts of the inference layer, So it’s not like I give it a bunch of features and it says, ‘This is pharyngitis.’ It’s a collection of several things. In the end, there is also a decision tree that is being executed. So there, I can clearly say, I’ve seen symptom x, y, and z, and therefore I think it is this,” he says.
“With Word2vec, you can find similarity between terms. Once you build the math, you can then find an angle between concepts in a multi-dimensional space. Once you model that, a question like ‘What is similar to cough’ can be answered by the system – it starts to tell me nasal congestion, vomiting, wheezing are conceptually similar,” says Narayanan, sharing a diagram of what nasal congestion looks like in math.
Narayanan says that mfine has built an algorithm on top of this that attempts to mimic a doctor’s behaviour. “What is similar? What is dissimilar? What have I seen together? What does elimination set? What is the convergence set? What kind of questions can I ask more to narrow down the problem?”
As for the future roadmap, Narayanan says the mfine AI will develop various qualities and senses. “Today, it is text. Tomorrow, it will be able to listen to voice and give voice, and it will be able to see. We’ll probably start on x-ray understanding, MRI understanding, as well.”
By the end of the year, mfine is looking to integrate devices like heart rate monitors. “Today, patient-reported data collection is happening. Eventually, patient will be just strapping up a watch; without explicitly reporting, we can collect the data as well,” says Kompalli. By the end of next year, mfine also plans to integrate medicine delivery and at-home lab services from within the app. It also has plans to launch a Netflix-style subscription plan in the coming months, says Lawania, adding the pricing if not final. “We would be taking care of his/her primary and secondary healthcare, working towards the health outcome,” he says. “We would be adding a lot more value with technology, anything related to tech, we would not be charging for that. For e.g., adding a diabetes treatment plan or a pain management plan, we would not be charging for that,” he says.
FactorDaily reached out to a doctor for an opinion on mfine’s app, who then took the app for a spin, making a payment and going through an online consultation. She offered her opinion on the condition she will not be named. “The interface looks very nice,” the doctor said. “It was interesting to note that I said I have loose stools and it showed me doctors who were practising general medicine, but it showed me a dermatologist as well, which doesn’t make sense. I’m not sure why they did it,” she says.
She complimented the app on doing a good job of taking in patient history, and having a smooth payment process. “It assigned me a doctor assistant – which is clear that it is a bot considering the rate at which the questions were sent to me. The bot asked me details on my stools: the colour, consistency etc.,” she says. Once the patient details and history was taken, there was a gap and the doctor actually came in. “However, I saw that there was a repetition of questions that the bot had asked me – loose stools since how many days, the doctor again asked me the same question. What this means is that what the bot has asked has not been assimilated by the doctor.”
If mfine can iron out some of these glitches in its process, the AI under its hood could help it ease the pain it is trying to solve: scale up doctor consultations.