By Jonathan Wood, General Manager, India, Middle East, and Africa (IMEA)
There's broad acceptance that artificial intelligence (AI) is here. However, we're still figuring out how to apply it across retail businesses for supply chains, operations, omnichannel experiences, or even in-store. While we're still in the early stages of application, one thing is certain: as we increasingly rely on machines to assist us with B-to-B and customer interactions, AI technology will only be as good as the data humans can supply.
The big question at stake is where to begin. Perhaps the simplest way to start is by picturing the outcome you want to produce. For many of us in the retail industry, that begins with the customer.
The Customer is Already There
Between the world of chatbots and voice assistants, consumers have already started to embrace simple interactions with machines.
It’s artificial, yes, but it hardly counts as intelligence. Whether asking a chatbot basic queries or commands about item availability or having Alexa order a fresh box of detergent, it’s easily done on today’s platforms. However, what about the tough questions — e.g., what to wear to a business meeting or guidance on a home improvement project? The next phase of AI is all about context and intent — the what and why behind a customer’s shopping experience.
These contextual interactions will only become more prevalent. As such, the next step for these digital assistants is to take trend analysis into effect, marry it up with personal preferences, and give better advice for what to buy and where to buy it from. Here’s the catch — most retailers will not own the dominant AI platforms of the near future. Google, Amazon.com, eBay and Apple already have a solid head start, not to mention all the emerging startups and vendors hard at work on their own solutions. With this in mind, retailers probably shouldn't commit budget to creating their own AI platforms because those dollars will be better spent making sure the robots can find you.
Aligning Data for the New World
Recent reports have claimed that the next billion internet users will use voice and video over the good old-fashioned keyboard. We're moving beyond the browser, and now is the time for retailers to align product data for visual and voice search.
People perform searches differently with an AI digital assistant compared to a browser-based search engine. Rather than a string of short words, shoppers tend to speak in full sentences with AI. They ask questions, and those questions tend to focus less on a specific product and more on its intended use. People expect AI assistants to tell them what to wear to a wedding or the best way to get a stain out of their favourite shirt; they take a picture and ask the digital assistant to recreate a look with the exact items or similar products that fit within their budget. As AI-powered digital assistants get more sophisticated, they’ll use machine learning to mine more product data and provide better results in response to these new customer expectations.
How do you prepare for that kind of world? Start by cleaning up your product data, and start now. Make sure that each product has accurate and unique descriptions (i.e., metadata) to set it apart not only in text but voice and visual searches, too. The next step is to make sure peripheral efforts — e.g., blogs, online ads — are aligned with the message you want to send and link back to the bigger picture for the customer.
Indeed, AI and the shift in consumer internet use will only accelerate retail’s pace of change. However, retailers don't necessarily need to build their own bots to survive. Start with the consumer in mind, get data organized, and make sure the right underlying technology is in place — business intelligence, merchandising systems, and even point-of-sale software – to show the machines your business is relevant in this new connected world.