Have you heard of AI? Of course, you have; it’s bigger than Taylor Swift.
After a long 2023 of nonstop AI hype, hyperbole, and hiccups, we all hope 2024 will be the year AI proves its worth. For retailers and commerce CTOs, that means better customer service chatbots, intelligent merchandising assistants, hyper-personalization, and so much more.
But, it takes more than buzzwords and marketing effort to deliver commerce-centric AI that retailers and their customers will truly value. Here’s my take on what retail CTOs must focus on in AI for commerce and how fabric plans to make it easy for you.
Gartner’s year-old hype cycle already showed AI technologies pouring over the dreaded “peak of inflated expectations.” However, it doesn’t take much surface scratching to see that many of these technologies and companies are skating on thin ice. Finding out that models were trained on public data like Wikipedia or web scraping doesn’t do much to build confidence for enterprise executives.
The first thing CTOs need from AI is large language models (LLMs) trained on industry-, business-, and task-specific data, not just random and unfocused information.
Luckily, we’re now seeing a wave of tailored LLMs that will impact product enrichment, image augmentation, labeling and text generation, and more for the benefit of unique roles and personas, and even individual workers and customers. Here are a few examples of where focused LLMs will benefit retailers:
Enterprise businesses require robust and repeatable results that both validate business hypotheses and have proper attribution. However, one challenge with AI is the inability of current neural networks to provide causality. Worse yet, AI output is sometimes difficult to understand even for those who’ve developed it!
To make the technology less of a black box, causal models provide audit trails so organizations can show why the AI delivered a certain outcome. This is especially important for generative and predictive AI applications, which will have a bigger impact on the customer experience.
Consider how a rules-based engine displays prices or discounts for a customer in a specific scenario: it’s always possible to backtrack to understand why a price was shown. That’s not possible with a deep learning algorithm, so these types of use cases are currently ill-suited for generative AI. Issues like explainability and AI hallucinations are slowing adoption for retail CTOs, as is the lack of major breakthroughs in AI specifically for commerce.
Another challenge for retail CTOs is the separation of front- and back-ends that have isolated data across vendors. Consider your current stack of content management (CMS), search, personalization, cart/checkout, pricing, storefront, order management (OMS), inventory, sourcing, returns, and the list goes on. You may conceivably have different vendors for each of these areas. That’s one of the benefits of a headless system, but can also create miniature data silos that hamper AI learning and performance.
Sure, many enterprises consolidate information in a data lake on which AI algorithms and deep learning can run. But, it then becomes increasingly critical to ensure that complete, accurate, and accurately correlated data is used by AI. That can cause friction for technology vendors in feature development and execution as they attempt to deliver on data needs and feature innovations. It can further hamper composability as data and APIs become more complex and slow the development of data-rich algorithms. Today, enterprises get around this with customizations stitched together in a patchwork of difficult-to-maintain applications.
fabric is working to help retail CTOs overcome these challenges. Here are a few things we see for the very near future of commerce AI.
Technologies and terms like microservices, API-first, cloud-native, and headless (also referred to as MACH), plus composable and others, are all great tools fit for a purpose. But those alone don’t solve retailers’ challenges.
We’re tackling your challenges by rethinking a commerce platform to overcome boundaries, increase speeds, increase customer convenience, and more. We started with a clean slate and from our customer’s perspective — customer-first — in approaching technology. We’re also using the latest technology principles because we’re building on the shoulders of AWS, microservices, headless, and standards like RESTful and OpenAPI.
Count us lucky, as we are fortunate to be in a position to redefine digital commerce by working backward from the customer.
We see the promise of ML and deep learning to improve predictions and generative content and disrupt several parts of the commerce experience. This enables “store of the future” use cases like single-page stores and checkouts where retailers will no longer need to enforce a PDP/PLP/Cart journey.
Many of today’s offerings are superficial in these areas, but we’ll soon see better recommendations, better search results, and enhanced merchandising tools based on AI-driven predictions and without the need to alter the current customer or worker experience. We are even imagining entirely new shopping experiences based on generative AI and AI chatbots.
For operators, AI will begin to enable more automation that enhances the behind-the-scenes retail tooling. This will support a “commerce at your fingertips” future where every merchandiser can utilize an army of agents to analyze data, run reports, or create SKUs, collections, and campaigns.
We want to invest in enabling the right capabilities for data processing and integrations with top LLM models to create modern commerce experiences, rather than just rebranding or tuning third-party LLMs.
A copilot-like feature within our merchandising dashboard is a good example of what this could look like.
We understand that LLM copyright issues have yet to be resolved. That’s especially worrisome for LLMs trained on dubiously obtained public data. This could see companies like closed-source LLMs struggling in the risk-averse enterprise market while commerce executives prefer less sexy but more usable models from Google, Microsoft, and Amazon that were trained on data obtained through consent.
With the above points in mind, fabric continues to focus on AI use cases that drive the most value to our commerce customers. Here’s what we’re currently exploring:
We’re already building native integrations with top AI models, and doing further work to fine-tune, augment, and validate these integrations. Expect to see APIs for these capabilities soon, as well as dedicated interfaces to ease development and deployment.
The obvious next step, we think, is a sort of fabricGPT that uses obfuscated data to deliver custom experiences to your customers. Yeah, it’s going to be pretty cool. I mentioned copilot-like tools above, so expect some announcements there, too.
Of course, our customers will always have the power to train and tune data and maintain isolated models — all as a service.
We have a great team at fabric and a great set of partners. If you’re looking to solve real-world retail problems through automation and the latest advances in ML, let’s talk.
CTO @ fabric. Previously software development @ Amazon and Williams Sonoma.