For the last year, there has been a lot of activity applying Generative AI to all areas of business. Many companies have been rapidly exploring the space, eager to understand how to modernize their operations, increase revenue, or decrease costs to increase gross margin. These efforts are usually initiated by C-suite executives, often driven by FOMO – such is the perceived disruptive potential of AI today.
Yellow Radio has had the privilege since 2022 of working with many of the world’s leading companies, particularly in Financial Services, as they explore the potential of Gen AI. We have seen trials, PoCs and pilots across a number of use cases, such as:
- Using Conversational Interfaces to transform end user interactions, for example chatbots handling sales enquiries and customer support
- Question Answering (QA) to unlock access to institutional knowledge, often combining internal and external knowledge search
- Augmenting, automating and accelerating software development with AI copilots
- Automating inside sales functions to provide SDR/BDR capabilities, able to research potential leads and construct relevant messaging and engagement
- CPD and learning content production, providing tailored learning material that accelerates career progression and aids with staff retention
- Talent acquisition, able to find better candidates globally and screen more effectively
- Legal contract management, to classify, validate, and automate the processing of contracts and claims
- PR and marketing material production, able to automate generation of social media posts, and accelerate production of long form content
These are just the use cases we have worked on; there are many more.
But invariably, the question that is often missed is why. If a use case is the what, then the pain point is the why. AI is being wielded like the mother of all shiny hammers, and not everything is really a nail. Starting with a review of the core processes, unique value proposition (UVP), competitive differentiators, and business goals to find the real pain points is essential. If, for example, our core business offering is an online travel broker and we are losing market share, then perhaps we should first check if our supplier network has become uncompetitive or out of date, before we rush to AI enable our user-facing sales platform interfaces.
The question that is always missed, however, is should we? Without exception, every technology-focused event, conference, webinar that we have attended always focuses on the what and how to use AI, but never pauses to consider if there might be unintended negative consequences, for our business, our staff, our users, or wider society. Governance and ESG considerations are often treated as an unwelcome overhead, that get in the way of progress, and frankly is simply just more boring than the exciting world of using cutting edge technology to disrupt the way we work.
Even when we have assured ourselves that there is a genuine use case for AI in our business, and any risks are outweighed by the benefits, then the way in which AI can be used can still vary. Some points to consider:
- Broadly, there are different approaches for cross-enterprise applications, departmental functional applications, and productivity applications at an individual level.
- Using a GPT model, either directly via OpenAI or via MS Azure, using a CustomGPT perhaps, is just one of many ways of accessing GenAI capabilities. There are many platforms that can help evaluate the bewildering range of models available, which changes every week.
- There are choices for building with foundation models and using techniques such as RAG (Retrieval Augmented Generation) to improve performance (accuracy) and minimize hallucination.
- There are improvements in fine-tuning and training small language models that may represent a ‘better’, i.e. faster, cheaper, easier to deploy, with more control, option than using a foundation LLM, even with RAG
- Many traditional Natural Language Processing (NLP) use cases do not actually need a generative, i.e. LLM, capability. An older extractive model, such as from the BERT family of models that kick-started the GenAI revolution in 2017 when they were open sourced, may work just fine – or even better than an LLM.
- A point solution, i.e. a COTS application that solves a specific pain point, may be a perfect solution that gives the benefits of AI, without the need for building a custom solution.
For a non-exhaustive list of AI point solutions to common pain points and use cases, get in touch – we have built up a vendor-neutral, cross-industry matrix view that we would be happy to share.
Lastly, as a ‘third way’, the rise of no-code / low-code platforms that use AI can be an ideal deployment route that spares the effort of building with foundation or smaller language models, but with more flexibility than a COTS point solution. This is likely to be a growth area in 2024 and beyond.
But without an understanding of your true business point points, perhaps there isn’t an AI use case for you yet after all.