Large language model artificial intelligence has grabbed a lot of attention since the launch of ChatGPT, but the hype about their future potential can obscure the very real business case for using them right now.
It’s fair to say that the November 2022 launch of OpenAI’s ChatGPT large language model (LLM) artificial intelligence (AI) shook the world. Here, for the first time, was a conversational AI that not only simulated responses to questions asked of it but did so in an utterly convincing manner. In one fell swoop, OpenAI showed that so-called ‘generative’ AIs are now able to perform tasks once impossible for machines, from drawing to writing e-mails.
It’s no surprise then that it was followed by a flood of articles and videos on everything from how AI will transform our homes to how it will come for our jobs.
To some extent, the dust has now settled as cooler heads prevail. Developments in the field have not settled down, however. If anything, OpenAI and its partner Microsoft have raced forward by demonstrating ever more intelligent iterations of the underlying LLM, not to mention novel applications for its use. In fact, now that we are no longer stunned by the shock and awe of it all, it is a good time to look closely at how AI can be applied in the real world.
At its Build developer conference in May, Microsoft unveiled its plans to make AI a part of daily life. ChatGPT is being integrated not only into the Bing search engine but also into the Windows operating system itself.
Bing Chat will integrate conversational AI directly into the Edge browser. Windows Copilot, meanwhile, will work alongside similar Copilots for common business applications such as Dynamics 365, Power Platform and Microsoft 365, assisting users with complex tasks. These should be of particular interest to businesses as they allow organisations to dip a toe in the AI waters without committing to developing their own applications.
Exciting as these developments are, in fact, AI has been built into Microsoft’s Azure for many years. Azure Cognitive Services, which incorporates OpenAI’s models, allows developers to leverage advanced language processing functionalities to develop applications that can both understand and generate human-like text. Azure Machine Learning, meanwhile, allows for the creation of data-centric machine learning (ML) applications.
However, some other news from the Build conference is also relevant: Microsoft demonstrated Azure OpenAI Service, which brings the power of OpenAI’s generative AI to your data.
Your business, your data, your intelligence
ChatGPT and its successors are applications built on top of OpenAI’s large language model and have access to data provided to them by OpenAI’s engineers. Fascinating as the results have been, where the rubber really hits the road as far as business is concerned is applying the LLM to your own data and building custom applications on top of that.
Chatbots with access to your customer data and an understanding of not only what those customers want, but the tone your company wants to adopt in its communications, are only the most obvious potential application of LLMs in business. It is a powerful one, though. Chatbots have long been used as a cost-saving measure, reducing the load on contact centres, but by giving an LLM AI access to your data, more and more complex queries can be dealt with.
How far LLMs can go is a question that will only be answered as applications are developed, but speaking at Build, Microsoft’s executive chairman and CEO Satya Nadella gave some indication of their vastWhere potential scope.
“Even Azure OpenAI API customers are all new. And the workload conversations, whether it’s B2C conversations in financial services or drug discovery […] these are all new workloads that we really were not in the game in the past, whereas we now are,” he said.
These are exciting possibilities, but there is a strong case for setting LLM AIs to work today. Over time, OpenAI’s LLM in Azure will no doubt be at the heart of all manner of interesting developments. Right now, however, it is already ready to unlock value for the business as it allows users to, in effect, chat with documents, and because it is a private instance, it remains GDPR-compliant in doing so.
It is also easy to get up and running. Unlike with previous generations and other kinds of AIs, you do not need to train it. Instead, you use Azure’s semantic search capabilities, known as Cognitive Search, to search the document repository that you have indexed, meaning it can be instructed to compare documents, find similar ones, and extract and summarise information from them.
In other words, just because the possibilities opened by AI are endless doesn’t mean its use should be relegated to only the most imaginative applications. As the old saying goes: a journey of a thousand miles begins with a single step. Today, thanks to AI, even that single step can be quite the leap forward.