Smaller, open-source AI large language models offer enterprises the chance to leverage their own data assets for innovation and competitive advantage. Credit: Gorodenkoff / Shutterstock For the last 30 years, the dream of being able to collect, manage and make use of the collected knowledge assets of an organization has never been truly realized. Systems for sharing information assets across the enterprise have evolved in their sophistication but haven’t been able to take it to the next level by effectively turning the information that resides in digital files into usable knowledge. Data exists in ever larger silos, but real knowledge still resides in employees. But the rise of large language models (LLMs) is starting to make true knowledge management (KM) a reality. These models can extract meaning from digital data at scale and speed beyond the capabilities of human analysts. The 2023 State of the CIO survey reveals that 71% of CIO respondents anticipate greater involvement in business strategy over the next three years with 85% stating they were becoming more digital- and innovation-focused. The application of LLMs to an organization’s knowledge assets has the potential to accelerate these trends. Less is More OpenAI’s ChatGPT and Dall-E 2 generative AI (GenAI) models have revolutionized how we think about AI and what it can do. From writing poems to creating images, it’s staggering how a computer can create new content from a few simple prompts. However, the scale of the LLMs used to perform these tasks is vast and expensive for OpenAI to offer. GPT-4 was trained on over 45 terabytes of text data via more than a thousand GPUs over 34 days and cost almost $5 million in compute power. In 2022, OpenAI lost $540 million despite having raised $11.3 billion in funding rounds. Clearly these costs and its scale of operations are beyond the capabilities of most organizations wanting to develop their own LLMs. However, the AI future for many enterprises lies in building and adapting much smaller models based on their own internal data assets. Rather than relying on APIs provided by firms such as OpenAI and the risks of uploading potentially sensitive data to third-party servers, new approaches are allowing firms to bring smaller LLMs inhouse. Adjusting the parameters of LLM models and new languages such as Mojo and AI-programming frameworks like PyTorch significantly reduce the compute resources and time needed to run AI programs. Open is Better Just as the web is built on open-source software and protocols, it’s likely that many enterprise AI initiatives will be built on open-source models such as LLaMA and freely available techniques such as LoRa. According to a recently leaked Google memo, “The barrier to entry for training and experimentation has dropped from the total output of a major research organization to one person, an evening, and a beefy laptop.” These barriers to entry will only become lower and the results better, allowing startups and enterprises to build niche models focused on the specific needs of businesses and workflows. From GenAI to SynthAI Central to these developments is the move from AI systems that create new content based on simple prompts to models that are trained on an enterprise’s internal data and programmed to generate usable insights and recommendations. LLMs such as ChatGPT produce often believable results, but it’s not clear how the data fed into the model was used and whether the answers it gives are true or hallucinations. The recent case of a New York lawyer using ChatGPT to generate court filings and listing presumably historical cases to support his client’s claim showed the dangers of relying on GenAI outputs: despite looking like genuine evidence, six of the cases listed never took place. Silicon Valley venture capital firm A16Z recently outlined its belief that the future for AI in the workplace was not necessarily LLMs like ChatGPT, but more focused models designed to address specific business needs. They refer to this as SynthAI where the models are trained on proprietary data sets and optimized for discrete purposes, such as resolving customer support issues, summarizing market research results and creating personalized marketing emails. Applying the SynthAI approach to better managing a firm’s data assets is a natural evolution of this next stage in the AI revolution. Consulting firm BCG has adopted this approach to 50 years worth of their archives, largely reports, presentations and data collected from surveys and client engagements. Previously, employees could only search these files via a keyword search and then read through each document to check its relevance. Now the system provides usable answers to questions. The knowledge management dream is becoming a reality. Related content feature New US CIO appointments, May 2024 Congratulations to these 'movers and shakers' recently hired or promoted into a new chief information officer, senior IT, or board role. By Martha Heller May 08, 2024 9 mins CIO Careers IT Leadership feature The extent Automic’s group CIO goes to reconcile data Cathy O'Sullivan, CIO editor-in-chief for APAC, recently sat with Marcelo Dantas, group CIO at Automic Group, to discuss completing one of the largest-ever registry services transitions in Australia, keeping pace with technology, and why cyberse By CIO staff May 08, 2024 9 mins CIO Cloud Native Data Quality feature Expion Health revamps its RFP process with AI The healthcare cost management firm built a customized AI tool to streamline an error-prone process for gaining new customers. Now, it’s considering selling the project for external use. By Grant Gross May 08, 2024 6 mins CIO 100 Healthcare Industry Digital Transformation feature Ways IT leaders can meet the EU AI Act head on The biggest mistake companies of all sizes could make is to put conformity before innovation, according to EU AI Act co-rapporteur Dragoș Tudorache. By Andrada Fiscutean May 08, 2024 6 mins CIO Military Regulation PODCASTS VIDEOS RESOURCES EVENTS SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe