AI

Active learning is the future of generative AI: Here’s how to leverage it

Comment

Digital generated image of silhouette of male head with multicoloured gears inside on white background.
Image Credits: Andriy Onufriyenko (opens in a new window) / Getty Images

Eric Landau

Contributor

Before Eric Landau co-founded Encord, he spent nearly a decade at DRW, where he was lead quantitative researcher on a global equity delta one desk and put thousands of models into production. He holds an S.M. in Applied Physics from Harvard University, an M.S. in Electrical Engineering and a B.S. in Physics from Stanford University.

More posts from Eric Landau

During the past six months, we have witnessed some incredible developments in AI. The release of Stable Diffusion forever changed the artworld, and ChatGPT-3 shook up the internet with its ability to write songs, mimic research papers and provide thorough and seemingly intelligent answers to commonly Googled questions.

These advancements in generative AI offer further evidence that we’re on the precipice of an AI revolution.

However, most of these generative AI models are foundational models: high-capacity, unsupervised learning systems that train on vast amounts of data and take millions of dollars of processing power to do it. Currently, only well-funded institutions with access to a massive amount of GPU power are capable of building these models.

The majority of companies developing the application-layer AI that’s driving the widespread adoption of the technology still rely on supervised learning, using large swaths of labeled training data. Despite the impressive feats of foundation models, we’re still in the early days of the AI revolution and numerous bottlenecks are holding back the proliferation of application-layer AI.

Downstream of the well-known data labeling problem exist additional data bottlenecks that will hinder the development of later-stage AI and its deployment to production environments.

These problems are why, despite the early promise and floods of investment, technologies like self-driving cars have been just one year away since 2014.

These exciting proof-of-concept models perform well on benchmarked datasets in research environments, but they struggle to predict accurately when released in the real world. A major problem is that the technology struggles to meet the higher performance threshold required in high-stakes production environments and fails to hit important benchmarks for robustness, reliability and maintainability.

For instance, these models often can’t handle outliers and edge cases, so self-driving cars mistake reflections of bicycles for bicycles themselves. They aren’t reliable or robust so a robot barista makes a perfect cappuccino two out of every five times but spills the cup the other three.

As a result, the AI production gap, the gap between “that’s neat” and “that’s useful,” has been much larger and more formidable than ML engineers first anticipated.

Fortunately, as more and more ML engineers have embraced a data-centric approach to AI development, the implementation of active learning strategies have been on the rise. The most sophisticated companies will leverage this technology to leapfrog the AI production gap and build models capable of running in the wild more quickly.

What is active learning?

Active learning makes training a supervised model an iterative process. The model trains on an initial subset of labeled data from a large dataset. Then, it tries to make predictions on the rest of the unlabeled data based on what it has learned. ML engineers evaluate how certain the model is in its predictions and, by using a variety of acquisition functions, can quantify the performance benefit added by annotating one of the unlabeled samples.

By expressing uncertainty in its predictions, the model is deciding for itself what additional data will be most useful for its training. In doing so, it asks annotators to provide more examples of only that specific type of data so that it can train more intensively on that subset during its next round of training. Think of it like quizzing a student to figure out where their knowledge gap is. Once you know what problems they are missing, you can provide them with textbooks, presentations and other materials so that they can target their learning to better understand that particular aspect of the subject.

With active learning, training a model moves from being a linear process to a circular one with a strong feedback loop.

Why sophisticated companies should be ready to leverage active learning

Active learning is fundamental for closing the prototype-production gap and increasing model reliability.

It’s a common mistake to think of AI systems as a static piece of software, but these systems must be constantly learning and evolving. If not, they make the same mistakes repeatedly, or, when they’re released in the wild, they encounter new scenarios, make new mistakes and don’t have an opportunity to learn from them. They need to have the ability to learn over time, making corrections based on previous mistakes as a human would. Otherwise, models will have issues of reliability and micro robustness, and AI systems will not work in perpetuity.

Most companies using deep learning to solve real-world problems will need to incorporate active learning into their stack. If they don’t, they’ll lag their competitors. Their models won’t respond to or learn from the shifting landscape of possible scenarios.

However, incorporating active learning is easier said than done. For years, a lack of tooling and infrastructure made it difficult to facilitate active learning. Out of necessity, companies that began taking steps to improve their models’ performance with respect to the data have had to take a Frankenstein approach, cobbling together external tools and building tools in-house.

As a result, they don’t have an integrated, comprehensive system for model training. Instead, they have modular block-like processes that can’t talk to each other. They need a flexible system made up of decomposable components in which the processes communicate with one another as they go along the pipeline and create an iterative feedback loop.

The best ways to leverage active learning

Some companies, however, have implemented active learning to great effect and we can learn from them. For companies that have yet to put active learning in place also can do a few things to prepare for and make the most out of this methodology.

The gold standard of active learning is stacks that are fully iterative pipelines. Every component is run with respect to optimizing the performance of the downstream model: data selection, annotation, review, training and validation are done with an integrated logic rather than as disconnected units.

Counterintuitively, the best systems also have the most human interaction. They fully embrace the human-in-the-loop nature of iterative model improvement by opening up entry points for human supervision within each subprocess while also maintaining optionality for completely automated flows when things are working.

The most sophisticated companies therefore have stacks that are iterative, granular, inspectable, automatable and coherent.

Companies seeking to build neural networks that take advantage of active learning should build their stacks with the future in mind. These ML teams should project the types of problems they’ll have and understand the issues they’re likely to encounter when attempting to run their models in the wild. What edge cases will they encounter? In what unreasonable way is the model likely to behave?

If ML teams don’t think through these scenarios, models will inevitably make mistakes in a way that a human never would. Those errors can be quite embarrassing for companies and they should have been highly penalized because they’re so misaligned with human behavior and intuition.

Fortunately, for companies just entering the game, there’s now plenty of know-how and knowledge to be gained from companies that have broken through the production barrier. With more and more companies putting models into production, ML teams can more easily think about forward problems by studying their predecessors, as they will likely face similar problems when moving from proof of concept to production.

Another way to troubleshoot problems before they occur is to think about what a working model looks like beyond its performance metric scores. By thinking about how that model should operate in the wild and the sorts of data and scenarios it will encounter, ML teams will better understand the kinds of issues that might arise once it’s in the production stage.

Lastly, companies should make themselves aware of and understand the tools available to support an active learning and training data pipeline. Five or six years ago, companies had to build infrastructure internally and combine these in-house tools with imperfect external ones. Nowadays, every company should think before they build something internally. New tooling is being developed rapidly, and it’s likely that there’s already a tool that will save time and money while requiring no internal resourcing to maintain it.

Active learning is still in its very early days. However, every month, more companies are expressing an interest in taking advantage of this methodology. The most sophisticated ones will put the infrastructure, tooling and planning in place to harness its power.

More TechCrunch

Copilot, Microsoft’s brand of generative AI, will soon be far more deeply integrated into the Windows 11 experience.

Microsoft wants to make Windows an AI operating system, launches Copilot+ PCs

Some startups choose to bootstrap from the beginning while others find themselves forced into self funding by a lack of investor interest or a business model that doesn’t fit traditional…

VCs wanted FarmboxRx to become a meal kit, the company bootstrapped instead

Uber and Lyft drivers in Minnesota will see higher pay thanks to a deal between the state and the country’s two largest ride-hailing companies. The upshot: a new law that…

Uber and Lyft’s ride-hailing deal with Minnesota comes with a cost

Andreessen Horowitz’s American Dynamism fund has established a new fellowship program aimed at introducing top engineers and technologists to venture investing, a move that could help the firm identify less…

a16z’s American Dynamism team launches program to introduce technical minds to VC

Another fintech startup, and its customers, has been gravely impacted by the implosion of banking-as-a-service startup Synapse. Copper Banking, a digital banking service aimed at teens, notified its customers on…

Teen fintech Copper had to abruptly discontinue its banking, debit products

Autodesk — the 3D tools behemoth — has acquired Wonder Dynamics, a startup that lets creators quickly and easily make complex characters and visual effects using AI-powered image analysis. The…

Autodesk acquires AI-powered VFX startup Wonder Dynamics

Farcaster, a blockchain-based social protocol founded by two Coinbase alumni, announced on Tuesday that it closed a $150 million fundraise. Led by Paradigm, the platform also raised money from a16z…

Farcaster, a crypto-based social network, raised $150M with just 80K daily users

Microsoft announced on Tuesday during its annual Build conference that it’s bringing “Windows Volumetric Apps” to Meta Quest headsets. The partnership will allow Microsoft to bring Windows 365 and local…

Microsoft’s new ‘Volumetric Apps’ for Quest headsets extend Windows apps into the 3D space

The spam reached Bluesky by first crossing over two other decentralized networks: Mastodon and Nostr.

The ‘vote Trump’ spam that hit Bluesky in May came from decentralized rival Nostr

Welcome to TechCrunch Fintech! This week, we’re looking at the continued fallout from Synapse’s bankruptcy, how Layer wants to disrupt SMB accounting, and much more! To get a roundup of…

There’s a real appetite for a fintech alternative to QuickBooks

The company is hoping to produce electricity at $13 per megawatt hour, which would be more than 50% cheaper than traditional onshore wind.

Bill Gates-backed wind startup AirLoom is raising $12M, filings reveal

Generative AI makes stuff up. It can be biased. Sometimes it spits out toxic text. So can it be “safe”? Rick Caccia, the CEO of WitnessAI, believes it can. “Securing…

WitnessAI is building guardrails for generative AI models

It’s not often that you hear about a seed round above $10 million. H, a startup based in Paris and previously known as Holistic AI, has announced a $220 million…

French AI startup H raises $220M seed round

Hey there, Series A to B startups with $35 million or less in funding — we’ve got an exciting opportunity that’s tailor-made for your growth journey! If you’re looking to…

Boost your startup’s growth with a ScaleUp package at TC Disrupt 2024

TikTok is pulling out all the stops to prevent its impending ban in the United States. Aside from initiating legal action against the U.S. government, that means shaping up its…

As a US ban looms, TikTok announces a $1M program for socially driven creators

Microsoft wants to put its Copilot everywhere. It’s only a matter of time before Microsoft renames its annual Build developer conference to Microsoft Copilot. Hopefully, some of those upcoming events…

Microsoft’s Power Automate no-code platform adds AI flows

Build is Microsoft’s largest developer conference and of course, it’s all about AI this year. So it’s no surprise that GitHub’s Copilot, GitHub’s “AI pair programming tool,” is taking center…

GitHub Copilot gets extensions

Microsoft wants to make its brand of generative AI more useful for teams — specifically teams across corporations and large enterprise organizations. This morning at its annual Build dev conference,…

Microsoft intros a Copilot for teams

Microsoft’s big focus at this year’s Build conference is generative AI. And to that end, the tech giant announced a series of updates to its platforms for building generative AI-powered…

Microsoft upgrades its AI app-building platforms

The U.K.’s data protection watchdog has closed an almost year-long investigation of Snap’s AI chatbot, My AI — saying it’s satisfied the social media firm has addressed concerns about risks…

UK data protection watchdog ends privacy probe of Snap’s GenAI chatbot, but warns industry

U.S. cell carrier Patriot Mobile experienced a data breach that included subscribers’ personal information, including full names, email addresses, home ZIP codes and account PINs, TechCrunch has learned. Patriot Mobile,…

Conservative cell carrier Patriot Mobile hit by data breach

It’s been three years since Spotify acquired live audio startup Betty Labs, and yet the music streaming service isn’t leveraging the technology to its fullest potential — at least not…

Spotify’s ‘Listening Party’ feature falls short of expectations

Alchemist Accelerator has a new pile of AI-forward companies demoing their wares today, if you care to watch, and the program itself is making some international moves into Tokyo and…

Alchemist’s latest batch puts AI to work as accelerator expands to Tokyo, Doha

“Late Pledge” allows campaign creators to continue collecting money even after the campaign has closed.

Kickstarter now lets you pledge after a campaign closes

Stack AI’s co-founders, Antoni Rosinol and Bernardo Aceituno, were PhD students at MIT wrapping up their degrees in 2022 just as large language models were becoming more mainstream. ChatGPT would…

Stack AI wants to make it easier to build AI-fueled workflows

Pinecone, the vector database startup founded by Edo Liberty, the former head of Amazon’s AI Labs, has long been at the forefront of helping businesses augment large language models (LLMs)…

Pinecone launches its serverless vector database out of preview

Young geothermal energy wells can be like budding prodigies, each brimming with potential to outshine their peers. But like people, most decline with age. In California, for example, the amount…

Special mud helps XGS Energy get more power out of geothermal wells

Featured Article

Sonos finally made some headphones

The market play is clear from the outset: The $449 headphones are firmly targeted at an audience that would otherwise be purchasing the Bose QC Ultra or Apple AirPods Max.

9 hours ago
Sonos finally made some headphones

Adobe says the feature is up to the task, regardless of how complex of a background the object is set against.

Adobe brings Firefly AI-powered Generative Remove to Lightroom

All cars suffer when the mercury drops, but electric vehicles suffer more than most as heaters draw more power and batteries charge more slowly as the liquid electrolyte inside thickens.…

Porsche Ventures invests in battery startup South 8 to boost cold-weather EV performance