Skip to Content

Download the slides from all of our speakers from the London Causal AI Conference!

Download Slides
  • Blog

Deep learning has a small data problem

Deep learning has had flagship successes over the past decade in image and speech recognition, text generation, and gaming, and is now one of the leading AI paradigms.

Whatever one thinks about the prospects of deep learning achieving general intelligence in the long run (we’re unpersuaded), it is falling short when it comes to helping organizations solve their most pressing data challenges today. Leveraging small data is one such challenge. 

Small data

70% of organizations are shifting their focus from big to small data, according to Gartner. Small data comes in many shapes, sizes, and varieties. We highlight four kinds of small data that pervade organizations. 

Low volume data consists in a small number of observations and features. For example, datasets on internal business processes. Low volume data is ubiquitous: “For every big data set fueling an AI or advanced analytics initiative, a typical large organization may have a thousand small data sets that go unused”, Accenture Research finds. 

70% of organizations are shifting their focus from big to small data, now or in the near future.

Gartner


Low velocity data is updated infrequently. For instance, in real estate investment — one area in which causaLens is making an impact — time-series data often has quarterly or annual time steps. Similarly, GDP, inflation and other kinds of macroeconomic forecasting typically involve low velocity data.

Data that’s high in volume and velocity can also pose a small data problem. One issue is that valuable targets may be extremely rare in big data, and so AI systems need to learn to identify them based on just a few examples (“few shot learning”). Take IoT data in manufacturing used for anomaly detection, where fewer than three or four parts per million have defects. 

Another issue is that big data can be made obsolete by a rapid distribution shift. Think of the disruption wrought by COVID-19, which led to standard analytics models breaking down. Businesses need to improvise ways of working with unprecedented data in the aftermath of crises, when historical datasets are useless. 

Deep learning fails for small data

In deep learning, layers of neural networks are stacked on top of each other — increasingly complex aspects of the data are learned at each layer. 

Deep learning tends to perform only when there is very big data, and ideally when the world does not change (like a computer game). 

However, it is bad at learning from small data in the real world. This is due to the multi-layered architecture: deep learning models have large numbers (often millions, sometimes billions) of hidden features whose values are radically underdetermined by small data — it’s analogous to attempting to solve a system of simultaneous equations with too many variables. 

The performance of today’s deep learning systems tends to take a hit when they go from the lab to the field.

Yoshua Bengio, Geoffrey Hinton and Yann LeCun, the “Godfathers of Deep Learning”

Deep learning’s small data problem is one reason why just 16% of organizations have successfully embedded these algorithms in their standard business processes, and even fewer are realizing meaningful returns from their investments in machine learning.

“The performance of today’s best AI systems tends to take a hit when they go from the lab to the field”, as Bengio, Hinton and LeCun, three leading deep learning researchers, recently acknowledged.

Causal AI for small data

Causal AI is the next giant leap in AI. Causal AI is the only technology that can reason about the world like humans do. Humans can learn from extremely small datasets, often from just one example, and Causal AI can do the same. 

More technically, Causal AI models are learned from very few data points by leveraging causal discovery algorithms — a novel class of algorithms designed to pick out critical information from a complex world through very limited observations, like humans do. 

Causal AI also enables humans to share their insights and background knowledge with the AI, essentially providing data when and where it isn’t available. 

Equipped with a causal model of the business environment, Causal AI can think outside of the small data it’s given and imagine novel scenarios not present in historical data. 


Causal AI is making a significant impact on businesses today; small data handling is one reason why.