Transform 2022 will be back in person July 19, and virtual July 20 – 28, we are thrilled to announce. Get connected with data and AI leaders to hear insightful talks and network. Register Today!
The story of artificial intelligenceThe growth of artificial intelligence (AI) has been driven by scale over the last five years. It has made huge progress. Natural language processingBy using strategies developed in the middle of 2010, and adding more computing power to them, NLP can help with image understanding, voice recognition, speech recognition, and other areas. This has led to an interesting power dynamics in the usage and distribution AI systems. One that makes AI look much like the electrical grid.
NLP is all about being bigger.
Current state-of-the art in NLP is powered by neural networks that have billions of parameters and are trained from terabytes upon terabytes. These networks must be stored in memory using multiple GPUs. Training these networks will require supercomputer clusters that are beyond the capabilities of most organizations.
You could train a smaller group using the same methods. neural networkYou could use significantly less text, but the performance would be much worse. It is so much worse that it becomes an actual difference in type instead of a difference in degree. Tasks such as entity extraction, text classification, summarization or text classification are all tasks where large language model excels and small language models perform poorly.
This is a remarkable development for someone who has worked with neural networks for over ten years. It’s not obvious from a technical standpoint that increasing the number of parameters in a neural network would lead to such a drastic improvement in capability. We are now in 2022 and we have neural networks training almost identically to the 2017 architectures, but with hundreds of times more computations, and better results.
This points to an exciting new dynamic in this field. State-of-the-art models are too computationally expensive for nearly any company – let alone an individual – to create or even deploy. In order for a company to make use of such models, they need to use one created and hosted by someone else – similar to the way electricity is created and distributed today.
Sharing AI like it’s a metered utility
Electricity is essential for every office building. However, not all office buildings can have the infrastructure required to generate their own power. Instead, they are connected to a central power grid and paid for the electricity they use.
Many companies can reap the benefits of NLP integration in their operations. However, few companies have the resources or the time to develop their own AI models. These companies have developed large AI models and made them accessible via an API. By offering a way for businesses to “hook up” to the proverbial NLP power grid, the cost of training these large-scale state-of-the-art models is amortized over various customers, thereby enabling them to access this cutting-edge technology, without the cutting-edge infrastructure.
To give a concrete example, let’s say a company that stores legal documents wants to display a summary of each document in its possession. A few law students could be hired to summarize and read each document. Or they could use a neural network. Large-scale neural networks working in tandem with a law student’s workflow would drastically increase efficiency in summarization. Training one from scratch, though, would cost orders of magnitude more than it would to just hire more law students, but if said company had access to a state-of-the-art neural network via a network-based API, they could just hook up to the AI “power grid,” and pay for the summarization usage.
If you take the analogy to its extreme, this analogy can have some very interesting implications. Like water and transport infrastructure, electricity is a utility. These services are vital to our society’s functioning. In Ontario, where I am writing this, they are maintained by crown corporations. They are owned and regulated by either the federal or provincial governments. These crown corporations are responsible not only for infrastructure and distribution but also for evaluation and quality assurance such as water-quality testing.
Controlling the use AI is also important
Additionally, it can be misused, just as electricity. It is also susceptible to misuses and has been shown to be limited. A lot of research has been done on the potential harm these models could cause via astroturfing or the propagation biases. This technology has the potential to revolutionize the way we work. It is important to look at its regulation and governing body. A number of NLP API providers have released best practices for the deployment of these models. However, this is only a first step and builds on previous work.
Andrew Ng Famously, that “AI is the new electricity.” I believe he meant that it will power a wave of progress and innovation, becoming crucial to the functioning of our economy with the same scale impact as the introduction of electricity. Although it is a little too optimistic, I believe the statement may be more accurate than I thought. AI will be the new electricity. This means that it will require a new set power plants to enable it.
Nick Frosst was a cofounder of Cohere.
VentureBeat welcomes you!
DataDecisionMakers allows experts to share their data-related insights and innovating with each other.
DataDecisionMakers offers cutting-edge information, the most up-to date information, best practices and the future data and technology.
You might even consider Contributing to an article of your own!