AI

Hype cycles are interesting. With each, my communication channels become flooded with articles, conference invites, and sales pitches. It's wild how many novices become experts overnight.

In the distribution space, the "Digital Transformation" hype cycle has graduated to "Artificial Intelligence." Broad terms are a prerequisite of hype cycles. No legitimate one is without them. Hype cycles require umbrella terms whose lack of specificity is leveraged against the psychological constant of scarcity bias.

To be clear, I don't dismiss hype cycles. On the contrary, I seek to understand what the hype is. This process of understanding involves disregarding placeholder terms and acquiring a sufficient depth of knowledge to apply the underlying technology. In short: less talking, more building.

What we call AI today primarily encompasses a text-based natural language technology built on the Transformer architecture. This architecture was originally publicized in 2017 by Google Brain researchers. Like any and every software system before it, "AI" is a product of inputs and outputs.

Transformer architecture is fundamental to designing and operating Large Language Models (LLMs). It was popularized to the masses by ChatGPT. But, developers have been using the technology for years. My first experience came during the beta of GitHub's Copilot in early 2021.

So, how do LLMs work? A product like OpenAI's ChatGPT or Google's Bard—both client interfaces—accept text input. The input text is passed to a model where it undergoes "tokenization." Each token is comprised of roughly 4 characters in the English language. These tokens are processed based on patterns and relationships learned by the model. As the response is crafted, prior parts of the conversation are referenced. The outputs become recursive inputs – the models are context-aware.

While the coherent outputs of one-shot or sequential conversational inputs can appear magical, they are certainly not. Large Language Models operate on probabilities, not certainties. What has changed considerably with this technology is the degree of sophistication between inputs and outputs.

An appropriate way to explain the software side of AI to the business side is through an analogy. Imagine you are a carpenter who's worked your entire life using a hand-cranked drill. Then, someone hands you a power drill (a brushless M18 😉). Those hand cranks across every pilot hole and screw are replaced with a trigger pull. While the carpenter's craft remains the same, the tool drives radical efficiency.

Unlike many previous hype cycles in the tech space, the hype surrounding the current state of AI is legitimate. Nvidia's guidance on their Q2 earnings call is evidence. But the hype is still hype. The application of technology to solve problems is what matters. We are very early in the lifecycle of how AI will transform industries.

Liberty Supply Chain

Get supply chain insights delivered monthly from our team to your inbox.