Spent some time with a great annual read from Ben Evans on trends.
First up, a bunch of facts to internalize:
500 Y Combinator startups funded in 2023 — 50% focused on AI
Venture capital into software back to $10B/year
Right at the average from 2015-2020 (down from $40B in 2021)
Nvidia quarterly revenues 4x between March 2022 and March 2023 (and then did it again with their quarterly earnings posted this week)
Big three cloud platforms plus Meta spent $125B+ on infrastructure in 2023
Next up, a view on where this is all headed:
The big guys have two major natural advantages that they will fight to protect at all angles — tech, patents, contracts, policy, regulatory, capital, you name it.
First, mass amounts of first-party (or near first-party) data. A friend pushed me when Meta was beat down to $90/share last year that the company’s data was worth at least their market cap, regardless of how much Zuck was spending on VR and I agree. A fraction of this data can/is used to create AI models unlike any others.
Second, a scaled infrastructure footprint. I went to a Google Data Center years ago and felt like Professor Otto Lidenbrock in Jules Verne’s Journey to the Center of the Earth. The advantage just cannot be overstated. These companies run the internet, partially open and partially closed, with servers and fiber cables that offer incredible spillover value from other parts of their business that match nicely with the computer and storage needs of AI. (Hopefully my NDA has expired by now).
Thus, a bazillion dollars will be spent to be amongst these companies to build and support the top 2-3 LLM models, which eventually probably reach parity in user value because they’re built on similar data (insight) and infrastructure (latency). The pace of “innovation” will be fast, fierce, and heavily advantaged to incumbents.
OK, now to some other views that we are debating on the porch and as always, appreciate insight and feedback from this esteemed community:
View #1. Front Porch can’t/won’t mess with the infrastructure layer or maybe/probably not even the data layer. We are too small and the big guys are too big and if you want to make this bet, you should just go buy Big Tech/Nvidia (ideally 18 months ago — or even just a week ago). The application and efficiency layer are about the only places for us to reasonably play.
View #2. We are usually hesitant about companies that provide simple “process efficiency” — meaning, you can use tool X or Y to help a customer or partner save money (or worse, hypothetically save money, but really save time). Why? It is especially hard to preserve an efficiency advantage over time and even if you can, profits tend to get competed away and/or things get too customized/non-scalable.
So… how are you guys thinking about AI then?
1 — We are investing with targeted fund partners in our region (especially Atlanta where corporate and more technical universities have a unique operating model
together) to help us with a broader landscape/scaled perspective.
2 — We give extra bonus points for founders who can demonstrate that they are “AI-natives” (or are building their business with this mentality). It is becoming pretty clear as we hear pitches when a founder is really living and learning at the AI frontier, or has simply added a few buzz words into their presentation to check the box.
3 — Double extra bonus points for founders who quantify how their exploration of product-market fit is accelerating with the use of AI tools. Triple extra bonus points for founders who can demonstrate that their AI-native operating model is not too dependent on any single provider (hard, especially now) but we have our eyes on hedging against a world where a founder’s hard work just becomes another module available after login on Open AI, AWS, Google/Bard, etc — this is typically most likely when the company has an extremely specific and narrow vertical focus.
Appreciate you joining us on the porch for a few — please send feedback!