Gold Rush

Goldman Sachs is fully embracing AI. 

Earlier this year, at the Cisco AI Summit, David Solomon said:

“The work of drafting an S-1 (for an IPO) used to take a six person team two weeks to complete, but now it can be 95% done by AI in minutes. The last 5% now matters because the rest is commodity.”

Quite the testimonial for foundational AI companies like ChatGPT.

Not so great for the junior talent on deal teams.

We’ve spent the past two months meeting with nearly a dozen founders building AI tools and platforms for the largest banks, private equity funds, and hedge funds.

Today, we breakdown our findings

“internal tools”

OpenAI was founded in December 2015 by Sam Altman, Elon Musk, with involvement from Ilya Sutskever, Peter Thiel, Reid Hoffman, Jessica Livingston, and others.

A total of $1B was pledged from the founders and other institutions with the purpose of funding safe artificial intelligence and machine learning research.

Only $130M was actually contributed between 2015 and 2019.

Elon Musk ended up stepping down from the board in 2018, after his attempt to take over OpenAI came up short.

Seven years after its founding, OpenAI dropped ChatGPT, which broke the internet.

The rest is history…

threads cheated

OpenAI is the leader in foundational artificial intelligence models, with startups like Anthropic (Claude, Claude 3.7 Sonnet) and Cohere (Command) commanding respect as well.

For large financial institutions, the biggest concerns around AI adoption are data privacy and pride. Yes, pride.

Let’s take KKR for example.

KKR deals with tons of sensitive information, so using ChatGPT and Claude poses an obvious risk to the firm and their clients.

KKR also has an image to uphold in front of their LPs and clients. They have proprietary tools for investment analysis, due diligence, and other parts of the private markets value chain. This is a part of their brand, a marketing ploy. Even if ChatGPT was theoretically low-risk from a data privacy standpoint, pride would prevent large asset managers from allowing their analysts and associates to use it.

So instead, they use “internal tools”… Which is ChatGPT / Claude in disguise.

tree = internal tool; shaq = chatgpt

The amount of resources it would take to build artificial general intelligence from the ground up is a price too high for even Tier-1 banks and funds to pay. There’s a reason why OpenAI’s raised nearly $60B in the past few years.

So Wall Street firms effectively white label ChatGPT or Claude, have their engineers sprinkle sauce on it, and call it a day. Not a bad thing per se

no need to reinvent the wheel, just throw some chrome on it

This leads to the next topic of discussion,

“wrappers”

An AI “wrapper” is a startup or product that innovates on the application layer rather than the model layer.

  • Application Layer = the user interface through which AI models are made accessible to end users. This is the least expensive layer to build a product on

  • Model Layer = the foundational technology of language models. OpenAI, Anthropic, Mistral, Google, etc., play in this field. There are also opportunities to create more niche models for specific use cases, rather than the artificial general intelligence goal (very expensive) that the aforementioned players are aiming for. More expensive from a capital and time perspective than the application layer

  • Infrastructure Layer = NVIDIA, Microsoft, Google. The physical data centers that AI compute depends on. Ultra expensive and inaccessible to build new companies here.

The vast majority of AI startups are wrappers because it’s easiest to do, quite frankly.

In 1848, James Marshall found gold at Sutter’s Mill in Northern California.

By the next year, hundreds of thousands of newcomers flooded the region in search of easy money.

So many migrants came in 1849, gold-seekers became known as “forty-niners”.

The gold rush lasted nearly 20 years.

Tens of billions of dollars worth of gold were unearthed during the gold rush, but only a few individuals became wealthy from it. 

Most of those who participated were not better off for having done so.

The truth of the matter is, easy money does not exist.

yes he does

Founders are treating AI like gold in the 1850s.

Almost every startup is now an AI startup, even if there is no real benefit from incorporating AI.

The unit economics of AI wrappers typically do not pan out favorably.

For instance, imagine two startups building the same product for hedge funds. The primary functionality of the platform for one startup completely relies on Claude’s foundational model. Another startup goes through the more difficult process of building a niche, proprietary model. The cost of compute for the second startup will likely be 4 - 10x lower in the medium term.

While this hyper-specific example does not apply to every wrapper startup, it’s helpful to keep in mind when evaluating opportunities.

A number of AI wrappers will likely prove generational in the long term, so take this with a grain of salt.

“job security”

The elephant on the Street for the past few years.

What happens to junior-level knowledge workers on Wall Street?

windows of opportunity, let me go through the ‘dores

Wall Street is investing tons of money into replacing analysts, full stop

  • Bridgewater launched a fund last year driven entirely by AI. The fund’s AIA Labs worked to replicate every stage of the investment process with machine learning

  • Balyasny Asset Management is in the midst of building the AI equivalent of a senior analyst, built on an OpenAI API

  • EQT built an AI engine called Motherbrain for its dealmakers to source deals more efficiently

The potential cost savings for large banks and funds would be enormous…

Nearly 25% of surveyed banking executives from major institutions like Citigroup, JP Morgan, and Goldman Sachs anticipate sizable AI-related job losses, ranging from 5-10% of their total staff.

The shift toward AI is expected to increase bank profitability as much as 12-17% by 2027.

Nonetheless, it would be quite difficult for Wall Street to get rid of analysts entirely. Analysts are a key source of internal talent for future leadership positions. Artificial intelligence is good, but good isn’t good enough when you’re dealing with institutional investors. Hallucination risk is real and costly. AI allows analysts to get work done faster, so modest headcount reduction in the future is the most likely path. As the technology gets better, business models will adapt.

“cursor”

Last week, news broke that OpenAI was buying Windsurf for a reported $3B.

Windsurf is the maker of a popular AI coding assistant.

The most popular AI coding assistant, however, is Cursor.

Apparently, OpenAI tried to buy Cursor, but failed. Cursor went on to raise money at a $10B valuation.

Cursor allows talented software engineers to seriously increase their output, reducing the time it takes to ship new product. According to a number of technical founders, Cursor is by far the best use case of artificial intelligence from a productivity standpoint. The most interesting thing I learned is that AI coding assistants like Cursor are like steroids for talented, experienced coders, but can be dangerous in the hands of a novice, because human judgement is still critical.

Cursor went from $1M to $100M in revenue over the course of 12 months…

Headlines

  • Columbia student suspended over cheating tool raises $5.3M to cheat on everything. Techcrunch

  • 19 new unicorns in 2019. Techcrunch

  • Nico Harrison - didn’t know fans level of love for Luka. ESPN

  • The VC - AI paradox. Axios

Schedule a Call: Meet the team. Learn more about what we’re working on, give us feedback / ideas, say what’s up

Submit a Deal: Send us an interesting early stage deal to look at

Follow us on LinkedIn: Stay up to date on all things mainstreet.

Reply

or to participate.