“If you’re the head of risk and you let a $60m loss go by, then a $200m loss, and you don’t ask what the hell is happening here, what are you doing?”
"At Credit Suisse Group AG, executives had given the point salesman to Archegos Capital Management on its swaps desk the new responsibility of instead overseeing risk-taking in the broader prime-brokerage unit"
Two very interesting articles filled with unusually specific details about Israeli spying activities on Hezbollah printing in separate papers the same day. The why here as interesting as the what.
The FT in particular goes into greater details sourced from Israeli officials
Hezbollah’s involvement in Syria exposed them in a way they hadn’t before. Particularly their interactions with Syrian and Russia intelligence services.
"the street is modeling $167B in cumulative AI capex, which is enough to support over 12,000 ChatGPTs.
We think one of the big players may blink and cut back the capex plans, but not likely until we get well into 2025 or beyond"
"Based on these estimates, Google is assuming around 180T AI text queries (both input and output) and 15T AI image queries. This is a staggering figure, as there are around 11T web search queries per year right now worldwide. Stated differently, Google’s AI capex assumes a market that is 15x-20x larger than the web search market by 2026"
"Based on the 2026 consensus AI inference capex above, we estimate that the industry is assumed to produce upwards of 1 quadrillion AI queries in 2026. This would result in over 12,000 ChatGPT-sized products to justify this level of spending, illustrated below."
Reading the Nadella/Scott interviews, and the transcript from JPM of Alysa Taylor who heads Commercial Cloud GTM, you get insights on 3 key topics:
1) frontier models vs models-as-a-service 2) confidence in demand relative to capex spend 3) MSFT's attempts at differentiation
Asked about vertical integration in AI, Nadella says "I'm more of the believer in the horizontal specialization".
More importantly, he goes on: "So I think any enterprise application, really what they're most excited about is models-as-a-service".
Mark Murphy asks: "re: foundation models do you expect we're going to see some convergence in capabilities or do you suspect... we're going to see sustained performance differential"
Alysa Taylor, heavily cribbing from AWS: "We don't believe there is one model to rule them all"
Something that was noticeable, on each of MSFT peer review slides for Apple, Amazon and Google, they highlighted progress on proprietary chips
And then the Azure and Windows slides both have develop custom silicon chips as long term drivers. We knew that this was the case, but just interesting to see them highlight other co's successes and set goals for it.
"base of our stack, our custom silicon efforts will help us remain competitive.. Our efforts will be a mix of internal and partnership... ultimately, we will need to become a first-class provider of chipset designs, especially the most critical chips given our scale in the cloud"
FTC request for comments on hyperscalers interesting. Can tell a lot by each participants responses.
ORCL/NET: whining about egress fees
GOOG: whining about MSFT/ORCL leveraging on prem biz to win cloud
MSFT: "very competitive market, even IBM"
AWS: history and tutorial on cloud
this is arguably more descriptive from AWS then anything they've said to investors about AI
Google complaining about MSFT and ORCL and then pitching Workspaces right in the doc. ABC guys, ABC.