Product-market fit is not enough anymore. You need position-market fit:
There was a time, not too long ago, where in startupland one could build a thing that solved a real problem, put a price tag on it, and see if the market wanted it:
• If they did, then we would have said he had found product-market fit
• If they didn’t, we would say he didn’t and it was “back to the lab”
That time, for better or worse, is long gone.
Here’s how @0zne and I break it down:
In 2024, a utility provided through software can’t make a dent effectively anymore. People’s heads are overstuffed with competing products, messaging, and narratives. It’s hard for a product alone to get a market edge.
The main exceptions are new tech or hyper-niche markets. ChatGPT, for example. But that is a rare instances of breakthrough technology where the “product” itself carries the bulk of the impact.
Most companies don't have that luxury and are not in such a position. So, if a product alone isn’t enough, then what is enough?
Enter position-market-fit.
• If “product-market-fit” means that you’ve found the right kind of product that the market wants…
• “Position-market-fit” means that you’ve found the right combination of product/brand/marketing/pricing/go-to-market/sales/etc in a given domain.
The Importance of Brain Estate
The fundamental reason why “position-market-fit” is so important is that it operates more at a personal and subconscious level. Our brains can only conceptualize a finite set of “characters'' per domain.
Similar to the "Dunbar number" rule, which suggests we can maintain stable social relationships with up to 150 people, our brains are wired to understand only a finite number of company-market associations.
Gaining a strong positional edge, or nailing “position-market-fit” is the exercise through which a company, with the right combination of product, brand, pricing, marketing and go-market is able to conquer a certain portion of consumer “brain-estate.”
The Story of Startup Success
If you step back and analyze some of the best startups from the last decade, you'll see they excelled at this.
→ Are you building in an established market dominated by large incumbents with feature-bloated, slow, and clunky software? In that case, you might want to position your product as a speed-first, high-craft, premium option, similar to a luxury car company. Does Linear ring a bell?
→ Alternatively, if you're entering a highly commoditized market dominated by a few corporate-looking brands, consider positioning yourself as the quirky, fun company that doesn’t take itself too seriously. Embrace the David vs. Goliath narrative with bold, edgy marketing and design. Does Arc Browser come to mind?
Their pricing strategies are almost a second-order effect of their employed market positioning.
And that’s the power of position-market fit.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Cursor is raising at a $50 billion valuation on the claim that its “in-house models generate more code than almost any other LLMs in the world.” Less than 24 hours after launching Composer 2, a developer found the model ID in the API response: kimi-k2p5-rl-0317-s515-fast.
That’s Moonshot AI’s Kimi K2.5 with reinforcement learning appended. A developer named Fynn was testing Cursor’s OpenAI-compatible base URL when the identifier leaked through the response headers. Moonshot’s head of pretraining, Yulun Du, confirmed on X that the tokenizer is identical to Kimi’s and questioned Cursor’s license compliance. Two other Moonshot employees posted confirmations. All three posts have since been deleted.
This is the second time. When Cursor launched Composer 1 in October 2025, users across multiple countries reported the model spontaneously switching its inner monologue to Chinese mid-session. Kenneth Auchenberg, a partner at Alley Corp, posted a screenshot calling it a smoking gun. KR-Asia and 36Kr confirmed both Cursor and Windsurf were running fine-tuned Chinese open-weight models underneath. Cursor never disclosed what Composer 1 was built on. They shipped Composer 1.5 in February and moved on.
The pattern: take a Chinese open-weight model, run RL on coding tasks, ship it as a proprietary breakthrough, publish a cost-performance chart comparing yourself against Opus 4.6 and GPT-5.4 without disclosing that your base model was free, then raise another round.
That chart from the Composer 2 announcement deserves its own paragraph. Cursor plotted Composer 2 against frontier models on a price-vs-quality axis to argue they’d hit a superior tradeoff. What the chart doesn’t show is that Anthropic and OpenAI trained their models from scratch. Cursor took an open-weight model that Moonshot spent hundreds of millions developing, ran RL on top, and presented the output as evidence of in-house research. That’s margin arbitrage on someone else’s R&D dressed up as a benchmark slide.
The license makes this more than an attribution oversight. Kimi K2.5 ships under a Modified MIT License with one clause designed for exactly this scenario: if your product exceeds $20 million in monthly revenue, you must prominently display “Kimi K2.5” on the user interface. Cursor’s ARR crossed $2 billion in February. That’s roughly $167 million per month, 8x the threshold. The clause covers derivative works explicitly.
Cursor is valued at $29.3 billion and raising at $50 billion. Moonshot’s last reported valuation was $4.3 billion. The company worth 12x more took the smaller company’s model and shipped it as proprietary technology to justify a valuation built on the frontier lab narrative.
Three Composer releases in five months. Composer 1 caught speaking Chinese. Composer 2 caught with a Kimi model ID in the API. A P0 incident this year. And a benchmark chart that compares an RL fine-tune against models requiring billions in training compute without disclosing the base was free.
The question for investors in the $50 billion round: what exactly are you buying? A VS Code fork with strong distribution, or a frontier research lab? The model ID in the API answers that.
If Moonshot doesn’t enforce this license against a company generating $2 billion annually from a derivative of their model, the attribution clause becomes decoration for every future open-weight release. Every AI lab watching this is running the same math: why open-source your model if companies with better distribution can strip attribution, call it proprietary, and raise at 12x your valuation?
kimi-k2p5-rl-0317-s515-fast is the most expensive model ID leak in the history of AI licensing.
"You're my personal Hebrew tutor. Build me a 20-minute lesson for today based on my current level [beginner/intermediate/advanced]. Include: 10 new vocabulary words with context, 1 key grammar concept with 3 examples, and 5 practice sentences I should translate. End with tomorrow's preview."
2. Real Conversation Partner
"Let's have a 10-minute conversation in Hebrew about [topic: weekend plans/work/hobbies]. Start each response in Hebrew, then add English corrections below using this format: '❌ You said X → ✅ Say Y (because Z)'. Adjust your Hebrew complexity to match my responses."
In product management, not everything is straight forward maths, or solvable by AI.
Yet, some PMs still make better decisions most of the time.
How?
That's product sense:
"The ability to find the right solution for the user and business, despite limited and ambiguous information."
I love this definition from @Sid Arora.
You start with the PM process:
1. Take a vague & ambiguous problem statement 2. Create, or clarify the overall goal 3. Identify all users in ecosystem 4. Pick 1-2 users 5. Identify major problems of the user 6. Select the problems to solve 7. Brainstorm for solutions 8. Select the highest ROI solution 9. Build and deploy the solution 10. Measure success / collect feedback
This is the most important part! You want to describe your unicorn candidate-market fit. This is your opportunity to make a bumpy career look like a straight line to a sector of the market.
2. Navigation
The beauty of a website is you can add layers. Add in followup pages to deep dive into your experience. Link them. Add in blog & podcast appearances too.