I'd like to share my initial reaction to today's Berkeley Lights report. But first, I need to do some housekeeping. I can't comment on stock movements, share financial projections, or debate fair value.
Generally, I respect anyone who's put this much work into a topic. I won't pretend to have a clean rebuttal to every point. In my experience, beyond the hyperbole and hasty generalizations, there is some truth in these types of reports.
I want to soberly appraise those truths.
Also, I'd invite the subject-matter experts waiting in the wings to build off of this thread, add detail, or share their experiences. Ultimately, we're all after the same thing.
I will start with a few concessions and end with a few counterpoints to today's report:
(1) I agree that a $2M ASP for Beacon is too high, especially under a direct purchase model. Further, this price likely prevents small or mid-tier biotechs and academic labs from becoming buyers. These market segments may be better served by a subscription access model.
(2) Beacon is an ambitious platform. Its internals are complicated, involving interconnected liquid reagent pumps, semiconductors chips, optics, and software. The result seems to be inter-run variability, though I'm unsure if the errors are stochastic or systematic.
(Cont.)
Regardless, if scientists need to perform manual experiments to generate 'ground truth, it weakens the value proposition of Beacon, which is to avoid doing manual experiments. I would like to see more workflow-specific, 3rd-party data on coefficients of variation between runs.
(3) The volume of data coming off a fully-loaded Beacon is immense. This likely will compound with higher-throughput chips. It's apparent that Berkeley will need to improve its analysis software to help users contend with the volume and richness of data produced by its systems.
(4) While it's important to consider in the context of slashed capital budgets and R&D allocations by virtue of COVID-19, the decreasing system utilization trend is concerning and should be focused on in the coming quarters. Though anecdotal, several users reported ...
(Cont.)
... system underutilization because of hardware malfunctions or software bugs. Customer trust is fragile and vital, so I would urge both product development and field application support teams to reevaluate what's going wrong. I would expect more white-glove service.
As time goes on, I likely will come back with more concessions. These are the ones that leaped off the page the most, to me at least.
Now, I'm going to switch gears to offer where I believe this report is errant:
(1) The terms Berkeley uses to describe its technology are not non-sensical or misleading. The prefix "Opto-" refers to optical tweezer technology, which allows Beacon to non-destructively manipulate living cells.
The suffix "-select" refers to Beacon's ability to isolate, clone, and export desired cells for downstream analysis. I won't waste time doing this for each term the author finds esoteric. I believe none of them are and would be happy to expound on all of them in the comments.
(2) Contrary to the author's assertion, Berkeley is not a synthetic biology (synbio) company, nor is its success tied to that of the synthetic biology industry. Berkeley makes instruments that synbio companies can use, but this is its smallest customer segment.
(Cont.)
I believe the author is smart and knows this. I think they've made this purposeful and incorrect comparison solely to bring turbulence from the synbio sector onto Berkeley, which is unequivocally a life science tools manufacturer. This is one of the nastier arguments, IMO.
(3) Semiconductors are not archaic technology. In fact, semiconductor innovation is central to the life sciences. Companies like Pacific Biosciences use semiconductors embedded with nanoscale confinement chambers to enhance light and observe DNA synthesis in real-time.
(Cont.)
The author doesn't reference improvements to surface chemistry (activation, passivization) or micro-fluidics that integrate with the semiconductor. Moreover, I take issue with trivializing the process of 'miniaturizing' standard well plates.
(Cont.)
Shrinking embedded features on a semi-conductor is extremely challenging and requires clever circumvention of physical constraints, like the diffraction limit of light or transport phenomena. While NanoPens serve the role of reaction wells, they're really hard stuff.
(4) In my opinion, every reference to 10X Genomics' products is incorrect. While true that both companies sell instruments that operate on single cells, that is where the similarities end. These companies do not really compete against one another.
(Cont.)
10X makes single-cell sample-prep boxes. Their systems take in thousands of cells, partition individual cells into little oil droplets, and then tag each cell's genome with a unique barcode. The library of tagged DNA molecules is then fed to a sequencer.
(Cont.)
Using 10X barcodes, the sequencer can tell which DNA molecules came from which cell, but that only gives you sequence (genotypic) information. Knowing the sequence of the genes that encode surface proteins is insufficient for the functional characterization of cells.
(Cont.)
Beacon is both a sample-prep box and a sensor. It generates phenotypic information that single-cell sequencing cannot. We believe 10X's technology and Berkeley's offer complementary views of single-cell biology and that their successes aren't zero-sum.
(5) In my first Google search, I found 167 articles that reference Berkeley Lights, so I'm unsure why the author only presents 4. Truthfully, I've not filtered each one, so it's possible there are duplicates or non-journal articles. I welcome anyone's help checking!
(Cont.)
Generally, we've found a logarithmic relationship between SKU publications and revenue growth, so I agree with the author that it's a relevant statistic. The 1st article I found (by Amgen) runs contra to the report's interview with an Amgen employee?
(6) I'm not an expert on FACS, but it seems wrong to conflate their roles. Before I go on, I welcome FACS-experts to chime in. My understanding is that FACS can't preserve cells and is incapable of measuring cytokine release or toxicity. Ostensibly, Beacon does both.
(Cont.)
While this last point isn't a counter or concession, I wasn't surprised that Beacon required specialized personnel or months of groundwork to get going, especially given how complex and expensive the instrument is. As stated earlier, Berkeley needs to improve ...
(Cont.)
... its field support and/or new Beacon workflows so they work better and faster in customers' hands. However, I'd be surprised if the first Quadrupole MS^2 setups worked out of the box. Even today, NGS isn't plug-and-play. I'm also not surprised that ...
(Cont.)
... scientists are resistant to change. If it ain't broke, don't fix it, right? The "innovator's dilemma" has and will continue to slow change in the life sciences industry. It's certainly not unique to Berkeley, though they have work to do in this department.
Hopefully, this has been a balanced review of today's report. Let's be clear, there's still much more investigative work to do, and I invite others to flip over stones if they'd like. Finally, even though I may disagree with some things, I thank the author for their diligence.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Imagine that a meteor was hurtling through space towards the Earth. Its speed and trajectory indicate that it will destroy the planet in approximately 10 years.
Now, let's say that our best sensors are only ...
... capable of seeing said meteor 1 year in advance. So, 9 years go by and we are blissfully unaware of our impending doom. Then, at the 9-year mark, we detect the meteor and measure our remaining survival time to be just 1 year.
What if I gave you a better sensor? What if this sensor could see the meteor from 10 years away instead of just 1?
How long would our survival time be? While we may have a 10-year lead time instead of a 1-year lead time, the meteor still strikes us on the same day.
We often discuss how more comprehensive and sensitive techniques improve the diagnostic yield for patients affected by rare genetic diseases. Indeed, yields have improved as we've gone from microarrays to whole genome #sequencing.
However, there's another critical component.
Case-Level Reanalysis (CLR)
By reanalyzing genomic data, as our global knowledge-base grows, we improve diagnostic yields.
We believe the broadest tests should be done first to avoid the need to re-contact and re-accession patient samples.
The economics for both the lab (and patient) change dramatically as well in a 'generate-once-reassess-often' framework. As more is known, variant interpretation may shift from being more manual to more automated.
Still, this is a really hard technological problem.
The widespread adoption of liquid biopsy seems to be 'un-commoditizing' DNA synthesis in the molecular diagnostics industry.
Recall that synthetic DNA probes, molecules that bind and pull a DNA out of solution, are a critical input for liquid biopsy.
Diagnostics companies buy probes to use in their clinical tests, oftentimes in bulk, from a synthetic DNA provider. There's been a prevailing notion recently that DNA providers only can differentiate on the basis of cost or turnaround time.
I think liquid biopsy changes this.
Firstly, a huge technical constraint in liquid biopsy is the availability of cancerous DNA in a tube of blood, which decreases exponentially with tumor size.
Remember that smaller tumors don't leak as much DNA into the bloodstream.
@NatHarooni@snicobio I’m watching Jeopardy—will come back later tonight. Short answer—no, not competitive to PacBio. Likely friends down the road.
@NatHarooni@snicobio Alright, so in theory the QSI platform can enable DNA (or RNA) sequencing on chip. However, I think of it more like a call option and less of a near-term goal. Proteomics is the killer app enabled by the QSI platform. But, as OP alluded to, multi-omics (inc. proteins) on one ...
@NatHarooni@snicobio ... instrument could be an attractive value prop. from a capital outlay point of view, especially for $50K which is achievable for many labs w/o needing to seek a major grant (so speedy sales cycles). Now, back to the main point about sequencing. If you read the patents ...
@NatHarooni@AlbertVilella In my opinion — HiFi reads are the most accurate/complete, but currently are more expensive and lower throughput. Nanopore reads are cheap, fast, and high-throughput, but have a weaker error profile. Both of these descriptors are changing and may not be the case in a few years.
@NatHarooni@AlbertVilella As far as QSI is concerned, it’s a little too early for me to calculate operating costs per run. I’ll update when I know more.
@NatHarooni@AlbertVilella Regardless of how you consider the remaining engineering obstacles, necessary R&D spend, or computational issues—I feel that long-read sequencing (as a class of tech), will outperform short reads on virtually every relevant metric by 2024-2025
@nhawk45@AlbertVilella Hey, @nhawk45 -- Sure, I think I can take some of these. Let's start from the beginning to help explain why certain features of QSI's approach/IP are needed and interesting.
First: Why are proteins the hardest molecules to sequence of the 'big three'? (DNA/RNA/Proteins) ...
@nhawk45@AlbertVilella Here are some of my notes on reasons I could think of, but I'll take a moment to elaborate on some of them.
The proteome is estimated to have the largest 'unit diversity' for lack of a better term. DNA = 20K genes, RNA = 10^5 isoforms, proteins = 10^6 proteofroms ...
@nhawk45@AlbertVilella Proteomic diversity is driven by post-translational modifications (PTM), which is a set of chemical alterations to peptides not dissimilar to how DNA can have #epigenetic modifications like methylation. Combine this w/ the fact that peptides have a 20-letter alphabet ...