Companies wishing to sell IVDs must seek pre-market approval (PMA) from the FDA, which can take roughly 200 days. As I understand it, IVD's must be run on FDA-cleared diagnostic equipment, such as the NextSeq 550Dx, which is FDA-cleared and CE-marked (for Europe).
This way, FDA has end-to-end oversight of the diagnostic workflow. They must approve the kit (the IVD) and the machine that the IVD is run on (Dx-enablement).
This way, all a lab/hospital must do is buy both and they can run (and bill for) diagnostic tests.
Conversely, an LDT is a diagnostic test developed and operated by one lab in a central location. You do not need FDA-clearance on the reagents/equipment used in an LDT, so long as the lab is CLIA/CAP-certified.
So, why seek FDA-clearance at all, then? If you wish to sell an IVD kit, yes it's necessary to obtain the PMA from FDA.
If you're an instrument-provider, an FDA-approval is helpful insofar as it allows for decentralized testing. As you may guess, an LDT is expensive.
So, having self-contained, FDA-cleared equipment lightens the burden on the lab/hospital so that they don't need to build an LDT on-site, all they need is your machine and they can start running IVDs.
I'll end by saying that FDA-cleared equipment needs to be matched by FDA-cleared IVDs. If there isn't a menu of IVDs, then the decentralization benefit to having approved equipment diminishes. You could argue FDA-clearance is helpful for a sales pitch, though.
Regulation is a tedious subject, so if you think I've misunderstood any nuances, please comment. Thanks!
• • •
Missing some Tweet in this thread? You can try to
force a refresh
The first day of the conference hasn't disappointed, especially if you're a fan of talking cubes. What is this mysterious object and what sorcery is inside?
See disclosures at the end.
The Tempus One, meant to be carried in a doctor's coat or sat at the bedside, is a physical manifestation of @TempusLabs' genomic and phenotypic data-lake. Oncologists can ask One all sorts of questions regarding their patients, though I'm unsure if it'll (...)
(...) just reflex you to a computer after a sufficiently difficult question. I'm sure we'll learn more soon. Has this sort of form-factor been tried before?
I'm just hoping it has adjustable humor/honesty settings like TARS from Interstellar.
Tomorrow kicks off the JP Morgan Healthcare Conference, one of the most information-dense and exciting weeks for biotechnology. #JPM2021
Though I’ll miss annual lab tours, I’m excited not to have my shoes destroyed amidst all the shoulder-to-shoulder crowds.
Unlike last year, I’m going to try to give a daily news recap once the US trading session closes (inspired by @aurmanARK). My hope is to aggregate input from folks who can offer alternative takes.
We’ve got our #mARKetUpdate webcast on Tuesday, where I’ll be talking more about my recent blog on earlier #cancer detection as well as plans for including community feedback in the forthcoming white paper.
First, let's deconstruct the paper's title: "De Novo Assembly of 64 Haplotype-Resolved Human Genomes of Diverse Ancestry and Integrated Analysis of Structural Variation".
De novo (Latin: "Of New") assembly involves sequencing a genome without the help of a reference.
Assembling a #genome de novo is like solving a jigsaw puzzle without using the picture on the front of the box. You could start with the corners, assemble the edges, and try to fill in the rest using color- or shape-matching methods.
For every cumulative doubling in sequence data generated across its install base, @PacBio has been able to lower (consumables) costs by roughly 30%, as shown below.
What could this imply about the future of long-read #sequencing?
First, let's acknowledge a Catch-22. Does PacBio need to (a) derive knowledge from platform utilization to lower sequencing costs or (b) lower costs first in order to unlock greater platform utilization?
At present, we believe it's more of the latter. Why?
PacBio's HiFi chemistry and Sequel II optics are relatively nascent (2019). This suggests a lot of near-term headroom left for optimization in these areas.
It's crucial that all long-read users, not just the top 1%, have access to this innovation.
Researchers at @CNIOStopCancer just published an exciting proof-of-concept showing how CRISPR can delete cancer-causing gene fusions, selectively killing cancer cells.
First, let's discuss what gene fusions are. As shown below, fusions result when two genes crash into each other and fuse together.
The resulting protein product is a hybrid. It has some features of Protein A and some of Protein B.
This usually is very bad.
We know that cancer-causing (#oncogenic) fusions have been found in nearly all cancer types. They're more common in pediatric cancers, but still are present in as many as 15-20% of adult cancers.
If present, fusions often are the main drivers of tumor growth.
Interesting, Exact Sciences ($EXAS) is halted and spiking up ~15%, likely because of what's going on at the Cowen liquid biopsy conference. I will provide updates.
This is the first time, to my knowledge, Exact has seriously discussed multi-cancer liquid biopsy instead of just colorectal cancer screening via Cologuard. They presented preliminary data evaluating a blood-based multi-cancer test.
The cohort was relatively small, but showed sensitivity of ~85% (true-positive rate) and specificity of ~95% (true-negative rate). This is definitely the highest sensitivity I've seen from a test like this, but also the weakest specificity. Granted, this is early data.