On #biocompute news, $TSLA Tesla AI day gave some technical details on how they do their training and video labelling. It seems $NVDA Nvidia GPUs are the norm, with a 14,000 HPC heavily optimised on the software side. #pytorch #AVX2 #CUDA #SMT #smol #CUDNN Image
None of this is for the purpose of #Bioinformatics applications, rather here it's for Full Self-Driving software, but the technical details show some choices for high-throughput #AI training that one could compare to the #ComputationalBiology #ComputeAcceleration world.
Now even though $TSLA Tesla continues to use $NVDA Nvidia for their #AI training, they gave an update of their own #Computing platform, #DOJO. Difficult to say from the cost comparison slide if 1 Dojo tile is worth 6x GPU boxes (each 8x A100?) but they say it's on par. Image
At the moment it's work a bit more than an $NVDA Nvidia A100 GPU, the bar in the middle is faster due to faster VRAM, and the last one is a projection of hardware and software improvements. No numbers on wattage. Image
The Exapod cabinet demo shows I believe one Dojo tile per case, sort of the size of an Apple Mini, so only 4 in the picture, then the below compute I presume is the CPUs / hard drives / network? If someone has more insight, do comment below please. Image
Again, nothing shown here directly relates to #Bioinformatics #ComputeAcceleration, but it shows the "Nvidia A100 as a work-horse of #AI" trend, and also the fact that people at Tesla believe they need their own silicon to move forward with freedom to operate from other people.
Looking forward to hear more from Dojo and how it compares to Nvidia. The thread is currently open to all for comments (so be nice and behave).
A final comment is that from the Exapod cabinet demo it looks like the Dojo die is quite large. I am not sure what it means when they say they want to produce 7 Exapods in the next 1-2 years, but I presume it means they haven't considered at all opening up Dojo to the world.
There is also the software side to this: it's all well and good to write software for a new #ComputeAcceleration silicon *within* a company, but if they ever wanted wide software adoption, they would have to open it up to the outside world in software terms as well
which I believe it to be very unlikely. I have posted before about other #Acceleration computing platforms, such as @CerebrasSystems and @graphcoreai , but their adoption rate compared to Nvidia GPUs is still minuscule. Hopefully there is enough software, and enough of it
is nicely integrated into #Linux #Tensorflow and other #OpenSource libraries that we see a rich #ComputeAcceleration ecosystem come out of this wave of #AI initiatives.

• • •

Missing some Tweet in this thread? You can try to force a refresh

Keep Current with Albert Vilella

Albert Vilella Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!


Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @AlbertVilella

Oct 5
Another day, another tweet about Next-Generation Sequencing, this time comparing four companies that have a play in short-read NGS with markedly different histories so far: $ILMN Illumina, $OMIC Singular Genomics, Element Bio @ElemBio and @UltimaGenomics :
All four companies listed here are in the short-read Next-Generation sequencing realm, with Illumina being the 800-pound gorilla of NGS with 80-90% of the marked share, and having recently announced their NovaSeq X Plus instrument line to consolidate their leadership in the
high-throughput part of the market. Second is Ultima Genomics, which is a private company that recently came out of stealth, and announced their plans to release a high-throughput short-read sequencer in 2023 capable of doing $100 genomes ($1/Gb). No confirmation on the price
Read 12 tweets
Oct 3
Thank you all who followed the $ILMN Illumina updates last week via this twitter account. I suspect things will calm down until the next announcement in NGS ($PACB PacBio / $ONT.L Oxford Nanopore / MGI Tech in the next few weeks). Meanwhile I'll carry on tweeting...
... about related topics such as:
Synthetic Biology and DNA synthesis (bit.ly/biowrite-slides)
Next-Generation Proteomics (bit.ly/ngps-slides)
Single Cell and Spatial Biology (bit.ly/scspatial-slid…)
Liquid Biopsy and Cancer Screening (bit.ly/liqbiopcs)
As well as:
Bioinformatics Compute Acceleration (bit.ly/biocompute / more on the Computers geeky side)
Biotech stocks and tech stock market trend (bit.ly/biotech-slides financial/strategic as well as the technical/scientific side)
Read 4 tweets
Sep 30
Illumina presented their new high-throughput instrument on Thursday, the NovaSeq X Plus, a a new mark of $200/WGS or $2/Gb in a machine capabale of producing 20,000 WGS per year. What are the implications for the field of NGS as a whole?
First of all, it's worth mentioning that 5 years after the release of the NovaSeq, with the first batch of NovaSeq X Plus instruments being delivered in customer's hands in Feb/March 2023, this release will coincide with both MGI Tech and Ultima Genomics starting to deliver
instrument in the $100-200/WGS or $1-2/Gb range with very similar CAPEX and throughput profiles to the new Illumina sequencer. This may be the first new Illumina instrument that will enter the market with the main specs of reagent cost / throughput / instrument cost not being
Read 36 tweets
Sep 29
Genomeweb asked about the price per Gb of the NextSeq P4 flowcell (2024) but $ILMN Illumina didn't give any details on pricing. So it's not $6/Gb. Based on the NovaSeq SP/S1/S2/S4 current pricing, the P4 would naturally slot in at a $12-15/Gb range, but who knows what it will...
... be in 2024. By then maybe the NovaSeq pricing itself has shrunk, e.g. to a $9/$8/$6/$4 range, then the P4 could be priced at the same price as the NovaSeq SP 2024 range at $9/Gb. All speculation at this point.
This is to say that the NextSeq 1000/2000 is not and may not be in 2024 a $6/Gb platform, unlike the Element Bio AVITI (from $5.6/Gb) or the MGI Tech G400, which depending on unofficial quotes can be at a similar range.
Read 4 tweets
Sep 29
Main headlines:
New NextSeq P4 flowcell with 500Gb output ($6/Gb still to be confirmed) in 2024, same instrument.
Nex NovaSeq X/X Plus instrument. 15Tb at $2/Gb, 20K WGS a year. Instrument is $1.25M.
Some comments:
If @UltimaGenomics can sell a $2-3M instrument at $1/Gb, then the new $ILMN Illumina NovaSeq X/XPlus instrument bridges half the gap to where the Ultima U100 aims to be in 2023.
The Infinity + NovaSeq X would make a long-read equivalent genome at $200x5 = $1000, which is in the ballpark of there $ONT.L Oxford @nanopore already is, and not far for the current (pending Oct update) $PACB PacBio Hifi equivalent.
Read 7 tweets
Sep 29
I'll be live tweeting the $ILMN Illumina #IGF with running commentary. I won't intend to be comprehensive with the slide deck screenshots, but cover the main headlines.
NovaSeq 6000 Dx, same as NovaSeq, but regulated FDA and CE (Europe).
6Tb in 44 hours, this hasn't changed
Read 18 tweets

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!


0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy


3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!