, 187 tweets, 121 min read Read on Twitter
@skdh @earltcampbell You are asking for several hours of my time.
Okay, I really had other plans for this morning, including prepping to talk to a potential funding agency this afternoon, but here goes...
Let's talk #QuantumComputer architecture, broadly.
@skdh @earltcampbell Let's divide the scalability problem into four areas:
1. architecture-aware compilation of realistic algorithms, including
resource analysis and error correction (QEC)
2. architecture of large-scale quantum computers, including QEC
3. classical control systems
4. NISQy business
@skdh @earltcampbell (to borrow Joe Fitzsimons' fantastic phrase)
@skdh @earltcampbell (In the interest of some amount of brevity, I'm going to name only one author for most papers; my apologies to those left out. I'll also stick to arXiv and open access links where I can.)
@skdh @earltcampbell (Also, some of this we have covered here on Twitter in the last couple of years, but sometimes it's easier to recreate than find the links.)
@skdh @earltcampbell 1. IMO, the first compilations of quantum algorithms (done by hand, as most of the papers cited here are) were VBE and BCDP, both focusing on Shor, but with little architecture focus (more in BCDP).
@skdh @earltcampbell Nearly a decade later, Simon Devitt, Austin Fowler and I came along, with more of a focus on architecture awareness, but still without QEC.
@skdh @earltcampbell My collaborator Byung-Soo Choi, yet another person with a classical architecture background, focused on architecture-aware circuits.
@skdh @earltcampbell A few years later, Cody Jones and James Whitfield worked on quantum chemistry, this time taking into consideration some aspects of QEC.
@skdh @earltcampbell Around that same time, the IARPA program was under way and HHL was all the rage, but Clader, Scherer, and others found shocking consumption of resources for the problems IARPA asked for.
@skdh @earltcampbell This was the era that brought Margaret Martonosi into the community, and saw the development of some of the first nearly-practical programming languages, such as Scaffold.
@skdh @earltcampbell Quipper, from Green, Selinger and others, and used by Siddiqui and Islam, is another.
@skdh @earltcampbell They, and Jon Dowling and Ken Brown, can probably answer additional questions about their findings on resource consumption for various algorithms.
@skdh @earltcampbell Krysta Svore and Andrew Cross, working with Ike Chuang, published an influential paper on software architectures in the early days.
@skdh @earltcampbell After Krysta joined Microsoft, she initiated a team with Dave Wecker, Matthias Troyer, Nathan Wiebe, Martin Roetteler and Alex Bocharov and worked on programming languages and resource counts for
quantum chemistry, elliptic curve, optimization algorithms, etc.,
@skdh @earltcampbell This era also saw some advances in arithmetic, useful for Shor's algorithm but also much more. I like the Pavlidis paper, despite not being a big fan of the QFT-based adder.
@skdh @earltcampbell I assume you're aware of the Quantum Algorithm Zoo
I've been saying for years that one of the key things we in the community need to be focusing on is establishing *concrete* use cases for those algorithms,
@skdh @earltcampbell including resource counts for both small-machine
demonstrations and desirable large-machine implementations, including error analysis. That is really starting to happen now.
@skdh @earltcampbell Finally, let me close with another of mine. Horsman and I wrote about the top-to-bottom problem in CACM.
The SOM is hard to find, but includes 60 more references, many focused on the IARPA-driven problem set:
@skdh @earltcampbell If you get the impression I'm simply bombarding you with volume, rather than a few superhero papers, what I'm trying to convey is that building useful systems is an end-to-end problem, and that *hundreds*, perhaps thousands, of researchers are all pulling in the same
@skdh @earltcampbell trying to solve these problems.
We had almost fifty researchers at the Dagstuhl seminar focused on quantum programming languages last September, organized by Peter, Martin and Michele
Mosca, and that was basically only one representative per research group.
@skdh @earltcampbell More than enough for now. And that's just topic one of four I listed above. If you're waiting in the wings to see if I brought up you, or your favorite paper, feel free to chime in, though the odds are good I have that planned for the next segment!
/end (temporary)
@skdh @earltcampbell (I apparently branched this once above, which wasn't my intention. I'll try to keep it linear from here.)
@skdh @earltcampbell Okay, let's do this.
Let's talk #QuantumComputer architecture. Earlier, we discussed the work of the hundreds of researchers working on quantum programming tools and on figuring out the resource requirements of algorithms.
2. Now let's do large-scale architectures.
@skdh @earltcampbell I'm assuming you're familiar with Seth Lloyd's "first buildable quantum architecture", and the basic ideas of solid-state(superconducting, quantum dot), ion trap, all-optical, and optical hybrid (esp. NV diamond).
@skdh @earltcampbell If not, go read Thaddeus Ladd et al.'s "Quantum computing".
@skdh @earltcampbell What we're after here is not few-qubit systems, but designs for *big* machines, capable of running big algorithms with full error correction and scaling to thousands of logical qubits.
@skdh @earltcampbell This doesn't necessarily mean I think this is the _only_ path to _useful_ quantum computers, but it's a route we'll have to follow eventually.
@skdh @earltcampbell The first papers with input from computer architects were by a group centered around Ike Chuang, featuring Fred Chong, Mark Oskin, and John "Kubi" Kubiatowicz.
@skdh @earltcampbell This is 2002-3, just before Simon, Austin and I arrived on the scene, as discussed earlier.
@skdh @earltcampbell Dean Copsey, Setso Metodi (Tzvetan Metodiev), Darshan Thaker, Andrew Cross, Nemanja Isailovic, Mark Whitney started contributing around this time, too.
@skdh @earltcampbell Dean's paper on the costs of moving data around inside big machines led the way. "This is an important paper," I wrote in my notes.
@skdh @earltcampbell Mark and I wrote a survey on the architectural implications of technologies, based on a chapter from my Ph.D. thesis.
@skdh @earltcampbell The architectural direction I went, which I still maintain is the right one, is quantum "multicomputers", with the name borrowed from classical distributed-memory computers. My undergrad adviser, Chuck Seitz, coined the term, I believe.
@skdh @earltcampbell I can't say the term has fully caught on, but a number of people had similar ideas.
@skdh @earltcampbell My most important paper on multicomputer architecture was at ISCA, the premiere computer architecture conference, same time as a couple of other papers. I later expanded it to a JETC paper.
@skdh @earltcampbell My thesis tried to address all of what I considered *architectural* issues, independent of technology.
@skdh @earltcampbell Other papers I would classify as multicomputer are Daniel Oi's and Liang Jiang's.
@skdh @earltcampbell And, arguably, Yuan Liang Lim's, and Jungsang Kim and Changsoon Kim.
@skdh @earltcampbell Then a few years later, from Naomi Nickerson in Simon Benjamin's group, and for ion traps from Chris Monroe's group, growing out of the work of Jungsang.
@skdh @earltcampbell This paper gets only a modest amount of love, but represents perhaps(?) the first attempt to do a full stack design, including taking into account distillation and QEC overhead. The microarchitecture even includes the ability to work around dead qubits!
@skdh @earltcampbell (Yes, if you're paying close attention, I just copied a tweet from a couple of months ago:
@skdh @earltcampbell Arguably, that paper by Thaddeus Ladd, Austin, Yoshi and me had a _negative_ influence; I believe at least one or two senior researchers shifted direction _away_ from practical quantum computing with error correction because of the huge resource demands in that.
@skdh @earltcampbell But it was always intended to be a first "what if" paper, and many of the designs that followed from that use far fewer resources (qubits and classical hardware) as a result.
@skdh @earltcampbell Cody's paper is one such, and does a great job of laying out a structure for the architectural roles for various subsystems.
@skdh @earltcampbell So multicomputer is a popular, and relatively obvious, architectural choice, but the devil is in making the interconnect work. Ahsan, Jungsang and I worked on scheduling the interconnect during my year's sabbatical in Durham.
@skdh @earltcampbell We were all striving toward architectures with error correction. During the early part of this period, there was increasing awareness of the architectural difficulty of *executing* quantum error correction.
@skdh @earltcampbell I'm not even going to be able to *begin* to address QEC, except to point out two great introductions from Simon and Barbara Terhal. See esp. Tab. VIII in Simon's.
@skdh @earltcampbell Bravyi, Kitaev, Dennis, Landahl (one of the many Quantum Andrews) and Preskill proposed a weird topological idea shortly before I joined the field.
@skdh @earltcampbell Which became maybe-physically-buildable with the work of Raussendorf.
@skdh @earltcampbell Austin latched onto that, and has become our paladin, working to make it all achievable. It's impossible to express how much *work* Austin has done in the last decade. This 54-page paper only scratches the surface.
@skdh @earltcampbell This Nature paper gives a hint of how much influence Austin has on the design of physical computers inside Google, though.
@skdh @earltcampbell Simon and the group that has coalesced around him have worked to shrink resource requirements, again harder than words can convey.
@skdh @earltcampbell Dominic Horsman, Austin, Simon and I found an improvement that is really gaining traction, known as lattice surgery.
@skdh @earltcampbell Overall, there is broad but not universal consensus that these surface codes are the most architecturally practical, and many groups are working to *build* these systems.
@skdh @earltcampbell Simon and Austin worked with another group that proposed a football field-sized quantum computer, which is indeed what it might take.
@skdh @earltcampbell In some cases, these architectural designs are abstract enough to provide principles that guide all designs. In other cases, they are specific to one technology. Certainly all-optical has different demands.
@skdh @earltcampbell The best integrated optical experiments might come from Jeremy O'Brien's group.
@skdh @earltcampbell But the most complete system designs come, again, from Simon Devitt, Kae Nemoto, Ashley Stephens, et al. The scale of the latter of these is mind-boggling, and I wish I had been involved in that work!
@skdh @earltcampbell Kae herself took the lead on one NV-diamond architecture that to my eye looks very buildable, if NV centers can be fabbed with less human intervention.
@skdh @earltcampbell David DiVincenzo -- yes, *that* DiVincenzo -- has worked on scalable architectures. I've seen pictures at conferences, but this is the best paper match I could find.
@skdh @earltcampbell We discussed some of this back over a year ago, here on Twitter:
@skdh @earltcampbell Finally, let me close again with one of mine: Simon and I wrote about the path to scalability a couple of years ago in IEEE Computer.
@skdh @earltcampbell (Ugh, I never updated the arXiv version, and the published version is *very* different. I should fix that.
@skdh @earltcampbell Enough for tonight. I'm offline for a couple of days, but this should give you enough to read. And at this point, if you turn this into an article, I want coauthorship :-).
@skdh @earltcampbell Okay, I'm back in the saddle (though I'm allergic to horses). Let's do this. Next stage in our #QuantumComputerArchitecture discussion.
@skdh @earltcampbell We were talking about large-scale quantum computer architectures. Before we leave the topic, it would be remiss of me not to plug Kane and Kielpinski.
@skdh @earltcampbell Kielpinski's design, with Chris Monroe and Dave Wineland, is the ancestor of single-trap scalable ion systems. The Chuang orbit mentioned earlier designed architectures around this technology.
@skdh @earltcampbell Kane's design for donors in Si is not just a technology, he proposed actual architectures with 1-D and 2-D layouts.
@skdh @earltcampbell This idea proved to be an enormous fabrication challenge; two decades on, Michelle Simmons leads the charge, working with Lloyd Hollenberg, Charles Hill, Michael Bremner, and a whole cast.
@skdh @earltcampbell The Copsey paper I mentioned (long ago now) led to QEC-focused designs based on both those technologies. It's hard to overstate the importance of those two papers.
@skdh @earltcampbell Importantly, Kane took into consideration the need for space for classical control and I/O AND the need to achieve scalability...
@skdh @earltcampbell which brings us to our next topic:
3. classical control systems
Let's throw in I/O with this.
@skdh @earltcampbell I/O here covers three meanings:
a. coaxing a photon from a quantum device, often entangled with a stationary qubit, to make long-distance entanglement;
@skdh @earltcampbell b. measuring quantum states in whole or in part, whether for algorithmic results, quantum error correction, or teleportation; and
c. building a detailed quantum superposition based on large classical datasets.
@skdh @earltcampbell To the surprise of many people, in most technologies, qubits are huge, especially when you incorporate the classical control structures and I/O pads (in solid state systems). A few hundred in a single device will be a challenge.
@skdh @earltcampbell Kane and Kielpinski were perhaps the first to propose systems that would reach these limits of individual device size, forcing us toward multicomputer architectures.
@skdh @earltcampbell I'm not going to talk technology much, but it's important to note that entangling between two ion traps was done by Moehring & Monroe a while
@skdh @earltcampbell And Narla, Schoelkopf, Devoret just recently between two separate superconducting chips in the same fridge.
@skdh @earltcampbell The other thing you have to worry about, besides the technological basis, is the interconnect topology, scheduling and resource consumption; hence my thesis and related work.
@skdh @earltcampbell If you're doing this in optics, of course, you need a way to switch those photons around, and Jungsang Kim knows this better than anybody.
@skdh @earltcampbell This is also the basis for quantum networking, whether system area networks (as in multicomputers) or for the Quantum Internet.
@skdh @earltcampbell (Ha! You didn't really think I was going to go through this whole thread without plugging my book, did you?!?)
@skdh @earltcampbell Apologies to the Americans (north, south, wherever) who are in or going to bed, but now it's time to start picking up the European early risers.
@skdh @earltcampbell Okay, so that's a. above. What about measurement? Ugh, that's such a hassle, at the physical level...measurements are slow and inaccurate in most technologies, compared to the actual gates.
@skdh @earltcampbell At the algorithm level, you only have to measure each qubit once, generally speaking. But some things involving byproduct operators, where you need to select your next operation based on the outcome of the current one.
@skdh @earltcampbell That's called "feedforward", and it's tough, especially in solid-state systems, because the signal integration times for measurement can be on the order of the memory coherence time. It's one of the biggest open architectural problems across multiple platforms.
@skdh @earltcampbell It's a challenge in today's systems already, but more importantly, for large-scale, fault-tolerant systems, we need it to make quantum error correction work, and it's going to have to be both fast and accurate.
@skdh @earltcampbell But rather than the tech, let's talk about the control. It has been suggested the output problem presents a challenge. Fortunately, as the systems themselves scale, for the most part, the classical control logic scales linearly.
@skdh @earltcampbell Again, let's turn to Simon and Austin. There is *zero* doubt that they are the only ones working on this problem, but I think some of those efforts are proprietary, and I couldn't find other literature on the topic.
@skdh @earltcampbell Rather than the I/O bandwidth itself, the question is whether we will be able to do the calculations necessary to decode error syndromes and correct the state in real time.
@skdh @earltcampbell Some quantum codes have very straightforward decoding, but the surface code that we have fallen in love with (2-D or 3-D) presents a very difficult problem. Simon looked at this a full decade ago, and concluded yes, it's feasible.
@skdh @earltcampbell Austin has worked on the decoder more than anyone, looking for the simplest and the most effective methods. Somewhere, I think on a Google research blog, I have seen a photo of Austin and others with a
@skdh @earltcampbell rig of FPGA boards emulating an error syndrome stream and executing a simplified decoder in real time using FPGAs, as a demonstration that yes, it works. But I couldn't find that among the absolute torrent of Austin's work.
@skdh @earltcampbell More prosaically, the classical control hardware scales linearly, but currently still has a high per-qubit cost. But it has come down from a 19" rack full per qubit, to an FPGA or two in the last decade.
@skdh @earltcampbell Martinis's group (I think) even published elements of their FPGA design as open source, if I recall, but I couldn't dig that up, either.
@skdh @earltcampbell But while looking for that, I found this review by David Reilly on classical control of solid-state qubits that I hadn't seen before. What I'm after is probably buried in those references.
@skdh @earltcampbell Moreover, while the idea of multiplexing control of qubits on-chip is an automatic thought, it's hard due to the fact that many semiconductor circuits don't work at the millikelvin temps the quantum devices prefer. But people are working that problem, as well.
@skdh @earltcampbell Sorry I'm not more help on this area, it's just outside my usual trodden path. Let's leave it here.
@skdh @earltcampbell So, c. building superpositions from classical data. Without some solution to this problem, quantum systems will be limited to problems with a small problem description but large computational space, such as e.g. chess/shogi/go or small molecule simulation.
@skdh @earltcampbell In particular, quantum computing isn't going to work well for image recognition or any machine learning tasks with dataset-based learning until we fix this (but see caveat a little later).
@skdh @earltcampbell We want to take a big pile of classical data, put in a quantum superposition as an address, and receive as output a superposition of all of the data.
@skdh @earltcampbell The first description of the general idea I'm aware of is actually Fig. 6.9 in Mike & Ike.
@skdh @earltcampbell It doesn't seem to have a reference; I don't know if they originated the idea or not. But I used it in one of my early papers on improving Shor's algorithm performance.
@skdh @earltcampbell Giovannetti, Lloyd and Maccone took a step toward making this practical, and coined the term "QRAM".
@skdh @earltcampbell But there is still no proposed concrete implementation. I refer to this as the "billion dollar patent". Build it, and you're famous (and rich).
@skdh @earltcampbell All right, that one was shorter than topics 1 & 2 from our list. Time for a break, then we'll take up #4.
@skdh @earltcampbell Let's pick up topic #4: noisy, intermediate-scale quantum technology, or NISQ. John Preskill, with his usual timeliness and insight, coined the term about two years ago, and people have latched on to it.
@skdh @earltcampbell It describes our current era, with tens of qubits and the fight to declare quantum supremacy. It's likely true that the first problems solved using quantum computers will be abstruse, of little direct value. We all know that; see Harrow & Montanaro.
@skdh @earltcampbell And we know that there are problems on the far horizon that large-scale, fault-tolerant, fast quantum computers will solve, famously factoring of large numbers but numerous others. See Montanaro for the most recent, thorough list.
@skdh @earltcampbell Along with Bacon & van Dam and the Quantum Algorithm Zoo mentioned earlier.
@skdh @earltcampbell And so the challenge in front of us is to *find the boundary*. Where is the minimal machine and software combination that *customers will pay money for* and use to solve their pressing problems?
@skdh @earltcampbell That sounds nakedly capitalist, and it is. Our goal here is to seed an entire industry, where the computational outcomes are used to make the world a better place, which means creating products delivered to the public.
@skdh @earltcampbell Governments have seeded entire industries before, famously the airlines via postal services, and the Internet and much of our modern IT industry (which lets me talk to you now).
@skdh @earltcampbell In the short run, this means finding hybrid quantum-classical algorithms, breaking down large problems into subproblems or iterate-able problems. This field is *wide open*.
@skdh @earltcampbell Alán Aspuru-Guzik's team has been working this problem for VQE, the variational quantum eigensolver.
@skdh @earltcampbell Martin Suchara, Fred Chong, Graeme Smith and others are thinking about hybrid quantum-classical supercomputing systems, making concrete ideas that have been floating around.
@skdh @earltcampbell While Bravyi, Smith and Smolin quantitatively examined this tradeoff in one specific case.
@skdh @earltcampbell This is the caveat I mentioned above -- if you want to do big data on a quantum computer, can you break it down into bite-sized chunks where the QC can help?
@skdh @earltcampbell Also, there are weird mathematical tricks you can play with noisy outcomes to estimate what the outcome would have been in a perfect world, as the IBM team has shown.
@skdh @earltcampbell *This* is where the battle is being fought right now. And bringing things full circle...I don't think we have the programming tools, indeed, even the vocabulary to think about and describe these systems as they span the spectrum from NISQ to full scale.
@skdh @earltcampbell So the end of this epic tweetstorm is in sight. We'll come back tonight or tomorrow with one more round on programming, then do a review & wrap-up.
@skdh @earltcampbell Oh, back in the I/O section, I forgot to include recent work from Zhikuan Zhao of Joe Fitzsimons' group, on building superpositions from classical data. It's on my own stack to read!
@skdh @earltcampbell And one (near) final name that hasn't yet managed to pop up, but we would be remiss to ignore: Jake Taylor, now at the White House OSTP.
@skdh @earltcampbell We started this Homeric epic on the topic of programming quantum systems, and that's where we're going to end. It turns out to be of utmost relevance both to large-scale systems and, urgently, to NISQ. It's also critical for education.
@skdh @earltcampbell For the real systems available today, we have Qiskit, PyQuil, etc. Libraries and compilers for, essentially, assembly languages for quantum computers. These are awesome, and necessary, toolkits.
@skdh @earltcampbell Remember back when I was talking about architecture-aware compilation, and graph embedding? The idea is important in the abstract, and absolutely critical for making programs run on IBM's and Rigetti's current processors.
@skdh @earltcampbell Alwin Zulehner and Alexandru Paler, of Robert Wille's group specializing in reversible computing (which has both deep and practical connections to quantum), built a toolkit and tested it against real machines.
@skdh @earltcampbell One of the critical characteristics of those computers is that each qubit and each coupler is different. So, we need *error aware* compilation. My group started working on that topic a while back, focusing on daily-use arithmetic circuits rather than abstract benchmarking.
@skdh @earltcampbell We addressed specific aspects of the problem, including experimentally assessing our ability to _predict_ error rates on the IBM machines. We also focused on heuristics that will scale, and we hope provide good-enough answers.
@skdh @earltcampbell Unbeknownst to us, Margaret Martonosi's group was working on the same topic. They took a broader approach, and covered several machines and algorithms very thoroughly.
@skdh @earltcampbell Sadly for us, our paper was rejected, while theirs was (very deservedly) accepted to a top architecture conference. Ours is currently under review again. I think the two papers are complementary, and expect to see ours in print as well.
@skdh @earltcampbell Oh, and the need for compilation heuristics? This graph embedding problem is NP-complete. I had long assumed it was, but to my surprise, that apparently was never proven until last year.
@skdh @earltcampbell (Not being a theorist myself, that's a result I'm happy to read and apply; it's the kind of paper I'm unlikely to write.)
@skdh @earltcampbell And Matthew Amy, Dmitri Maslov, Michele Mosca and Martin Roetteler worked on optimizations as well, back in the intermediate period.
@skdh @earltcampbell All this is great. But I honestly don't think that the current generation of production tools are what we will use ten or twenty years from now. We need better theory of programming, and better tools for writing and debugging programs.
@skdh @earltcampbell Lloyd Hollenberg's group (from whence Simon and Austin came) is building a fantastic toolset, called QUI, intended to support intuition in programming.
@skdh @earltcampbell And you remember the Dagstuhl seminar I mentioned? Reports are available.
@skdh @earltcampbell Earl Campbell, who started me down this thread, has his hand in a whole slew of theory topics that advance our understanding of how to compute on top of error correction. He was there.
@skdh @earltcampbell So was Olivia Di Matteo, working on circuit synthesis and (to my pleasant surprise) error tolerance in QRAM.
@skdh @earltcampbell There were a *bunch* of folks there I admire, some with bold visions for what it means to program quantum systems. One such is Robert Rand, whose thesis is on formally verifying quantum programs.
@skdh @earltcampbell Another advanced concept is the Scalable ZX calculus, from Dominic Horsman's group and orbit. It supports rewrite-based optimization that is provably correct.
@skdh @earltcampbell Earlier this year, an experimentalist I *deeply* admire told me that it's way too early to early to be working on high-level programming languages and tools, that low-level optimization will be required for the foreseeable future.
@skdh @earltcampbell I couldn't possibly disagree more. Building these systems takes *insight*, and none of that comes for free; it takes *time* and *work*.
@skdh @earltcampbell Moreover, I think we need to build tools that will support the *intuition* of the #QuantumNative students we are educating today.
@skdh @earltcampbell And, we need software engineering tools that will scale to large programs, and ways of *debugging* quantum software. I think these tools will be entirely new.
@skdh @earltcampbell So, building scalable systems is an end-to-end problem, requiring many skills. "It takes a village," someone said in a different context. Come join us, we can use the help!
@skdh @earltcampbell Okay, let me end there. I don't know about you, but I'm pretty much exhausted at this point. I'll come back tomorrow with a recap and some final thoughts. Good night!
@skdh @earltcampbell Welcome to the final day of your master class in #QuantumArchitecture!
There is no final exam, but if you're a student, I just wrote your bibliography for you.
@skdh @earltcampbell Author's privilege: I've been saving for the end a plug for the Ph.D. thesis of my student Shota Nagayama, whose large-scale architecture includes a different approach to dealing with defective qubits,
@skdh @earltcampbell and detailed mechanisms for architectures that depend on different error correcting codes for different tasks.
@skdh @earltcampbell So let's go back and review some of the principles of quantum computer architectures. Most of these have classical precedents.
@skdh @earltcampbell A. basic gate-level parallelism. When I joined the field, it was widely but not yet universally understood that we *have* to have this for QEC to work. I think it's now universally known.
@skdh @earltcampbell B. architecture-aware compilation. Graph theory-based techniques are useful here, and there are echoes of classical dataflow architectures and dependency graphs. The abstraction underlying this is still poorly understood by many.
@skdh @earltcampbell (Dataflow architectures in their purest form have passed on, but the principle is deep and fundamental in both hardware and software.
@skdh @earltcampbell C. application-level parallelism and resource tradeoffs. Scheduling of computations and the application of Amdahl's Law are practically subfields in computer architecture.
@skdh @earltcampbell D. scalability requires distribution. This idea led us to multicomputer architectures -- independent units must have autonomy while sharing data in large computations.
@skdh @earltcampbell E. constant factors matter. I lost track of the number of people in my early days who stared at me uncomprehendingly and replied, "But it's exponentially faster." This misunderstanding in particular led to non-viable designs.
@skdh @earltcampbell One thing I like to say is that the *principles* are the same, but the technological differences mean that the *answers* look very different.
@skdh @earltcampbell Overall, there is much for the physicists to learn from the computer engineers, and vice versa. It would be a help for the physicists to take undergrad and graduate computer architecture classes.
@skdh @earltcampbell So, are we there yet? Have the vision, math and technology come together? In my 2006 thesis, I commented that "We are, in effect, in the time of Babbage asking what Knuth, Lampson and Torvalds will do with the machines we build," but maybe flight is a better analogy.
@skdh @earltcampbell Are our architectures above like da Vinci's flying machines, where the vision was there but the math and technology weren't yet? Or Samuel P. Langley, with so much right about his organizational approach, which we might call Open Science today, but the wrong technology?
@skdh @earltcampbell I prefer to think we're in the time of the Wright brothers, going back and examining everything we're told, applying a scientific and engineering mindset, solving *all* the problems, and putting the first ones in the air.
@skdh @earltcampbell (Their legacy afterwards was more problematic, but they certainly set off the revolution.)
@skdh @earltcampbell All right, by now either you're on board with the overall message here, or you're not. But to find out if I'm way off base, just now I conducted a sanity check.
@skdh @earltcampbell I went to duckduckgo.com, where my searches should be anonymous, and searched for "scalable quantum computer architecture".
@skdh @earltcampbell The first few pages will find you the Microsoft Svore group, the Berkeley Kubi group, our "Blueprint" paper, some things from the Monroe-Kim-Brown orbit I didn't cite above, Michelle Simmons' team, some additional work from Misha Lukin's team, the football field paper.
@skdh @earltcampbell Opening an incognito window and going to Google Scholar, I got Kielpinski, Monroe, Nemoto, Oskin, Copsey, Yao-Taylor-Jiang-Lukin all in the first page.
@skdh @earltcampbell A couple of things I failed to cite, but I'd say that more or less validates the overall thrust here. (After about 150 tweets, that's a relief.)
@skdh @earltcampbell About a third of the people I've cited here are on Twitter, but I deliberately didn't tag them in. You can find them if you try. And of course you are _all_ welcome to extend or refute what I've written!
@skdh @earltcampbell And with that, we come to the end. Thank all y'all for sticking with me. I hope you found something useful here, and that creating this thread was worth my time!
@skdh @earltcampbell The End

(swirling Erich Wolfgang Korngold music as the credits roll and the most un-Errol Flynn-like person you know rides off into the sunset)
@skdh @earltcampbell Coda: okay, okay, I know I threw a complete bus, driver and all, at you. By my count, I've included about 97 references, and I know not all of you are going to read that many. So, if you must have the tl;dr:
@skdh @earltcampbell tl;dr:
If you want to understand the state of affairs in quantum computing, you could do much worse than these six:
1. tdl et al., "Quantum Computers"
2. rdv & dch, "Blueprint"
3. sjd et al., "Beginners"
4. Montanaro, "Algorithms"
5. Preskill, "NISQ"
6. Harrow, "Supremacy"
@skdh @earltcampbell (yes, three of those are by me and/or my friends & collaborators, and the first one desperately needs an update, but so it goes.)
@skdh @earltcampbell Postscript (yes, both a coda and a postscript! Two for the price of one!)
@skdh @earltcampbell It seems to be customary to follow up popular postings with a "feed me" button of some sort. I'm not an independent creator, so I don't need money directly. What I want is for you -- *especially* if you're
building systems --
@skdh @earltcampbell is to *read* the lit, *use* the lit, *cite* the lit. And, if you're in a position to do so, invite the people listed here to come give talks at your institutions, and maybe even collaborate!
@skdh @earltcampbell That is all.
@skdh @earltcampbell @threadreaderapp unroll. onegai shimasu.
@skdh @earltcampbell Good grief, I forgot the Chong-Franklin-Martonosi survey on programming languages. That should definitely be on this minimal reading list, and should have been brought up earlier.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Rod Van Meter
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!