Thinking about this tweetstorm, one of the issues I’ve run into as an engineering leader is what to call the software engineering stuff that’s “agile” given that the Agile Community(tm) has killed the brand.
And by & large, I’ve taken to call it “DevOps”, because the DevOps community have taken up much of the mantle @KentBeck & the XP community started with. & Kent has independently focused on safe small changes deployed to production. Which is DevOps.
3/
Much of the art here is making changes safe enough to deploy to production continuously. And to do that, we need to design incrementally, test obsessively, take architecture seriously so we decompose dependencies. & we need to automate everything & do it all the time.
4/
It turns out that this is what Kent & @RonJeffries@GeePawHill & many other folks have been nattering on about & being broadly misunderstood. @KentBeck has some brilliant essays (scattered across FB & his site alas) & @GeePawHill has amazing twitter threads on the topic
5/
When you look at *what it takes* to get to the DORA measures that @nicolefv & team write about in Accelerate, the input metrics for the DORA outputs, it’s making small changes safe.
6/
As an engineering leader, I provide training, tools, mentorship, leadership development, vision, etc. to help people learn the skills needed to achieve those output metrics. And most of those skills are what @GeePawHill might call the skills of making.
7/
Unfortunately many of those skills are deeply counterintuitive & much of the work is as much unlearning as learning. For example, there’s an implicit definition of work as writing new code, or even writing code.
8/
Because that’s what engineers love to do, and because there are emotional and sometimes financial incentives to make customer visible functionality, we need to overcorrect sometimes on focusing on the tools of making.
9/
Providing visibility & reward for the people who build the CI/CD tooling or build a deployment pipeline that automates acceptance testing, or figure out how to do AppMesh with Terraform as a module or automates linters & code coverage tools in the pipelines.
10/
Great teams end up spending most of their time building user facing functionality because they build the tools of making and sweat automation, IoT, design, architecture, code quality & test automation. Less successful teams try to write lots of code & get stuck.
11/
Accelerating teams towards that point where they’ve incorporated the habits of investing in the tools of making & designing architectures that decompose change into very small safe increments is a key area of software engineering management.
12/
Rallying these changes under the flag of DevOps has been the most successful way I’ve seen to describe these habits.
END
• • •
Missing some Tweet in this thread? You can try to
force a refresh
If I were public health interoperability czar with the power to pull the Levers Of Power
A 🧵
2/
ELR:
- Unify FDA, CLIA & CMS powers to tie payment & lab certification to reporting, work w/ CDC to create a registry of reportable tests & include public health reporting in analyte machine test scenarios
- Require reporting of demographics & contact info as COP
3/
- Create a certification fast lane to the LOI/LRI specifications & create test methods that go all the way to lab reporting
- Create a long-term funding mechanism for public health, create certification criteria & tie grant to certification & prod deployment
NCVHS was created forever ago, but it's mission was updated w/ HIPAA to advise the secretary on, among other things, the administrative transactions created for HIPAA (& as a reminder, the privacy & security stuff in HIPAA was sideline in the day…
3/
b/c HIPAA was about insurance portability & administrative efficiency & health data privacy came along for the ride).
HITAC, defined in Cures was the update to HITSC/HITPC which was created for HITAC. Fhew.
Anyway 2 FACAs advising on standards tells you a lot.
Speakers in the public health community explicitly noting that the Promoting Interoperability (née MU) program established the baseline for syndromic surveillance & dropped many of the feeds in place.
A tweetstorm that will make some people mad. I apologize in advance.
There are two perspectives on the IHE SOAP-based specifications (primarily XDS, XCA and XCPD), not mutually exclusive
2) a) They have worked in practice, so should not be ripped and replaced
b) They are legacy technology and we should work on more modern alternatives.
A 3rd perspective, enshrined in #TEFCA#QTF is:
c) We should continue to make them the core of our health infrastructure
3)
The world "legacy" makes people mad -- my perspective is that technology moves on regardless of whether we like it or not. Some standards, particularly low level networking standards, have stood the test of time, some have... not
To start with, a basic question: are market forces for interoperability good, or nah?
2/
There are three basic positions that I hear articulated
a) "making money on interoperability is immoral"
b) "making 'unreasonable' profit on interoperability is immoral"
c) "extracting monopoly rents as an interoperability gatekeeper is immoral"
3/
These are obviously *very* different stances, and the inability to articulate and distinguish them causes real problems. I often hear a silent assumption that all fees are monopoly rents and ipso facto immoral.