Lloyd W Profile picture
Jul 1, 2021 173 tweets 31 min read Read on X
Today's my seven year BBC-versary, and also my last with.. 25 more working days until I leave.

Because these never possibly go wrong, one like => one anecdote/interesting thing from my time here (well, there, not my desk in my flat). I only have about five prepared...
1/ I was once summoned at around midnight - a live stream had failed because of operator error investigating a silent audio alarm and taking down the stream entirely.

It was a Springwatch stream of a rabbit burrow. There was silence because the rabbit had left the burrow.
2/ The number of times an MP has written to escalate a complaint and I've had to be involved in drafting the response: 2
3/ Eagle-eyed viewers once saw me in the background of BBC News at One, gracefully tripping on the spiral stairs.
4/ The iPlayer Windows Phone 8 app would only compile in a VM inside another VM on my desktop at home.
5/ Many devices and orgs around the world use loading bbc.co.uk (often HTTP-with-no-S) as an automated test for a working connection. A more efficient alternative would be to use the network status API: bbc.co.uk/emp/network_st…
6/ All three of these (on the same day) were my fault. Three ticket summaries: 1) ...
7/ This short story starring @g_bonfiglio:
(yes, my Slack photo is from before I started at the BBC, but tbf my Twitter one is even older) Lloyd Wallis (20:44): we're...
8/ Whilst Pudsey is shy so doesn't speak in public, he uses a series of secret hand (paw?) signals to communicate needs to his human minder, and the human has secret code phrases too.
9/ The error you get if things with the BBC's web browser media player go really wrong (emp.bbc.co.uk/ohno) includes a random hidden Capaldi-era Doctor Who quote in the bottom left.
10/ Media outlet @TheRegister once referred to me as a "techie minion": theregister.com/2015/09/30/bbc…

I like to think in the intervening time I've at least moved into a techie lieutenant role.
11/ Relatedly, the best koala is absolutely the one at bbc.co.uk/html5, end of discussion.
12.1/ I was once asked to fill a gap in the background of The One Show for thirty seconds as I walked past on my way home.
12.2/ I still don't know why Alex Jones then appeared in a swimsuit, got into a bathtub of seaweed and was offered a platter of hamburgers by Chris Evans. She declined, they went to VT and I was thanked and sent on my way.
13/ I once arrived in the office in the morning and was asked what I did to fix a stream being broken overnight.

It turns out I was called at 5am when I wasn't on call, so was sleeping off some alcohol, because nobody else could figure out what to do. I have no memory of this.
14/ The BBC has a unique definition of "in the UK", excluding some things, including places everything else online considers "rest of world", some overseas schools, and depending on who's up there, space.
15/ Miss the old BBC Fireclown error page (it's now clangers looking into a hole)? Get it on a t-shirt, mug or phone case at redbubble.com/i/t-shirt/beeb….

Oooh, and face coverings and jigsaws. I may be about to spend some money...
16/ Oh, and the fireclown was replaced because it scared children, which is fair enough.
17/ Somewhere in the basements of Old Broadcasting House, there was a jukebox of 7 inch vinyl records. It includes an Oxford/Cambridge boat race where the commentator lost track of which boat was which, the Home Service announcement of WW2, and iirc some Innuendo Bingo.
18/ Last time I tried to show somebody the jukebox, it dropped a record. I tried to report this in a few places but the responses were all that nobody else knew it existed but wanted a go once it was fixed. I haven't seen it since so am starting to wonder if I imagined it.
19/ A Freedom of Information act request was once submitted for all of the BBC's custom Slack reactions. We decided due to consent/rights of potential images we could not share the reactions, but could share their names.

So: :shy_thuglife: exists, and you will never know why.
20/ On the responsive web media player, you can press '888' to turn on subtitles.
21/ There is a BBC service called Syncopaticaption. But it's not the machine learning transcription service. Those components are named sensibly.
22.1/ During a major upgrade, it was found the reason one step worked in Scotland but not Wales/NI was different firmware versions. The box in question had a special cable needed for upgrades that the nations couldn't find.
22.2/ During the lunch break engineers used photos of the cable from Scotland to make their own replacements. The upgrade completed successfully.
23/ The first BBC blog post published under my name was not written by me. I was on holiday in New York. My BBC blogs photo is still a holiday photo, though you can't tell.
24/ BBC staff get a blue lanyard. There was also red (now discontinued), green (first aid), yellow (fire warden) and pink (iPlayer). I made it my mission to get a full rainbow, but gave up after I got a yellow misprint and lost it. A rainbow lanyard later appeared as a one-off.
c/w sex
25/ In the IRC days (if you don't know, IRC is like Slack in that it's a chat service, but like Twitter in that there's no edit button), I once got 'c' and 's' confused when telling the NOC that I can't help with their "failed disk problem". I was *technically* correct.
26/ You can link to a specific time in iPlayer using #tXmYs, e.g. bbc.co.uk/iplayer/episod…
27/ Part of Television Centre which BBC Public Service sold, is now office space leased by BBC Studios. It has a cafe with a Dalek in it.

It's called the Dalekatessen.
28/ There's an engineering room in one building that took me four attempts over three days to find for the first time, despite having a map with it on. I eventually found a contractor drinking tea who knew. It's on -2, but accessed from +3.
28.2/ It was on one of these searches that I found the jukebox from 17.
29/ I've worked in two departments at the BBC: (FM(&T)?|Digital|Design [&+] Engineering) Platform Media Services,
and (Design [&+] Engineering Infrastructure, Services, Operations and Commercial \(👁🧦\) Online Technology Group)|(Technology Group Digital Distribution)
30/ One of the live streaming control interfaces is called Marvin because "Live? Don't talk to me about live."

At its peak, it had 42 hardware distribution encoders configured in it.
31/ I once managed to get trapped in a corridor between a fire exit and an on air radio studio.
32/ There was once a major incident because of a missing asset for BBC One due to broadcast in an hour or so, because the server that received a file from a server to then transfer it back to the same server had failed.
33/ The BBC day starts at 06:00 and runs until 30:00.
34.1/ CBeebies and BBC Four time share - using the same bit of bandwidth for non-online distribution, since only one is on air at a time. A side effect is that online has a single video input for both, using a cron job on the encoders to put up a static slate at cutover times.
34.2/ For years, every daylight savings switch, we'd forget to move the time of the cron job, and you'd get an hour of BBC Four on CBeebies or similar. The same with CBBC/Three when Three was first a thing. The encoders now follow local time instead of UTC.
35/ Similar to #23, the first BBC blog post I drafted was not published under my name.
36/ A contact once managed to get into the filters of things I see, which was from a person who was my new celebrity crush as of about two days before when I started watching a show they starred in.

They were writing to complain that we'd credited a completely different actor.
37/ There is not a single correct answer for the question "How many radio stations does the BBC have?". The most correct answer is "More than you think". It's possibly somewhere between 100 and 125, depends on how you count and what time of day it is, really.
38/ Several radio services were once taken off air because a fire alarm got so wet it decided there must be a fire. Radio 3 did this: soundcloud.com/lloyd-wallis/b…
39/ I once, in a work conversation, said the phrase "Maybe it's nothing to do with a nation state at all".
40/ There was a period of a couple of years where bbc.co.uk/archive was, itself, archived: web.archive.org/web/2018090207…
41/ One of our service providers had a special escalation noted in support tickets raised by one of our accounts which essentially says "If a ticket saying there's a network problem is raised here, there usually *actually* is, so please escalate immediately."
42/ The longest-running high priority (i.e. people are called out, it can't wait until morning) incident I ever handled was because a hard drive broke. At Sky. As any enterprise-grade storage array tends to do, the controllers then crashed and lost all data.
42.2/ If you think you saw an earlier version of this tweet where I typed "enterprise-gay" instead of "enterprise-grade", no you didn't.
43/ Charlie Sloth once walked up to me (and @andydurant I think) and asked if we thought he looked like a smurf. He was wearing a blue sweatsuit. We were too polite(?) to admit he did, except for being too tall.
44/ Times I have been asked if it's possible to delete something from the Internet: 3
45.1/ The OG Britbox, in US/Canada/Australia and possibly others I've forgotten, is an SVOD service where the streaming stack is the same one that powers iPlayer (and therefore my problem).

Britbox, in the UK, is a completely unrelated SVOD service with its own stack.
45.2/ Neither are related to BBC Player, an SVOD service in Singapore and Malaysia.
46/ One of the most surreal meetings I had was being in a conference call with YouTube's artist relations and Ariana Grade's social media people, the former walking the latter through clicking the "Go live" button and emailing me the stream key.
47/ This underused
David Attenborough clip: media-availability.tools.bbc.co.uk/smp?mediaId=p0…
48/ The Broadcast Centre lifts would regularly announce the fourth floor as the fourth basement. Sometimes it was otherwise fine, sometimes the doors then didn't open and after a while the lift went to another random floor the deposited you there.
49/ The fire training for equipment rooms on one campus can be summarised as:
There are four distinct alarms with their own sound and meaning, if you hear any of them, get out. There's a fifth sound that means the next one is a test. If you hear that, get out anyway to be sure.
50/ In one of my collections of incident reports, there's an equal number where a causal factor is automation in place to stop other automation going crazy stops the automation doing something it did need to do, and ones where a factor is a drop in water pressure.
51/ This is probably my favourite world service language story: bbc.com/pidgin/world-4…
52/ There's a service called Catapuss. It's essentially a slightly more complex version of the 'cat' command, which is used to join (concatenate) one or more files together, in this case usually audio/video transport streams.
53/ When we launched the BBC's responsible disclosure policy (bbc.com/backstage/secu…) as we can't offer real money, the original reward was a t-shirt on it with a Dalek exclaiming "I exterminated a bug on the BBC website". I *think* the design's changed since but am not certain.
54/ The web browser media player has been translated into forty-six languages (there's not much visible text, but lots for screen readers), including Welsh, Amharic, Pidgin and Pirate. Media player, cursor hoveri...Media player, cursor hoveri...Media player, cursor hoveri...Media player, in an error s...
55/ Because of #45, I was once woken up at midnight by a call asking where this day's episode of the ITV show Good Morning Britain was.
56.1/ There are two politicians with last names that when spoken sound the same, but are spelt differently. Subtitles once incorrectly attributed a statement to the wrong one, so it was decided to do a spoken on air apology.
...
...
luckily the obvious upcoming gaffe was caught
56.2/ and conversations were had on how to make sure the subtitles got it right the second time around. The BBC ended up providing pre-scripted text to Red Bee (who do our subtitling) for that specific news segment to prevent further embarrassment.
56.3/ Yes, I've been vague, and yes, it's because I can't figure out which way around they were.
57/ That Christmas BBC One ident of a cat on a roomba-other-robotic-vacuum-cleaners-are-available has a ten hour edit:
58/ For a few years, I'd do late shifts editing photos from festivals (you could volunteer to spend some time in Radio 1's cool kids bit of office to do this, and I was cheapest to taxi home after tube stopped), including the Coldplay one at the top of bbc.in/3wdcW8Q
59/ When testing live streams, and I don't need a specific source, I will usually use CBeebies as I find it more mentally stimulating and less repetitive than daytime One/Two (but our test env channels are limited to One Northern Ireland and Alba).
60/ Because of #59, I was very happy when S4C aired a Welsh language version of Sarah and Duck, Sara a Kwak. Unfortunately it's not available on iPlayer now, but you *can* get SpynjBob Pantsgwâr: bbc.co.uk/iplayer/episod…
61/ Sticking to a children's show theme, #52 introduced catapuss. There is another service called Bagpuss, because it also `cat`s files together, but usually older, bigger (fatter), though rarely furry files.
62/ Because Trust Is The Foundation Of The BBC, We Are Independent, Impartial And Honest, many tools/services at least provide read access by default unless there's a good reason not to.

I know of one case where a show was reversioned because a JIRA issue was read out verbatim.
63/ During the same migration project as #22, there was panic on the 50-engineer conference call as all outputs for a channel went black. Luckily it turned out someone was watching properly, and noted the show was following a train that had just gone into a tunnel.
64/ Number of times I've seen an email from a producer demanding that whoever screwed up that badly (me) needs to be fired: 1. To be fair, the last thing I did before I closed my laptop on Friday started systematically deleting a whole station's content from BBC Sounds overnight.
65/ This phone menu that is quite possibly the Most BBC Thing, and definitely tells you all you need to know about supporting internal user services at the BBC: soundcloud.com/lloyd-wallis/b…
66/ As part of the fallout from the Sounds issue in #64, there was a discussion about adding an "are you sure" to the button I clicked, or otherwise explicitly calling out what it would do. It was decided it wasn't worth it, that nobody would ever be that silly again.
67/ Around two weeks after #64, and the conversation in #66, someone else did the same thing to Radio 4.
68/ Most online streams have a Digital On-Screen Graphic of the BBC blocks placed in the top left. This can sometimes conflict with another Digital On-Screen Graphic, especially during sports coverage. This phenomenon is known as "double DOGing".
69/ The 100% factually accurate documentary, W1A, narrated by renowned nature documentary... narrator David Tennant was, of course, partly filmed in New Broadcasting House at W1A. Unfortunately, the producers didn't like the look of the sixth floor, but also wanted one. And so, Large Number Six painted on...
70/ The "BBC Transparency" signs in Series 3 of the W1A documentary had production staff dedicated to making sure you didn't walk into them (which I certainly needed!), and a big X of hazard tape was put over them between takes. Image
71/ #69/70 happened whilst we had candidates in for job interviews, who were finding it very difficult to navigate the area, and were probably left with inaccurate experiences of how well organised the building is.
72/ Friends used ask how accurate W1A is. I have to admit that I later realised my default response was completely untrue - "Well Will goes to get coffee from the ground floor, and that doesn't make any sense." There's the Media Café and the Unbranded Costa to choose from on 0!
73/ (Last W1A I promise) During filming, a coworker was berated for sitting at a hot desk with headphones in and actually working, instead of reacting to the filming around him and asked to pay attention or move.
74/ Continuing the upgrade project started back in #22, after the train tunnel incident of #63, the call was advised if any upcoming content had use of 'artistic black', and touching things during those parts was avoided.
75/ Demonstrating my second-to-none music knowledge, during one festival (#58), I submitted a collection of Mumford & Sons photos. None of the images contained any members of Mumford & Sons.
76.1/ The first BBC UHD TV series was physically couriered, encrypted to R&D, who then encoded and packaged it. I then walked over to their building and back with the hard drive to upload it, with a reasonable amount of effort going into tracking chain-of-custody for leaks.
76.2/ I got back to my desk, and discovered I'd been handed a BitLocker encrypted drive.

The only Windows device I could find had no wired network port and only 802.11g, so that next step then painfully happened over a temporary 54Mbps Wifi connection.
77/ In a corrective actions report from a vendor we once received was the sentence "This will decrease the frequency of this state by a factor of 16 million"

To be fair, I haven't noticed it again, so maybe that's enough.
78.1/ For a Taylor Swift set (see #58), we were told to wait until images moved into an 'approved' or 'rejected' folder so agents could review the photos. Eventually one photo moved into approved, which looked odd, but was approved and we really wanted a photo, so got to it.
78.2/ Meanwhile, all the other photos had gone into the 'rejected' folder. Just as publish was hit on the one photo, everything in 'rejected' moved to 'approved' and the 'approved' image moved to rejected.

This was one of the three in #44.
c/w scenes of a sexual nature

79.1/ On my first day at the BBC, there was a quick welcome drink with the team before they all went off to see something they booked tickets for months ago. Around half an hour after they left, they discovered there'd be a no show and invited me.
79.2/ Two days into living in London, I was told to travel to an old Royal Mail sorting office near Paddington, and meet someone from the team who'd actually had the day off and so I hadn't met yet. So we're already at about 300% disoriented.
79.3/ It turned out to be some sort of theatre thing, The Drowned Man. We were given shots and face masks, then I was shoved out of a lift and immediately lost track of who else came out that I knew. I also really, *really* need the toilet.
79.4/ The scene that opened the toilets hadn't happened yet, I was told by staff, so I was led 'backstage', but then *not* escorted back to where anything was happening. I spent the next 40mins wondering around the various sets in the building, not encountering anyone else.
79.5/ I eventually entered a dim room with a checkerboard floor just as a corpse suddenly appeared. A few minutes later the room flooded with hundreds of people, and all of the ones not wearing masks started having a sex party. I still had no context other than it was a play.
79.6/ This is not representative of life at the BBC, contrary to popular tabloid media presentation, and can confirm the first day was the only day I experienced such a thing.
It was a great show (loops three times so I saw some after that), and I'm sad I never went again.
80/ The responsive web media player supports two casting modes: Chrome, and Cat. "Select casting device...
81/ There is a service called Kholkikos, as all DRM-related services are named after dragons, because "here be dragons". This is a controversial name because 1) there was a dispute as to whether Kholkikos is a dog not a dragon; and 2) You try typing Kholkikos at 4am.
82/ There is not a DRM service called Drogon, because the name was used to register a Lambda Service instead of an EC2 one, and our internal tooling means Drogon's gone never to be seen again. Viserion and Rhaegal do exist.
83/ There is not a DRM service called Toothless and this regularly makes me very, very sad. But not sad enough to convince people to come up with a reason to have more DRM.
84/ A good indicator of how long someone has been at the BBC is whether working here hasn't been the same since BC bar, TVC bar, Kingswood Warren, or Bush House closed.

They all closed before I joined.
85.1/ To change the configuration for what bitrates, CDNs, and other tidbits will be used for live streaming clients, you first edit a directory of XML configuration files. These are then run through a service called Manifesto to generate YAML files.
85.2/ Manifesto used to run as an AWS Lambda, but the config got too big and started running out of file descriptors, so we moved to running it locally. Next, you convert all the YAML files into CSVs, which you then cat, sort, and dedupe (not with Catapuss). With me so far? Good.
85.3/ You then diff the past and new, to see if the output looks vaguely like what you wanted. That CSV file is then catted to *another* CSV file, where things Manifesto doesn't understand are manually configured. The CSV file is then converted back to YAML.
85.4/ You then run a script, passing that YAML file as input, which spins up a local version of the API service itself, instantiating the perl representations of all this configuration. The perl objects are then serialised into JSON and output to another file.
85.5/ This JSON file, which at its peak was 130MB, is then committed to version control, and a signal sent to start a rolling redeploy of the service to get load new configuration files.
This is mostly the result of around a decade of abstractions on abstractions.
85.6/ Oh and don't worry - if you commit the files and forget the deploy step, you'll find out if they worked when the service reboots automatically at 3am. Or, well, stops at least.
86/ I once as part of an escape room fed a co-worker to a zombie so the rest of us could live.
c/w language

87/ Number of times I have said the f-word one or more times in a Slack message: 569^W 570 LloydW Did you know I've sa...
c/w language

88/ Number of times I have said the f-word one or more times in a BBC IRC message: 147
c/w language

89/ Number of times I have said the f-word one or more times in a JIRA ticket: 0
90/ For The OG Britbox (#45)'s Royal Wedding coverage, it was decided that the US would like a live stream. That's fine, the BBC is covering it already!
...but that they'd like *ITV*'s presentation of it. So my job for the wedding was watching ITV's coverage with ads blanked out.
91/ After the first UHD series (#76), more of the processing was given to BBC 'proper' rather than R&D. It was delivered to a colleague for encoding, then to me for packaging and uploading. After a couple of these I turned my script into an EC2 service driven by SQS messages.
92.1/ On the subject of packaging... when I talk about it in this thread this is taking encoded media essence, e.g. HEVC, AAC, h264, and putting it into an appropriate container file, e.g. transport stream, fragmented MP4.
92.2/ For most of my BBC time, the vendor we used for live stream packaging had architectural decisions (which can be summarised as 'SQLite DB') that severely limited horizontal scaling, i.e. at any given time there was a single, very big EC2 instance responsible for BBC One HD.
92.3/ This sometimes did not end well, and it ended having more caching layers and more workarounds to try and make it more stable. Eventually, on 2018-07-07, it went bang during an England football match and we decided we could do this 'packaging' thing better ourselves.
92.4/ Long story short it took a while and it finally started going live in March of this year. Apart from a couple of teething issues it's made so much of how things work so much easier. It's now multi-region and designed so that number of clients doesn't change packager load.
93.1/ I was once minding Pudsey (#8) during the rehearsals for a Children in Need flash mob. I was nice, in the MediaCityUK plaza, a Strictly choreographer, an 8ft ten year old bear, everything you'd want. And best, I didn't have to do any dancing.
93.2/ Then, just as it was time to do the real thing, Radzi Chinyanganya, a Blue Peter presenter, came out to join in the fun. He took over Pudsey responsibilities and I was put into the main mob. To dance. After skipping the rehearsal.

This was later broadcast on BBC One.
94/ It may shock you that the BBC is not without its controversies. One of the largest of these was in a broadcast of Trainspotting Live, which was later revoked from iPlayer until it was re-edited, and DG Tony Hall apologised, because one train was, in fact, not live.
95/ Speaking of trains, and UHD, the first UHD programme made available on The OG Britbox that was not also available on BBC Public Service was an hour of a steam train living its best life. A steam engine, steam billo...
95.2/ It does get quite the glow-up by the end, to be fair. A steam train at in the eve...
96/ On my first BBC training course, I spent a week in a training space at Elstree studios. It was not easy to focus, as Elstree was currently set to Christmas, and it was September, so the snow machines outside the window were pretty loud.
97/ There was once a disagreement over the use of the word 'legal' in the slates put up when a live programme isn't available online, and it was requested to be changed to 'technical'. We didn't want to change it to technical. Long story short, it now looks like this: BBC One branded background,...BBC One branded background,...
98/ Much of The Thick of It was filmed in offices in the White City campus. Due to complaints about 'bad language' and 'shouting', most of it was done after 5pm.
99/ The project to build our own live packaging software was called Project Lapland. Because, you know, packages, Christmas, presents?
100/ Elf is the Project Lapland service that takes the raw media essence and turns it into a nice packaged object.
101/ Santa is the Project Lapland service that takes the packaged objects and delivers them to origin object stores for Content Delivery Networks (CDNs)
102/ Rudolf is the Project Lapland service that generates manifest files - lighting the way from stream entry points to the media chunks themselves.
103/ Finally, Ribbon is the Project Lapland service that provides a control service over the rest - tying it all up into a neat little bow.
104/ Assuming nothing breaks, from about fifteen minutes before kickoff, you can tell how many people will be watching it at full time fairly reliably.

Except on Saturday when either we ran out of new people to tune in, or people got bored around goal #3.
105.1/ Around half the time I visit the equipment rooms with the physical distribution encoders for live streams, I either get lost or forget to check the location reference first.
105.2/ Luckily, despite being an an immense space filled with things going 'brr', these encoders manage to be on a whole other level and are audibly identifiable from about 50 meters.

brrrRRRRRRRR
106.1/ The physical simulcast encoders are stuck on a software version from 2016 - an update rewrote a critical feature such that it no longer worked for our use case. For quite a while we tried to get this sorted added so we could upgrade.
106.2/ The vendor was eventually being used in a significant build-out for another streaming provider, and one of the systems engineers at another company working on this was also 'person I eat pizza and drink beer with on Mondays'.
106.3/ On one such Monday, they were talking about one of the problems they were working on, which we solved by using this feature. A "wouldn't it be cool if..." later, it was fed back, and a couple weeks later we discovered reimplementing it was on the product roadmap.
106.4/ It's now done, but on a version our hardware is too old to upgrade to. So we're still stuck for now.
107/ There was once an outage to live streams for radio because somebody unplugged the time cable.
108/ Oh no, he's getting into time, make it stop!

Online video streams have a timecode based on UTC inserted a couple of SDI cables up from the contribution encoder in each capture node. This is practically never the same time code by the time it reaches your device.
109.1/ The video distribution encoders count time thusly:

1. At startup, read the timecode from the first video frame received
2. For each frame encoded, increment internal timecode by one frame

That's okay, as long as everything else is perfect.
109.2/ If video is ever hard, such as if there's a dance show that doesn't believe in 'too much confetti', the encoder might decide it doesn't have enough time to finish encoding a frame, and so repeats the previous one.
109.3/ However, it doesn't *drop* the frame it was encoding - it outputs it when it's done, repeating as much as necessary. Over an episode of Strictly Come Dancing, an encoder can manage to drift ten seconds or more out of sync from these frame repeats.
109.4/ This is fixed in newer encoder versions, but see #106.
110/ This is quite convenient for leap seconds, as the encoder doesn't notice them until you restart it, so you can insert a 1s frame hop at your own leisure. But it's also pretty rubbish otherwise.
111.1/ Leap seconds! There's currently 37 of them, of which 27 you can count*.

*I'm probably wrong, but nobody understands time anyway.

Unlike video streams, the timecode for radio is based on *TAI*. This is like UTC (seconds since 1970-01-01 00:00:00).
111.2/ The difference is 1) when 1970-01-01 00:00:00 was - ten seconds earlier for TAI than UTC, and 2) TAI does not have leap seconds - it has been completely monotonic.

To turn TAI time back to UTC in your player and elsewhere, the number '37' is configured in seven places.
111.3/ Well, seven last time there was a leap second and we had to find them all.
112/ A couple more UHD series after #91, I discovered that the coworker that did the encoding part before I put the message onto a queue to run the packaging script also was now just running a script. I rinsed, repeated, and wired the two together.
113/ Still not having had roadmap space to fully integrate UHD series, next time around I wrote docs as I did it, the time after that showed the product manager how to do it themselves until this could be done properly.
114/ I have contributed to 89 distinct Github repositories at the BBC, and for several years was an admin of the dash.js project (though largely inactive I'll admit).
115/ The One Show have their own printer/photocopiers. Or at least have taped a sheet of A4 paper on them to tell other people to keep their print jobs off. Kind of like the hotdesks featured in W1A, but for multifunctional printing appliances.
116.1/ During the build up to the Royal Wedding in 2018 (actually there were two weddings Royal, but people only talk about one of them (also is it still a Royal Wedding retrospectively now they've retired from being Royal?))
116.2/ Sorry. During the build up to Royal Wedding 1 in 2018, the feed from the Outside Broadcast cut out a couple of times. We were advised not to worry though as it was believed to be aircraft interference and the flight restriction would be taking effect shortly.
117/ I have spent about the last forty tweets trying to get Tensorflow to work in the background to see what training a neural network on all my JIRA activity would generate as new tickets, and have failed.
118/ I have been an Active Participant of 2,685 JIRA tickets.
119/ The two UHD scripts mushed onto EC2 instances now have a Rundeck job in front that operators can use, and it's been considered 'production enough' for now.
It's called the Interim Production UHD Workflow, or IPUW, pronounced 👁💩.
120/ When an online radio station output is blanked, sometimes the loop cuts between random points. We know why, we know how to fix it, we just never have. So sometimes it will say, for example, "Thank you for choosing to try again later."
121.1/ That's an excerpt from the default backup loop speech, which obviously I can say from memory along with proper cadence and enunciation, and is ~15s long. Some stations have their own. Local Radio's is 35 seconds, 5 Live' Sport Extra's a whopping 12 minutes 9 seconds,
121.2/ and Asian Network's is 3 seconds. That one gets irritating to hear on a loop real fast.
122/ I once said the sentence "There's nothing wrong with a Confluence calendar that tells you what you're manually redeploying today" and was told it was perfect.

This is actually one of the other problems Project Lapland solves.
123/ In an incident summary from a vendor, the document made clear that the service is resilient and never has issues due to the failure of a single hardware appliance, and then goes on to explain how it had issues due to the failure of a single hardware appliance.
124.1/ On the web, a 'cache' is a service that concentrates requests, offloading them from your origin service - if someone asked it for a file earlier, it keeps a copy so the next time someone asks it can serve that instead of asking the origin again.
124.2/ Things that are basically fancy caches is 99% of my job (and 100% of my new one) so actually there's a little more to it than that, but that's the general idea.
125.1/ One way of measuring how well a cache service is performing is the 'bytes offloaded' metric - this is the ratio of bytes served to users versus those it needed to fetch from the origin. A 90% offload means for every 10MB served, it needed to get 1MB from the origin.
125.2/ Due to a minor misconfiguration once, one of our caching layers reached an offload of -1600%. For every 1MB served to a user, it requested 16MB from the origin.

This was not a good day.
126/ On the Reading + Leeds live streams in 2016, you could see the photographer fall over about three seconds after this photo was taken: ichef.bbci.co.uk/images/ic/raw/…

I personally feel like it was worth it.
127/ I eventually stopped doing photo editing for festivals. This is because I ended up looking more at those cache things than photo editing during the events, so decided to keep myself fully free to do that. And definitely wasn't because I was bad at it.
128/ Along with a YouTube stream (#46, which I've just seen the typo in...), for One Love Manchester I also looked after the syndication to dozens of small radio stations around the world who could only get receivers for simple HTTP streams set up in time.
129/ Having given up on tensorflow, I just counted uses of each token in my JIRA history.
As a sign in confidence in the way things are, I am slightly more likely to say "should" (872) than "will" (866).
130/ Whilst I could do a whole other thread on how the cloud doesn't scale, an on-premise service one had capacity constraints over a weekend.

Developers arrived Monday morning asking questions like "why are the pre-prod environments gone" and "didn't this have RAM before?"
131/ I once took the BBC One HD online stream down because I was trying to reboot a server in the non-live path, but it kept rejecting a password. To check the password was definitely right, I logged into the live equivalent, and when it let me in rebooted that one instead.
132/ I lost my favourite coat on a train because I got distracted by a live issue so rushed off of it to sit on the platform fixing it. It was a very, very cold platform, but at least I remembered my suitcase.
133/ There were erm... 'elevated error rates' for BBC live streams once due to misconfiguration of the update service of a popular operating system causing devices to connect to the wrong servers.
133 likes, 133 things. Since I've managed to catch up I'm going to call it there, I think. It's been a ride, and thanks to the BBC people who provided ideas/fact checking/"is this okays?", and to other friends who kept telling me I could do this. Now go watch some Netflix.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Lloyd W

Lloyd W Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(