It’s been a while since we had a weeknotes, hasn’t it? To cut a long story short, it’s been a busy winter for most of the tech team and we haven’t had as much time to do interesting stuff, let alone write about it. But we’re here now, and here’s a quick run-down of the past couple of months:

We did Christmas

There are three times in the year when we’re put under pressure to make sure things go seamlessly. Easter, Remembrance Sunday, and Christmas. This year for Christmas we went all-out and streamed every Christmas service, even streaming our crib service internally so we could use our Community Centre for socially distanced overflow if needed.

Most of these were streamed using our usual complement of equipment, but for the Festival of Lessons and Carols we built on things we had learned at Remembrance to add two whole new temporary camera angles. One of these gave us a closer view on the choir to help our viewers feel more connected to the music, and the other was an alternative angle on the lectern to give extra variety during readings. This went well for a couple of reasons; the extra angles make the process of streaming feel more creative and engaging both for the video operator and viewers, and it gives us a way to test out new angles when we’re thinking about the expansion of our permanent system before we commit to anything.

Our Omada write-up is being used

Way back last August we wrote a more in-depth dive into how we set up our Omada controller. Since then we’ve heard from a couple of people who followed the guide to get their own installation up and running. It’s always nice to know that we’re not just talking to the void!

We fixed some email configuration

As a sort of follow-up to the above Omada configuration, we identified a problem with our email settings where it wasn’t playing nice with Google’s two-factor authentication requirements. To resolve this we swapped our controller’s email configuration to use our Mailgun account instead.

We removed a load of equipment from the stage. Then we put it back.

Since our Community Centre was refurbished we’ve had a somewhat temporary kludge of stuff running the sound and stage lighting. During the pandemic there wasn’t much of drive to do anything about this, but now that we’re back up and running it was getting annoying.

To fix this we removed everything (including all our power and signal cables), came up with a tidier plan, and put it back again. We’ve got more work on this planned for the next couple of months to get things looking even smarter, keep them safer, and make them easier to maintain.

We started looking at better heating controls

Energy costs are going up, and our buildings are awkward and expensive places to heat. At the moment all our heating is controlled by fairly dumb thermostats with schedulers, and these need people to be physically stood next to them to update the schedule. We’re starting to look at connected solutions which not only let us update the schedule remotely, but which also support things like weather-aware pre-heating, and better zoning of our heat to get some potentially significant efficiency gains.

We built some tools to automate some workflows

One thing we dislike is having to repeat ourselves, which is why a lot of what we do relies on only having one ‘true’ source of information for each set of facts within the Church. For a lot of these things the source of truth is a thing called ChurchSuite – everything in our calendar comes straight from there, as does our list of what’s on (and we share the code we use for this). It knows about – at least in theory – all our upcoming services and events.

But quite a lot of our streaming services don’t yet rely on this single source of truth, instead relying on a series of checklists and our technical team manually stitching things together to make sure our orders of service and YouTube channel are in sync. This unfortunately involves quite a lot of repetition – we set things like dates and times in at least half a dozen places for every service, and write things like video titles in a very formulaic way.

Fortunately, computers are really good at copying things into lots of places in a formulaic way. So we built some bits of code which automate chunks of workflow for our Communications Team and Streaming Team. These pull events from ChurchSuite into another tool called Airtable, where these teams can then layer on top information specific to what they’re doing without needing to duplicate all the basics. And when something is updated in ChurchSuite, those replicated pieces of information are updated as well. Each team then uses Airtable to generate reports needed to do their job.

…and then we automated telling people the result

Once a week, another bit of tooling goes and grabs the information from Airtable, does some work to group it and highlight things of interest, and then emails the teams a summary. This improved visibility makes sure that our teams can more easily keep on top of what’s going on and what they need to do, stopping things from falling down the cracks and getting problems solved earlier.

…and then we automated some downstream processes

For the Streaming Team, though, we went a step further by automating to process of adding streaming services to YouTube. As soon as a service is marked as being streamed we now automatically create a new stream with the relevant details, ready to go.

Our plan for the next step once we’re confident in what we’ve built so far is to include the creation and interlinking of order of service entries as well, at which point we will have automated away the vast majority of weekly drudge work. This will free our time to worry about things that actually need a human’s attention, instead of copying and pasting.

We started working on an improved switcher for our stage equipment

As part of improving the house rig in the Community Centre, we’re building an improved mains switcher and interface so that Centre users can more intuitively manage the sound system and stage lighting. Our current plan is a pair of Raspberry Pis, one with a touchscreen and one hooked up to a relay board, with a server running on the relay unit communicating to a JavaScript frontend.

This has a few advantages over our current way of doing things:

  • It’s more unified (so no more hunting for the right switch)
  • It’s more intuitive (we can put instructions for things right there on the screen)
  • It’s more extensible (without needing to add yet more boxes to the front of the stage)
  • Because it’s software-defined we can update it easily (without having to go under the stage to visit the equipment rack)
  • Since it’s connected to our network we can control things remotely (for example by remotely assisting a Centre user if they’re having trouble)

When we’ve finished our proof-of-concept we’ll open-source the whole thing, as well as do a more complete write-up.