Review: Remembrance Sunday 2023

After all of our big tech events, we like to review how things went and identify areas for improvement, even if we felt that nothing went wrong. We always do these in a no-blame way, following the Retrospective Prime Directive:

“Regardless of what we discover, we understand and truly believe that everyone did the best job they could, given what they knew at the time, their skills and abilities, the resources available, and the situation at hand.”

Norm Kerth, Project Retrospectives: A Handbook for Team Review

For our reviews we follow a format called The Good, The Bad and The Ugly, where we identify things which went well, things which shouldn’t have happened, and things which we could have done better.

The Good

  • The stream from the War Memorial was good, with solid connectivity and clear audio.
  • Using an additional speaker at the War Memorial gave clearer, crisper audio for those there.
  • Relaying the Act of Remembrance into the Community Centre is appreciated by those unable to join the procession for whatever reason.

The Bad

  • We lost our in-church connectivity just as the procession was forming, which meant we had to hurriedly switch to a backup. We were only able to do this because of performance issues earlier in the morning meaning we were in a state where this option was available. We should investigate ways of improving the resiliency of this connection.

The Ugly

  • The internet connection in the church was performing poorly at the start of the service, prompting us to use a more adaptive but lower quality streaming process.
  • There was a slight mis-timing at the start of the service due to some confusion over flag parties.
  • We forgot to disable the wifi on our mobile camera, which led to a drop in connectivity as we left the church.
  • We used a radio microphone at the War Memorial, but this wasn’t as reliable as we expected. We should spend the time to install a wired microphone in future.

Community Centre: Now with extra wifi

A few weeks ago during a routine check on some of our equipment in our Community Centre we noticed that one of our wifi access points was no longer connected to the network and totally refused to reconnect no matter what we tried. This was annoying, but not entirely unexpected – the access points were installed in 2019, as a relatively cheap stop-gap solution until something more permanent was put in place. Then 2020 happened, and priorities changed slightly.

But still, a failing access point forced our hand. More specifically, it forced our hands to move the remaining access points around to try and maintain coverage where it was most needed – the bar, where it supports our till and card terminal, and the office, where people do a lot of parish administration.

Unfortunately, this deployment meant that the Main Hall suffered from poor coverage. It would sometimes work if you stood in the right place, but couldn’t really be relied on. Since we promote our guest wifi network, and advertise wifi coverage as one of the features of the building, this wasn’t great. Not only was it far below the standard we aim to provide, but it also began to affect the quality of events.

So we decided to fix it. Fortunately we already had the beginnings of a solution powering the wifi in our church building, a pair of TP-Link Omada hotspots (more specifically, EAP115s), along with a controller. Instead of spending time scoping out a new system, we knew we could just extend this. All we needed to do was decide on the new hardware for our Community Centre and get it installed.

During the refurbishment of our Community Centre in 2019 we installed some structural network cabling (the plan being to one day hook proper wifi access points up to it), which meant we knew the points in the building where hotspots could be easily installed. Being able to power them over ethernet was crucial because not all these locations had mains. Ideally, we wanted ones which could be powered by our network switch directly; the EAP115 access points need an external power injector, which in turn takes up space and needs sockets. We also considered the density of users, and if we should future-proof by supporting the newer WiFi 6 standards.

In the end we settled for four new WiFi 6 access points; three EAP620HD units which cover the main hall and bar, and an EAP615-wall unit which gives the Smeaton Room dedicated coverage, as well as maintaining access to physical network sockets in there should we ever need them. In an ideal world we would have ceiling-mounted all of the EAP620s, but limitations of our cabling means that two of them are wall-mounted instead. Fixing this would require a significant amount of work to move cabling, channeling out walls and ceilings and then making good again, and it’s simply not worth it for the marginal gains.

The next step once the hardware was installed was provisioning it and configuring our network. For the most part, we just added the Church’s existing network configurations to the new spots. Print off some updated signage with our new network details for guests, and we’re done!

Finally, because we know everything we do has an environmental footprint, we’ve offset a full tonne of CO2 emissions to cover the impact of manufacturing and shipping this new equipment. We’ve also funded the planting of another 25 trees, more specifically mangrove trees in Mozambique.

Oops: Streaming failure

This morning we suffered a networking failure which meant we couldn’t stream the entire service, and only captured the first few minutes.

What happened?

Shortly after the start of the service, we began to see signs of unexpected buffering on our video stream. A few minutes after this we lost all internet connectivity in the church. We quickly tracked this down to a complete failure of the wireless bridge between the church and the Community Centre, but were unable to restore the connection.

Why did this happen?

We don’t know. It’s possible that the cold temperatures were having an impact on the equipment we use for the wireless bridge and the timing of the failure was entirely coincidence. The equipment, however, is intended for use outdoors and should be comfortable operating in a much wider range of temperatures. For this reason, we’re wary of naming the weather as the culprit.

What are we doing to fix it?

In the short term, we’ve re-angled one of the receivers which has become knocked during routine maintenance to make sure we always have the strongest possible connection between the two buildings.

In the long term, we’ll investigate the feasibility of installing a permanent fibre-optic link between the two buildings so that we’re unaffected by issues such as signal alignment and weather in future.

Post-mortem: Temple Newsam Centenary Weekend

One thing we like to do in the technology team is look back at things we’ve done, and contemplate what we’ve liked, not liked, learned, and can do. We call this a post-mortem, and after a particularly busy weekend celebrating 100 years of Temple Newsam being owned by the City of Leeds, we’ve got two events to look at!

As always our reviews – especially around stuff which didn’t go quite right – are blameless. No one individual is held responsible, as we believe this is the best way to identify our weaknesses and get them fixed.


The Whitkirk Lecture

This was a lecture held in our Community Centre, for which we provided projection and sound reinforcement, as well as video recording for later release.

Things that we liked

  • Having a clicker and presenter display made our presenter more confident, as well as keeping the podium clear

Things that we didn’t like

  • Our projector’s VGA input wasn’t compatible with our chosen computer’s VGA adapter, so we had to cobble together an alternative on the fly
  • There weren’t many good or unobtrusive places to put our tripods to record the talk

Things we learned

  • We don’t have enough power distribution options to get power where we needed it without resorting to a mix of cabling

Things we’re going to do

  • Encourage more investment in the Centre as a presentation venue, such as purchasing a proper podium
  • Look at the costs involved in purchasing a new projector and screen suitable for the space
  • Invest in some more power distribution options for the tech team

The Annual Eucharist

We streamed this service live from Temple Newsam House, as well as relaying it across the building to an overflow space in case the Long Gallery exceeded capacity.

Things that we liked

  • Using a dedicated 4G router simplified our setup, meaning we could more confidently use multiple devices

Things that we didn’t like

  • Our outbound network speed wasn’t as fast as we predicted, meaning we suffered some drop-outs and loss of quality
  • The poor lighting in the room meant some of our shots weren’t as sharp as we would like
  • The lack of internal networking in the building meant we were forced into maintaining two entirely separate mobile network connections

Things we learned

  • Our internal network’s performance was impacted by the sheer volume of people when stood
  • The mobile signal in the Long Gallery, despite our router having much larger antennas than a phone, still isn’t great

Things we’re going to do

  • In future, make sure our hotspot is lifted above the crowd on a tripod
  • Investigate an external antenna (or two) to let us maximise our external connectivity

Weeknotes: Saturday 20 August

The period of Ordinary Time between Trinity Sunday and Advent is pretty empty in the Church calendar… but for the tech team it’s a chance to get on with all kinds of bits and pieces.

We automated the notices

Over the past few months, we’ve been progressively automating more of the things which happen every week in an easily repeatable way. The latest thing to be taken over by the computer is the process of generating our weekly notices both for our website and email.

We’ve had fibre installed

As part of our plans for continuous improvement, we’ve had fibre-optic internet fitted to the Community Centre. This gives us slightly higher speeds than before, but more importantly it offers improved reliability, future-proofing and scope for further improvement if needed.

Better continuity and disaster planning

Nobody likes thinking about things going wrong, but we’ve spent some time figuring out how we can improve the tech team’s resiliency in a number of situations.

Weeknotes: Saturday 12 February

It’s been a while since we had a weeknotes, hasn’t it? To cut a long story short, it’s been a busy winter for most of the tech team and we haven’t had as much time to do interesting stuff, let alone write about it. But we’re here now, and here’s a quick run-down of the past couple of months:

We did Christmas

There are three times in the year when we’re put under pressure to make sure things go seamlessly. Easter, Remembrance Sunday, and Christmas. This year for Christmas we went all-out and streamed every Christmas service, even streaming our crib service internally so we could use our Community Centre for socially distanced overflow if needed.

Most of these were streamed using our usual complement of equipment, but for the Festival of Lessons and Carols we built on things we had learned at Remembrance to add two whole new temporary camera angles. One of these gave us a closer view on the choir to help our viewers feel more connected to the music, and the other was an alternative angle on the lectern to give extra variety during readings. This went well for a couple of reasons; the extra angles make the process of streaming feel more creative and engaging both for the video operator and viewers, and it gives us a way to test out new angles when we’re thinking about the expansion of our permanent system before we commit to anything.

Our Omada write-up is being used

Way back last August we wrote a more in-depth dive into how we set up our Omada controller. Since then we’ve heard from a couple of people who followed the guide to get their own installation up and running. It’s always nice to know that we’re not just talking to the void!

We fixed some email configuration

As a sort of follow-up to the above Omada configuration, we identified a problem with our email settings where it wasn’t playing nice with Google’s two-factor authentication requirements. To resolve this we swapped our controller’s email configuration to use our Mailgun account instead.

We removed a load of equipment from the stage. Then we put it back.

Since our Community Centre was refurbished we’ve had a somewhat temporary kludge of stuff running the sound and stage lighting. During the pandemic there wasn’t much of drive to do anything about this, but now that we’re back up and running it was getting annoying.

To fix this we removed everything (including all our power and signal cables), came up with a tidier plan, and put it back again. We’ve got more work on this planned for the next couple of months to get things looking even smarter, keep them safer, and make them easier to maintain.

We started looking at better heating controls

Energy costs are going up, and our buildings are awkward and expensive places to heat. At the moment all our heating is controlled by fairly dumb thermostats with schedulers, and these need people to be physically stood next to them to update the schedule. We’re starting to look at connected solutions which not only let us update the schedule remotely, but which also support things like weather-aware pre-heating, and better zoning of our heat to get some potentially significant efficiency gains.

We built some tools to automate some workflows

One thing we dislike is having to repeat ourselves, which is why a lot of what we do relies on only having one ‘true’ source of information for each set of facts within the Church. For a lot of these things the source of truth is a thing called ChurchSuite – everything in our calendar comes straight from there, as does our list of what’s on (and we share the code we use for this). It knows about – at least in theory – all our upcoming services and events.

But quite a lot of our streaming services don’t yet rely on this single source of truth, instead relying on a series of checklists and our technical team manually stitching things together to make sure our orders of service and YouTube channel are in sync. This unfortunately involves quite a lot of repetition – we set things like dates and times in at least half a dozen places for every service, and write things like video titles in a very formulaic way.

Fortunately, computers are really good at copying things into lots of places in a formulaic way. So we built some bits of code which automate chunks of workflow for our Communications Team and Streaming Team. These pull events from ChurchSuite into another tool called Airtable, where these teams can then layer on top information specific to what they’re doing without needing to duplicate all the basics. And when something is updated in ChurchSuite, those replicated pieces of information are updated as well. Each team then uses Airtable to generate reports needed to do their job.

…and then we automated telling people the result

Once a week, another bit of tooling goes and grabs the information from Airtable, does some work to group it and highlight things of interest, and then emails the teams a summary. This improved visibility makes sure that our teams can more easily keep on top of what’s going on and what they need to do, stopping things from falling down the cracks and getting problems solved earlier.

…and then we automated some downstream processes

For the Streaming Team, though, we went a step further by automating to process of adding streaming services to YouTube. As soon as a service is marked as being streamed we now automatically create a new stream with the relevant details, ready to go.

Our plan for the next step once we’re confident in what we’ve built so far is to include the creation and interlinking of order of service entries as well, at which point we will have automated away the vast majority of weekly drudge work. This will free our time to worry about things that actually need a human’s attention, instead of copying and pasting.

We started working on an improved switcher for our stage equipment

As part of improving the house rig in the Community Centre, we’re building an improved mains switcher and interface so that Centre users can more intuitively manage the sound system and stage lighting. Our current plan is a pair of Raspberry Pis, one with a touchscreen and one hooked up to a relay board, with a server running on the relay unit communicating to a JavaScript frontend.

This has a few advantages over our current way of doing things:

  • It’s more unified (so no more hunting for the right switch)
  • It’s more intuitive (we can put instructions for things right there on the screen)
  • It’s more extensible (without needing to add yet more boxes to the front of the stage)
  • Because it’s software-defined we can update it easily (without having to go under the stage to visit the equipment rack)
  • Since it’s connected to our network we can control things remotely (for example by remotely assisting a Centre user if they’re having trouble)

When we’ve finished our proof-of-concept we’ll open-source the whole thing, as well as do a more complete write-up.

Making our orders of service more accessible

Our tagline begins with the words “where all find a welcome”, and we believe that this doesn’t just apply to the people greeting you at the door or to the people leading the service, but to the technology which supports us as well. That’s why when we heard from a member of our congregation with poor eyesight that our digital orders of service had given them a whole new way of experiencing services – by being able to scale up the text on their phone to a comfortable reading size – we decided to explore other ways we can make our orders of service more accessible.

We already try to make our website friendly for assistive technologies like screen readers (although we know we can do better, so we’re always making improvements), but we realised we could use the power of technology to put more accessible orders of service front and centre for a commonly sidelined disability: dyslexia.

Thanks to a typeface called OpenDyslexic which is designed specifically to combat some of the more common symptoms of dyslexia. Amongst other things, the letter shapes are all unique and have distinctive ‘heavy bottoms’ which help combat rotation and transposition.

Continue reading “Making our orders of service more accessible”

The impact of streaming

Advent is a time for looking forward, but it’s also a time for taking stock and reflecting. Given it’s the end of the year, it’s also a great time for us to do some maths and figure out the environmental impact of our service streaming.

In the last year, people watched around 3,200 hours of video on our YouTube channel. Streaming an hour of video online causes around 36g of CO2 emissions. This means our streaming was responsible for around 0.12 tonnes of carbon emissions.

We’re committed to reducing our impact (in fact, the PCC recently approved a whole new policy and strategy on it), so we decided to offset these emissions. Unfortunately, the company we use for offsetting can only offset in minimums of half a tonne at a time. So we decided to round up, and offset a full tonne to make sure we were also mopping up streaming from 2020 which wasn’t included in the total, as well as cover any unaccounted-for costs in equipment purchases (where we’ve previously offset larger items, but not some consumables).

Next, we counted up the number of likes which people had given our content in the last year – 153 overall – and we’ve funded the planting of a tree for each one, which you can see in our forest. These trees will continue to absorb carbon into the future, as well as boost the local ecosystem.

We hope to repeat this exercise every year, not only making streaming our services carbon negative, but helping to build greener future for everyone.

You can read more about our commitment to the environment, and discover what else we’re doing across the Church.