At Drupalcon North America this year (2021), I had a presentation named "Less human interaction, more security?". This pandemic-ridden year made sure it was an online version of the conference, which makes presenting a bit different. One interesting aspect of online presentations, is that it enables feedback and chatter in the comments section. First of all, this meant that I could get feedback on my GIFs even if I wasn't in the same room as the session participants. As we all know, GIFs are an important part of every presentation. Second, I also received a lot of relevant and interesting comments on other matters related to the presentation. Today, I want to talk about one of the subjects that came up: is it scary to let a robot update and deploy your website?

To those of you that didn't attend the presentation; let’s do a quick recap. First, I talked about how to set up a fully automatic discovery, patching and deployment of Drupal security updates using violinist.io and Gitlab CI. Next, I demonstrated how a new Drupal version was picked up automatically, updated and pushed to open a Gitlab Merge Request for the project in question.

The Merge request was then analyzed and found to be a security update, which enabled auto merge for the merge request.

Finally it was tested according to the continuous integration set up for the project, and deployed to the production server for the demonstration.

No human hands were used to write commands or click buttons in the demonstration. What does this mean? Robots taking over? Reason for anxiety attacks? In this case, it's a clear "no" from me. Let me explain why.

Several things could be considered scary about letting a robot update and deploy your website. First, you´re somehow allowing third party access to your production server. In my opinion, this can actually be a step towards making your deployments more secure. Let me explain: maybe your current workflow involves a person remotely accessing the server and running deployment commands. Depending on your configuration of remote access, this can actually create a larger attack surface, instead of making sure only the machine deploy user can deploy to a server. There are of course more things to consider to this question, but my point still stands: moving towards only automated deployments will make your application more secure, more robust and more predictable.

Another potential cause of concern about letting a robot update and deploy your website, is that an update might break the website. Before deploying a change to a production server, people often like to check that their website still works in a real browser. I acknowledge this need for manual verification, but I believe there's a better way. Take the task of updating Drupal Core. I'll wager that a Drupal release has a lot more test coverage than any website you are currently maintaining. If you combine this with some basic functional testing for your website, an update like this is probably some of the safest code changes you can do. Furthermore, having a robot do it for you makes it very unlikely that the update commands will be done wrong, which could happen if done by human hands.

Of course, sometimes automatic updates will crash your site some way or another. I find this unproblematic. Not that I want websites to crash, but it happens. And this also happens when a human being is the author of code changes. However, I prefer that automatic updates crash my site, as this uncovers missing test coverage and makes me more confident about future updates. Say that your Drupal website was relying on the (fantastic) Metatag module, but one particular update made your metatags stop working on a particular content type. How come the update was deployed then? Because you did not have test coverage for that functionality. By learning from this, you can expand your test coverage, and feel even more confident about automatically updating the Metatag module the next time there is a new release.

Don´t get me wrong. You don't have to wait for your site to crash to learn that you´re missing test coverage. Start by introducing automated updates to your website through merge requests. When a new merge request comes along for an update, you´ll get a feeling about how confident you are to deploy it. Maybe it's the metatag module, and you know you have to test it manually to make sure it works. This could be an indication that you are lacking test coverage. To be able to automatically deploy the next version of metatag, just write some tests for the things you are testing manually.

Ultimately, my claim is that updating automatically will make your updates more secure and predictable. Over time, it will also increase your test coverage and the general robustness of your project. And at a certain point, maybe when your site is in more of a maintenance state, you can merge and deploy all updates automatically.

Now that the concerns around these things being scary are addressed: Please check out the presentation and make up your own mind on how scary automatic updates really are. And if you want to start updating right away, here is a link to violinist.io!

Disclaimer: I am the founder of violinist.io