Automating updates to a digital vigil
Monday, November 24, 2025
November 20th each year is Transgender Day of Remembrance (TDoR). It is a day when we memorialize those who've been lost to transphobia, through violence or suicide. And each year, I make the most difficult git commit of the year, updating the list of names in the digital vigil that I made a few years ago.
I was late doing it this year, and I did it on November 20th last year. Ideally, it would be done early. I keep procrastinating it, because it's tough emotional work. Next year, I want it to be early, so... how do I actually get myself to do that?
The solution is either more therapy or more automation. So naturally, I decided on the automation! (Don't worry, I have a therapist that I love.)
We'll need to solve two main problems: updating the lists of names, and deploying it on a schedule.
Updating the names
The digital vigil is a static site, and we need to know all the names when it's built. All the names are stored in a couple of arrays of strings in a Rust file[1]. We'll want to update that file, which means we get to do codegen, baby!
Let's tackle getting the names first.
An authoritative source of names for TDoR vigils is the website Trans Lives Matter. This is where I download the report from each year for manual updates. It's a great source of data, and I'm only using a fraction of what is there.
I decided to write a Python script to pull the data. I got partway through the script using the same endpoint the human-consumable webpage offers for a download, when I realized it gives me a zip file. After opening a few too many tabs, I remembered: there's an API for this! Of course there's an API, and like all principal engineers working on their hobby projects, I didn't remember to check the obvious things first[2]. After switching to the API, I got JSON directly, and the data was super easy to retrieve.
Here's what that looks like.
api_key = os.environ.get("TDOR_API_KEY")
def get_report(year, country="all"):
"""Retrieves the data for a given year's vigil.
This will request the data from September 30 of the previous year through
October 1 of the requested year.
Params:
- year: (int) what year's vigil the data is for
- country: (str) scope of the data; default="all"
"""
from_date = f"{year-1}-10-01"
to_date = f"{year}-09-30"
headers = { "User-Agent": "tdor-digital-vigil-bot" }
path = f"/api/v1/reports/?key={api_key}&from={from_date}&to={to_date}&country={country}&category=&filter="
conn = http.client.HTTPSConnection("tdor.translivesmatter.info")
conn.request("GET", path, None, headers)
resp = conn.getresponse()
if resp.status != 200:
print(f"Error: expected 200, got {resp.status} ({resp.reason})")
exit(1)
body = resp.read()
data = json.loads(body)
return data
The next portion is fun and straightforward: turning this into some Rust code! "Codegen" can make it sound fancy, but for a lot of problems like this, codegen can be really simple.
In this case, we just have a file that has two static arrays in it. The code generation is really easy: iterate through our list of names, but bracket them with lines that start and end our declarations.
It looks like this.
usa_data = get_report(2025, "usa")
all_data = get_report(2025, "all")
with open("src/names.rs", "w+") as f:
all_names = [r["name"] for r in all_data["data"]["reports"]]
usa_names = [r["name"] for r in usa_data["data"]["reports"]]
f.write(f"pub const FULL_NAMES: [&'static str; {len(all_names)}] = [\n")
for name in all_names:
f.write(f" \"{name.replace("\"", "\\\"")}\",\n")
f.write("];\n");
f.write("\n")
f.write(f"pub const US_NAMES: [&'static str; {len(usa_names)}] = [\n")
for name in usa_names:
f.write(f" \"{name.replace("\"", "\\\"")}\",\n")
f.write("];\n");
And generates a file like this.
pub const FULL_NAMES: [&'static str; 367] = [
"Name Withheld",
...
];
pub const