Jay's blog

Adopt These Trackers

A recent blog post by Kian Bradley called "Resurrecting a dead torrent tracker and finding 3 million peers" is making the rounds and it captured my imagination. There's something undeniably punk rock about adopting a dead tracker and bringing it back to life.

I was curious how someone might find old tracker domains to resurrect, though. Kian seemed to find his through natural usage of BitTorrent. I had a different idea.

This git repository has an automatically updated list of valid tracker URLs as well as an automatically updated block list. It also has 50k stars on GitHub, so I figure it's a decent source of data.

I vibe-coded up a Nushell script that will look through the git history for tracker domain names that were removed from the tracker list but never added to the block list. Finally, it writes out the results in files of max 100 entries because the final step is embarrassingly manual. I checked the removed but not blocked domains in the bulk search feature of my preferred registrar, who limits bulk search to 100 entries at a time.

#!/bin/env nu

def parse_file [path: string] {
  open $path
  | str replace '\n\n+' '\n'
  | lines
  | str trim
  | where {|x| $x != ""}
  | uniq
}

def get_all_commits [] {
  ^git rev-list master
  | lines
}

def build_blacklist_set [] {
  get_all_commits
  | each {|rev|
      let file = (try { ^git show $"($rev):blacklist.txt" } catch { "" })
      $file
      | lines
      | str trim
      | where {|x| $x != ""}
  }
  | flatten
  | uniq
}

def main [] {
  let all_commits = (get_all_commits | reverse)
  let blacklist_set = (build_blacklist_set)

  mut prev_trackers = []
  mut removed_total = []

  for commit in $all_commits {
    let file = (try { ^git show $"($commit):trackers_all.txt" } catch { "" })
    let trackers = (
      $file
      | str replace '\n\n+' '\n'
      | lines
      | str trim
      | where {|x| $x != ""}
      | uniq
    )

    if ($prev_trackers | length) > 0 {
      let removed = ($prev_trackers | where {|x| not ($trackers | any {|y| $y == $x }) })
      $removed_total = ($removed_total | append $removed)
    }

    $prev_trackers = $trackers
  }

  let final_list = (
    $removed_total
    | flatten
    | uniq
    | where {|x| not ($blacklist_set | any {|y| $y == $x }) }
    | parse -r '(?P<uri>.+)'  # ensure strings not treated as plain
    | each {|row|
        let host = (try { $row.uri | url parse | get host } catch { "" })
        let parts = ($host | split row ".")
        if ($parts | length) >= 2 {
          let root = ($parts | last 2 | str join ".")
          $root
        } else {
          $host  # fallback to full host if malformed
        }
    }
    | uniq
  )

  # Save output in batches of 100
  let chunked = ($final_list | chunks 100)

  let saved_files = $chunked
  | enumerate
  | each {|row|
      let idx = $row.index
      let part = $row.item
      let filename = $"output_part_($idx | fill -c '0' -w 3).txt"
      $part | str join "\n" | save --force $filename
      $filename
  }

  $saved_files
}

After a little copy-pasta, here are the available domain names that were once trackers:

This was a quick little project that took me all of 20 minutes to satisfy my curiosity. If I cared more deeply about answering this question, I might be inclined to try to scrape torrent search websites to collect tracker URLs from their magnet links instead of relying on one GitHub repository. I'd also want to figure out a way to automate checking if domains are available for sale. That data appears to be gated behind services that require registration and might cost money.

#bittorrent #nushell