Friday, November 21, 2025

AutoPkg GitHub Actions and Issues

AutoPkg GitHub Actions and Issues

Automating AutoPkg

For many years we have been running AutoPkg recipes daily using tools like Jenkins, gitlabs CI, and GitHub Actions. These recipes feed our endpoint management systems and fill a niche to quickly package and cache software for deployment. While we were able to receive status emails for the run of recipes, one are of improvement was to automatically create any issues for recipes that failed. If that sounds like an interesting topic, keep reading!

Auto-Running Recipes

Setting up GitHub actions to run AutoPkg recipes is covered in many places, so I won't dive too deep into that area. I hope it suffices to say that we are using GitHub Actions with a on-premise Mac mini that was set up to run the AutoPkg recipes locally. The GitHub Action we use triggers the run of these recipes using a text file that lists each one. This is the command we use to trigger and write the logs to a file named "autopkg-log.txt":

sudo -H -u macadmin bash -c '/usr/local/bin/autopkg run --recipe-list ~/Library/AutoPkg/RecipeRepos/com.github.company.autopkg-recipes/_common.txt) | tee ${{ github.workspace }}/autopkg-log.txt'

Our GitHub repo of recipes "com.github.company.autopkg-recipes' contains the "_common.txt" file that lists each of the recipes names to run; E.G. "GoogleChrome.jamf". Lines that are commented out with "#" are skipped. The results of the AutoPkg run are sent to both the STDOUT for showing results in the GitHub Actions and a new text file for our processing. The results are also saved in a file named "autopkg-log.txt", which stays in  a local copy of the repo on the Mac mini. We do not save the file and it is replaced each run. Thanks to AutoPkg displaying error results in a nice block format, we can scrape for this within the file and do something with it!

Getting JSON-y

Processing text is the next step in our desire for determination of detrimental recipes. Another GitHub Action takes over to parse the results of the text file into a JSON format that is easier to iterate and automate with. The following is an example of failed recipes output that we are looking to create issues for:

The following recipes failed:
    Anaconda3-PSU.jamf
        Error in local.jamf.Anaconda3-PSU: Processor: URLTextSearcher: Error: No match found on URL: https://www.anaconda.com/download/success
    Anki-PSU.jamf
        Error in local.jamf.Anki-PSU: Processor: CodeSignatureVerifier: Error: Code signature verification failed. Note that all verifications can be disabled by setting the variable DISABLE_CODE_SIGNATURE_VERIFICATION to a non-empty value.

Getting some assistance from the Internet lead me to find a solution for creating a JSON array for the issues. This uses a simple Shell script to parse each failure and create a new "title" and "body" json entry in a list. This list is then placed into the "issues" json key and saved in a file locally:

 - name: Create Json Errors

              id: extract

              run: |

                  FILE="${{ github.workspace }}/autopkg-log.txt"   # adjust to your file path


                  # Grab the section starting at "The following recipes failed:"

                  SECTION=$(awk '/The following recipes failed:/,/^[[:space:]]*$/' "$FILE")


                  # Parse into JSON

                  JSON=$(echo "$SECTION" | awk '

                      BEGIN { RS="\n"; FS=":"; recipe=""; error=""; first=1 }

                      /^    / {

                      if ($0 ~ /^[[:space:]]+[A-Za-z0-9.-]+$/) {

                          recipe=$1; gsub(/^[[:space:]]+/, "", recipe);

                      } else {

                          error=$0; gsub(/^[[:space:]]+/, "", error);

                          if (!first) { printf(","); } else { first=0; }

                          printf("{\"title\":\"%s\",\"body\":\"%s\"}", recipe, error);

                      }

                      }

                  ')

                  echo "{\"issues\":[${JSON}]}" > ${{ github.workspace }}/autopkg-log.json

We're using the "github.workspace" variable to help us ensure the file is written and available locally. After we run this action, the "autopkg-log.json" file is available for our next step.

Making Issuses

The internet was a lovely help again with this task, which we want to process the json results and create a new issue, if one does not already exists, for the recipe. As we are able to iterate through the "issues" json and retrieve the "title" and "body", it's a simple github-script away from wrapping it all with a nice bow:

- name: Parse JSON and create issues
              uses: actions/github-script@v7
              with:
                  script: |
                      const fs = require('fs');
                      const data = fs.readFileSync('${{ github.workspace }}/autopkg-log.json', 'utf8');
                      const issues = JSON.parse(data).issues;

                      for (const issue of issues) {
                        // Search for existing issues with the same title
                        const existing = await github.rest.issues.listForRepo({
                            owner: context.repo.owner,
                            repo: context.repo.repo,
                            state: 'open',
                            per_page: 100
                        });

                        const found = existing.data.find(i => i.title === issue.title);

                        if (found) {
                            console.log(`Issue "${issue.title}" already exists (#${found.number}), skipping.`);
                        } else {
                            await github.rest.issues.create({
                            owner: context.repo.owner,
                            repo: context.repo.repo,
                            title: issue.title,
                            body: issue.body,
                            labels: ["bug"]
                            });
                            console.log(`Created new issue: "${issue.title}"`);
                        }
                        }

This code finds the local file in the location we saved it, which requires running the action in the same set of steps in GitHub Actions or on the same runner. Once the file is located, we process each issue using the json "title" and "body". Existing issues are checked for matching titles and new issues are only created when they do not already exist.

Final Thoughts

It is possible to pass the json between steps, and jobs, using GitHub variables, which may be better way for those who are running AutoPkg completely in the cloud. Creating and relying on a file isn't necessarily the best method, however it does allow for a backup of the issues locally on a runner and for other processes to access the same data.

As with any automation, testing is a big part of ensuring this will work in the way you expect. For example, when a processor we use was updated ALL of the recipes need updated. Does it make sense to create an issue for each one when there is a common error? This example does not take those considerations into account and should be used as a starting point for your own solutions.

-rusty

Updates:

Updated "autopkg run" to use "--recipe-list": https://github.com/autopkg/autopkg/wiki/Running-Multiple-Recipes#recipe-lists Thanks elios!

No comments: