Remember how I mentioned how GitHub actions are underrated? I’m going to show at a high level how GitHub Actions with PowerShell can save you time and be more efficient.
What does it do?
- Web scrapes website into PowerShell Object
- Compares the web scrape to the json “database” file(FidoKeys.json) of all the keys
- Matches by AAGUID
- Adds to FidoKeys.json if doesn’t exit
- Removes from FidoKeys.json if not in the web scrape anymore
- If New key
- Checks the first word in the description to see if that matches with the Valid Vendor List(Valid_Vendors.json) and if it matches adds the Vendor
- If it doesn’t have a valid vendor it will create a GitHub issue for that vendor and key
- Checks the first word in the description to see if that matches with the Valid Vendor List(Valid_Vendors.json) and if it matches adds the Vendor
- If Existing key
- Checks to see if any of the properties have changed and updates FidoKeys.json
- If Missing key
- If key is no longer in the web scrape, it removes it from FidoKeys.json
- Matches by AAGUID
- Updates Merge dates on FidoKeys.json
- If it checks to see if there are any changes and there are no changes, it only updates databaseLastChecked
- If it checks to see if there are any changes and there are changes, it updates databaseLastChecked and databaseLastUpdated
- Creates GitHub Issues for Invalid Vendors
- If a vendor isn’t in the valid_vendors.json list or if the vendor name is blank, it will automatically create a GitHub issue for that key and invalid vendor name
- Assigns myself at the owner of the issue
- Closes GitHub Issues for Valid Vendors
- If a vendor now matches with a vendor name in valid_vendors.json, then it will automatically close the issue for the now valid vendor
- Updates merge_log.md
- It only updates the merge_log.md when a new change occurs from the previous check
- Updates detailed_log.txt
- This is written to every time, but if it is the same as previous check it will write “No changes detected during this run”
It does that automatically once every day, I could do it more, but didn’t think it was necessary. Best of all, this is all done for free. Since it is a public repository all GitHub actions are free. Today, I’ll go over the GitHub Action, but I’ll do another post to go into detail on the PowerShell script side.
Let’s start from the beginning. We first have to name the GitHub Action so we will use “Main Entra Merge” in this case as this is for the Main branch and is merging keys for Entra.
name: Main Entra Merge
Then we have to determine when it will run. What I like to do in the beginning is always have a “workflow_dispatch:” as this will always allow you to test it manually and you don’t have to wait for any other triggers. Then in this case I have it run at midnight, and anytime there is a push or pull request to the main branch
on:
workflow_dispatch:
schedule:
- cron: '0 0 * * *'
push:
branches:
- main
pull_request:
branches:
- main
Next, we have to define what OS do we want to run on. I usually only use ubuntu-latest unless I have a real use to use Mac or Windows, as if I remember right, Windows is 3 times the cost to run in Actions, and Mac is 9 times. I know it’s free for me, but why use resources that aren’t needed. You can as well uses different versions of Ubuntu too (GitHub Runners). Also you need to have “jobs:” and then the name of the job or it won’t work. Also spacing is very important with Yaml. It has burned me a few times.
jobs:
merge-fido-data:
runs-on: ubuntu-latest
The workflow begins by checking out the repository to the runner using the actions/checkout@v4 action. This step ensures that all necessary files and scripts are available for subsequent steps.
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 0
ref: main
Next, it installs the PSParseHTML PowerShell module, which is essential for parsing HTML content in the scripts that follow.
- name: Install PSParseHTML Module
shell: pwsh
run: Install-Module -Name PSParseHTML -Force -Scope CurrentUser
The workflow runs a series of custom PowerShell scripts that perform data validation and merging:
- Validation Scripts:
Test-GHValidVendor.ps1
andTest-GHAAGUIDExists.ps1
ensure that the vendor information and AAGUIDs are valid. - Data Export and Merge:
Export-GHEntraFido.ps1
exports data from Microsoft Entra, andMerge-GHFidoData.ps1
merges it with existing data.
- name: Run Merge-GHFidoData Script
shell: pwsh
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPOSITORY: ${{ github.repository }}
run: |
Import-Module PSParseHTML
. ./Scripts/Test-GHValidVendor.ps1
. ./Scripts/Test-GHAAGUIDExists.ps1
. ./Scripts/Export-GHEntraFido.ps1
. ./Scripts/Merge-GHFidoData.ps1
- name: Read Environment Variables
shell: bash
run: |
if [ -f ./Scripts/env_vars.txt ]; then
echo "Setting environment variables from env_vars.txt"
cat ./Scripts/env_vars.txt >> $GITHUB_ENV
else
echo "env_vars.txt not found."
fi
For transparency, the workflow outputs the values of key environment variables, aiding in debugging and verification. This could be removed, but leaving for now for testing.
- name: Debug - Display ISSUE_ENTRIES, KEYS_NOW_VALID, and VENDORS_NOW_VALID Environment Variables
shell: bash
run: |
echo "ISSUE_ENTRIES: $ISSUE_ENTRIES"
echo "KEYS_NOW_VALID: $KEYS_NOW_VALID"
echo "VENDORS_NOW_VALID: $VENDORS_NOW_VALID"
Utilizing actions/github-script@v6
, the workflow runs a JavaScript script that automates issue creation and closure based on validation results.
- Creates Issues: For any data discrepancies found.
- Closes Issues: If previously reported issues are now resolved.
- Assigns Issues: Automatically assigns issues to
DevClate
for certain labels.
- name: Close Fixed Issues and Create New Issues
uses: actions/github-script@v6
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const issueEntriesRaw = process.env.ISSUE_ENTRIES || '';
const issueEntries = issueEntriesRaw.split('%0A').map(entry => decodeURIComponent(entry)).filter(entry => entry.trim() !== '');
if (issueEntries.length === 0) {
console.log('No new issue entries found.');
} else {
for (const entry of issueEntries) {
const parts = entry.split('|');
if (parts.length < 2) {
console.error(`Invalid entry format: ${entry}`);
continue;
}
const [issueTitle, issueBody, issueLabel] = parts;
console.log(`Processing issue: ${issueTitle}`);
const { data: issues } = await github.rest.issues.listForRepo({
owner: context.repo.owner,
repo: context.repo.repo,
state: 'open',
labels: 'auto-generated',
});
const existingIssue = issues.find(issue => issue.title === issueTitle);
if (!existingIssue) {
const assignees = [];
if (issueLabel === 'InvalidVendor' || issueLabel === 'DuplicateEntry') {
assignees.push('DevClate');
}
await github.rest.issues.create({
owner: context.repo.owner,
repo: context.repo.repo,
title: issueTitle,
body: issueBody,
labels: issueLabel ? ['auto-generated', issueLabel] : ['auto-generated'],
assignees: assignees,
});
console.log(`Issue created: ${issueTitle}`);
} else {
console.log(`Issue already exists: ${issueTitle}`);
}
}
}
// Close issues for keys (AAGUIDs) that are now valid
const keysNowValidRaw = process.env.KEYS_NOW_VALID || '';
const keysNowValid = keysNowValidRaw.split('%0A').map(entry => decodeURIComponent(entry)).filter(entry => entry.trim() !== '');
if (keysNowValid.length === 0) {
console.log('No keys have become valid.');
} else {
console.log('Keys that are now valid:', keysNowValid);
for (const aaguid of keysNowValid) {
const { data: issues } = await github.rest.issues.listForRepo({
owner: context.repo.owner,
repo: context.repo.repo,
state: 'open',
labels: ['auto-generated', 'InvalidVendor'],
per_page: 100,
});
for (const issue of issues) {
if (issue.title.includes(aaguid)) {
await github.rest.issues.update({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
state: 'closed',
state_reason: 'completed',
});
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
body: `The vendor for key with AAGUID '${aaguid}' is now valid. This issue is being closed automatically.`,
});
console.log(`Closed issue for key with AAGUID: ${aaguid}`);
}
}
}
}
The workflow extracts the newest entries from merge_log.md and detailed_log.txt and appends them to the GitHub Actions summary for easy access.
- name: Display Merge Log
shell: bash
run: |
# Extract and format logs
Configuring Git ensures that any commits made by the workflow are properly attributed.
- name: Configure Git
run: |
git config --global user.name 'D--ate'
git config --global user.email 'c---@--t.com'
Finally, the workflow commits the changes made to the data and logs, pushing them back to the main branch.
- name: Commit changes
run: |
git add Assets/FidoKeys.json merge_log.md detailed_log.txt
git commit -m "Update Fidokeys.json, merge_log.md, and detailed_log.txt" || echo "No changes to commit"
- name: Push changes
uses: ad-m/github-push-action@v0.6.0
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
branch: main
And that’s it! It’s completely ok to not fully understand it, but wanted to give you a quick breakdown on how it works in case you have a project that you are working on or have been holding off because you didn’t know this is possible. If you have any tips, I’d be glad to talk as well as I’m always open for improvement and learning new ideas.
If you want to see this in action check out https://github.com/DevClate/EntraFIDOFinder
I do have a PowerShell module that works with this and allows you to find/filter which FIDO2 Keys are Entra Attestation approved, that can be downloaded there or on the PowerShell Gallery
And I even made an interactive website as well at https://devclate.github.io/EntraFIDOFinder/Explorer/
I will be doing a breakdown of the PowerShell of this in part 2!
Hope this was helpful and have a great day!