• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Clatent

Technology | Fitness | Food

  • About
  • Resources
  • Contact

Learning

First Snowfall of the Season

November 22, 2024 by ClaytonT Leave a Comment

First snowfall in NY! Tell me about the first time PowerShell just clicked… what was that moment?

Mine was adding new users to AD(it’s usually AD or Exchange, right?) and remembering the standard fields needed to be filled out with their default values. Also when we had multiple new hires it was so time consuming clicking through the GUI. That’s when I learned how to create an AD user with a CSV. It was life changing and realized I needed to do more of this.

Now I have a module that is meant to create Microsoft 365 test environments but can be used in production to create users, groups, and much more from an excel file without even having excel on the computer! You can check out the module below.

We all start somewhere and love hearing that light bulb moment that triggers the snowball effect!

365AutomatedLab
Powershell Gallery: https://www.powershellgallery.com/packages/365AutomatedLab/2.11.0
GitHub: https://github.com/DevClate/365AutomatedLab

Tagged With: 365, 365AutomatedLab, AD, Automation, PowerShell

GitHub Actions and PowerShell: The Underdog

November 15, 2024 by ClaytonT Leave a Comment

Remember how I mentioned how GitHub actions are underrated? I’m going to show at a high level how GitHub Actions with PowerShell can save you time and be more efficient.

What does it do?

  • Web scrapes website into PowerShell Object
  • Compares the web scrape to the json “database” file(FidoKeys.json) of all the keys
    • Matches by AAGUID
      • Adds to FidoKeys.json if doesn’t exit
      • Removes from FidoKeys.json if not in the web scrape anymore
    • If New key
      • Checks the first word in the description to see if that matches with the Valid Vendor List(Valid_Vendors.json) and if it matches adds the Vendor
        • If it doesn’t have a valid vendor it will create a GitHub issue for that vendor and key
    • If Existing key
      • Checks to see if any of the properties have changed and updates FidoKeys.json
    • If Missing key
      • If key is no longer in the web scrape, it removes it from FidoKeys.json
  • Updates Merge dates on FidoKeys.json
    • If it checks to see if there are any changes and there are no changes, it only updates databaseLastChecked
    • If it checks to see if there are any changes and there are changes, it updates databaseLastChecked and databaseLastUpdated
  • Creates GitHub Issues for Invalid Vendors
    • If a vendor isn’t in the valid_vendors.json list or if the vendor name is blank, it will automatically create a GitHub issue for that key and invalid vendor name
    • Assigns myself at the owner of the issue
  • Closes GitHub Issues for Valid Vendors
    • If a vendor now matches with a vendor name in valid_vendors.json, then it will automatically close the issue for the now valid vendor
  • Updates merge_log.md
    • It only updates the merge_log.md when a new change occurs from the previous check
  • Updates detailed_log.txt
    • This is written to every time, but if it is the same as previous check it will write “No changes detected during this run”

It does that automatically once every day, I could do it more, but didn’t think it was necessary. Best of all, this is all done for free. Since it is a public repository all GitHub actions are free. Today, I’ll go over the GitHub Action, but I’ll do another post to go into detail on the PowerShell script side.

Let’s start from the beginning. We first have to name the GitHub Action so we will use “Main Entra Merge” in this case as this is for the Main branch and is merging keys for Entra.

name: Main Entra Merge

Then we have to determine when it will run. What I like to do in the beginning is always have a “workflow_dispatch:” as this will always allow you to test it manually and you don’t have to wait for any other triggers. Then in this case I have it run at midnight, and anytime there is a push or pull request to the main branch

on:
  workflow_dispatch:
  schedule:
    - cron: '0 0 * * *'
  push:
    branches:
      - main
  pull_request:
    branches:
      - main

Next, we have to define what OS do we want to run on. I usually only use ubuntu-latest unless I have a real use to use Mac or Windows, as if I remember right, Windows is 3 times the cost to run in Actions, and Mac is 9 times. I know it’s free for me, but why use resources that aren’t needed. You can as well uses different versions of Ubuntu too (GitHub Runners). Also you need to have “jobs:” and then the name of the job or it won’t work. Also spacing is very important with Yaml. It has burned me a few times.

jobs:
  merge-fido-data:
    runs-on: ubuntu-latest

The workflow begins by checking out the repository to the runner using the actions/checkout@v4 action. This step ensures that all necessary files and scripts are available for subsequent steps.

- name: Checkout repository
  uses: actions/checkout@v4
  with:
    fetch-depth: 0
    ref: main

Next, it installs the PSParseHTML PowerShell module, which is essential for parsing HTML content in the scripts that follow.

- name: Install PSParseHTML Module
  shell: pwsh
  run: Install-Module -Name PSParseHTML -Force -Scope CurrentUser

The workflow runs a series of custom PowerShell scripts that perform data validation and merging:

  • Validation Scripts: Test-GHValidVendor.ps1 and Test-GHAAGUIDExists.ps1 ensure that the vendor information and AAGUIDs are valid.
  • Data Export and Merge: Export-GHEntraFido.ps1 exports data from Microsoft Entra, and Merge-GHFidoData.ps1 merges it with existing data.
- name: Run Merge-GHFidoData Script
  shell: pwsh
  env:
    GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
    GITHUB_REPOSITORY: ${{ github.repository }}
  run: |
    Import-Module PSParseHTML
    . ./Scripts/Test-GHValidVendor.ps1
    . ./Scripts/Test-GHAAGUIDExists.ps1
    . ./Scripts/Export-GHEntraFido.ps1
    . ./Scripts/Merge-GHFidoData.ps1
- name: Read Environment Variables
 shell: bash
 run: |
 if [ -f ./Scripts/env_vars.txt ]; then
 echo "Setting environment variables from env_vars.txt"
 cat ./Scripts/env_vars.txt >> $GITHUB_ENV
 else
 echo "env_vars.txt not found."
 fi

For transparency, the workflow outputs the values of key environment variables, aiding in debugging and verification. This could be removed, but leaving for now for testing.

- name: Debug - Display ISSUE_ENTRIES, KEYS_NOW_VALID, and VENDORS_NOW_VALID Environment Variables
 shell: bash
 run: |
 echo "ISSUE_ENTRIES: $ISSUE_ENTRIES"
 echo "KEYS_NOW_VALID: $KEYS_NOW_VALID"
 echo "VENDORS_NOW_VALID: $VENDORS_NOW_VALID"

Utilizing actions/github-script@v6, the workflow runs a JavaScript script that automates issue creation and closure based on validation results.

  • Creates Issues: For any data discrepancies found.
  • Closes Issues: If previously reported issues are now resolved.
  • Assigns Issues: Automatically assigns issues to DevClate for certain labels.
- name: Close Fixed Issues and Create New Issues
      uses: actions/github-script@v6
      with:
        github-token: ${{ secrets.GITHUB_TOKEN }}
        script: |
          const issueEntriesRaw = process.env.ISSUE_ENTRIES || '';
          const issueEntries = issueEntriesRaw.split('%0A').map(entry => decodeURIComponent(entry)).filter(entry => entry.trim() !== '');
          if (issueEntries.length === 0) {
            console.log('No new issue entries found.');
          } else {
            for (const entry of issueEntries) {
              const parts = entry.split('|');
              if (parts.length < 2) {
                console.error(`Invalid entry format: ${entry}`);
                continue;
              }
              const [issueTitle, issueBody, issueLabel] = parts;
              console.log(`Processing issue: ${issueTitle}`);
              const { data: issues } = await github.rest.issues.listForRepo({
                owner: context.repo.owner,
                repo: context.repo.repo,
                state: 'open',
                labels: 'auto-generated',
              });
              const existingIssue = issues.find(issue => issue.title === issueTitle);
              if (!existingIssue) {
                const assignees = [];
                if (issueLabel === 'InvalidVendor' || issueLabel === 'DuplicateEntry') {
                  assignees.push('DevClate');
                }
                await github.rest.issues.create({
                  owner: context.repo.owner,
                  repo: context.repo.repo,
                  title: issueTitle,
                  body: issueBody,
                  labels: issueLabel ? ['auto-generated', issueLabel] : ['auto-generated'],
                  assignees: assignees,
                });
                console.log(`Issue created: ${issueTitle}`);
              } else {
                console.log(`Issue already exists: ${issueTitle}`);
              }
            }
          }

          // Close issues for keys (AAGUIDs) that are now valid
          const keysNowValidRaw = process.env.KEYS_NOW_VALID || '';
          const keysNowValid = keysNowValidRaw.split('%0A').map(entry => decodeURIComponent(entry)).filter(entry => entry.trim() !== '');
          if (keysNowValid.length === 0) {
            console.log('No keys have become valid.');
          } else {
            console.log('Keys that are now valid:', keysNowValid);
            for (const aaguid of keysNowValid) {
              const { data: issues } = await github.rest.issues.listForRepo({
                owner: context.repo.owner,
                repo: context.repo.repo,
                state: 'open',
                labels: ['auto-generated', 'InvalidVendor'],
                per_page: 100,
              });
              for (const issue of issues) {
                if (issue.title.includes(aaguid)) {
                  await github.rest.issues.update({
                    owner: context.repo.owner,
                    repo: context.repo.repo,
                    issue_number: issue.number,
                    state: 'closed',
                    state_reason: 'completed',
                  });
                  await github.rest.issues.createComment({
                    owner: context.repo.owner,
                    repo: context.repo.repo,
                    issue_number: issue.number,
                    body: `The vendor for key with AAGUID '${aaguid}' is now valid. This issue is being closed automatically.`,
                  });
                  console.log(`Closed issue for key with AAGUID: ${aaguid}`);
                }
              }
            }
          }

The workflow extracts the newest entries from merge_log.md and detailed_log.txt and appends them to the GitHub Actions summary for easy access.

- name: Display Merge Log
  shell: bash
  run: |
    # Extract and format logs

Configuring Git ensures that any commits made by the workflow are properly attributed.

- name: Configure Git
  run: |
    git config --global user.name 'D--ate'
    git config --global user.email 'c---@--t.com'

Finally, the workflow commits the changes made to the data and logs, pushing them back to the main branch.

- name: Commit changes
 run: |
 git add Assets/FidoKeys.json merge_log.md detailed_log.txt
 git commit -m "Update Fidokeys.json, merge_log.md, and detailed_log.txt" || echo "No changes to commit"

- name: Push changes
 uses: ad-m/github-push-action@v0.6.0
 with:
 github_token: ${{ secrets.GITHUB_TOKEN }}
 branch: main

And that’s it! It’s completely ok to not fully understand it, but wanted to give you a quick breakdown on how it works in case you have a project that you are working on or have been holding off because you didn’t know this is possible. If you have any tips, I’d be glad to talk as well as I’m always open for improvement and learning new ideas.

If you want to see this in action check out https://github.com/DevClate/EntraFIDOFinder

I do have a PowerShell module that works with this and allows you to find/filter which FIDO2 Keys are Entra Attestation approved, that can be downloaded there or on the PowerShell Gallery

And I even made an interactive website as well at https://devclate.github.io/EntraFIDOFinder/Explorer/

I will be doing a breakdown of the PowerShell of this in part 2!

Hope this was helpful and have a great day!

Tagged With: 365, Automation, Entra, FIDO2, GitHub Actions, PowerShell, Reporting

GitHub Copilot Password Warning

October 4, 2024 by ClaytonT Leave a Comment

Did you know that GitHub Copilot is now sensing hard coded credentials and giving you a warning? It’s not perfect, but even if something looks like hard coded creds it will flag it, as on another script I had, it contained numbers that looked like they could be private, and it gave me a warning about it. Honestly, I rather find more potentials credentials then not. That’s it for today, hope you have a great day!

Tagged With: Automation, Copilot, GitHub, Passwords, Security

Simple Tip for GitHub Copilot

September 20, 2024 by ClaytonT 2 Comments

If you have GitHub Copilot, you may or may not know about this little tip, but wanted to let you know just in case. It has saved me so much time and it can be applied to a lot of scenarios.

Here is the prompt, “Can you add the export to markdown capability from #file:Test-365ACCompanyName.ps1 to #file:Test-365ACStreetAddress.ps1”

And what this does is it adds the functionality of exporting to markdown to the new file the same I did in CompanyName, but adapts it to fit StreetAddress. Further more, it adds Help for it and creates the parameter for it(with ValidatePattern).

From this:

<#
.SYNOPSIS
    Tests whether users have a street address and generates a report.

.DESCRIPTION
    The Test-365ACStreetAddress function tests whether users have a street address and generates a report. 
    It takes a list of users as input and checks if each user has a street address. 
    The function generates a report that includes the user's display name and whether they have a street address.
    The report can be exported to an Excel file or an HTML file.

.PARAMETER Users
    Specifies the list of users to test. If not provided, it retrieves all users from the Microsoft Graph API.

.PARAMETER TenantID
    The ID of the tenant to connect to. Required if using app-only authentication.

.PARAMETER ClientID
    The ID of the client to use for app-only authentication. Required if using app-only authentication.

.PARAMETER CertificateThumbprint
    The thumbprint of the certificate to use for app-only authentication. Required if using app-only authentication.

.PARAMETER AccessToken
    The access token to use for authentication. Required if using app-only authentication.

.PARAMETER InteractiveLogin
    Specifies whether to use interactive login. If this switch is present, interactive login will be used. Otherwise, app-only authentication will be used.

.PARAMETER OutputExcelFilePath
    Specifies the file path to export the report as an Excel file. The file must have a .xlsx extension.

.PARAMETER OutputHtmlFilePath
    Specifies the file path to export the report as an HTML file. The file must have a .html extension.

.PARAMETER TestedProperty
    Specifies the name of the property being tested. Default value is 'Has Street Address'.

.EXAMPLE
    Test-365ACStreetAddress -Users $users -OutputExcelFilePath "C:\\Reports\\StreetAddressReport.xlsx"

    This example tests the specified list of users for street addresses and exports the report to an Excel file.

.EXAMPLE
    Test-365ACStreetAddress -OutputExcelFilePath "C:\\Reports\\StreetAddressReport.xlsx"

    This example retrieves all users from the Microsoft Graph API, tests them for street addresses, and exports the report to an Excel file.

#>

Function Test-365ACStreetAddress {
    [CmdletBinding()]
    param
    (
        [Parameter(ValueFromPipeline = $true)]
        [array]$Users = (Get-MgUser -All -Property DisplayName, StreetAddress | Select-Object DisplayName, StreetAddress),
        
        [Parameter(Mandatory = $false)]
        [string]$TenantID,
        
        [Parameter(Mandatory = $false)]
        [string]$ClientID,
        
        [Parameter(Mandatory = $false)]
        [string]$CertificateThumbprint,
        
        [Parameter(Mandatory = $false)]
        [string]$AccessToken,
        
        [Parameter(Mandatory = $false)]
        [switch]$InteractiveLogin,

        [ValidatePattern('\\.xlsx$')]
        [string]$OutputExcelFilePath,
        
        [ValidatePattern('\\.html$')]
        [string]$OutputHtmlFilePath,
        
        [string]$TestedProperty = 'Has Street Address'
    )
    BEGIN {
        if ($InteractiveLogin) {
            Write-PSFMessage "Using interactive login..." -Level Host
            Connect-MgGraph -Scopes "User.Read.All", "AuditLog.read.All"  -NoWelcome
        }
        else {
            Write-PSFMessage "Using app-only authentication..." -Level Host
            Connect-MgGraph -ClientId $ClientID -TenantId $TenantID -CertificateThumbprint $CertificateThumbprint -Scopes "User.Read.All", "AuditLog.Read.All"
        }
        
        $results = @()
    }
    PROCESS {
        foreach ($user in $Users) {
            $hasStreetAddress = [bool]($user.StreetAddress)
            $result = [PSCustomObject]@{
                'User Display Name' = $user.DisplayName
                $TestedProperty     = $hasStreetAddress
            }
            $results += $result
        }
    }
    END {
        $totalTests = $results.Count
        $passedTests = ($results | Where-Object { $_.$TestedProperty }).Count
        $failedTests = $totalTests - $passedTests
        if ($OutputExcelFilePath) {
            Export-365ACResultToExcel -Results $results -OutputExcelFilePath $OutputExcelFilePath -TotalTests $totalTests -PassedTests $passedTests -FailedTests $failedTests -TestedProperty $TestedProperty
        }
        elseif ($OutputHtmlFilePath) {
            Export-365ACResultToHtml -Results $results -OutputHtmlFilePath $OutputHtmlFilePath -TotalTests $totalTests -PassedTests $passedTests -FailedTests $failedTests -TestedProperty $TestedProperty
        }
        else {
            Write-Output $results
        }
    }
}

To this:

<#
.SYNOPSIS
    Tests whether users have a street address and generates a report.
.DESCRIPTION
    The Test-365ACStreetAddress function tests whether users have a street address and generates a report. 
    It takes a list of users as input and checks if each user has a street address. 
    The function generates a report that includes the user's display name and whether they have a street address.
    The report can be exported to an Excel file, an HTML file, or a Markdown file.
.PARAMETER Users
    Specifies the list of users to test. If not provided, it retrieves all users from the Microsoft Graph API.
.PARAMETER TenantID
    The ID of the tenant to connect to. Required if using app-only authentication.
.PARAMETER ClientID
    The ID of the client to use for app-only authentication. Required if using app-only authentication.
.PARAMETER CertificateThumbprint
    The thumbprint of the certificate to use for app-only authentication. Required if using app-only authentication.
.PARAMETER AccessToken
    The access token to use for authentication. Required if using app-only authentication.
.PARAMETER InteractiveLogin
    Specifies whether to use interactive login. If this switch is present, interactive login will be used. Otherwise, app-only authentication will be used.
.PARAMETER OutputExcelFilePath
    Specifies the file path to export the report as an Excel file. The file must have a .xlsx extension.
.PARAMETER OutputHtmlFilePath
    Specifies the file path to export the report as an HTML file. The file must have a .html extension.
.PARAMETER OutputMarkdownFilePath
    Specifies the file path to export the report as a Markdown file. The file must have a .md extension.
.PARAMETER TestedProperty
    Specifies the name of the property being tested. Default value is 'Has Street Address'.
.EXAMPLE
    Test-365ACStreetAddress -Users $users -OutputExcelFilePath "C:\\Reports\\StreetAddressReport.xlsx"
    This example tests the specified list of users for street addresses and exports the report to an Excel file.
.EXAMPLE
    Test-365ACStreetAddress -OutputExcelFilePath "C:\\Reports\\StreetAddressReport.xlsx"
    This example retrieves all users from the Microsoft Graph API, tests them for street addresses, and exports the report to an Excel file.
#>
Function Test-365ACStreetAddress {
[CmdletBinding()]
param
(
    [Parameter(ValueFromPipeline = $true)]
    [array]$Users = (Get-MgUser -All -Property DisplayName, StreetAddress | Select-Object DisplayName, StreetAddress),
    
    [Parameter(Mandatory = $false)]
    [string]$TenantID,
    
    [Parameter(Mandatory = $false)]
    [string]$ClientID,
    
    [Parameter(Mandatory = $false)]
    [string]$CertificateThumbprint,
    
    [Parameter(Mandatory = $false)]
    [string]$AccessToken,
    
    [Parameter(Mandatory = $false)]
    [switch]$InteractiveLogin,
    
    [ValidatePattern('\\.xlsx$')]
    [string]$OutputExcelFilePath,
    
    [ValidatePattern('\\.html$')]
    [string]$OutputHtmlFilePath,
    
    [ValidatePattern('\\.md$')]
    [string]$OutputMarkdownFilePath,
    
    [string]$TestedProperty = 'Has Street Address'
)
BEGIN {
    if ($InteractiveLogin) {
        Write-PSFMessage "Using interactive login..." -Level Host
        Connect-MgGraph -Scopes "User.Read.All", "AuditLog.read.All"  -NoWelcome
    }
    else {
        Write-PSFMessage "Using app-only authentication..." -Level Host
        Connect-MgGraph -ClientId $ClientID -TenantId $TenantID -CertificateThumbprint $CertificateThumbprint -Scopes "User.Read.All", "AuditLog.Read.All"
    }
    
    $results = @()
}
PROCESS {
    foreach ($user in $Users) {
        $hasStreetAddress = [bool]($user.StreetAddress)
        $result = [PSCustomObject]@{
            'User Display Name' = $user.DisplayName
            $TestedProperty     = $hasStreetAddress
        }
        $results += $result
    }
}
END {
    $totalTests = $results.Count
    $passedTests = ($results | Where-Object { $_.$TestedProperty }).Count
    $failedTests = $totalTests - $passedTests
    if ($OutputExcelFilePath) {
        Export-365ACResultToExcel -Results $results -OutputExcelFilePath $OutputExcelFilePath -TotalTests $totalTests -PassedTests $passedTests -FailedTests $failedTests -TestedProperty $TestedProperty
        Write-PSFMessage "Excel report saved to $OutputExcelFilePath" -Level Host
    }
    elseif ($OutputHtmlFilePath) {
        Export-365ACResultToHtml -Results $results -OutputHtmlFilePath $OutputHtmlFilePath -TotalTests $totalTests -PassedTests $passedTests -FailedTests $failedTests -TestedProperty $TestedProperty
        Write-PSFMessage "HTML report saved to $OutputHtmlFilePath" -Level Host
    }
    elseif ($OutputMarkdownFilePath) {
        Export-365ACResultToMarkdown -Results $results -OutputMarkdownFilePath $OutputMarkdownFilePath -TotalTests $totalTests -PassedTests $passedTests -FailedTests $failedTests -TestedProperty $TestedProperty
        Write-PSFMessage "Markdown report saved to $OutputMarkdownFilePath" -Level Host
    }
    else {
        Write-PSFMessage -Level Output -Message ($results | Out-String)
    }
}
}

And here is the source Test-365ACCompanyName.ps1 code

<#
.SYNOPSIS
Tests if users have a company name property and generates test results.

.DESCRIPTION
The Test-365ACCompanyName function tests if users have a company name property and generates test results. It takes an array of users as input and checks if each user has a company name property. The test results are stored in an array of custom objects, which include the user's display name and the result of the test.

.PARAMETER Users
Specifies the array of users to test. Each user should have a DisplayName and CompanyName property.

.PARAMETER TenantID
The ID of the tenant to connect to. Required if using app-only authentication.

.PARAMETER ClientID
The ID of the client to use for app-only authentication. Required if using app-only authentication.

.PARAMETER CertificateThumbprint
The thumbprint of the certificate to use for app-only authentication. Required if using app-only authentication.

.PARAMETER AccessToken
The access token to use for authentication. Required if using app-only authentication.

.PARAMETER InteractiveLogin
Specifies whether to use interactive login. If this switch is present, interactive login will be used. Otherwise, app-only authentication will be used.

.PARAMETER OutputExcelFilePath
Specifies the path to the output Excel file. If provided, the test results will be exported to an Excel file.

.PARAMETER OutputHtmlFilePath
Specifies the path to the output HTML file. If provided, the test results will be exported to an HTML file.

.PARAMETER OutputMarkdownFilePath
Specifies the path to the output Markdown file. If provided, the test results will be exported to a Markdown file.

.PARAMETER TestedProperty
Specifies the name of the tested property. Default value is 'Has Company Name'.

.INPUTS
An array of users with DisplayName and CompanyName properties.

.OUTPUTS
If OutputExcelFilePath or OutputHtmlFilePath is not provided, the function outputs an array of custom objects with the user's display name and the result of the test.

.EXAMPLE
$users = Get-MgUser -All -Property DisplayName, CompanyName | Select-Object DisplayName, CompanyName
Test-365ACCompanyName -Users $users -OutputExcelFilePath 'C:\\TestResults.xlsx'
This example tests if the users have a company name property and exports the test results to an Excel file.
#>
Function Test-365ACCompanyName {
    [CmdletBinding()]
    param
    (
        [Parameter(ValueFromPipeline = $true)]
        [array]$Users = (Get-MgUser -All -Property DisplayName, CompanyName | Select-Object DisplayName, CompanyName),
        
        [Parameter(Mandatory = $false)]
        [string]$TenantID,
        
        [Parameter(Mandatory = $false)]
        [string]$ClientID,
        
        [Parameter(Mandatory = $false)]
        [string]$CertificateThumbprint,
        
        [Parameter(Mandatory = $false)]
        [string]$AccessToken,
        
        [Parameter(Mandatory = $false)]
        [switch]$InteractiveLogin,
        
        [ValidatePattern('\\.xlsx$')]
        [string]$OutputExcelFilePath,
        
        [ValidatePattern('\\.html$')]
        [string]$OutputHtmlFilePath,
        
        [ValidatePattern('\\.md$')]
        [string]$OutputMarkdownFilePath,
        
        [string]$TestedProperty = 'Has Company Name'
    )
    BEGIN {
        if ($InteractiveLogin) {
            Write-PSFMessage "Using interactive login..." -Level Host
            Connect-MgGraph -Scopes "User.Read.All", "AuditLog.read.All"  -NoWelcome
        }
        else {
            Write-PSFMessage "Using app-only authentication..." -Level Host
            Connect-MgGraph -ClientId $ClientID -TenantId $TenantID -CertificateThumbprint $CertificateThumbprint -Scopes "User.Read.All", "AuditLog.Read.All"
        }
        $results = @()
    }
    PROCESS {
        foreach ($user in $Users) {
            $hasCompanyName = [bool]($user.CompanyName)
            $result = [PSCustomObject]@{
                'User Display Name' = $user.DisplayName
                $TestedProperty     = $hasCompanyName
            }
            $results += $result
        }
    }
    END {
        $totalTests = $results.Count
        $passedTests = ($results | Where-Object { $_.$TestedProperty }).Count
        $failedTests = $totalTests - $passedTests
        if ($OutputExcelFilePath) {
            Export-365ACResultToExcel -Results $results -OutputExcelFilePath $OutputExcelFilePath -TotalTests $totalTests -PassedTests $passedTests -FailedTests $failedTests -TestedProperty $TestedProperty
            Write-PSFMessage "Excel report saved to $OutputExcelFilePath" -Level Host
        }
        elseif ($OutputHtmlFilePath) {
            Export-365ACResultToHtml -Results $results -OutputHtmlFilePath $OutputHtmlFilePath -TotalTests $totalTests -PassedTests $passedTests -FailedTests $failedTests -TestedProperty $TestedProperty
            Write-PSFMessage "HTML report saved to $OutputHtmlFilePath" -Level Host
        }
        elseif ($OutputMarkdownFilePath) {
            Export-365ACResultToMarkdown -Results $results -OutputMarkdownFilePath $OutputMarkdownFilePath -TotalTests $totalTests -PassedTests $passedTests -FailedTests $failedTests -TestedProperty $TestedProperty
            Write-PSFMessage "Markdown report saved to $OutputMarkdownFilePath" -Level Host
        }
        else {
            Write-PSFMessage -Level Output -Message ($results | Out-String)
        }
    }
}

And yes, it did write out the whole function again, I didn’t manipulate it at all, so after reviewing you can literally just copy it in, and not have to worry about forgetting a “(” or “{”. I know it’s not a huge lift, but you can also do “Can you also do this for #file:Test-365ACZipcode.ps1” and it will.

I’ve used this prompt syntax for other functions as well with great results. Have you used anything like this before?

You can see more uses of it at GitHub – 365AutomatedCheck

Tagged With: AI, Automation, GitHub Copilot, PowerShell, Templates

Why did I get this email?

March 25, 2024 by ClaytonT Leave a Comment

Here’s the scenario…

An executive forwards an email to your ticketing system and asks why they are receiving it. Then sends another from the day before. There is a Microsoft 365 distribution list(DL) in both emails, but not one they would be on. What do you do?

Check and see if there are any tickets for that DL, and you see there haven’t been any tickets for that DL or even that person. You then check the DL, and indeed see they are in it…. but how?

PowerShell to the rescue! Have you ever used “Search-UnifiedAuditLog” which is a cmdlet for Exchange Online PowerShell? It is a great for one off investigations in 365, but here we will use it to find any admin activity for that user in the past week. Full disclosure, I’ve used it a handful of times and had never really dug into which was a mistake on my part. Knowing more of what it can do now would have saved me so much time on other resolutions where I had gone through the 365 portal. Don’t be me, start using this now and create your own functions as well Purview to save you time and headaches. Enable it now, as it can’t be backdated.

# See if you have it enabled
Get-AdminAuditLogConfig | Format-List UnifiedAuditLogIngestionEnabled

# If not enabled, run this
Enable-OrganizationCustomization

# Enable Audit logs - this can take up to 60 mins
Set-AdminAuditLogConfig -UnifiedAuditLogIngestionEnabled $true

Hopefully you already have it enabled, or you enabled it right now and can wait for it to start ingesting the logs so when you do need it, it is available.

Back to our executive incident. How do we find out what happened. The quick way is to run:

# Check for all admin activity for named user
Search-UnifiedAuditLog -StartDate 2/1/2024 -EndDate 3/16/2024 -ObjectIds execuser@domain.com

And this is the way I originally did it to get the answer I needed. That’s it! Then you will see in “AuditData” which groups they were added/removed from and any operations that happened with the groups they are in. This broad search will show even more, but only mentioning parts related to this task. At the end of this post I’ll have a list of great resources on how to get granular on your searches.

Now you can see that another engineer accidentally added them(after confirming with engineer), and you can just remove them from the list. This is best case scenario, as if I hadn’t looked and just removed the executive without searching and asking the engineer, they could have been added by a compromised account seeing what kind of privileges they had.

There is a way clean up the audit data so it is easier to view, but that will be in a longer blog post coming soon. Again, I’ll have some links at the end to give you a head start. Honestly, this was only supposed to be a quick one liner post, that definitely grew, and I’ve spent more time than I would like to admit researching it. It has given me more ideas on how to use it and I’ll put together functions in a repository or possibly a module of most useful commands.

One function I’ll be creating is one to check to see if a user has changed their password recently, has multiple failed attempts, and/or if they have locked themself out. How nice would that be for you or your help desk if the function sees who submitted the ticket, runs the function then gives you that feedback? To go one step farther, if they aren’t blocked out, automatically send them the password reset portal to reset their password?

If you already use this, what scripts/functions have you created? I’d love to hear about them, and I can create a repository for us to keep them in one spot.

Useful Links:

Search-UnifiedAuditLog – Microsoft Learn Cmdlet
How it works – Services that support auditing
Detailed info – Detailed Microsoft Script

Hope this helps saving you from headaches and can’t wait to hear how you use it! Have a great day!

Tagged With: 365, AuditLog, Automation, Entra, PowerShell, Reporting, Security

365AutomatedLab Update: Now Supports MacOS

October 23, 2023 by ClaytonT Leave a Comment

You read that write v0.1.4 now supports MacOs! As long as you are using PowerShell 7.x you are good to go. Also in v0.1.4, you are now able to export your users from your main tenant into a worksheet that is easily imported into your dev tenant. You will only have to add the licensing(Please let me know if you want me to default the developer licenses). I’ve cleaned up some of the documentation and help as well. I’m really excited about this release and can’t wait to add more features! Let me know what you would like to see next.

For easy installation:

Install-Module -Name 365AutomatedLab

As always any PR, Issue, or question is always welcome. Have a great day!

Detailed Changelog:
https://github.com/DevClate/365AutomatedLab/blob/main/CHANGELOG.md

GitHub:
https://github.com/DevClate/365AutomatedLab

PowerShell Gallery:
https://www.powershellgallery.com/packages/365Automatedlab/0.1.4

Tagged With: 365, 365AutomatedLab, Automation, MacOS, Module Monday, PowerShell

  • Page 1
  • Page 2
  • Page 3
  • Go to Next Page »

Primary Sidebar

Clayton Tyger

Tech enthusiast dad who has lost 100lbs and now sometimes has crazy running/biking ideas. Read More…

Find Me On

  • Email
  • GitHub
  • Instagram
  • LinkedIn
  • Twitter

Recent Posts

  • Did you know: SSPR/Password Reset Edition
  • How to Delete Recurring Planner Tasks with PowerShell
  • Why does my 365 Admin Audit Log sometime say it’s disabled, but other times enabled? Am I being compromised?
  • EntraFIDOFinder Update
  • New version of EntraFIDOFinder is out now

Categories

  • 365
  • Active Directory
  • AI
  • AzureAD
  • BlueSky
  • Cim
  • Dashboards
  • Documentation
  • Entra
  • Get-WMI
  • Learning
  • Module Monday
  • Nutanix
  • One Liner Wednesday
  • Passwords
  • PDF
  • Planner
  • PowerShell
  • Read-Only Friday
  • Reporting
  • Security
  • Windows
  • WSUS

© 2025 Clatent