Mapping legacy files shares for Azure AD joined devices

More and more of my customers are moving their devices from a traditional IT model to a Modern Desktop build directly in Azure AD, managing devices via Microsoft Intune rather than Group Policy or System Center Configuration Manager. The move to this modern approach of delivering IT services usually sits alongside of moving the organisation’s unstructured file data to OneDrive and SharePoint online which is the logical place to store this data instead of sat on a file server in an office or datacentre.

What if, however, that you still have a large volume of data that remains on your on premises file servers. Users will still require access to these shares but there is no native way of connecting to file shares within the Intune console. This is the challenge I have had for a customer in recent weeks and have developed a couple of PowerShell scripts that can be run to map drives when a user logs in and supports both dedicated and shared devices.

The Challenge

Looking back at a legacy IT approach, drive mappings were done through either Group Policy Preferences but also through login scripts such as batch or KIX. Both processes follow a similar method:

  1. User signs into a device
  2. GPP or login script runs containing list of mapped drives aligned to security groups of users who should have access
  3. If the user signing into the device is in the relevant groups the drive letter is mapped to the shared location

This method has worked for years and IT admins maintain one or the other process to give users access to corporate data. If we now look forward to the modern managed IT environment, there are a few issues when working with the legacy file servers:

  • There is no native construct in Intune that maps UNC file paths for users
  • Whilst you can run a PowerShell script that could run a New-PSDrive cmdlet this will only execute once on the device and never again.

You may think that the second piece isn’t an issue, simply create a couple of scripts to map the network drives to the file shares and they will run once and remain mapped. What if the devices are shared and multiple users need to sign into the computer or if you need to amend the drive mappings? We needed a solution that could map drives at user sign in and be easy to change as the organisation moves away from file servers.

The Solution

As with most things, I started looking at what was on the Internet and quickly came across these blogs from Nicola Suter and Jos Lieben but neither really did what I needed for my customer (they have >100 different network drives). I set about looking for scripts that would deliver what I needed for the customer.

My requirements for the new drive mapping script were as follows:

  • Work natively with Azure AD joined devices
  • support users on dedicated or shared workstations
  • process the drive mappings sequentially as a traditional GPP or Login script would execute

Let’s start with the actual drive mapping script itself

Drive mapping script

For the drive mapping script to work, it needs to run silently and also enumerate the groups that the user has access to. Sounds easy, but PowerShell and AzureAD doesn’t natively have a way of matching these. I settled on making use of Microsoft’s Graph API listMemberOf function as this can be called to pull the groups that a user is a member of into a variable that can work with the drive mapping. The function requires a minimum permission of Directory.ReadAll which needed to be granted through an App Registration in Azure AD. Step forward my next web help in the form of Lee Ford’s blog on using Graph API with PowerShell

Configure Azure AD

First sign into Azure Portal and navigate to Azure AD and Application Registrations (Preview) to create a new App Registration. Give the app a name

Create new App Registration

When its created you will be shown the new app details. make sure that you note down the Directory ID and the Application (client) ID as you will need these in the script.

App and Directory ID values will be used in the script

As well as these ID values, you also need a Redirect URI that is referenced in the script, click on Add a Redirect URI and choose the item in the screenshot below then click Save.

Now that the app is registered, we need to add permissions to read data from Graph API. Click on the API Permissions heading to grant the required Directory.ReadAll delegated permission.

Add the Directory.ReadAll permission to the App Registration

By default the user’s making a connection to the API will be required to consent to the permissions change. To make this seamless, we can use our administrative account to grant this consent on behalf of all organisation users.

Grant Admin consent

This is the setup of the Azure AD Application that will be used to access the Graph API, we can now focus on the PowerShell script that will map the drives.

The Drive Mapping Script

The Drive mapping script is made up of several parts:

  • Configuration section where you setup the Application Registration and drive mappings that will be run for each user
  • Connection to the Graph API
  • Enumeration of group membership for the user
  • Iterating through all drive maps and mapping those that the user is a member of.

I will share the script in full further down this post but have included the key snippets in each location.

First we define the variables for the app registration we created earlier.

$clientId = "73b7bec7-738#-####-####-############" #This is the Client ID for your Application Registration in Azure AD
$tenantId = "3b7b2097-f13#-####-####-############" # This is the Tenant ID of your Azure AD Directory
$redirectUri = "https://login.microsoftonline.com/common/oauth2/nativeclient" # This is the Return URL for your Application Registration in Azure AD
$dnsDomainName = "skunklab.co.uk" #This is the internal name of your AD Forest/

Once these are in place we setup our array of drive mappings. In this section there are four attributes that can be defined:

  • includeSecurityGroup – this is the group of users who should have the drive mapped
  • excludeSecurityGroup – this is a group of users who shouldnt have the drive mapped (this is optional)
  • driveLetter – this is the alphabetical letter that will be used as part of the drive mapping
  • UNCPath – this is the reference to the file share that should be mapped to the drive letter.

The code in the script looks like this:

$Drivemappings = @( #Create a line below for each drive mapping that needs to be created.
    @{"includeSecurityGroup" = "FOLDERPERM_FULL-ACCESS" ; "excludeSecurityGroup" = "" ; "driveLetter" = "T" ; "UNCPath" = "\\skunklab.co.uk\dfs\Shared"},
    @{"includeSecurityGroup" = "FOLDERPERM_ALL-STAFF" ; "excludeSecurityGroup" = "FOLDERPERM_FULL-ACCESS" ; "driveLetter" = "T" ; "UNCPath" = "\\skunklab.co.uk\dfs\shared\SHARED ACCESS"}
)

You can add as many lines in the $Drivemappings variable as you have groups that need mapping, just make sure that the final line doesn’t have a comma at the end of the line.

Next we create the connection to Graph API. I use the code from Lee’s blog earlier and it worked first time:

# Add required assemblies
Add-Type -AssemblyName System.Web, PresentationFramework, PresentationCore
 
# Scope - Needs to include all permisions required separated with a space
$scope = "User.Read.All Group.Read.All" # This is just an example set of permissions
 
# Random State - state is included in response, if you want to verify response is valid
$state = Get-Random
 
# Encode scope to fit inside query string 
$scopeEncoded = [System.Web.HttpUtility]::UrlEncode($scope)
 
# Redirect URI (encode it to fit inside query string)
$redirectUriEncoded = [System.Web.HttpUtility]::UrlEncode($redirectUri)
 
# Construct URI
$uri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/authorize?client_id=$clientId&response_type=code&redirect_uri=$redirectUriEncoded&response_mode=query&scope=$scopeEncoded&state=$state"
 
# Create Window for User Sign-In
$windowProperty = @{
    Width  = 500
    Height = 700
}
 
$signInWindow = New-Object System.Windows.Window -Property $windowProperty
    
# Create WebBrowser for Window
$browserProperty = @{
    Width  = 480
    Height = 680
}
 
$signInBrowser = New-Object System.Windows.Controls.WebBrowser -Property $browserProperty
 
# Navigate Browser to sign-in page
$signInBrowser.navigate($uri)
    
# Create a condition to check after each page load
$pageLoaded = {
 
    # Once a URL contains "code=*", close the Window
    if ($signInBrowser.Source -match "code=[^&]*") {
 
        # With the form closed and complete with the code, parse the query string
 
        $urlQueryString = [System.Uri]($signInBrowser.Source).Query
        $script:urlQueryValues = [System.Web.HttpUtility]::ParseQueryString($urlQueryString)
 
        $signInWindow.Close()
 
    }
}
 
# Add condition to document completed
$signInBrowser.Add_LoadCompleted($pageLoaded)
 
# Show Window
$signInWindow.AddChild($signInBrowser)
$signInWindow.ShowDialog()
 
# Extract code from query string
$authCode = $script:urlQueryValues.GetValues(($script:urlQueryValues.keys | Where-Object { $_ -eq "code" }))
 
if ($authCode) {
 
    # With Auth Code, start getting token
 
    # Construct URI
    $uri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
 
    # Construct Body
    $body = @{
        client_id    = $clientId
        scope        = $scope
        code         = $authCode[0]
        redirect_uri = $redirectUri
        grant_type   = "authorization_code"
    }
 
    # Get OAuth 2.0 Token
    $tokenRequest = Invoke-WebRequest -Method Post -Uri $uri -ContentType "application/x-www-form-urlencoded" -Body $body
 
    # Access Token
    $token = ($tokenRequest.Content | ConvertFrom-Json).access_token
 
}
else {
 
    Write-Error "Unable to obtain Auth Code!"
 
}

Now we need to use the token we generated and query the Graph API to get the list of groups that our user is a member of

$uri = "https://graph.microsoft.com/v1.0/me/memberOf"
$method = "GET"
 
# Run Graph API query 
$query = Invoke-WebRequest -Method $method -Uri $uri -ContentType "application/json" -Headers @{Authorization = "Bearer $token"} -ErrorAction Stop
$output = ConvertFrom-Json $query.Content
$usergroups = @()
foreach ($group in $output.value) {
    $usergroups += $group.displayName
}

Finally we need to check if the user can see the domain (there’s no point executing the script if they are out of the office) and for each group the user is a member of, map the drives using the New-PSDrive cmdlet

$connected=$false
$retries=0
$maxRetries=3

Write-Output "Starting script..."
do {
    if (Resolve-DnsName $dnsDomainName -ErrorAction SilentlyContinue) {
        $connected=$true
    }
    else {
        $retries++
        Write-Warning "Cannot resolve: $dnsDomainName, assuming no connection to fileserver"
        Start-Sleep -Seconds 3
        if ($retries -eq $maxRetries){
            Throw "Exceeded maximum numbers of retries ($maxRetries) to resolve dns name ($dnsDomainName)"
        }
    }
} 
while( -not ($Connected))

Write-Output $usergroups

$drivemappings.GetEnumerator()| ForEach-Object {
    Write-Output $PSItem.UNCPath
    if(($usergroups.contains($PSItem.includeSecurityGroup)) -and ($usergroups.contains($PSItem.excludeSecurityGroup) -eq $false)) {
        Write-Output "Attempting to map $($Psitem.DriveLetter) to $($PSItem.UNCPath)"
        New-PSDrive -PSProvider FileSystem -Name $PSItem.DriveLetter -Root $PSItem.UNCPath -Persist -Scope global
    }
}

That’s it, the script when executed will run as the user and map the drives. We now need to host this script somewhere that can be referenced from any device with an Internet connection.

Uploading the script to Azure

We will host the drive mapping script in a blob store in Azure. Sign into your Azure Portal and click on Storage Accounts and create a new one with the following settings

Once created we need to add a Container that will store the script

and finally we upload the script to the container

Once uploaded we need to get the URL for the script so we can use this in the Intune script later.

The Intune Script

Now that we have our drive mapping script and its uploaded to the Azure blob, we need a way of calling this every time a user signs into the computer. This script will:

  • Be run from the Intune Management Extension as the SYSTEM account
  • Create a new Scheduled Task that will execute a hidden PowerShell window at logon which will download and run the previous script

The only variable we need to change in this script is the URL to the drive mapping script and the name of the scheduled task that is created. The whole script looks like:

<#
    DESCRIPTION:    Create a Scheduled Task to run at User Login that executes
                    that executes a powershell script stored in an Azure blob storage account
    AUTHOR:         Matt White (matthewwhite@itlab.com
    DATE:           2019-04-06
    USAGE:          Edit the values in the first section with respect to your link to the script
                    Add in the name of the scheduled task that you want to be called
                    Uplod the script to Intune to execute as a system context script

#>

<#
    DO NOT EDIT THIS SECTION
#>

$scriptName = ([System.IO.Path]::GetFileNameWithoutExtension($(Split-Path $script:MyInvocation.MyCommand.Path -Leaf)))
$logFile = "$env:ProgramData\Intune-PowerShell-Logs\$scriptName-" + $(Get-Date).ToFileTimeUtc() + ".log"
Start-Transcript -Path $LogFile -Append

<#
    END SECTION
#>

<#
    Setup Script Variables
#>

$scriptLocation = "https://#############.blob.core.windows.net/pub-intune-scripts/DriveMapping.ps1" #enter the path to your script StorageAccounts->"account"->Blobs->"container"->"script"->URL
$taskName = "Map Network Drives" #enter the name for your scheduled task

<#
    END SECTION
#>

<#
    Setup the Scheduled Task
#>


$schedTaskCommand = "Invoke-Expression ((New-Object Net.WebClient).DownloadString($([char]39)$($scriptLocation)$([char]39)))"
$schedTaskArgs= "-ExecutionPolicy Bypass -windowstyle hidden -command $($schedTaskCommand)"
$schedTaskExists = Get-ScheduledTask -TaskName $taskName -ErrorAction SilentlyContinue
If (($schedTaskExists)-and (Get-ScheduledTask -TaskName $taskName).Actions.arguments -eq $schedTaskArgs){
    Write-Output "Task Exists and names match"
}
Else {
    if($schedTaskExists) {
        Write-Output "OldTask: $((Get-ScheduledTask -TaskName $taskName).Actions.arguments)"
        Write-Output "NewTask: $($schedTaskCommand)"
        Write-Output "Deleting Scheduled Task"
        Unregister-ScheduledTask -TaskName $taskName -Confirm:$false -ErrorAction SilentlyContinue
    }
    Write-Output "Creating Schdeuled Task"
    $schedTaskAction = New-ScheduledTaskAction -Execute 'Powershell.exe' -Argument $schedTaskArgs
    $schedTaskTrigger = New-ScheduledTaskTrigger -AtLogon
    $schedTaskSettings = New-ScheduledTaskSettingsSet -AllowStartIfOnBatteries -DontStopIfGoingOnBatteries -Compatibility Win8
    $schedTaskPrincipal = New-ScheduledTaskPrincipal -GroupId S-1-5-32-545
    $schedTask = New-ScheduledTask -Action $schedTaskAction -Settings $schedTaskSettings -Trigger $schedTaskTrigger -Principal $schedTaskPrincipal -ErrorVariable $NewSchedTaskError
    Register-ScheduledTask -InputObject $schedTask -TaskName $taskName -ErrorVariable $RegSchedTaskError
}

Stop-Transcript

This now needs to be added to Intune so that it can be executed on the devices. Navigate to Intue, Device Configuration, PowerShell scripts and add a new script

Once the file is uploaded, click on Configure to check how the script should be run

Once complete click Save and the script will be uploaded.

Finally we need to assign the script to users or devices. In my example all my computers are deployed via Autopilot so I assign the script to my Autopilot security groups which contain all the computer accounts.

The end result

When the Intune script runs on the endpoint it will check if the scheduled task exists and whether the script it will execute matches what was in any previous configuration. If there is no task, it is created and if there are changes, the old task is deleted and a new task is created.

When a user signs in they will see a popup window as the auth token is generated and then, if they are connected to the corporate network, their network drives will be mapped.

If you need to change the drives that a user has access to (either as you migrate to a more appropritae cloud service or you change the servers that host the data) simply amend the script in the blob store and the new drives will be mapped at logon.

The Intune script can be re-used for any other code that you want to run at user logon, simply reference the link to the script in the blob store and the name of the scheduled task you wish to use.

The scripts in full

Drive Mapping Script

<#
    DESCRIPTION:    Iterate through a list of drive mappings, match the groups to AzureAD groups
                    Where they match connect to the UNC path
    AUTHOR:         Matt White (matthewwhite@itlab.com)
    DATE:           2019-04-06
    USAGE:          Edit the values in the first section with respect to your link to the script
                    Add in the name of the scheduled task that you want to be called
                    Uplod the script to Intune to execute as a system context script

#>

<#
    DO NOT EDIT THIS SECTION
#>

$scriptName = ([System.IO.Path]::GetFileNameWithoutExtension($(Split-Path $script:MyInvocation.MyCommand.Path -Leaf)))
$logFile = "$env:ProgramData\Intune-PowerShell-Logs\$scriptName-" + $(Get-Date).ToFileTimeUtc() + ".log"
Start-Transcript -Path $LogFile -Append

<#
    END SECTION
#>

$clientId = "73b7bec7-####-####-####-############" #This is the Client ID for your Application Registration in Azure AD
$tenantId = "3b7b2097-####-####-####-############" # This is the Tenant ID of your Azure AD Directory
$redirectUri = "https://login.microsoftonline.com/common/oauth2/nativeclient" # This is the Return URL for your Application Registration in Azure AD

$dnsDomainName = "skunklab.co.uk" #This is the internal name of your AD Forest


$Drivemappings = @( #Create a line below for each drive mapping that needs to be created.
    @{"includeSecurityGroup" = "FOLDERPERM_FULL-ACCESS" ; "excludeSecurityGroup" = "" ; "driveLetter" = "T" ; "UNCPath" = "\\skunklab.co.uk\dfs\shared"},
    @{"includeSecurityGroup" = "FOLDERPERM_ALL-STAFF" ; "excludeSecurityGroup" = "FOLDERPERM_FULL-ACCESS" ; "driveLetter" = "T" ; "UNCPath" = "\\skunklab.co.uk\dfs\shared\sharedaccess"}
)
 
# Add required assemblies
Add-Type -AssemblyName System.Web, PresentationFramework, PresentationCore
 
# Scope - Needs to include all permisions required separated with a space
$scope = "User.Read.All Group.Read.All" # This is just an example set of permissions
 
# Random State - state is included in response, if you want to verify response is valid
$state = Get-Random
 
# Encode scope to fit inside query string 
$scopeEncoded = [System.Web.HttpUtility]::UrlEncode($scope)
 
# Redirect URI (encode it to fit inside query string)
$redirectUriEncoded = [System.Web.HttpUtility]::UrlEncode($redirectUri)
 
# Construct URI
$uri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/authorize?client_id=$clientId&response_type=code&redirect_uri=$redirectUriEncoded&response_mode=query&scope=$scopeEncoded&state=$state"
 
# Create Window for User Sign-In
$windowProperty = @{
    Width  = 500
    Height = 700
}
 
$signInWindow = New-Object System.Windows.Window -Property $windowProperty
    
# Create WebBrowser for Window
$browserProperty = @{
    Width  = 480
    Height = 680
}
 
$signInBrowser = New-Object System.Windows.Controls.WebBrowser -Property $browserProperty
 
# Navigate Browser to sign-in page
$signInBrowser.navigate($uri)
    
# Create a condition to check after each page load
$pageLoaded = {
 
    # Once a URL contains "code=*", close the Window
    if ($signInBrowser.Source -match "code=[^&]*") {
 
        # With the form closed and complete with the code, parse the query string
 
        $urlQueryString = [System.Uri]($signInBrowser.Source).Query
        $script:urlQueryValues = [System.Web.HttpUtility]::ParseQueryString($urlQueryString)
 
        $signInWindow.Close()
 
    }
}
 
# Add condition to document completed
$signInBrowser.Add_LoadCompleted($pageLoaded)
 
# Show Window
$signInWindow.AddChild($signInBrowser)
$signInWindow.ShowDialog()
 
# Extract code from query string
$authCode = $script:urlQueryValues.GetValues(($script:urlQueryValues.keys | Where-Object { $_ -eq "code" }))
 
if ($authCode) {
 
    # With Auth Code, start getting token
 
    # Construct URI
    $uri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
 
    # Construct Body
    $body = @{
        client_id    = $clientId
        scope        = $scope
        code         = $authCode[0]
        redirect_uri = $redirectUri
        grant_type   = "authorization_code"
    }
 
    # Get OAuth 2.0 Token
    $tokenRequest = Invoke-WebRequest -Method Post -Uri $uri -ContentType "application/x-www-form-urlencoded" -Body $body
 
    # Access Token
    $token = ($tokenRequest.Content | ConvertFrom-Json).access_token
 
}
else {
 
    Write-Error "Unable to obtain Auth Code!"
 
}
 
####
# Run Graph API Query to get group membership
####
 
$uri = "https://graph.microsoft.com/v1.0/me/memberOf"
$method = "GET"
 
# Run Graph API query 
$query = Invoke-WebRequest -Method $method -Uri $uri -ContentType "application/json" -Headers @{Authorization = "Bearer $token"} -ErrorAction Stop
$output = ConvertFrom-Json $query.Content
$usergroups = @()
foreach ($group in $output.value) {
    $usergroups += $group.displayName
}
 
# Loop the Drive Mappings and check group membership

$connected=$false
$retries=0
$maxRetries=3

Write-Output "Starting script..."
do {
    if (Resolve-DnsName $dnsDomainName -ErrorAction SilentlyContinue) {
        $connected=$true
    }
    else {
        $retries++
        Write-Warning "Cannot resolve: $dnsDomainName, assuming no connection to fileserver"
        Start-Sleep -Seconds 3
        if ($retries -eq $maxRetries){
            Throw "Exceeded maximum numbers of retries ($maxRetries) to resolve dns name ($dnsDomainName)"
        }
    }
} 
while( -not ($Connected))

Write-Output $usergroups

$drivemappings.GetEnumerator()| ForEach-Object {
    Write-Output $PSItem.UNCPath
    if(($usergroups.contains($PSItem.includeSecurityGroup)) -and ($usergroups.contains($PSItem.excludeSecurityGroup) -eq $false)) {
        Write-Output "Attempting to map $($Psitem.DriveLetter) to $($PSItem.UNCPath)"
        New-PSDrive -PSProvider FileSystem -Name $PSItem.DriveLetter -Root $PSItem.UNCPath -Persist -Scope global
    }
}

Stop-Transcript

Intune Scheduled Task Script

<#
    DESCRIPTION:    Create a Scheduled Task to run at User Login that executes
                    that executes a powershell script stored in an Azure blob storage account
    AUTHOR:         Matt White (matthewwhite@itlab.com
    DATE:           2019-04-06
    USAGE:          Edit the values in the first section with respect to your link to the script
                    Add in the name of the scheduled task that you want to be called
                    Uplod the script to Intune to execute as a system context script

#>

<#
    DO NOT EDIT THIS SECTION
#>

$scriptName = ([System.IO.Path]::GetFileNameWithoutExtension($(Split-Path $script:MyInvocation.MyCommand.Path -Leaf)))
$logFile = "$env:ProgramData\Intune-PowerShell-Logs\$scriptName-" + $(Get-Date).ToFileTimeUtc() + ".log"
Start-Transcript -Path $LogFile -Append

<#
    END SECTION
#>

<#
    Setup Script Variables
#>

$scriptLocation = "https://###########.blob.core.windows.net/pub-intune-scripts/DriveMapping.ps1" #enter the path to your script StorageAccounts->"account"->Blobs->"container"->"script"->URL
$taskName = "Map Network Drives" #enter the name for your scheduled task

<#
    END SECTION
#>

<#
    Setup the Scheduled Task
#>


$schedTaskCommand = "Invoke-Expression ((New-Object Net.WebClient).DownloadString($([char]39)$($scriptLocation)$([char]39)))"
$schedTaskArgs= "-ExecutionPolicy Bypass -windowstyle hidden -command $($schedTaskCommand)"
$schedTaskExists = Get-ScheduledTask -TaskName $taskName -ErrorAction SilentlyContinue
If (($schedTaskExists)-and (Get-ScheduledTask -TaskName $taskName).Actions.arguments -eq $schedTaskArgs){
    Write-Output "Task Exists and names match"
}
Else {
    if($schedTaskExists) {
        Write-Output "OldTask: $((Get-ScheduledTask -TaskName $taskName).Actions.arguments)"
        Write-Output "NewTask: $($schedTaskCommand)"
        Write-Output "Deleting Scheduled Task"
        Unregister-ScheduledTask -TaskName $taskName -Confirm:$false -ErrorAction SilentlyContinue
    }
    Write-Output "Creating Schdeuled Task"
    $schedTaskAction = New-ScheduledTaskAction -Execute 'Powershell.exe' -Argument $schedTaskArgs
    $schedTaskTrigger = New-ScheduledTaskTrigger -AtLogon
    $schedTaskSettings = New-ScheduledTaskSettingsSet -AllowStartIfOnBatteries -DontStopIfGoingOnBatteries -Compatibility Win8
    $schedTaskPrincipal = New-ScheduledTaskPrincipal -GroupId S-1-5-32-545
    $schedTask = New-ScheduledTask -Action $schedTaskAction -Settings $schedTaskSettings -Trigger $schedTaskTrigger -Principal $schedTaskPrincipal -ErrorVariable $NewSchedTaskError
    Register-ScheduledTask -InputObject $schedTask -TaskName $taskName -ErrorVariable $RegSchedTaskError
}

Stop-Transcript

 

 

 

One thought on “Mapping legacy files shares for Azure AD joined devices

  1. Thanks a lot for the great script. I have one more question is it possible to fill the Credential Manager with access data in the script? At the moment I use a script in which the users have to enter their access data themselves:

    $Username = Read-Host = “Enter Username (Example: firstname.lastname)”
    $Username = $Username.ToLower()
    #$UsernameShort = $Username.Replace(‘@DOMAIN’,”)
    $Password = Read-Host = “Enter Password”

    #CleanUp Credentials from Credentials Manager
    cmdkey /delete:server01 >$Null
    cmdkey /delete:server02 >$Null

    #Create Credentials to Credentials Manager
    cmdkey /add:server01 /user:$Username /pass:$Password
    cmdkey /add:server02 /user:$Username /pass:$Password

  2. Have you got your identity federated to another idp? Either on-prem adfs or something external like Okta or Ping?

    Presume when you complete the sign in the drives map as expected

    1. Yes federated AD with a few domains, you could say the domain we’re logging on to is a secondary domain and the primary domain with the dirsync for all 3 domains is on a different domain. For the local AD domain in the script we set the domain to the local secondary domain not the domain which is primary for doing dirsync etc.

    2. Hi Matt,

      Just to update I had to run the script first on a machine running ps as an Azure AD admin which added some further delegation to the Azure App to read all users groups and read all users profile data. The script now runs fine.

      However, one drive maps for a user but not all the drives…

      @{“includeSecurityGroup” = “Logon Admin, Logon Library, Logon Medical” ; “excludeSecurityGroup” = “” ; “driveLetter” = “N” ; “UNCPath” = “\\srv-apps\Apps”},

      Is this the correct formatting for adding multiple groups to be assigned one drive share?

      Strangely the drive which did map is further down the code in the script. The logs shows the drive unc paths but only one is actually mapped.

      Also when the next user logs in the new drives aren’t mapped, the powershell script doesn’t even pop up to run at the next logon. Still looking in to all this just wondering if you have come across it before?

      Thanks,
      Stefan

  3. Hi Matt,

    Thanks for sharing the scripts and knowledge. We have setup the Azure AD app, delegated, and granted admin permissions, however when the user logs on a popup appears asking for authentication. I thought the authentication was seamless because of the granted admin permissions? Is something missing?

    Thanks,
    Stefan

    1. Hi Stefan, are the users signing into the device with their AzureAD identity as part of the script execution?

      1. Thanks for the quick reply. I’ll have to check but I think they are logging on with their local domain credentials, which are dirsync’d to Azure AD. Would have thought using either local or aad would both authenticate the same as they’re in sync… do they have to login with azuread login and not @domain.local ?

        1. I haven’t tested it in that scenario but if they are signing in with local domain credentials and the devices are hybrid aad joined wouldn’t you just use gpo instead and always-on vpn to get the machines online

          Alternatively if they are trying to connect to on prem resources using aad credentials that are syncd do you have the correct domain and forest functional level in your ad estate?

          1. Trying to move away from using domain so avoiding hybrid aad. Actually thinking about it they must be logging on using the azure ad identity, because the devices aren’t on the local domain. It’s just strange that the requesting permission box pops up I was pretty sure the Application > API > delegate admin access was all needed to skip this box.

          2. After accepting the azure ad permissions box nothing happens. The scheduled task is created though. After rebooting and logging on again with the same user nothing happens.

  4. Nice script..works OK! 😉

    But can you maybe expand the scheduled task script with an addition to also run/trigger at event id 20224 from the RASclient?

    this is because if users setup a VPN to Azure they also want their drives mapped after the connection is established. useful for home or field users.

    Above will then also run it when the event id has occured.

    Which is “The link to the Remote Access Server has been established by user DOMAIN\USER

    My knowledge of Powershell is not enough to add the above.

    1. Hi JJ, this is something that i think is achievable as you can trigger a scheduled task based on eventID. What i think we would need in this case is a custom piece of XML that defines the triggers for the task to run and look for the event ID. I think i have something similar in another script so will try and dig it out to share it.

    2. So having had a look back through code samples I was right – you can’t use the stock PowerShell cmdlets to create custom conditions for when the task should trigger. You can however set the task to register based on an XML input instead. The code below should replace the Intune script that creates the Scheduled task and it will look specifically for the eventID you asked to be triggered on

      If you replace the “Setup Scheduled Task” code with the detail below this should work:

      <#
          Setup the Scheduled Task
      #>
       
       
      $schedTaskCommand = "Invoke-Expression ((New-Object Net.WebClient).DownloadString($([char]39)$($scriptLocation)$([char]39)))"
      $schedTaskArgs= "-ExecutionPolicy Bypass -windowstyle hidden -command $($schedTaskCommand)"
      $schedTaskExists = Get-ScheduledTask -TaskName $taskName -ErrorAction SilentlyContinue
      If (($schedTaskExists)-and (Get-ScheduledTask -TaskName $taskName).Actions.arguments -eq $schedTaskArgs){
          Write-Output "Task Exists and names match"
      }
      Else {
          if($schedTaskExists) {
              Write-Output "OldTask: $((Get-ScheduledTask -TaskName $taskName).Actions.arguments)"
              Write-Output "NewTask: $($schedTaskCommand)"
              Write-Output "Deleting Scheduled Task"
              Unregister-ScheduledTask -TaskName $taskName -Confirm:$false -ErrorAction SilentlyContinue
          }
          Write-Output "Creating Schdeuled Task"
          $taskxml = @"
      
      
        
        
        
          
            true
          
      	
            true
            <QueryList><Query Id="0" Path="Application"><Select Path="Application">*[System[Provider[@Name='RasClient'] and EventID=20224]]</Select></Query></QueryList>
          	  
        
        
          
            S-1-5-32-545
            LeastPrivilege
          
        
        
          IgnoreNew
          false
          false
          true
          false
          false
          
            PT10M
            PT1H
            true
            false
          
          true
          true
          false
          false
          false
          true
          false
          PT72H
          7
        
        
          
            Powershell.exe
            $($schedTaskArgs)
          
        
      
      "@
          Register-ScheduledTask -xml $taskXML -TaskName $taskName -ErrorVariable $RegSchedTaskError
      }

      Let me know if this works

      1. yes…this works also…the last challenge to have this 100% functional is when the user activates the business SSID wi-fi. if they logon through wireless the scripts runs, but the drives are not available yet.

        The last part of the script then has to be a trigger on both eventID 8001 ftom WLAN-AutoConfig. and in the detail part it should look for the SSID of the Wi-Fi business WAN only. or it will trigger at every wi-fi connection made.

        1. I’ve got a new post in the offing that will overhaul the login script to enable multiple triggers. Should allow you to specify this but would need to check

  5. Hi Matt – another quick question we discovered a little minor flaw in that member.of graph API is only pulling in the first 100 security groups. We’ve got users who have more than that which results in the drive mapping not working… any thoughts or workarounds? Regards Jim.

    1. I think this is a feature of the graph that it “pages” the results to reduce the payload in each query. It should be possible to modify the query to get each page until you have them all.

      I’ll have a chat to a colleague who found something similar with another graph query and share the adapted code to get multiple pages.

  6. Hi – great script, we’re having an interesting issue which hoping you might be able to clarify, we want to use Windows HELLO as the initial sign in to the device however, this causes a problem with this script as it tries to use these credentials to map to the share, which of course it can’t as the legacy file servers are not configured to support this. You have to manually map a share with the domain shared credentials e.g. username + password for this to work. Which once the credentials are cached happy days everything maps fine. Any thoughts on a way forward on this?

    We don’t really want to be having users to intervene like this as it will create confusion.

    1. Hi Jim,

      I have come across this before and it depends how your legacy forest and directory services are setup and integrated with Azure AD. For this to work seamlessly you need to have Windows Hello for Business configured in either Key Trust or Certificate Trust model with the relevant supporting version of Active Directory Domain Services as well as your Domain and Forest Functional Levels.

      When this is in place your users will authenticate to Azure AD as normal using the strong credentials but if they can see a Domain Controller for your forest they will also generate a Kerberos token that is valid for user authentication to legacy resources.

      Personal preference has been to do this as a Key Trust deployment as I didn’t need to deploy ADCS and certificate profiles to all my devices, just my Domain Controllers. In this mode, you need to have DCs running Windows Server 2016 or newer with up to date certificates issued to them by your internal PKI. This article from Microsoft should sort out the authentication issues for you if you opt for the Hybrid Key Trust model.

      1. Nice one thanks for quick response on this, much appreciated will have a look + review. I think in the interim we’ll probably disable Windows Hello for the time being until we can our all DCs up to the required OS level + forest auth.

        1. Yeah, depending on what state the AD forest is in at the start of the process and the size of the organisation this can be a relatively sizeable undertaking. There is also still the requirement to have some form of internal PKI which, depending upon what has been deployed in the past, may also be a requirement to setup.

        2. One other thing that may work for you – if you get the user to add *.domain.local in Credential Manager using their UPN and password seems to fix the use of Windows Hello. the only time people would need to update this is when their password changes but in testing seems to provide a domain wide alternative for the certificate piece

  7. So this does work great on my machine and a couple of other folks who have admin rights. I can’t get it to work for standard users though.

    1. What happens on the standard user machine, can you share a subset of the log files that run on a standard user machine

    1. The account needs to exist in Azure AD and have a matching account in Active Directory that can authenticate, normally this would be synchronised through Azure AD Connect yes

  8. Hi

    I wanted to use this in a project i am currently on , i seem to get the Scheduled task to work but every time i log on the script does not seem to run and nothing in logs

    1. Did you manage to get this working, if you run the script interactively from a PowerShell window what do you get in the output? Are there any things like AppLocker that would prevent the task/script from executing

  9. Very cool tutorial. I’m curious, does this require the computer to be Hybrid AD Joined? I ask because I didn’t see any reference when mapping the drive to where you would need to specify different credentials to make this connection. My assumption going into this was that this was for a purely AAD Joined machine connecting to a traditional file share that’s joined to an On-Premise domain (or even just a WORKGROUP server), with no AAD Connect involved. Can you verify what the circumstances are here?

    1. I believe it’s a AADJ machine but connected to corporate network to be able to access org resources.

      1. That’s right, the devices are purely AADJ rather than hybrid joined. If we had a hybrid join I would still probably use GPP to map the drives rather than script the functionality as I have done here

Comments are closed.