In Michael Niehaus’ recent blog on Configuring Windows 10 defaults via Windows Autopilot using an MSI he talked about the ability to set the Time Zone of the device based on a variable in the Config.xml file. One of the comments on the blog asked whether it would be possible to set the time zone based on where the device was at the time of setup rather than based on an attribute in the file.
Whilst Windows 10 has a feature to detect the time zone automatically I have found on occasion with devices that this doesn’t always work out of the box and the time zone remains in a region that is not accurate for where the device is. This should be relatively simple to automate if we can geo-locate the public IP address and match those coordinates to a valid time zone. For this I am using two web services:
https://ipstack.com – this will map the public IP address to a geo-location including coordinates, you get access to 10,000 queries a month for free
Bing Maps API – this will return the correct time zone in a valid Windows format that can be used in the script to set the time zone on the device, there is a free tier here that also gives you 10,000 queries per month without charge
The script will execute the following in sequence:
Attempt to get the coordinates of the IP address you are using via the IPStack API
If successful, attempt to find the time zone from the Bing Maps API
Compare the value of the current time zone on the machine with the response from the Bing Maps API
Change the time zone on the machine
The only variable you should need to change in the script are the two lines that will contain your API keys helpfully called ipStackAPIKey and bingMapsAPIKey.
You can choose now to either save the file as a script to run once in Intune, or, incorporate this into Michael’s MSI. If you wanted the script to run every time the machine starts up, you could adapt the Logon script from my recent post on Mapping legacy files shares for Azure AD joined devices.
DESCRIPTION: Based on the public egress IP obtain geoLocation details and match to a time zone
Once obtained set the Time Zone on the destination computer
More and more of my customers are moving their devices from a traditional IT model to a Modern Desktop build directly in Azure AD, managing devices via Microsoft Intune rather than Group Policy or System Center Configuration Manager. The move to this modern approach of delivering IT services usually sits alongside of moving the organisation’s unstructured file data to OneDrive and SharePoint online which is the logical place to store this data instead of sat on a file server in an office or datacentre.
What if, however, that you still have a large volume of data that remains on your on premises file servers. Users will still require access to these shares but there is no native way of connecting to file shares within the Intune console. This is the challenge I have had for a customer in recent weeks and have developed a couple of PowerShell scripts that can be run to map drives when a user logs in and supports both dedicated and shared devices.
Looking back at a legacy IT approach, drive mappings were done through either Group Policy Preferences but also through login scripts such as batch or KIX. Both processes follow a similar method:
User signs into a device
GPP or login script runs containing list of mapped drives aligned to security groups of users who should have access
If the user signing into the device is in the relevant groups the drive letter is mapped to the shared location
This method has worked for years and IT admins maintain one or the other process to give users access to corporate data. If we now look forward to the modern managed IT environment, there are a few issues when working with the legacy file servers:
There is no native construct in Intune that maps UNC file paths for users
Whilst you can run a PowerShell script that could run a New-PSDrive cmdlet this will only execute once on the device and never again.
You may think that the second piece isn’t an issue, simply create a couple of scripts to map the network drives to the file shares and they will run once and remain mapped. What if the devices are shared and multiple users need to sign into the computer or if you need to amend the drive mappings? We needed a solution that could map drives at user sign in and be easy to change as the organisation moves away from file servers.
As with most things, I started looking at what was on the Internet and quickly came across these blogs from Nicola Suter and Jos Lieben but neither really did what I needed for my customer (they have >100 different network drives). I set about looking for scripts that would deliver what I needed for the customer.
My requirements for the new drive mapping script were as follows:
Work natively with Azure AD joined devices
support users on dedicated or shared workstations
process the drive mappings sequentially as a traditional GPP or Login script would execute
Let’s start with the actual drive mapping script itself
Drive mapping script
For the drive mapping script to work, it needs to run silently and also enumerate the groups that the user has access to. Sounds easy, but PowerShell and AzureAD doesn’t natively have a way of matching these. I settled on making use of Microsoft’s Graph API listMemberOf function as this can be called to pull the groups that a user is a member of into a variable that can work with the drive mapping. The function requires a minimum permission of Directory.ReadAll which needed to be granted through an App Registration in Azure AD. Step forward my next web help in the form of Lee Ford’s blog on using Graph API with PowerShell
Configure Azure AD
First sign into Azure Portal and navigate to Azure AD and Application Registrations (Preview) to create a new App Registration. Give the app a name
When its created you will be shown the new app details. make sure that you note down the Directory ID and the Application (client) ID as you will need these in the script.
As well as these ID values, you also need a Redirect URI that is referenced in the script, click on Add a Redirect URI and choose the item in the screenshot below then click Save.
Now that the app is registered, we need to add permissions to read data from Graph API. Click on the API Permissions heading to grant the required Directory.ReadAll delegated permission.
By default the user’s making a connection to the API will be required to consent to the permissions change. To make this seamless, we can use our administrative account to grant this consent on behalf of all organisation users.
This is the setup of the Azure AD Application that will be used to access the Graph API, we can now focus on the PowerShell script that will map the drives.
The Drive Mapping Script
The Drive mapping script is made up of several parts:
Configuration section where you setup the Application Registration and drive mappings that will be run for each user
Connection to the Graph API
Enumeration of group membership for the user
Iterating through all drive maps and mapping those that the user is a member of.
I will share the script in full further down this post but have included the key snippets in each location.
First we define the variables for the app registration we created earlier.
$clientId="73b7bec7-738#-####-####-############"#This is the Client ID for your Application Registration in Azure AD
$tenantId="3b7b2097-f13#-####-####-############"# This is the Tenant ID of your Azure AD Directory
$redirectUri="https://login.microsoftonline.com/common/oauth2/nativeclient"# This is the Return URL for your Application Registration in Azure AD
$dnsDomainName="skunklab.co.uk"#This is the internal name of your AD Forest/
Once these are in place we setup our array of drive mappings. In this section there are four attributes that can be defined:
includeSecurityGroup – this is the group of users who should have the drive mapped
excludeSecurityGroup – this is a group of users who shouldnt have the drive mapped (this is optional)
driveLetter – this is the alphabetical letter that will be used as part of the drive mapping
UNCPath – this is the reference to the file share that should be mapped to the drive letter.
The code in the script looks like this:
$Drivemappings=@(#Create a line below for each drive mapping that needs to be created.
Finally we need to check if the user can see the domain (there’s no point executing the script if they are out of the office) and for each group the user is a member of, map the drives using the New-PSDrive cmdlet
This now needs to be added to Intune so that it can be executed on the devices. Navigate to Intue, Device Configuration, PowerShell scripts and add a new script
Once the file is uploaded, click on Configure to check how the script should be run
Once complete click Save and the script will be uploaded.
Finally we need to assign the script to users or devices. In my example all my computers are deployed via Autopilot so I assign the script to my Autopilot security groups which contain all the computer accounts.
The end result
When the Intune script runs on the endpoint it will check if the scheduled task exists and whether the script it will execute matches what was in any previous configuration. If there is no task, it is created and if there are changes, the old task is deleted and a new task is created.
When a user signs in they will see a popup window as the auth token is generated and then, if they are connected to the corporate network, their network drives will be mapped.
If you need to change the drives that a user has access to (either as you migrate to a more appropritae cloud service or you change the servers that host the data) simply amend the script in the blob store and the new drives will be mapped at logon.
The Intune script can be re-used for any other code that you want to run at user logon, simply reference the link to the script in the blob store and the name of the scheduled task you wish to use.
The scripts in full
Drive Mapping Script
DESCRIPTION: Iterate through a list of drive mappings, match the groups to AzureAD groups
Where they match connect to the UNC path
AUTHOR: Matt White (firstname.lastname@example.org)
USAGE: Edit the values in the first section with respect to your link to the script
Add in the name of the scheduled task that you want to be called
Uplod the script to Intune to execute as a system context script
After I completed a recent upgrade of Windows 10 to 1803, I noticed that my English (United Kingdom) language had been joined by English (United States) and that I couldn’t remove this from the system.
A lot of my work recently has been working with Microsoft Intune to utilise Microsoft Modern Management constructs and principles to deliver a cloud first approach to provisioning new Windows 10 endpoints for an organisation.
Since Microsoft has migrated Intune management from the classic interface to the Azure Portal, the ability to execute installers for legacy line of business applications has been reduced. The idea is that the modern workplace is consuming data via apps from an app store and this is evident in Microsoft’s support for the Microsoft Store for Business and Universal Windows Platform .appx package support in Intune however this is not always feasible in most workplaces. There are still legacy line of business applications that require an MSI or EXE based installer and whilst Intune will support Line of Business installers that are MSI based there is again a limitation that the MSI must contain all the code required to install the application. There is currently no support for EXE based installers in the Azure Portal for Intune.
Back at Microsoft Ignite 2017, Microsoft announced the availability of the Intune Management Extension and the support to execute PowerShell scripts on Windows 10 Endpoints via Microsoft Intune (Read More). This got me thinking about how to extend the functionality of Microsoft Intune to deliver a more traditional (MDT / SCCM) provisioning process for legacy applications on modern managed Windows 10 devices.
If you could store your legacy line of business applications in a web accessible location (with appropriate security controls to prevent unauthorised access) you could then utilise the Intune Management Extension and PowerShell scripts to download the application install payload to a temporary location and then execute the payload to overcome the limitation of the Intune portal.
Looking around the Internet I came across this blog post by MVP Peter van der Woude which integrates the Chocolatey package manager and Intune. With a bit of reworking I amended the PowerShell code to download and install the AEM agent onto a target machine.
As I deploy more and more instances of Microsoft Intune I am having to onfigure managed applications for Android and iOS enrolled devices. Whilst the iOS app store has been neatly integrated into the Azure portal, Android apps need to be added by their relevant App URL (frustrating and something I hope that Microsoft / Google can fix in the near future).
When configuring the mobile app and app protection policies it can be useful to have the correct Microsoft suite (Outlook, Word, Excel, PowerPoint, OneDrive, Skype for Business etc.) installed automatically on the end user device. The list below is hopefully a time saver to get to each of the applications without having to click through or search the Google Play store manually:
Following a long break from completing my MCSA: Messaging in Server 2003 I have finally got round to updating this for the modern era and upgraded this first to MCTS in Windows Server 2008 and finally this afternoon completed my 70-647 exam to attain the qualification of Microsoft Certified IT Professional: Enterprise Administrator.
For those of you with an MCSA in Windows Server 2003 the upgrade is done with the following exams:
70-648 – TS: Upgrading from Windows Server 2003 MCSA to, Windows Server 2008, Technology Specializations. This is equivalent to completing the following two exams 70-640 and 70-642
70-680 – TS: Windows 7, Configuring. This is the client exam required as part of the qualification
70-643 – TS: Windows Server 2008 Applications Infrastructure, Configuring
70-647 – Pro: Windows Server 2008, Enterprise Administrator
Once Microsoft confirm it on the MCP site I will update the qualifications links on the left with the new logo.
I realise this is has been around for a while now but until a few weeks ago I never really appreciated the Group Policy Preferences and the simplicity they offer.
Back in the days of Windows NT, Server 2000 and Server 2003 server administrators would create login scripts to perform a number of commands such as mapping network drives, installing printers, creating shortcuts and folders… I could go on but you get the idea. In Server 2008 Microsoft introduced the Group Policy Preferences to allow you to use Group Policy to natively configure a whole host of setting in Group Policy that would otherwise be a number of lines of batch/kix/vb script.
As you can see from the image to the left there are a vast number of options that can be configured for a user when they login. For most of the items there are four options: Create, Delete, Update and Replace which will let you make changes to the Drive Mappings, Folders etc. The difference between Update and Replace can vary from item to item but my general understanding is that the Update will attempt to modify the existing item to match what is in the Preference whereas the Replace option will remove what is there and recreate the new object (smilar to a net use P: /DELETE /Y followed by net use P: \ServerUsers%USERNAME%)
Another benefit is that in a single GPO you can define a number of different Preferences and then filter these around Group Membership.
This should all work Out of the Box with Windows Vista and above so for any legacy clients and servers (Windows XP, Server 2003) you will need to download the appropriate updates from Microsoft http://support.microsoft.com/kb/943729.
All in all this should save time and administrative overhead when they are fully adopted. The only problem is getting the legacy scripts switched over to the new Preferences.
Backup Exec backs up the DFSr Replicated Folders using the shadow copy components and in the past to perform a restore you were unable to redirect the files to an alternate location. This could cause issues if you wanted to keep both versions of the file as Backup Exec would overwrite the file and then perform an inital replication of that DFSr folder to the other servers in its replication group.
Whilst you could also perform an Authoritative restore of the DFSr folder this has recently caused me even more issues which resulted in support calls to Symantec and Microsoft to follow up on why this happens and what state my DFS is now in as a result of these restores.
During the inital support call to Symantec they advised me that for the first time in Backup Exec you can redirect the files you restore from the Volume Shadow Copy of the DFSr folders. Simply select the server and location in the File Redirection tab in Backup Exec and you will be able to dump the folder structure to whereever you want it and then copy the relevant files back into your DFS structure as you want it.