Provision Microsoft Teams with Azure Automation – Part II

Provision Microsoft Teams with Azure Automation – Part II

I am a big fan of Azure Automation and its helpful capabilities around provisioning and administrating around Office 365 and its components such as SharePoint, Office 365 Groups or Microsoft Teams. In this little series I want to show you how you can provision your Teams with Azure Automation in a quick and simple way but the same would be scalable to a much more complex solution depending on your needs.

In this second part of my little series I want to show you the way currently supported in production to first create a Group and then a Team based on that. Also we will have a look how we can further adjust that Team by execute some PnP Provisioning templates against it and its corresponding SharePoint site. Finally some adjustments to the Team (not possible with PnP Provisioning so far) will be done.

To remember what I started to explain in Part I. We have a modularized approach consisting of several runbooks where one needs to be called first.

  • Parent Runbook Module (calls:)
    • Create Team runbook
    • PnP Provision against site
    • Execute further PS manipulations …
All our Runbooks for creating Microsoft Teams (Part I & II of this series)

Our parent runbook you might remember from Part I but here once again and a bit more complete.

Our parent runbook to create a team from a group

After our input parameters we retrieve our assets (credentials, AppIDs, Urls and so on) from the current Automation Account (refer to another of my posts in case you need more information on Automation Account assets). Once we have them we can start to call our first runbook to create the Team (see next) which we already did in part I but there with another approach.

Our (sub)runbook to create an Office365 Group first and afterwards a team from it

Most code parts are the same as in part 1. We first connect to Microsoft Graph and get the access token. Then we create our Rest header. Also the block to receive the user id for the given login is the same. What’s new is the first While loop where we check if the desired alias is already in use. If so we manipulate it by attaching a “2”, next a “3” and so on. This pattern is only an example of course but gives us a bit more control in case we cannot handle something more comfortable in a potential UI.

As in this part we want to create a Group first it’s now time to create a GroupRequest body object. Additionally to an intro video on this approach by Paolo Pialorsi I already referenced in part 1. I also provide the alias here as we already managed that above and furthermore it is good practice to not provision Groups without any owners.

Next is to convert the request object to Json and execute a Rest call against Microsoft Graph. Having the Group and it’s ID as a result we can immediately go on with preparing the next request to create a Team from that Group (“teamify”). The request is a bit smaller than in part 1 but thats obvious because many parameters we already provided for the Group. The request follows the same pattern, converting request object to Json, execute Rest request and grabbing the ID from the result (which is not really necessary because Team and Group ID are in fact the same!)

After the new Team is created (back in our parent runbook!) we can retrieve a Group object and with it its site url. With a small waiting loop we check that the team and its site is really ready as provisioning might take some time. Having that we can call our further runbooks to go on provisioning with a PnP template and further PS cmdlets.

Next step is applying a PnP Provisioning template with that we want to create some fields, a content type and a list. The list will also get some DataRows so we have content for demonstration purposes. The runbook for that you find next:

Our (sub)runbook to apply PnP Provisioning template

First you find a function to download our existing provisioning template from an Azure Blob storage secured by a key.

PnP Provisioning template, stored in Azure Blob storage

Next we establish the authentication against SharePoint. You can either grab credentials from our AutomationAccount as a credential asset or use modern authentication with an app reg and certificate as described here.

After Connect-PnPOnline we construct our template-name, download it to local runbook runtime environment (C:\Users\… or C:\temp\… are our options) and then we simply apply our provisioning template.

Unfortunately I was still not able to output PnP Provsiioning debugging information in a runbook output so the following line of code is obsolete. Any hint would be greatly appreciated.

Set-PnPTraceLog	-On -Level Debug 

Next we come to our third runbook. Not everything can be solved with PnP Provisioning unfortunately (or not yet). So we decided in our scenarios to isolate further code steps in another runbook, typically called afterwards. In this scenario lets assume we want to create a new tab in our team and show the just created SharePoint list in a website tab. This is what the third (sub)runbook will do for us.

First we connect to Microsoft Graph again. Might not be necessary as we did in the first (sub)runbook but better than the connection got lost. As we need to add a new Tab to a Channel we first need to identify our Channel by retrieving the “General” one.

Afterwards we create a request url based on the IDs of our Team and the Channel we just identified. Build an object for a Website Tab, converting it to Json and executing a POST request against Microsoft Graph is all we need to do to achieve our result.

The custom ShrePoint list, provisioned with PnP Provisioning to a Teams’ SharePoint site
And the same list embedded in a Teams Website tab for instance

I hope this post illustrated a bit how you can create even more complex provisioning solutions with Azure Automation and PnP PowerShell. The same is applicable for SharePoint only or Office 365 Groups of course. In the create runbook you already saw here how to create Groups. Replace this by creating SharePoint sites is an obvious possibility.

Depending on some feedback next might be to point out a bit more the basics of Azure Automation or an architectural discussion when it might make more or less sense to use compared with/combined to alternatives such as Azure (Durable) Functions, Site Designs and so on. Looking forward to receive some questions on that.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
Advertisements
Provision Microsoft Teams with Azure Automation – Part I

Provision Microsoft Teams with Azure Automation – Part I

I am a big fan of Azure Automation and its helpful capabilities around provisioning and administrating around Office 365 and its components such as SharePoint, Office 365 Groups or Microsoft Teams. In this little series I want to show you how you can provision your Teams with Azure Automation in a quick and simple way but the same would be scalable to a much more complex solution depending on your needs.

Inspired by a video from Paolo Pialorsi I wanted to establish some base runbooks in Azure Automation for provisioning Microsoft Teams. In the past I did lots of them for Sharepoint Online and Office 365 Groups.

As always striving for the latest and greatest I wanted to omit the two-step approach that also Paolo is showing in his video: First create an Office 365 Group and then enhance this Group to a Team. And indeed there is at least a beta Api in Microsoft Graph for directly creating a Team, based on an existing template:
POST https://graph.microsoft.com/beta/teams

The pattern is quite simple and similar to any other POST request, you need to

  • Care for authentication by creating a token and providing it in your request header
  • Create a body to provide information for the object to be created
  • Execute a POST request by handing in the header and body

While the token creation is similar to others and shown in the video of Paolo as well lets have a detailed look at the request body:

{
  "template@odata.bind": "https://graph.microsoft.com/beta/teamsTemplates('standard')",
  "displayName": "My First Team",
  "description": "My First Team’s Description",
  "owners@odata.bind": [
    "https://graph.microsoft.com/beta/users/<UserGuid>"
  ]
}

The first thing I missed here was a “mailNickname” or alias. I checked and tried out but you cannot provide that this way, at least at the moment, you need to rely on what Microsoft creates from your requested displayName. While on Groups provisioning I clearly preferred to give the user a chance to enter own wishes.

The second thing I found out (and here I moved away from Microsoft’s example in the beta documentation, as that wasn’t working) that you need to provide a user Guid. Same in Groups by the way if you want to provide at least one owner from the start (what you should!). So a user Guid for sure is no good idea to request from a user directly so we need some additional lines of code but lets come to that right now:

At first we connect against Microsoft Graph. A simple way to do that is with PnP PowerShell of course. I assume you already registered an app for creating Groups as documented here and we already grabbed our id, secret, domain (see later in another runbook). After connecting we get our access token and put that in a header object.

Next we come to the problem with the user Guid. Here we assume our user request provides a login name. With a simple Graph request we get the corresponding user and it’s id which we store in a variable.

Next we can fill our requestBody from above. Not only with the user Guid but also with displayName, description and also our template. I won’t go into details here but there are several Microsoft templates available and you can even override properties of the available templates. One problem we have are the escape charaters ” which enclose the template name!

"template@odata.bind" = "https://graph.microsoft.com/beta/teamsTemplates('standard')"

As in the next step we convert the just created body object to Json this wouldn’t work if we wouldn’t pipe the Json creation to a regex method called Unescape from Microsoft’s standard regex class. I found this simple but effective tip in another blogpost.

ConvertTo-Json -InputObject $teamRequest | % { [System.Text.RegularExpressions.Regex]::Unescape($_) }

The final thing now should be to invoke a Rest request of type Post to the teams Url and providing our just created header and body. We do that and store the response in another variable.

We would expect now to have our most recent created team in a variable and could use it’s Id to request it and do further things with that. Unfortunately this is the next insufficiency in the beta version we currently use: It returns NOTHING.

So for reasons of completeness I added some “weak” lines of code to get the “latest” Team created with our requested displayName. For sure such a method wouldn’t be 100% reliable. But hopefully this gets fixed before this Api is supported for use in production.

Finally I would start to show my regular concept on provisioning with runbooks: I always use a modularized approach because I hate PS scripts with tons of code lines. So I always use one parent runbook to be called that calls further runbooks for significant steps in my provisioning process usch as:

  • Create my site/Group/Team (that runbook I showed above)
  • Provision my artefacts with PnP Provisioning
  • Post Provisioning stuff, modifications where I need additional code and what doesn’t work with PnP Templates
  • Have Shared or Helper runbooks to be called from different runbook modules …

So in our case here is a simple parent runbook that calls the just mentioned “Create” runbook, retrieves back the created Team’s Id and processes this further.

First we grab our assets such as Graph access credentials and store them in $global:variables so we have them available in all called (sub) runbooks as well in case we need to share variables …

That was it for now. Hopefully there will be some progress with this Api on Teams creation soon. I will observe it and potentially update this post by then.

In the next part I will show you the approach which is already available for productional use (v1.0 Graph Api) to first create a Group and then a Team out of it and further modify the just created Team. There you will see additional things to note when handling Provisioning with Azure Automation runbooks. So stay tuned.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.

Deploy SPFx app package to SharePoint from Azure DevOps with modern authentication

As you can see from my last posts I got heavily involved in dealing with SharePoint modern authentication in the recent past. To repeat once again:

Intro

If your tenant has turned off legacy authentication you cannot simply authenticate with PowerShell and UserCredentials anymore. To check this run the following script:

Connect-SPOService -Url "https://<Your-Tenant>-admin.sharepoint.com"

$tenantsettings = Get-SPOTenant

$tenantsettings.LegacyAuthProtocolsEnabled

And potentially modify this setting (turn on: $true)

Set-SPOTenant -LegacyAuthProtocolsEnabled $true

In one of my last posts I showed how modern authentication is handled in a PowerShell script, especially in an Azure Automation environment where you can store and retrieve the necessary self-signed certificate as an Azure Automation asset.

Now another challenge: Inside an Azure DevOps Release pipeline you have another one or two PowerShell tasks. In my scenario two, one for uploading the app package to the app catalog, a second one to upload the assets to a SharePoint library, that is your Office 365 public CDN.

In Azure DevOps Release (and Build) pipelines you have no capability to simply store a certificate as an asset. Have it in the source code might not be an option to not distribute it to any developer’s environment. I only found the way to store variables (strings!), if anyone else has a better idea, speak up please 🙂

Fortunately this is also a working scenario as you can connect to SharePoint with another variant of the Connect-PnPOnline cmdlet providing a PEMCertificate and a PEMPrivateKey, both represented by large strings. According to my last post both can be simply extracted from a certificate once it is created by the New-PnPAzureCertificate cmdlet.

$Password = "******"
$secPassword = ConvertTo-SecureString -String $Password -AsPlainText -Force
$cert = New-PnPAzureCertificate -Out "AzureAutomationSPOAccess.pfx" - `
 -ValidYears 10 `
 -CertificatePassword $secPassword
 -CommonName "AzureAutomationSPOAccess" `
 -Country "DE" `
 -State "Bavaria"
$PEMCert = $cert.Certificate
$PEMSecret = $cert.PrivateKey 
Write-Host $PEMCert
Write-Host $PEMSecret

On top you also have to output the KeyCredentials and combine them with an Azure AD App registration.

Release pipeline basic configuration

In the past I mentioned Elio Struyf as a good resource for SPFx Azure DevOps Build&Release Pipelines but today I would also like to mention the blogpost of Giuliano de Luca which lists up all the necessary steps and options in a fantastic way.

In my simple scenario I picked most of the basics from him but I prefer to use a Release Pipeline only in staging environments (as you have to package for every stage again (changes in write-manifest.json for instance). But to simplify we will cover only one stage:

Simple Azure DevOps SPFx Release Pipeline (1 Stage)
Azure DevOps Release Pipeline – The Tasks

The first tasks are quite obvious. At first we follow a recommendation from Elio to use Node version 8 which significantly improves performance. Afterwards we run npm install, bundle and package our solution.

Here I want to give you a special hint on dealing with larger projects:

There I tend to use larger Git repositories, meaning not every SPFx solution has it’s own like the typical standard deployment demo scenario. That leads to the point where the standard file structure is not valid anymore. When entering the System.DefaultWorkingDirectory followed by your Artifact folder (“MM” in my example, see 1st screenshot) we come to different solution directories below:

For those solution directories (assume today you want to deploy Webpart1 but tomorrow Webpart2 and you all have them in one Git repository and all are handled by one configured Release pipeline) I created a Release variable localPath that can be set at releasetime.

Azure DevOps Release variables

This leds us to the point where we can relate to this in the npm install or our gulp tasks when it comes to a local file path:

npm install
gulp bundle task (gulp package-solution quite similar)

So depending on which webpart we want to deploy according to our variables above we would hand in different “packageFile” (we use it later!) and different “localPath” values and the task will find it’s package.json (for npm install) as well as it’s gulpfile.js

PowerShell Tasks and modern authentication

But now let’s come to the main point of this post. Let’s create the two PowerShell tasks to upload our app package and the assets to the library that represents our Office 365 public CDN. At first let me clarify which task to use as there are more outside (in the past for instance I used one from the marketplace). I now switched to the official Microsoft task.

Azure DevOps Task Template PowerShell (by Microsoft)

The first task is to upload the package to the app catalog. It is quite simple as beyond the script we do not really need to configure anything else. I skipped to give a working directory (I use cd in script instead) or provide environment variables here. Maybe worth to investigate in future.

Azure DevOps PowerShell task Deploy to App catalog

As the complete script did not fit into the screenshot here once again:

Write-Host ("Adding package " + $($env:packageFile) + " to the AppCatalog at https://" + $($env:tenant) + ".sharepoint.com/" + $($env:catalogsite))
cd $env:System_DefaultWorkingDirectory/MM/$env:localPath/sharepoint/solution/
Install-PackageProvider -Name NuGet -Force -Scope "CurrentUser" 
Install-Module SharePointPnPPowerShellOnline -Scope "CurrentUser" -Verbose -AllowClobber -Force
Import-Module SharePointPnPPowerShellOnline -Scope "Local" -WarningAction SilentlyContinue

Write-Host ("Connecting to Tenant " + $($env:tenant) + ".onmicrosoft.com with AppID " + $($env:SPOAppID))

Connect-PnPOnline –Url https://$(tenant).sharepoint.com/$(catalogsite) -Tenant "$(tenant).onmicrosoft.com" -ClientId $(SPOAppID) -PEMCertificate "$(PEMCertificate)" -PEMPrivateKey "$(PEMPrivateKey)"

Write-Output 'Connected to SPO'

Add-PnPApp -Path $env:packageFile -Publish -Overwrite
Write-Host Finished upload app package $($env:packageFile)

For debug reasons I left a couple of Write-Host but they are not necessary of course. Anyway the first shows quite well how to reference our release variables from above ($env:<ReleaseVariableName>).

After the first output we switch our current directory into our current solution and the folder where the .sppkg file resides.

Afterwards we install our PnP-PowerShell module with 3 lines of code.

But then it comes to the ‘magic’ point of this post, the modern authentication with PnP Online:

Connect-PnPOnline 
    –Url https://$(tenant).sharepoint.com/$(catalogsite) 
    -Tenant "$(tenant).onmicrosoft.com" 
    -ClientId $(SPOAppID) 
    -PEMCertificate "$(PEMCertificate)" 
    -PEMPrivateKey "$(PEMPrivateKey)"

We build our Url from known and obvious parts combined with our release variables, the Tenant as well.

We then hand in the ID of our App Registration and finally
our PEMCertificate and PEMPrivateKey string retrieved from our variable.

Finally I simplified the things again and added a simple Add-PnPApp cmdlet assuming SkipFeatureDeployment is always false. Giuliano de Luca shows a cool optional handling for this in his code repository.

The next task is quite the same:


Azure DevOps PowerShell task Upload to Office 365 public CDN
Write-Host ("Adding package " + $($env:packageFile)+ " to the AppCatalog at https://"+ $($env:tenant) + ".sharepoint.com/" + $($env:catalogsite))
cd $env:System_DefaultWorkingDirectory/MM/$env:localPath/

$cdnConfig = Get-Content -Raw -Path ("config\copy-assets.json") | ConvertFrom-Json
$bundlePath = $cdnConfig.deployCdnPath
$files = Get-ChildItem $bundlePath\*.*

Write-Output "Connecting to CDN-Site $env:CDNSite"

Connect-PnPOnline –Url $env:CDNSite -Tenant "$(tenant).onmicrosoft.com" -ClientId $(SPOAppID) -PEMCertificate "$(PEMCertificate)" -PEMPrivateKey "$(PEMPrivateKey)"

foreach ($file in $files) {
    $fullPath = $file.DirectoryName + "\" + $file.Name
    Write-Output "Uploading file $fullPath to folder $env:CDNLib"
    $f = Add-PnPFile -Path $fullPath -Folder $env:CDNLib
}

We do not need to install PnP Powershell anymore but now only cd to a different directory. Then we grab a JSon file from our repository to get a value from there. An elegant way to omit some release variables.

We once again authenticate with our certificate values to SharePoint.

Finally we use the (retrieved from Json config file) path value where our bundles reside to grab all files from there. We then iterate them one by one and upload them to the library/folder inside our CDN site.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
Secure Azure Functions Part 2 – Handle certificates with Azure KeyVault when accessing SharePoint Online

Secure Azure Functions Part 2 – Handle certificates with Azure KeyVault when accessing SharePoint Online

This is the second post of my little series on secure Azure Functions working with Office 365. The first one was about “simple” credential (user/password or ID/secret) access. Now we need to use an additional certificate.

Recently I spent lots of time with modern SharePoint authentication used in either Azure Automation or Azure Functions. For most of the parts there is some documenation but not as a whole and step-by-step guide. This is what this blog post wants to accomplish. One of the best existing posts is this one from Jeremy Hancock.

The Architecture Scenario

As in part 1 we use an Azure Function to be securely called from a SPFx webpart with AADHttpClient for instance. We do not use the user authentication but an impersonation. That is, an own Azure AD App Registration with own permissions. This can be a necessary scenario when you need elevated privileges (as SPFx directly only can use user permissions when calling SharePoint).

Example scenarios can be around provisioning or post-provisioning site modifications (where you want to allow specific users to handle stuff that needs elevated privileges).

01securelyaccessspofromazurefunction

The Azure Function is once again MSI enabled so it can authenticate “itself” against the Key Vault (which gave access to the function, see part 1). With the Client ID of an registered app, which is given SharePoint Api permissions, the Azure Function will access SharePoint.

Create a self-signed certificate

For this step we can reuse the certificate / description from my last blogpost.

But in short again:

  • Create a self signed certificate with New-PnPAzureCertificate cmdlet
  • Keep the PS window open, we need the certificate’s KeyCredentials in a minute
$Password = "******"
$secPassword = ConvertTo-SecureString -String $Password -AsPlainText -Force
$cert = New-PnPAzureCertificate -Out "AzureAutomationSPOAccess.pfx" - `
 -ValidYears 10 `
 -CertificatePassword $secPassword
 -CommonName "AzureAutomationSPOAccess" `
 -Country "DE" `
 -State "Bavaria"

$cert.KeyCredentials
{
    "customKeyIdentifier": "zUFQhchR6FJ0...",
    "keyId": "4d2fe8fc-0dbb-45a7-...",
    "type": "AsymmetricX509Cert",
    "usage": "Verify",
    "value":  "MIIDJDCCAgygAwIBAgIQV9qo..."
}

App registration and SharePoint Online Api

For this step we can reuse the certificate / description from my last blogpost.

In short again:

  • Go to your Azure Portal 
  • Switch to Azure Active Directory
  • Choose “App registrations (preview)” version (the standard works either, but…)
  • Register an App with a name of your choice
  • Provide adequate SharePoint Api permissions (Application permissions for an elevated scenario!)
  • Grant your permissions with an admin account 
  • Insert Key Credentials in app registration`s manifest settings
Register Azure AD App registration
Provide SharePoint Api permissions
Paste KeyCredentials from certificate to app registation manifest

Import certificate in Azure Key vault

This is also quite the same than loading the certificate to an Azure Automation account:

You have to import the .pfx file under “Certificates” to your Azure Key vault by entering a name and the given password (to make sure you are the one who “controls” the certificate)

Example Code

We need three parts for this:

  • Our key vault controller as we had it in part 1 as well
  • An authentication helper class for establishing our authentication / client context
  • The Azure Function itself using both parts mentioned beforehand

Retrieve certificate from Key vault

It might not seem obvious but this step is quite similar to that one in part 1. Although we imported a certificate above (and not a secret), we now need to retrieve the secret of exactly that certificate first. And this works quite the same than in part 1.

The ‘magic’ happens afterwards: While in part one we simply returned the retrieved secret value, we NOW use that value to create a X509Certificate2 from it and return that one for further usage.

Access SharePoint Online

To access SharePoint Online we use a simple CSOM / MSAL combination.

In our helper class this time (in part 1 of this series I retrieved it via the Azure Key vault but for an ID this is not 100% necessary) we retrieve our client ID from configuration manager (your local.settings.json for local debugging, respectively the “Application settings” of your Azure function).

Then we create our MSAL Authentication context quite similar to my last blog post. Next step is once again retrieving an access token but this time we provide a combination of our ID and a X509Certificate2.

Final thing is to create a CSOM client context and attach the access token to every request.

Using it in an Azure Function

To use that stuff is also no rocket science anymore. For simplicity reasons I minimized the SharePoint operation itself to a simple web.Title retrieval. Of course that part is much more code in other scenarios but here I wanted to put attention on Key vault access and SharePoint authentication only.

Only two lines are really important to mention here. At first we retrieve our certificate by using our KeyVaultAccess-controller from above in line 8. Next we retrieve our SharePoint ClientContext from our SPOAuthHelper in line 10.

Conclusion

Modern SharePoint authentication becomes more and more relevant. Furthermore there is a necessity for a secure but comfortable handling of secret artefacts such as credentials, app secrets or private keys. The combination of Azure Function, Azure Key vault and modern SharePoint authentication addresses this. 

 

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
Modern SharePoint authentication in Azure Automation (Runbook) with PnP PowerShell

Modern SharePoint authentication in Azure Automation (Runbook) with PnP PowerShell

Modern authentication (not only) in SharePoint Online becomes more and more relevant as more and more organizations turn off LegacyAuthentication. In fact this means the classic Credential authentication with UserName and Password does not work anymore.

The issue is, when connecting with SharePoint Designer or PowerShell with classic credentials you will receive a “Cannot contact web site  or the web site does not support SharePoint Online credentials” error.

The setting that handles this is the following:

Connect-SPOService -Url "https://<Your-Tenant>-admin.sharepoint.com"

$tenantsettings = Get-SPOTenant

$tenantsettings.LegacyAuthProtocolsEnabled

If this setting is False you cannot login with classic credentials. To change this you could run

Set-SPOTenant -LegacyAuthProtocolsEnabled $true
But for security reasons your organization might think different.

For Microsoft.Online.SharePoint.PowerShell the story ends here for now as it does not work with modern authentication especially in an unattended mode such as Azure Automation runbooks.

Luckily the more popular PowerShell module in case of SharePoint Online is PnP-PowerShell. This module provides capabilities for an unattended authentication towards SharePoint Online. In this blogpost I will show you how to handle this inside an Azure Automation runbook.

According to Microsoft’s PnP documentation for an app only permission authentication you would need the following things to get this scenario up and running. I will show you each of them and point out the “Azure Automation specifics”.

 

Create a self-signed certificate

Here you can either use a script or (much simpler) the new New-PnPAzureCertificate cmdlet. For instance run the following:

$Password = "******"
$secPassword = ConvertTo-SecureString -String $Password -AsPlainText -Force
$cert = New-PnPAzureCertificate -Out "AzureAutomationSPOAccess.pfx" - `
 -ValidYears 10 `
 -CertificatePassword $secPassword
 -CommonName "AzureAutomationSPOAccess" `
 -Country "DE" `
 -State "Bavaria"

This would create a self-signed certificate with the needed values and, enforced by the “-out” parameter, write it to a .pfx file in the current directory.

After that simply run the following command and keep the window open. We would need the JSON result (shortened here) a in a minute.

$cert.KeyCredentials

{
    "customKeyIdentifier": "zUFQhchR6FJ0...",
    "keyId": "4d2fe8fc-0dbb-45a7-...",
    "type": "AsymmetricX509Cert",
    "usage": "Verify",
    "value":  "MIIDJDCCAgygAwIBAgIQV9qo..."
}

Register an Application

The next thing would be to register an app in your Office 365 Azure AD. Simply go to your Office 365 Admin Center and from there down below on the left side under “Admin Centers” to the “Azure Active Directory” admin center.

Here I encourage you now to use the new “App registrations (preview)” version and register an application.

RegisterApp1

Here only the “Name” is really relevant for us. Once you have that, you would need to give permissions to that app registration.

RegisterApp2_Permissions

Assuming you want to create lots of runbooks with several admin scenarios the selected permissions would be necessary. If you only have specific needs you might reduce it to a lower level. Pay attention to check “Application permissions” as we cannot handle a delegated scenario.

RegisterApp3_GrantPermissions

Anyway, any given SharePoint permission needs an administrator consent. Simply click the button but make sure you are a tenant administrator.

RegisterApp4_Granted

Then your permissions should be granted.

Next we would need to add our “KeyCredentials”. This “relates” our app registration to the recently created certificate. So switch back to our left open PowerShell window and copy the JSON output. Then switch to the app registration’s manifest and insert it there inside the [] .
At the moment of writing there was an issue with the “Preview” version of the App registration. If you receive the error “Failed to update application SPAccess. Error details: KeyValue cannot be null or empty” you can simply re-open your app registration in the classic mode, insert it there and save. Switch back to the preview mode and there are even more parameters now:

RegisterApp5_Manifest

The final thing would be to copy the App Id. Therefore switch to the “Overview” tab. Here you can copy the ID to the clipboard once we need it in another minute.

RegisterApp6_CopyID

Automation Account settings

PnP PowerShell module

I assume you already have an Automation Account configured. If not follow the documentation from Microsoft. On top you should install the PnP PowerShell module under modules. Simply install it on your local machine and then ZIP the local folder “C:\Program Files\WindowsPowerShell\Modules\SharePointPnPPowerShellOnline” and upload it under “Modules” in your Automation Account.

Upload pfx certificate

Now under “Certificates” upload your recently created certificate (the .pfx file). Therefore you would need the given name and the password.

AzureAutomation1_AddCert

Pay attention to check “Yes” for Exportable as we later need exactly this capability!

Finally I would like to store our App ID as well as the password for the certificate inside the Azure Automation Account assets. Although both values do not exactly belong together, you can store them as a simple credential pair. We later use them independently.

AzureAutomation1_AddCredentials

The Runbook – PnP PowerShell Commands

Now we have everything in place to create our first runbook with modern app only authentication. Therefore create a new PowerShell runbook and insert the following code:

$azureSPOcreds = Get-AutomationPSCredential -Name 'AzureAutomationSPO'

$siteUrl = "https://.sharepoint.com/teams/MMTeamNo1"

$aadDomain = ".onmicrosoft.com"

$Password = $azureSPOcreds.GetNetworkCredential().Password

$secPassword = $azureSPOcreds.Password

$clientID = $azureSPOcreds.UserName

$cert = Get-AutomationCertificate -Name 'AzureAutomationSPOAccess'

$pfxCert = $cert.Export(3 ,$Password) # 3=Pfx

$certPath = Join-Path "C:\Users" "SPSiteModification.pfx"

Set-Content -Value $pfxCert -Path $certPath -Force -Encoding Byte 
    

if (Test-Path $certPath)

{

    $pnpCert = Get-PnPAzureCertificate -CertificatePassword $secPassword `
					-CertificatePath $certPath

    Write-Output "Connecting to $siteUrl"

    Connect-PnPOnline   -CertificatePath $certPath `

                        -CertificatePassword $secPassword `

                        -Tenant $aadDomain `

                        -ClientId $clientID `

                        -Url $siteUrl 

    $web = Get-PnPWeb

    Write-Output $web.Title

}

else

{

    Write-Output "No Cert"

}

For simplicity reasons I put some other values in as hardcoded which you would also put to different Automation variables (or ONE as a config XML).

The first interesting thing is getting the $azureSPOcreds. After retrieval we extract the UserName on the one hand, that is our AppID. On the other hand with two commands we extract the password, once simply as a SecureString by retrieveng the Password attribute and give it to the $secPassword variable. Then with the help of the GetNetworkCredential() method we also extract the password in PlainText (!!!) and give it to the $Password variable. We need it that way for the certificate.

But here you see a disadvantage of the Azure Automation assets. Once you have some Admin credentials in there AND access to it, you also have the clear text password! 😉 

The next trick we use is to export the certificate to the “local filesystem of the Automation account” which is only valid during runtime of a runbook execution.

This is because the Connect-PnpOnline command we use next expects it that way and cannot handle a X509Certificate2 object directly (that is what we get with the Get-AutomationCertificate cmdlet).

Once we have that exported we simply ‘Test-Path’ our export and if everything is fine we can Connect-PnPOnline to our SharePoint and execute a simple demo task to get the current web’s title.

Secure Azure Functions Part 1 – Use Azure KeyVault Secrets when accessing Microsoft Graph

Secure Azure Functions Part 1 – Use Azure KeyVault Secrets when accessing Microsoft Graph

This is the first post of a little series on secure Azure Functions working with Office 365. This first one is about “simple” credential (user/password or ID/secret) access. Next we are going to use an additional certificate while accessing SharePoint Online.

When secure access to APIs is needed from Azure Functions the challenge is to securely store access credentials or secrets. To access Microsoft Graph for instance for the OAuth authentication you need a ClientID and a secret. The question is now where to store this. Is it a good idea to store this in your code or AppSettings? Most likely not!

In this post I want to show you a simple scenario to access Microsoft Graph from an Azure function and retrieving the ClientID and AppSecret from Azure KeyVault.

Accessing Microsoft Graph in this case is only an example for an API access where a secret is needed.

The architecture scenario

01SecurelyAccessMSGraphFromAzureFunction

Our architectural scenario is an Azure function that is securely called, for instance from a SharePoint Framework (SPFx) component via the new AADHttpClient. Nevertheless we will omit the UI part in this post and fully concentrate on the backend side:

So there we have our Azure function which will securely retrieve access credentials from Azure Key vault. Having that it will securely authenticate against our backend Api, that is Microsoft Graph in our example.

At first we will start creating the “credentials” for our backend Api. In our case this is an Azure Active Directory app registration.

App registration to access Microsoft Graph

There are two versions for Azure App registrations. V1 and V2. For a detailed overview on the differences refer to here. But as we want to access Microsoft Graph and this is possible with V2 we will now use that one. In a later scenario we will access SharePoint Online which was only possible till recent past with V1 and then we will have a look at that variant.

For accessing V2 we open the Url https://apps.dev.microsoft.com/ and register a new converged application. We will need to

  • Enter a name
  • Copy the application Id (we can do this later as well)
  • Create a new secret and copy it (Attention, we CAN’T do this later as well, we only can create a new one if we forget)
  • Add the platform Web-Api
  • As we want to access Office 365 Groups and its owners:

    Add Application permission Group.Read.All and Directory.Read.All

  • You can uncheck Live SDK support

To manage V2 app registrations from Azure Portal as well is currently under preview at the time of writing (end of 2018). To use it, simply log in to https://portal.azure.com/, go to "Azure Active Directory" and then "App registrations (preview)". The steps then are quite the same then mentioned above but slightly simpler. We will have a look at this in a later scenario as through this way it is now also possible to handle SharePoint Online Api permissions.

Once you have registered your app you need to consent it. Use the following Url template:

https://login.microsoftonline.com/{tenant}/adminconsent?client_id={AppId_seeAbove}&state=12345&redirect_uri=https://localhost/myapp/permissions

The tenant can be your GUID our the full name ( .onmicrosoft.com ).

The redirect_uri is obsolete as it might not be reached after everything went fine so ignore the 404 error afterwards.

In the new app registration preview this step can be done directly from the app registration by one click (make sure you are logged in as tenant admin that has adequate permissions to grant them to all users).

GrantGraphAPIPermissions
Grant Microsoft Graph Api permissions (app registration preview)

Establish Azure KeyVault

Creating an Azure KeyVault is also straightforward with some small but important steps to notice. First we switch to “Key vaults” in the Azure portal and create a new one:

CreateKeyVault01

A unique name is necessary as well as the subscription and the ressource group. The rest we can keep as is for now.

Next we will add our created AppID and Secret to the Key vault:

CreateKeyVaultAddSecret01

We do this twice. Once for the ID (debatable, you might also put this only to the AppSettings but …) and once for the Secret (not debatable 😉 )

CreateKeyVaultAddSecret02

Before leaving the key vault for a moment we might note down it’s Url which we need for access from our code later:

CreateKeyVaultUrl

That’s it for the moment. Once we are going to retrieve them from our Azure function we will come back to establish access to the key vault.

Configure the Azure Function App

For the Azure Function App we basically don’t need something special. For those of you new to the topic pay attention to create the Function App via “App Services”:

CreateFunctionApp01

Additionally (for instance when consuming your Azure function from SharePoint Framework (SPFx)) you might need tp

  • Configure CORS settings
  • Configure Authentication

Both is described here.

Managed Service Identity (MSI)

For sure the access to the recently established Azure key vault is not “anonymous” but also secured. The trick to not having to handle another credential/secret is the “Managed Service Identity”.

We can simply extend our Azure function app to register with Active Directory. In a next step we allow access for that AAD registration (think of an AD account for that app which auto-authenticates against the key vault later on) to our Azure key vault.

In your function app select “Platform settings” and there “Managed Service Identity”

CreateFunctionAppMSI01

Next you simply turn MSI on and save it

CreateFunctionAppMSI02

That’s it. After a short while you can switch back to your Azure key vault. There you choose “Access policies” and “+ Add new”.

Under “Principal” search for the recently created app function name. It can be easily detected by the special icon logo. For permissions select “Get” and “List” at least for “Secret Permissions” (for a later scenario I already added the same for key and certificate…)

CreateKeyVaultAddMSIPrincipal01

Example Code

Retrieve Key Vault Secret

The first thing we need are our app credentials (ID / secret) to access Microsoft Graph afterwards. The retrieval is quite simple as we need no explicit authentication. It is handled implicitly by MSI.

The async task will return the secret for a given name. First we build the Uri for it, constructed by our base Url we noted above when establishisng our key vault and now retrieving it from the ConfigurationManager. The Url is something like:

https://<Your-Key-Vault-Name>.vault.azure.net

Then with that Uri we finally retrieve the secretValue asynchronously and return it back.

Access Microsoft Graph

To access Micrsoft Graph we use the MSAL library which is represented by the NuGet package “Microsoft.Identity.Client”. This library is designed to handle Azure AD V2 endpoint access with AccessTokens. It is still a pre-release as time of writing (end of 2018) but it is declared “production ready”. You could also use the ADAL library for this scenario (you find an alternative example in the source code) and will use this in a later scenario for V1 access. Further explanation you can find here.

We extract our Graph stuff to a separate controller. At first we need some basic values which we can either hardcode or retrieve from ConfigurationManager.

But on top we need our registered values from above which we included both in Azure KeyVault. We will provide them within our constructor.

Next thing is to establish a GraphServiceClient. Therefore we retrieve an AccessToken via MSAL and on behalf of our client app registration values.

Finally we have our Task to retrieve the data. In our small example case we want to check if  a given user account is owner of a given Office365 Group ID.

The Azure Function – Putting it all together

The magic is already shown but now we simply put it alltogther inside our Azure function.

For our simple demo scenario we expect an Office 365 GroupID and a user login from query parameters.

Next we retrieve our application ID and our application secret from Azure Key vault by using our static KeyVaultAccess class.

Having those values we instantiate our GraphAccess class.

Once we have it we finally call the isValid method to evaluate if the given login is owner of the given group. The method retrieves the login and group id but before it needs a “GraphServiceClient”. I showed you two methods (ADAL and MSAL version) inside the GraphAccess class and here the MSAL version is called. That’s it.

Conclusion

You could now call this function from a SPFx solution for instance but might also say: Hey, why not directly call the Graph from SPFx?

You are right but this is an example that can be simply adapted to your own API call (that is, not MSGraph but also needs a ID/Secret or user/password access). Furthermore you might not provide a privileged access to your users by admit WebApiPermissions. Or you need Graph access in combination with other backend code for instance?

An even better scenario for this is elevating SharePoint permissions. Therefore I am going to write another post establishing access to SharePoint Online from Azure function by using Azure Key vault. Here we will need a certificate on top. So stay tuned.

AppSettings in SharePoint Framework (SPFx) and Staging

With SharePoint Framework 1.6.0 the great capability to use 3rd party Apis went GA. To use such a 3rd party Api you regularly need to use a Url. A simple scenario how to do that is described in Microsoft’s tutorial or the other one when you have a multi-tenant scenario (your Office 365 tenant is not the same like the managed Azure Active Directory, which can be the case especially in staging environments).

When it comes to handling of Urls in enterprise solutions you automatically get to the staging challenge. You might have a test stage where your test SPFx webpart might want to call a test azure function and an acceptance SPFx webpart might want to call the acceptance version of your azure function for instance.

So how to handle that?

I assume you already have established a Release Pipeline in your VSTS environment. If not Elio Struyf is your go-to resource. Furthermore there is a great post from Velin Georgiev who shows us how to use AppSettings inside your SPFx project. His post gave me the inspiration for this alternative solution I want to show here.

According to his post we simply need an appSettings file with following content:

Additionally as per Velin’s post we also need the corresponding d.ts file which in this case might look like this:

To use this in your code when calling the aadHttpClient it would look like the following:

We first import our appSettings. For the sake of brevity here I combined retrieving the aadHttpClient with the Azure Function call. In line 6 respectively line 9 we use the AppId / Url of our function retrieved from the appSettings. So far, so good.

But now comes the new stuff: How to update this on each new stage deployment? Therefore in my case I decided to put ALL stage settings in another json file which I call appSettings.all.json:

As you can guess now during deployment I will hand in the current stage name/abbreviation and copy from the “all” source to appSettings the right Url / AppID combination (we come to that in a minute). But at first I want to answer another question: Why do I put all my settings in the source code and not in a DevOps variable in VSTS for instance? Well, this parameters are no Fort-Knox secret like passwords, credentials e.g. They will occur in the JS code finally anyway when the user runs it. And in big solutions with lots of webparts you might have ONE release pipeline but a significant number of Azure Functions to call, maybe? That is my reason for this way.

Let’s go on with the final thing, that is a new gulp task to copy the right Url/ID for the related stage. The gulp task you find below is slightly similar to those from Elio in his VSTS posts where he updates the write-manifest.

So what does the task (which is quite simplified to get an easy understanding). It first gets the stage by an input parameter (run “gulp update-appSettings –stage DEV” for instance). Stages per our definition in the appSettings.all.json are DEV, TST, ACT & PRD.
Next is to retrieve the stage-corresponding Url and ID parameter (line 11,12,15,16) and finally writes them to the target appSettings.json file (Line 15,16) and stores that file (Line 17). That’s it.

Hope this helps you enhancing your DevOps release pipeline for SharePoint Framework (SPFx) in VSTS.

Executing batch requests with Microsoft Graph and SharePoint Framework (SPFx)

In my recent blogpost I wrote about using batch requests with the SharePoint Rest Api within a SharePoint Framework (SPFx) solution. Today I want to construct a similar scenario which shows batch requesting using the new Microsoft Graph access class within SPFx.

Please note that this approach  is currently, when writing this post (Jul-2018), in final preview mode and therefore not supported in production scenarios as it might be subject to change. A GA is expected with the next SPFx version (1.6.0) end of this month.

Update: This is now targeting the GA version from SPFx 1.6.0 on.

Lets start with a simple scenario. Suppose we have a bunch of Group IDs

Within our component we will do several things now. First within the constructor (you might use a Graph call several times within your code, otherwise you can also do it in your function) we instantiate a MSGraphClient.

As this is an async call now, it is debatable to do this in the constructor or at a later stage in componentDidMount. In larger projects I usually exclude the whole data stuff to a separate controller class which I instantiate in the constructor.
In a getGroupsData function we then create our requests based on the IDs of our Groups from above (Please note that we also handover an index which we can later use to simply identify the repsonse if needed as the repsonses might be ordered differently than the requests originally were):
What happens afterwards is one simple call to the Graph Api handing over all requests in the body. If no error occurs and we recieve a Status of 200 per response (also one single request might fail, for instance when we have provided a wrong Id) we can push the response part to our array based on our custom UnifiedGroup interface.
When every response is handled, we set our state which lets our component re-render as usual.
Please pay attention that there is a hard limit of 20 (!!) requests per batch request. So in case you would have more than 20 Ids (or something similar, this is a constructed scenarion of course) you need to create several batch requests, similar to a paging approach.
Before we can test our stuff, we also need to request permissions in the package-solution.json
For debugging with the workbench this is sufficient (this now works with SPFx 1.5.0!), when deploying our solution we have to grant that permissions by an admin tenant-wide also as described here.
Update: 2 additional things to mention here are the general request for “Windows Azure Active Directory” | “User.Read” which is required to have for all requests. On top I recently came across an obviously general limitation of having the string values for resources and scopes not more than 255 characters in total. So pay attention on this as well when having to request a significant number of Graph resources for instance. But as you only have to approve once you might split up your request over several packages but must rely on their all existence inside a tenant then.
Once we run our simple solution it might (simply rendered) look like this:
result
Finally for your reference the whole component’s source:

Use SharePoint Rest API Batch component to retrieve site logos

In the modern design of Office 365 and SharePoint Online in detail it is all about logos, photos and tiles or cards. Unfortunately there is still no easy Api to retrieve.

There is an undocumented version that helps maybe but only in case of ‘followed’ or ‘recent’ sites are what you want. Here is the issue and here the uservoice request.

In this post I want to show you another approach that uses the batch capability which is available for the SharePoint Rest API and can be easily used from the SPHttpClient class within SharePoint Framework (SPFx).

Furthermore the example is kept quite simple so you can also get an idea about batch requests with SharePoint Rest Api ans SPFx in general and adapt it to your own case.

Assume we have a bunch of site Urls in our use case:

When we start a batch we first need a url:

Now we have the problem that we cannot simply request the site properties as this would lead to totally different Urls which doesn’t work.

So in detail, a batch request with requests to
https://<yourtenant&gt;.shrepoint.com/sites/SiteA
https://<yourtenant&gt;.shrepoint.com/sites/SiteB
https://<yourtenant&gt;.shrepoint.com/sites/SiteC
won’t work but several request to the same site but different lists, items, e.g. will work.

Fortunately search helps out here. We can simply request the needed properties with a search query for a site with our specific URL from current context:

Now we have constructed and collected all of our requests. We can now send them to the server and handle the responses:

Two important things to explicitly mention here:
First we use Promise.All() to wait for all requests to be completed. If we would handle each one be one we would need to write to our state each time a request is done. Or we would store it in a variable but then the challenge is when are all done and we can transfer to state? A counter can help (increase on request, decrease on response) but Promise.All() is much simpler.
Second we have to handle another async call, the json() method. Therefore we build another array of requests and with another Promise.All() we handle our final JSon response.

As we have to handle the Response of a search call, we use the helper Interfaces as below:

This approach is taken from Elio Struyf’s SPFx search example webpart.

Finally we can render our data.

RestBatch_Result

As you can see I only used Groups respectively modern teamsites in my example. I like that they expose a site logo by default as a combination of an acronym and a randomly picked color. Modern communication sites display the same by default but do not expose the values (except in the above mentioned undocumented Api on followed or recent sites) or a built image as modern teamsites do. Mikael Svenson also commented on this a while ago, so we hope that Microsoft will improve the logo handling quite soon.
But I hope I gave you some ideas how to improve this yourself and also how to handle batch requests in general.

Now stay tuned for my next example where I will cover batch requests with Microsoft Graph and SharePoint Framework (SPFx).

Last not least the whole component class for your reference:

Extend PnP SharePoint Framework React ListView Control with a context menu

Extend PnP SharePoint Framework React ListView Control with a context menu

The ListView component from the PnP SharePoint Framework React Controls is really simple in use. And it’s furthermore easily extensible. In this little post I want to simply show how you can add a context menu to the list item like it’s available in the default modern experience of lists and like the “edit control block” in the classic experience.

Assume we have a simple list view webpart like the following

01_EmployeeListView_rel1

achieved by the following code

We now need to create another component which represents our context menu. Therefore we will use a combination of an IconButton and a ContextualMenu from the Office UI Fabric components.

The code for this additional component looks like this:

One of the tricks is here to give an empty iconName for the menuIconProps. This is because once you attach a menuProps Attribute to any Kind of Office UI fabric button a ChevronDown selector on the right side of the button will occur by default. With the menuIconProps Attribute you can adjust this, that is when specifying an empty Name you can remove it. This is what we want because we only want to have our “MoreVertical” icon which are the three dots in a vertical order.
To place this a bit more centric, we have small CSS manipulation as well:

Finally we need to put the pieces together. Therefore we will insert another Viewfield in the webpart code at the position of our choice, for instance after the “Lastname”:

{
        name: "",
        sorting: false,
        maxWidth: 40,
        render: (rowitem: IListitem) => {
	 const element:React.ReactElement<IECBProps> =React.createElement(
            ECB, {
	      item:rowitem
            }
          );
          return element;
        }      
},

We use the render method of IViewField. Inside we create our ECB component and as a property we handover the rowitem which is clicked. We could also handover functions that shall be executed inside the context menu component but be able to be bound to the parent component, that is the webpart which contains the ListView.

The result will look like the following:

02_EmployeeListView_ecb

And in action it shows which function and item was clicked:03_EmployeeListView_ecb_clicked