An Outlook Add-in with SharePoint Framework (SPFx) – Introduction

An Outlook Add-in with SharePoint Framework (SPFx) – Introduction

Since SharePoint Framework version 1.10 you can also develop Office Add-Ins with SPFx starting with Outlook Web Access in public preview. The great benefit is you already have the prepared context for access of Microsoft Graph. In this demo scenario I want to show you how to create a valuable Outlook Add-In with the capability to store complete mails to OneDrive, Office 365 Groups or Microsoft Teams.

An Outlook Add-In to copy a whole mail to Teams, Groups or OneDrive

Let’s start the first part by creating our first Outlook Add-In. It’s already documented quite well in a Microsoft tutorial and I won’t repeat the necessary steps here and they are also quite familiar to you I suppose as you are not new to SharePoint Framework.

The only difference I made to the tutorial above and what you can do as well of course: Use the “React Framework” instead of “No JavaScript Framework”.

At first lets only take a short look into the Outlook Add-In manifest. I won’t go into details about changing position or logo or something but at least a custom title we should give. Therefore open the XML file in the \officeAddin folder and set the following settings according your choice:

What we need to do next is to get our current mail as we want to run our add in in the context of a selected mail. The following code snippet simply handles that:

Base Webpart render function

We first check this.context.sdks.office and if we have that we try to use this.context.sdks.office.context.mailbox.item. From there we can retrieve the mail ID and the subject which we will need at a later point of time. We hand them over to our first React component and in case we found nothing we will handle that null value adequately. On top we hand over the msGraphClientFactory which we will use to connect to Graph for folder retrieval, mail storage and so on.

That component I reduced a bit in code and explanation here. The main task is to render the initial logic for browsing your Teams, Groups or OneDrive folders but also hold the storage methods. Both I will cover in separate parts. The full code you can find in my GitHub repository.

Base Component

First we instantiate a GraphController and once ready it is set to the state so we know it’s there and can display our entry points for OneDrive, Groups and Teams accordingly.

In the second part we will get to know the browsing logic through all available Teams, Groups respectively OneDrive Folders where we can store our mail.

Meanwhile you can check the full code repository here in my GitHub.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
An Outlook Add-in with SharePoint Framework (SPFx) – Store custom metadata with your mail

An Outlook Add-in with SharePoint Framework (SPFx) – Store custom metadata with your mail

Since SharePoint Framework version 1.10 you can also develop Office Add-Ins with SPFx starting with Outlook Web Access in public preview. The great benefit is you already have the prepared context for access of Microsoft Graph. In a demo scenario I showed you how to create a valuable Outlook Add-In with the capability to store complete mails to OneDrive, Office 365 Groups or Microsoft Teams.

Recently I presented that solution in the Microsoft SharePoint Framework bi-weekly community call.

During the call in the chat the idea was born to enhance this solution with some metadata. In fact to store with the mail so it is persisted by when this mail was saved where. Many thanks to Neelamugilan Gobalakrishnan and my colleague and MVP Wictor Wilen for your feedback on this.

Now I want to show you a simple solution for that. The existing project will only be enhanced by a simple operation to save an openExtension to the message. And when we open the Add-In we first try to get an existing openExtension for that mail and display as a reminder when and where we previously saved that mail in case that openExtension exists.

Now why an open extension you might think? Of course it depends if you use the (more simple) scenario of an open extension or the more complex but also more valuable schema extension scenario.

As per this great post from Mikael Svenson I have three arguments for the choice of open extensions:

  • I do not want to run a filter query on my messages based on the open extensions
  • I do not need my schema going public
  • I do not want that metadata schema being irreversible stored (you can only deprecate it for reasons once ‘Available’)

So lets start with the expected result. Once we open our existing add-in to store our mail somewhere we might want to have a hint like this that we already stored our mail previously and where:

Outlook add-in informs you about previous save operation

To achieve this we first need to store that information during our save process. In our controller we create another function that stores our metadata as an open extension to that mail. This works simply like that:

The function retrieves the id of the mail, a displayName, an url of the target location and finally a date. The displayName we construct from the target (OneDrive or Group/Team name) combined with the path we already displayed in our breadcrumb. That’s all for the metadata storage.

Now back to the start of our add-in where the requirement is to retrieve potentially existing metadata to remind the user in case the mail was already stored somewhere. In our base component we try to get the metadata once our graphController is established. As per the get an existing openExtension documentation you can either get the specific extension or expand the extension when getting a known instance, that is your mail. We will do the latter one as we do not know if the extension exists at all and in the former option a non existing extension would raise an error (which we could handle but that’s not the elegant way of course).

The graph request tries to retrieve the mail again only with id and subject but also with the expanded extension. In case there is metadata we return that as an object otherwise null. The result will be written to the components state and rendered in case it is not null.

Display in case metadata is filled
Display in case metadata is null

That’s all with my little enhancement using custom metadata for mails based on Microsoft Graph open extensions for my Outlook add-in.
You can check the full code repository here in my GitHub.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
An Outlook Add-in with SharePoint Framework (SPFx) – Storing mail with MicrosoftGraph

An Outlook Add-in with SharePoint Framework (SPFx) – Storing mail with MicrosoftGraph

Since SharePoint Framework version 1.10 you can also develop Office Add-Ins the with SPFx starting with Outlook Web Access in public preview. The great benefit is you already have the prepared context for access of Microsoft Graph. In this demo scenario I want to show you how to create a valuable Outlook Add-In with the capability to store complete mails to OneDrive, Office 365 Groups or Microsoft Teams.

An Outlook Add-In to copy a whole mail to Teams, Groups or OneDrive

In this part you will handle the mail storage to your selected Team, Group, OneDrive via MicrosoftGraph.

From our last part you know that for storing files via Microsoft Graph to any kind of drive (it’s the same handling for Teams, Groups, SharePoint, OneDrive so the selection to which kind of drive was already made in part 2) it depends if your file size is less or more than 4MB.

If it’s less than 4MB the action is quite simple:

GraphController: Save a ‘normal’ mail smaller 4MB

Having the driveID, selected folderID from part 2 the filename and a created mimestream from part 3 we simple call the Graph Api with a PUT.

After completion or in case of an error we call our client callback method which was handed in from UI where it might respond with a user friendly error / success message.

In case our message mimestream is bigger in size than 4MB it gets a bit more complex as we have to establish an UploadSession first and afterwards upload our whole mimestream in chunks.

GraphController: Save big mail with an UploadSession

This function looks still quite easy. We create an upload session with quite the same parameters than before while storing a smaller mail in one single action. Once created we asynchronously call the real upload function which puts our whole mimestream into slices and comes next:

GraphController: Upload the mail slices

We use two working variables first minSize and maxSize. Those will be used in a “from – to manner” to extract the slices of the whole sequential mimestream. We will iterate the whole mimestream, in our case with 320kilobyte slices (but you can and should increase that value later!!) until we reach the end of the stream. Each slice we have extracted we will upload “through the” created UploadSession.

The function for a single slice again is quite simple as it creates a header, takes the received mimestream slice and PUTs it against the received UploadUrl which includes a reference to the selected UploadSession.

That’s all to describe here. Hope you found my little series useful on how to work with the new SPFx capability on Outlook Add Ins or the possibilities around accessing mails or uploading files in different sizes with Microsoft Graph.

Again, you can check the full code repository here in my GitHub.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
An Outlook Add-in with SharePoint Framework (SPFx) – Retrieving the mail as Mime

An Outlook Add-in with SharePoint Framework (SPFx) – Retrieving the mail as Mime

Since SharePoint Framework version 1.10 you can also develop Office Add-Ins with SPFx starting with Outlook Web Access in public preview. The great benefit is you already have the prepared context for access of Microsoft Graph. In this demo scenario I want to show you how to create a valuable Outlook Add-In with the capability to store complete mails to OneDrive, Office 365 Groups or Microsoft Teams.

An Outlook Add-In to copy a whole mail to Teams, Groups or OneDrive

In this part you will see how to retrieve a mail as a whole mimestream so you can later store it as a file. In the past this was possible with the (meanwhile deprecated) Exchange asmx web-Services. Nowadays at there is also a Microsoft Graph endpoint for this.

The Api is simple me/messages/${mail.id}/$value
In the following function we are calling that endpoint and using the response to handover to next function for storing it somewhere.

GraphController: Retrieve Mail as MimeStream

Several parameters are not really needed here they are just handed in to handover to the next function for handling the response. The only needed parameter here is the mail respectively its id.

Once we have a positive result we need to check the size of the response, that is our mimestream. This is because for a file save operation via Microsoft Graph there are two different scenarios one simple scenario for files smaller than 4MB and a more complex one for files larger than 4MB. But this difference we will cover in the final part where we handle the storage operation of the the mimestream to a Group, Team, SharePoint or OneDrive. So stay tuned.

Meanwhile you can check the full code repository here in my GitHub.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
An Outlook Add-in with SharePoint Framework (SPFx) – Browsing your target OneDrive, Groups, Teams

An Outlook Add-in with SharePoint Framework (SPFx) – Browsing your target OneDrive, Groups, Teams

Since SharePoint Framework version 1.10 you can also develop Office Add-Ins with SPFx starting with Outlook Web Access in public preview. The great benefit is you already have the prepared context for access of Microsoft Graph. In this demo scenario I want to show you how to create a valuable Outlook Add-In with the capability to store complete mails to OneDrive, Office 365 Groups or Microsoft Teams.

An Outlook Add-In to copy a whole mail to Teams, Groups or OneDrive

In this part you will see how to retrieve all joined Groups or Teams and further retrieve the drives or folders, also from OneDrive, to browse them and select one as the target.

The UI simply follows quite the same process than moving an item within OneDrive or SharePoint. At first you can select your target system OneDrive, Office 365 Groups or Microsoft Teams.

Once you click one, and here we start with Groups respectively Teams as they are “higher” in terms of object hierarchy, we need to list all available Groups or Teams to the user . Therefore we have two functions in our GraphController.

GraphController: GetJoined Groups and Teams

As we do not need specific attributes of Groups and Teams, only as a parent container for browsing we treat them as a folder (IFolder interface). Drives and folders itself will only need the same attributes (id, name, optionally reference to parent item).

At the beginning of our “Groups” or “Teams” component we call that method and put the retrieved items as folders to our state.

We then render them in a simple way. This will be repeated later for all child items the same way.

Teams component (Groups component quite similar)
Folder component

Interesting is the subFolderCallback attribute of each folder which is set in the render function. In case we have a parent item (which has a parentFolder of null) we know this is a Group or Team. So once you click on it you want to see the drives below. In all other cases such as a drive or a folder a click on it wants to retrieve folders as child components. This is always the same method.

GraphController: GetDrives and SubFolders

As a folder has a parentFolder attribute of same type IFolder that parent also has/can have a parentFolder. With that information we can also create our simple breadcrumb.

Breadcrumb component

This breadcrumb can have up to three parts:

A << root anchor which brings you back to the root. The grandParenFolder (here “Folder 1”) which has a link and brings you one level up and a parentFolder (here “Subfolder 1”) which is not linked as it is the current location where you would store your mail right now in case you push the button and from which you might see eventually available child items.

In OneDrive it works quite the same but with two levels less as you only have one OneDrive (and not several Groups or Teams) and directly want to list the first level of folders once you open it. From there the functionality is quite the same.

In the next part we will get to know how to retrieve our mail as a full MimeStream object from Microsoft Graph.

Meanwhile you can check the full code repository here in my GitHub.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
PnP SPFx React Carousel improvements – Dot Navigation

PnP SPFx React Carousel improvements – Dot Navigation

The PnP SPFx React Control carousel is great when it comes to display various content on limited amount of space. Recently I had to implement two improvements which I now want to share with you. While another idea would be to directly implement this into the control itself.

In this second part of the carousel improvements we want to implement a dot navigation, that is showing dots for each existing element inside the carousel control. It allows us to directly navigate to each of the elements and also shows us the current position of the active element (full vs empty circle).

For simplicity reasons we omit retrieving data and mapping to content elements but simply use pre-defined react elements as in the PnP SPFx React test webpart. I also assume you already know how to install PnP SPFx React controls.

Carousel – The basics

This is the same starting point as in part 1. We can skip the state here for the moment. But again we take a look at our pre-defined elements AND the case that the carousel uses our implemented triggerNextElement function. Also remember that the “key” attribute of the elements is mandatory in combination with TriggerNextPage.

Carousel – Dot Navigation

The first thing we note here, is the creation of the dot navigation (dotNav). We create Icon elements per each existing content element from above. The name of the Icon is either “StatusCircleRing” or in case we have the element with the current index “StatusCircleInner” which is the filled circel. Furthermore we style the Icon a bit (refer to my GitHub repository for further details) and we connect the onClick event with our triggerNextElement function.

This function is exactly the same is in part 1: It only updates the current state after it ensures the index always stays in the given range (0 to element length -1).

So although we skipped the state above: It’s essential to control the current index as well as the (dis)ability to move to prev/next element. Although there is an infinite property in the pnp carousel control it seems not to work when TriggerNextPage is implemented.

The final solution might look like this:

Carousel Dot Navigation – Final result

That’s all and nothing really complex right? Nevertheless feel free to inspect the whole solution in my public GitHub repository.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
PnP SPFx React Carousel improvements – Autoplay

PnP SPFx React Carousel improvements – Autoplay

The PnP SPFx React Control carousel is great when it comes to display various content on limited amount of space. Recently I had to implement two improvements which I now want to share with you. While another idea would be to directly implement this into the control itself.

In this first part of the carousel improvements we want to implement an autoplay, that is automatically moving forward through all content elements of the carousel control. For simplicity reasons we omit retrieving data and mapping to content elements but simply use pre-defined react elements as in the PnP SPFx React test webpart. I also assume you already know how to install PnP SPFx React controls.

Carousel – The basics

First we have 5 pre-defined content elements. As you would usually create them from retrieved data we also copy them to the current state where we also hold the currently displayed element and its index.

We then are going to render our carousel element with its defined properties. Pay attention that we only hand in ONE element here, the current one and that the carousel logic is driven by the triggerPageEvent method. Please also note that for working properly it’s essential that the elements hold a “key” property sor sorting purposes.

Now lets handle the autoplay logic:

Carousel – The autoplay logic

In the componentDidMount we start an interval. The delay is the given interval property in seconds but we have to multiply with 1000 to have the required milisceonds in this case. The interval starts another function, the autoplayElements. This function will be called in the given intervals (all X * 1000 miliseconds) and always load the next element (currentCarouselIndex + 1).

To load the element we have the triggerNextElement function. This one was already referenced above by our carousel control, that is because it is not only used by autoplay but also when you click the next or prev button. This function only updates the current state after it ensures the index always stays in the given range (0 to element length -1).

The final solution might look something like this:

Carousel Autoplay – Final result

That’s all and nothing really complex right? Nevertheless feel free to inspect the whole solution in my public GitHub repository.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
Provision Microsoft Teams with Azure Automation – Part II

Provision Microsoft Teams with Azure Automation – Part II

I am a big fan of Azure Automation and its helpful capabilities around provisioning and administrating around Office 365 and its components such as SharePoint, Office 365 Groups or Microsoft Teams. In this little series I want to show you how you can provision your Teams with Azure Automation in a quick and simple way but the same would be scalable to a much more complex solution depending on your needs.

In this second part of my little series I want to show you the way currently supported in production to first create a Group and then a Team based on that. Also we will have a look how we can further adjust that Team by execute some PnP Provisioning templates against it and its corresponding SharePoint site. Finally some adjustments to the Team (not possible with PnP Provisioning so far) will be done.

To remember what I started to explain in Part I. We have a modularized approach consisting of several runbooks where one needs to be called first.

  • Parent Runbook Module (calls:)
    • Create Team runbook
    • PnP Provision against site
    • Execute further PS manipulations …
All our Runbooks for creating Microsoft Teams (Part I & II of this series)

Our parent runbook you might remember from Part I but here once again and a bit more complete.

Our parent runbook to create a team from a group

After our input parameters we retrieve our assets (credentials, AppIDs, Urls and so on) from the current Automation Account (refer to another of my posts in case you need more information on Automation Account assets). Once we have them we can start to call our first runbook to create the Team (see next) which we already did in part I but there with another approach.

Our (sub)runbook to create an Office365 Group first and afterwards a team from it

Most code parts are the same as in part 1. We first connect to Microsoft Graph and get the access token. Then we create our Rest header. Also the block to receive the user id for the given login is the same. What’s new is the first While loop where we check if the desired alias is already in use. If so we manipulate it by attaching a “2”, next a “3” and so on. This pattern is only an example of course but gives us a bit more control in case we cannot handle something more comfortable in a potential UI.

As in this part we want to create a Group first it’s now time to create a GroupRequest body object. Additionally to an intro video on this approach by Paolo Pialorsi I already referenced in part 1. I also provide the alias here as we already managed that above and furthermore it is good practice to not provision Groups without any owners.

Next is to convert the request object to Json and execute a Rest call against Microsoft Graph. Having the Group and it’s ID as a result we can immediately go on with preparing the next request to create a Team from that Group (“teamify”). The request is a bit smaller than in part 1 but thats obvious because many parameters we already provided for the Group. The request follows the same pattern, converting request object to Json, execute Rest request and grabbing the ID from the result (which is not really necessary because Team and Group ID are in fact the same!)

After the new Team is created (back in our parent runbook!) we can retrieve a Group object and with it its site url. With a small waiting loop we check that the team and its site is really ready as provisioning might take some time. Having that we can call our further runbooks to go on provisioning with a PnP template and further PS cmdlets.

Next step is applying a PnP Provisioning template with that we want to create some fields, a content type and a list. The list will also get some DataRows so we have content for demonstration purposes. The runbook for that you find next:

Our (sub)runbook to apply PnP Provisioning template

First you find a function to download our existing provisioning template from an Azure Blob storage secured by a key.

PnP Provisioning template, stored in Azure Blob storage

Next we establish the authentication against SharePoint. You can either grab credentials from our AutomationAccount as a credential asset or use modern authentication with an app reg and certificate as described here.

After Connect-PnPOnline we construct our template-name, download it to local runbook runtime environment (C:\Users\… or C:\temp\… are our options) and then we simply apply our provisioning template.

Unfortunately I was still not able to output PnP Provsiioning debugging information in a runbook output so the following line of code is obsolete. Any hint would be greatly appreciated.

Set-PnPTraceLog	-On -Level Debug 

Next we come to our third runbook. Not everything can be solved with PnP Provisioning unfortunately (or not yet). So we decided in our scenarios to isolate further code steps in another runbook, typically called afterwards. In this scenario lets assume we want to create a new tab in our team and show the just created SharePoint list in a website tab. This is what the third (sub)runbook will do for us.

First we connect to Microsoft Graph again. Might not be necessary as we did in the first (sub)runbook but better than the connection got lost. As we need to add a new Tab to a Channel we first need to identify our Channel by retrieving the “General” one.

Afterwards we create a request url based on the IDs of our Team and the Channel we just identified. Build an object for a Website Tab, converting it to Json and executing a POST request against Microsoft Graph is all we need to do to achieve our result.

The custom ShrePoint list, provisioned with PnP Provisioning to a Teams’ SharePoint site
And the same list embedded in a Teams Website tab for instance

I hope this post illustrated a bit how you can create even more complex provisioning solutions with Azure Automation and PnP PowerShell. The same is applicable for SharePoint only or Office 365 Groups of course. In the create runbook you already saw here how to create Groups. Replace this by creating SharePoint sites is an obvious possibility.

Depending on some feedback next might be to point out a bit more the basics of Azure Automation or an architectural discussion when it might make more or less sense to use compared with/combined to alternatives such as Azure (Durable) Functions, Site Designs and so on. Looking forward to receive some questions on that.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.
Provision Microsoft Teams with Azure Automation – Part I

Provision Microsoft Teams with Azure Automation – Part I

I am a big fan of Azure Automation and its helpful capabilities around provisioning and administrating around Office 365 and its components such as SharePoint, Office 365 Groups or Microsoft Teams. In this little series I want to show you how you can provision your Teams with Azure Automation in a quick and simple way but the same would be scalable to a much more complex solution depending on your needs.

Inspired by a video from Paolo Pialorsi I wanted to establish some base runbooks in Azure Automation for provisioning Microsoft Teams. In the past I did lots of them for Sharepoint Online and Office 365 Groups.

As always striving for the latest and greatest I wanted to omit the two-step approach that also Paolo is showing in his video: First create an Office 365 Group and then enhance this Group to a Team. And indeed there is at least a beta Api in Microsoft Graph for directly creating a Team, based on an existing template:
POST https://graph.microsoft.com/beta/teams

The pattern is quite simple and similar to any other POST request, you need to

  • Care for authentication by creating a token and providing it in your request header
  • Create a body to provide information for the object to be created
  • Execute a POST request by handing in the header and body

While the token creation is similar to others and shown in the video of Paolo as well lets have a detailed look at the request body:

{
  "template@odata.bind": "https://graph.microsoft.com/beta/teamsTemplates('standard')",
  "displayName": "My First Team",
  "description": "My First Team’s Description",
  "owners@odata.bind": [
    "https://graph.microsoft.com/beta/users/<UserGuid>"
  ]
}

The first thing I missed here was a “mailNickname” or alias. I checked and tried out but you cannot provide that this way, at least at the moment, you need to rely on what Microsoft creates from your requested displayName. While on Groups provisioning I clearly preferred to give the user a chance to enter own wishes.

The second thing I found out (and here I moved away from Microsoft’s example in the beta documentation, as that wasn’t working) that you need to provide a user Guid. Same in Groups by the way if you want to provide at least one owner from the start (what you should!). So a user Guid for sure is no good idea to request from a user directly so we need some additional lines of code but lets come to that right now:

At first we connect against Microsoft Graph. A simple way to do that is with PnP PowerShell of course. I assume you already registered an app for creating Groups as documented here and we already grabbed our id, secret, domain (see later in another runbook). After connecting we get our access token and put that in a header object.

Next we come to the problem with the user Guid. Here we assume our user request provides a login name. With a simple Graph request we get the corresponding user and it’s id which we store in a variable.

Next we can fill our requestBody from above. Not only with the user Guid but also with displayName, description and also our template. I won’t go into details here but there are several Microsoft templates available and you can even override properties of the available templates. One problem we have are the escape charaters ” which enclose the template name!

"template@odata.bind" = "https://graph.microsoft.com/beta/teamsTemplates('standard')"

As in the next step we convert the just created body object to Json this wouldn’t work if we wouldn’t pipe the Json creation to a regex method called Unescape from Microsoft’s standard regex class. I found this simple but effective tip in another blogpost.

ConvertTo-Json -InputObject $teamRequest | % { [System.Text.RegularExpressions.Regex]::Unescape($_) }

The final thing now should be to invoke a Rest request of type Post to the teams Url and providing our just created header and body. We do that and store the response in another variable.

We would expect now to have our most recent created team in a variable and could use it’s Id to request it and do further things with that. Unfortunately this is the next insufficiency in the beta version we currently use: It returns NOTHING.

So for reasons of completeness I added some “weak” lines of code to get the “latest” Team created with our requested displayName. For sure such a method wouldn’t be 100% reliable. But hopefully this gets fixed before this Api is supported for use in production.

Finally I would start to show my regular concept on provisioning with runbooks: I always use a modularized approach because I hate PS scripts with tons of code lines. So I always use one parent runbook to be called that calls further runbooks for significant steps in my provisioning process usch as:

  • Create my site/Group/Team (that runbook I showed above)
  • Provision my artefacts with PnP Provisioning
  • Post Provisioning stuff, modifications where I need additional code and what doesn’t work with PnP Templates
  • Have Shared or Helper runbooks to be called from different runbook modules …

So in our case here is a simple parent runbook that calls the just mentioned “Create” runbook, retrieves back the created Team’s Id and processes this further.

First we grab our assets such as Graph access credentials and store them in $global:variables so we have them available in all called (sub) runbooks as well in case we need to share variables …

That was it for now. Hopefully there will be some progress with this Api on Teams creation soon. I will observe it and potentially update this post by then.

In the next part I will show you the approach which is already available for productional use (v1.0 Graph Api) to first create a Group and then a Team out of it and further modify the just created Team. There you will see additional things to note when handling Provisioning with Azure Automation runbooks. So stay tuned.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.

Deploy SPFx app package to SharePoint from Azure DevOps with modern authentication

As you can see from my last posts I got heavily involved in dealing with SharePoint modern authentication in the recent past. To repeat once again:

Intro

If your tenant has turned off legacy authentication you cannot simply authenticate with PowerShell and UserCredentials anymore. To check this run the following script:

Connect-SPOService -Url "https://<Your-Tenant>-admin.sharepoint.com"

$tenantsettings = Get-SPOTenant

$tenantsettings.LegacyAuthProtocolsEnabled

And potentially modify this setting (turn on: $true)

Set-SPOTenant -LegacyAuthProtocolsEnabled $true

In one of my last posts I showed how modern authentication is handled in a PowerShell script, especially in an Azure Automation environment where you can store and retrieve the necessary self-signed certificate as an Azure Automation asset.

Now another challenge: Inside an Azure DevOps Release pipeline you have another one or two PowerShell tasks. In my scenario two, one for uploading the app package to the app catalog, a second one to upload the assets to a SharePoint library, that is your Office 365 public CDN.

In Azure DevOps Release (and Build) pipelines you have no capability to simply store a certificate as an asset. Have it in the source code might not be an option to not distribute it to any developer’s environment. I only found the way to store variables (strings!), if anyone else has a better idea, speak up please 🙂

Fortunately this is also a working scenario as you can connect to SharePoint with another variant of the Connect-PnPOnline cmdlet providing a PEMCertificate and a PEMPrivateKey, both represented by large strings. According to my last post both can be simply extracted from a certificate once it is created by the New-PnPAzureCertificate cmdlet.

$Password = "******"
$secPassword = ConvertTo-SecureString -String $Password -AsPlainText -Force
$cert = New-PnPAzureCertificate -Out "AzureAutomationSPOAccess.pfx" - `
 -ValidYears 10 `
 -CertificatePassword $secPassword
 -CommonName "AzureAutomationSPOAccess" `
 -Country "DE" `
 -State "Bavaria"
$PEMCert = $cert.Certificate
$PEMSecret = $cert.PrivateKey 
Write-Host $PEMCert
Write-Host $PEMSecret

On top you also have to output the KeyCredentials and combine them with an Azure AD App registration.

Release pipeline basic configuration

In the past I mentioned Elio Struyf as a good resource for SPFx Azure DevOps Build&Release Pipelines but today I would also like to mention the blogpost of Giuliano de Luca which lists up all the necessary steps and options in a fantastic way.

In my simple scenario I picked most of the basics from him but I prefer to use a Release Pipeline only in staging environments (as you have to package for every stage again (changes in write-manifest.json for instance). But to simplify we will cover only one stage:

Simple Azure DevOps SPFx Release Pipeline (1 Stage)
Azure DevOps Release Pipeline – The Tasks

The first tasks are quite obvious. At first we follow a recommendation from Elio to use Node version 8 which significantly improves performance. Afterwards we run npm install, bundle and package our solution.

Here I want to give you a special hint on dealing with larger projects:

There I tend to use larger Git repositories, meaning not every SPFx solution has it’s own like the typical standard deployment demo scenario. That leads to the point where the standard file structure is not valid anymore. When entering the System.DefaultWorkingDirectory followed by your Artifact folder (“MM” in my example, see 1st screenshot) we come to different solution directories below:

For those solution directories (assume today you want to deploy Webpart1 but tomorrow Webpart2 and you all have them in one Git repository and all are handled by one configured Release pipeline) I created a Release variable localPath that can be set at releasetime.

Azure DevOps Release variables

This leds us to the point where we can relate to this in the npm install or our gulp tasks when it comes to a local file path:

npm install
gulp bundle task (gulp package-solution quite similar)

So depending on which webpart we want to deploy according to our variables above we would hand in different “packageFile” (we use it later!) and different “localPath” values and the task will find it’s package.json (for npm install) as well as it’s gulpfile.js

PowerShell Tasks and modern authentication

But now let’s come to the main point of this post. Let’s create the two PowerShell tasks to upload our app package and the assets to the library that represents our Office 365 public CDN. At first let me clarify which task to use as there are more outside (in the past for instance I used one from the marketplace). I now switched to the official Microsoft task.

Azure DevOps Task Template PowerShell (by Microsoft)

The first task is to upload the package to the app catalog. It is quite simple as beyond the script we do not really need to configure anything else. I skipped to give a working directory (I use cd in script instead) or provide environment variables here. Maybe worth to investigate in future.

Azure DevOps PowerShell task Deploy to App catalog

As the complete script did not fit into the screenshot here once again:

Write-Host ("Adding package " + $($env:packageFile) + " to the AppCatalog at https://" + $($env:tenant) + ".sharepoint.com/" + $($env:catalogsite))
cd $env:System_DefaultWorkingDirectory/MM/$env:localPath/sharepoint/solution/
Install-PackageProvider -Name NuGet -Force -Scope "CurrentUser" 
Install-Module SharePointPnPPowerShellOnline -Scope "CurrentUser" -Verbose -AllowClobber -Force
Import-Module SharePointPnPPowerShellOnline -Scope "Local" -WarningAction SilentlyContinue

Write-Host ("Connecting to Tenant " + $($env:tenant) + ".onmicrosoft.com with AppID " + $($env:SPOAppID))

Connect-PnPOnline –Url https://$(tenant).sharepoint.com/$(catalogsite) -Tenant "$(tenant).onmicrosoft.com" -ClientId $(SPOAppID) -PEMCertificate "$(PEMCertificate)" -PEMPrivateKey "$(PEMPrivateKey)"

Write-Output 'Connected to SPO'

Add-PnPApp -Path $env:packageFile -Publish -Overwrite
Write-Host Finished upload app package $($env:packageFile)

For debug reasons I left a couple of Write-Host but they are not necessary of course. Anyway the first shows quite well how to reference our release variables from above ($env:<ReleaseVariableName>).

After the first output we switch our current directory into our current solution and the folder where the .sppkg file resides.

Afterwards we install our PnP-PowerShell module with 3 lines of code.

But then it comes to the ‘magic’ point of this post, the modern authentication with PnP Online:

Connect-PnPOnline 
    –Url https://$(tenant).sharepoint.com/$(catalogsite) 
    -Tenant "$(tenant).onmicrosoft.com" 
    -ClientId $(SPOAppID) 
    -PEMCertificate "$(PEMCertificate)" 
    -PEMPrivateKey "$(PEMPrivateKey)"

We build our Url from known and obvious parts combined with our release variables, the Tenant as well.

We then hand in the ID of our App Registration and finally
our PEMCertificate and PEMPrivateKey string retrieved from our variable.

Finally I simplified the things again and added a simple Add-PnPApp cmdlet assuming SkipFeatureDeployment is always false. Giuliano de Luca shows a cool optional handling for this in his code repository.

The next task is quite the same:


Azure DevOps PowerShell task Upload to Office 365 public CDN
Write-Host ("Adding package " + $($env:packageFile)+ " to the AppCatalog at https://"+ $($env:tenant) + ".sharepoint.com/" + $($env:catalogsite))
cd $env:System_DefaultWorkingDirectory/MM/$env:localPath/

$cdnConfig = Get-Content -Raw -Path ("config\copy-assets.json") | ConvertFrom-Json
$bundlePath = $cdnConfig.deployCdnPath
$files = Get-ChildItem $bundlePath\*.*

Write-Output "Connecting to CDN-Site $env:CDNSite"

Connect-PnPOnline –Url $env:CDNSite -Tenant "$(tenant).onmicrosoft.com" -ClientId $(SPOAppID) -PEMCertificate "$(PEMCertificate)" -PEMPrivateKey "$(PEMPrivateKey)"

foreach ($file in $files) {
    $fullPath = $file.DirectoryName + "\" + $file.Name
    Write-Output "Uploading file $fullPath to folder $env:CDNLib"
    $f = Add-PnPFile -Path $fullPath -Folder $env:CDNLib
}

We do not need to install PnP Powershell anymore but now only cd to a different directory. Then we grab a JSon file from our repository to get a value from there. An elegant way to omit some release variables.

We once again authenticate with our certificate values to SharePoint.

Finally we use the (retrieved from Json config file) path value where our bundles reside to grab all files from there. We then iterate them one by one and upload them to the library/folder inside our CDN site.

Markus is a SharePoint architect and technical consultant with focus on latest technology stack in Office 365 and SharePoint Online development. He loves the new SharePoint Framework as well as some backend stuff around Azure Automation or Azure Functions and also has a passion for Microsoft Graph.
He works for Avanade and is based in Munich.
Although if partially inspired by his daily work opinions are always personal.