Skip Ribbon Commands
Skip to main content

SharePoint Developer Reference - Blog

:

Home
July 28
How to publish a Workflow Definition in SharePoint using PowerShell

Today, while working on the PnP Provisioning Engine I had to write some code to publish a new XAML based Workflow Definition in SharePoint, as well as to create a Subscription between the workflow and a target list/library. To make it easier, I wrote a bunch of PowerShell scripting that leverages the Erwin’s PnP PowerShell extensions together with the SharePoint Client Side Object Model and the Workflow Services Manager client library.

Here is a PowerShell excerpt, with in-line comments, that explains how you can create, publish and subscribe to a workflow definition starting from a XAML file.

# Load the WorkflowServicesManager client library (please provide the proper file path …)
[System.Reflection.Assembly]::LoadFrom("Microsoft.SharePoint.Client.WorkflowServices.dll")

# Connect to the remote SharePoint Site
Connect-SPOnline "https://tenant.sharepoint.com/sites/TargetSite/"

# Init context variables
$web = Get-SPOWeb
$ctx = $web.Context

# Create a WorkflowServicesManager instance
$wfm = New-Object Microsoft.SharePoint.Client.WorkflowServices.WorkflowServicesManager -ArgumentList $ctx,$web

# Get a reference to the Workflow Deployment Service
$wfDeploymentService = $wfm.GetWorkflowDeploymentService()

# Load the Workflow XAML
$xamlPath = 'SampleWorkflowDefinition.xaml'
$xaml = [System.Xml.Linq.XElement]::Load($xamlPath)

# Prepare the Workflow Definition object
$wfDefinition = New-Object Microsoft.SharePoint.Client.WorkflowServices.WorkflowDefinition -ArgumentList $ctx
$wfDefinition.DisplayName = "SampleWorkflow"
$wfDefinition.Description = "This is a sample workflow published using PowerShell"
$wfDefinition.Xaml = $xaml.ToString()

# Save and publish the Workflow Definition object
$definitionId = $wfDeploymentService.SaveDefinition($wfDefinition)
$ctx.Load($wfDefinition)
$ctx.ExecuteQuery()

# Publish the Workflow Definition
$wfDeploymentService.PublishDefinition($definitionId.Value)

# Retrieve IDs of targets (list/library, history, and tasks)
$targetLibrary = $web.Lists.GetByTitle("Orders")
$ctx.Load($targetLibrary)

$historyList = $web.Lists.GetByTitle("Workflow History")
$ctx.Load($historyList)

$tasksList = $web.Lists.GetByTitle("Workflow Tasks")
$ctx.Load($tasksList)

$ctx.ExecuteQuery()

# Associate the Workflow Definition to a target list/library
$wfSubscriptionService = $wfm.GetWorkflowSubscriptionService()
$wfSubscription = New-Object Microsoft.SharePoint.Client.WorkflowServices.WorkflowSubscription -ArgumentList $ctx

# Configure the Workflow Subscription
$wfSubscription.DefinitionId = $definitionId.Value
$wfSubscription.Name = $wfDefinition.DisplayName + " - Definition"
$wfSubscription.Enabled = $true
$eventTypes = New-Object System.Collections.Generic.List[String]

# Available values are: ItemAdded, ItemUpdated, WorkflowStart
$eventTypes.Add("WorkflowStart") $wfSubscription.EventTypes = $eventTypes

$wfSubscription.EventSourceId = $targetLibrary.Id.ToString()
$wfSubscription.SetProperty("TaskListId", $tasksList.Id.ToString())
$wfSubscription.SetProperty("HistoryListId", $historyList.Id.ToString())

# Publish the Workflow Subscription
$wfSubscriptionService.PublishSubscriptionForList($wfSubscription, $targetLibrary.Id)
$ctx.ExecuteQuery()

And the XAML of the sample workflow is something like that:

<Activity mc:Ignorable="mwaw" x:Class="SampleWF.MTW" xmlns="http://schemas.microsoft.com/netfx/2009/xaml/activities" xmlns:local="clr-namespace:Microsoft.SharePoint.WorkflowServices.Activities" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:mwaw="clr-namespace:Microsoft.Web.Authoring.Workflow;assembly=Microsoft.Web.Authoring" xmlns:scg="clr-namespace:System.Collections.Generic;assembly=mscorlib" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml">
  <Sequence>
    <Sequence>
      <mwaw:SPDesignerXamlWriter.CustomAttributes>
        <scg:Dictionary x:TypeArguments="x:String, x:String">
          <x:String x:Key="InitBlock">InitBlock-7751C281-B0D1-4336-87B4-83F2198EDE6D</x:String>
        </scg:Dictionary>
      </mwaw:SPDesignerXamlWriter.CustomAttributes>
    </Sequence>
    <Flowchart StartNode="{x:Reference __ReferenceID0}">
      <FlowStep x:Name="__ReferenceID0">
        <mwaw:SPDesignerXamlWriter.CustomAttributes>
          <scg:Dictionary x:TypeArguments="x:String, x:String">
            <x:String x:Key="Next">4294967294</x:String>
          </scg:Dictionary>
        </mwaw:SPDesignerXamlWriter.CustomAttributes>
        <Sequence>
          <mwaw:SPDesignerXamlWriter.CustomAttributes>
            <scg:Dictionary x:TypeArguments="x:String, x:String">
              <x:String x:Key="StageAttribute">
                StageContainer-8EDBFE6D-DA0D-42F6-A806-F5807380DA4D
              </x:String>
            </scg:Dictionary>
          </mwaw:SPDesignerXamlWriter.CustomAttributes>
          <local:SetWorkflowStatus Disabled="False" Status="Stage 1">
            <mwaw:SPDesignerXamlWriter.CustomAttributes>
              <scg:Dictionary x:TypeArguments="x:String, x:String">
                <x:String x:Key="StageAttribute">StageHeader-7FE15537-DFDB-4198-ABFA-8AF8B9D669AE</x:String>
              </scg:Dictionary>
            </mwaw:SPDesignerXamlWriter.CustomAttributes>
          </local:SetWorkflowStatus>
          <Sequence DisplayName="Stage 1">
            <local:WriteToHistory Message="Hello World!" />
          </Sequence>
          <Sequence>
            <mwaw:SPDesignerXamlWriter.CustomAttributes>
              <scg:Dictionary x:TypeArguments="x:String, x:String">
                <x:String x:Key="StageAttribute">
                      StageFooter-3A59FA7C-C493-47A1-8F8B-1F481143EB08
                    </x:String>
              </scg:Dictionary>
            </mwaw:SPDesignerXamlWriter.CustomAttributes>
          </Sequence>
        </Sequence>
      </FlowStep>
    </Flowchart>
  </Sequence>
</Activity>

Of course, it is a very simple workflow that simply writes “Hello World!” in the History List, but it can be something more complex, if you like. Enjoy!

July 04
Using SharePoint REST API from PowerShell

In this short post I want to explain you how to use the SharePoint REST API from PowerShell, targeting a SharePoint Online site collection. As you probably know, you can do almost everything (and when I say everything, I really mean everything Smile …) using the PowerShell extensions created by my friend Erwin van Hunen, and which he kindly made available for free in the Office 365 Developer Patterns and Practices community project. just in case, you can install those PowerShell extensions from here. They are available in two flavors: v.15 that targets SharePoint on-premises, and v.16 that targets SharePoint Online.

Now, let’s say that you want to do something using low level SharePoint REST API calls within PowerShell and targeting SharePoint Online. In that case, the biggest issue is to properly provide your credentials to the target REST endpoint. In order to do that you can leverage the Connect-SPOnline cmdlet from Erwin, as well as a bunch of custom PowerShell scripting. Here is a code excerpt to accomplish a very simple task (getting the title of a library):

# Connect to SharePoint Online
$targetSite = https://<your-tenant>.sharepoint.com/sites/<SiteName>/
$targetSiteUri = [System.Uri]$targetSite

Connect-SPOnline $targetSite

# Retrieve the client credentials and the related Authentication Cookies
$context = (Get-SPOWeb).Context
$credentials = $context.Credentials
$authenticationCookies = $credentials.GetAuthenticationCookie($targetSiteUri, $true)

# Set the Authentication Cookies and the Accept HTTP Header
$webSession = New-Object Microsoft.PowerShell.Commands.WebRequestSession 
$webSession.Cookies.SetCookies($targetSiteUri, $authenticationCookies)
$webSession.Headers.Add("Accept", "application/json;odata=verbose")

# Set request variables
$targetLibrary = "Documents"
$apiUrl = "$targetSite" + "_api/web/lists/getByTitle('$targetLibrary')"

# Make the REST request
$webRequest = Invoke-WebRequest -Uri $apiUrl -Method Get -WebSession $webSession

# Consume the JSON result
$jsonLibrary = $webRequest.Content | ConvertFrom-Json
Write-Host $jsonLibrary.d.Title

You can now adapt the authentication code to your needs, and you can call almost every REST endpoint using this technique.

June 17
SharePoint 2013 Hybrid Topologies Maintenance

While waiting for Microsoft SharePoint 2016 and the upcoming new features and capabilities in the fields of Hybrid Cloud, maybe you already have an hybrid topology with Microsoft SharePoint 2013.

May also be that you created that topology following the instructions that I provided last year at TechEd Europe 2014, and that you can still find here.

In that case, you probably know that - when you want to enable the hybrid search topology - you have to register your on-premises farm’s STS certificate in the Office 365 tenant. In order to do that you can use the following cmdlet from PowerShell:

New-MsolServicePrincipalCredential

The STS X.509 certificate will be used to authenticate against the SPO tenant. As like as any other X.509 certificate that certificate will expire. Moreover, the Service Principal Credential associated with your on-premises farm will have a StartDate and an EndDate. And what happens when the certificate or the credentials expire? Almost nothing!

I mean: no errors in the UI, no alerts to the users … but the hybrid search outbound will simply stop working and you will get back results from the on-premises index only! Is it cool? Not that much, I guess, but it is understandable … you should not block end-user with fancy error messages, just because of that. However, you will find in the ULS log an exception like the following one: “ACS50027: JWT token is invalid” while sending the remote search request from on-premises to SPO.

How can you fix it? You will simply need to update the certificate and re-issue the credentials using the same PowerShell cmdlet as before, but providing the new certificate and the new EndDate. If you want you can do that “in advance” so that you will replace the certificate “on the run,” without any service interruption for your end users.

More information about the hybrid topologies can be found here.

Enjoy your hybrid topologies.

June 10
Slides and demos of my session at CEUS by Iberian SPC

Here you can find the slides of my session “Remote Provisioning with the new PnP Provisioning Engine” at the CEUS by Iberian SPC. Moreover, here follow the demos:

I hope you enjoyed the contents, and I hope you will contribute with the OfficeDev PnP Community Project.

Thanks!

June 03
Slide of my session at SPS Paris 2015

Here you can find the PDF slides of my session “Overview of Hybrid Scenarios with Office 365 and SharePoint” at the latest SharePoint Saturday Paris 2015. Enjoy!

May 08
Introducing the PnP Provisioning Engine

This week at the Microsoft Ignite conference we introduced the new PnP Provisioning Engine.

What is PnP?

Before talking about the new Provisioning Engine, let me introduce you the PnP group.
PnP stands for Office 365 Developers Patterns and Practices. I'm proud of being
one of the PnP Core Team members, and what we do is supporting developers adopting the new Cloud App Model, moving from the Full Trust Code model, and to
develop SharePoint Add-Ins (formerly known as SharePoint Apps), as well as Office 365 Apps. We do have a site, which is hosted by Microsoft,
and we do have a GitHub repository. Please, have a look at what we do,
and feel free to contribute.

What is the PnP Provisioning Engine?

And now, let's talk about the PnP Provisioning Engine. It is an engine, which is available for FREE and under an
Open Source model, to easily do Remote Provisioning (via CSOM) of artifacts in SharePoint on-premises and
SharePoint Online. Would you like to learn more about this topic? Please, read this white paper and let us know your feedbacks!

How can you get it?

You like it? I guess so … You can get the new PnP Provisioning Engine in many different ways, depending on your needs.

If you plan to use the engine within PowerShell, you can install the PowerShell extensions that we provide (thanks to Erwin van Hunen). Those PowerShell extensions are available here. You can download the extensions for SharePoint Online (v16) or for SharePoint on-premises (v15).

Another option is to get the OfficeDev PnP Core library as a NuGet Package in Microsoft Visual Studio. We do still have a library for on-premises, and another for the cloud.

Lastly, you can download the engine (including its source code) from the OfficeDev PnP repository on github. The PnP Provisioning Engine is in the Framework/Provisioning folder. We have both the monthly release (master branch) and the “under development” release (dev branch).

What’s next?

Enjoy the new PnP Provisioning Engine, let us know what you think about it, post any issue on GitHub, and feel free to contribute to this community project!

Thank you!

March 14
Slides and demos of my session “Office 365 API for .NET Developers” at SPS Helsinki

Thanks for attending my session @SPSHelsinki, here you can find the slides deck I showed during my session. Moreover, on the OfficeDev PnP project you can find the entire demo solution I illustrated during the session. There you will find a WPF application, an ASP.NET MVC web application, and an ASP.NET WebForm application to see how to consume the Office 365 API from .NET.

Thanks again and enjoy the Office 365 API!

March 11
Microsoft Office 365, ADFS and signing/encrypting certificates renewal

Let’s say that you have an Office 365 tenant, and that you also deployed a set of Federated Identities with Windows Active Directory Federation Services (ADFS). It could happen that you log into the Admin portal of your Office 365 tenant and you see the following polite message.

Office365-Alert

Let me repeat the text to help people find this content, via web search, in case of need: “Renew your Certificates – One of your on-premises Federation Service certificates is expiring. Failure to renew the certificate and update trust properties within X days will result in a loss of access to all Office 365 services for all users.” Wow! That’s a very clear message, but very scaring, as well. What’s happening?!

Well, from an ADFS perspective you will have at least three certificates to support the security of your own environment:

  • The “Service Communications” certificate, which will be used to protect the HTTPS endpoint
  • The “Token-decrypting” certificates, which will be used to decrypt security tokens
  • The “Token-signing” certificates, which will be used to sign security tokens

The first one is used to secure the HTTPS endpoint, and when it expires you simply need to renew it and replace it in your ADFS and in your reverse proxies, as well and if any.

Meanwhile in your environment, which is federated with Azure Active Directory, the last two certificates are used to secure the security token exchange with Azure AD, which sits under the cover of your Office 365 tenant.

The kind message you get in the Office 365 Admin center simply states that one or both of these certificates are going to expire and, if you will not replace them promptly, nobody will be able to consume your services after the effective expiration of those certificates. Luckily you also have the “Update Now” link, at the very end of the alert. If you click on it you will be redirected to a Wiki page (created on March 2011 and updated on February 2015), which mainly targets ADFS 2.0 on Windows Server 2012. Now, my personal suggestion is to leverage Windows Server 2012 R2 and ADFS 3.0 for federating identities with Azure AD and Office 365, if it is possible.

Well, using ADFS 2.0 or ADFS 3.0 the solution to the problem is almost automatic, if you enable the AutoCertificateRollover capability. Let’s see.

Connect to your ADFS server and open PowerShell console or PowerShell ISE. Run the Get-AdfsProperties cmdlet and watch the output:

ADFS-Properties-Highlighted

You will find some interesting properties like:

  • AutoCertificateRollover (default value TRUE): determines if the ADFS service will automatically manage the enrollment of new certificates before expiration of current ones.
  • CertificateCriticalThreshold (default value 2): is the number of days, before the current certificates expiration, that will determine a critical self-enrollment of new certificates to replace the current ones, which are going to expire. It shouldn’t happen if the auto certificate rollover procedure works properly.
  • CertificateDuration (default value 365): defines the duration in days of the enrolled certificates.
  • CertificateGenerationThreshold (default value 20): is the number of days, before the current certificates expiration, that will determine when the certificate auto-rollover procedure will be executed. That procedure will generate new certificates that will soon replace the current ones. Initially the new auto-generated certificates will be “Secondary,” and the current certificates will keep their status of “Primary” certificates.
  • CertificatePromotionThreshold (default value 5): defines the number of days during which the new certificates will be “Secondary.” After that, there will be a switch between the certificates: the expiring ones will become “Secondary”, while the new ones will become “Primary.”

More details about these ADFS properties, and some others, can be found here.

In the demo environment that I am showing you I had the expiration date for my encrypting and signing certificates set on March 27th 2015, as you can see in the following figure.

ADFS-Certificates-Highlighted

Well, on March 7th 2015, which is exactly 20 days (= CertificateGenerationThreshold) before certificates expiration, ADFS 3.0 automatically generated a couple of new certificates for encrypting and signing. Moreover, ADFS 3.0 set those new certificates as “Secondary.” Here you can see.

ADFS-Certificates-Double-Highlighted

This way, the Azure AD engine is able to retrieve the information about these two new certificates within the next 5 days (= CertificatePromotionThreshold). In fact, on March 10th 2015 the polite alert disappeared from my tenant Admin center, as you can see in the following figure.

Office365-NoMore-Alert

Lastly, on March 13th 2015 (i.e. today), the ADFS 3.0 engine switched the “Primary” and the “Secondary” certificates, in order to promote the new ones and demote the expiring ones. Here is the result in the ADFS administration console.

ADFS-Certificates-Double-Switched-Highlighted

And you are done! Your ADFS certificates are updated, the Azure AD tenant is aware of the new certificates, and for the next 365 days (= CertificateDuration) - after the creation date of the new certificates - you don’t need to care about certificates expiration. Your users can continue consuming their services without any service interruption.

When the Secondary (old and expiring) certificate will effectively expire, ADFS will automatically remove it from the list of available certificates, and only the Primary and valid (not expired) one will survive.

February 16
Slide, demos and links of my webinar “Consuming Office 365 REST API” for ESPC15

Here you can find the demos (on OfficeDev/PnP project) about how to consume the Office 365 APIs. The slides and the recorded video will be available soon on the European SharePoint & Office 365 Community.

Moreover, as promised, here you can find the list of libraries (released and beta) available to consume Azure AD: https://msdn.microsoft.com/en-us/library/azure/dn151135.aspx .

I hope you enjoyed my webinar, and hope to meet you again shortly.

My next stop Sorriso will be in Helsinki, March 14th.

February 15
SPSSTHLM18 -  Advanced SharePoint Workflow Scenarios

And here you can find slides and demos of my session “Advanced SharePoint Workflow Scenarios” at SharePoint Saturday Stockholm, 14 February 2015.

Stay tuned … the workflow topic is a huge one!

1 - 10Next
 
Visit my company: PiaSys.com 
 

 About this blog

 
About this blog

Welcome to the SharePoint Developer Reference Blog. I'm Paolo Pialorsi and I'm a senior consultant, author and trainer, focused on SharePoint development and customization. I'm based in Italy, but I work whereever it is needed (mainly EMEA). I'm a Microsoft Certified Master on SharePoint 2010.

You also can follow me on twitter: @PaoloPia

 MCM.png