Skip Ribbon Commands
Skip to main content

SharePoint Developer Reference - Blog

:

Home
July 12
Updating declarative custom actions for SharePoint 2013 Workflows

Let’s say you have created a declarative custom action for a SharePoint 2013 workflow, like I have illustrated in this post. Now, let’s say you want to update it, changing the .actions4 file or updating its inner workings in the .xaml file.

When you upload the .WSP sandboxed solution, which includes the custom action, and then you activate the feature that deploys the custom action, under the cover SharePoint 2103 creates a folder (with the name of the custom action) under the Workflows folder in your site, as you can see in the following screenshot of Microsoft SharePoint Designer 2013.

image

Each of the folders, which are inside of the Workflows folder, contains the .actions4 and .xaml files of a declarative custom action, as you can see in the following screenshot.

image

This behavior happens because the feature element file, which is created by the Visual Studio action template, internally declares exactly to do that. Here you can see a sample Elements.xml file that is related to a sample declarative custom action.

image

Well now, imagine that you want to update the custom action, as I stated at the very beginning of this post. You can simply update it in Visual Studio, rebuild the .WSP package, release an updated version of the solution and upgrade or deactivate/re-activate the feature. However, it could happen that the action doesn’t upgrade. Do you guess why?

It doesn’t update the action because the Elements.xml file syntax provisions the files just once, and then doesn’t update the .actions4 and .xaml files in case of any future updates. In order to change this behavior, you can simply change the Elements.xml file adding the ReplaceContent attribute with a value of TRUE to each of the File elements. Here you can see the updated Elements.xml file.

image

That’s all! This way, whenever you will update the feature that provisions the declarative custom action, the files will be overwritten and the action will be updated!

February 13
Publishing SharePoint apps that leverage SharePoint Workflow Services

Just a quick reference for those of you, who are developing SharePoint apps that leverage the SharePoint Workflow Services. If you are using Microsoft Visual Studio 2013, you can see the app publishing process has been improved and simplified. You simply right-click on the SharePoint app project, you select “Publish …”, and you are prompted with a nice and friendly page with all the buttons for managing your publishing phase.

image

Moreover, the publishing process is available whether you are publishing an app on the Office Store, or whether you are publishing the app on-premises on a Corporate Catalog.

image

Well, first of all by using this nice and friendly UI you can configure a publishing profile, providing information about the Client ID and Client Secret to use for publishing the app on Office 365 using OAuth, or you can provide the Client ID, the .PFX certificate file and password, and the Issuer ID if you plan to leverage a Server to Server deployment for on-premises farms.

You can deploy the web app directly to the target server (web deploy, web deploy package, FTP, file system).

image

Then, you can create the .APP package file for publishing the app to the target store (Office Store or Corporate Catalog) by clicking the “Package the app” button.

image

Notice that the “Package the app” wizard accepts only apps published via HTTPS. Thus, you will have to provide an SSL secured URL for your app, which is good and safe … but not always possible, for instance if you are packaging a test app and you do not want to use an SSL certificate. Keep in mind that in a production environment you should always use SSL!

Well, now if your app is leveraging one or more of the services available in SharePoint, those services will be listed in the AppManifest.xml file as Prerequisites. However, if you are using a workflow inside the app … the SharePoint Workflow Services requirement will make your app deployment to fail!

In fact, the AppManifest.xml generated by Visual Studio 2013 will reference a Capability with ID:

CDD8F991-B459-4512-8048-03D5A03FF27E

Here is a sample AppManifest.xml file generated by Visual Studio 2013:

<?xml version="1.0" encoding="utf-8"?>
<App xmlns="
http://schemas.microsoft.com/sharepoint/2012/app/manifest" Name="NameOfYourApp" ProductID="{777fd9aa-cf34-4de3-bc86-e5d0c00b58bc}" Version="1.0.0.0" SharePointMinVersion="15.0.0.0">
  <Properties>
    <Title>Title of your App</Title>
    <StartPage>
https://host.domani.tld/?{StandardTokens}</StartPage>
    <SupportedLocales>
      <SupportedLocale CultureName="en-US" />
    </SupportedLocales>
  </Properties>
  <AppPrincipal>
    <RemoteWebApplication ClientId="35f7958e-a9b3-44c0-86b1-cf363c716f90" />
  </AppPrincipal>
  <AppPermissionRequests>
    <AppPermissionRequest Scope="
http://sharepoint/content/sitecollection/web" Right="FullControl" />
  </AppPermissionRequests>
  <AppPrerequisites>
    <AppPrerequisite Type="Capability" ID="CDD8F991-B459-4512-8048-03D5A03FF27E" />
  </AppPrerequisites>
</App>

This is also documented here. However, as clearly stated by a comment in that article, the ID for the SharePoint Workflow Services is wrong. Meanwhile, the right ID is:

B1DDD88F-6ADD-4700-B5CD-18E451635E24

If you try to publish the app, let’s say in the Corporate Catalog, with the wrong ID in the AppManifest.xml file … the result will be something like that:

image

Moreover, by clicking on the “Find out why” link, you will see something like that:

image

A friendly message stating “Sorry, this app is not supported on your server.” will inform you that your target farm does not satisfy the declared requirements of your app. This could happen also if you reference a real requirement with a valid Capability ID, which is not available in the target farm. But in the case of SharePoint Workflow Services, the issue is related to the wrong Capability ID referenced in the AppManifest.xml file.

Well, to fix and solve the issue you simply need to edit the .APP file content, which under the cover is a .ZIP file. You can open it with WinZIP, WinRAR, or something like that. Then, you have to provide the proper Capability ID, which is B1DDD88F-6ADD-4700-B5CD-18E451635E24 and you are done! Upload the new and updated .APP file and enjoy your app!

I hope this will help.

January 25
Understanding the REST API of SharePoint 2013–Slide and Demo (#SPSSTHLM17)

Here you can find the slides and demos of my session “Uderstanding the REST API of SharePoint 2013” provided at the SharePoint Saturday Stockholm on 25th January 2014.

I hope you enjoyed the session, and I’m looking forward to meet you again at the upcoming events, where I will have speeches.

SPSSthlm_header31[1]

Thanks to Matthias, Erwin, and Hannah for this really great and well managed event!

November 22
How to create a Workflow Custom Action for SharePoint Designer 2013 using Visual Studio 2013

Quite often customers ask me about how to create custom Workflow Custom Actions, for SharePoint Designer 2013, using Microsoft Visual Studio. Because on the network there are a lot of contents, some of them not very complete or clear … let me try to clarify this topic, or add some more entropy :-) …

Thus, I will show you how to create a Custom Action to Move a file from one document library to another. Because the move action requires write permissions on the source and target folders, and because you are not guaranteed that the user running the workflow has appropriate permissions … I will show you also how to create this Action elevating your identity and acting as the app only. Where the app will be the workflow app. For further details about this topic, you can also read the following document on MSDN: Create a workflow with elevated permissions by using the SharePoint 2013 Workflow platform.

Pre-requirements

First of all, you should have to properly configure your environment. Thus, because the current Custom Action needs to use app permissions, you have to configure the target web site to support App Step sequences. In order to do that, open your site and navigate to the Site Settings page. There, select the “Manage Site Features” option in the “Site Actions” group of actions. In the resulting page, at the very end, you should see a feature called “Workflows can use app permission”. Activate that feature.

image

Then, you will have to configure the permissions for the workflow app. Go back to the Site Settings page and click on the “Site app permissions” link in the “Users and Permissions” group of actions. If you have already created and published at least one workflow with the new SharePoint 2013 workflow engine, you should find an app named “Workflow” (or whatever you say “Workflow” in your language, if you are using a localized UI for SharePoint 2013). If you do not find an app with name “Workflow”, open Microsoft SharePoint Designer 2013 and create a fake workflow, publish it and go back to the “Site app permissions” page. Here you can see how your page should look like.

image

Copy the ID of the “Workflow” app, which is the value between the last | and @ in the App Identifier field. In my sample image it is “f189f858-5565-4221-8d33-0099df9306fd”.

Change the web browser URL in order to navigate to the page /_layouts/15/appinv.aspx, which is not available in the “Site Settings” page. You will be prompted with the following form.

image

Fill the “App Id” field with the ID you have just copied, click the “Lookup” and compile the “Permission Request XML” with an XML permission request like the following one:

<AppPermissionRequests>
<AppPermissionRequest Scope="http://sharepoint/content/sitecollection/web"

    Right="FullControl" />
</AppPermissionRequests>

Or something like that, in order to properly configure the permissions you want to assign to the “Workflow” app. For further details about the available permissions for apps, you can read my book :-) or you can read the following article on MSDN: App permissions in SharePoint 2013.

As soon as you will click on the “Create” button, you will be prompted to trust the “Workflow” app. Click the “Trust it” button, if you want to trust it and assign the declared permissions to it.

Implementing the Move File Custom Action

Now you are ready to implement the custom action. This action will use the REST APIs provided by SharePoint 2013 in order to retrieve a reference to a file in a library, and will invoke the MoveTo method of the File class (Microsoft.SharePoint.Client) in order to move that file somewhere else.

Start Microsoft Visual Studio 2013, choose “New Project” and select a “SharePoint 2013 – Empty Project” template.

WF-Custom-Action-01

Select to create a Sandboxed Solution, in order to being able to leverage the custom action both on-premises and on Office 365.

WF-Custom-Action-02

Right click on the project item in the Solution Explorer and add a new item. Choose the “Workflow Custom Activity” template. Name the new item “MoveFileActivity”.

WF-Custom-Action-03

That action will add a new feature and a new feature element made of three files:

  • Elements.xml: it is the feature element used to provision the new activity.
  • MoveFileActivity.actions4: it is an XML file that will be used to describe the custom action. We will talk about it later in this article.
  • MoveFileActivity.xaml: it is the real markup-based (XAML) custom activity.

Keep in mind that now, with SharePoint 2013 and the new workflow engine based on Workflow Manager, any custom activity suitable for SharePoint 2013 on-premises and online has to be markup based. You can also implement code-based activities, but those will be suitable for on-premises farms only.

On the designer area of Visual Studio you will find an activity designer with a Sequence activity ready to go. In the lower side of the designer, click on the Arguments tab in order to add some input arguments that will be used by SharePoint Designer 2013 to provide a reference to the file to move, and to the target library.

image

The arguments will be:

  • ItemGuid – In – System.Guid
  • ListId – In – System.Guid
  • TargetFolder – In – System.String

Then, you can open the toolbox and drag’n’drop on the designer surface the following activities:

  • WebUri (available under the “SP – Current Context” group of activities)
  • LookupSPListItemId (available under the “SP – List” group of activities)
  • AppOnlySequence (available under the “SP – Utilities” group of activities)

The WebUri activity will be used to retrieve the URL of the current web site. Define a workflow variable with name currentWebUri and store the result of the WebUri activity into it.

WF-Custom-Action-04

Now you can retrieve the ID of the source item using the LookupSPListItemId activity. Provide the ListID and the ItemGuid values as filters for selecting the ID. Save the result in a variable named ItemId, of type System.Int32, like shown in the following screenshot.

WF-Custom-Action-05

Now you are ready to compose the URL of the REST request for moving the file. As already stated, the REST API that we will use is the MoveTo method of the File class of the client object model. The URL of that method, from a REST perspective, will look like the following one:

{currentWebUri}/_api/web/lists/GetById(guid'{ListId}')/Items({sourceItemID})/File/MoveTo(newUrl='{targetFolder}',flags='1')

Where the tokens wrapped in { } will be the variables and arguments just defined, and the last parameter named flags with a value of ‘1’ means: overwrite any already existing file on the destination. You can find the full documentation of the MoveTo method here.

Thus, you are ready to prepare the REST request. Within the AppOnlySequence drag’n’drop a new Sequence activity, and inside it define the following activities:

  • Assign (available under the “Primitives” group of activities)
  • BuildDynamicValue (available under the “DynamicValue” group of activities)
  • HttpSend (available under the “Messaging” group of activities)

The Assign activity will be used to compose the target URL of the REST API to invoke. Define a String variable named restAPIUri and assign it the following value:

String.Format("{0}_api/web/lists/GetById(guid'{1}')/Items({2})/File/MoveTo(newUrl='{3}',flags='1')", currentWebUri, ListId, sourceItemID, TargetFolder)

Now define a variable named restHttpHeaders of type DynamicValue and assign it an item with path “Accept” and value “application/json;odata=verbose”, using the BuildDynamicValue activity. That header will instruct the REST API to respond using JSON (JavaScript Object Notation).

image

Now you are ready to configure the HttpSend activity like the following.

WF-Custom-Action-06

The HTTP Method property will have a value of “POST”, the Uri will be the restAPIUri variable, and the RequestHeaders will be the restHttpHeaders variable. In some cases, depending on the REST API you are invoking, you should have to provide some other HTTP Headers like X-RequestDigest, IF-MATCH, etc. It is out of scope of this article to cover this topic, too. However, consider that to retrieve such information you can still use one or more HttpSend activity instances. For example, to retrieve a value for the X-RequestDigest you can invoke the /_api/ContextInfo URI via POST and parse the JSON response.

Publishing the Custom Action

So far, you are almost ready. You simply need to publish the Custom Action. In order to accomplish this task you have to define the .actions4 file mentioned above. This file is mainly an XML based definition of the action, of its input and output arguments, and of the bindings between the UI of SharePoint Designer 2013 and the arguments expected by the action. Unfortunately there are not so much documents online available about the schema of .actions4 files. However, you can read the following article: WorkflowActions4 schema reference.

Nevertheless, the best thing to do - in order to understand the schema and to learn the supported values for elements and attributes of .actions4 files - is to inspect the out of the box available .actions4 files deployed by the standard setup of SharePoint 2013. The .actions4 files are deployed in the SharePoint15_Root\TEMPLATE\{Language LCID}\Workflow folder. There you can open already existing files using Notepad or any other text editor … and do inspection.

Here you can see the .actions4 file for the current custom action.

<Action Name="MoveFileActivity"
ClassName="DevLeap.SP2013.MoveFileAction.MoveFileActivity"
    Category="Files" AppliesTo="all">
    <RuleDesigner Sentence="Move item from %1 to %2">
        <FieldBind Field="ListId,ItemGuid" Text="this document" Id="1"
            DesignerType="ChooseDoclibItem" DisplayName="Item" />
        <FieldBind Field="TargetFolder" Text="target" Id="2"
            DesignerType="TextArea" DisplayName="Target" />
    </RuleDesigner>
    <Parameters>
        <Parameter Name="ListId" Type="System.Guid" Direction="In" DesignerType="Hide" />
        <Parameter Name="ItemGuid" Type="System.Guid" Direction="In" DesignerType="ListItem"
            Description="ID of the list item used by this action." />
        <Parameter Name="TargetFolder" Type="System.String, mscorlib" Direction="In"
            Description="Target Folder where the file will be moved to." />
    </Parameters>
</Action>

As you can see the ClassName and Name attributes map to the corresponding elements in the Visual Studio 2013 project. Within the RuleDesigner element are defined the fields to prompt to the end users, while in SharePoint Designer 2013, in order to fill out the item to move (field with Id 1) and the target folder (field with Id 2). Furthermore, in the Parameters section you can see the real input arguments expected by the custom action, whose Name attribute map to the Field attributes of the FieldBind elements.

Build the project, publish the .WSP package and upload it onto the target SharePoint 2013 Site Collection inside the list of Sandboxed Solutions (“Site Settings” –> “Web Designer Galleries” –> “Solutions”), whether it is on-premises or on SharePoint Online. Activate the solution and the target feature, as well. Start SharePoint Designer 2013, or close and restart it, and create a new workflow definition. You will find the new action available in the “Files” group of actions. In the following figure, you can see the output in SharePoint Designer 2013.

WF-Custom-Action-07

Sometime, the custom action do not show up in SharePoint Designer 2013. In that situation, try to clear the cache of the tool simply by deleting the folder named with the name of the target site collection and available under the folder user profile\appdata\local\microsoft\websitecache\sitename.

That’s all … Enjoy with your custom actions!

Here you can download the code related to this article.

November 04
Updating the X.509 Certificate of a Trusted Identity Provider in SharePoint 2010/2013

Many times I have been asked by customers about how it is possibile to update an X.509 Certificate bundled with a Trusted Identity Provider. It is a common request, and a common need … because certificates expire based on a schedule.

Here you can see a sample PowerShell code excerpt to update the certificate of a trusted IP:

Add-PSSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue

$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("C:\Certificates\IPSTS.cer")
New-SPTrustedRootAuthority -Name "IPSTS Certificate for 2013" -Certificate $cert


Set-SPTrustedIdentityTokenIssuer -Identity "IPSTS" -ImportTrustCertificate $cert

Assuming that the X.509 certificate is saved in a file with path C:\Certificates\IPSTS.cer and the trusted IP is named “IPSTS” in SharePoint.

Meanwhile, in order to register the trusted IP for the first time, you should use the following PowerShell script:

Add-PSSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue

$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("C:\Certificates\IPSTS.cer")
New-SPTrustedRootAuthority -Name "IPSTS Certificate for 2013" -Certificate $cert

$map0 = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress" -IncomingClaimTypeDisplayName "Email" -SameAsIncoming
$map1 = New-SPClaimTypeMapping -IncomingClaimType "
http://schemas.microsoft.com/ws/2008/06/identity/claims/role" -IncomingClaimTypeDisplayName "Role" -SameAsIncoming

$realm = “http://www.company.com/_trust/default.aspx
$signinurl = “
https://www.ipsts.demo/Identity/Issue.aspx
$ip = New-SPTrustedIdentityTokenIssuer -Name "IPSTS" -Description "IPSTS" -Realm $realm -ImportTrustCertificate $cert -ClaimsMappings $map0,$map1 -SignInUrl $signinurl -IdentifierClaim $map0.InputClaimType

And to remove the trusted IP you should use:

Remove-SPTrustedIdentityTokenIssuer -Identity "IPSTS"

August 31
MCM/MCSM/MCA are not just certification exams!

After the shocking news that I found this morning in my mailbox, as like as many of my friends and buddies did, and after reading a ton of emails, tweets, and blog posts … I needed to think a little bit, spend some time with my family, and drink some good beers …
Now, I am ready to tell you about my feeling and about what is happening to the most brilliant certification path I have ever been involved.

Since today, the MCM/MCSM/MCA certifications are dead. So what?!

The Microsoft Certified Solutions Master certification for me

It was about five years ago when I became aware of the Advanced Certifications program. Instantly I thought “I wanna get those certifications!” At that time my goal was to show my muscles and to demonstrate myself that “I can do that” :-) ! I studied a lot, I saved a lot of money in order to reach the budget (18.500$!) to register for the training (I’m self-employed), I attended the training (what a great experience!), I tried the exams (twice the knowledge exam, once the lab), and I asked a huge sacrifice to my family during more than six months of intensive training and studying. While wondering to achieve the certification I simply saw the certification by itself, almost like yet another badge! In my carrier, I passed more than 40 Microsoft certification exams, and having the MCM was like collecting one more, a big or the biggest one!

However, step by step, while approaching the training I understood that it is not just a certification exam. First of all: the suggested pre-reading list is a great source of information and it took me a lot of time to read just a subset of all the suggested documents. If you want to become an expert of SharePoint, regardless the MCSM certification, download it (as long as it will be available) and read it carefully. Then, I attended the screening call with Brett (thank you man!), and I understood that the process was really serious and rigorous, but challenging! However, when I attended the training in Redmond I realized that the real value of being an MCM was something else than just getting one more badge! Yes, because the MCM/MCSM/MCA certifications are not “just certifications!” But, if you don’t live and feel the experience, you are not aware of.

The first great value you get back from the MCM certification is the knowledge you get and share with all your classmates and your trainers. The training material is really great! You cannot find anywhere else quantity of contents, quality of contents, and real experiences from the fields. Moreover, you have the opportunity to meet the most brilliant people, you can share experiences with them, you can understand how big, fascinating, and involving are the SharePoint world and the community of SharePoint experts. Nevertheless, if you are so lucky (and prepared!) to pass the exams (knowledge and lab) you have the opportunity to be part of a REAL community of people excited about technology, and about SharePoint. The community, and being part of it, is the real great return of investment that you achieve when you become an MCM/MCSM/MCA.

SPC2012-CommunityPicture

Thus, regardless the current fate of the Advanced Certifications program, I would like to thank publicly all the people involved in the program for the great job they have done! You deserve my thankfulness and respect forever and ever.

What happened to the advanced certifications?

This morning I waked up reading a very impersonal and unfair email, which was telling me that the Advanced Certifications program will shut down and that the MCM/MCSM/MCA certifications will die on October the 1st 2013! And so what?

I don’t want to start a flame about why the killed the certification, and I don’t want to speculate about what they should have done instead of … that’s it. They killed the program. I don’t understand why, I don’t like it, I hate it … but that’s it. I don’t think that my complains, or those of all my colleagues will change something, and I’m sad about that. I would have liked to be involved in a process that involves completely me and my friends, but they didn’t. Thus, I think my feelings and my feedback do not really matter from the business perspective of a company that has to make money, more today than in the past.

Moreover, from a business viewpoint, I could even understand the reasons that lead them to kill the program. However, they did not simply killed a program or a product. They also injured about 100 people, who were presented no more than one year ago, during the keynote of the SPC2012, as the most brilliant community of SharePoint experts supporting the customers in the fields. We are all enthusiast, ambassadors and we are a community. You cannot kill a community so easily and rapidly and we will not die!

So what am I asking for?

I would like to keep the real value of being an MCM/MCSM, which is being part of a real community of real people, sharing common interests, ideas, experiences, and passion. Thus, even if Microsoft decided to kill our baby (!), they should keep alive the community (including the private distribution list), they should keep alive the web pages describing what was (!) the program and who are the members of the community, to justify the return of the investment we did, and they should keep for all of us the benefits that inherit from being an MCM/MCSM. This is a matter of honesty and fairness, in my opinion. We invested our lives for that, and we deserve what they promised for us.

Lastly, I would suggest to setup a kind of “backup plan” in order to still update and provide the training contents (as a new business? Come on ... :-)! ), instead of throwing out of the window all the investments and all the great value of those contents. I can provide somehow my contribution, in case of need ...

Today I feel really sad and bitter … but “the show must go on” …

Feedbacks are welcome (paolo@pialorsi.com).

July 30
Accessing SharePoint 2010 via CSOM using claims-based Authentication and providing FedAuth cookie

A few days ago I’ve been asked about how to access a SharePoint 2010 web site using CSOM, when the target Web Application is configured to use the claims-based authentication, there are multiple authentication providers configured, and you want to provide a FedAuth cookie automatically.

In fact, the ClientContext class provides the Credentials property or the FormsAuthenticationLogonInfo property, which together with the AuthenticationMode enum property (Default, FormsAuthentication, and Anonymous) allow to authenticate either using Windows credentials or FBA.

However, when you configure the claims-based authentication, you define multiple authentication providers, and you need to provide a FedAuth cookie to SharePoint via CSOM, the previously shown properties do not fit with your needs.

If you track the requests made from your browser while authenticating against a target SharePoint site, let’s say using Windows Integrated authentication and the claims-based authentication, you will see a flow like the following one:

 AuthN-Flow-Fiddler2

As you can see, your browser is redirected from the requested URL to the /_layouts/Authenticate.aspx page, which redirects (HTTP 302) your browser to the /_login/default.aspx page. This last page is the default login page, which prompts the end user with the authentication provider selector dropdown.

image

Let’s say you select the “Windows Authentication” option. As you can see in the Fiddler2 trace, your browser will be redirected again (HTTP 302) to the /_windows/default.aspx page. You will authenticate with your Windows credentials (eventually leveraging Integrated Authentication) and, as soon as you will be authenticated, your browser will be redirected (one more time!) to the /_layouts/Authenticate.aspx page, which will finally send one last redirect to the originally requested page. Under the cover, between steps 15 and 16 of the traced flow, SharePoint will emit a cookie (FedAuth) that will hold a reference to your authentication session. It will be something like that:

Set-Cookie: FedAuth=77u/PD94bWwg...; expires=Wed, 31-Jul-2013 07:34:25 GMT; path=/; HttpOnly

Every subsequent requests will provide the FedAuth cookie to SharePoint, in order to make it aware of the current authentication context.

In order to manually orchestrate such a flow, you will have to manually retrieve the FedAuth cookie and to provide it to the target Web Application via CSOM. Luckily, the ClientContext class provides an event handler called ExecutingWebRequest, which allows to intercept a web request running from the ClientContext to the target SharePoint site, just before the request is sent on the wire. Moreover, within the ExecutingWebRequest event you will get a variable of type WebRequestEventArgs, which provides you a hook to web request executor, the collection of HTTP Headers, the Cookies, etc.

Through the WebRequestExecutor property of the current event args, you will be able to access all the main information about the outgoing request, including the cookies. By providing a CookieContainer object to the WebRequest object used by the ClientContext you will be able to keep track of the issued FedAuth cookie and you will be able to authenticate against your target SharePoint.

In the following code excerpt you can see how to manage this task.

    // Create the ClientContext instance
    ClientContext ctx = new ClientContext(baseSiteUrl);

    // Configure anonymous authentication, because we will use FedAuth cookie instead
    ctx.AuthenticationMode = ClientAuthenticationMode.Anonymous;

    // Register an anonymous delegate to the ExecutingWebRequest event handler
    ctx.ExecutingWebRequest += new EventHandler<WebRequestEventArgs>((s, e) => {

        // If we do not have a cookies variable, which will be a shared instance of a CookieContainer 
        if (null == cookies)
        {
            lock (cookiesSyncLock)
            {
                if (null == cookies)
                {
                    // Let’s create the CookieContainer instance
                    cookies = new CookieContainer(); 

                    // Make a “fake” request to the /_windows/default.aspx page
                    // emulating the flow previously illustrated
                    HttpWebRequest request = WebRequest.Create(
                        baseSiteUrl + "_windows/default.aspx?ReturnUrl=%2f_layouts%2fAuthenticate.aspx%3fSource%3d%252FDefault%252Easpx&Source=%2FDefault.aspx") as HttpWebRequest;

                    // Provide a set of Windows credentials (default or explicit)
                    request.Credentials = CredentialCache.DefaultNetworkCredentials;
                    request.Method = "GET"; 

                    // Assign the CookieContainer object
                     request.CookieContainer = cookies;
                    request.AllowAutoRedirect = false;

                    // Execute the HTTP request
                    HttpWebResponse response = request.GetResponse() as HttpWebResponse;
                    if (null != response)
                    {
                        // The following variable simply holds the FedAuth cookie value, but that value
                        // is not used directly
                        fedAuthCookieValue = response.Cookies[fedAuthCookieName].Value;
                    }
                }
            }
        }

        // Grab the CookieContainer, which now holds the FedAuth cookie, and configure
        // it into the WebRequest that the ClientContext is going to execute and …
        // you have done all you need!
        e.WebRequestExecutor.WebRequest.CookieContainer = cookies;
    });

    Site site = ctx.Site;
    Web web = ctx.Web;

    List targetList = web.Lists.GetByTitle("Shared Documents");
    ListItemCollection items = targetList.GetItems(CamlQuery.CreateAllItemsQuery());

    ctx.Load(items);
    ctx.ExecuteQuery();

    foreach (ListItem item in items)
    {
        Console.WriteLine(item["Title"]);
    }
}

That’s all! I hope this will help someone.

July 02
Final Demos of my sessions at TechEd Europe 2013

At the following links you can find the latest and updated demos of my sessions at TechEd Europe 2013:

Keep in mind that the samples of SES-B402 are also part of the whole code samples of my latest book, and you can download those samples for free from the publisher web site.

Meanwhile, here you can download or watch online the videos of the sessions:

Thanks everybody for attending my sessions!

June 27
Visual Studio Ultimate 2013 Preview VM model on Windows Azure VMs (for SP2013 devs, too!)

Yesterday at the Build 2013 Conference Microsoft announced the availability of Visual Studio 2013 Preview, as well as the availability of Windows Server 2012 R2, Windows 8.1 Preview, and .NET 4.5.1. Since yesterday, are also available some new VM models in the Windows Azure VMs IaaS offering.

One of them is really interesting from a SharePoint 2013 developer perspective. In fact, the model named “Visual Studio Ultimate 2013 Preview” includes not only the preview of the new Visual Studio 2013, but also SQL Server 2012 Express and SharePoint 2013 Trial (Note: 180 days trial should expire on 12/21/2013).

Here you can see the screenshot of the VMs gallery with the new VM model:

image

Moreover, after creating a VM based on this model you can configure - simply leveraging a set of out-of-the-box available PowerShell scripts - any of the following development environments:

  • SharePoint 2013 development machine for local development, which allows you to play with SharePoint 2013, SQL Server 2013 Express and Visual Studio 2013 on a single VM.
  • SharePoint 2013 development machine joined to an already existing AD domain, which requires to have a domain controller VM accessible from the development VM (i.e. either within the same cloud service or within a common virtual network).
  • SQL Express development machine, which allows you to play with Visual Studio 2013 Preview and SQL Server 2012 Express

You can find further details about configuring the development environment at the following URL: http://www.microsoft.com/en-us/download/details.aspx?id=39346.

There you will also find instructions about how to create - from remote using PowerShell - such a VM, already configured for the target proper scenario. The VM model image file for this scenario is:

03f55de797f546a1b29d1b8d66be687a__Visual-Studio-2013-Preview-Ultimate-12.0.20617.1

which is a useful information, in case you would like to create the VM manually through PowerShell.

To tell the true, the scripts are really useful not only for their main purpose, but also for learning how you can push the gas on PowerShell to automate deployment processes!

Be careful: As it is clearly stated in the document I’ve just referenced a few lines before, the Workflow Manager engine will not be installed and configured by the scripts available in the VM model, if you install a SharePoint development machine (either local or domain joned). On the contrary, if you plan to develop workflows on the development machine, you will have to configure the Workflow Manager by yourself, meanwhile the setup files are already available on the system disk of the VM model, including the latest fixes (KB2799752 and KB2799754). Moreover, the only service applications installed on the farm are those available by the default:

  • Application Discovery and Load Balancer Service Application
  • Security Token Service Application

Thus, if you plan to play with the real world of SharePoint (including apps for SharePoint; Managed Metadata Service; User Profile Service, which requires to install the domain joined configuration, and Enterprise Social features; Search; Workflow; etc.) you will have to provision those service applications manually. Just in case, consider reading this paper with some useful instructions and code samples about how to configure a SharePoint 2013 farm.

So, what are you waiting for?! :-) Let’s start creating your VM development environment and discover the new features of the Office Developer Tools for Visual Studio 2013!

June 26
Visual Studio 2013 Preview and new Office Developer Tools for VS2013

Today, at the Build Conference 2013, Microsoft announced – as it was expected – the availability of a preview of Visual Studio 2013, together with the new .NET Framework 4.5.1 and Windows 8.1.

Here you can find further information about .NET 4.5.1 and Visual Studio 2013, while here you can find a preview of the new features and capabilities of the Office Developer Tools for Visual Studio 2013. It is really challenging the native support for ASP.NET MVC apps in SharePoint, as well as the new Publishing Manager for apps for Office and SharePoint.

You can download Visual Studio 2013 Preview from here: http://www.microsoft.com/visualstudio/eng/2013-downloads

Enjoy!

1 - 10Next
 

 About this blog

 
About this blog

Welcome to the SharePoint Developer Reference Blog. I'm Paolo Pialorsi and I'm a senior consultant, author and trainer, focused on SharePoint development and customization. I'm based in Italy, but I work whereever it is needed (mainly EMEA). I'm a Microsoft Certified Master on SharePoint 2010.

You also can follow me on twitter: @PaoloPia

 MCM.png