Running VS 2015 Tools for Docker in Azure VM

Recently I tried to deploy the ASP.NET 5 web app to remote docker host by following the post in this article. The reason I want to deploy to remote host because my development machine is an Azure VM and the docker machine from the Docker Toolbox  won’t be able to create a docker host (a Virtualbox VM) locally.

There are some missing information in the article, such as to be able to deploy to remote docker host (VM), the remote host needs to be registered to local docker machine. In this post I will explain how to register the remote docker host to docker machine running locally.

This is the steps based on this article with some additional steps

  1.  Prerequisite: The Create an Azure Docker Host VM instead of following the article I just using Azure portal to create a basic A0 VM based on Ubuntu 16.04 image and install docker by running wget -qO- | sh as explained in this article. In my case I have machine name: and login name: mylogin
    And in addition, install Docker Toolbox for Windows. We need docker-machine and Git bash from this installation as VS Tools for Docker using it.
  2. Nothing changes in this step: Create ASP.NET 5 web app
  3. Add Docker support: nothing changes
  4. Point to remote docker host. Note. the {name-of-your-vm} is the docker host vm name you created in the step 1 above for example:
  5. In this step we need to setup several endpoints (ports) in our remote docker host (ubuntu vm): 80, 2376 (for docker machine) and 22 (ssh). And then do the additional config below to register Ubuntu docker host to local docker-machine.

Below is the additional config to make the local docker-machine to talk to our remote host:

a).Allow port 2376 in the remote host (Ubuntu). SSH (using PuTTY) to remote Ubuntu host and run this command

sudo ufw allow 2376

b).Setting non-password sudo user. Still in the ssh session above run these commands.$sudo adduser mylogin sudo$sudo visudo

c). Setting ssh rsa key. Back to your development Windows VM, open Git bash and run below command. as in this article step 1.The command above will generate ssh keys and store it in /user/username/.ssh/.


d). Then copy this public key to the remote host. Unfortunately step2 in the article above doesn’t work as Git bash in windows doesn’t have ssh-copy-id command. So we need to do the work around as in this article. Run below command in Git bash. Replace username,mylogin and mydockertest accordingly

  cat /users/username/.ssh/ | ssh -p 22 "cat - >> ~/.ssh/authorized_keys"

e). Test is ssh is working by running from Git bash.


f). After your development VM can ssh to remote host. We need to register the remote host to local docker machine by running this command below. Reference of this command is in here The location of the docker-machine.exe is in the Program Files where Docker Toolbox installed. Replace the mylogin and mydocktertest accordingly. You can remove the –debug flag from the command.

docker-machine.exe --debug create --driver generic --generic-ip-address -generic-ssh-user mylogin

g). Check if the the remote docker host is registered to local dockar machine by running this command

docker-machine.exe ls


Now we can build the Visual Studio Project as in the step 6 in this article. Please make sure you change the DockerMachineName in the Docker.props file as in the step 4 above. After the container is deployed you can ssh to the Ubuntu VM and run this command

sudo docker ps


SQL Azure Limit for SharePoint Autohosted App

To find the SQL Azure Edition and its maximum size in the SharePoint Online Autohosted app, firstly we need to get the connection string of the database used by the app by displaying it in an aspx page or more subtly returning it using Web API. Don’t forget that the connection string is defined in the AppSettings, snippet below shows how to get the value from the code.

public string GetConnectionString()
   return WebConfigurationManager.AppSettings["SqlAzureConnectionString"];

Normally you will get something like below.

Data Source=(a_server);
Initial Catalog=db_(guid);
User ID=db_(guid_as_above)_dbo;

Put this connection string to Visual Studio’s SQL explorer or SQL Management Studio. The User ID won’t have access to master db so that you need to connect directly to the database.
After it gets connected run this below scripts to get the SQL Edition and limit, thanks to Azure How-to talk series’ post


And the result is Azure Web Edition and Max limit is 1 GB, and unfortunately we can’t change this using ALTER DATABASE command as shown here as we don’t have access to master db. Below is the screen shot from VS 2013′s SQL Server Explorer:


P.S. please let me know if there is a way to change the limit :)

Azure emulator wont start as CacheInstaller has stopped working

Several times, I am getting CacheInstaller error as shown below during the starting up of Azure emulator from Visual Studio. Most of the cases I can just ‘Close the program’ and the emulator will continue to start.
But there was one occasion that the CacheInstaller error dialog kept coming up each time I closed the dialog; and since then the emulator won’t start. So I clicked the ‘Debug the program’ to attach the CacheInstaller to Visual Studio. And copy the error message in the CacheInstallationException dialog (shown below) to notepad.
And below is the message I got from the exception:

An unhandled exception of type 'Microsoft.ApplicationServer.Caching.AzureServerCommon.CacheInstallationException' occurred in CacheInstaller.exe
Additional information: Install Manifests : Script completed with error: ExitCode: 2
ErrorStream: Configuration error.
Configuration error.

ERROR: wevtutil.exe im failed with error 15010
OutputStream: F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching>C:\Windows\system32\unlodctr.exe /

Info: Successfully uninstalled the performance counters from the counter definition XML file
F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching>C:\Windows\system32\wevtutil.exe um

Provider Microsoft-Windows-Fabric{{751c9dc0-4f51-44f6-920a-a620c7c2d13e}} is missing channels under the channelreferances registry key.
F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching>C:\Windows\system32\wevtutil.exe um
F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching>C:\Windows\system32\wevtutil.exe um
F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching>C:\Windows\system32\wevtutil.exe um
F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching>C:\Windows\system32\lodctr.exe /
Info: Successfully installed performance counters in F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching\

F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching>if 0 NEQ 0 (
 echo ERROR: lodctr.exe failed with error 0  1>&2
 exit /b 2

F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching>C:\Windows\system32\wevtutil.exe in
<b>Provider Microsoft-Windows-Fabric{{751c9dc0-4f51-44f6-920a-a620c7c2d13e}} is missing channels under the channelreferances registry key.</b>
F:\Projects\My Cloud Project\My Cloud Project\csx\Debug\roles\MyCloudWorkerRole\plugins\Caching>if 15010 NEQ 0 (
 echo ERROR: wevtutil.exe im failed with error 15010  1>&2
 exit /b 2

From the highlighted message above, it is obvious that the wevtutil command is returning error (15010), instead of the expected value:0. And the error is something to do with some missing registry. Hmm, it looks like registry corruption and unfortunately I didn’t have restore point of my VM so that I could restore to the last working state. So I did more research on the wevtutil command, as for developer I am not really familiar with it. To cut the story short, the wevtutil command is provisioning/registering the custom event log defined in the file to the Event Viewer and it depends on the registry values in this locations: [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\WINEVT\Publisher]. Then I searched for registry key {751c9dc0-4f51-44f6-920a-a620c7c2d13e} under the Publisher node and find this node key: [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\WINEVT\Publishers\{751c9dc0-4f51-44f6-920a-a620c7c2d13e}] which is the event source for Azure Caching.

In the error message above it complains about missing channels. Under this registry node in my development VM, I only saw 2 channels (channel 1 and channel 2). So I did remote desktop to one of the worker roles in our UAT cloud services and check for this registry setting, and there were 3 channels (0,1, and 2). I added the channel 0 and their associated keys following the one in UAT and restart my visual studio. And finally I can debug my cloud project again.

Below is the registry settings that missing from my development VM:


Azure Service Bus Relay with WCF Routing Service

Azure Service Bus (SB) Relay provides us the ability create public endpoints for our internal on-premise/ corporate WCF services that resides behind proxy/firewall. This is quite useful during the development phase of any external apps that consume any on-premise LOB services. For example we can use our development cloud (MSDN) that in most cases don’t have Site to Site VPN setup to connect to these services. But if our app talk to more than 1 on premise services then each of them requires a SB Relay endpoint, so we need to manage multiple endpoint that point to the SB namespace. Luckily WCF 4.0 support routing service that enable message forwarding based on the different rules such as endpoint addresses. This diagram below shows the architecture:

SBRelayRoutingA WCF routing service host with  Azure Bus Relay endpoint maintains a routing table that maps the request URLs to on premise WCF service endpoints. I will outline the steps to set this up though it won’t be detail steps as most of them are covered in these MSDN articles (article_1 and article_2). So basically we need to combined the configuration from both articles.

  1. Setup Windows Azure Service Namespace as shown in this article_1. Let say that the service namespace is MyRouting
  2. Setup the Routing Service with Service Bus Endpoint: Create a Visual Studio’s Console Application or Windows Service, Add the Windows Azure Service Bus NuGet package (it will add the require dll and also add some WCF extensions that used by Service Bus Relay  in the app.config) and also System.ServiceModel.dll as well as System.ServiceModel.Routing.dll.
  3. In your console app’s program.cs (in windows service you add this routine in OnStart & OnStop methods) add the code below as shown in this article_2
     public static void Main(string[] args)
        var host = new ServiceHost(typeof(RoutingService));
        Console.WriteLine("host is listening");
  4. Configuring the service host with SB endpoint and routing capability. Basically it combines the service host endpoint configuration as shown in this article 1 and routing xml as shown in the article 2. Which ending up likes below:
          <service behaviorConfiguration="RoutingConfiguration"
            <endpoint binding="basicHttpRelayBinding"

    The routing is defined in the service’s behavior configuration called Routing Configuration, it contains the ‘routing table’. As the endpoint is using the basicHttpRelayBinding the address is using https protocol.

  5. The behavior configuration as shown as below
            <behavior name="RoutingConfiguration">
              <routing filterTableName="LOBFilterTable" routeOnHeadersOnly="false" />
              <serviceDebug includeExceptionDetailInFaults="False"  />
            <behavior name="sbTokenProvider">
                  <sharedSecret issuerName="owner" issuerSecret="**key**" />
              <binding name="httpServiceBusBinding">
                <security mode="Transport" relayClientAuthenticationType="None" />

    The RoutingConfiguration defines the routing table with name LOBFilterTable and the Service Bus Relay endpoint is using sbTokenProvider endpoint’s behavior connect to the Azure Service Bus Relay. As my on-premise WCF services are using the basicHttpBinding, I use basicHttpRelayBinding for the Azure Service Bus endpoint’s binding.

  6. Next step is defining the routing table as shown below. The filterTable matches the filters with the client’s endpoint. The routing is based on EndpointAddress (this article shows different filter that we can use in WCF Routing Service), in this case the public endpoint on service bus is mapped to wcf client’s endpoint (on-premise WCF services endpoint).
            <filter name="LobSvc_A" filterType="EndpointAddress" filterData="" />
            <filter name="LobSvc_B" filterType="EndpointAddress" filterData=""/>
            <filter name="LobSvc_C" filterType="EndpointAddress" filterData=""/>
            <filterTable name="LOBFilterTable">
              <add filterName="LobSvc_A" endpointName="LobSvc_A_Client" />
              <add filterName="LobSvc_B" endpointName="LobSvc_B_Client"/>
              <add filterName="LobSvc_B" endpointName="LobSvc_C_Client"/>
  7. Finally we put the client configuration as shown below. As you can see that the client endpoint’s names match with those in the filterTable
      <endpoint address="your.onPremise.WCF.A.url" contract="*"
            binding="basicHttpBinding" name="LobSvc_A_Client" />
      <endpoint address="your.onPremise.WCF.B.url" contract="*"
            binding="basicHttpBinding" name="LobSvc_B_Client" />
      <endpoint address="your.onPremise.WCF.C.url" contract="*"
            binding="basicHttpBinding" name="LobSvc_C_Client" />

    Of course you can add your own binding configuration and endpoint’s behavior to the client, such as connect using certain credentials. We have finished with the host so the next step is setting up the client connect this host.

  8. The client can be a web role in Azure or any external applications that need to connect to the on-premise WCF services. This code below show a generic method to create Service Bus Channel. the endpoint string parameter is the last part of endpoint in the filterdata filter’s configuration. In our example it can be WcfEndpointA, WcfEndpointB or WcfEndpointC, so WcfEndpointA will be routed to on-premise WCF A service.
     private static ChannelFactory<T> CreateChannelFactory<T>(string endPoint)
       var binding = new BasicHttpRelayBinding();
       binding.Security.Mode = EndToEndBasicHttpSecurityMode.Transport;
       binding.Security.RelayClientAuthenticationType = RelayClientAuthenticationType.None;
       var serviceBusEndpoint = new EndpointAddress(ServiceBusEnvironment.CreateServiceUri("https", "myrouting", "rttest/" + endPoint));
       ChannelFactory<T> sbChannelFactory = new ChannelFactory<T>(binding, serviceBusEndpoint);
              new TransportClientEndpointBehavior { 
                 TokenProvider = TokenProvider.CreateSharedSecretTokenProvider("owner", "your sb secret key") 
       return sbChannelFactory;
  9. In conclusion we can have one Service Bus Endpoint that map to many on-premise WCF Services by utilizing WCF Routing Service capability.

Call Sharepoint Online CSOM from an external application

There might be time when an external application want to talk to SharePoint Online (Office 365) without user interaction. Below are several scenarios that we might want to use SharePoint 2013 resources from external applications such as:

  • A console/windows application/service to perform administration to Skydrive Pro or SharePoint Online
  • Use SharePoint Online resources from Azure worker role i.e. uploading documents to a document library or adding items in a list or interact with the workflow,etc

These are steps to allow an external application to use site collection resources:

Firstly, we need to register a new SharePoint app. If the external application needs to access a site collection in SharePoint Online for example, we need to register an app by going to an application page called appregnew.aspx. For site collection above, it will be


  • Generate Client Id – and copy it to notepad
  • Generate Client Secret – and copy it to notepad
  • Title is your app title
  • App domain, the domain of your app, or anything such as guid. I normally put a URI that identifies the external app
  • Redirect URI, can be blank or put the current site collection url where this app registered.

Then click create button, it will register an Azure AD’s Service Principal with Id equals to the Client Id. This Service Principal will allow the OAuth process between the external application and SharePoint Online. (You can run this command from Office 365 Powershell console  to get more information about the service principal: Get-MsolServicePrincipal -AppPrincipalId <Client Id>) This msdn article provides some guideline about registering SharePoint app.

Next step, we need to set the permission for the app by going to /_layouts/15/appinv.aspx. The permission will authorize the external application to access SharePoint resources.

  • App Id: copy the Client Id we created at the first step and click Lookup. it will populate the other information except the Permission Request XML.
  • In the Permission Request XML put the below xml. The AllowAppOnlyPolicy flags that the registered app can be access by external application regardless the user.
  • The scope represents the permission right that the app can have. I noticed that with Write permission I am still getting access denied for uploading documents or adding items in a list. So I need to have FullControl permission.
<AppPermissionRequests AllowAppOnlyPolicy="true">
    <AppPermissionRequest Scope="http://sharepoint/content/sitecollection" Right="FullControl" />

Off course you need to be a site collection admin to be able to set permission as above, then click Create and click Trust It on the next screen. If you want to call SharePoint CSOM against any personal sites in SkyDrive Pro you need to register the app to have full control with tenant scope http://sharepoint/content/tenant. But only tenant admininstrator can register an app with this scope. This article contains all possible scopes that you can use in app permission.

TenantIDBefore we jump to the code we need to get the Realm (in the case of SharePoint Online, it is the Tenant Id). Go to /_layouts/15/appprincipals.aspx and copy the GUID after the ampersand to notepad. Click the image for more detail

Now we’re ready to write code, in your external application project add the TokenHelper.cs, you can get the file from any SharePoint App project then put the appsettings in the project’s app.config or web.config that will be used by the Token Helper as below

    <add key="ClientId" value="the-client-id"/>
    <add key="ClientSecret" value="the-client-secret"/>
    <add key="Realm" value="the-tenant-id"/>

These setting will be used by the Token Helper to perform the OAuth process. Below is the code to get the SharePoint Client Context that can be used to access the site collection in SPO using CSOM.

 public static ClientContext GetClientContextForApp(Uri siteUrl)
    var SharePointPrincipalId = "00000003-0000-0ff1-ce00-000000000000";
    var token = TokenHelper.GetAppOnlyAccessToken(SharePointPrincipalId, siteUrl.Authority, null).AccessToken;
    return TokenHelper.GetClientContextWithAccessToken(siteUrl.ToString(), token);

The GetAppOnlyAccessToken’s 3rd parameter is targetrealm, it sets to null as it uses the one set in the appsettings. If you want to retrieve the realm dynamically you can call TokenHelper.GetRealmFromTargetUrl method, but this will make another https roundtrip to Azure AD.

With this approach we can use SharePoint Online resources from any external applications such as Azure Worker can upload files to a document library or insert items to a List or kick of a workflow in SharePoint.

SharePoint 2013 Execute Javascript function after MDS load

SharePoint 2013 introduces a performance improvement to increase the page load response time called Minimal Download Strategy (MDS). It is a feature that by default is activated in the team site. The MDS is performed by functions inside javascript file called start.js. This blog explains the flow of a MDS call.

So it is a bit tricky to execute a javascript function to modify the DOM after the MDS finish as a normal jQuery’s document.ready function or _spBodyOnLoadFunctionNames are executed before the MDS script finishes.

Inside the start.js file there is an object called asyncDeltaManager that perform most of the async page loading. After several times code stepping in a javascript debugger, I could see what is going on under the hood, at the end the process a function called _scriptsLoadComplete is called and inside it, a function called _endRequest is called. And in the _endRequest function there is a routine to iterate a eventHandlerList called endRequest. So there is a event handler that we can hook our function into the asyncDeltaManager pipeline. Shortly, below code shows how to perform this task.

$(function () {
      ExecuteOrDelayUntilScriptLoaded(function () {
          if (typeof asyncDeltaManager != "undefined")
          else setFullScreenMode();
      }, "start.js");


function setFullScreenMode() {
     return false;

In the code above the function setFullScreenMode will set the page in full screen mode and hide the siteaction and the full screen button. This code need to be executed after the MDS finishes. I hooked this function inside the jQuery load to the MDS script pipeline asyncDeltaManager.add_endRequest to make sure that the function executed after everything is loaded.

Apply DesignPackage using SharePoint 2013 Client Object Model

SharePoint 2013 introduced Design Manager as part of Publishing feature to provision look and feel such as master page, page layouts, theme, composed look,etc. We can also package the design files as a design package which actually is a sandbox solution file (wsp). Whenever we import a design package to SharePoint site, in the background it performs these tasks:

  • Rename the wsp (design package) file and upload it to the Solution Gallery
  • Activate the solution file and Activate the features inside the solution. The feature will provision the design files into the master page gallery
  • And finally apply the design such as applying the master page to the site.

There are also API available both server and client APIs to perform this task. The client API gives us the ability to push the Design Package to remote site such as SharePoint Online (Office 365). This below code show how to use the client API to provision design package to remote site. The fileUrl is the location of the file in the document libraryrelative to the site such as “SiteAssets/Bootstrap.wsp”. This post shows the routine to upload the file to the document library using Client Object Model.

private static void ApplyDesign(ClientContext context, string fileUrl)

  if (context.IsSiteFeatureActivated(PublishingFeature))
    DesignPackageInfo info = new DesignPackageInfo()
       PackageGuid = Guid.Empty,
       MajorVersion = 1,
       MinorVersion = 1,
       PackageName = "Bootstrap"
    Console.WriteLine("Installing design package ");

    string fileRelativePath = GetSiteRelativePath(context.Url) + fileUrl;
    DesignPackage.Install(context, context.Site, info, fileRelativePath);

    Console.WriteLine("Applying Design Package!");
    DesignPackage.Apply(context, context.Site, info);

First we check if publishing feature GUID (f6924d36-2fa8-4f0b-b16d-06b7250180fa) is activated in site collection level. This method as well as the GetSiteRelativePath are a custom methods. The DesignPackageInstall url requires to pass the wsp url relative to the root for example in a site collection: we need to pass /sites/TestA/SiteAssets/Bootstrap.wsp to the method.

 public static bool IsSiteFeatureActivated(this ClientContext context, Guid guid)
    var features = context.Site.Features;

    foreach (var f in features)
        if (f.DefinitionId.Equals(guid))
          return true;
    return false;

  public static string GetSiteRelativePath(string url)
     string urlDoubleSlsRemoved = url.Replace("://", string.Empty);
     int index = urlDoubleSlsRemoved.IndexOf("/");
     return urlDoubleSlsRemoved.Substring(index);

Another thing that I found is the DesignPackage.Install method will install the wsp file into the solution gallery. As there is no client API to perform Sandbox solution installation we can literally use this method(DesignPackage.Install) to install ANY sandbox solution packages.

Upload a wsp file to Office 365 (SP 2013) using WebClient

In recent project I need to create a script to apply a design package to site collections in SharePoint Online. The first thing I need to do is to upload the design package (wsp file) to a document library. There is Client API to upload the file to SharePoint but there is size limitation, so I decided to use WebClient  PUT method instead. There are some catches that I found when using this approach:

  • I couldn’t just set the WebClient.Credentials to SharePoint Online Credentials as I would get 403 error. Solution: I need to send the SharePoint security token back as a FedAuth cookie, as a result I need to have a custom WebClient that has Cookie Container
  • I was getting ServerException (Unable to read cabinet info from) when run the Client API DesignPackage.Install on the uploaded wsp file. And I found that the WebClient.UploadFile method somehow didn’t maintain the file format properly. Solution: I need to use the OpenWrite method and upload it in a stream

Below is the working code. First I created a custom web client that accepts Cookie. There is other benefit of using this approach as we have more control on the web request object such as we can increase the timeout as well.

    class SPWebClient : WebClient
        public CookieContainer CookieContainer { get; set; }

        protected override WebRequest GetWebRequest(Uri address)
            HttpWebRequest request = base.GetWebRequest(address) as HttpWebRequest;
            if (request != null)
                request.CookieContainer = CookieContainer;
            return request;

Below is the upload routine. the site Url is the siteUrl in Office 365, the file Url is the document library url such as “SiteAssets/MyDesignPackage.wsp”, local path is the location of the wsp file in the filesystem.

As I mentioned above that I need to have claim base authentication cookie (“FedAuth”) and set its value as the security token found in the SPOIDCRL. And lastly upload the file in a stream (DON’T use UploadFile method!)

   private static void UploadFile(string siteUrl ,string fileUrl, string localPath, SharePointOnlineCredentials credentials)
            var targetSite = new Uri(siteUrl);

            using (var spWebClient = new SPWebClient())
                var authCookie = credentials.GetAuthenticationCookie(targetSite);
                spWebClient.CookieContainer = new CookieContainer();
                spWebClient.CookieContainer.Add(new Cookie("FedAuth",
                          authCookie.Replace("SPOIDCRL=", string.Empty),
                          string.Empty, targetSite.Authority));
                spWebClient.UseDefaultCredentials = false;
                    Stream stream = spWebClient.OpenWrite(targetSite + fileUrl, "PUT");
                    BinaryWriter writer = new BinaryWriter(stream);
                catch (WebException we)

In the next post I will show you how to use the DesignPackage CSOM to install and apply the Design Package.

PSConfig /SharePoint Configuration wizard runs longer when upgrading the servers.

During production farm (multi servers environment) SP1 upgrade,  I noticed that the psconfig was running significantly longer. I expected that the first run would be the only one that took very long time, but in my case the subsequent runs on the other servers took on average 25 minutes to finish. As we know that the first run will update the SharePoint databases so, it makes senses that it will take longer time in proportion to the number of the databases. But subsequent runs should only take around 5 minutes each.

I noticed that my config wizard seems paused at certain times, then I checked the upgrade log in the 14 hive\LOGS, which I found that during that apparent paused the log spit out alot of this below message, and they are a lot, as this site is in Mysite DB which has hundreds of site collections.

[PSCONFIG] [SPContentDatabaseSequence] [DEBUG] [3/3/2012 10:10:24 PM]: Site with Id = 961cfd1e-5cf8-4e40-8756-0032a517119b is contained in dictSitesNeedUpgrade, but is not added. Possible sitemap conflicts. Could not connect to http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc. TCP error code 10061: No connection could be made because the target machine actively refused it

Looking at that message, it seems that the psconfig tried to communicate with the Token Service, which obviously shut down during the process because the IIS service W3SVC (World Web Pulbhisng Service) is stopped.

So I looked at the Central Admin > Upgrade and Migration > Review Databases Status and I found that any content databases for any site collections that use Claim Based Authentication have Status field saying Database is up to date, but some sites are not completely upgraded. It became clear to me why the psconfig run much longer during the servers upgrade, because it tries to upgrade any claim based site collections by calling unavailable Security Token Services. It seems the psconfig try to call this service several times before it spits out the error message and it worsened in my case as there are hundreds of site collections of this type.

So my solution is before running the psconfig  to upgrade the servers (after successfully upgrade the database by running the psconfig at the first time), I run powershell commands to upgrade the databases that have status not completely upgraded. as mentioned in previous paragraph. This will work as the Token Service will be available. Run below powershell script for each db once, where the db_name is the ‘not fully upgraded’ database name. After run this command, my configuration wizard only runs for approximately 5 minutes in each server.

  $ct = Get-SPContentDatabase -Identity <db_name>
  Upgrade-SPContentDatabase -id  $

SP OOB Approval Workflow: Change Request &Reassignment stop working

I noticed in several occasions that whenever I customize the OOB approval workflow or modify its task form, the Request Change and Reassignment (Reassign Task) stop working. Before we are going into the workaround, I want to outline how the OOB Approval workflow interacts with its Initiation Form. The task page (WrkTaskIP.aspx) contains an XmlFormView control. The page’s code behind will parse the task form’s infopath xml data into a Hashtable that will be passed into the Approval workflow (OfficeTask) workflow through this SPWorkflow.AlterTask method. Then the workflow will access the hashtable during its process via SPWorkflowTaskProperties.ExtendedProperties, and it relies on the correct combination of keys-values of the hashtable.

The most important key in the Hastable is the TaskStatus, with value equals to the submit data connection of the associated button as shown in the picture. So for example the Request Change button is clicked the task status will be set as ChangeRequest.

if(Request Change clicked)
hash["TaskStatus"] = "ChangeRequest"

Another interesting fact is that the hash table’s key will be changed to the field’s Guid of the Task list where the workflow task resides if it has a field with internal name or static name equal to the key. So in this case if the associated task list’s Status field has static name equals to TaskStatus, the hashtable key will be replaced with the Guid of the Status field.

So if we have a custom form with custom submit connection and a custom workflow, our workflow need to handle extedendProperties["TaskStatus"] = “our submit connection’s name” and extendendProperties[SPBuiltInFieldId.TaskStatus] =”our submit connection’s name”.

With this knowledge we can troubleshoot some of the OOB Approval workflow issues:

  • The Change Request and Reassign Task are not executed at all. Symptom: You click the Request Change or Reassign Task button with Request Change From/Reassign Task To field empty (means reassign the task to the originator). Nothing happen to the workflow and it just stays at current approval process. The Request From and Reassign Task To field are represented by FieldName_RequestTo element and FieldName_DelegateTo element respectively. And they have infopath Person xml in their inner xml. In the unmodified task form the inner xml is literaly empty string when the fields are empty, but whenever we modified the task form the inner xml is not empty but an empty Person element. Somehow the OOB approval workflow fails whenever the value is an empty Person xml. To workaround this we need to set the default values of this two fields to empty.
    1. Go to Data Ribbon tab of the Infopath Designer and choose Default Values in the Form Data section.
    2. Expand the dataFields elements and untick the elements below to set the default  values as empty.
    3. Save and re publish the form
  • The Change Request and Reassign Task are executed but instead of creating a new task it change the current status to ChangeRequest or ReassignTask. This is something to do with the TaskStatus ExtendedProperties as I mentioned in a previous paragraph that the TaskStatus key might be replaced with a GUID if in the task list’s Status field has static name equals to TaskStatus. To fix this one, the Status field’s static name needs to be changed. The best practise is changed it to the same as the internal name. How do we know/change the static name of the field? The answer is Powershell, open the SharePoint administrator and check the static name of the Status field, if it is equals to TaskStatus, change it. Below is short powershell code.
       #$list is the workflow task list
       $field = $list.fields["Status"]
       #if static name = TaskStatus
       $field.staticname = $field.internalname
  • When a user has assigned a Change Request task, the Request field on the task form is empty even though there was a comment. The request field is associated with the Description field (Body is its internal name) of the task list. The field is a Note field (Multiple lines of text) and the infopath form expects it as a plain text. Somehow when we modify the OOB task form, the Description field is provisioned as a Rich Text field. We can change this setting to Plain Text from the list settings as below.