Cloud and Microsoft technologies enthusiast architect in Switzerland RSS 2.0
# Thursday, November 15, 2012

Speakers : Oleg Kofman, Jon Epstein

The goal of a SharePoint governance is to keep both IT and users happy and to set some processes in place. It should involve a broad range of people, from the business (really important to get adoption) to the network people. Legal and compliance becomes also even more important teams, as data and files are going online and licensing concerns.

SLAs should be published, in order to limit the number of escalation and helps explaining the expectations.

So far, there are 3 models : Farm solution (since 2007), sandbox solutions (deprecated) and SP Apps. Sandbox solution, even if deprecated, it is not yet gone. Recommendation is to convert the Sandbox solutions. App is the preferred option for multi-tenant Office 365. It is easy to deploy, maintain and reuse, but there is no server-side code. Even if an App does not have server-side code, it can be an umbrella on top of an other solution having server-side code.

Process to determine if an App can be used : 1st, check if there is already something existing in the Enterprise Catalog and if it can be done without code, just to avoid a reinvent the wheel. Then, check the SP Store or 3rd party vendors in order to have a build vs buy thought. Then, check if TimerJob is needed to be developed or if any server-side code is needed. Will is save time and who will maintain the solution (once the developers have gone) ? The last step is to define who will publish the App in the store.

Different hosting models : SharePoint-hosted, when a user deploy an App a subweb will be created. So, if 10000 users request to install the App, there is potentially as much subweb created ! No server-side code is allowed. Provider-hosted, to host the App on your own infrastructure that can be also completely separated, enabling to use other languages, such as PHP, for App development. Autohosted, where a Windows Azure Web Site and SQL Azure DB will automatically be provisioned when the App is installed.

So far, every developers needed his own SP farm. With the new App model, therefore this is not really required as the developer can stay in a single site.

Publishing an App needs to make a choice between two App Scopes : Web Scope, on site per instance, or Tenant Scope, one site per tenant. This can’t be defined by the developer (no manifest entry). It is important to publish the evaluation criteria for App permission, so that the developers know what is expected and what is allowed in terms of App permissions. High Trust Apps (for On-Prem) require more scrutiny. New Apps versions may also request for different permissions. So, it is important to check, from a governance perspective, what are the permissions are requested and challenge them. Plan for publishing requests SLAs. Someone has to proactively look for the requests and approving them. Plan who will highlights, adds and review (technical) Apps. It is possible to have one App catalog per Web Applications. It is not possible to share an App catalog across Web Applications. Therefore, define whether the Apps should be published in all the catalogs or not. Such process should then be in place.

On the operations side, it is important to monitor the usage, errors and the licensing. It can be done from the Central Admin and Apps monitoring information is stored in the Usage database.

Thursday, November 15, 2012 11:32:00 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
SP2013 | SPC12

Speaker : Sesha Mani

SharePoint 2013 Server-to-Server (S2S) Out-of-the-box implemented scenarios : SP to Exchange, SP to SP and SP to MTW (Multi-tenant workflow service). OAuth is a standard that enables an application to access user’s resources without prompting for user’s credentials. S2S is a kind of an extension of OAuth 2.0 to allow an application to be high trust between applications. Application Principal is like the user’s principal (user identity), but for applications.

S2S implementation in O365

All S2S applications must exist in MSO-DS (kind of application directory). ACS plays the role of trust broker. When someone connects to SP Online, SP Online makes a call to the ACS to authenticate itself (with a certificate exchange) and asks to talk to Exchange Online and tries to get a token. ACS validates the SP Online token and checks the requested endpoint before issuing a STS token. SP Online augments the user identity information before sending the STS token (composed by the inner token – the basic ACS token, and the outer token – containing the added user’s claims). Exchange Online validates the STS token ensuring it has been issued by the ACS. It also validates that inner-token endpoint is the same as the outer-token endpoint. Then, it ensures that the user has the necessary permissions. Basically, the Application Identity is the inner-token, while the User Identity is the outer-token. Finally, Exchange Online returns the resources to SP Online.

This scenario is also valid for on-premises.

S2S authentication - On-Premise Only

SP hosts the App Management Service, the STS and the User Profile App (UPA) Service. Exchange hosts as well an STS OM. Making a trust between the two STS can simply be done using a PowerShell command (New-SPTrustedSecurityTokenIssuer and New-PartnerApplication). A user connects on SP and wants to do some activity on both SP and Exchange. SP STS issues an STS token containing the outer and inner token. That STS token is sent to Exchange, which checks this token if it accepts delegation. Also, endpoints check is done between the inner and outer tokens information. Exchange checks the user’s permissions before returning the resources to SP. The configuration steps are : STS trust establishment using the PS cmdlets), Permissions for principal and scenario specific settings.

Hybrid Solution

In the Cloud, MSO-DS synchronizes with the ACS and SP Online trusts the ACS which plays the role of the trust broker. On-Prem, same setup as in the previous scenario. In addition, SP STS synchronizes with the SP Online STS. Also, AD synchronizes with MSO-DS. A user makes a query in SP Online through On-Prem SP. SP issues a STS token to connect to SP Online. Request is sent to ACS for validation. Then SP sends the augmented-token to SP Online STS (containing the e-mail address – SMTP - and the UPN). SP Online accepts the token and returns the resources back to On-Prem SP.

S2S not supported topologies

Cross-premise or cross product S2S calls (SP calling Exchange Online), Cross-tenant scenario (Contoso to Fabrikam), S2S call between SP without AD to Exchange or Lync On-Prem, Web apps using Windows-Classic authentication.

Thursday, November 15, 2012 9:58:00 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
SP2013 | SPC12

Speaker : Eric Shupps

The SP2013 model still has sites, content, services API, but has now Apps with package, HTML/JS (or other technology) and data. In order for an App to be authorized in SharePoint, it uses OAuth.

Autohosted App model enables SharePoint to automatically deploy the package on Azure. Be careful of the limitations of this model. In Visual Studio, when creating an App for SharePoint, it actually creates two project : the App project and the SharePoint project. The App project is the entity that will be deployed. You can only deploy using Visual Studio in a developer template SharePoint site. A ToketHelper.cs file is automatically created in the VS solution to deal with all the token-related operations and authorization. It is used to get the client context.

An App can consume SQL data using WCF/JSON/XML, SharePoint data using OAuth/REST/CSOM or Office data using HTML/XML.

He wrote a WCF “proxy” that interfaces with the SQL database and serialize/deserialize the data in JSON for consumption from Javascript.

Javascript is the language to use when dealing with Office data.

To get the data, App Web REST API has to be called. Be sure to deploy a SharePoint artifact, otherwise no App Web will be created. It is easy to add the Chrome look-and-feel, by using a bit of Javascript.

SQL Azure database tables must have a primary key before deploying it. Azure Virtual Machines have 5 different sizes : XS/S/M/L/ML . They have persistent storage, virtual networking. Web Role is an Azure VM. It can be shared or reserved, 3rd party assemblies can be deployed, TFS/GIT//Web Deploy are also available. Azure Web Sites are free, only contains default asemblies (i.e. WIF is not there and can’t be deployed), TFS/GIT/Web Deploy is also available.

SharePoint Apps only uses HTTPS.

Office 365 Apps only uses HTTPS and needs a unique App ID. In order to “F5” deploy, it has to be a developer template site. Publishing to the Office Store requires an App & Package validation. The Office Store is public, whereas the App catalog is private, and therefore does not require validation or licensing.

Thursday, November 15, 2012 6:02:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
SP2013 | SPC12

Speakers : Mike Ammerlaan, Neil McCarthy

To integrate Yammer in an App, “Yammer Embed” is what to use. It provides support for profile information and also communicates with the Yammer platform. When conversation is started in your application, the embed will also post it on the Yammer network. Conversation can be at the site level or at the item or document level, which then uses the Open Graph protocol.

Yammer is exposed as a set of protocols and APIs that allow to build any kind of application (example of an ASP.NET application). Documentation can be found at .

An App needs a key (ClientID) and can be proposed in the company’s App Store. From the Yammer web application, your application has to be registered, then keys and tokens are delivered. Some other information have to be filled, such as the website and URIs.

No server-side code is needed. Using server-side code is more difficult than using JavaScript. It can be done using Javascript (example shown). A single reference to the yam.js is needed in the HTML. In that reference, that is where the ClientID is specified. OAuth is used to authorize the application.

REST APIs supports Messages (be careful of not flooding the feeds of users), Groups, Users, Suggestions, Autocomplete, Search and Networks.

Activity story is composed by an actor, an object and an activity (Robert Red has Updated this File). Activity stories appear in the Activity feed (“Recent Activity). It contains a URL (i.e. to a document), a Type, an Image, a Title, the Name and the e-mail address of the Actor, an Action (i.e. Update. It also supports custom actions), and a Message.

From SharePoint, the Remote Event Receivers can be used to publish activities happening on SharePoint to Yammer. But, in order to work, the OAuth token must be cached before to be reused when calling the Yammer REST APIs. For example, for each update on a document or list item, the Yammer APIs can be called.

User-enabled and Admin-enabled Apps need the non-free version of Yammer. Admin-enabled Apps are useful for posting information on behalf of users (impersonation).

Global Yammer Apps Development steps : Register the app, development, list the App in the network’s directory, describe the App, submit the app to Yammer, list in every network’s directory. The last 3 steps are only required for Global Apps.

To export Yammer data, there is a Data Export API which generates a zipped file containing the data.

Thursday, November 15, 2012 4:11:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
SP2013 | SPC12
# Wednesday, November 14, 2012

Speaker : Spencer Harbar

Almost all the features of SharePoint have to deal with Identity management and the User Profiles. Identity Management is only 10% about technology. One of the primary consideration when talking about Identity Management is “who owns” the data. The other is the quality of the data. Is the data clean or up to date. Another important consideration is, for example, the Active Directory data quality. Sometimes as well, data is stores in lagacy or LOB systems. Also, access to Identity Management data has to be controlled, but for external systems, the question of authorization and authentication comes in the game.

It is really important to work closely with the DS admins as they are at the center of such project. Communication is therefore key. Also, several permissions are needed for the synchronization.

An issue so far was a misunderstanding of the UPA architecture and its features and design constraints are driving the deployment options. 4 key areas that need to be careful with : Security, Privacy, Policy, Operations. Several services are in the scope of UPA : SQL, Distributed Cache, Search, Managed Metadata, Business Data Connectivity.

The goals of the new Profile Sync in SP2013 are performance improvements and a wider compatibility. As an example, for a directory with more and 100’000 users or groups can be imported in 60 hours instead of 2 weeks previously.

Several synchronization “modes” : AD import, UP Sync and custom code synchronization.

Can filter on users and groups (object selection) using LDAP queries (inclusion based, UPS has exclusion based filters). Requires one connection per domain. Support shadow accounts and it is possible to do property mapping as well as account mappings between AD and FBA or others. Replication of AD changes is still needed, but improves the import. There is no cross forest Contact resolution, mapping to SP system properties is not supported. Embedding profile with data from BDC is not possible. Mapping properties with multi-values is not possible. When an AD configuration is changing (schema), a full import is required as well as a purge after the import. The full import can’t be scheduled. AD connections are stored in the Profile DB, whereas the UPS stores them in the Sync DB. Mappings and filters are not moved.

Provisioning UPA and UPS is done in the Manage Service Applications and with PowerShell, but with PS, there is still the default schema issue. Two workarounds : logon the machine using the Farm account, or to change manually the data in the database (not supported).

Some profile properties are automatically in the taxonomy when provisioning the Managed Metadata Service. Indeed, MMS is leveraged by the User Profile import. In order to start the User Profile Service Application, the Farm account has to be put in the Local Admins group. Therefore a warning, complaining that the Farm account is in the admin group, will be displayed in the SP Health analyzer. The recommendation is to enable Netbios if the FQDN and Netbios domain name don’t match, right after the UPSA provisioning.

Planning is the key to success. Remember that if data are rubbish, it will not be better once imported. Health of the AD is very important.

The web front-end servers are still making direct TDS calls to the SQL Server.

Wednesday, November 14, 2012 9:48:00 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
SP2013 | SPC12

Speaker : Scot Hillier

Search should be meant as a data access technology, using client-side code (CSOM) and REST. Windows 8 (for example), can therefore leverage the search of SharePoint.

REST and CSOM are connected to the Query Pipeline. Both Javascript and CSharp can be used to connect REST or CSOM, but Javascript and REST are the recommended way for remote applications. Search API is available under the /_api/search/query URI.

Managed Properties have other properties to describe what can be done with them. Keyword Query Language has been improved (SQL Query Language no longer exists). KQL allows to make quite interesting queries or even filters (by date). XRANK is a possibility to boost some content in the result, such as pushing some documents to the top of the search result. WORDS enables to make synonyms when searching.

Result Sources are similar to scopes (a subset of the index). It is built using a query-like language, enabling to filter by content type or metadata values. To build the Result Source, a query builder is available from the Site Settings page. It is then used by the Search Result web part.

Query Rules uses words to target some content only (i.e. “deck” would return only powerpoint presentations). It applies on a given Result Source.

Result Types gives specific presentation to content based on their types. It is also linked to a Result Source and also associated to a template for the display. The template is defined using a URL. A template specifies the managed properties needed in order to use it. A template is basically a bit of html to generate for each item of the search result. Templates are stored in the Master Page and Page Layout gallery. From a template originally written in html, SharePoint generates the javascript file, which is used for the template.

Search settings can be exported or imported in a SearchConfiguration.xml file. But, does not contain master pages or web parts. This can be useful to move configuration from an environment to another. The options are directly available in the Site Settings page.

CSQP is not available in Office 365.

Wednesday, November 14, 2012 4:09:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

Speaker : Daniel Kogan

The new model (Search driven) is about improving the way publishing compared to what was done so far. Search engine can crawl a lot more content than just SharePoint can do. Content can be published wherever it is located. It is also a way to separate the content from the presentation.

Search driven is not about searching for content. It is assembling pages based on the search. SharePoint and external will be crawled to build the index. Some libraries can be declared as a catalog, meaning that their content can be used cross site collection. On the publishing side, there is the term store, the content search web part, managed navigation and publishing pages. Indexed content is published through the webpart framework and the page framework. The idea is also to propose new content to the user, based on previous requested content.

Content Search Web Part

The CSQP executes a search query. The content is skinned for the presentation. The query can be set in a way to return 1 or more result. Therefore, it can be used to display a single article as well. The CSQP can also take parameters, like the term where the user navigated, to drive the search query. One little issue with the CSQP is the search latency. If a 2-minutes crawl latency is an issue, then CSWP is not the good candidate. When editing a page, two choices are proposed. Either the page template or only for a given article or URL. Editing the page for one article will create an individual instance of that page. The CSQP proposes a query builder to set the query itself, and also the refiners and sorting settings. Queries are relying on the managed properties. Query Rules manipulate the way we want to return the result.

Display Templates

They are HTML and Javascript. They take the search query result and display it using a given look-and-feel. Publishing content and display templates are going hand-in-hand. One of the display template presented is the Slideshow template.

Query Builder

It is a UI based query tool to create queries on the index. It is for information worker.

Query Rules

They are pretty technical and little bit more complex in terms of management. They allows to trick the query result, based on the, for example, user. It is more for information architects. They are available from the Site Settings page. Query rules can be stopped. It is in the CSQP that it is set whether the Query Rules should be used or not.

Content Catalog

3 steps : go in a catalog and enable it, or a new site catalog (site template); indexing, connection. The search index will “advertise” the catalog. The Manage Catalog Connections page displays the list of catalog available. Connecting asks several questions before making the link to the catalog. On the library side, catalog has to be enabled, what kind of filter could be used and some other settings.

Managed Navigation

Hierarchies are now a bit different from what was in SP2010. The intended use of the managed terms has been extended for the navigation. Selecting a term set in the term store offers more option, such as the purpose of the term set to be used for navigation. Hierarchies can be different from site collection to another. It is possible to assign a specific page for a given term. Custom properties are now exposed in the UI. In the navigation, Managed Navigation can be selected, needing a term set to use for the navigation.

Cross Site Publishing

The goal of XSP is to author the content somewhere, and to use it from any other SharePoint publishing environment. XSP is not content deployment. Content deployment moves content or artifacts, but not XSP. XSP really reuses the content, staying at the same storage location. It requires the publishing feature to be enabled and the catalog. It can be used when dealing with multilingual environments.

Wednesday, November 14, 2012 2:46:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
# Tuesday, November 13, 2012

Speaker : Dux Raymond Sy

Warning, this post is absolutely not neutral, as @meetdux is one of my favorite speaker, and once again, I was not disappointed.

Before starting his presentation, he invited the attendees to vote via twitter and to see the result of the poll live on the screen.

When a use comes to you on Monday and tells you “SharePoint sucks”, just answer “SharePoint does not suck. You suck !”. The e-mail is still the most used collaboration tool in the companies. Don’t be misled by the word “Social”. It is not FaceBook and telling what you ate at lunch, but for companies, it is collaboration and working together, but does not give any preferred tools. It does not mean that a wiki or newsfeed has to be used. Wiki or blogs will not solve the collaboration issues that companies are facing. If business says “we need a wiki”, just ask “why ? For what ?”.

Social fails because of a lack of Executive support and ownership. Social should be understood as a way to deliver business value and not as a tool.

1 step : Gain Executive Engagement. It means commitment from the executives. Not a one shot, but on the long run. It has to have financial gains, but not only. It has to promote innovation and engagement of the people to work together.

Step 2 : Develop Relevant Use Cases. Stop pushing tools and features and talk about solutions. Social communication is different from people to people (HR, IT, Marketing, etc). Get the pain points of the users and identify quick wins, using the tools people are used to use or familiar with. Don’t try to get people leave Excel. To support his explanation, he does a demo of an Excel table synchronized with SharePoint 2013, and browsing with a Mac. Another problem with IT is that the speech is not targeting the users and should stop talking about SharePoint, CRM, SAP and so on, and rather talk with the users’ language.

Step 3 : Establish Social Roadmap. Or Enterprise Social Journey. Help people and give them the power. Once users are empowered, they have to engage with each others. But it takes time and intention “Nothing happens by accident or overnight”. Everything can’t be done at the same time, therefore, prioritizing is key. He shows an example of a list of pain points and puts them in front of SharePoint Out-of-the-box features. Then, for each features he puts the priority coming from the people, with an effort to implement the feature and the impact on the business. Also, a rating for the reusability is set. From these values, he can extract the business value for each feature. But, don’t forget to set also the IT impact, in terms of training, support and cost. Don’t forget to make the business pay for the feature. Because it is not free, they will engage more as well.

Step 4 : Identify and Groom Champions. IT or developers can’t answer all the questions. Here comes the need for SharePoint Analysts.

Step 5 : Deliver Sustainable Adoption. Does not mean only training. It is a constant process. Training does not make people experts. People have to practice and work on the solutions. In every implementation, people should be able to help themselves, by having a location where help can be provided or videos published. First raise the awareness of your Champions. Then, get a bigger buy-in beyond the Champions in other departments or group of people. Don’t forget to put a timeline and budget along with the adoption plan. In all the companies, there is a budget line for SAP stuff (training, licenses, etc) whereas SharePoint is only a little line in the Microsoft stuff. Big problem, as it is then impossible to leverage the platform.

Off course, the session finished by a Gangnam style dance session :

Tuesday, November 13, 2012 11:11:00 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

Speakers : Eray Chou, Keenan Newton

3 architecture options for Apps hosting : SharePoint-Hosted Apps (created in a separated App Web, does not allow server-side code but proxy can be defined in the manifest), for Cloud-Based Apps there are Provider-Hosted App (hosted on-premises) and Autohosted-App (SharePoint automatically and invisibly hosting the App on Azure).

Choosing between Cloud Hosted Apps and SharePoint Hosted Apps : SP Hosted Apps is more for smaller apps and resource storage. If server-side code is needed, go with a Cloud Hosted Apps. Cloud Hoted Apps is also the preferred hosting model in most of the cases.

On the services side, services.svc got extended to support REST accesses, with GET, PUT and POST verbs. Protocol used is now OData. New APIs have been added to the CSOM in order to support SharePoint Server or Windows Phone Applications. The API is now covering the whole set of SharePoint services (i.e. Taxonomy, Workflow, eDiscovery, etc). These last APIs are only available in the SharePoint Server SKU (in the DocumentManagement.dll, Publishing.dll, Taxonomy.dll and UserProfiles.dll). REST is now really the recommended way to call the APIs.

_api is the new alias for _vti_bin/client.svc, such as http://contososerver/_api/web

Results can be in JSON or ATOM format and calls can be tested in a browser. URLs are now mapping objects to resources (_api/web/lists). Even feeds can be queried through REST calls.

Remote Event Receivers are introduced and are meant to be used to call external systems. Their scope are List Item, List, Web or App and supports both Synchronous and Asynchronous After events. The purpose is not synchronization, but more notifications (do not use it to sync or mirror content). They can be defined either by used declarative syntax or via code.

This session, full of demos, was a solid developer one. If not attended, download it when it will be available.

Tuesday, November 13, 2012 4:28:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

Speakers : Kenan Newton, Rolando Jimenez

This first part is about the basics and the core concepts of the Apps and what makes them working.

With two products and platforms and many different services, how can we make them working together ? Apps is basically making the bridge between the two worlds. To help to discover these Apps, the catalog and store is here for that.

The goal of the Apps is to unify the developer worlds (Office and web development). It is done by relying on standards.

To secure the App on the client-side, it is run in the browser sandbox. For the server-side, it is no longer hosted directly in SharePoint. The API is provided through CSOM, REST, Office JS or SharePoint JS.

Tools mainly used for the development are Visual Studio 2012 or Visual Studio NAPA. But, any tool could be used, such as Notepad or Eclipse.

An example of how to integrate a Bing map in an Excel file and how data contained in the worksheet are used to pin location on the map is shown.

A first thing needed is the manifest, describing, among other things, the permissions required to run the app in Excel. The second element is the Bing html page using jQuery and the Bing map component to be displayed.

The same App can be used in several Office client application (demo of an App taking a table in Word or in Excel to create a SharePoint list in Azure). When starting an Office 2013 App project, the different Office client can be selected to make the App available there.

SharePoint Hosted Apps disallow server-side code, therefore, the code is in Javascript and interacting with the SharePoint API. SharePoint Server is no longer needed on the client machine for the developments.

Provider-hosted means any web server.

Autohosted App allows both client-side and server-side logic. SharePoint deploys then the package to Azure.

App project templates generate a .app and a .wsp, but it does not contain any dll, only declarative code. The .app package contains the wsp as well, the manifest which contains some properties such as the start page URL or the AppPrincipal.

Apps can be App Part, Custom Action or pages (immersive). Apps are hosted in a separate domain to avoid XSS and allow isolation of the App.

Once developed, the SharePoint Apps is published in the App Catalog (or Office Store for Office Apps). Uploading an App package from Visual Studio is simple. Downloading a profile for Visual Studio from the Azure portal is needed before the publishing. Once the App is uploaded in Azure, the manifest needs to be sent to the catalog. When in the catalog, users will be able to see it and install it on their Office.

Tuesday, November 13, 2012 2:58:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

As the SharePoint Conference 2012 starts with a keynote gathering more than 10’000 attendees from 85 different countries, I will post here the summary of the sessions I will attend.

This year, a little change as these posts won’t be only here, but also on 2 different community blogs :

The SharePoint Bar, because with SharePoint it is always happy hours Winking smile

And SharePoint Edu Tech, from Dave Coleman, where I will be posting with few others.

So, stay tuned here to get more stuff around SharePoint 2013

Tuesday, November 13, 2012 1:06:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
# Thursday, October 25, 2012

<Caution>This post is based on the SharePoint 2013 Consumer Preview. Thus, behavior described here may change in the next release of the platform</Caution>

When a site collection is created and if the “SharePoint Server Publishing Infrastructure” site collection feature is activated, SharePoint 2013 automatically creates a term group in the term store attached to the web application. This term group can then be used for the navigation. The structure below is therefore created :


But, when you delete the site collection, the term group and its structure underneath will remain, which can in one sense be understood. The drawback of this is that once the site collection is deleted, you can’t see it in the Term Store Management Tool from the Central Admin site. It also means that if you create a site collection with the same name as the previous one, this term group will be suffixed by a number, like “-1” or “-2”. This can be a bit dirty.

Two possibilities to avoid this situation :

  1. Delete the Term Sets and Term Group from the site collection before its deletion
  2. Use PowerShell to delete these ghost Term Groups if it is too late.

For the second solution, the following script can be used :

$Site = get-SPSite "$url"
$ServiceName = "Managed Metadata Service"
$session = Get-SPTaxonomySession -Site $Site
$termStore = $session.TermStores[$ServiceName]
$termGroup = $termStore.Groups[$termGroupName]

"About to delete Term group in"
"URL : " + $url
"Group : " $termGroup

if ($termStore -ne $null)
    if ($termGroup -ne $null)
        $termGroup.TermSets | ForEach {
            "deleting " + $_.Name
               $_.Name + "deleted"

$Site represents the target site collection URL, and $termGroupName is the name of the Term Group you want to delete

If you want to check what are the existing Term Groups in your store, you can use this script :

$Site = get-SPSite "$url"
$ServiceName = "Managed Metadata Service"
$session = Get-SPTaxonomySession -Site $Site
$termStore = $session.TermStores[$ServiceName]

"Groups in"
"URL : " + $url
if ($termStore -ne $null)
    $termStore.Groups | ForEach    {

Thursday, October 25, 2012 1:18:00 AM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Friday, October 19, 2012

<Caution>This post is based on the SharePoint 2013 Consumer Preview. Thus, behavior described here may change in the next release of the platform</Caution>

SharePoint 2013 is coming with a very nice feature which is the Managed Metadata Navigation, allowing us to define the navigation completely separated from the content or the physical pages. But, it has to be used with some attentions.

I found what could be interpreted as a bug in the SharePoint 2013 Consumer Preview and its new Managed Metadata Navigation. When a term has the same name as a subsite, the navigation to the page targeted by the term itself is fine, but for all the sub-terms, you get a “Page not found” message. The pictures below show that navigating to the “About” term is well going to the default page of the “About” subsite (i.e. /about/Pages/default.aspx), but when selecting an “About” sub-term, it does not display the /about/companyinformation/Pages/default.aspx page :



The structure of the content is the following :


And the metadata structure :


Even if, for the “About” term, I define a custom target page and explicitly specify /about/Pages/default.aspx , I still get the “Page not found” error.

The only way to solve this problem is to change the automatically assigned Friendly URL, AND, to change also the Target Page.

In fact, this is because there is a naming collision. /about, which is the Friendly URL associated with the “About” term is also the URL of the “About” sub-site. This is why the “About” term itself works.

It also works for the sub-term “About” / “Locations”, because the associated Friendly URL is /about/locations, which is also the name of the sub-site, leading to its default page. Unfortunately, the “Company Information” term is associated with the /about/company-information Friendly URL, which does not correspond to any sub-element of the /about site, which seems to take the precedence over the Friendly URL resolution and its Target Page. That explains that even if you specify a valid Target Page for the “Company Information” term, “Page not found” will still be displayed.

So, as written above, by changing the Friendly URL automatically associated to the “About” term to “aboutus” would only partially solve the issue. Indeed, /aboutus would remove the collision with the sub-site name, but would lead to an unexisting page. Changing also the Target Page, then, would completely work around this.

As a conclusion, be careful with the name you give to your site and the Friendly URL associated with the terms.

Friday, October 19, 2012 8:28:00 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Tuesday, October 16, 2012

That is something we do almost every time when building workflows. Indeed, at some point in your process, you need to know who belongs to a SharePoint group and you wonder how to get this user’s list. Of course, there is a SharePoint API for that, but when working with Nintex, the only way, if you don’t want to code your custom action, is to use the SharePoint web services. And because it is something that is regularly needed, it made sense to have a “User Defined Action”, just for that. Not that it is difficult to do, but it is quite useful to have a single action that would return the list of user for a given group.

The SharePoint web service that can be used is located at /_vti_bin/usergroup.asmx . One of the method that can be called and which is especially interesting is GetUserCollectionFromGroup that has the following WSDL signature.

POST /_vti_bin/usergroup.asmx HTTP/1.1
Host: myserver
Content-Type: text/xml; charset=utf-8
Content-Length: length
SOAPAction: ""

<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="" xmlns:xsd="" xmlns:soap="">
    <GetUserCollectionFromGroup xmlns="">

HTTP/1.1 200 OK
Content-Type: text/xml; charset=utf-8
Content-Length: length

<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="" xmlns:xsd="" xmlns:soap="">
    <GetUserCollectionFromGroupResponse xmlns="">

As it can be seen, the web method only needs a “groupName” string parameter and returns a user collection (in other words, an array of users, which is in fact a <Users><User></User></Users> element).

So, to create a useful UDA, the parameters that it should have are :

Parameter Type Direction
The Web Service URL Text Input
The Group Name Text Input
The XML Field you are interested in Text Input
The Collection of Users Collection Output



The special thing here is the “XML Field your are interested in”. Because I didn’t want to limit the UDA to return either the login or the user first or last name, I decided to leave the decision to the user calling the UDA to choose which field to get in the collection. The format of the parameter is directly tied to the XML outputted by the web service :

<GetUserCollectionFromGroup xmlns="">
        <User ID="116"
                    Flags="0" />

If you are interested in the name, the parameter should contain "@Name”, as it is an attribute of the User element. On the other side, if you are interested in the e-mail of the users, then it should contain “@Email”. The image below shows the activities being part of the UDA. Basically, 3 activities are needed : Call Web Service, Query XML, Regular Expression.


To configure the Call Web Service activity, enter the URL of the usergroup site’s web service, for example http://myserver/_vti_bin/usergroup.asmx. Enter the user name and password of an account that has the permissions to call the web service in the appropriate text box. A good practice would be to define a protected constant and to select it using the lock icon. Clicking on the Refresh button will list all the methods available from this web service, so, select the “GetuserCollectionFromGroup” one:


Once this is done, the parameters will be listed in the “Web service message” section. You can then replace the different web method arguments by the UDA’s parameters, such as “URL”, “groupName” and “Store result in”. You will end with the following configuration :


The next activity, “Query XML” is there to parse the XML response we get from the web method, and to create the collection of users. In the case below, we use the semi-colon as a separator, and we use the “User Field” UDA’s parameter to match the correct attribute using an XSL, the bold part corresponding to the UDA’s parameter :

<xsl:stylesheet version="1.0" xmlns:xsl="" xmlns:p="">
    <xsl:template match="p:GetUserCollectionFromGroup/p:Users">
        <xsl:for-each select="p:User">
        <xsl:value-of select="{WorkflowVariable:User Field}"/>;</xsl:for-each>

Finally, the “Query XML” should look like :


The very last activity is “Regular Expression”. For a reason that I still have to understand, the collection generated by the “Query XML” activity and stored in the “ParsedGroupUsers” does not behave like a “correct” collection. This is why I use such “Regular Expression”, to split the “ParsedGroupUsers” variable using the semi-colon, and to store the result in the “User Collection” output parameter :


That’s it !

Publish the User Defined Action and then use this UDA in your workflows.

Tuesday, October 16, 2012 2:06:00 AM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
Nintex | SharePoint
# Wednesday, August 01, 2012

<Caution>This post is based on the SharePoint 2013 Consumer Preview. Thus, behavior described here may change in the next release of the platform</Caution>


While working on a SharePoint 2013, after several hours of uptime, the user interface showed me an error message when I wanted to save an item in a list : “The server was unable to save the form at this time. Please try again.” Looking at different possible causes, I found that the available memory was drastically low. Indeed, on this 8GB RAM front-end server virtual machine (the database is hosted on a second VM), only few megs were still available.


Then, in the “Processes” tab, I saw that the noderunner.exe process was eating a lot of memory. A quick tour on Google and I found this Marc Molenaar’s blog post about the noderunner.exe process.

I decided to give it a try and, as suggested in the post, I restarted the “SharePoint Search Host Controller” service. Same observation as Marc, the service took a long time to restart and a huge part of the memory was released. The good thing is that at the same time it solved my item-saving issue. The error disappeared.

To be sure this service restart was “solving” the issue, I worked again several hours, playing also with the search and when the VM got short in memory, the same error message was shown to me again.

Another side effect of this low-memory case occurs when browsing the Managed Metadata tree. I suddenly received constantly an “Unexpected response from server. The status code of response is ‘500’. The status text of response is ‘System.ServiceModel.ServiceActivationException’. Unfortunately, it was impossible to get out of this message loop, and the only way to get rid of it was to kill the Internet Explorer application.


Wednesday, August 01, 2012 10:01:23 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Wednesday, July 18, 2012

Since couple of weeks, the Microsoft world was boiling and the recent announcements raised the level of excitement among the partners, developers or users. For the most recent events, it started by the Yammer’s acquisition by Microsoft, followed by the Windows Phone 8 announcement. Then, Windows 8 and the related devices, accompanied with the Surface tablet. And now, what many people were expecting since a while, the Office 2013 wave, including SharePoint 2013. Here, it is not only a wave like we had with the “Office 14 wave”, but a tidal wave should we say, with the release of the “Consumer Preview” of the products, introducing the “Modern Office” concept.

During the July 16th presentation, SharePoint 2013 was slightly mentioned, but the complete set of Office 2013 Consumer Preview was released, and following the #officepreview twits was amazing. At the same time, the NDA by which the closest communities were tied (like the MVPs) was lifted and a massive amount of information was released.

The install

So, during the event, I downloaded the SharePoint Server 2013 to start an install in a VM (2.1 GB), with 4 CPUs and 8GB of RAM. The first surprise came when I started the setup program, which directly had an “Install Prerequisites”, saving us from downloading the pre-requisites individually (if I remember well, the first versions of SharePoint 2010 didn’t have such shortcut). And fortunately, because the list of pre-requisites is quite big. Once the the pre-requisites were installed, the setup itself took around 20 minutes to install the beast. The configuration wizard is well-known too, as it looks like (if not the same) the one of 2010. Finally, it is the post-install wizard that starts and displays the first bits of the new SharePoint 2013 user interface.


Quick Round

Once the install is done, the first site created, the new user interface using the Metro style is presented. To be honest, the default theme is not the one that is the most successful; it is really difficult to see what is part of the header, what is part of the current navigation and the content. So, the first operation I did is to go in the former “Site Action” menu which is now in the right of the top bar, in the “Site Settings” and to “Change the look”. Some themes are better than others to distinguish between the content and the navigation. In addition, for each theme (or look ?), it is possible to select a different color scheme, to change or remove the background image or the fonts used.

By default, the ribbon is hidden and to make it appearing, you have to click on one of the menu header. The ribbon will thus appear, “sliding” from the top of the page header. The current navigation didn’t change much, but a nice feature is the ability to modify the links of both the global navigation and the current navigation, using the “Edit Links” link, pretty convenient as it does not force you to “Top Link Bar” or the “Quick Launch” settings.

The user menu is quite simple now. Exit the “Sign in as a different user” or other items, in SharePoint 2013, only “About Me” leading you to your personal page and the social part of SharePoint and “Sign Out”. In the same area, the “Share” button allows you to invite others and assign permissions to them on the current page, “Follow” to have the current page appearing in your feeds, “Sync” to synchronize locally the content of your site, “Edit” which is a shortcut to the Page => Edit action, or the surprising “Focus on Content” button. This feature toggles between a view without any navigation and having only the content area on the screen, and the standard view of the SharePoint page. Why not….


But, how to create a document library ? If you are not familiar (this means the first 15 minutes), you will desperately look for a “Create” button somewhere. Rather than that, going in the “Site Contents” enables you to “add an app”, which will proposes you the different types of lists or libraries you can create. Thus, I created a first library and uploaded a file in that library, which does not differ from the previous version of SharePoint. Where I am surprised again is regarding the usability of some features. For example, viewing the properties of a file, where before it was quick and needed only one click, in SharePoint 2013 it requires 2 clicks, each time on the “…” button. For such functionality, I would expect to have it directly in the context menu of the item. Let’s see if it stays like this in the final release, but maybe it worth some improvements in some cases.

The performance


With SharePoint 2010, installing it on a VM with 4 CPUs and 8GB of RAM was quite ok for trying some things. Having SQL Server in the same VM was not that bad. Sometimes slow, but not that bad. Here with SharePoint Server 2013, I decided to install it on the same kind of machine, and after a bit of time, it became really slow. I connected the server to check the performances and, even if the CPUs were only used at several percentages, it was radically different with the memory. It is simple, less than 1GB was “free”, the main memory eaters were SQL Server and the Distributed Cache Service (AppFabric). This demonstrates that another level of requirements. Indeed, checking on the web, I found this article from Bjorn Furuknap and later, the hardware requirements from Microsoft, 24GB of RAM (yes, 24 !!) is recommended, this new release of SharePoint has a price… It is also true that during the install and the configuration, I selected all the services, this for sure plays a role. But, Visual Studio is not yet installed and I am wondering what kind of setup a developer will need to have a decent development environment.

This concludes my very first post for SharePoint 2013, and will follow other articles describing either the (new) features of the platform or what is new in terms of architecture and development on a regular basis. So, as SharePoint 2013, I am “working on it” Smile and thanks to stay tuned.

Wednesday, July 18, 2012 10:37:07 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
Google Cloud Platform Certified Professional Cloud Architect
Ranked #1 as
French-speaking SharePoint
Community Influencer 2013
Currently Reading :
I was there :
I was there :
I was exhibiting at :
I was there :
I was a speaker at :
I was a speaker at :
I was a speaker at
(January 2013 session):
I was a speaker at :
I was a speaker at :
United Nations (UN) SharePoint Event 2011
I was a speaker at :
I was there !
I was there !
I was there !
I was there !
<November 2012>
About the author/Disclaimer

The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2020
Yves Peneveyre
Sign In
Total Posts: 289
This Year: 1
This Month: 0
This Week: 0
Comments: 19
Pick a theme:
All Content © 2020, Yves Peneveyre
DasBlog theme 'Business' created by Christoph De Baene (delarou)