Cloud and Microsoft technologies enthusiast architect in Switzerland RSS 2.0
# Wednesday, November 14, 2012

Speaker : Spencer Harbar

Almost all the features of SharePoint have to deal with Identity management and the User Profiles. Identity Management is only 10% about technology. One of the primary consideration when talking about Identity Management is “who owns” the data. The other is the quality of the data. Is the data clean or up to date. Another important consideration is, for example, the Active Directory data quality. Sometimes as well, data is stores in lagacy or LOB systems. Also, access to Identity Management data has to be controlled, but for external systems, the question of authorization and authentication comes in the game.

It is really important to work closely with the DS admins as they are at the center of such project. Communication is therefore key. Also, several permissions are needed for the synchronization.

An issue so far was a misunderstanding of the UPA architecture and its features and design constraints are driving the deployment options. 4 key areas that need to be careful with : Security, Privacy, Policy, Operations. Several services are in the scope of UPA : SQL, Distributed Cache, Search, Managed Metadata, Business Data Connectivity.

The goals of the new Profile Sync in SP2013 are performance improvements and a wider compatibility. As an example, for a directory with more and 100’000 users or groups can be imported in 60 hours instead of 2 weeks previously.

Several synchronization “modes” : AD import, UP Sync and custom code synchronization.

Can filter on users and groups (object selection) using LDAP queries (inclusion based, UPS has exclusion based filters). Requires one connection per domain. Support shadow accounts and it is possible to do property mapping as well as account mappings between AD and FBA or others. Replication of AD changes is still needed, but improves the import. There is no cross forest Contact resolution, mapping to SP system properties is not supported. Embedding profile with data from BDC is not possible. Mapping properties with multi-values is not possible. When an AD configuration is changing (schema), a full import is required as well as a purge after the import. The full import can’t be scheduled. AD connections are stored in the Profile DB, whereas the UPS stores them in the Sync DB. Mappings and filters are not moved.

Provisioning UPA and UPS is done in the Manage Service Applications and with PowerShell, but with PS, there is still the default schema issue. Two workarounds : logon the machine using the Farm account, or to change manually the data in the database (not supported).

Some profile properties are automatically in the taxonomy when provisioning the Managed Metadata Service. Indeed, MMS is leveraged by the User Profile import. In order to start the User Profile Service Application, the Farm account has to be put in the Local Admins group. Therefore a warning, complaining that the Farm account is in the admin group, will be displayed in the SP Health analyzer. The recommendation is to enable Netbios if the FQDN and Netbios domain name don’t match, right after the UPSA provisioning.

Planning is the key to success. Remember that if data are rubbish, it will not be better once imported. Health of the AD is very important.

The web front-end servers are still making direct TDS calls to the SQL Server.

Wednesday, November 14, 2012 9:48:00 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
SP2013 | SPC12

Speaker : Scot Hillier

Search should be meant as a data access technology, using client-side code (CSOM) and REST. Windows 8 (for example), can therefore leverage the search of SharePoint.

REST and CSOM are connected to the Query Pipeline. Both Javascript and CSharp can be used to connect REST or CSOM, but Javascript and REST are the recommended way for remote applications. Search API is available under the /_api/search/query URI.

Managed Properties have other properties to describe what can be done with them. Keyword Query Language has been improved (SQL Query Language no longer exists). KQL allows to make quite interesting queries or even filters (by date). XRANK is a possibility to boost some content in the result, such as pushing some documents to the top of the search result. WORDS enables to make synonyms when searching.

Result Sources are similar to scopes (a subset of the index). It is built using a query-like language, enabling to filter by content type or metadata values. To build the Result Source, a query builder is available from the Site Settings page. It is then used by the Search Result web part.

Query Rules uses words to target some content only (i.e. “deck” would return only powerpoint presentations). It applies on a given Result Source.

Result Types gives specific presentation to content based on their types. It is also linked to a Result Source and also associated to a template for the display. The template is defined using a URL. A template specifies the managed properties needed in order to use it. A template is basically a bit of html to generate for each item of the search result. Templates are stored in the Master Page and Page Layout gallery. From a template originally written in html, SharePoint generates the javascript file, which is used for the template.

Search settings can be exported or imported in a SearchConfiguration.xml file. But, does not contain master pages or web parts. This can be useful to move configuration from an environment to another. The options are directly available in the Site Settings page.

CSQP is not available in Office 365.

Wednesday, November 14, 2012 4:09:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

Speaker : Daniel Kogan

The new model (Search driven) is about improving the way publishing compared to what was done so far. Search engine can crawl a lot more content than just SharePoint can do. Content can be published wherever it is located. It is also a way to separate the content from the presentation.

Search driven is not about searching for content. It is assembling pages based on the search. SharePoint and external will be crawled to build the index. Some libraries can be declared as a catalog, meaning that their content can be used cross site collection. On the publishing side, there is the term store, the content search web part, managed navigation and publishing pages. Indexed content is published through the webpart framework and the page framework. The idea is also to propose new content to the user, based on previous requested content.

Content Search Web Part

The CSQP executes a search query. The content is skinned for the presentation. The query can be set in a way to return 1 or more result. Therefore, it can be used to display a single article as well. The CSQP can also take parameters, like the term where the user navigated, to drive the search query. One little issue with the CSQP is the search latency. If a 2-minutes crawl latency is an issue, then CSWP is not the good candidate. When editing a page, two choices are proposed. Either the page template or only for a given article or URL. Editing the page for one article will create an individual instance of that page. The CSQP proposes a query builder to set the query itself, and also the refiners and sorting settings. Queries are relying on the managed properties. Query Rules manipulate the way we want to return the result.

Display Templates

They are HTML and Javascript. They take the search query result and display it using a given look-and-feel. Publishing content and display templates are going hand-in-hand. One of the display template presented is the Slideshow template.

Query Builder

It is a UI based query tool to create queries on the index. It is for information worker.

Query Rules

They are pretty technical and little bit more complex in terms of management. They allows to trick the query result, based on the, for example, user. It is more for information architects. They are available from the Site Settings page. Query rules can be stopped. It is in the CSQP that it is set whether the Query Rules should be used or not.

Content Catalog

3 steps : go in a catalog and enable it, or a new site catalog (site template); indexing, connection. The search index will “advertise” the catalog. The Manage Catalog Connections page displays the list of catalog available. Connecting asks several questions before making the link to the catalog. On the library side, catalog has to be enabled, what kind of filter could be used and some other settings.

Managed Navigation

Hierarchies are now a bit different from what was in SP2010. The intended use of the managed terms has been extended for the navigation. Selecting a term set in the term store offers more option, such as the purpose of the term set to be used for navigation. Hierarchies can be different from site collection to another. It is possible to assign a specific page for a given term. Custom properties are now exposed in the UI. In the navigation, Managed Navigation can be selected, needing a term set to use for the navigation.

Cross Site Publishing

The goal of XSP is to author the content somewhere, and to use it from any other SharePoint publishing environment. XSP is not content deployment. Content deployment moves content or artifacts, but not XSP. XSP really reuses the content, staying at the same storage location. It requires the publishing feature to be enabled and the catalog. It can be used when dealing with multilingual environments.

Wednesday, November 14, 2012 2:46:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
# Tuesday, November 13, 2012

Speaker : Dux Raymond Sy

Warning, this post is absolutely not neutral, as @meetdux is one of my favorite speaker, and once again, I was not disappointed.

Before starting his presentation, he invited the attendees to vote via twitter and to see the result of the poll live on the screen.

When a use comes to you on Monday and tells you “SharePoint sucks”, just answer “SharePoint does not suck. You suck !”. The e-mail is still the most used collaboration tool in the companies. Don’t be misled by the word “Social”. It is not FaceBook and telling what you ate at lunch, but for companies, it is collaboration and working together, but does not give any preferred tools. It does not mean that a wiki or newsfeed has to be used. Wiki or blogs will not solve the collaboration issues that companies are facing. If business says “we need a wiki”, just ask “why ? For what ?”.

Social fails because of a lack of Executive support and ownership. Social should be understood as a way to deliver business value and not as a tool.

1 step : Gain Executive Engagement. It means commitment from the executives. Not a one shot, but on the long run. It has to have financial gains, but not only. It has to promote innovation and engagement of the people to work together.

Step 2 : Develop Relevant Use Cases. Stop pushing tools and features and talk about solutions. Social communication is different from people to people (HR, IT, Marketing, etc). Get the pain points of the users and identify quick wins, using the tools people are used to use or familiar with. Don’t try to get people leave Excel. To support his explanation, he does a demo of an Excel table synchronized with SharePoint 2013, and browsing with a Mac. Another problem with IT is that the speech is not targeting the users and should stop talking about SharePoint, CRM, SAP and so on, and rather talk with the users’ language.

Step 3 : Establish Social Roadmap. Or Enterprise Social Journey. Help people and give them the power. Once users are empowered, they have to engage with each others. But it takes time and intention “Nothing happens by accident or overnight”. Everything can’t be done at the same time, therefore, prioritizing is key. He shows an example of a list of pain points and puts them in front of SharePoint Out-of-the-box features. Then, for each features he puts the priority coming from the people, with an effort to implement the feature and the impact on the business. Also, a rating for the reusability is set. From these values, he can extract the business value for each feature. But, don’t forget to set also the IT impact, in terms of training, support and cost. Don’t forget to make the business pay for the feature. Because it is not free, they will engage more as well.

Step 4 : Identify and Groom Champions. IT or developers can’t answer all the questions. Here comes the need for SharePoint Analysts.

Step 5 : Deliver Sustainable Adoption. Does not mean only training. It is a constant process. Training does not make people experts. People have to practice and work on the solutions. In every implementation, people should be able to help themselves, by having a location where help can be provided or videos published. First raise the awareness of your Champions. Then, get a bigger buy-in beyond the Champions in other departments or group of people. Don’t forget to put a timeline and budget along with the adoption plan. In all the companies, there is a budget line for SAP stuff (training, licenses, etc) whereas SharePoint is only a little line in the Microsoft stuff. Big problem, as it is then impossible to leverage the platform.

Off course, the session finished by a Gangnam style dance session :

Tuesday, November 13, 2012 11:11:00 PM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

Speakers : Eray Chou, Keenan Newton

3 architecture options for Apps hosting : SharePoint-Hosted Apps (created in a separated App Web, does not allow server-side code but proxy can be defined in the manifest), for Cloud-Based Apps there are Provider-Hosted App (hosted on-premises) and Autohosted-App (SharePoint automatically and invisibly hosting the App on Azure).

Choosing between Cloud Hosted Apps and SharePoint Hosted Apps : SP Hosted Apps is more for smaller apps and resource storage. If server-side code is needed, go with a Cloud Hosted Apps. Cloud Hoted Apps is also the preferred hosting model in most of the cases.

On the services side, services.svc got extended to support REST accesses, with GET, PUT and POST verbs. Protocol used is now OData. New APIs have been added to the CSOM in order to support SharePoint Server or Windows Phone Applications. The API is now covering the whole set of SharePoint services (i.e. Taxonomy, Workflow, eDiscovery, etc). These last APIs are only available in the SharePoint Server SKU (in the DocumentManagement.dll, Publishing.dll, Taxonomy.dll and UserProfiles.dll). REST is now really the recommended way to call the APIs.

_api is the new alias for _vti_bin/client.svc, such as http://contososerver/_api/web

Results can be in JSON or ATOM format and calls can be tested in a browser. URLs are now mapping objects to resources (_api/web/lists). Even feeds can be queried through REST calls.

Remote Event Receivers are introduced and are meant to be used to call external systems. Their scope are List Item, List, Web or App and supports both Synchronous and Asynchronous After events. The purpose is not synchronization, but more notifications (do not use it to sync or mirror content). They can be defined either by used declarative syntax or via code.

This session, full of demos, was a solid developer one. If not attended, download it when it will be available.

Tuesday, November 13, 2012 4:28:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

Speakers : Kenan Newton, Rolando Jimenez

This first part is about the basics and the core concepts of the Apps and what makes them working.

With two products and platforms and many different services, how can we make them working together ? Apps is basically making the bridge between the two worlds. To help to discover these Apps, the catalog and store is here for that.

The goal of the Apps is to unify the developer worlds (Office and web development). It is done by relying on standards.

To secure the App on the client-side, it is run in the browser sandbox. For the server-side, it is no longer hosted directly in SharePoint. The API is provided through CSOM, REST, Office JS or SharePoint JS.

Tools mainly used for the development are Visual Studio 2012 or Visual Studio NAPA. But, any tool could be used, such as Notepad or Eclipse.

An example of how to integrate a Bing map in an Excel file and how data contained in the worksheet are used to pin location on the map is shown.

A first thing needed is the manifest, describing, among other things, the permissions required to run the app in Excel. The second element is the Bing html page using jQuery and the Bing map component to be displayed.

The same App can be used in several Office client application (demo of an App taking a table in Word or in Excel to create a SharePoint list in Azure). When starting an Office 2013 App project, the different Office client can be selected to make the App available there.

SharePoint Hosted Apps disallow server-side code, therefore, the code is in Javascript and interacting with the SharePoint API. SharePoint Server is no longer needed on the client machine for the developments.

Provider-hosted means any web server.

Autohosted App allows both client-side and server-side logic. SharePoint deploys then the package to Azure.

App project templates generate a .app and a .wsp, but it does not contain any dll, only declarative code. The .app package contains the wsp as well, the manifest which contains some properties such as the start page URL or the AppPrincipal.

Apps can be App Part, Custom Action or pages (immersive). Apps are hosted in a separate domain to avoid XSS and allow isolation of the App.

Once developed, the SharePoint Apps is published in the App Catalog (or Office Store for Office Apps). Uploading an App package from Visual Studio is simple. Downloading a profile for Visual Studio from the Azure portal is needed before the publishing. Once the App is uploaded in Azure, the manifest needs to be sent to the catalog. When in the catalog, users will be able to see it and install it on their Office.

Tuesday, November 13, 2012 2:58:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -

As the SharePoint Conference 2012 starts with a keynote gathering more than 10’000 attendees from 85 different countries, I will post here the summary of the sessions I will attend.

This year, a little change as these posts won’t be only here, but also on 2 different community blogs :

The SharePoint Bar, because with SharePoint it is always happy hours Winking smile

And SharePoint Edu Tech, from Dave Coleman, where I will be posting with few others.

So, stay tuned here to get more stuff around SharePoint 2013

Tuesday, November 13, 2012 1:06:00 AM (GMT Standard Time, UTC+00:00)  #    Comments [0] -
# Thursday, October 25, 2012

<Caution>This post is based on the SharePoint 2013 Consumer Preview. Thus, behavior described here may change in the next release of the platform</Caution>

When a site collection is created and if the “SharePoint Server Publishing Infrastructure” site collection feature is activated, SharePoint 2013 automatically creates a term group in the term store attached to the web application. This term group can then be used for the navigation. The structure below is therefore created :


But, when you delete the site collection, the term group and its structure underneath will remain, which can in one sense be understood. The drawback of this is that once the site collection is deleted, you can’t see it in the Term Store Management Tool from the Central Admin site. It also means that if you create a site collection with the same name as the previous one, this term group will be suffixed by a number, like “-1” or “-2”. This can be a bit dirty.

Two possibilities to avoid this situation :

  1. Delete the Term Sets and Term Group from the site collection before its deletion
  2. Use PowerShell to delete these ghost Term Groups if it is too late.

For the second solution, the following script can be used :

$Site = get-SPSite "$url"
$ServiceName = "Managed Metadata Service"
$session = Get-SPTaxonomySession -Site $Site
$termStore = $session.TermStores[$ServiceName]
$termGroup = $termStore.Groups[$termGroupName]

"About to delete Term group in"
"URL : " + $url
"Group : " $termGroup

if ($termStore -ne $null)
    if ($termGroup -ne $null)
        $termGroup.TermSets | ForEach {
            "deleting " + $_.Name
               $_.Name + "deleted"

$Site represents the target site collection URL, and $termGroupName is the name of the Term Group you want to delete

If you want to check what are the existing Term Groups in your store, you can use this script :

$Site = get-SPSite "$url"
$ServiceName = "Managed Metadata Service"
$session = Get-SPTaxonomySession -Site $Site
$termStore = $session.TermStores[$ServiceName]

"Groups in"
"URL : " + $url
if ($termStore -ne $null)
    $termStore.Groups | ForEach    {

Thursday, October 25, 2012 1:18:00 AM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Friday, October 19, 2012

<Caution>This post is based on the SharePoint 2013 Consumer Preview. Thus, behavior described here may change in the next release of the platform</Caution>

SharePoint 2013 is coming with a very nice feature which is the Managed Metadata Navigation, allowing us to define the navigation completely separated from the content or the physical pages. But, it has to be used with some attentions.

I found what could be interpreted as a bug in the SharePoint 2013 Consumer Preview and its new Managed Metadata Navigation. When a term has the same name as a subsite, the navigation to the page targeted by the term itself is fine, but for all the sub-terms, you get a “Page not found” message. The pictures below show that navigating to the “About” term is well going to the default page of the “About” subsite (i.e. /about/Pages/default.aspx), but when selecting an “About” sub-term, it does not display the /about/companyinformation/Pages/default.aspx page :



The structure of the content is the following :


And the metadata structure :


Even if, for the “About” term, I define a custom target page and explicitly specify /about/Pages/default.aspx , I still get the “Page not found” error.

The only way to solve this problem is to change the automatically assigned Friendly URL, AND, to change also the Target Page.

In fact, this is because there is a naming collision. /about, which is the Friendly URL associated with the “About” term is also the URL of the “About” sub-site. This is why the “About” term itself works.

It also works for the sub-term “About” / “Locations”, because the associated Friendly URL is /about/locations, which is also the name of the sub-site, leading to its default page. Unfortunately, the “Company Information” term is associated with the /about/company-information Friendly URL, which does not correspond to any sub-element of the /about site, which seems to take the precedence over the Friendly URL resolution and its Target Page. That explains that even if you specify a valid Target Page for the “Company Information” term, “Page not found” will still be displayed.

So, as written above, by changing the Friendly URL automatically associated to the “About” term to “aboutus” would only partially solve the issue. Indeed, /aboutus would remove the collision with the sub-site name, but would lead to an unexisting page. Changing also the Target Page, then, would completely work around this.

As a conclusion, be careful with the name you give to your site and the Friendly URL associated with the terms.

Friday, October 19, 2012 8:28:00 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Tuesday, October 16, 2012

That is something we do almost every time when building workflows. Indeed, at some point in your process, you need to know who belongs to a SharePoint group and you wonder how to get this user’s list. Of course, there is a SharePoint API for that, but when working with Nintex, the only way, if you don’t want to code your custom action, is to use the SharePoint web services. And because it is something that is regularly needed, it made sense to have a “User Defined Action”, just for that. Not that it is difficult to do, but it is quite useful to have a single action that would return the list of user for a given group.

The SharePoint web service that can be used is located at /_vti_bin/usergroup.asmx . One of the method that can be called and which is especially interesting is GetUserCollectionFromGroup that has the following WSDL signature.

POST /_vti_bin/usergroup.asmx HTTP/1.1
Host: myserver
Content-Type: text/xml; charset=utf-8
Content-Length: length
SOAPAction: ""

<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="" xmlns:xsd="" xmlns:soap="">
    <GetUserCollectionFromGroup xmlns="">

HTTP/1.1 200 OK
Content-Type: text/xml; charset=utf-8
Content-Length: length

<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="" xmlns:xsd="" xmlns:soap="">
    <GetUserCollectionFromGroupResponse xmlns="">

As it can be seen, the web method only needs a “groupName” string parameter and returns a user collection (in other words, an array of users, which is in fact a <Users><User></User></Users> element).

So, to create a useful UDA, the parameters that it should have are :

Parameter Type Direction
The Web Service URL Text Input
The Group Name Text Input
The XML Field you are interested in Text Input
The Collection of Users Collection Output



The special thing here is the “XML Field your are interested in”. Because I didn’t want to limit the UDA to return either the login or the user first or last name, I decided to leave the decision to the user calling the UDA to choose which field to get in the collection. The format of the parameter is directly tied to the XML outputted by the web service :

<GetUserCollectionFromGroup xmlns="">
        <User ID="116"
                    Flags="0" />

If you are interested in the name, the parameter should contain "@Name”, as it is an attribute of the User element. On the other side, if you are interested in the e-mail of the users, then it should contain “@Email”. The image below shows the activities being part of the UDA. Basically, 3 activities are needed : Call Web Service, Query XML, Regular Expression.


To configure the Call Web Service activity, enter the URL of the usergroup site’s web service, for example http://myserver/_vti_bin/usergroup.asmx. Enter the user name and password of an account that has the permissions to call the web service in the appropriate text box. A good practice would be to define a protected constant and to select it using the lock icon. Clicking on the Refresh button will list all the methods available from this web service, so, select the “GetuserCollectionFromGroup” one:


Once this is done, the parameters will be listed in the “Web service message” section. You can then replace the different web method arguments by the UDA’s parameters, such as “URL”, “groupName” and “Store result in”. You will end with the following configuration :


The next activity, “Query XML” is there to parse the XML response we get from the web method, and to create the collection of users. In the case below, we use the semi-colon as a separator, and we use the “User Field” UDA’s parameter to match the correct attribute using an XSL, the bold part corresponding to the UDA’s parameter :

<xsl:stylesheet version="1.0" xmlns:xsl="" xmlns:p="">
    <xsl:template match="p:GetUserCollectionFromGroup/p:Users">
        <xsl:for-each select="p:User">
        <xsl:value-of select="{WorkflowVariable:User Field}"/>;</xsl:for-each>

Finally, the “Query XML” should look like :


The very last activity is “Regular Expression”. For a reason that I still have to understand, the collection generated by the “Query XML” activity and stored in the “ParsedGroupUsers” does not behave like a “correct” collection. This is why I use such “Regular Expression”, to split the “ParsedGroupUsers” variable using the semi-colon, and to store the result in the “User Collection” output parameter :


That’s it !

Publish the User Defined Action and then use this UDA in your workflows.

Tuesday, October 16, 2012 2:06:00 AM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
Nintex | SharePoint
# Wednesday, August 01, 2012

<Caution>This post is based on the SharePoint 2013 Consumer Preview. Thus, behavior described here may change in the next release of the platform</Caution>


While working on a SharePoint 2013, after several hours of uptime, the user interface showed me an error message when I wanted to save an item in a list : “The server was unable to save the form at this time. Please try again.” Looking at different possible causes, I found that the available memory was drastically low. Indeed, on this 8GB RAM front-end server virtual machine (the database is hosted on a second VM), only few megs were still available.


Then, in the “Processes” tab, I saw that the noderunner.exe process was eating a lot of memory. A quick tour on Google and I found this Marc Molenaar’s blog post about the noderunner.exe process.

I decided to give it a try and, as suggested in the post, I restarted the “SharePoint Search Host Controller” service. Same observation as Marc, the service took a long time to restart and a huge part of the memory was released. The good thing is that at the same time it solved my item-saving issue. The error disappeared.

To be sure this service restart was “solving” the issue, I worked again several hours, playing also with the search and when the VM got short in memory, the same error message was shown to me again.

Another side effect of this low-memory case occurs when browsing the Managed Metadata tree. I suddenly received constantly an “Unexpected response from server. The status code of response is ‘500’. The status text of response is ‘System.ServiceModel.ServiceActivationException’. Unfortunately, it was impossible to get out of this message loop, and the only way to get rid of it was to kill the Internet Explorer application.


Wednesday, August 01, 2012 10:01:23 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Wednesday, July 18, 2012

Since couple of weeks, the Microsoft world was boiling and the recent announcements raised the level of excitement among the partners, developers or users. For the most recent events, it started by the Yammer’s acquisition by Microsoft, followed by the Windows Phone 8 announcement. Then, Windows 8 and the related devices, accompanied with the Surface tablet. And now, what many people were expecting since a while, the Office 2013 wave, including SharePoint 2013. Here, it is not only a wave like we had with the “Office 14 wave”, but a tidal wave should we say, with the release of the “Consumer Preview” of the products, introducing the “Modern Office” concept.

During the July 16th presentation, SharePoint 2013 was slightly mentioned, but the complete set of Office 2013 Consumer Preview was released, and following the #officepreview twits was amazing. At the same time, the NDA by which the closest communities were tied (like the MVPs) was lifted and a massive amount of information was released.

The install

So, during the event, I downloaded the SharePoint Server 2013 to start an install in a VM (2.1 GB), with 4 CPUs and 8GB of RAM. The first surprise came when I started the setup program, which directly had an “Install Prerequisites”, saving us from downloading the pre-requisites individually (if I remember well, the first versions of SharePoint 2010 didn’t have such shortcut). And fortunately, because the list of pre-requisites is quite big. Once the the pre-requisites were installed, the setup itself took around 20 minutes to install the beast. The configuration wizard is well-known too, as it looks like (if not the same) the one of 2010. Finally, it is the post-install wizard that starts and displays the first bits of the new SharePoint 2013 user interface.


Quick Round

Once the install is done, the first site created, the new user interface using the Metro style is presented. To be honest, the default theme is not the one that is the most successful; it is really difficult to see what is part of the header, what is part of the current navigation and the content. So, the first operation I did is to go in the former “Site Action” menu which is now in the right of the top bar, in the “Site Settings” and to “Change the look”. Some themes are better than others to distinguish between the content and the navigation. In addition, for each theme (or look ?), it is possible to select a different color scheme, to change or remove the background image or the fonts used.

By default, the ribbon is hidden and to make it appearing, you have to click on one of the menu header. The ribbon will thus appear, “sliding” from the top of the page header. The current navigation didn’t change much, but a nice feature is the ability to modify the links of both the global navigation and the current navigation, using the “Edit Links” link, pretty convenient as it does not force you to “Top Link Bar” or the “Quick Launch” settings.

The user menu is quite simple now. Exit the “Sign in as a different user” or other items, in SharePoint 2013, only “About Me” leading you to your personal page and the social part of SharePoint and “Sign Out”. In the same area, the “Share” button allows you to invite others and assign permissions to them on the current page, “Follow” to have the current page appearing in your feeds, “Sync” to synchronize locally the content of your site, “Edit” which is a shortcut to the Page => Edit action, or the surprising “Focus on Content” button. This feature toggles between a view without any navigation and having only the content area on the screen, and the standard view of the SharePoint page. Why not….


But, how to create a document library ? If you are not familiar (this means the first 15 minutes), you will desperately look for a “Create” button somewhere. Rather than that, going in the “Site Contents” enables you to “add an app”, which will proposes you the different types of lists or libraries you can create. Thus, I created a first library and uploaded a file in that library, which does not differ from the previous version of SharePoint. Where I am surprised again is regarding the usability of some features. For example, viewing the properties of a file, where before it was quick and needed only one click, in SharePoint 2013 it requires 2 clicks, each time on the “…” button. For such functionality, I would expect to have it directly in the context menu of the item. Let’s see if it stays like this in the final release, but maybe it worth some improvements in some cases.

The performance


With SharePoint 2010, installing it on a VM with 4 CPUs and 8GB of RAM was quite ok for trying some things. Having SQL Server in the same VM was not that bad. Sometimes slow, but not that bad. Here with SharePoint Server 2013, I decided to install it on the same kind of machine, and after a bit of time, it became really slow. I connected the server to check the performances and, even if the CPUs were only used at several percentages, it was radically different with the memory. It is simple, less than 1GB was “free”, the main memory eaters were SQL Server and the Distributed Cache Service (AppFabric). This demonstrates that another level of requirements. Indeed, checking on the web, I found this article from Bjorn Furuknap and later, the hardware requirements from Microsoft, 24GB of RAM (yes, 24 !!) is recommended, this new release of SharePoint has a price… It is also true that during the install and the configuration, I selected all the services, this for sure plays a role. But, Visual Studio is not yet installed and I am wondering what kind of setup a developer will need to have a decent development environment.

This concludes my very first post for SharePoint 2013, and will follow other articles describing either the (new) features of the platform or what is new in terms of architecture and development on a regular basis. So, as SharePoint 2013, I am “working on it” Smile and thanks to stay tuned.

Wednesday, July 18, 2012 10:37:07 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
# Saturday, June 09, 2012

This post is also published on The SharePoint Bar

Recently, I was called to troubleshoot and fix an issue on a SharePoint 2010 farm with a simple statement that not all the users were displayed in the people picker list. Indeed, some users were listed, some not, without having a clear common pattern that could lead to something like “they are not in a group with enough privileges” or anything similar.

The symptoms

A simple way to reproduce the issue was to open a “Library Permissions” or a “List Permissions” in the ribbon of any library or list and then to select “Check Permissions” in the ribbon. This will open a dialog from which a people picker can easily be opened.


Now, when clicking on the address book button and looking for a specific user, it was not displayed and therefore not selectable. This user was existing in the Active Directory, and, after a bit of time of investigation, I also found that, on the Central Administration or in a completely fresh web application and site collection, the problem was not present. Thus, it was clear it was only one site collection that had the issue.


The issue was caused by a restriction applied to the people picker. Indeed, it is possible to restrict the scope of the people picker to a specific OU (Organizational Unit) or to use a specific LDAP filter. Let’s illustrate this. In my Active Directory, I created 3 OU, in each of them I created a user :


In the people picker, I have all the users :


Now, execute the following command, which applies the limitation to the site collection specified by the –url parameter :

stsadm -o setsiteuseraccountdirectorypath -url http://centaurus -path "OU=OU2,OU=OU1,DC=plab,DC=local"

The result is that you will limit the scope of the people picker to the OU2 within OU1 :


To check the state of the limitations, execute the command below :

stsadm -o getsiteuseraccountdirectorypath -url http://centaurus

The result will be :


To simply remove any restriction, execute the following command :

stsadm -o setsiteuseraccountdirectorypath -url http://centaurus/ -path ""

This command has no PowerShell equivalent and is part of a set of others along with properties, dedicated to configure the people picker and that are listed below (from the TechNet article : :

Property name Description

Configures the timeout when a query is issued to Active Directory. The default timeout value is 30 seconds. For more information, see Peoplepicker-activedirectorysearchtimeout.


Restricts the search of a distribution list to a specific subset of domains. For more information, see Peoplepicker-distributionlistsearchdomains.


Specifies not to search Active Directory when the current port is using forms-based authentication. For more information, see Peoplepicker-nowindowsaccountsfornonwindowsauthenticationmode.


Displays only users who are members of the site collection when the Select People and Groups dialog box is used. For more information, see Peoplepicker-onlysearchwithinsitecollection.


Displays only users who are members of the current site collection when the Check Names button is clicked. For more information, see Peoplepicker-peopleeditoronlyresolvewithinsitecollection: Stsadm property (SharePoint Server 2010).


Enables a farm administrator to specify a unique search query. For more information, see Peoplepicker-searchadcustomfilter.


Permits the administrator to set the custom query that is sent to Active Directory. For more information, see Peoplepicker-searchadcustomquery.


Permits a user to search from a second one-way trusted forest or domain. For more information, see Peoplepicker-searchadforests.


Enables a farm administrator to manage the site collection that has a specific organizational unit (OU) setting as defined in the Setsiteuseraccountdirectorypath setting. For more information, see Peoplepicker-serviceaccountdirectorypaths.


This other TechNet article explains what are the other people picker configurations that can be done :

These different commands can be really useful to restrict the users that can be added in a site collection, based on OUs in the Active Directory. To enable this, the Active Directory should follow the security model of your SharePoint organization, as it is only possible to restrict to a single OU as it is not possible to specify several OUs.

And, finally, it has to be documented, as these different properties and commands are not available in the SharePoint user interface and this feature may not come to the mind of the administrators that would have to find out why they don’t find users in the SharePoint infrastructure.

Saturday, June 09, 2012 8:40:00 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0] -
Google Cloud Platform Certified Professional Cloud Architect
Ranked #1 as
French-speaking SharePoint
Community Influencer 2013
Currently Reading :
I was there :
I was there :
I was exhibiting at :
I was there :
I was a speaker at :
I was a speaker at :
I was a speaker at
(January 2013 session):
I was a speaker at :
I was a speaker at :
United Nations (UN) SharePoint Event 2011
I was a speaker at :
I was there !
I was there !
I was there !
I was there !
<November 2012>
About the author/Disclaimer

The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

© Copyright 2022
Yves Peneveyre
Sign In
Total Posts: 290
This Year: 0
This Month: 0
This Week: 0
Comments: 20
Pick a theme:
All Content © 2022, Yves Peneveyre
DasBlog theme 'Business' created by Christoph De Baene (delarou)