窗外

Build a SharePoint Server 2016 Hybrid Lab

SharePoint Server 2016 has been out there for a while. One big feature of it is the hybrid configuration with Office 365. To understand how it works, I built a lab environment based on Azure VMs and a trial subscription of Office 365. Here is how I did it.

Prerequisites

To build a lab environment for hybrid solutions, you need the following components in place.

  • An Office 365 subscription. A trial is fine.
  • A public domain name. The default <yourcompany>.onmicrosoft.com domain that you get from the O365 subscription won’t work in hybrid scenarios. You have to register a public domain if you don’t have one.

Configure Office 365

In order to configure the hybrid environment, you must register a public domain with your O365 subscription. The process is like you go to your O365 subscription and kick start a setup process. O365 will generate a TXT value. You need to create a TXT record in the DNS of your domain vendor with that value, and then ask O365 to verify it. Once the domain is verified, the domain is register with your O365 subscription successfully. More details can be found here.

You don’t need to create those DNS records for mail exchange such as MX etc. if you just want to test SharePoint hybrid scenarios. You only need to create them if you also want to test the mailbox features.

The next step is to configuration AD sync between your on-premise AD and the Azure AD created with your O365 subscription. You can configure the Azure AD Connect tool to do it. And for a lab environment, AD sync with password sync is good enough. You can also try AD sync SSO if you have an AD FS to play with.

Before kicking start the AD sync, you might have to do some cleaning on AD attributes. I changed the following:

  • Add a valid and unique email address in the proxyAddresses attribute.
  • Ensure that each user who will be assigned Office 365 service offerings has a valid and unique value for the userPrincipalName attribute in the user’s user object.

With the cleaning done, you can start to sync the AD. You should be able to see users account in the O365 admin center after syncing.

Configure SharePoint Server 2016

Deploy the SharePoint Server 2016 farm. You can try the MinRole deployment if you have multiple servers. In my lab, I just deployed a single server.

The following service applications are required for the hybrid scenarios.

  • Managed Metadata Service
  • User Profile Service with user profile sync and MySite host.
  • App Management Service
  • Subscription Settings Service
  • Search Service for hybrid search scenario

The user profile properties need to have the following mapping:

  • User Principal Name property is mapped to userPrincipalName attribute.
  • Work email property is mapped to mail attribute.

Configure Hybrid

Once you have the O365 and SharePoint Server 2016 ready, you can start to configure the hybrid. It is fairly simple with the help of Hybrid Picker of SharePoint Online. You just need to go to SharePoint admin center of O365, click configure hybrid and pickup a hybrid solution, follow the wizard. If everything is ok, you will get the hybrid configured. Browse to an on-premise site, and you should see the app picker like the screenshot below.

Next Step

Next thing to try is to configure the server to server trust and the cloud hybrid search. Stay tuned.

 

A better way to control the toolbar of the XlstListViewWebPart

Sometimes you may want to programmatically control how the toolbar of an XlstListViewWebPart is shown on the page. If you search it on the web, you may find that a lot people tried to accomplish it with .NET Reflection, like what was shown in this post. It reflected into SharePoint assembly and called an internal method. Using this way may be able to achieve the goal, but it is a totally bad idea because it broke the basic .NET programing rules. When an method was declared as internal, it meant not to be called publicly. Calling it with Reflection, typically to SharePoint, may raise the supportability concerns.

So are there a better way to achieve the goal? The answer is yes. For example, to hide the toolbar of an XlstListViewWebPart, you can simply use the following 3 lines of code.

view.Toolbar = null;
view.Toolbar = @”<Toolbar Type=’None’ />”;
view.Update();

If you want to do other things with the toolbar, just replace the value of the 2nd line with the CAML of your toolbar. The view object in the above code is an SPView object. This way is simple, and more importantly we don’t use any internal method here.

Here we reach the end of this post. But if some of you may want to know how this way works, you can find it out by yourself with a reflector tool and checking how SPView.Toolbar property was implemented.

Exporting SharePoint search result with a custom display template

In my recent project, I got a requirement to export the search result of SharePoint 2013 to a CSV file. Initially, I think it was simple. I could probably extend the search result web part, generate the CSV on the server side and send it to the client. But when I looked at it in detail, I doubted if a search extension web part was a good idea. It could be way too heavy for such a simple requirement. It didn’t fit into our project very well too, because we are trying to leverage the new app model as much as possible, and to limit the server side customization as minimum as possible.

So I decided to explore another way, to use the display template to get the search result and to generate the CSV with JavaScript. Here are then two questions to answer.

  1. Is it possible to get all search result in display template easily?
  2. How to generate a file with JavaScript on the fly and prompt the user to download it?

Fortunately, I managed to figure out the answers of the above two questions. And I will show you how I did it in this post.

First of all, I created a control display template based on the OOTB Default Result template. I wrapped all my JavaScript code in a custom js file, exportresult.js, and uploaded it to the Style Library. I then linked the js in the display template with the following code:

<body>
  <script>
    $includeScript(this.url, "~sitecollection/Style%20Library/export/exportresult.js");
  </script>
...
</body>

I hooked my JavaScript function onto ctx.OnPostRender() so that it could be called after the page is rendered. I also added a button on top of the search result for user to trigger the exporting:

<div style="display: table-cell">
   <a id="idExportSearchResults" href="#">Export Result</a>
</div>

There is almost no useful document regarding to the JavaScript object used in the display template. So I have to try and debug the JavaScript code to figure out where the search result is stored. In the ctx.ListData object, there is a ResultTables array which stores several result tables. Based on my testing, the first item of this array is the search result shown in the search result web part. The 2nd item could be the result for refinement. So I used the following js code to export the result with the required metadata.

var propArray = ["Author", "Created", "ExternalMediaURL", "LastModifiedTime", "Path"];
var exportedResult = new Array();
// Get only the required the managed properties.
ctx.ListData.ResultTables[0].ResultRows.forEach(function (row) {
   var obj = new Object;
   for (var i = 0; i < propArray.length; i++) {
     obj[propArray[i]] = row[propArray[i]] ? row[propArray[i]] : "";
   }
   exportedResult.push(obj)
});

Finally, I need a way to export the data to a file with JavaScript. I actually found a very helpful post here. All credit of the way I used following goes to this post. The following code shows the idea.

var showSave;
// Only works with IE10 or later.
if (window.Blob && navigator.msSaveBlob) {
  showSave = function (data, name, mimeType) {
    resultBlob = new Blob([data], { type: mimeType });
    navigator.msSaveBlob(resultBlob, name);
  }
}

With all the above, I can fulfill the requirement now. The code only works for IE10 and later. It could be extended to support other browsers. If you are interested in extending the code, feel free to fork it on Github.

Deploying a Remote Event Receiver in a Provider-Hosted App

The remote event receiver (RER) and the app event receiver are new concepts of SharePoint 2013. Although there have already been many articles and posts on the web regarding to how to create a remote event receiver, how to deploy it etc., you may still be bitten by the subtle difference when you want to deploy your provider-hosted app which has a remote event receiver, just like what I experienced recently in one of my projects. The app of my project worked all right in the dev environment. I only hit the problem when I tried to publish and deploy it with all certificate and SSL on. I’ve spent the most of my time in the past two days to figure out what is wrong. So here comes some of my findings.

Binding

An remote event receiver is an endpoint of a WCF service. But interestingly, when we create such a receiver in our app, we don’t have to either declare the endpoint in web.config or do so with runtime code. SharePoint knows how to talk to it. Given my limited WCF knowledge, I guess this would help to prevent clients other than SharePoint to figure out how to communicate with the services easily.

Although SharePoint knows the binding and protocol used to talk to the RER, it was not documented, at least based on my search. In dev environment without SSL, it worked fine. But when turning on SSL, you may see an error like the following, although the url of the service can be opened in the browser.

There was no endpoint listening at https://app1.contosoapp.com/AppEventReceiver.svc that could accept the message. This is often caused by an incorrect address or SOAP action.

Why it could happen is because SharePoint tries to talk to the service with basicHttpBinding. Without SSL, it works because the basicHttpBinding with HTTP scheme is one of the default in the protocolMapping. When turning on SSL, HTTPS scheme is not there and the endpoint cannot be accessed. So to overcome it, the basicHttpBinding with HTTPS scheme need to be added to the protocalMapping with the following settings in web.config:

    <bindings>
      <basicHttpBinding>
        <binding name="secureBinding">
          <security mode="Transport" />
        </binding>
      </basicHttpBinding>
    </bindings>
    <protocolMapping>
      <add binding="basicHttpBinding" scheme="https" bindingConfiguration="secureBinding" />
    </protocolMapping>

You may see the above code in many RER samples but may not know why it is needed, as it was only marked as something like “used by SharePoint app”. Hope you know the reason now.

Authentication

In most of cases, the remote web of a provide-hosted app would use some sort of authentication method to prevent the anonymous access. Likewise in my project, I turn on Windows authentication for the remote web in IIS. But if you do so on an app with RER, you may see the following error:

The HTTP request is unauthorized with client authentication scheme ‘Anonymous’. The authentication header received from the server was ‘NTLM,Negotiate’.

What happened here is, based on the previous basicHttpBinding settings, there is no credential configured to be used for client authentication since the default clientCredentialType of basicHttpBinding is None. Obviously, when the client, here it is SharePoint, calls the service, it doesn’t provide any credential. So it cannot pass the Windows authentication.

To overcome this issue, the Anonymous authentication must be enabled on the remote web in IIS. However, I don’t want to expose the whole site anonymously. So I end up enabling both the Anonymous and Windows authentication in IIS, and configure the anonymous only on the RER service, something similar to the following configuration in web.config:

<configuration>
  <location path="AppEventReceiver.svc">
    <system.web>
      <authorization>
        <allow users="?"/>
      </authorization>
    </system.web>
  </location>
  <system.web>
    <compilation debug="true" targetFramework="4.5" />
    <httpRuntime targetFramework="4.5" maxRequestLength="2147483647" executionTimeout="14000" />
    <authorization>
      <deny users="?" />
    </authorization>
  </system.web>
......
</configuration>

With all the above changing and configuration, I am able to deploy the provider-hosted app with the RER successfully and all functions work properly as well.

SharePoint 2013: 关于配置Claims Authentication的一切

网上关于如何在SharePoint中配置Claims based authentication的文章非常多,在SharePoint 2010时代,我自己也写过几篇。不过,这些文章大部分都只集中于一个特定的主题,比如怎么配置ADFS和SharePoint之间的Trust,怎么部署自己开发的Claim Provider等等,很少有文章就如何完整地部署一个Claim AuthN的环境,而不只是配置Web Application,还有怎么配置User Profile, My Site等等,给出一个完整的说明。我的上一个项目正好是关于Claim AuthN的,这篇文章算是对项目的一个总结。

配置Identity Provider

使用Claims based authentication的第一步,当然是配置SharePoint和Identity Provider之间的信任关系了。网上这样的文档非常多。如果是ADFS,可以参考官方文档,照着来基本错不了。其他的Identity Provider,配置起来也是大同小异,没什么可说的。我自己以前曾经写过一篇如何配置自己开发的Identity Provider的文章,也可以参考。多个Web Application是可以共用一个Trusted Identity Provider设置的,只要配置好相应的realm就可以了。为Trusted Token Issuer添加新的realm,可以用下面的PowerShell代码:

$uri = New-Object System.Uri("http://intranet.contoso.com")
$ap = Get-SPTrustedIdentityTokenIssuer
$ap.ProviderRealms.Add($uri, "urn:SharePoint:Intranet")
$ap.Update()

配置Claim Provider

由于默认的Claim Provider其实什么都不做,一旦启用了Claims AuthN,用户的权限管理就会有麻烦。一般启用了Claims AuthN之后,都需要部署定制的Claim Provider,来做用户搜索或名字解析。关于Claim Provider的开发和部署也没有太多好说的,网上的资料很多,Codeplex上也有相应的开源项目。在Claim Provider部署成功之后,可以用下面的PowerShell命令,来关联Trusted Identity Provider和Claim Provider。

$sts = Get-SPTrustedIdentityTokenIssuer
$sts.ClaimProviderName = "ClaimProviderInternalName"
$sts.Update()

配置User Profile

配置好Web Application和Claim Provider之后,基本上Claims based authentication就可以用了。但是,如果想使用User Profile,例如想显示用户的display name,支持People Search等,就还需要配置User Profile Synchronization。

配置User Profile Synchronization Connection的时候,Athentication Provider Type需要选择Trusted Claims Provider Authentication以及相应的Trusted Identity Provider。User Profile里的Claim User Identifier属性需要map到作为Identifier的Claim上,比如Trusted Claims Provider配置为使用email地址作为Identifier,那么在User Profile里,Claim User Identifier就需要map到mail这个AD属性上。参考下面两幅图:

另外,如果想在My Site URL里避免名字冲突,或者不想将用户的Windows Logon Name显示在My Site URL里的话,可以改变User name这个属性的mapping。例如,如果想在My Site URL里使用email地址的话,可以将User name map到AD的mail属性,参考下图。

完成了上面的设置之后,执行一次Full Sync,用户的User Profile就会得到更新了。

最后

完成了上面所有的配置之后,一个完整的,基于Claims based authentication的SharePoint环境基本上就完成了。如果想要在People Picker中隐藏默认的Windows Authentication的选项,可以执行一个简单的PowerShell命令,有详细的官方文档介绍,这里就不赘述了。

十字路口

最近这一阵子,微软的股价像过山车一样。前几天鲍尔默宣布12个月后退休,股价应声暴涨了超过7%。还没过几天了,他又联合诺基亚的埃洛普,宣布微软收购诺基亚的移动设备部门,股价又应声下跌了超过5%。这一来一回,竟然是涨跌互相抵消了,微软的股东在这几个星期就只是坐了回过山车,又回到了原点。

按理说,一个公司的CEO宣布退休,并且没有宣布接班人计划,这应该是个利空的消息,搁微软这儿股价竟然大涨了。而收购诺基亚的移动设备部门,能加强微软的硬件能力,整合产业链,降低Lumia设备的成本,最关键的是,价钱不贵很划算,这应该是个利好消息,可搁微软这儿股价却大跌了。也不知道是鲍尔默太不招人待见呢,还是微软的公关做的太差的缘故。或者是二者兼而有之?现在的微软,像是站在了十字路口,既要加强产品质量,又要改变企业形象。既要稳固企业市场,又要在消费市场持续发力。路怎么走,是继任CEO必须回答的问题。

我觉得从产品的角度来说,微软的产品还是不错的。但是,一来微软重新进入移动互联网领域起步晚了,二来原来的软件工程模式跟不上移动互联网的节奏。这两个因素加起来,使得微软要想在移动领域有所突破,变得非常困难。我还是非常支持鲍尔默关于消费电子领域的坚持的。微软不像IBM,人家卖大型机和专业服务器的,消费者个人用什么,IBM不在乎。可微软不一样,如果大家平时都不用微软的产品了,那么微软的产品在企业里也必然被挤压。随着iPad/iPhone越来越多地进入企业领域,这种苗头已经有所显现了。如果微软真放弃了消费市场,那离完蛋也就一步之遥了。俗话说,进攻是最好的防守,即使微软在消费市场不能取得领先,只要能把竞争对手拖住,企业市场就是稳固的。

说到微软的公关,就不能不提上个周末的另外一件事。我上次贴过一封微软的MS Learning部门发给MCM/MCA社区的信,信的内容是宣布MCM/MCA认证即将从微软退休了。这封措辞生硬的信在MCM/MCA社区中激起了轩然大波。很多人可能不知道MCM/MCA认证是什么,尤其是如果不做微软服务器产品的话。这两个认证是微软认证证书里的最高级别,好比Cisco的CCIE一样。要拿到这两个证书,不光要做题,还得通过一个Lab考试,难度是很大的,很多在某个产品和技术上做了很久的大拿,也不一定能一次通过所有的测试。也正是如此,这两个认证才被认为最有价值,相关社区的人数虽然不多,但是影响力却很大。MSL简简单单地发封信,就把这个证书给毙了,不引起反弹才怪。这充分暴露了微软不会做公关和有效沟通。朝令夕改,以后还有谁会相信微软的技术方向和承诺呢?

这两个认证看来是会被毙掉无疑了,不过事情还没完,怎么抚平这些高级用户和合作伙伴的创伤?怎么重建信用?我等着看MSL怎么回应呢。

Update: Cross post to http://zhuanlan.zhihu.com/wurenyedu/19579555. 试试知乎专栏怎么玩的。

不是我不明白,这世界变化快!

From: Advanced Certification
Sent: Saturday, August 31, 2013 1:05 PM
Cc: Advanced Certification
Subject: MCM/MCSM/MCA Program Update

We are contacting you to let you know we are making a change to the Microsoft Certified Master, Microsoft Certified Solutions Master, and Microsoft Certified Architect certifications. As technology changes so do Microsoft certifications and as such, we are continuing to evolve the Microsoft certification program. Microsoft will no longer offer Masters and Architect level training rotations and will be retiring the Masters level certification exams as of October 1, 2013. The IT industry is changing rapidly and we will continue to evaluate the certification and training needs of the industry to determine if there’s a different certification needed for the pinnacle of our program.

As a Microsoft Certified Master, Microsoft Certified Solutions Master, or Microsoft Certified Architect, you have earned one of the highest certifications available through the Microsoft Certification program. Although individuals will no longer be able to earn these certifications, you will continue to hold the credential and you will not be required to recertify your credential in the future. You will continue to have access to the logos through the MCP site, and your certifications will continue to show in the appropriate section of your transcript, according to Microsoft technology retirement dates. If you are a Charter Member, you will continue to hold the Charter Member designation on your transcript.

Also as a Microsoft Certified Master, Microsoft Certified Solutions Master, or Microsoft Certified Architect, you are a member of an exclusive, highly technical community and you’ve told us this community is one of the biggest benefits of your certification. We encourage you to stay connected with your peers through the main community distribution lists. Although we won’t be adding more people to this community, you continue to be a valued member of it. Over time, Microsoft plans to transition the distribution lists to the community, and, with your consent, will include your information so that it can continue to be a valuable resource for your ongoing technical discussions.

Within the coming weeks, you will receive invitations to an updated community site. This community site will require you to sign in with a Microsoft Account and will replace the need for a Microsoft Partner account as is required today. From this site, you will be able to manage service requests for the Masters and Architects communities – such as ordering welcome kits and managing your contact information for the distribution lists and directory – and accessing training rotation and other community content (if applicable).

If you have not ordered your Welcome Kit, the last day to do so is October 31, 2013. To order your Welcome Kit, please contact the Advanced Cert team at *****@microsoft.com.

We thank you for your commitment to Microsoft technologies.

Respectfully,

S*****

*****, Certification Product Management
Developer & Platform Evangelism

文档里的bug害死人

项目需要用脚本来部署SharePoint farm,因为对脚本的需求挺简单的,我没用AutoSPInstaller这样的大块头,而是自己写了一个轻量级的PowerShell脚本,安装完binaries之后通过调用psconfig.exe来配置farm。为了确定SharePoint 2010的psconfig.exe的用法和MOSS 2007没区别,我还专门查了一下它的文档:http://technet.microsoft.com/en-us/library/cc263093.aspx

没想到脚步运行时还是出问题了,而且就是出在psconfig.exe这里。为了确定到底是哪里不对,我不得不把代码拆开,把输出结果用命令行来跑,结果发现是在创建configdb的时候,参数少了-passphrase。可是在那个文档里,configdb这个命令的参数列表里根本没有passphrase。显然,这个温度是直接从MOSS 2007搬过来的。

为这一个小错误,浪费了许多时间,文档不清楚真是害人。