All posts tagged FS4SP

Upcoming Events

SharePoint Search is alive and well, as the market changes – by Jeff Fried, CTO of BA Insight

See the first part of this series here: Is SharePoint Search Dead?

Despite all these signals, there has continued to be a quite healthy investment from Microsoft in search, and SharePoint search is a remarkably capable and very affordable product.    The search market was commoditized for quite some time, due largely to Microsoft and Google.

The market is changing, though.    And there are a lot of positive signs for Microsoft in Enterprise Search. Read more…


Is SharePoint Search Dead? “- by Jeff Fried, CTO of BA Insight

I’m asked regularly whether Microsoft has abandoned the Enterprise Search market.   This was a frequent question in 2015, and less frequent in 2016, but there’s been a recent uptick, and I got this question 10 times last month.   As a long-standing search nerd that lives close to Microsoft, I know the answer is NO.   But I was baffled about why this question keeps coming up.

So I decided to investigate.   This blog takes you through what I’ve found and how you can answer the question when it comes up.    Search Explained is the perfect place to publish it. Read more…

Debugging and Troubleshooting the Search UI

Recently, I have been giving several Search presentations, and some of them were focusing on Crawled and Managed Properties. In this post, I’m focusing on the User Experience part of this story, especially on the Debugging and Troubleshooting.

As you know, our content might have one (or more) unstructured part(s) and some structured metadata, properties. When we crawl the content, we extract these properties – these are the crawled properties. And based on the crawled properties, we can create managed properties.

Managed Properties in a Nutshell

Managed Properties are controlled and managed by the Search Admins. You can create them mapped to one or more Crawled Properties.

For example, let’s say your company has different content coming from different source systems. Office documents, emails, database entries, etc. stored in SharePoint, File System, Exchange or Lotus Notes  mailboxes, Documentum repositories, etc. For each content, there’s someone who created that, right? But the name of this property might be different in the several systems and/or for the several document types. For Office Documents, it might be Author, Created By, Owner, etc. For emails, usually it’s called From.

At this point, we have several different Crawled Properties, used for the same thing: tag the creator of the content. Why don’t display this in a common way for you, the End User? For example, we can create a Managed Property called ContentAuthor and map each of the Crawled Properties above to this (Author, Created By, Owner, From, etc.). With this, we’ll be able to use this properties in a common way on the UI: display on the Core Results Web Part, use as Refiner, or as a Sorting value in case of FAST.

(NOTE: Of course, you can use each Crawled Property for more than one Managed Properties.)

On the Search UI

If you check a typical SharePoint Search UI, you can find the Managed Properties in several ways:

Customized Search UI in SharePoint 2010 (with FS4SP)

1. Refiners – Refiners can be created by the Managed Properties. You can define several refiner types (text, numeric range, date range, etc.) by customizing this Web Part’s Filter Category Definition property. There are several articles and blog posts describing how to do this, one of my favorite one is this one by John Ross.

2. Search Result Properties – The out-of-the-box Search Result Set is something like this:

OOTB Search Results

This UI contains some basic information about your content, but I’ve never seen any environment where it should have not been customized more or less. Like the first screenshot, above. You can include the Managed Properties you want, and you can customize the way of displaying them too. For this, you’ll have to edit some XMLs and XSLTs, see below…

3. Property-based Actions – If you can customize the UI of properties on the Core Result Web Part, why don’t assign some actions to them? For example, a link to a related item. A link to more details. A link to the customer dashboard. Anything that has a (parameterized) URL and has business value to your Search Users…

4. Scopes and Tabs – Search Properties can be used for creating Scopes, and each scope can have its own Tab on the Search UI.

Core Result Web Part – Fetched Properties

If you want to add some managed properties to the Search UI, the first step is adding this property to the Fetched Properties. This field is a bit tricky though:

Fetched Properties

Edit the Page, open the Core Result Web Part’s properties, and expand the Display Properties. Here, you’ll see the field for Fetched Properties. Take a deep breath, and try to edit it – yes, it’s a single-line, crazy long XML. No, don’t try to copy and paste this by your favorite XML editor, because if you do and break it to lines, tabs, etc. and try to copy back here, you’ll have another surprise – this is really a single line text editor control. If you paste here a multi-line XML, you’ll get the first line only…

Instead, copy the content of this to the clipboard and paste to Notepad++ (this is a free text editor tool, and really… this is a Notepad++ :)). It seems like this:

Fetched Properties in Notepad++

Open the Language menu and select XML. Your XML will be still one-line, but at least, formatted.

Open the Plugins / XML Tools / Pretty Print (XML only – with line breaks) menu, and here you go! Here is your well formatted, nice Fetched Properties XML:

Notepad++ XML Tools Pretty print (XML Only - with line breaks)

So, you can enter your Managed Properties, by using the Column tag:

<Column Name=”ContentAuthor”/>

Ok, you’re done with editing, but as I’ve mentioned, it’s not a good idea to copy this multi-line XML and paste to the Fetched Properties field of the Core Results Web Part. Instead, use the Linarize XML menu of the XML Tools in Notepad++, and your XML will be one loooooooooong line immediately. From this point, it’s really an easy copy-paste again. Do you like it? 🙂

NOTES about the Fetched Properties:

  • If you enter a property name that doesn’t exist, this error message will be displayed:

Property doesn't exist or is used in a manner inconsistent with schema settings.

  • You’ll get the same(!) error if you enter the same property name more than once.
  • Also, you’ll get the same error if you enter some invalid property names to the Refinement Panel Web Part!

Debugging the Property Values

Once you’ve entered the proper Managed Property names to the Fetched Property field, technically, you’re ready to use them. But first, you should be able to check their values without too much effort. Matthew McDermott has published a very nice way to do this: use an empty XSL on the Core Results Web Part, so that you’ll get the plain XML results. You can find the full description here.

In summary: if you create a Managed Property AND add it to the Fetched Properties, you’re ready to display (and use) it on the Result Set. For debugging the property values, I always create a test page with Matthew’s empty XSL, and begin to work with the UI customization only afterwards.



Event-Driven Crawl Schedule

Recently I’ve been working for a customer where I’ve found some interesting requirements: they had several content sources and wanted to crawl them one by one after each other. Scheduling the incrementals for fix time was not a good solution as their content incrementals were very hectic: incremental crawl for the same content source took 5 min at one time, then 1.5 hours next time. And of course, they didn’t want idle time.

But we cannot define these kind of rules from the UI, so the ultimate solution was PowerShell.

First, we need to be able to start the crawl. Let’s talk about Incremental Crawl only this time. Here is the PowerShell script for this:

$SSA = Get-SPEnterpriseSearchServiceApplication -Identity “Search Service Application”

$ContentSourceName = My Content Source

$ContentSource = $SSA | Get-SPEnterpriseSearchCrawlContentSource -Identity $ContentSourceName


It’s an easy one, isn’t it?

Next step is checking the status of this content source. We need this for several reasons, for example we want to start the crawl only if it’s in Idle status, or we want to display the current status of the crawl in every minute, etc.

Here is the PowerShell command you need:


What values can it have? Here you are, the list of crawl statuses:

  • Idle
  • CrawlStarting
  • CrawlingIncremental / CrawlingFull
  • CrawlPausing
  • Paused
  • CrawlResuming
  • CrawlCompleting
  • CrawlStopping

Ok, we can decide the status now, we can start a crawl. How to make it event driven? Here is the logical sequence we have to follow:

  1. Start the crawl of a content source.
  2. Wait until it’s done.
  3. Take the next content source and repeat the steps 1. and 2. until you’re done with each content source.
  4. Repeat this sequence.

First step is creating a function if we want a nice code. Here you go, my first one:

function Crawl {             #Start crawling     $ContentSourceName = $args[0]     $ContentSource = $SSA | Get-SPEnterpriseSearchCrawlContentSource –Identity $ContentSourceName     $CrawlStarted = Get-Date

#Check crawl status     if (($ContentSource.CrawlStatus -eq “Idle”) -and ($CrawlNumber -eq 0)) {             $ContentSource.StartIncrementalCrawl()         Start-sleep 1         Write-Host $ContentSourceName ” – Crawl Starting…”

do {             Start-Sleep 60     # Display the crawl status in every 60 seconds             $Now = Get-Date             $Duration = $Now.Subtract($CrawlStarted)    # Duration of the current crawl             $Speed = $ContentSource.SuccessCount / $Duration.TotalSeconds    # Speed of the current crawl, docs/sec                         Write-Host $ContentSourceName ” – ” $ContentSource.CrawlState (Get-Date).ToString() “-” $ContentSource.SuccessCount”/” $ContentSource.WarningCount”/” $ContentSource.ErrorCount “(” (“{0:N2}” -f $Speed) ” doc/sec)”         } while (($ContentSource.CrawlStatus -eq “CrawlStarting” ) -or ($ContentSource.CrawlStatus -eq “CrawlCompleting”) -or ($ContentSource.CrawlStatus -eq “CrawlingIncremental”) -or ($ContentSource.CrawlStatus  -eq “CrawlingFull” ))

Write-Host $ContentSourceName ” – Crawling Finished”         Write-Host “”     } }

This is how you can call this function:

Crawl(“My Content Source”)

Some additional steps you might need:

  • If you want to run this script once a day (need daily incrementals only but would like to be done as quick as possible), just schedule this script as a Windows task.
  • If you want to run this script during your day only (and release the resources for some other jobs for nights, for example), you can do the start in the morning and start in the evening logic. I’ve made a simple example in my blog post a few months ago.
  • If you want to run this sequence all day long, you might insert this logic into an infinite loop. (But be careful, sometimes you’ll need to run full crawl and then you have to stop running this script.)
  • You can insert some other steps into this script too. If you want to do something (logging, sending some alerts, etc.) when the crawl starts / stops, just do that here. It’ll be your custom event handler on the crawl events.
  • You can even write the output of this script to a file, so that you’ll have your own crawl log.

The scripts above works fine with both SharePoint Search and FAST Search for SharePoint. Enjoy!

Why are some Refiner values hidden?

Refiners are cool either if you use SharePoint Search or FAST, it’s not a question. I very like them, they give so much options and power to the end users.

But there’s a very common question around them: the deep and shallow behavior. As you know the definitions very well: FAST Search for SharePoint has deep refiners, that means each result in the result set is processed and used when calculating the refiners. And SharePoint Search uses shallow refiners, where the refiner values are calculated from the first 50 results only.

These definitions are easy, right? But let’s think a bit forward, and try to answer the question that pops up at almost every conference: Why some Refiner values are not visible when searching? Moreover: why they’re visible when running Query1 and hidden when running Query2?

For example: let’s say you have a lot of documents crawled, and you enter a query where the result set contains many-many items. Thousands, tens of thousands or even more.

Let’s say you have some Excel workbook in the result set that might be relevant for you, but this Excel file is not boosted in the result set at all, let’s say the first Excel result is on the 51th position (you have a lot of Word, PowerPoint, PDF, etc. files on the positions 1-50).

What happens if you use FAST Search? – As the refiners are deep, each result will be processed, so your Excel workbook. For example, in the Result Type refiner you’ll see all the Word, PowerPoint, PDF file types as well as the Excel. Easy way, you can click on the Excel refiner and you’ll get what you’re looking for immediately.


But what’s the case if you don’t have FAST Search, only the SharePoint one? – As the first 50 results is processed for the refiner calculation, your Excel workbook won’t be included. This means, the Result Type refiner displays the Word, PowerPoint, PDF refiners but doesn’t display the Excel at all, as your Excel file is not amongst the top results. You’ll see the Result Type refiner as if it there wasn’t any Excel result at all!


Conclusion: the difference between the shallow and deep refiners doesn’t seem to be so much important for the first sight. But you have to be aware there’s a huge difference in a real production environment as you and your users might have some hidden refiners, and sometimes it’s hard to understand why.

In other words, if a refiner value shows up on your Refinement Panel, that means:

  • In case of FAST Search for SharePoint (deep refiner): There’s at least one item matching this refiner value in the whole result set. Exact number of the items match the refiner value is included.
  • In case of SharePoint Search (shallow refiner): There’s at least one item matching this refiner value in the first 50 results.

If you cannot see a specific value on the Refiner Panel, that means:

  • In case of FAST Search for SharePoint (deep refiner): There’s no result matching this refiner value at all.
  • In case of SharePoint Search (shallow refiner): There’s no result matching this refiner in the first 50 results.


Q&As of my Recent Webinar – Part 1.

Recently, I’ve made a webinar with Dave Coleman and MetaVis. Thanks to the great people attending, I’ve got lot more question I was able to answer then, so as an ultimate solution I promised to answer them on my blog. So here are my answers, part #1:

  1. What is the motivation behind prevent folders under Document Sets – is it search related or just an business level decision? – Unfortunately I don’t know the exact answer. It was a technical decision during design time, as I know. The property management and all inheritance things could be much more complicated in case we could have a hierarchy inside a Document Set.
  2. Will you provide the ppt’s after the webinar? – Yes, the PPTs have been uploaded to slideshare:
  3. Can the content uploaded and processed by content organizer be moved to different libraries in different site collections? – Out of the box, the Content Organizer Rules can move the documents inside a Site Collection. If you need to move them out, you need some custom solution.
  4. Is SharePoint 2010 search based on FAST? – No, FAST Search for SharePoint is based on the FAST Search product line, and SP2010 Search is based on the previous SharePoint products’ code.
  5. Can we import Term sets from external system like external data in other software running in the company like we can do this in via bcs? – Yes, you can import term sets in CSV format. You can find more information here:
  6. Have we been seeing any BA-Insight Longitude Search additions to FAST Search or was it all only FASt Search? – No, all my demos contained out-of-the-box capabilities only (both SP2010 and FAST Search).
  7. Is Search also limited to a single Site (Server) or could this be run over multiple Sites (Servers) at same time? – You can scale-out your Crawler and Query servers to multiple servers both in case of SP2010 Search and FAST Search.
  8. How do activate FAST Search? – FAST Search Server 2010 for SharePoint (FS4SP) is a separated product. Steps for activating FS4SP are the followings:
    1. You must have a SharePoint 2010 Enterprise farm.
    2. Install FAST Search server(s). Depending on your environment, you can have one or more FS4SP servers, logically it’s a farm too.
    3. Configure your SP2010 farm and the FS4SP farm to work together, create the Search Server Applications in SharePoint.
    4. If you’re interested in much more details, here are some useful links and troubleshooting steps for you:
  9. How to better integrate search in the custom applications (not from inside sharepoint)? – SharePoint has both an API (Object Model) and a collection of Web Services. You can call those Web Services from any remote application (with proper permissions, of course), while Object Model can be used on the SharePoint server only. You can find the QueryService reference here:

More to come soon (sooner than this one, I can tell you)…

How to check the Crawl Status of a Content Source

As you know I’m playing working with SharePoint/FAST Search a lot. I have a lot of tasks when I have to sit on the button F5 while crawling and check the status: is it started? is it still crawling? is it finished yet?…

I have to hit F5 in every minute. I’m too lazy, so decided to write a PowerShell script that does nothing but checking the crawl status of a Content Source and writes it to the console to me. And I can work on my second screen while it’s working and working and working – without touching F5.

The script is pretty easy:

$SSA = Get-SPEnterpriseSearchServiceApplication -Identity “Search Service Application” $ContentSource = $SSA | Get-SPEnterpriseSearchCrawlContentSource -Identity “My Content Source”

do {     Write-Host $ContentSource.CrawlState (Get-Date).ToString() “-” $ContentSource.SuccessCount “/” $ContentSource.WarningCount “/” $ContentSource.ErrorCount     Start-Sleep 5 } while (1)

Yes, it works fine for FAST (FS4SP) Content Sources too.

Troubleshooting: FAST Admin DB

The environment:

A farm with three servers: SharePoint 2010 (all rules), FAST Admin, FAST non-admin. SQL is on the SharePoint box too.

The story:

Recently, I had to reinstall the box with SP2010 and SQL. Everything seemed to be fine: installing SQL, SP2010, configuring FAST Content and Query Service Apps, crawl, search… It was like a dream, almost unbelievable. But after that I started to get an error on BA Insight Longitude Connectors admin site, when I started to play with the metadata properties: Exception configuring search settings: … An error occurred while connecting to or communicating with the database…

I went to the FAST Query / FAST Search Administration / Managed Properties, and got this error: Unexpected error occurred while communicating with Administration Service

Of course, I went to the SQL Server’s event log, where I found this error: Login failed for user ‘MYDOMAINsvc_user’. Reason: Failed to open the explicitly specified database On the Details tab I could see the ‘master’ as the related DB.

I went to SQL Server Profiler, but the Trace told the same.

Of course, I checked everything around FAST: the user was in the FASTSearchAdministrators group, permission settings were correct on SQL, etc.

Finally, I found what I was looking for: Event Log on the FAST admin server contained this error: System.Data.SqlClient.SqlException: Cannot open database “FASTSearchAdminDatabase” requested by the login. The login failed. Login failed for user ‘MYDOMAINsvc_user’

The solutions:

Yes, it was what I was looking for: I really forgot to restore the FASTSearchAdminDatabase. But what to do if you don’t have a backup about that?

Never mind, here is the Powershell command for you:

Install-FASTSearchAdminDatabase -DbServer SQLServer.mydomain.local -DbName FASTSearchAdminDatabase

Voilá, it’s working again! 🙂

PowerShell script for exporting Crawled Properties (FS4SP)

Recently, I was working with FAST Search Server 2010 for SharePoint (FS4SP) and had to provide a list of all crawled properties in the category MyCategory. Here is my pretty simple script that provides the list in a .CSV file:

$outputfile = “CrawledProperties.csv”

if (Test-Path $outputfile) { clear-content $outputfile }

foreach ($crawledproperty in (Get-FASTSearchMetadataCrawledProperty)) {     $category = $crawledproperty.categoryName     if ($category = “MyCategory”)     {

# Get the name and type of the crawled property         $name = $         $type = $crawledproperty.VariantType

switch ($type) {             20 {$typestr = “Integer”}             31 {$typestr = “Text”}             11 {$typestr = “Boolean”}             64 {$typestr = “DateTime”}             default {$typestr = other}

}         # Build the output: $name and $typestr separated by           $msg = $name + ” ” + $typestr

Write-output $msg | out-file $outputfile -append     } }

$msg = “Crawled properties have been exported to the file ” + $outputfile write-output “” write-output $msg write-output “” write-output “”

Online Search Trainings (SharePoint 2010 and FAST)

The following trainings contain 24 free, distinct modules with downloadable content, assessments and hosted hands on labs. They cover both SharePoint Server 2010 and FAST Search Server 2010 for SharePoint:

SharePoint Summit 2011 Presentations

I’ve just uploaded my presentations at SharePoint Summit 2011 (Updated links to SlideShare!):

All feedbacks are welcome here!

Best Practices for FAST Search Server for SharePoint 2010 Installation [Updated]

Last week, I’ve been honored as a speaker of Best Practices Conference 2010 in Washington DC, and got a lot of questions about the installation of FAST Search Server for SharePoint 2010. So that, I’ve decided to collect my tips and best practices and publish here. I know my list is not full and definitely will be growing in the future.

  1. Install FAST Search Server to one or more separated box, not to the SharePoint server(s). – Why? Because in this case not only the performance will be better but you’ll have a more manageable and maintainable environment. For example, checking the performance or troubleshooting is much easier in this case. Of course, exceptions can be happened always. Although the best practice is to have separated App Servers and WFEs as well as separated Admin and Non-Admin FAST servers in the farm, in some cases we should deploy some different architecture. Even in my case: as you know, I make a lot of sessions and presentations, online as well as personally. If you have speaker experience, you know: basically we have three options to make a demo during the presentation:
    • Bring the farm to the venue. – Yes, sometimes it’s an option, for example if the event is in the office of your company. But in most cases this option is not an option, you cannot bring 2-3-5-… boxes to a foreign conference.
    • Connect to a remote farm. – If you leave the farm in the office (or at home), you can connect to that from your session via Remote Desktop, or simply in the browser, for example. It’s a good choice if you have a good Internet connection, otherwise all your presentation can be collapsed on the demo unable to work.
    • Have the demo environment on your laptop. – Ok, you can do that only in case if you have a proper laptop. But even if you have, it’s not to easy to have a full SharePoint 2010 farm, SQL, AD and FAST Search Server all together on your laptop. But I can say: it’s not impossible 🙂 I have installed a VMware image to my laptop (HP8530p with 8GB RAM), Win2008 R2 as a domain controller, SQL Express and SharePoint Server 2010 Enterprise on it, as well as FAST Search Server on the same virtual box. So it works, but of course it’s not a production environment: doesn’t contain too much content, there are no crawlings scheduled, etc. It’s just a simple demo machine for my sessions. So please don’t do this kind of installation, except you really need a demo environment! In production, never. Never! Please…
  2. After installing the FAST bits, use the FAST Configuration Wizard for the final steps, but don’t forget the additional steps! – First of all, the Config Wizard doesn’t create and import the certificates, so you have to do that manually. The following steps have to be performed:
    • Export the certificate with the following Powershell command running in the SharePoint 2010 Management Shell:

      $stsCert = (Get-SPSecurityTokenServiceConfig).LocalLoginProvider.SigningCertificate

      $stsCert.Export(“cert”) | Set-Content -encoding byte MOSS_STS.cer

    • Import the certificate on the FAST Search Server, either in the Certificates MMC Console or by running Powershell again.


    • [Update 03/10/2010] If you get a Certificate error during the configuration (check the config log for the error message), don’t panic: close the FAST Search Configuration Wizard, and start it again as Administrator (‘Run as Administrator’). It’ll work.

  3. Check InstallInfo.txt before deploying the FAST Service Applications. – Even if you’ve installed a lot of FAST Servers and think you know every port numbers and settings, please check the InstallInfo.txt. It contains all URLs and port numbers for the proper deployment, and believe me: it worth some seconds!
  4. Doublecheck all the URLs and port numbers when deploying the FAST Service Applications.
  5. If you’re ready with installation and configuration, create a FAST Search Center, and don’t forget to crawl your content!
    • [Update 03/11/2010] If the SharePoint content cannot be crawled in the FAST Content SSA, and the only thing you get instead of the result set is an error message like We did not find any results, but anything else (see point 6.) looks fine, check the Firewall on the servers in the farm. You might have an error Unable to resolve Contentdistributor in the Event Log too.
  6. If the content has been crawled, check the environment with querying something on the FAST Search Center. If you get some error, don’t panic! 🙂
    • The search request was unable to connect to the Search Service – This error means that SharePoint is unable to make connection to the FAST engine. Its reason can be vary on your settings, please check all of them:
      • Check the services with the command nctrl status.
      • Check the URLs and port numbers: compare if your Service Application settings fit to the InstallInfo.txt. Don’t forget to check the protocols as well (HTTP or HTTPS), as the port numbers can be different with the different protocols. And don’t forget to check both FAST Content SSA and FAST Query SSA.
      • Check if the FAST Query SSA is associated to your Web Application, and this is the default one.
      • Check if the credential has been imported properly to the FAST Server. Don’t copy it from another farm!
      • [Update 03/10/2010] Check [FASTSearchDirectory]etccontentdistributor.cfg for the Content Distributor URL and port number.
      • [Update 03/11/2010] Check if the FAST Server has been restarted after the installation. If not, probably you’ll get an error” Unexpected error occurred while communicating with Administration Service too when trying to configure the FAST settings on SharePoint 2010 (for example, keywords, user context, etc.) Restart it (again) if it’s necessary.
    • Unable to display this Web part – This error message means that probably you have something misconfigured in FAST Search Server for SharePoint 2010.
      • Check the services with the command nctrl status.
      • Check the Managed Properties settings on the FAST Query SSA / FAST Search Administration page. If you get an error here, the FAST Search Server definitely doesn’t work properly and you should check it not the SharePoint. If you get the list of the managed properties, the FAST Search Server component is definitely running.
      • Check the URLs and port numbers: compare if your Service Application settings fit to the InstallInfo.txt. Don’t forget to check the protocols as well (HTTP or HTTPS), as the port numbers can be different with the different protocols. And don’t forget to check both FAST Content SSA and FAST Query SSA.
    • [Update 03/11/2010] Unexpected error occurred while communicating with Administration Service – The FAST Search Server’s Administration Service cannot be reached by SP2010.
      • Check the services with the command nctrl status.
      • Check the URLs and port numbers: compare if your Service Application settings fit to the InstallInfo.txt. Don’t forget to check the protocols as well (HTTP or HTTPS), as the port numbers can be different with the different protocols.
      • Check if the FAST Server has been restarted after the installation. Restart it (again) if it’s necessary.
    • [Update 03/10/2010] If the Word/PowerPoint thumbnails and preview don’t work, check Office Web Apps (try to open some documents in the browser). Be careful, because on DCs you can have a strange issue: if the operation ‘Edit in Browser’ works but ‘View in Browser’ doesn’t, check this article, run IISRESET and you might run also SharePoint 2010 Config Wizard. After these steps Office Web Apps should start to work properly. You might also have a Full Crawling before having it working on FAST Result Pages.
    • [Update 03/10/2010] Sometimes it looks like FAST Search results are got from an old index: the result set contains some items don’t exist anymore and doesn’t contain some new items. If a new crawling doesn’t help, clear the FAST Content Collection by running this command in the FAST Search Server for SharePoint 2010 Management shell: Clear-FASTSearchContentCollection –Name <ContentCollectionName> and run an IISRESET. Of course, a new crawling will be needed to get the correct index again.
    • [Update 04/12/2010] In case of you get an error like Failed to communicate with WCF service.  when running FAST admin functions (for example, Clear-FASTContentCollection), check if you have FAST admin privileges on the server. If not, try again to run the command as a different user with admin privileges.


Stay tuned, more details and updates are coming later, regarding to my experiences!


How to Test your FAST Search Deployment?

Recently I’ve made a farm setup where SharePoint 2010 (SP2010) and FAST Search Server 2010 for SharePoint (F4SP) had to be installed to separated boxes. After a successfully installation there’s always useful to make some testing before indexing the production content sources. In case of F4SP it’s much more easier than you’d think.

First, you have to push some content to the content collection. Follow these steps:

  1. Create a new document anywhere on your local machine, for example C:FAST_test.txt
  2. Fill some content into this document, for example: Hello world, this is my FAST Test doc.
  3. Save the document.
  4. Run the Microsoft FAST Search Server 2010 for SharePoint shell.
  5. Run the following command: docpush -c <collection name> “<fullpath to a file>” (in my case: docpush –c sp C:FAST_test.txt) (See the full docpush reference here.)

If this command run successfully, your document has been pushed to the FAST content collection and can be queried. Next step should be to test some queries:

  1. Open a browser on the FAST server and go to http://localhost:[base_port+280]. In case you used the default base port number (13000) you should go to http://localhost:13280. This is the FAST Query Language (FQL) test page, so you can make some testing directly here.
  2. Search for a word contained in the document you’ve uploaded (C:FAST_test.txt). For example, search for the word ‘world’ or ‘FAST’. The result set should contain your document uploaded to the content collection before.
  3. Also, you can set some other parameters on the FQL testing page, for example language setting, debug info, etc.

FAST Search Query Language (FQL) test page

But this site (http://localhost:13280) is much more than a simple FQL testing page. On the top navigation there are other useful functions too:

  • Log
  • Configuration
  • Control
  • Statistics
  • Exclusion List
  • Reference

I’ll deep into these functions in a post later. Stay tuned!


My Favorite SharePoint 2010 Search Features

Recently, I’ve published my first article on
This is the first article of my series about SharePoint Search, including FAST Search for SharePoint 2010.
Although MOSS 2007 also has has very powerful Search capabilities, the SharePoint 2010 improvements are very impressive. In this article I’ll enumerate the most important capabilities of SharePoint 2010 Search, not including FAST Search. This one will be the topic of my next article in this series.
But first of all, let’s see the most important SharePoint 2010 Search improvements for end users:
  1. Rich User Interface with Refinement Panel
  2. Boolean Query Syntax
  3. Suggestion while typing
  4. ‘Did you mean’ suggestions
  5. Federated results
  6. ‘View in Browser’ for Office documents
  7. Improved People Search
  8. SharePoint Search engine as a Federated Location in Windows 7
More details and screenshots can be found in the full article on EUSP2010.

Installing FAST Search Server for SharePoint 2010

No, this time I won’t write a full step-by-step install guide of FAST Search Server for SharePoint 2010. If you’d like to get a detailed install guide, please go to the Microsoft Download Center and download the B2FASTSearchDeploy.xps from there. This is a very detailed, 59 pages long guide. Ohhhhhhh, is it too long and would you like to get some shorter summary? – Yes, you can find some shorter summary around the blogosphere. BUT, please, be careful as generally they miss some very important steps, so finally the result will be not really what you expect. Or if you made a FAST Search Server for SharePoint 2010 installation and configuration by some of them, verify it by the official guide.
So, the only one install guide with 100% success is the detailed, 59 pages long: this one from Microsoft. I know, it’s not to easy to follow all these steps and don’t make any mistake. I also did. And I was suffering because of web part errors and empty result sets as well. I was going to become crazy. I was going to destroy all of the computers around me and getting some new job without bits on a deserted island. Yes, I was really upset. I think you can imagine that…
Finally, I’ve got some idea and help from the Microsoft FAST Team. Here are some steps you should check if you have some errors or unexpected symptops after the installation:
  • Check FAST Query SSA server URLs, port numbers and protocols.
  • FAST Query SSA should be set as default for your webapp.
  • Check %FASTSearch%varlogsyslog for logs, especially qrproxy.log if exists.
  • Try a local query on the FAST Search Server 2010 Query node (http://localhost:13280)
  • The command nctrl status checks if all required processes are running.
To be honest, in my case all of these steps were passed, without any error. Finally, I recognized the command New-FASTSearchSecurityClaimsUserStore -id win didn’t run but gave this error for me:
New-FASTSearchSecurityClaimsUserStore : Unable to create new Claims content: Could not connect to net.tcp://localhost:13278/ConfigManager. The connection attempt lasted for a time span of 00:00:02.0252610. TCP error code 10061: No connection could be made because the target machine actively refused it
In the Event log, I found this:
Log Name:     FAST Search Source:        FAST Search QRServer Date:         12/8/2009 11:37:11 AM Event ID:     1000 Task Category: None Level:         Error Keywords:     Classic User:         N/A Computer:     SP2010.demo2010.local Description: Admin error occurred during loading of keyword data. The keyword cache may be stale. Exception Message:Failed to communicate with the WCF service.
Ok, let’s check if all of the required services are running, even IIS Application Pools (eg. SharePoint Web Service).
If they are also running, check the following folders are created:
  • data
  • datadata_security
  • datadata_securityadmin
  • datadata_securityworker
In my case, these folders were missing. Yes, they should be created manually, but unfortunately I’ve missed this step during the configuration. What could I do? I stopped the SAMAdmin and SAMWorker services, created the required folders, then restart these services again.
And voila, everything became working!
The world is nice and good again… More posts about FAST Search are coming soon, stay tuned!