Print Shortlink

Automated sitemap creation to make your SharePoint 2010 public site more searchable

In my last article we started on a subject of making your site more searchable and covered all the concepts of creating an XML sitemap. In this part I will dig into implementing site crawl using custom SharePoint 2010 service job.

One way of using service job is that you can schedule (and change the schedule) of how often is your site crawled and you get a nice report of your crawls being executed.

The interface for the service job was there since MOSS – in 2010 we have few improvements including an out-of-the-box UI to change the schedule or force running the job.

Each job will have it’s feature that will take care of installing a job; there is also an execution class that will define the job and what it’s supposed to do. In this article we will create a job and actual crawling mechanism will be implemented in a last part of this article series.

Let’s get started.

1.In your Visual Studio structure, locate the Features folder and create a new feature in it.
2.Ensure your feature is scoped to Web Application.
3.Right click on the name of the newly created feature and select Add Event Receiver.
4.Switch to the code of your newly created event receiver.
5.Add the following namespace references into the appropriate section:
using Microsoft.SharePoint.Administration;
using System.Diagnostics;
6.Locate your FeatureActivated method and place the following code in it
public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
SPWebApplication webApplication = (SPWebApplication)properties.Feature.Parent;
SPJobDefinition jobDefinition =
webApplication.JobDefinitions["My Service Job Name"];
if (jobDefinition!=null)
{
jobDefinition.Delete();
}
try
{
MyJobDefinition myJobDefinition =
new MyJobDefinition("MyServiceJobId", webApplication);
SPMinuteSchedule minuteSchedule = new SPMinuteSchedule();
minuteSchedule.BeginSecond = 1;
minuteSchedule.EndSecond = 59;
minuteSchedule.Interval = 1;
myJobDefinition.Schedule = minuteSchedule;
myJobDefinition.Update();
}
catch (Exception ex)
{
Debug.Write("Exception in Feature Activated occurred: “ +
ex.Message);
}
}

In the code above, first we delete an existing instance of our job definition if, for whatever reason, it is still there when we activate the feature. We use webApplication.JobDefinitions to enumerate our job definitions. Then, we create an instance of our custom job definition MyJobDefinition, which we’re about to add below. Our job will run every one minute as per schedule that was created.

7. Now, let’s add a custom job definition class. Create a new folder under the root of your Visual Studio solution called ServiceJobs.

8. Add a new Class to the folder called MyJobDefinition.cs, and open the code of the newly create class.

9. Ensure you have added the following namespace references:

using Microsoft.SharePoint.Administration;
using System.Diagnostics;

10.Replace the default definition of the class with the following code:

public class MyJobDefinition : SPJobDefinition

{

public MyJobDefinition() : base()

{

}

public MyJobDefinition(string jobName, SPWebApplication webApplication)

: base(jobName, webApplication, null, SPJobLockType.Job)

{

this.Title = "My Service Job Name";

}   

public override void Execute(Guid targetInstanceId)

{

try

{

SPWebApplication webApplication = this.Parent as SPWebApplication;

// TODO: Implement the crawler code in next article

}

catch (Exception ex)

{

Debug.Write(“Problem during service job execution: ” + ex.Message)

}

}

That’s it, if you were to deploy your Visual Studio solution, the job would get deployed with the name you have specified.

In my next article we will implement the Execute part of the job that will actually perform the crawl and sitemap creation.

Stay tuned ..