New SHAREPoint Kompendium Magazin Article About ECS Web Service Designer For SharePoint

In the last issue of the SHAREPoint Kompendium Magazin (Band 6, 2014) I published an article in German about the brand new ECS Web Service Designer for SharePoint 2010/2013 from Theobald Software. The application is a desktop windows application allowing you to visually design SAP driven Web Services for SharePoint.

Advertisements

How To Integrate SAP Business Data Into SharePoint 2010 Using BCS Services And BCS Connector From Theobald Software

The Business Connectivity Services (BCS) of SharePoint 2010 provide a great way to fully integrate external data into the SharePoint. In most cases developers are integrating SQL database tables into the BCS services. But how do we connect to a SAP system? How do we integrate real world SAP business data like tables and function modules or BAPIs into SharePoint?

The answer is just a few clicks away. Theobald Software just released the ERPConnect Services (ECS) for SharePoint 2010 product suite and the product includes a great tool named BCS Connector. The BCS Connector allows developer to create SAP-driven BDC models in minutes with just a couple of clicks.

The BCS Connector application can be installed on client machines and will connect remotely to the SharePoint BCS service application. In this post I will give you an overview of the tool by creating a BDC model with an entity called Customer. We will also create two operations for the entity, a Finder and SpecificFinder method. Both methods are using the SAP built-in function module SD_RFC_CUSTOMER_GET. This is a very simple SAP function returning a list of customers.

To create a new BDC model, first you must enter the SAP and the SharePoint connection data after starting the BCS Connector application (see screenshot for SAP connection below).

Once you have entered the SAP connection data press the OK button and start adding your new BDC entity. To add an entity that is connected to a SAP function module SD_RFC_CUSTOMER_GET press the New button on the lower left side. A new wizard dialog will pop up. Select Function and press the Next button.

Then search for the function module and press the Next button to select the structure of our new BDC entity.

The last wizard page shows us a list with all possible entity structures available for this function module. Select the table CUSTOMER_T, including our customer data, and press the Finish button.

Now we have created our new entity Customer, but we still need to rename the entity from CUSTOMER_T to Customer. Each entity in BCS services must define at least two operations or methods, a so called Finder method to return a list of entities and a SpecificFinder method to return a specific entity within the list.

You also need to define the identifier fields within the entity by clicking the checkbox for the field KUNNR (Customer ID). You may also rename the field or all fields. Next, we create the Finder method by clicking the New button.

The Finder option is already selected. Just press the Finish button and the BCS Connector is automatically creating everything else and will open afterwards the Edit Operation dialog.

This dialog allows you to define the return parameter, input parameters and filters for the entity operation. Actually to execute the SAP function module SD_RFC_CUSTOMER_GET we need to define a value for the input parameters KUNNR or NAME1. For demonstration purpose I just define a name pattern for field NAME1. This query returns all customers that starts with T. What you can define as input parameter depends on the function itself. Clicking on the Preview button displays a list of all filtered customers.

In the same way we are creating the SpecificFinder method:

So, finally we have created a new entity with two entity operations and now we are able to save it on the SharePoint server. Just press the Save Model button. This will result in a new BDC model created on the server:

You can find the BDC models within the Central Administration of SharePoint 2010.

So far we just created a model, but we also want to display the customer data within an external list. We can create an external list using the SharePoint Designer or the BCS Connector. I will show you the second option. Switch to the External Lists tab of the ribbon bar and click on the New External List button.

The New External List dialog has pre-selected all values. Click on the Create button and you are done. You may also change the name of the external list. The final external list looks as follows:

That was really easy and you can even export the BDC Model to Visual Studio 2010 and do additional customizing.

Further information about the product ERPConnect Services and BCS Connector can be found here:

http://www.theobald-software.com/en/products/erpconnectservices.htm

FluentSP – The Fluent SharePoint API

Download FluentSP 1.0 from Codeplex.com

More information on my new homepage at http://www.parago.net

Once you are doing a lot of SharePoint programming you know you often have to write lengthy pieces of code to implement simple tasks like querying SharePoint lists. Nowadays you can read a lot of fluent APIs or fluent interface. For instance, jQuery, a JavaScript library that had successfully introduced a fluent API to handle the hierarchical structure of the HTML documents.

Today, I want to introduce a small library I have developed, FluentSP, a modern fluent interface around the classic SharePoint 2010 API. By using FluentSP instead of the classic SharePoint API, you will be able to chain methods and act on sets of items of the underlying SharePoint objects.

What is a fluent API?
Checkout this CodeProject article A Look at Fluent APIs and the Wikipedia article Fluent interface.

To start into the fluent API you call the Use() method on SPSite, SPWeb, SPWebCollection or SPListCollection. The Use() method is implemented as an extension method that will return the entry facade object (see facade table below). Another entry point to the fluent API is the static class FluentSP with its static methods CurrentSite, CurrentWeb, CurrentLists or RootWebLists.

SPContext.Current.Site.Use()... // => Returns the SPSiteFacade as entry point

// OR:
FluentSP.CurrentSite()...       // => Returns the SPSiteFacade as entry point 

Using the entry facade instance you can start chaining the available facade methods as follows:

FluentSP.CurrentSite().Web("Home").List("Tasks").Items().ForEach(i => // Do something with the item i of type SPListItem...);

// OR:
FluentSP.CurrentSite()
  .Web("Home")
    .List("Tasks")
      .Items()
      .ForEach(i => // Do something with...);

Each facade object is actually wrapping an underlying data item, for instance the SPSiteFacade class is the fluent wrapper of the SPSite class. Depending on what kind of facade methods you are calling the method is returning either the current facade instance (e.g., ForEach() or Where()) or the method is returning a new child facade object (e.g. Items()). During the process of chaining methods in such a way you will build up a tree or hierarchy of facade instances. In order to step back to the parent or previous facade instance you need to call the End() method:

site.Use()
       .RootWeb()
         .Site()
       .End()		// Returns SPWebFacade  as parent facade
         .Site()
       .End()		// Returns SPWebFacade  as parent facade
     .End();		// Returns SPSiteFacade as parent facade

FluentSP is currently missing a number of possible useful methods, but you can easily extend the FluentSP API with custom facade classes and extension methods, see below and source code for implementation examples.

Samples

SPSite site = SPContext.Current.Site;

// ----------------------------

// Outputs titles of all lists of the root web where the list title starts with T
site.Use().RootWeb().Lists().Where(l => l.Title.StartsWith("T")).ForEach(l => Console.WriteLine(l.Title));

// Outputs titles of all lists of the root web where the list title ends with a ts (using RegEx)
site.Use().RootWeb().Lists("ts$").ForEach(l => Console.WriteLine(l.Title)).Count(out c);

// Outputs titles of all lists of the root web in ascending order where the starts with T
site.Use().RootWeb().Lists().Where(l => l.Title.StartsWith("T")).OrderBy(l => l.Title).ForEach(l => Console.WriteLine(l.Title));

// Outputs titles of all lists of the root web in descending order where the starts with T
site.Use()
    .RootWeb()
      .Lists()
      .Where(l => l.Title.StartsWith("T"))
      .OrderByDescending(l => l.Title)
      .ForEach(l => Console.WriteLine(l.Title));

// ----------------------------

// Delete all items in the Members list, then add 7 new members and then select and output 
// the titles of a few of the newly created items
site.Use()
    .RootWeb()
      .List("Members")
      .Do(w => Console.WriteLine("Deleting all members..."))
       .Items()
       .Delete()
      .End()
      .Do(w => Console.WriteLine("Adding all members..."))
      .AddItems(7, (i, c) => i["Title"] = "Member " + c)
       .Items()
       .Skip(2)
       .TakeUntil(i => ((string)i["Title"]).EndsWith("6"))
       .ForEach(i => Console.WriteLine(i["Title"]));

// ----------------------------

// Search for lists that are created by specific a user and depending on the results
// displays different messages by calling the IfAny or IfEmpty methods
site.Use()
    .RootWeb()
      .Lists()
      .ThatAreCreatedBy("Unknown User")
      .IfAny(f => f.ForEach(l => Console.WriteLine(l.Title)))
      .IfAny(l => l.Title.StartsWith("M"), f => Console.WriteLine("Lists found that starts with M*"))
      .IfEmpty(f => Console.WriteLine("No lists found for user"))
    .End()
    .Do(w => Console.WriteLine("---"))
      .Lists()
      .ThatAreCreatedBy("System Account")
      .IfAny(f => f.ForEach(l => Console.WriteLine(l.Title)));

// ----------------------------

var items = new List<SPListItem>();

// Query with Skip and TakeUnitl methods
site.Use().RootWeb().List("Members").Items().Skip(2).TakeUntil(i => i.Title.EndsWith("5")).ForEach(i => { items.Add(i); Console.WriteLine(i.Title); });

// Query with Skip and TakeWhile methods
site.Use()
    .RootWeb()
      .List("Members")
       .Items()
       .Skip(2)
       .TakeWhile(i => i.Title.StartsWith("Member"))
       .ForEach(i => { items.Add(i); Console.WriteLine(i.Title); })
      .End()
       .Items()
       .Where(i => i.Title == "XYZ")
       .ForEach(i => { items.Add(i); Console.WriteLine(i.Title); });

// ----------------------------

// Adds new items using the Do method with the passed facade object
site.Use()
    .RootWeb()
    .AllowUnsafeUpdates()
      .List("Members")
      .Do((f, l) => {
        for(int c = 1; c <= 5; c++)
          f.AddItem(i => i["Title"] = "Standard Member #" + c);
      })
      .AddItem(i => i["Title"] = "Premium Member")
       .Items()
        .OrderBy(i => i.Title)
        .ForEach(i => Console.WriteLine(i["Title"]));

Extensibility Samples

// This sample is using the ThatAreCreatedBy extension method defined in Extensions.cs to show how to extend the fluent API
site.Use()
        .RootWeb()
          .Lists()
          .ThatAreCreatedBy("System Account", "jbaurle")
          .Count(c => Console.WriteLine("Lists found: {0}", c))
          .ForEach(l => Console.WriteLine(l.Title));

// This sample uses the new SPWebApplicationFacade extenion defined in SPwebApplicationFacade.cs to show how to extend the fluent API
site.WebApplication.Use()
              .Sites()
              .ForEach(i => Console.WriteLine(i.Url));

// This sample uses an alternative implementation for SPSiteFacade defined in SPSiteFacadeAlternate.cs to show how to extend the fluent API
site.WebApplication.Use().WithFirstSite().DoSomething();
site.Use<SPSiteFacadeAlternate<BaseFacade>>().DoSomething();

The custom method ThatAreCreatedBy which is used in the first query of the extensibility samples is implemented as follows:

static class Extensions
{
  public static SPListCollectionFacade<TParentFacade> ThatAreCreatedBy<TParentFacade>(this SPListCollectionFacade<TParentFacade> facade, params string[] names)
    where TParentFacade : BaseFacade
  {
    // NOTE: This sample uses the GetCollection method of the given facade instance to retrieve the current 
    // collection and adds the its query (see LINQ Deferred Execution). The Set method updates the 
    // underlying collection. The GetCurrentFacade method will then return the current facade to allow 
    // method chaining.

    if(names.Length > 0)
      facade.Set(facade.GetCollection().Where(i => names.Contains(i.Author.Name)));

    return facade.GetCurrentFacade();
  }
}

For more samples and details check out the source code you can download from Codeplex.

Built-In Facades and Methods

See Codeplex

How To Implement A Generic Template Engine For SharePoint 2010 Using DotLiquid

*** NEW! My blog moved to my homepage at http://www.parago.de! ***

During the process of creating a complex SharePoint application you often need to send mails and create text files based on SharePoint data elements like SPListItem or SPWeb. Mail templates for instance mostly contain specific list item data. It would be helpful sometimes if the text generation itself is template-driven.

This article shows how to implement a generic template manager based on the free DotLiquid templating system with SharePoint specific extensions. This allows you for example to iterate through all SharePoint lists available within a SiteCollection and render only details for lists which contain Task in their title:

<p>All task lists for the current web '{{SP.Web.Title}}' and site '{{SP.Site.Url}}'
  <ul>
    {% for list in SP.Lists %}
      {% if list.Title contains 'Task' %} 
        <li><i>{{list.Title}}</i> with ID '{{Slist.ID|upcase}}' (<i>(Created: 
                                                     {{list.Created|sp_format_date}})</i></li>
      {% endif %}
    {% endfor %}
  </ul>
</p>
<p>All lists for current web '{{SP.Site.RootWeb.Title}}' created by the 'splists' tag
  <ul>{% splists '<li>{0}</li>' %}</ul>
</p>

The screenshot below shows the result of the rendered template sample:

TemplateEngine

Of course the technique implemented in this article can also be used in conjunction with other technologies or applications, it’s not only SharePoint related.

DotLiquid Template Engine

The DotLiquid template engine is a C# port of the Ruby’s Liquid templating system and is available for .NET 3.5 and above. DotLiquid is open source and can be downloaded at dotliquidmarkup.org. The software is also available as NuGet package for Visual Studio.

The templating system includes features like variable, text replacement, conditional evaluation and loop statements that are similar to common programming languages. The language elements consists of tags and filter constructs.

The engine can also be easily extended by implementing and adding custom filters and/or tags. This article actually shows how to extend the DotLiquid and implement SharePoint specific parts.

The following sample shows a Liquid template file:

<p>{{ user.name | upcase }} has to do:</p>

<ul>
{% for item in user.tasks -%}
  <li>{{ item.name }}</li>
{% endfor -%}
</ul>

Output markup is surrounded by curly brackets {{…}} and tag markup by {%…%}. Output markup can take filter definitions like upcase. Filters are simple static methods, where the first parameter is always the output of the left side of the filter and the return value of the filter will be the new left value when the next filter is run. When there are no more filters, the template will receive the resulting string.

There are a big number of standard filters available to use, but later on we will implement a custom filter method for SharePoint. The result of the above rendered template looks like:

<p>TIM JONES has to do:</p>

<ul>
  <li>Documentation</li>
  <li>Code comments</li>
</ul>

To pass variables and render the template you first need to parse the template and the then just call the Render method with the variable values:

string templateCode = @"<ul>
{% for item in user.tasks -%}
  <li>{{ item.name }}</li>
{% endfor -%}
</ul>";

Template template = Template.Parse(templateCode);

string result = template.Render(Hash.FromAnonymousObject(new {
                                    user = new User
                                    {
                                      Name = "Tim Jones",
                                      Tasks = new List<Task> {
                                        new Task { Name = "Documentation" },
                                        new Task { Name = "Code comments" }
                                      }
                                    }}));

public class User : Drop
{
	public string Name { get; set; }
	public List<Task> Tasks { get; set; }
}

public class Task : Drop
{
	public string Name { get; set;	 }
}

The User and Task classes inherit from the Drop class. This is an important class in DotLiquid. The next sections explains the class in more detail. It is out of scope of this article to discuss all the features for DotLiquid in detail. For more information please see the homepage of DotLiquid (dotliquidmarkup.org) or the website of the original creator of the Liquid template language at liquidmarkup.org. You will find there a lot of manuals and sample code.

Template Manager

The TemplateManager class is a wrapper over the DotLiquid template engine and provides SharePoint support. The class allows to cache parsed templates, to register tags and filters and render them using a top-level custom Drop class named SharePointDrop:

internal class TemplateManager
{
  public Dictionary<string, Template> Templates { get; protected set; }

  public TemplateManager()
  {
    Templates = new Dictionary<string, Template>();
  }

  public void AddTemplate(string name, string template)
  {
    if(string.IsNullOrEmpty(name))
      throw new ArgumentNullException("name");
    if(string.IsNullOrEmpty(template))
      throw new ArgumentNullException("template");

    if(Templates.ContainsKey(name))
      Templates[name] = Template.Parse(template);
    else
      Templates.Add(name, Template.Parse(template));
  }

  public void RegisterTag<T>(string tagName) where T : Tag, new()
  {
    Template.RegisterTag<T>(tagName);
  }

  public void RegisterFilter(Type type)
  {
    Template.RegisterFilter(type);
  }

  public string Render(string nameOrTemplate, IDictionary<string, object> values)
  {
    Template template;

    if(Templates.ContainsKey(nameOrTemplate))
      template = Templates[nameOrTemplate];
    else
      template = Template.Parse(nameOrTemplate);

    SharePointDrop sp = new SharePointDrop();

    if(values != null)
    {
      foreach(KeyValuePair<string, object> kvp in values)
        sp.AddValue(kvp.Key, kvp.Value);
    }

    return template.Render(new RenderParameters { LocalVariables = 
             Hash.FromAnonymousObject(new { SP = sp }), RethrowErrors = true });
  }
}

The Render method is using the SharePointDrop class to support objects like SPListItem or SPListCollection. The Drop class as key concept of DotLiquid must be explained in detail. The DotLiquid template engine is focusing on making templates safe. A Drop is a class which allows you to export DOM like objects. DotLiquid, by default, only accepts a limited number of types as parameters to the Render method. These data types include the .NET primitive types (integer, float, string, etc.), and some collection types including IDictionary, IList and IIndexable (a custom DotLiquid interface).

If DotLiquid would support arbitrary types, then it could result in properties or methods being unintentionally exposed to template authors. To prevent this, DotLiquid templating system uses Drop classes that use an opt-in approach to exposing object data.

The code following shows the SharePointDrop implementation:

internal class SharePointDrop : Drop
{
  Dictionary<string, object> _values;

  public SharePointDrop()
  {
    _values = new Dictionary<string, object>();

    if(SPContext.Current != null)
      _values.Add("Site", SPContext.Current.Site);
    if(SPContext.Current != null)
      _values.Add("Web", SPContext.Current.Web);
    if(SPContext.Current != null)
      _values.Add("User", SPContext.Current.Web.CurrentUser);

    _values.Add("Date", DateTime.Now);
    _values.Add("DateISO8601",
                   SPUtility.CreateISO8601DateTimeFromSystemDateTime(DateTime.Now));

    // TODO: Add more default values
  }

  public void AddValue(string name, object value)
  {
    if(string.IsNullOrEmpty(name))
      throw new ArgumentNullException("name");

    if(_values.ContainsKey(name))
      _values[name] = value;
    else
      _values.Add(name, value);
  }

  public override object BeforeMethod(string method)
  {
    if(!string.IsNullOrEmpty(method) && _values.ContainsKey(method))
      return DropHelper.MayConvertToDrop(_values[method]);

    return null;
  }
}

The SharePointDrop class main objective is to solve the problem of casting unsupported data types like SPListItem or SPListItemCollection and other SharePoint related types. Therefore the class is overriding the BeforeMethod method of the Drop class to analyse the requested variable value. If the variable is available in the value context the method will try to cast the data type to a known Drop type by calling the MayConvertToDrop method of the DropHelper class:

public static object MayConvertToDrop(object value)
{
  if(value != null)
  {
    // TODO: Add your own drop implementations here

    if(value is SPList)
      return new SPPropertyDrop(value);
    if(value is SPListCollection)
      return ConvertDropableList<SPPropertyDrop, SPList>(value as ICollection);
    if(value is SPListItem)
      return new SPListItemDrop(value as SPListItem);
    if(value is SPListItemCollection)
      return ConvertDropableList<SPListItemDrop, SPListItem>(value as ICollection);
    if(value is SPWeb)
      return new SPPropertyDrop(value);
    if(value is SPSite)
      return new SPPropertyDrop(value);
    if(value is SPUser)
      return new SPPropertyDrop(value);
    if(value is Uri)
      return ((Uri)value).ToString();
    if(value is Guid)
      return ((Guid)value).ToString("B");
  }

  return value;
}

The SPListItemDrop class for instance is returning the value of the requested field:

internal class SPListItemDrop : SPDropBase
{
  public SPListItem ListItem { get { return DropableObject as SPListItem; } }

  public SPListItemDrop()
  {
  }

  public SPListItemDrop(SPListItem listItem)
  {
    DropableObject = listItem;
  }

  public override object BeforeMethod(string method)
  {
    if(!string.IsNullOrEmpty(method))
    {
      StringBuilder sb = new StringBuilder();
      string name = method + "\n";

      for(int i = 0; i < name.Length; i++)
      {
        if(name[i] == '\n')
          continue;
        if(name[i] == '_')
        {
          if(name[i + 1] != '_')
            sb.Append(' ');
          else
          {
            i++;
            sb.Append('_');
          }
        }
        else
          sb.Append(name[i]);
      }

      name = sb.ToString();

      if(ListItem.Fields.ContainsField(name))
        return DropHelper.MayConvertToDrop(ListItem[name]);
    }

    return null;
  }
}

The method parameter (field name of the SPListItem) of the BeforeMethod method can contain underscores which are replaced by spaces. So, field names with spaces like Start Date of the Task item must be defined in the template as {{task.Start_Date}}.

The SPPropertyDrop class, also part of the solution of this article, is a generic Drop implementation which exposes all properties of an object and may cast them if needed into an Drop objects again. For implementation details see the source code.

Filters and Tags

The solution is also providing a custom filter and tag implementation. The filter called sp_format_date (see template above) is implemented by the method SPFormatDate and is calling the FormatDate method of the class SPUtility form the SharePoint API:

internal static class SPFilters
{
  public static object SPFormatDate(object input)
  {
    DateTime dt = DateTime.MinValue;

    if(input is string)
    {
      try
      {
        dt = SPUtility.ParseDate(SPContext.Current.Web, input as string, 
               SPDateFormat.DateOnly, false);
      }
      catch { }
    }
    else if(input is DateTime)
      dt = (DateTime)input;

    if(dt != DateTime.MinValue && dt != DateTime.MaxValue && SPContext.Current != null)
      return SPUtility.FormatDate(SPContext.Current.Web, (DateTime)input, 
               SPDateFormat.DateOnly);

    return input;
  }
}

The custom tag named splists is returning a formatted list of all SPList object names of the current web (see template above):

internal class SPListsTag : Tag
{
  string _format;

  public override void Initialize(string tagName, string markup, List<string> tokens)
  {
    base.Initialize(tagName, markup, tokens);

    if(string.IsNullOrEmpty(markup))
      _format = "{0}";
    else
      _format = markup.Trim().Trim("\"".ToCharArray()).Trim("'".ToCharArray());
  }

  public override void Render(Context context, StreamWriter result)
  {
    base.Render(context, result);

    if(SPContext.Current != null && !string.IsNullOrEmpty(_format))
    {
      foreach(SPList list in SPContext.Current.Web.Lists)
        result.Write(string.Format(_format, list.Title));
    }
  }
}

Download Source Code | Download Article (PDF)

How To Use SharePoint 2010 Secure Store As Single Sign-On Service For SAP Applications Using ERPConnect

*** NEW! My blog moved to my homepage at http://www.parago.de! ***

The Secure Store Service in SharePoint 2010 replaces the Single Sign-on Shared Service of MOSS 2007 and provides an easy way to map user credentials of external resources like SAP systems to Windows users. During the process of developing SAP interfaces using the very handy ERPConnect library from Theobald Software you have to open a R3 connection with the SAP system using SAP account credentials (username and password).

In most cases you will use a so called technical user with limited access rights to execute or query objects in SAP, but a SAP system saves a lot of sensitive data which cannot all be shown to all users. So, creating a new secure store in SharePoint 2010 to save the SAP user credentials will be the solution. Accessing the secure store from program code is quite simple.

A trail version of the ERPConnect library can be downloaded at www.theobald-software.com.

Secure Store Configuration

The Secure Store Service will be managed by the Central Administration (CA) of SharePoint 2010 under Application Management > Manage service applications > Secure Store Service:

Screenshot1

As the screenshot above shows is it possible to create multiple target applications within one Secure Store Service.

Clicking the New button will open the Create New Secure Store Target Application page. In this dialog you have to enter the new Target Application ID, a display name, a contact email address and other application related details (see screenshot below).

Screenshot2

Next, the application fields must be defined:

Screenshot3

It’s important to select the field type User Name and Password, because our implementation later on will check the target application for those two field types.

In the last dialog step the application administrator must be defined. After defining the administrator and clicking the Ok button SharePoint is creating a new secure store:

Screenshot5

Next, the Windows users must be mapped to the SAP user credentails. Therefore mark the checkbox for the newly created secure store SAPCredentialsStore and click the Set button in the toolbar. This opens the following dialog:

Screenshot6

The Credential Owner is the Windows user for whom the SAP user credentials will be set. Enter the SAP username and password and click the Ok button to save them.

That’s it !

Secure Store Programming

Accessing the secure store by code is simple. We implement a SecureStore class which will encapsulate all the access logic. A second class called SecureStoreCredentials contains the retrieved user credentials returned by the method GetCurrentCredentials of the SecureStore class.

But first, we need to create a Visual Studio 2010 SharePoint project and reference a couple of assemblies. You can directly enter the file paths in the Reference dialog (Browse tab) to add the assemblies:

Microsoft.BusinessData.dll
C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\ISAPI\Microsoft.BusinessData.dll

Microsoft.Office.SecureStoreService.dll
C:\Windows\assembly\GAC_MSIL\Microsoft.Office.SecureStoreService\14.0.0.0__71e9bce111e9429c\Microsoft.Office.SecureStoreService.dll

The following code shows the SecureStore class implementation:

internal class SecureStore
{
  public string ApplicationId { get; private set; }

  public SecureStore(string applicationId)
  {
    if(string.IsNullOrEmpty(applicationId))
      throw new ArgumentNullException("applicationId");
    if(!IsApplicationValid(applicationId))
      throw new ArgumentException(string.Format("Target application with ID '{0}' is not defined.", applicationId));

    ApplicationId = applicationId;
  }

  public SecureStoreCredentials GetCurrentCredentials()
  {
    SecureStoreProvider provider = new SecureStoreProvider { Context = SPServiceContext.Current };
    string userName = string.Empty;
    string password = string.Empty;

    using(SecureStoreCredentialCollection data = provider.GetCredentials(ApplicationId))
    {
      foreach(ISecureStoreCredential c in data)
      {
        if(c != null)
        {
          if(c.CredentialType == SecureStoreCredentialType.UserName)
            userName = GetDecryptedCredentialString(c.Credential);
          else if(c.CredentialType == SecureStoreCredentialType.Password)
            password = GetDecryptedCredentialString(c.Credential);
        }
      }
    }

    if(string.IsNullOrEmpty(userName) || string.IsNullOrEmpty(password))
      throw new SecureStoreException("Credentials for the current Windows user are not valid or not defined.");

    return new SecureStoreCredentials(userName, password);
  }

  public static bool IsApplicationValid(string applicationId)
  {
    if(string.IsNullOrEmpty(applicationId))
      throw new ArgumentNullException("applicationId");

    SecureStoreProvider provider = new SecureStoreProvider { Context = SPServiceContext.Current };

    foreach(TargetApplication application in provider.GetTargetApplications())
    {
      if(application.ApplicationId == applicationId)
      {
        ReadOnlyCollection<ITargetApplicationField> fields = provider.GetTargetApplicationFields(applicationId);
        bool existsUserNameDefinition = false;
        bool existsPasswordDefinition = false;

        foreach(TargetApplicationField field in fields)
        {
          if(field.CredentialType == SecureStoreCredentialType.UserName)
            existsUserNameDefinition = true;
          else if(field.CredentialType == SecureStoreCredentialType.Password)
            existsPasswordDefinition = true;
        }

        if(existsUserNameDefinition && existsPasswordDefinition)
          return true;
      }
    }

    return false;
  }

  public static string GetDecryptedCredentialString(SecureString secureString)
  {
    IntPtr p = Marshal.SecureStringToBSTR(secureString);

    try
    {
      return Marshal.PtrToStringUni(p);
    }
    finally
    {
      if(p != IntPtr.Zero)
        Marshal.FreeBSTR(p);
    }
  }
}

The constructor checks if an application ID is passed and if it’s valid by calling the static method IsApplicationValid. In first place, the IsApplicationValid method is creating an instance of the SecureStoreProvider class to get access to Secure Store Service. The SecureStoreProvider class provides all methods to talk to the SharePoint service. Then, the method queries for all target applications and checks for the given application. If the application has been created and can be found, the method will analyse the application field definitions. The IsApplicationValid method then looks for two fields of type User Name and Password (see above).

The GetCurrentCredentials method is actually trying to get the SAP user credentials from the store. The method is creating an instance of the SecureStoreProvider class to get access to service and then calls the GetCredentials method of the provider class. If credentials are available the method decrypt the username and password from type SecureString using the internal method GetDecryptedCredentialString. It then will wrap and return the data into an instance of the SecureStoreCredentails class.

For more details of the implementation see the source code.

Accessing SAP Using The Secure Store Credentials

The sample and test code calls the SAP function module SD_RFC_CUSTOMER_GET to retrieve all customer data that match certain criteria (where NAME1 starts with Te*):

Screenshot7

The following code shows the implementation of the Test button click event:

protected void OnTestButtonClick(object sender, EventArgs e)
{
  string licenseKey = "<LICENSEKEY>";
  string connectionStringFormat = "CLIENT=800 LANG=EN USER={0} PASSWD={1} ASHOST=HAMLET ...";

  R3Connection connection = null;

  try
  {
    LIC.SetLic(licenseKey);

    ...

    SecureStoreCredentials credentials =
      new SecureStore(ApplicationID.Text).GetCurrentCredentials();
    string connectionstring =
      string.Format(connectionStringFormat, credentials.UserName, credentials.Password);

    connection = new R3Connection(connectionstring);
    connection.Open();

    RFCFunction function = connection.CreateFunction("SD_RFC_CUSTOMER_GET");
    function.Exports["NAME1"].ParamValue = "Te*";
    function.Execute();

    ResultGrid.DataSource = function.Tables["CUSTOMER_T"].ToADOTable();
    ResultGrid.DataBind();

    OutputLabel.Text = string.Format("The test called...",
      ApplicationID.Text, Web.CurrentUser.Name, Web.CurrentUser.LoginName,
      credentials.UserName);
  }
  catch(Exception ex)
  {
    WriteErrorMessage(ex.Message);
  }
  finally
  {
    if(connection != null && connection.Ping())
      connection.Close();
  }
}

The interesting part is the retrieval of the SAP user credentials from the secure store defined in the text box named ApplicationID. The application ID will be passed as parameter to the constructor of the SecureStore class. After creating the instance the method GetCurrentCredentials will be called to ask the store for the credentials of the current Windows user.

After the user credential query has been successfully executed the SAP connection string will be constructed. Then the connection string will then be used to create an instance of the R3Connection class to connect with the SAP system. The remaining code is just calling the function module SD_RFC_CUSTOMER_GET and binding the result to the SPGridView.

Download Source Code | Download Article (PDF)

How To Implement A Custom SharePoint 2010 Logging Service For ULS And Windows Event Log

Prior to Microsoft SharePoint 2010 there was no official documented way to programmatically use the built-in ULS service (Unified Logging Service) to log own custom messages. There are still solutions available on the internet that can be used, but SharePoint 2010 now supports full-blown logging service support.

To log a message to the SharePoint log files just call the WriteTrace method of the SPDiagnosticsService class:

SPDiagnosticsService.Local.WriteTrace(0, new SPDiagnosticsCategory("My Category", TraceSeverity.Medium, EventSeverity.Information), TraceSeverity.Medium, "My log message");

The problem with this technique is that the log entry does not contain any category information, instead SharePoint uses the value "Unknown":

image

Of course, that does not matter if someone develops small solutions. Developing custom solutions you may want to implement an own logging service with custom categories and UI integration within the Central Administration (CA) of SharePoint 2010.

This article shows how to develop a custom logging service that integrates with the Diagnostic Logging UI of the Central Administration:

image

Custom Logging Service

A custom logging service must inherit from the SPDiagnosticsServiceBase class. This is the base class for diagnostic services in SharePoint and it offers the option to log messages to log files via ULS and to the Windows Event Log. By overriding the ProvideAreas method the service provides information about diagnostic areas, categories and logging levels. A diagnostic area is a logical container of one or more categories.

The sample service defines one diagnostic area and one category (used by application pages) for this area:

[Guid("D64DEDE4-3D1D-42CC-AF40-DB09F0DFA309")] 
public class LoggingService : SPDiagnosticsServiceBase
{
  public static class Categories
  {
    public static string ApplicationPages = "Application Pages";
  }

  public static LoggingService Local
  {
    get { return SPFarm.Local.Services.GetValue(DefaultName); }
  }

  public static string DefaultName
  {
    get { return "Parago Logging Service"; }
  }

  public static string AreaName
  {
    get { return "Parago"; }
  }

  protected override IEnumerable ProvideAreas()
  {
    List areas = new List
    {
      new SPDiagnosticsArea(AreaName, 0, 0, false, new List
      {
        new SPDiagnosticsCategory(Categories.ApplicationPages, null, TraceSeverity.Medium, 
              EventSeverity.Information, 0, 0, false, true)
      })
    };

    return areas;
  }

  // . . .

}

The area name as well as the category names will be also shown in the Diagnostic Logging UI of the CA. It is also possible to define a resource DLL to localize the names.

The service will offer the two static methods WriteTrace and WriteEvent. WriteTrace writes the log message to the SharePoint log files, usually saved in the SharePoint folder C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS.

The WriteEvent writes the log message to the Windows Event Log. The event source is called like the AreaName and later on created within the FeatureReceiver:

[Guid("D64DEDE4-3D1D-42CC-AF40-DB09F0DFA309")] 
public class LoggingService : SPDiagnosticsServiceBase
{

  ...

  public static void WriteTrace(string categoryName, TraceSeverity traceSeverity, 
    string message)
  {
    if(string.IsNullOrEmpty(message))
      return;

    try
    {
      LoggingService service = Local;

      if(service != null)
      {
        SPDiagnosticsCategory category = service.Areas[AreaName].Categories[categoryName];
        service.WriteTrace(1, category, traceSeverity, message);
      }
    }
    catch { }
  }

  public static void WriteEvent(string categoryName, EventSeverity eventSeverity, 
    string message)
  {
    if(string.IsNullOrEmpty(message))
      return;

    try
    {
      LoggingService service = Local;

      if(service != null)
      {
        SPDiagnosticsCategory category = service.Areas[AreaName].Categories[categoryName];
        service.WriteEvent(1, category, eventSeverity, message);
      }
    }
    catch { }
  }
}

The usage of the new custom logging service is quite simple:

 

// ULS Logging
LoggingService.WriteTrace(LoggingService.Categories.ApplicationPages, 
  TraceSeverity.Medium, "...");

// Windows Event Log
LoggingService.WriteEvent(LoggingService.Categories.ApplicationPages, 
  EventSeverity.Information, "...");

Next, we need to register it with SharePoint.

Service Registration

The custom logging service must be registered with SharePoint 2010 to show up in the Diagnostic Logging UI of the CA. The event sources also must be created on each server of the SharePoint farm. These two registration steps can be bundle within the FeatureActivated override method of the FeatureReceiver.

[Guid("50CA5F69-381F-4C2A-BE6C-F28219AFF20C")]
public class FeatureEventReceiver : SPFeatureReceiver
{
  const string EventLogApplicationRegistryKeyPath = 
    @"SYSTEM\CurrentControlSet\services\eventlog\Application";

  public override void FeatureActivated(SPFeatureReceiverProperties properties)
  {
    RegisterLoggingService(properties);
  } 

  public override void FeatureDeactivating(SPFeatureReceiverProperties properties)
  {
    UnRegisterLoggingService(properties);
  }

  static void RegisterLoggingService(SPFeatureReceiverProperties properties)
  {
    SPFarm farm = properties.Definition.Farm;

    if(farm != null)
    {
      LoggingService service = LoggingService.Local;

      if(service == null)
      {
        service = new LoggingService();
        service.Update();

        if(service.Status != SPObjectStatus.Online)
          service.Provision();
      }

      foreach(SPServer server in farm.Servers)
      {
        RegistryKey baseKey = RegistryKey.OpenRemoteBaseKey(RegistryHive.LocalMachine, 
                                server.Address);

        if(baseKey != null)
        {
          RegistryKey eventLogKey = baseKey.OpenSubKey(EventLogApplicationRegistryKeyPath,  
                                      true);

          if(eventLogKey != null)
          {
            RegistryKey loggingServiceKey = eventLogKey.OpenSubKey(LoggingService.AreaName);

            if(loggingServiceKey == null)
            {
              loggingServiceKey = eventLogKey.CreateSubKey(LoggingService.AreaName, 
                                   RegistryKeyPermissionCheck.ReadWriteSubTree);
              loggingServiceKey.SetValue("EventMessageFile", 
                @"C:\Windows\Microsoft.NET\Framework\v2.0.50727\EventLogMessages.dll", 
                RegistryValueKind.String);
            }
          }
        }
      }
    }
  }

  static void UnRegisterLoggingService(SPFeatureReceiverProperties properties)
  {
    SPFarm farm = properties.Definition.Farm;

    if(farm != null)
    {
      LoggingService service = LoggingService.Local;

      if(service != null)
        service.Delete();

      foreach(SPServer server in farm.Servers)
      {
        RegistryKey baseKey = RegistryKey.OpenRemoteBaseKey(RegistryHive.LocalMachine, 
                                server.Address);

        if(baseKey != null)
        {
          RegistryKey eventLogKey = baseKey.OpenSubKey(EventLogApplicationRegistryKeyPath, 
                                      true);

          if(eventLogKey != null)
          {
            RegistryKey loggingServiceKey = eventLogKey.OpenSubKey(LoggingService.AreaName);

            if(loggingServiceKey != null)
              eventLogKey.DeleteSubKey(LoggingService.AreaName);
          }
        }
      }
    }
  }
}

The FeatureActivated override is calling the RegisterLoggingService helper method to register with the SharePoint system if the service is not already available. Since one of the base classes of LoggingService is the SPService class which provides an Update and Provision method, we can register the new service farm-wide.

The second step is to create a new Windows Event Log source. Therefore we have to go through the collection of SharePoint farm servers and remotely add the new source by generating registry entries on each server.

To unregister we redo the registration steps by overriding FeatureDeactivating method of the FeatureReceiver class. The UnRegisterLoggingService method then deletes the service and removes all registry keys on all servers in the SharePoint farm.

The sample solution also created an application page to test the logging service. The source code is available as download.

image

Enter a test message and press the Log button. The message will be logged to the Windows Event Log:

And to the SharePoint log files:

image

That’s it.

Download Source Code | Download PDF Version