How To Integrate SAP Business Data Into SharePoint 2010 Using BCS Services And BCS Connector From Theobald Software

The Business Connectivity Services (BCS) of SharePoint 2010 provide a great way to fully integrate external data into the SharePoint. In most cases developers are integrating SQL database tables into the BCS services. But how do we connect to a SAP system? How do we integrate real world SAP business data like tables and function modules or BAPIs into SharePoint?

The answer is just a few clicks away. Theobald Software just released the ERPConnect Services (ECS) for SharePoint 2010 product suite and the product includes a great tool named BCS Connector. The BCS Connector allows developer to create SAP-driven BDC models in minutes with just a couple of clicks.

The BCS Connector application can be installed on client machines and will connect remotely to the SharePoint BCS service application. In this post I will give you an overview of the tool by creating a BDC model with an entity called Customer. We will also create two operations for the entity, a Finder and SpecificFinder method. Both methods are using the SAP built-in function module SD_RFC_CUSTOMER_GET. This is a very simple SAP function returning a list of customers.

To create a new BDC model, first you must enter the SAP and the SharePoint connection data after starting the BCS Connector application (see screenshot for SAP connection below).

Once you have entered the SAP connection data press the OK button and start adding your new BDC entity. To add an entity that is connected to a SAP function module SD_RFC_CUSTOMER_GET press the New button on the lower left side. A new wizard dialog will pop up. Select Function and press the Next button.

Then search for the function module and press the Next button to select the structure of our new BDC entity.

The last wizard page shows us a list with all possible entity structures available for this function module. Select the table CUSTOMER_T, including our customer data, and press the Finish button.

Now we have created our new entity Customer, but we still need to rename the entity from CUSTOMER_T to Customer. Each entity in BCS services must define at least two operations or methods, a so called Finder method to return a list of entities and a SpecificFinder method to return a specific entity within the list.

You also need to define the identifier fields within the entity by clicking the checkbox for the field KUNNR (Customer ID). You may also rename the field or all fields. Next, we create the Finder method by clicking the New button.

The Finder option is already selected. Just press the Finish button and the BCS Connector is automatically creating everything else and will open afterwards the Edit Operation dialog.

This dialog allows you to define the return parameter, input parameters and filters for the entity operation. Actually to execute the SAP function module SD_RFC_CUSTOMER_GET we need to define a value for the input parameters KUNNR or NAME1. For demonstration purpose I just define a name pattern for field NAME1. This query returns all customers that starts with T. What you can define as input parameter depends on the function itself. Clicking on the Preview button displays a list of all filtered customers.

In the same way we are creating the SpecificFinder method:

So, finally we have created a new entity with two entity operations and now we are able to save it on the SharePoint server. Just press the Save Model button. This will result in a new BDC model created on the server:

You can find the BDC models within the Central Administration of SharePoint 2010.

So far we just created a model, but we also want to display the customer data within an external list. We can create an external list using the SharePoint Designer or the BCS Connector. I will show you the second option. Switch to the External Lists tab of the ribbon bar and click on the New External List button.

The New External List dialog has pre-selected all values. Click on the Create button and you are done. You may also change the name of the external list. The final external list looks as follows:

That was really easy and you can even export the BDC Model to Visual Studio 2010 and do additional customizing.

Further information about the product ERPConnect Services and BCS Connector can be found here:

http://www.theobald-software.com/en/products/erpconnectservices.htm

FluentSP – The Fluent SharePoint API

Download FluentSP 1.0 from Codeplex.com

More information on my new homepage at http://www.parago.net

Once you are doing a lot of SharePoint programming you know you often have to write lengthy pieces of code to implement simple tasks like querying SharePoint lists. Nowadays you can read a lot of fluent APIs or fluent interface. For instance, jQuery, a JavaScript library that had successfully introduced a fluent API to handle the hierarchical structure of the HTML documents.

Today, I want to introduce a small library I have developed, FluentSP, a modern fluent interface around the classic SharePoint 2010 API. By using FluentSP instead of the classic SharePoint API, you will be able to chain methods and act on sets of items of the underlying SharePoint objects.

What is a fluent API?
Checkout this CodeProject article A Look at Fluent APIs and the Wikipedia article Fluent interface.

To start into the fluent API you call the Use() method on SPSite, SPWeb, SPWebCollection or SPListCollection. The Use() method is implemented as an extension method that will return the entry facade object (see facade table below). Another entry point to the fluent API is the static class FluentSP with its static methods CurrentSite, CurrentWeb, CurrentLists or RootWebLists.

SPContext.Current.Site.Use()... // => Returns the SPSiteFacade as entry point

// OR:
FluentSP.CurrentSite()...       // => Returns the SPSiteFacade as entry point 

Using the entry facade instance you can start chaining the available facade methods as follows:

FluentSP.CurrentSite().Web("Home").List("Tasks").Items().ForEach(i => // Do something with the item i of type SPListItem...);

// OR:
FluentSP.CurrentSite()
  .Web("Home")
    .List("Tasks")
      .Items()
      .ForEach(i => // Do something with...);

Each facade object is actually wrapping an underlying data item, for instance the SPSiteFacade class is the fluent wrapper of the SPSite class. Depending on what kind of facade methods you are calling the method is returning either the current facade instance (e.g., ForEach() or Where()) or the method is returning a new child facade object (e.g. Items()). During the process of chaining methods in such a way you will build up a tree or hierarchy of facade instances. In order to step back to the parent or previous facade instance you need to call the End() method:

site.Use()
       .RootWeb()
         .Site()
       .End()		// Returns SPWebFacade  as parent facade
         .Site()
       .End()		// Returns SPWebFacade  as parent facade
     .End();		// Returns SPSiteFacade as parent facade

FluentSP is currently missing a number of possible useful methods, but you can easily extend the FluentSP API with custom facade classes and extension methods, see below and source code for implementation examples.

Samples

SPSite site = SPContext.Current.Site;

// ----------------------------

// Outputs titles of all lists of the root web where the list title starts with T
site.Use().RootWeb().Lists().Where(l => l.Title.StartsWith("T")).ForEach(l => Console.WriteLine(l.Title));

// Outputs titles of all lists of the root web where the list title ends with a ts (using RegEx)
site.Use().RootWeb().Lists("ts$").ForEach(l => Console.WriteLine(l.Title)).Count(out c);

// Outputs titles of all lists of the root web in ascending order where the starts with T
site.Use().RootWeb().Lists().Where(l => l.Title.StartsWith("T")).OrderBy(l => l.Title).ForEach(l => Console.WriteLine(l.Title));

// Outputs titles of all lists of the root web in descending order where the starts with T
site.Use()
    .RootWeb()
      .Lists()
      .Where(l => l.Title.StartsWith("T"))
      .OrderByDescending(l => l.Title)
      .ForEach(l => Console.WriteLine(l.Title));

// ----------------------------

// Delete all items in the Members list, then add 7 new members and then select and output 
// the titles of a few of the newly created items
site.Use()
    .RootWeb()
      .List("Members")
      .Do(w => Console.WriteLine("Deleting all members..."))
       .Items()
       .Delete()
      .End()
      .Do(w => Console.WriteLine("Adding all members..."))
      .AddItems(7, (i, c) => i["Title"] = "Member " + c)
       .Items()
       .Skip(2)
       .TakeUntil(i => ((string)i["Title"]).EndsWith("6"))
       .ForEach(i => Console.WriteLine(i["Title"]));

// ----------------------------

// Search for lists that are created by specific a user and depending on the results
// displays different messages by calling the IfAny or IfEmpty methods
site.Use()
    .RootWeb()
      .Lists()
      .ThatAreCreatedBy("Unknown User")
      .IfAny(f => f.ForEach(l => Console.WriteLine(l.Title)))
      .IfAny(l => l.Title.StartsWith("M"), f => Console.WriteLine("Lists found that starts with M*"))
      .IfEmpty(f => Console.WriteLine("No lists found for user"))
    .End()
    .Do(w => Console.WriteLine("---"))
      .Lists()
      .ThatAreCreatedBy("System Account")
      .IfAny(f => f.ForEach(l => Console.WriteLine(l.Title)));

// ----------------------------

var items = new List<SPListItem>();

// Query with Skip and TakeUnitl methods
site.Use().RootWeb().List("Members").Items().Skip(2).TakeUntil(i => i.Title.EndsWith("5")).ForEach(i => { items.Add(i); Console.WriteLine(i.Title); });

// Query with Skip and TakeWhile methods
site.Use()
    .RootWeb()
      .List("Members")
       .Items()
       .Skip(2)
       .TakeWhile(i => i.Title.StartsWith("Member"))
       .ForEach(i => { items.Add(i); Console.WriteLine(i.Title); })
      .End()
       .Items()
       .Where(i => i.Title == "XYZ")
       .ForEach(i => { items.Add(i); Console.WriteLine(i.Title); });

// ----------------------------

// Adds new items using the Do method with the passed facade object
site.Use()
    .RootWeb()
    .AllowUnsafeUpdates()
      .List("Members")
      .Do((f, l) => {
        for(int c = 1; c <= 5; c++)
          f.AddItem(i => i["Title"] = "Standard Member #" + c);
      })
      .AddItem(i => i["Title"] = "Premium Member")
       .Items()
        .OrderBy(i => i.Title)
        .ForEach(i => Console.WriteLine(i["Title"]));

Extensibility Samples

// This sample is using the ThatAreCreatedBy extension method defined in Extensions.cs to show how to extend the fluent API
site.Use()
        .RootWeb()
          .Lists()
          .ThatAreCreatedBy("System Account", "jbaurle")
          .Count(c => Console.WriteLine("Lists found: {0}", c))
          .ForEach(l => Console.WriteLine(l.Title));

// This sample uses the new SPWebApplicationFacade extenion defined in SPwebApplicationFacade.cs to show how to extend the fluent API
site.WebApplication.Use()
              .Sites()
              .ForEach(i => Console.WriteLine(i.Url));

// This sample uses an alternative implementation for SPSiteFacade defined in SPSiteFacadeAlternate.cs to show how to extend the fluent API
site.WebApplication.Use().WithFirstSite().DoSomething();
site.Use<SPSiteFacadeAlternate<BaseFacade>>().DoSomething();

The custom method ThatAreCreatedBy which is used in the first query of the extensibility samples is implemented as follows:

static class Extensions
{
  public static SPListCollectionFacade<TParentFacade> ThatAreCreatedBy<TParentFacade>(this SPListCollectionFacade<TParentFacade> facade, params string[] names)
    where TParentFacade : BaseFacade
  {
    // NOTE: This sample uses the GetCollection method of the given facade instance to retrieve the current 
    // collection and adds the its query (see LINQ Deferred Execution). The Set method updates the 
    // underlying collection. The GetCurrentFacade method will then return the current facade to allow 
    // method chaining.

    if(names.Length > 0)
      facade.Set(facade.GetCollection().Where(i => names.Contains(i.Author.Name)));

    return facade.GetCurrentFacade();
  }
}

For more samples and details check out the source code you can download from Codeplex.

Built-In Facades and Methods

See Codeplex

LINQ to SAP

A lot has been written about Microsoft’s new data access technology LINQ (Language Integrated Query) since the first preview version has been published. But there are still some interesting aspects to LINQ and its extensibility. This article will introduce a new provider called LINQ to SAP which offers an easy way to retrieve business data from SAP/R3 systems within a .NET application.

With the introduction of the .NET framework 3.5 and the programming language extensions for C# and VB.NET Microsoft began to redefine the way developers would implement data access. Nearly all applications today are querying data from different data sources. Mainly those data sources are SQL databases, XML files or in-memory collections. LINQ offers an universal and standardized approach to access all those data sources using special data providers. The LINQ language syntax strongly resembles that of SQL. The following sample shows how to query data from a string array with LINQ:

string[] names = {„John“, „Patrick“, „Bill“, „Tom“}

var res = from n in names where n.Contains(„o“) select n;

foreach(var name in res)
Console.WriteLine(name);

This simple LINQ query based on an in-memory collection (array) selects all items in the array „names“ that contain the letter „o“. Console output: „John, Tom“. A LINQ introduction is beyond the scope of this article. A very good introduction article can be found on CodeProject.com.

LINQ Providers

The .NET framework comes with built-in providers for collections and lists (LINQ to Objects), for Microsoft SQL Server databases (LINQ to SQL), for XML files (LINQ to XML) and finally for DataSet object instances (LINQ to DataSet). Beside the standard providers, developers can extend LINQ by creating custom providers to support special data sources. LINQ to LDAP or LINQ to Amazon are examples of such custom providers.

To write a custom LINQ provider one must basically implement two interfaces: IQueryable and IQueryProvider. These interfaces make objects queryable in LINQ expressions. Developing a LINQ provider can be a very complex task, but there a quite some good blog entries on the net explaining the steps in detail.

This article will introduce a new provider called LINQ to SAP from Theobald Software which provides developers with a simple way to access SAP/R3 systems and their data objects. The software also provides a Visual Studio 2008 Designer to define SAP objects interactively and integrate them in .NET applications.

SAP Background

This section will give you a short explanation and background of SAP objects that are queryable by LINQ to SAP. The most important objects are Function Modules, Tables, BW Cubes and Queries.

A Function Module is basically similar to a normal procedure in conventional programming languages. Function Modules are written in ABAP, the SAP programming language, and are accessible from any other programs within a SAP/R3 system. They accept import and export parameters as well as other kind of special parameters. The image below shows an example of a Function Module named BAPI_EQUI_GETLIST within the SAP Workbench:

Figure 1: Function Module (SAP Workbench)

In addition BAPIs (Business-API) are special Function Modules that are organized within the SAP Business Object Repository. LINQ to SAP also allows access to SAP tables. Those are basically straightforward relational database tables. Furthermore the LINQ to SAP Designer allows developers to define and access BW Cubes (Business Cube Cubes) and Queries (SAP Query). BW Cubes are also known as OLAP Cubes. Data are organized in a multi-dimensional way within a cube. SAP Queries work just like other queries. To indentify a SAP Query uniquely there are three pieces of information necessary: user area (global or local); user group and the name of the query. With the concept of Queries SAP provides a very simple to generate reports, without the need of knowing the SAP ABAP programming language.

Visual Studio 2008 Designer for LINQ to SAP

In order to use LINQ to SAP and the associated Visual Studio Designer, the .NET library ERPConnect.net from Theobald Software must be installed first. This software is the basic building block between .NET and a SAP/R3 system and provides an easy API to exchange data between the two systems. The company offers a free trial version to download. After installing ERPConnect.net, LINQ to SAP must be installed separately using a setup program (see manual). The provider and the designer are actually extensions to the ERPConnect.net library. The LINQ to SAP provider itself consists of the Visual Studio 2008 Designer and additional class libraries that are bundled within the namespace ERPConnect.Linq.

The setup adds a new ProjectItem type to Visual Studio 2008 with the file extension .erp and is linking it with the designer. Double-clicking the .erp-file will open the LINQ to SAP Designer. The designer supports application developers with the option to automatically generate source code to integrate with SAP objects. For all defined SAP objects in an .erp file, the provider will create a context class which inherits from the ERPDataContext base class. The generated context class contains methods and sub-classes that represent the defined SAP objects. Beside the .erp file, LINQ to SAP designer will save the associated and automatically generated source code in a file with the extension .Designer.cs.

Figure 2: Add new project item

Figure 3: SAP objects in LINQ to SAP Designer

Figure 4: Connection dialog in LINQ to SAP Designer

Function Modules

This section shows how to access and obtain data using the function module BAPI_EQUI_GETLIST by creating a LINQ to SAP object. The module is returning an equipment list for pre-defined plants. First of all one must add a new LINQ to SAP file (.erp) to a new or existing Visual Studio 2008 project. By opening the .erp file the LINQ to SAP Designer will start. By double-clicking on the Function item in the toolbox of Visual Studio will add a new SAP object Function Module. In the next step the object search dialog opens and the developer can search for function modules.

Figure 5: Search dialog in LINQ to SAP Designer

Once the selection is made, the LINQ to SAP Designer will show up the Function Module dialog box with all data, properties and parameter definitions of the selected module BAPI_EQUI_GETLIST. The user can now change the naming of the auto-generated method as well as all used parameters.

Figure 6: Function Module dialog in LINQ to SAP Designer

For each function module the LINQ to SAP Designer will generate a context class method with all additional object classes and structures. If for instance the user defines a method name called GetEquipmentList for the function module BAPI_EQUI_GETLIST, the designer will generate a context class method with that name and the defined method signature. The user can also specify the parameters to exchange. The lower area of the dialog is displaying the SAP typical parameters, like IMPORT, EXPORT, CHANGING and TABLES parameters. LINQ to SAP allows to define default values for SAP parameters. Those parameters can also be used as parameters for the auto-generated context class method as well as for return values. The names for the parameters and the associated structures can also be renamed.

The method signature for the function module defined above looks like this:

public EquipmentTable GetEquipmentList(PlantTable plants)

The context class itself is named SAPContext by default. The context class name, the namespace, the connection settings as well as other flags can be defined in the properties window of the LINQ to SAP Designer. The following code shows how to use the context class SAPContext:

class Program

{

static void Main()

{

SAPContext dc = new SAPContext(„TESTUSER“, „XYZ“);

 

SAPContext.PlantTable plants = new SAPContext.PlantTable();

SAPContext.PlantStructure ps = plants.Rows.Add();

ps.SIGN = „I“;

ps.OPTION = „EQ“;

ps.LOW = „3000“;

 

SAPContext.EquipmentTable equipList = dc.GetEquipmentList(plants);

}

}

Tables

The procedure for adding a SAP Table is basically the same as for function modules (see above). After adding the SAP Table object from the toolbox in Visual Studio and finding the table with the search dialog, the Table dialog will show up:

Figure 7: Tables Module dialog in LINQ to SAP Designer

In the upper part of the table dialog the user must define the class name for table object for auto-generation. The default name is the name of the table. The lower part shows a data grid with all table fields and their definitions. For each field a class property name can be defined for the auto-generated context class code. The checkbox in the first column selects if the field will be part of the table class.

The figure above shows the definition of the SAP Table object T001W. This tables stores plant information. The class has not been changed, so the designer will create a C# class with the name T001W. In addition the context class will contain a property T001WList. The type of this property is ERPTable<T001W>, which is LINQ queryable data type.

The following code shows how to query the table T001W using the context class:

class Program

{

static void Main()

{

SAPContext dc = new SAPContext(„TESTUSER“, „XYZ“);

dc.Log = Console.Out;

 

var res = from p in dc.T001WList

where p.WERKS == „3000“

select p;

 

foreach (var item in res)

Console.WriteLine(item.NAME1);

}

}

SAP Context Class and Logging

To access objects using LINQ to SQL, the provider will generate a context class named DataContext. Accordingly LINQ to SAP also creates a context class called SAPContext. This class is defined as a partial class. A partial class is a type declaration that can be split across multiple source files and therefore allows developers to easily extend auto- generated classes like the context class of LINQ to SAP.

The code sample below shows how to add a partial class (file SAPContext.cs) which adds a new custom method GetEquipmentListForMainPlant to extend the context class generated by the LINQ to SAP designer. This new method calls internally the auto-generated method GetEquipmentList with a pre-defined parameter value. The C# compiler will internally merge the auto-generated LINQtoERP1.Designer.cs with the SAPContext.cs source file.

using System;

 

namespace LINQtoSAP

{

partial class SAPContext

{

public EquipmentTable GetEquipmentListForMainPlant()

{

SAPContext.PlantTable plants = new SAPContext.PlantTable();

 

SAPContext.PlantStructure ps = plants.Rows.Add();

ps.SIGN = „I“;

ps.OPTION = „EQ“;

ps.LOW = „3000“;

 

return GetEquipmentList(plants);

}

}

}

LINQ to SAP also provides the capability to log LINQ query translations. In order to log data the LOG property of the context class must be set with a TextWriter instance, e.g. the console output Console.Out. All LINQ to SAP does is a very rudimentary logging which is restricted to table objects. But it helps developers to get a feeling about what the translated where part looks like.

Summary

In overall LINQ to SAP is very simple but yet powerful LINQ data provider and Visual Studio 2008 Designer to use. You also get a feeling on how to develop against a SAP/R3 system using .NET. For more information about the product please check out the homepage of the vendor, http://www.theobald-software.com.

Creating DAL Components Using Custom ASP.NET Build ProvidersAnd Compiler Techniques

There are many articles on the internet that are dealing with the creation and usage of a Data Access Layer (DAL) and the comprising components, also known as DALC (DAL Components). There is nothing new in the process of creation a DAL. You can actually use Typed DataSets, Microsoft’s Enterprise Library (DAAB) or you may use one of the many third party tools to implement a comprehensive DAL system.

The main objectives of this composition are to show how to create and use ASP.NET build providers and also to explain how easy you can analyze a small self-defined description language to declare DALCs or anything else. To implement a full-blown DAL killer application which you can use for the rest of your programming life is not the objective of this article. Normally you would not define your own description language to declare DALCs, you would instead use a XML-based description of the components and analyze them using the feature-rich XML classes that comes with the .NET framework.

I just wanted to implement a lexical analyzer (a.k.a. scanner or tokenizer), some parsing techniques and dynamic code generation using the .NET’s CodeDOM. There a lot of situations in daily work where it would be handy to develop some kind of parser (even a very small and simple one) to come up with an acceptable and elegant solution. In fact, for one of my customers I defined a description language to automate the extension of a web application.

To show an example of the final result of a dynamically generated DAL using the DALComp application lets assume we have a database table called Articles. For this table the DAL (see section „DALC Description Language“ below) respectively the build provider will automatically create a class called Article, containing private member fields and public poperties that correspond to the table column names. Nullable type are created for value types.

In addition the system generates static methods (also defined within a .dal file) to select the requested data. The data are returned as a generic list of type Article (in C# List<Article>) and can be used as follows:

Sample 1:

foreach(Article article in Article.SelectAll())

Console.WriteLine(article.Title);

Sample 2:

ArticlesGridView.DataSource=Article.SelectAll();

ArticlesGridView.DataBind();

Sample 3:

<asp:ObjectDataSource ID=“ArticlesDS“ TypeName=“Parago.DAL.Article“ SelectMethod=“SelectAll“

runat=“server“ />

So, now lets start!

Build Providers

Build providers are new to ASP.NET and the .NET framework version 2.0. They basically allow you to integrate yourself within the ASP.NET compilation process and build environment. That means you can define a new file type for which you can generate source code based on arbitrary file content. The source code (provided for instance as a CodeCompileUnit) will then be built into the final compiled website assemblies. In our case, we will define a new file type .dal and the file content will be of type Text containing our own little description language to define DALCs.

In fact, the ASP.NET framework is basically doing the exact same thing for file types like .apsx and .ascx as well as for many more. The corresponding build providers are defined in the global web.config configuration file. For instance the file type .apsx is handled by a framework class called PageBuildProvider. If you are interested in how the ASP.NET-Team has implemented this provider, you can either use ILDASM or Lutz Roeder’s „.NET Reflector“ to disassemble the code.

To use build providers in your web applications, you have to activate new file types within the local web.config. The file type .dal for the DALComp application is defined in the configuration file as follows:

<compilation>

<buildProviders>

<add extension=“.dal“ type=“Parago.DALComp.DALCompBuildProvider, DALComp.BuildProvider“/>

</buildProviders>

</compilation>

The class DALCompBuildProvider is handling from now on all files with the extension .dal. The class extends the abstract base class BuildProvider and overrides the GenerateCode method. ASP.NET is calling this method during the compilation and build process of the website and is passing an instance of type AssemblyBuilder to the method. Code can then be added by calling the AddCodeCompileUnit method of the AssemblyBuilder.

CodeCompileUnits represent containers for CodeDOM program graphs. Basically they are an internal image of the source code. Each .NET language that supports the code provider model can create source code in its own language based on a CodeCompileUnit.

Creating a clean language-independent CodeDOM program graph is an annoying and somewhat cumbersome task. You have to create it if you want to generate source code for different languages. The CodeGen class for DALComp generates DAL source code for currently C# and VB.NET.

The BuildProvider class also provides a method called OpenReader to read the source code (a file with the extension .dal). The next steps are to tokenize, parse and generate a CodeDOM program graph which we can turn over to the ASP.NET build process:

Tokenizer tokenizer=new Tokenizer(source);

Parser parser=new Parser(tokenizer);

CodeGen codeGen=new CodeGen(parser);

builder.AddCodeCompileUnit(this, codeGen.Generate());

In the next section we first of all take a look at a sample source code to see what kind of language we want to analyze and generate code for.

DALC Description Language

The description „language“ to formally describe DAL components uses a very simple syntax. The following shows a sample DAL definition contained in the file Sample.dal (stored in the special folder App_Code):

Config {

Namespace = „Parago.DAL“,

DatabaseType = „MSSQL“,

ConnectionString = „Data Source=.\SQLEXPRESS;…“

}

//

// DAL component for table Articles

//

DALC Article ( = Articles ) {

Mapping {                // Map just the following fields, leave others

ArticleID => Id,

Text1 => Text

}

SelectAll()

SelectByAuthor(string name[CreatedBy])

SelectByCategory(int category[Category])

}

DALC Category( = „Categories“ ) {

SelectAll()

}

The syntax for the language is defined using the extended Backus-Naur form (EBNF), which is an extension of the basic Backus-Naur form (BNF) metasyntax notation, a formal way to describe languages. The following syntax-rules illustrate the definiton of the DALC description language:

digit

= „0-9“

letter

= „A-Za-z“

identifier

= letter { letter | digit }

string

= ‚“‚ string-character { string-character } ‚“‚

string-character

= ANY-CHARACTER-EXCEPT-QUOTE | ‚““‚

dal

= config dalc { dalc }

config

= „Config“ „{“ config-setting { „,“ config-setting } „}“

config-setting

= ( „Namespace“ | „DatabaseType“ | „Connectionstring“ ) „=“ string

dalc

= „DALC“ identifier [ dalc-table ] „{“ [ dalc-mapping ] dalc-function { dalc-function } „}“

dalc-table

= „(“ „=“ ( identifier | string ) „)“

dalc-mapping

= „Mapping“ „{“ dalc-mapping-field { „,“ dalc-mapping-field } „}“

dalc-mapping-field

= ( identifier | string ) „=>“ identifier

dalc-function

= identifier „(“ [ dalc-function-parameter-list ] „)“

dalc-function-parameter-list

= dalc-function-parameter { „,“ dalc-function-parameter }

dalc-function-parameter

= ( „string“ | „int“ ) identifier „[“ identifier | string „]“

The next section explains how to scan and parse the syntax above shown.

Compiler Techniques

The DALComp application is using compiler techniques in a very basic kind. Implementing a real-world compiler can be a very complicated task. It involves techniques like syntax error recovering, variable scoping or bytecode (IL) generation and many more.

The first step in the implementaion is to create a tokenizer. A tokenizer is analyzing the input character by character and is trying to split it into so-called tokens. Tokens are categorized blocks of texts. The categories may be language keywords like the C# loop statement „for“, comparison operators like „==“ or whitespaces. DALComp is defining a class named Token to represent a single token as well as an enumeration called TokenKind to define the categories of tokens:

public enum TokenKind {

    KeywordConfig,

    KeywordDALC,

    KeywordMapping,

    Type,

    Identifier,

    String,

    Assign,            // =>

    Equal,            // =

    Comma,            // ,

    BracketOpen,        // [

    BracketClose,        // ]

    CurlyBracketOpen,    // {

    CurlyBracketClose,    // }

    ParenthesisOpen,    // (

    ParenthesisClose,    // )

    EOT            // End Of Text

}

public class Token {

    public TokenKind Type;

    public string Value;

    public Token(TokenKind type) {

        Type=type;

        Value=null;

    }

    public Token(TokenKind type, string value) {

        Type=type;

        Value=value;

    }

}

The class Tokenizer is doing the tokenizing by analyzing character by character of the input stream. The Tokenizer constructor initializes the object instance with the input text to scan, creates a generic queue of type Token and calls the method Start to do the work:

public Tokenizer(string text) {

// To avoid index overflow append new line character to text

this.text=(text==null?String.Empty:text)+“\n“;

// Create token queue (first-in, first-out)

tokens=new Queue<Token>();

// Tokenize the text!

Start();

}

The constructor will also add an additional character („\n“) to the input text to avoid an index overflow. The Start method looks similar to the folowing:

void Start() {

int i=0;

// Iterate through input text

while(i<text.Length) {

// Analyze next character and may be the following series of characters

switch(text[i]) {

// Ignore whitespaces

case ‚\n‘:

case ‚\r‘:

case ‚\t‘:

case ‚ ‚:

break;

// Comment (until end of line)

case ‚/‘:

if(text[i+1]==’/‘)

while(text[++i]!=’\n‘) ;

continue;


case ‚{‚:

     tokens.Enqueue(new Token(TokenKind.CurlyBracketOpen));

     break;

case ‚}‘:

     tokens.Enqueue(new Token(TokenKind.CurlyBracketClose));

     break;

// ‚=‘ or ‚=>‘

case ‚=‘:

    if(text[i+1]==‘>‘) {

     i++;

     tokens.Enqueue(new Token(TokenKind.Assign));

    }

    else

     tokens.Enqueue(new Token(TokenKind.Equal));

    break;


As you can see it is simple and straightforward to implement a tokenizer. There two more methods of the Tokenizer class, PeekTokenType to lookahead for the next token type in the queue and GetNextToken to actually return the next token from the queue (also removes the token from the queue):

public TokenKind PeekTokenType() {

// Always return at least TokenKind.EOT

return (tokens.Count>0)?tokens.Peek().Type:TokenKind.EOT;

}

public Token GetNextToken() {

// Always return at least Token of type TokenKind.EOT

return (tokens.Count>0)?tokens.Dequeue():new Token(TokenKind.EOT);

}

Both methods are called from the parser, the next step in compiling source code. Parsing is the process of analyzing the sequence of tokens in order to determine its grammatical structure with respect to a given formal grammar. An instance of the Tokenizer class will be handed over to the Parser class. The Parser class is producing a structure that is representing a semantically correct source code, called an abstract syntax tree (AST). The name is used by mistake, since the structure is not a tree of any kind. The AST used in this context is just a generic list of DALC class objects and settings (the Config part).

The Parser class implementation is also simple and straightforward. There is no time for me to explain in detail how parsing and the concepts behind are working. This is a huge area of computing. The source code for the Parser class is self-explanatory and easy to understand.

The parser is basically just implementing the syntax-rules defined above by the extended Backus-Naur form, see also the section „DALC Description Language“:

/// <summary>

/// dal = config dalc { dalc }

/// </summary>

void ParseDAL() {

ParseConfig();

do {

ParseDALC();

} while(Taste(TokenKind.KeywordDALC));

Eat(TokenKind.EOT);

}

/// <summary>

/// config = „Config“ „{“ config-setting { „,“ config-setting } „}“

/// </summary>

void ParseConfig() {

Eat(TokenKind.KeywordConfig);

Eat(TokenKind.CurlyBracketOpen);

ParseConfigSetting();

while(true) {

if(!Taste(TokenKind.Comma))

break;

Eat();

ParseConfigSetting();

}

Eat(TokenKind.CurlyBracketClose);


}

The parsing methods are using helper functions to „eat“ the tokens of the queue by calling the mentioned method GetNextToken of the Tokenizer class or simply aborting the parsing process. Here an example:

/// <summary>

/// Looks ahead the token line and returns the next token type.

/// </summary>

bool Taste(TokenKind type) {

return tokenizer.PeekTokenType()==type;

}

/// <summary>

/// Returns the next token.

/// </summary>

string Eat() {

token=tokenizer.GetNextToken();

return token.Value;

}

/// <summary>

/// Returns the next token of type, otherwise aborts.

/// </summary>

string Eat(TokenKind type) {

token=tokenizer.GetNextToken();

if(token.Type!=type)

Abort();

return token.Value;

}

/// <summary>

/// Returns the next token of any of the passed array of types, otherwise aborts.

/// </summary>

string EatAny(TokenKind[] types) {

token=tokenizer.GetNextToken();

foreach(TokenKind type in types)

if(token.Type==type)

return token.Value;

Abort();

return String.Empty;

}

The third phase is using the structure generated by the Parser class and transforms it into a CodeDOM structure that can be used to create C# or VB code. This phase is called the code generation phase. Language compilers targeting the .NET framework are usually generating Intermediate Language (IL) code. The DALComp application is not generating any IL code, instead it generates a CodeDOM graph that can be used e.g. by ASP.NET to compile into web assemblies.

The method Generate of the CodeGen class is creating first of all a container for the CodeDOM structure, is adding a namespace unit to it and tries to connect to the defined database using the connection string that is specified within the DAL defintion file (see Sample.dal):

// Create container for CodeDOM program graph

CodeCompileUnit compileUnit=new CodeCompileUnit();

try {

// If applicable replace the value ‚|BaseDirectory|‘ with the current

// directory of the running assembly (within the connection string)

// to allow database access in DALComp.Test.Console

string connectionString=dal.Settings[„CONNECTIONSTRING“]

.Replace(„|BaseDirectory|“, Directory.GetCurrentDirectory());

// Define new namespace (Config:Namespace)

CodeNamespace namespaceUnit=new CodeNamespace(dal.Settings[„NAMESPACE“]);

compileUnit.Namespaces.Add(namespaceUnit);

// Define necessary imports

namespaceUnit.Imports.Add(new CodeNamespaceImport(„System“));

namespaceUnit.Imports.Add(new CodeNamespaceImport(„System.Collections.Generic“));

namespaceUnit.Imports.Add(new CodeNamespaceImport(„System.Data“));

namespaceUnit.Imports.Add(new CodeNamespaceImport(„System.Data.SqlClient“));

// Generate private member fields (to save public property values)

// by analyzing the database table which is defined for the DALC

SqlConnection connection=new SqlConnection(connectionString);

connection.Open();

// Generate a new public accessable class for each DALC definition

// with all defined methods

foreach(DALC dalc in dal.DALCs) {

// Generate new DALC class type and add to own namespace

CodeTypeDeclaration typeUnit=new CodeTypeDeclaration(dalc.Name);

namespaceUnit.Types.Add(typeUnit);

// Generate public empty constructor method

CodeConstructor constructor=new CodeConstructor();

constructor.Attributes=MemberAttributes.Public;

typeUnit.Members.Add(constructor);

// Get schema table with column defintions for the current DALC table

DataSet schema=new DataSet();

new SqlDataAdapter(String.Format(„SELECT * FROM {0}“, dalc.Table), connection)

    .FillSchema(schema, SchemaType.Mapped, dalc.Table);

// Generate for each column a private member field and a public

// accessable property to use

foreach(DataColumn column in schema.Tables[0].Columns) {

// Define names by checking DALC mapping definition

string name=column.ColumnName;

string nameMapped=

dalc.Mapping.ContainsKey(name.ToUpper())?dalc.Mapping[name.ToUpper()]:name;

// Generate private member field with underscore plus name; define

// member field type by checking if value type and create a

// nullable of that type accordingly

CodeMemberField field=new CodeMemberField();

field.Name=String.Format(„_{0}“, nameMapped);

field.Type=GenerateFieldTypeReference(column.DataType);

typeUnit.Members.Add(field);

// Generate public accessable property for private member field,

// to use for instance in conjunction with ObjectDataSource

CodeMemberProperty property=new CodeMemberProperty();

property.Name=nameMapped;

property.Type=GenerateFieldTypeReference(column.DataType);

property.Attributes=MemberAttributes.Public;

property.GetStatements.Add(

new CodeMethodReturnStatement(

new CodeFieldReferenceExpression(

new CodeThisReferenceExpression(),

field.Name

)

)

);

property.SetStatements.Add(

new CodeAssignStatement(

new CodeFieldReferenceExpression(

new CodeThisReferenceExpression(),

field.Name

),

new CodePropertySetValueReferenceExpression()

)

);

typeUnit.Members.Add(property);

}


}

}

Based on the DAL specification file the method reads in each schema table of a DALC table, builds up a new class type and adds all columns as private member fields and public properties to it. If a column data type is a value type then it will create a nullable version of that value type as follows:

CodeTypeReference GenerateFieldTypeReference(Type columnType) {

// If column data type is not a value type return just return it

if(!columnType.IsValueType)

return new CodeTypeReference(columnType);

// Type is a value type, generate a nullable type and return that

Type nullableType=typeof(Nullable<>);

return new CodeTypeReference(nullableType.MakeGenericType(new Type[] { columnType }));

}

For example if a column is of type „int“, this helper function will generate a „int?“ or „System.Nullable<int>“. Here a sample of auto-generated C# code:

public class Article {

private System.Nullable<int> _Id;

private string _Title;

private string _Text;

private string _Text2;

private string _Language;

private System.Nullable<int> _Category;

private string _CreatedBy;

private System.Nullable<System.DateTime> _CreatedOn;

public Article() {

}

public virtual System.Nullable<int> Id {

get {

return this._Id;

}

set {

this._Id = value;

}

}


// Helper method to query data

public static List<Article> SelectData(string sql) {

List<Article> result;

result = new List<Article>();

System.Data.SqlClient.SqlConnection connection;

System.Data.SqlClient.SqlCommand command;

System.Data.SqlClient.SqlDataReader reader;

connection = new System.Data.SqlClient.SqlConnection(„Data Source=…“);

connection.Open();

command = new System.Data.SqlClient.SqlCommand(sql, connection);

reader = command.ExecuteReader();

for (; reader.Read(); ) {

Article o;

o = new Article();

if (Convert.IsDBNull(reader[„ArticleID“])) {

o.Id = null;

}

else {

o.Id = ((System.Nullable<int>)(reader[„ArticleID“]));

}


result.Add(o);

}

reader.Close();

connection.Close();

return result;

}

// DALC function

public static List<Article> SelectAll() {

string internalSql;

internalSql = „SELECT * FROM Articles“;

return SelectData(internalSql);

}


}

For more detail information please refer to source code.

Summary

The DAL itself is a basic implementation and shows the concepts of creating dynamic code. To be accurate, the DALComp compiler is actually more a source-to-source translator then a compiler. The current version is only generating methods to select data, no updates or inserts. As you can see there is plenty of room for extending the DAL by augmenting the description language and generating more dynamic code to make the DAL productive.

For more information regarding building compilers and virtual maschines, I recommend Pat Terry’s „Compiling with C# and Java“ book. Another way to study compiler techniques in practise is to take a look at the sources of the .NET implementation of Python, IronPython. The source code is available as download on the CodePlex website.

For real-world compiler development there are plenty of utility tools available such as Coco/R, a scanner and parser generator, or the ANTLR compiler tools (used by the Boo compiler). You also find a lot of information on the web site of Microsoft Research, e.g. the F# compiler. Another interesting topic is the Phalanger project („The PHP Language Compiler for the .NET Framework“) on CodePlex.com.

Download Source Code

Using ERPConnect Services To Integrate SAP Business Data Into SharePoint 2010

SharePoint 2010 provides developer with the capability to integrate external data sources like SAP business data via the Business Connectivity Services (BCS) into the SharePoint system. The concept of BCS is based on entities and associated stereotyped operations. This perfectly suits for flat and simple structured data sets like SAP tables.

Another and way more flexible option to use SAP data in SharePoint are the ERPConnect Services for SharePoint 2010 from Theobald Software. The product suite consists of three product components: ERPConnect Services runtime, the BCS Connector application and the Xtract PPS for PerformancePoint Services.

The ERPConnect Services runtime is providing a Service Application that integrates itself with the new service architecture of SharePoint 2010. The runtime offers a secure middle-tier layer to integrate different kind of SAP objects in your SharePoint applications, like tables and function modules.

The BCS Connector application allows developers to quickly create BDC models for the BCS Services, completely without programming knowledge. You even be able to export the BDC models created by the BCS Connector to Visual Studio 2010 for further customizing. The Xtract PPS component offers a SAP data source provider for the PerformancePoint Services of SharePoint 2010.

This article gives you an overview of the ERPConnect Services runtime and shows how easy you can create and incorporate business data from SAP in different SharePoint application types, like Web Parts, Application Pages or Silverlight modules. This article does not introduce the BCS Connector nor the Xtract PPS component.

SAP Background

This section will give you a short explanation and background of SAP objects that can be used in ERPConnect Services for SharePoint 2010. The most important objects are SAP tables and function modules.

A function module is basically similar to a normal procedure in conventional programming languages. Function modules are written in ABAP, the SAP programming language, and are accessible from any other programs within a SAP system. They accept import and export parameters as well as other kind of special parameters. In addition, BAPIs (Business-API) are special function modules that are organized within the SAP Business Object Repository. In order to use function modules with ERPConnect Services they must be marked as Remote (RFC).

SAP tables can also be accessed by ERPConnect Services. Tables in SAP are basically relational database tables. Others SAP objects like BW Cubes or SAP Queries can be accessed via the XtractQL query language (see below).

ERPConnect Services Installation & Configuration

Theobald Software is providing an evaluation version that can be downloaded from their website. Installing the ERPConnect Services on a SharePoint 2010 server is done by an installer and is straight forward. The SharePoint Administration Service must run on the local server (see Windows Services). For more information see product documentation. After the installation has been successfully processed navigate to the Service Applications screen within the central administration (CA) of SharePoint:


Before creating your first ERPConnect Service Application a Secure Store must be created, where ERPConnect Services will save SAP user credentials. There will be a SNC (Secure Network Communication) option for Single-Sign-On (SSO) scenarios starting with the next product version. In the settings page for the "Secure Store Service" create a new Target Application and name the application "ERPConnect Services". Click on the button "Next" to define the store fields as follows:


Finish the creation process by clicking on "Next" and define application administrators. Then, mark the application, click "Set Credentials" and enter the SAP user credentials:


Let’s go on and create a new ERPConnect Service Application!

Click the "ERPConnect Service Application" link in the "New" menu of the Service Applications page (see also first screenshot above). This opens the "Create New ERPConnect Service Application" dialog to define the name of the service application, the SAP connection data and the IIS application pool:


Click "Create" after entering all data and you will see the following entries in the Service Applications screen:


That’s it! You are now done setting up your first ERPConnect Service Application.

ERPConnect Services Development

The ERPConnect Services runtime functionality covers different programming demands such as generically retrievable interface functions. The ERPConnect Services are managed by the Central Administration of SharePoint. The following service and function areas are provided by ERPConnect Services:

1. Executing and retrieving data directly from SAP tables

2. Executing SAP function modules / BAPIs

3. Executing XtractQL query statements

The next sections shows how to use these service and function areas and access different SAP objects from within your custom SharePoint applications using the ERPConnect Services. The runtime can be used in applications within the SharePoint context like Web Parts or Application Pages. In order to do so, you need to reference the assembly ERPConnectServices.Server.Common.dll in the project.

Before you can access data from the SAP system you must create an instance of the ERPConnectServiceClient class. This is the gate to all SAP objects and the generic API of ERPConnect Services runtime in overall. In the SharePoint context there are two options to create a client object instance:

// Option #1
ERPConnectServiceClient client = new ERPConnectServiceClient();

// Option #2
ERPConnectServiceApplicationProxy proxy = SPServiceContext.Current.GetDefaultProxy  (typeof(ERPConnectServiceApplicationProxy)) as
    ERPConnectServiceApplicationProxy;

ERPConnectServiceClient client = proxy.GetClient();

For more details on using ERPConnect Services in Silverlight or desktop applications see the specific sections below.

Querying Tables

Querying and retrieving table data is a common task for developers. ERPConnect Services runtime allows retrieving data directly from SAP tables. The ERPConnectServiceClient class provides a method called ExecuteTableQuery with two overrides which query SAP tables in a simple way. The method also supports a way to pass miscellaneous parameters like row count and skip, custom function, where clause definition and a returning field list. These parameters can be defined by using the ExecuteTableQuerySettings class instance.

DataTable dt = client.ExecuteTableQuery("T001");

…

ExecuteTableQuerySettings settings = new ExecuteTableQuerySettings {
  RowCount = 100,
  WhereClause = "ORT01 = 'Paris' AND LAND1 = 'FR'",
  Fields = new ERPCollection<string> { "BUKRS", "BUTXT", "ORT01", "LAND1" }
};

DataTable dt = client.ExecuteTableQuery("T001", settings);

…

// Sample 2
DataTable dt = client.ExecuteTableQuery("MAKT",
  new ExecuteTableQuerySettings {
    RowCount = 10,
    WhereClause = "MATNR = '60-100C'",
    OrderClause = "SPRAS DESC"
});

The first query reads all records from the SAP table T001 where the fields ORT01 equals Paris and LAND1 equals FR (France). The query returns the top 100 records and the result set contains only the fields BUKRS, BUTXT, ORT01 and LAND1. The second query returns the top ten records of the SAP table MAKT, where the field MATNR equals the material number 60-100C. The result set is ordered by the field SPRAS.

Executing Function Modules

In addition to query SAP tables the runtime API executes SAP function modules (BAPIs). Function modules must be marked as remote-enabled modules (RFC) within SAP. The ERPConnectServiceClient class provides a method called CreateFunction to create a structure of metadata for the function module. The method returns an instance of the data structure ERPFunction. This object instance contains all parameters types (import, export, changing and tables) that can be used with function modules.

In the sample below we call the function SD_RFC_CUSTOMER_GET and pass a name pattern (T*) for the export parameter with name NAME1. Then we call the Execute method on the ERPFunction instance. Once the method has been executed the data structure is updated. The function returns all customers in the table CUSTOMER_T.

ERPFunction function = client.CreateFunction("SD_RFC_CUSTOMER_GET");
function.Exports["NAME1"].ParamValue = "T*";
function.Execute();

foreach(ERPStructure row in function.Tables["CUSTOMER_T"])
  Console.WriteLine(row["NAME1"] + ", " + row["ORT01"]);

The following code shows an additional sample. Before we can execute this function module we need to define a table with HR data as input parameter. The parameters you need and what values the function module is returning dependents on the implementation of the function module.

ERPFunction function = client.CreateFunction("BAPI_CATIMESHEETMGR_INSERT");
function.Exports["PROFILE"].ParamValue = "TEST";
function.Exports["TESTRUN"].ParamValue = "X";

ERPTable records = function.Tables["CATSRECORDS_IN"];
ERPStructure r1 = records.AddRow();
r1["EMPLOYEENUMBER"] = "100096";
r1["WORKDATE"] = "20110704";
r1["ABS_ATT_TYPE"] = "0001";
r1["CATSHOURS"] = (decimal)8.0;
r1["UNIT"] = "H";

function.Execute();

ERPTable ret = function.Tables["RETURN"];

foreach(var i in ret)
  Console.WriteLine("{0} - {1}", i["TYPE"], i["MESSAGE"]);



Executing XtractQL Query Statements

The ERPConnect Services runtime is offering a new way of accessing SAP data. Theobald Software has developed a SAP query language called XtractQL. The XtractQL query language, also known as XQL, consists of ABAP and SQL syntax elements.

XtractQL allows querying SAP tables, BW-Cubes, SAP Queries and executing function modules. It also returns metadata for the objects and even MDX statements can be executed with XQL. All XQL queries are returning a data table object as result set. In case of the execution of function modules the caller must define the returning table (see sample below – INTO @RETVAL). XQL is very useful in situations where you need to handle dynamic statements. The following list shows a couple of query samples you may use in your applications:

SELECT TOP 5 * FROM T001W WHERE FABKL = ‚US‘

This query selects the top 5 records of the SAP table T001W where the field FABKL equals the value US.

SELECT * FROM MARA WITH-OPTIONS(CUSTOMFUNCTIONNAME = ‚Z_XTRACT_IS_TABLE‘)

This query selects all records and fields of the SAP table MARA using a custom SAP function module to retrieve the data called Z_XTRACT_IS_TABLE.

SELECT MAKTX AS [ShortDesc], MANDT, SPRAS AS Language FROM MAKT

This query selects all records of the SAP table MAKT. The result set will contains three fields named ShortDesc, MANDT and Language.

EXECUTE FUNCTION ‚SD_RFC_CUSTOMER_GET‘

    EXPORTS KUNNR=’0000003340′

    TABLES CUSTOMER_T INTO @RETVAL;

This query executes the SAP function module SD_RFC_CUSTOMER_GET and returns as result the table CUSTOMER_T (defined as @RETVAL).

DESCRIBE FUNCTION ‚SD_RFC_CUSTOMER_GET‘ GET EXPORTS

This query returns metadata about the export parameters of the SAP function module.

SELECT TOP 30 LIPS-LFIMG, LIPS-MATNR, TEXT_LIKP_KUNNR AS CustomerID

    FROM QUERY ‚S|ZTHEO02|ZLIKP‘

    WHERE SP$00002 BT ‚0080011000‘AND ‚0080011999‘

This statement executes the SAP Query "S|ZTHEO02|ZLIKP" (name includes the workspace, user group and the query name). As you can see XtractQL extends the SQL syntax with ABAP or SAP specific syntax elements. This way you can define fields using the LIPS-MATNR format and SAP-like where clauses like "SP$00002 BT ‚0080011000‘AND ‚0080011999‘".

ERPConnect Services for SharePoint 2010 (ECS) provides a little helper tool, the XtractQL Explorer (see screenshot below), to learn more about the query language and to test XQL queries. You can use this tool independent of SharePoint 2010, but you need access to a SAP system.



To find out more about all XtractQL language syntax see the product manual.

ERPConnect Services In Silverlight And Desktop Applications

So far all samples are using the assembly ERPConnectServices.Server.Common.dll as project reference and all code snippets shown run within the SharePoint context, e.g. Web Part. ERPConnect Services runtime also provides client libraries for Silverlight and desktop applications:

ERPConnectServices.Client.dll for Desktop applications

ERPConnectServices.Client.Silverlight.dll for Silverlight applications

You need to add the references depending what project you are implementing.

In Silverlight the implementation and design pattern is a little bit more complicated, since all web services will be called in asynchronously. It’s also not possible to use the DataTable class. It’s just not implemented for Silverlight. ERPConnect Services provides a similar class called ERPDataTable, which is used in this cases by the API.

The ERPConnectServiceClient class for Silverlight provides the method ExecuteTableQueryAsync and an event called ExecuteTableQueryCompleted as callback delegate.

public event EventHandler<ExecuteTableQueryCompletedEventArgs> ExecuteTableQueryCompleted;

public void ExecuteTableQueryAsync(string tableName)
public void ExecuteTableQueryAsync(string tableName, ExecuteTableQuerySettings settings)

The following code sample shows a simple query of the SAP table T001 within a Silverlight client. First of all, an instance of the ERPConnectServiceClient is created using the URI of the ERPConnectService.svc, then a delegate is defined to handle the complete callback. Next, the query is executed, defined with a RowCount equal 10 to only return the top 10 records in the result set. Once the result is returned the data set will be attached to a DataGrid control (see screenshot below) within the callback method.

void OnGetTableDataButtonClick(object sender, RoutedEventArgs e)
{
   ERPConnectServiceClient client = new ERPConnectServiceClient(
      new Uri("http://<SERVERNAME>/_vti_bin/ERPConnectService.svc"));

   client.ExecuteTableQueryCompleted += OnExecuteTableQueryCompleted;
   
   client.ExecuteTableQueryAsync("T001",
      new ExecuteTableQuerySettings { RowCount = 150 });
}

void OnExecuteTableQueryCompleted(object sender,    ExecuteTableQueryCompletedEventArgs e)
{
   if(e.Error != null)
      MessageBox.Show(e.Error.Message);
   else
   {
      e.Table.View.GroupDescriptions.Add(new PropertyGroupDescription("ORT01"));
      TableGrid.ItemsSource = e.Table.View;
   }
}

The screenshot below shows the XAML of the Silverlight page:



The final result can be seen below:



ERPConnect Services Designer

ERPConnect Services for SharePoint 2010 product suite includes a Visual Studio 2010 plugin, the ECS Designer, that allows developer to visually design SAP interfaces. It’s working similar to the LINQ to SAP Designer I have written about a while ago, see article at CodeProject: LINQ to SAP.

The ECS Designer is not automatically installed once you install the ERPConnect Services suite. You need to call the installation program manually. The setup adds a new project item type to Visual Studio 2010 with the file extension .ecs and is linking it with the ECS Designer. The needed references are added automatically after adding an ECS project item. The designer generates source code to integrate with the ERPConnect Services runtime after the project item is saved. The generated context class contains methods and sub-classes that represent the defined SAP objects (see screenshots below).



Before you access the SAP system for the first time you will be asked to enter the connection data. You may also load the connection data from SharePoint system. The ECS Designer GUI is shown in the screenshots below:





The screenshot above for instance shows the tables dialog. After clicking the Add (+) button in the main designer screen and searching a SAP table in the search dialog, the designer opens the tables dialog. In this dialog you can change the name of the generated class, the class modifier and all needed properties (fields) the final class should contain. To preview your selection press the Preview button. The next screenshot shows the automatically generated classes in the file named EC1.Designer.cs:



Using the generated code is simple. The project type we are using for this sample is a standard console application, therefore the ECS Designer is referencing the ERPConnectServices.Client.dll for desktop applications. Since we are not within the SharePoint context, we have to define the URI of the SharePoint system by passing this value into the constructor of the ERPConnectServicesContext class.

The designer has generated class MAKT and an access property MAKTList for the context class of the table MAKT. The type of this property MAKTList is ERPTableQuery<MAKT>, which is a LINQ queryable data type. This means you can use LINQ statements to define the underlying query. Internally, the ERPTableQuery<T> type will translate your LINQ query into call of ExecuteTableQuery.



That’s it! SAP access as its best.

Advanced Techniques

There are situations when you have to use the exact same SAP connection while calling a series of function modules in order to receive the correct result. Let’s take the following code:

ERPConnectServiceClient client = new ERPConnectServiceClient(); 

using(client.BeginConnectionScope()) 
{ 
    ERPFunction f = client.CreateFunction("BAPI_GOODSMVT_CREATE"); 

    ERPStructure s = f.Exports["GOODSMVT_HEADER"].ToStructure(); 
    s["PSTNG_DATE"] = "20110609"; // Posting Date in the Document 
    s["PR_UNAME"] = "BAEURLE"; // UserName 
    s["HEADER_TXT"] = "XXX"; // HeaderText 
    s["DOC_DATE"] = "20110609"; // Document Date in Document 

    f.Exports["GOODSMVT_CODE"].ToStructure()["GM_CODE"] = "01"; 

    ERPStructure r = f.Tables["GOODSMVT_ITEM"].AddRow(); 
    r["PLANT"] = "1000";            // Plant 
    r["PO_NUMBER"] = "4500017210";    // Purchase Order Number 
    r["PO_ITEM"] = "010";        // Item Number of Purchasing Document 
    r["ENTRY_QNT"] = 1;            // Quantity in Unit of Entry 
    r["MOVE_TYPE"] = "101";        // Movement Type 
    r["MVT_IND"] = "B";            // Movement Indicator 
    r["STGE_LOC"] = "0001";        // Storage Location 
    
    f.Execute(); 

    string matDocument = f.Imports["MATERIALDOCUMENT"].ParamValue as string; 
    string matDocumentYear = f.Imports["MATDOCUMENTYEAR"].ParamValue as string; 

    ERPTable ret = f.Tables["RETURN"]; //.ToADOTable(); 

    foreach(var i in ret) 
        Console.WriteLine("{0} - {1}", i["TYPE"], i["MESSAGE"]); 

    ERPFunction fCommit = client.CreateFunction("BAPI_TRANSACTION_COMMIT"); 
    fCommit.Exports["WAIT"].ParamValue = "X"; 
    fCommit.Execute(); 
} 

In this sample we create a goods receipt for a goods movement with BAPI_GOODSMVT_CREATE. The final call to BAPI_TRANSACTION_COMMIT will only work, if the system under the hood is using the same connection object.

The ERPConnect Services is not providing direct access to the underlying SAP connection, but the library offers a mechanism called connection scoping. You may create a new connection scope with the client library and telling ERPConnect Services to use the same SAP connection until you close the connection scope. Within the connection scope every library call will use the same SAP connection.

In order to create a new connection scope you need to call the BeginConnectionScope method of the class ERPConnectServiceClient. The method returns an IDisposable object, which can be used in conjunction with the using statement of C# to end the connection scope. Alternatively, you may call the EndConnectionScope method.

It’s also possible to use function modules with nested structures as parameters. This is a special construct of SAP. The goods receipt sample above is using a nested structure for the export parameter GOODSMVT_CODE. For more detailed information about nested structures and tables see the product documentation.

Summary

ERPConnect Services for SharePoint 2010 is powerful product suite to integrate SAP business data into SharePoint applications. Combining the ERPConnect Services runtime with the BCS Connector tool unleashes the real power of this toolkit. More information about ERPConnect Services for SharePoint 2010 you will find here:

Product website of ERPConnect Services for SharePoint 2010