My new website at Parago.de

http://www.parago.de

ParagoHomepage

Advertisements

How To Integrate SAP Business Data Into SharePoint 2010 Using BCS Services And BCS Connector From Theobald Software

The Business Connectivity Services (BCS) of SharePoint 2010 provide a great way to fully integrate external data into the SharePoint. In most cases developers are integrating SQL database tables into the BCS services. But how do we connect to a SAP system? How do we integrate real world SAP business data like tables and function modules or BAPIs into SharePoint?

The answer is just a few clicks away. Theobald Software just released the ERPConnect Services (ECS) for SharePoint 2010 product suite and the product includes a great tool named BCS Connector. The BCS Connector allows developer to create SAP-driven BDC models in minutes with just a couple of clicks.

The BCS Connector application can be installed on client machines and will connect remotely to the SharePoint BCS service application. In this post I will give you an overview of the tool by creating a BDC model with an entity called Customer. We will also create two operations for the entity, a Finder and SpecificFinder method. Both methods are using the SAP built-in function module SD_RFC_CUSTOMER_GET. This is a very simple SAP function returning a list of customers.

To create a new BDC model, first you must enter the SAP and the SharePoint connection data after starting the BCS Connector application (see screenshot for SAP connection below).

Once you have entered the SAP connection data press the OK button and start adding your new BDC entity. To add an entity that is connected to a SAP function module SD_RFC_CUSTOMER_GET press the New button on the lower left side. A new wizard dialog will pop up. Select Function and press the Next button.

Then search for the function module and press the Next button to select the structure of our new BDC entity.

The last wizard page shows us a list with all possible entity structures available for this function module. Select the table CUSTOMER_T, including our customer data, and press the Finish button.

Now we have created our new entity Customer, but we still need to rename the entity from CUSTOMER_T to Customer. Each entity in BCS services must define at least two operations or methods, a so called Finder method to return a list of entities and a SpecificFinder method to return a specific entity within the list.

You also need to define the identifier fields within the entity by clicking the checkbox for the field KUNNR (Customer ID). You may also rename the field or all fields. Next, we create the Finder method by clicking the New button.

The Finder option is already selected. Just press the Finish button and the BCS Connector is automatically creating everything else and will open afterwards the Edit Operation dialog.

This dialog allows you to define the return parameter, input parameters and filters for the entity operation. Actually to execute the SAP function module SD_RFC_CUSTOMER_GET we need to define a value for the input parameters KUNNR or NAME1. For demonstration purpose I just define a name pattern for field NAME1. This query returns all customers that starts with T. What you can define as input parameter depends on the function itself. Clicking on the Preview button displays a list of all filtered customers.

In the same way we are creating the SpecificFinder method:

So, finally we have created a new entity with two entity operations and now we are able to save it on the SharePoint server. Just press the Save Model button. This will result in a new BDC model created on the server:

You can find the BDC models within the Central Administration of SharePoint 2010.

So far we just created a model, but we also want to display the customer data within an external list. We can create an external list using the SharePoint Designer or the BCS Connector. I will show you the second option. Switch to the External Lists tab of the ribbon bar and click on the New External List button.

The New External List dialog has pre-selected all values. Click on the Create button and you are done. You may also change the name of the external list. The final external list looks as follows:

That was really easy and you can even export the BDC Model to Visual Studio 2010 and do additional customizing.

Further information about the product ERPConnect Services and BCS Connector can be found here:

http://www.theobald-software.com/en/products/erpconnectservices.htm

FluentSP – The Fluent SharePoint API

Download FluentSP 1.0 from Codeplex.com

More information on my new homepage at http://www.parago.net

Once you are doing a lot of SharePoint programming you know you often have to write lengthy pieces of code to implement simple tasks like querying SharePoint lists. Nowadays you can read a lot of fluent APIs or fluent interface. For instance, jQuery, a JavaScript library that had successfully introduced a fluent API to handle the hierarchical structure of the HTML documents.

Today, I want to introduce a small library I have developed, FluentSP, a modern fluent interface around the classic SharePoint 2010 API. By using FluentSP instead of the classic SharePoint API, you will be able to chain methods and act on sets of items of the underlying SharePoint objects.

What is a fluent API?
Checkout this CodeProject article A Look at Fluent APIs and the Wikipedia article Fluent interface.

To start into the fluent API you call the Use() method on SPSite, SPWeb, SPWebCollection or SPListCollection. The Use() method is implemented as an extension method that will return the entry facade object (see facade table below). Another entry point to the fluent API is the static class FluentSP with its static methods CurrentSite, CurrentWeb, CurrentLists or RootWebLists.

SPContext.Current.Site.Use()... // => Returns the SPSiteFacade as entry point

// OR:
FluentSP.CurrentSite()...       // => Returns the SPSiteFacade as entry point 

Using the entry facade instance you can start chaining the available facade methods as follows:

FluentSP.CurrentSite().Web("Home").List("Tasks").Items().ForEach(i => // Do something with the item i of type SPListItem...);

// OR:
FluentSP.CurrentSite()
  .Web("Home")
    .List("Tasks")
      .Items()
      .ForEach(i => // Do something with...);

Each facade object is actually wrapping an underlying data item, for instance the SPSiteFacade class is the fluent wrapper of the SPSite class. Depending on what kind of facade methods you are calling the method is returning either the current facade instance (e.g., ForEach() or Where()) or the method is returning a new child facade object (e.g. Items()). During the process of chaining methods in such a way you will build up a tree or hierarchy of facade instances. In order to step back to the parent or previous facade instance you need to call the End() method:

site.Use()
       .RootWeb()
         .Site()
       .End()		// Returns SPWebFacade  as parent facade
         .Site()
       .End()		// Returns SPWebFacade  as parent facade
     .End();		// Returns SPSiteFacade as parent facade

FluentSP is currently missing a number of possible useful methods, but you can easily extend the FluentSP API with custom facade classes and extension methods, see below and source code for implementation examples.

Samples

SPSite site = SPContext.Current.Site;

// ----------------------------

// Outputs titles of all lists of the root web where the list title starts with T
site.Use().RootWeb().Lists().Where(l => l.Title.StartsWith("T")).ForEach(l => Console.WriteLine(l.Title));

// Outputs titles of all lists of the root web where the list title ends with a ts (using RegEx)
site.Use().RootWeb().Lists("ts$").ForEach(l => Console.WriteLine(l.Title)).Count(out c);

// Outputs titles of all lists of the root web in ascending order where the starts with T
site.Use().RootWeb().Lists().Where(l => l.Title.StartsWith("T")).OrderBy(l => l.Title).ForEach(l => Console.WriteLine(l.Title));

// Outputs titles of all lists of the root web in descending order where the starts with T
site.Use()
    .RootWeb()
      .Lists()
      .Where(l => l.Title.StartsWith("T"))
      .OrderByDescending(l => l.Title)
      .ForEach(l => Console.WriteLine(l.Title));

// ----------------------------

// Delete all items in the Members list, then add 7 new members and then select and output 
// the titles of a few of the newly created items
site.Use()
    .RootWeb()
      .List("Members")
      .Do(w => Console.WriteLine("Deleting all members..."))
       .Items()
       .Delete()
      .End()
      .Do(w => Console.WriteLine("Adding all members..."))
      .AddItems(7, (i, c) => i["Title"] = "Member " + c)
       .Items()
       .Skip(2)
       .TakeUntil(i => ((string)i["Title"]).EndsWith("6"))
       .ForEach(i => Console.WriteLine(i["Title"]));

// ----------------------------

// Search for lists that are created by specific a user and depending on the results
// displays different messages by calling the IfAny or IfEmpty methods
site.Use()
    .RootWeb()
      .Lists()
      .ThatAreCreatedBy("Unknown User")
      .IfAny(f => f.ForEach(l => Console.WriteLine(l.Title)))
      .IfAny(l => l.Title.StartsWith("M"), f => Console.WriteLine("Lists found that starts with M*"))
      .IfEmpty(f => Console.WriteLine("No lists found for user"))
    .End()
    .Do(w => Console.WriteLine("---"))
      .Lists()
      .ThatAreCreatedBy("System Account")
      .IfAny(f => f.ForEach(l => Console.WriteLine(l.Title)));

// ----------------------------

var items = new List<SPListItem>();

// Query with Skip and TakeUnitl methods
site.Use().RootWeb().List("Members").Items().Skip(2).TakeUntil(i => i.Title.EndsWith("5")).ForEach(i => { items.Add(i); Console.WriteLine(i.Title); });

// Query with Skip and TakeWhile methods
site.Use()
    .RootWeb()
      .List("Members")
       .Items()
       .Skip(2)
       .TakeWhile(i => i.Title.StartsWith("Member"))
       .ForEach(i => { items.Add(i); Console.WriteLine(i.Title); })
      .End()
       .Items()
       .Where(i => i.Title == "XYZ")
       .ForEach(i => { items.Add(i); Console.WriteLine(i.Title); });

// ----------------------------

// Adds new items using the Do method with the passed facade object
site.Use()
    .RootWeb()
    .AllowUnsafeUpdates()
      .List("Members")
      .Do((f, l) => {
        for(int c = 1; c <= 5; c++)
          f.AddItem(i => i["Title"] = "Standard Member #" + c);
      })
      .AddItem(i => i["Title"] = "Premium Member")
       .Items()
        .OrderBy(i => i.Title)
        .ForEach(i => Console.WriteLine(i["Title"]));

Extensibility Samples

// This sample is using the ThatAreCreatedBy extension method defined in Extensions.cs to show how to extend the fluent API
site.Use()
        .RootWeb()
          .Lists()
          .ThatAreCreatedBy("System Account", "jbaurle")
          .Count(c => Console.WriteLine("Lists found: {0}", c))
          .ForEach(l => Console.WriteLine(l.Title));

// This sample uses the new SPWebApplicationFacade extenion defined in SPwebApplicationFacade.cs to show how to extend the fluent API
site.WebApplication.Use()
              .Sites()
              .ForEach(i => Console.WriteLine(i.Url));

// This sample uses an alternative implementation for SPSiteFacade defined in SPSiteFacadeAlternate.cs to show how to extend the fluent API
site.WebApplication.Use().WithFirstSite().DoSomething();
site.Use<SPSiteFacadeAlternate<BaseFacade>>().DoSomething();

The custom method ThatAreCreatedBy which is used in the first query of the extensibility samples is implemented as follows:

static class Extensions
{
  public static SPListCollectionFacade<TParentFacade> ThatAreCreatedBy<TParentFacade>(this SPListCollectionFacade<TParentFacade> facade, params string[] names)
    where TParentFacade : BaseFacade
  {
    // NOTE: This sample uses the GetCollection method of the given facade instance to retrieve the current 
    // collection and adds the its query (see LINQ Deferred Execution). The Set method updates the 
    // underlying collection. The GetCurrentFacade method will then return the current facade to allow 
    // method chaining.

    if(names.Length > 0)
      facade.Set(facade.GetCollection().Where(i => names.Contains(i.Author.Name)));

    return facade.GetCurrentFacade();
  }
}

For more samples and details check out the source code you can download from Codeplex.

Built-In Facades and Methods

See Codeplex

LINQ to SAP

A lot has been written about Microsoft’s new data access technology LINQ (Language Integrated Query) since the first preview version has been published. But there are still some interesting aspects to LINQ and its extensibility. This article will introduce a new provider called LINQ to SAP which offers an easy way to retrieve business data from SAP/R3 systems within a .NET application.

With the introduction of the .NET framework 3.5 and the programming language extensions for C# and VB.NET Microsoft began to redefine the way developers would implement data access. Nearly all applications today are querying data from different data sources. Mainly those data sources are SQL databases, XML files or in-memory collections. LINQ offers an universal and standardized approach to access all those data sources using special data providers. The LINQ language syntax strongly resembles that of SQL. The following sample shows how to query data from a string array with LINQ:

string[] names = {„John“, „Patrick“, „Bill“, „Tom“}

var res = from n in names where n.Contains(„o“) select n;

foreach(var name in res)
Console.WriteLine(name);

This simple LINQ query based on an in-memory collection (array) selects all items in the array „names“ that contain the letter „o“. Console output: „John, Tom“. A LINQ introduction is beyond the scope of this article. A very good introduction article can be found on CodeProject.com.

LINQ Providers

The .NET framework comes with built-in providers for collections and lists (LINQ to Objects), for Microsoft SQL Server databases (LINQ to SQL), for XML files (LINQ to XML) and finally for DataSet object instances (LINQ to DataSet). Beside the standard providers, developers can extend LINQ by creating custom providers to support special data sources. LINQ to LDAP or LINQ to Amazon are examples of such custom providers.

To write a custom LINQ provider one must basically implement two interfaces: IQueryable and IQueryProvider. These interfaces make objects queryable in LINQ expressions. Developing a LINQ provider can be a very complex task, but there a quite some good blog entries on the net explaining the steps in detail.

This article will introduce a new provider called LINQ to SAP from Theobald Software which provides developers with a simple way to access SAP/R3 systems and their data objects. The software also provides a Visual Studio 2008 Designer to define SAP objects interactively and integrate them in .NET applications.

SAP Background

This section will give you a short explanation and background of SAP objects that are queryable by LINQ to SAP. The most important objects are Function Modules, Tables, BW Cubes and Queries.

A Function Module is basically similar to a normal procedure in conventional programming languages. Function Modules are written in ABAP, the SAP programming language, and are accessible from any other programs within a SAP/R3 system. They accept import and export parameters as well as other kind of special parameters. The image below shows an example of a Function Module named BAPI_EQUI_GETLIST within the SAP Workbench:

Figure 1: Function Module (SAP Workbench)

In addition BAPIs (Business-API) are special Function Modules that are organized within the SAP Business Object Repository. LINQ to SAP also allows access to SAP tables. Those are basically straightforward relational database tables. Furthermore the LINQ to SAP Designer allows developers to define and access BW Cubes (Business Cube Cubes) and Queries (SAP Query). BW Cubes are also known as OLAP Cubes. Data are organized in a multi-dimensional way within a cube. SAP Queries work just like other queries. To indentify a SAP Query uniquely there are three pieces of information necessary: user area (global or local); user group and the name of the query. With the concept of Queries SAP provides a very simple to generate reports, without the need of knowing the SAP ABAP programming language.

Visual Studio 2008 Designer for LINQ to SAP

In order to use LINQ to SAP and the associated Visual Studio Designer, the .NET library ERPConnect.net from Theobald Software must be installed first. This software is the basic building block between .NET and a SAP/R3 system and provides an easy API to exchange data between the two systems. The company offers a free trial version to download. After installing ERPConnect.net, LINQ to SAP must be installed separately using a setup program (see manual). The provider and the designer are actually extensions to the ERPConnect.net library. The LINQ to SAP provider itself consists of the Visual Studio 2008 Designer and additional class libraries that are bundled within the namespace ERPConnect.Linq.

The setup adds a new ProjectItem type to Visual Studio 2008 with the file extension .erp and is linking it with the designer. Double-clicking the .erp-file will open the LINQ to SAP Designer. The designer supports application developers with the option to automatically generate source code to integrate with SAP objects. For all defined SAP objects in an .erp file, the provider will create a context class which inherits from the ERPDataContext base class. The generated context class contains methods and sub-classes that represent the defined SAP objects. Beside the .erp file, LINQ to SAP designer will save the associated and automatically generated source code in a file with the extension .Designer.cs.

Figure 2: Add new project item

Figure 3: SAP objects in LINQ to SAP Designer

Figure 4: Connection dialog in LINQ to SAP Designer

Function Modules

This section shows how to access and obtain data using the function module BAPI_EQUI_GETLIST by creating a LINQ to SAP object. The module is returning an equipment list for pre-defined plants. First of all one must add a new LINQ to SAP file (.erp) to a new or existing Visual Studio 2008 project. By opening the .erp file the LINQ to SAP Designer will start. By double-clicking on the Function item in the toolbox of Visual Studio will add a new SAP object Function Module. In the next step the object search dialog opens and the developer can search for function modules.

Figure 5: Search dialog in LINQ to SAP Designer

Once the selection is made, the LINQ to SAP Designer will show up the Function Module dialog box with all data, properties and parameter definitions of the selected module BAPI_EQUI_GETLIST. The user can now change the naming of the auto-generated method as well as all used parameters.

Figure 6: Function Module dialog in LINQ to SAP Designer

For each function module the LINQ to SAP Designer will generate a context class method with all additional object classes and structures. If for instance the user defines a method name called GetEquipmentList for the function module BAPI_EQUI_GETLIST, the designer will generate a context class method with that name and the defined method signature. The user can also specify the parameters to exchange. The lower area of the dialog is displaying the SAP typical parameters, like IMPORT, EXPORT, CHANGING and TABLES parameters. LINQ to SAP allows to define default values for SAP parameters. Those parameters can also be used as parameters for the auto-generated context class method as well as for return values. The names for the parameters and the associated structures can also be renamed.

The method signature for the function module defined above looks like this:

public EquipmentTable GetEquipmentList(PlantTable plants)

The context class itself is named SAPContext by default. The context class name, the namespace, the connection settings as well as other flags can be defined in the properties window of the LINQ to SAP Designer. The following code shows how to use the context class SAPContext:

class Program

{

static void Main()

{

SAPContext dc = new SAPContext(„TESTUSER“, „XYZ“);

 

SAPContext.PlantTable plants = new SAPContext.PlantTable();

SAPContext.PlantStructure ps = plants.Rows.Add();

ps.SIGN = „I“;

ps.OPTION = „EQ“;

ps.LOW = „3000“;

 

SAPContext.EquipmentTable equipList = dc.GetEquipmentList(plants);

}

}

Tables

The procedure for adding a SAP Table is basically the same as for function modules (see above). After adding the SAP Table object from the toolbox in Visual Studio and finding the table with the search dialog, the Table dialog will show up:

Figure 7: Tables Module dialog in LINQ to SAP Designer

In the upper part of the table dialog the user must define the class name for table object for auto-generation. The default name is the name of the table. The lower part shows a data grid with all table fields and their definitions. For each field a class property name can be defined for the auto-generated context class code. The checkbox in the first column selects if the field will be part of the table class.

The figure above shows the definition of the SAP Table object T001W. This tables stores plant information. The class has not been changed, so the designer will create a C# class with the name T001W. In addition the context class will contain a property T001WList. The type of this property is ERPTable<T001W>, which is LINQ queryable data type.

The following code shows how to query the table T001W using the context class:

class Program

{

static void Main()

{

SAPContext dc = new SAPContext(„TESTUSER“, „XYZ“);

dc.Log = Console.Out;

 

var res = from p in dc.T001WList

where p.WERKS == „3000“

select p;

 

foreach (var item in res)

Console.WriteLine(item.NAME1);

}

}

SAP Context Class and Logging

To access objects using LINQ to SQL, the provider will generate a context class named DataContext. Accordingly LINQ to SAP also creates a context class called SAPContext. This class is defined as a partial class. A partial class is a type declaration that can be split across multiple source files and therefore allows developers to easily extend auto- generated classes like the context class of LINQ to SAP.

The code sample below shows how to add a partial class (file SAPContext.cs) which adds a new custom method GetEquipmentListForMainPlant to extend the context class generated by the LINQ to SAP designer. This new method calls internally the auto-generated method GetEquipmentList with a pre-defined parameter value. The C# compiler will internally merge the auto-generated LINQtoERP1.Designer.cs with the SAPContext.cs source file.

using System;

 

namespace LINQtoSAP

{

partial class SAPContext

{

public EquipmentTable GetEquipmentListForMainPlant()

{

SAPContext.PlantTable plants = new SAPContext.PlantTable();

 

SAPContext.PlantStructure ps = plants.Rows.Add();

ps.SIGN = „I“;

ps.OPTION = „EQ“;

ps.LOW = „3000“;

 

return GetEquipmentList(plants);

}

}

}

LINQ to SAP also provides the capability to log LINQ query translations. In order to log data the LOG property of the context class must be set with a TextWriter instance, e.g. the console output Console.Out. All LINQ to SAP does is a very rudimentary logging which is restricted to table objects. But it helps developers to get a feeling about what the translated where part looks like.

Summary

In overall LINQ to SAP is very simple but yet powerful LINQ data provider and Visual Studio 2008 Designer to use. You also get a feeling on how to develop against a SAP/R3 system using .NET. For more information about the product please check out the homepage of the vendor, http://www.theobald-software.com.

Creating DAL Components Using Custom ASP.NET Build ProvidersAnd Compiler Techniques

There are many articles on the internet that are dealing with the creation and usage of a Data Access Layer (DAL) and the comprising components, also known as DALC (DAL Components). There is nothing new in the process of creation a DAL. You can actually use Typed DataSets, Microsoft’s Enterprise Library (DAAB) or you may use one of the many third party tools to implement a comprehensive DAL system.

The main objectives of this composition are to show how to create and use ASP.NET build providers and also to explain how easy you can analyze a small self-defined description language to declare DALCs or anything else. To implement a full-blown DAL killer application which you can use for the rest of your programming life is not the objective of this article. Normally you would not define your own description language to declare DALCs, you would instead use a XML-based description of the components and analyze them using the feature-rich XML classes that comes with the .NET framework.

I just wanted to implement a lexical analyzer (a.k.a. scanner or tokenizer), some parsing techniques and dynamic code generation using the .NET’s CodeDOM. There a lot of situations in daily work where it would be handy to develop some kind of parser (even a very small and simple one) to come up with an acceptable and elegant solution. In fact, for one of my customers I defined a description language to automate the extension of a web application.

To show an example of the final result of a dynamically generated DAL using the DALComp application lets assume we have a database table called Articles. For this table the DAL (see section „DALC Description Language“ below) respectively the build provider will automatically create a class called Article, containing private member fields and public poperties that correspond to the table column names. Nullable type are created for value types.

In addition the system generates static methods (also defined within a .dal file) to select the requested data. The data are returned as a generic list of type Article (in C# List<Article>) and can be used as follows:

Sample 1:

foreach(Article article in Article.SelectAll())

Console.WriteLine(article.Title);

Sample 2:

ArticlesGridView.DataSource=Article.SelectAll();

ArticlesGridView.DataBind();

Sample 3:

<asp:ObjectDataSource ID=“ArticlesDS“ TypeName=“Parago.DAL.Article“ SelectMethod=“SelectAll“

runat=“server“ />

So, now lets start!

Build Providers

Build providers are new to ASP.NET and the .NET framework version 2.0. They basically allow you to integrate yourself within the ASP.NET compilation process and build environment. That means you can define a new file type for which you can generate source code based on arbitrary file content. The source code (provided for instance as a CodeCompileUnit) will then be built into the final compiled website assemblies. In our case, we will define a new file type .dal and the file content will be of type Text containing our own little description language to define DALCs.

In fact, the ASP.NET framework is basically doing the exact same thing for file types like .apsx and .ascx as well as for many more. The corresponding build providers are defined in the global web.config configuration file. For instance the file type .apsx is handled by a framework class called PageBuildProvider. If you are interested in how the ASP.NET-Team has implemented this provider, you can either use ILDASM or Lutz Roeder’s „.NET Reflector“ to disassemble the code.

To use build providers in your web applications, you have to activate new file types within the local web.config. The file type .dal for the DALComp application is defined in the configuration file as follows:

<compilation>

<buildProviders>

<add extension=“.dal“ type=“Parago.DALComp.DALCompBuildProvider, DALComp.BuildProvider“/>

</buildProviders>

</compilation>

The class DALCompBuildProvider is handling from now on all files with the extension .dal. The class extends the abstract base class BuildProvider and overrides the GenerateCode method. ASP.NET is calling this method during the compilation and build process of the website and is passing an instance of type AssemblyBuilder to the method. Code can then be added by calling the AddCodeCompileUnit method of the AssemblyBuilder.

CodeCompileUnits represent containers for CodeDOM program graphs. Basically they are an internal image of the source code. Each .NET language that supports the code provider model can create source code in its own language based on a CodeCompileUnit.

Creating a clean language-independent CodeDOM program graph is an annoying and somewhat cumbersome task. You have to create it if you want to generate source code for different languages. The CodeGen class for DALComp generates DAL source code for currently C# and VB.NET.

The BuildProvider class also provides a method called OpenReader to read the source code (a file with the extension .dal). The next steps are to tokenize, parse and generate a CodeDOM program graph which we can turn over to the ASP.NET build process:

Tokenizer tokenizer=new Tokenizer(source);

Parser parser=new Parser(tokenizer);

CodeGen codeGen=new CodeGen(parser);

builder.AddCodeCompileUnit(this, codeGen.Generate());

In the next section we first of all take a look at a sample source code to see what kind of language we want to analyze and generate code for.

DALC Description Language

The description „language“ to formally describe DAL components uses a very simple syntax. The following shows a sample DAL definition contained in the file Sample.dal (stored in the special folder App_Code):

Config {

Namespace = „Parago.DAL“,

DatabaseType = „MSSQL“,

ConnectionString = „Data Source=.\SQLEXPRESS;…“

}

//

// DAL component for table Articles

//

DALC Article ( = Articles ) {

Mapping {                // Map just the following fields, leave others

ArticleID => Id,

Text1 => Text

}

SelectAll()

SelectByAuthor(string name[CreatedBy])

SelectByCategory(int category[Category])

}

DALC Category( = „Categories“ ) {

SelectAll()

}

The syntax for the language is defined using the extended Backus-Naur form (EBNF), which is an extension of the basic Backus-Naur form (BNF) metasyntax notation, a formal way to describe languages. The following syntax-rules illustrate the definiton of the DALC description language:

digit

= „0-9“

letter

= „A-Za-z“

identifier

= letter { letter | digit }

string

= ‚“‚ string-character { string-character } ‚“‚

string-character

= ANY-CHARACTER-EXCEPT-QUOTE | ‚““‚

dal

= config dalc { dalc }

config

= „Config“ „{“ config-setting { „,“ config-setting } „}“

config-setting

= ( „Namespace“ | „DatabaseType“ | „Connectionstring“ ) „=“ string

dalc

= „DALC“ identifier [ dalc-table ] „{“ [ dalc-mapping ] dalc-function { dalc-function } „}“

dalc-table

= „(“ „=“ ( identifier | string ) „)“

dalc-mapping

= „Mapping“ „{“ dalc-mapping-field { „,“ dalc-mapping-field } „}“

dalc-mapping-field

= ( identifier | string ) „=>“ identifier

dalc-function

= identifier „(“ [ dalc-function-parameter-list ] „)“

dalc-function-parameter-list

= dalc-function-parameter { „,“ dalc-function-parameter }

dalc-function-parameter

= ( „string“ | „int“ ) identifier „[“ identifier | string „]“

The next section explains how to scan and parse the syntax above shown.

Compiler Techniques

The DALComp application is using compiler techniques in a very basic kind. Implementing a real-world compiler can be a very complicated task. It involves techniques like syntax error recovering, variable scoping or bytecode (IL) generation and many more.

The first step in the implementaion is to create a tokenizer. A tokenizer is analyzing the input character by character and is trying to split it into so-called tokens. Tokens are categorized blocks of texts. The categories may be language keywords like the C# loop statement „for“, comparison operators like „==“ or whitespaces. DALComp is defining a class named Token to represent a single token as well as an enumeration called TokenKind to define the categories of tokens:

public enum TokenKind {

    KeywordConfig,

    KeywordDALC,

    KeywordMapping,

    Type,

    Identifier,

    String,

    Assign,            // =>

    Equal,            // =

    Comma,            // ,

    BracketOpen,        // [

    BracketClose,        // ]

    CurlyBracketOpen,    // {

    CurlyBracketClose,    // }

    ParenthesisOpen,    // (

    ParenthesisClose,    // )

    EOT            // End Of Text

}

public class Token {

    public TokenKind Type;

    public string Value;

    public Token(TokenKind type) {

        Type=type;

        Value=null;

    }

    public Token(TokenKind type, string value) {

        Type=type;

        Value=value;

    }

}

The class Tokenizer is doing the tokenizing by analyzing character by character of the input stream. The Tokenizer constructor initializes the object instance with the input text to scan, creates a generic queue of type Token and calls the method Start to do the work:

public Tokenizer(string text) {

// To avoid index overflow append new line character to text

this.text=(text==null?String.Empty:text)+“\n“;

// Create token queue (first-in, first-out)

tokens=new Queue<Token>();

// Tokenize the text!

Start();

}

The constructor will also add an additional character („\n“) to the input text to avoid an index overflow. The Start method looks similar to the folowing:

void Start() {

int i=0;

// Iterate through input text

while(i<text.Length) {

// Analyze next character and may be the following series of characters

switch(text[i]) {

// Ignore whitespaces

case ‚\n‘:

case ‚\r‘:

case ‚\t‘:

case ‚ ‚:

break;

// Comment (until end of line)

case ‚/‘:

if(text[i+1]==’/‘)

while(text[++i]!=’\n‘) ;

continue;


case ‚{‚:

     tokens.Enqueue(new Token(TokenKind.CurlyBracketOpen));

     break;

case ‚}‘:

     tokens.Enqueue(new Token(TokenKind.CurlyBracketClose));

     break;

// ‚=‘ or ‚=>‘

case ‚=‘:

    if(text[i+1]==‘>‘) {

     i++;

     tokens.Enqueue(new Token(TokenKind.Assign));

    }

    else

     tokens.Enqueue(new Token(TokenKind.Equal));

    break;


As you can see it is simple and straightforward to implement a tokenizer. There two more methods of the Tokenizer class, PeekTokenType to lookahead for the next token type in the queue and GetNextToken to actually return the next token from the queue (also removes the token from the queue):

public TokenKind PeekTokenType() {

// Always return at least TokenKind.EOT

return (tokens.Count>0)?tokens.Peek().Type:TokenKind.EOT;

}

public Token GetNextToken() {

// Always return at least Token of type TokenKind.EOT

return (tokens.Count>0)?tokens.Dequeue():new Token(TokenKind.EOT);

}

Both methods are called from the parser, the next step in compiling source code. Parsing is the process of analyzing the sequence of tokens in order to determine its grammatical structure with respect to a given formal grammar. An instance of the Tokenizer class will be handed over to the Parser class. The Parser class is producing a structure that is representing a semantically correct source code, called an abstract syntax tree (AST). The name is used by mistake, since the structure is not a tree of any kind. The AST used in this context is just a generic list of DALC class objects and settings (the Config part).

The Parser class implementation is also simple and straightforward. There is no time for me to explain in detail how parsing and the concepts behind are working. This is a huge area of computing. The source code for the Parser class is self-explanatory and easy to understand.

The parser is basically just implementing the syntax-rules defined above by the extended Backus-Naur form, see also the section „DALC Description Language“:

/// <summary>

/// dal = config dalc { dalc }

/// </summary>

void ParseDAL() {

ParseConfig();

do {

ParseDALC();

} while(Taste(TokenKind.KeywordDALC));

Eat(TokenKind.EOT);

}

/// <summary>

/// config = „Config“ „{“ config-setting { „,“ config-setting } „}“

/// </summary>

void ParseConfig() {

Eat(TokenKind.KeywordConfig);

Eat(TokenKind.CurlyBracketOpen);

ParseConfigSetting();

while(true) {

if(!Taste(TokenKind.Comma))

break;

Eat();

ParseConfigSetting();

}

Eat(TokenKind.CurlyBracketClose);


}

The parsing methods are using helper functions to „eat“ the tokens of the queue by calling the mentioned method GetNextToken of the Tokenizer class or simply aborting the parsing process. Here an example:

/// <summary>

/// Looks ahead the token line and returns the next token type.

/// </summary>

bool Taste(TokenKind type) {

return tokenizer.PeekTokenType()==type;

}

/// <summary>

/// Returns the next token.

/// </summary>

string Eat() {

token=tokenizer.GetNextToken();

return token.Value;

}

/// <summary>

/// Returns the next token of type, otherwise aborts.

/// </summary>

string Eat(TokenKind type) {

token=tokenizer.GetNextToken();

if(token.Type!=type)

Abort();

return token.Value;

}

/// <summary>

/// Returns the next token of any of the passed array of types, otherwise aborts.

/// </summary>

string EatAny(TokenKind[] types) {

token=tokenizer.GetNextToken();

foreach(TokenKind type in types)

if(token.Type==type)

return token.Value;

Abort();

return String.Empty;

}

The third phase is using the structure generated by the Parser class and transforms it into a CodeDOM structure that can be used to create C# or VB code. This phase is called the code generation phase. Language compilers targeting the .NET framework are usually generating Intermediate Language (IL) code. The DALComp application is not generating any IL code, instead it generates a CodeDOM graph that can be used e.g. by ASP.NET to compile into web assemblies.

The method Generate of the CodeGen class is creating first of all a container for the CodeDOM structure, is adding a namespace unit to it and tries to connect to the defined database using the connection string that is specified within the DAL defintion file (see Sample.dal):

// Create container for CodeDOM program graph

CodeCompileUnit compileUnit=new CodeCompileUnit();

try {

// If applicable replace the value ‚|BaseDirectory|‘ with the current

// directory of the running assembly (within the connection string)

// to allow database access in DALComp.Test.Console

string connectionString=dal.Settings[„CONNECTIONSTRING“]

.Replace(„|BaseDirectory|“, Directory.GetCurrentDirectory());

// Define new namespace (Config:Namespace)

CodeNamespace namespaceUnit=new CodeNamespace(dal.Settings[„NAMESPACE“]);

compileUnit.Namespaces.Add(namespaceUnit);

// Define necessary imports

namespaceUnit.Imports.Add(new CodeNamespaceImport(„System“));

namespaceUnit.Imports.Add(new CodeNamespaceImport(„System.Collections.Generic“));

namespaceUnit.Imports.Add(new CodeNamespaceImport(„System.Data“));

namespaceUnit.Imports.Add(new CodeNamespaceImport(„System.Data.SqlClient“));

// Generate private member fields (to save public property values)

// by analyzing the database table which is defined for the DALC

SqlConnection connection=new SqlConnection(connectionString);

connection.Open();

// Generate a new public accessable class for each DALC definition

// with all defined methods

foreach(DALC dalc in dal.DALCs) {

// Generate new DALC class type and add to own namespace

CodeTypeDeclaration typeUnit=new CodeTypeDeclaration(dalc.Name);

namespaceUnit.Types.Add(typeUnit);

// Generate public empty constructor method

CodeConstructor constructor=new CodeConstructor();

constructor.Attributes=MemberAttributes.Public;

typeUnit.Members.Add(constructor);

// Get schema table with column defintions for the current DALC table

DataSet schema=new DataSet();

new SqlDataAdapter(String.Format(„SELECT * FROM {0}“, dalc.Table), connection)

    .FillSchema(schema, SchemaType.Mapped, dalc.Table);

// Generate for each column a private member field and a public

// accessable property to use

foreach(DataColumn column in schema.Tables[0].Columns) {

// Define names by checking DALC mapping definition

string name=column.ColumnName;

string nameMapped=

dalc.Mapping.ContainsKey(name.ToUpper())?dalc.Mapping[name.ToUpper()]:name;

// Generate private member field with underscore plus name; define

// member field type by checking if value type and create a

// nullable of that type accordingly

CodeMemberField field=new CodeMemberField();

field.Name=String.Format(„_{0}“, nameMapped);

field.Type=GenerateFieldTypeReference(column.DataType);

typeUnit.Members.Add(field);

// Generate public accessable property for private member field,

// to use for instance in conjunction with ObjectDataSource

CodeMemberProperty property=new CodeMemberProperty();

property.Name=nameMapped;

property.Type=GenerateFieldTypeReference(column.DataType);

property.Attributes=MemberAttributes.Public;

property.GetStatements.Add(

new CodeMethodReturnStatement(

new CodeFieldReferenceExpression(

new CodeThisReferenceExpression(),

field.Name

)

)

);

property.SetStatements.Add(

new CodeAssignStatement(

new CodeFieldReferenceExpression(

new CodeThisReferenceExpression(),

field.Name

),

new CodePropertySetValueReferenceExpression()

)

);

typeUnit.Members.Add(property);

}


}

}

Based on the DAL specification file the method reads in each schema table of a DALC table, builds up a new class type and adds all columns as private member fields and public properties to it. If a column data type is a value type then it will create a nullable version of that value type as follows:

CodeTypeReference GenerateFieldTypeReference(Type columnType) {

// If column data type is not a value type return just return it

if(!columnType.IsValueType)

return new CodeTypeReference(columnType);

// Type is a value type, generate a nullable type and return that

Type nullableType=typeof(Nullable<>);

return new CodeTypeReference(nullableType.MakeGenericType(new Type[] { columnType }));

}

For example if a column is of type „int“, this helper function will generate a „int?“ or „System.Nullable<int>“. Here a sample of auto-generated C# code:

public class Article {

private System.Nullable<int> _Id;

private string _Title;

private string _Text;

private string _Text2;

private string _Language;

private System.Nullable<int> _Category;

private string _CreatedBy;

private System.Nullable<System.DateTime> _CreatedOn;

public Article() {

}

public virtual System.Nullable<int> Id {

get {

return this._Id;

}

set {

this._Id = value;

}

}


// Helper method to query data

public static List<Article> SelectData(string sql) {

List<Article> result;

result = new List<Article>();

System.Data.SqlClient.SqlConnection connection;

System.Data.SqlClient.SqlCommand command;

System.Data.SqlClient.SqlDataReader reader;

connection = new System.Data.SqlClient.SqlConnection(„Data Source=…“);

connection.Open();

command = new System.Data.SqlClient.SqlCommand(sql, connection);

reader = command.ExecuteReader();

for (; reader.Read(); ) {

Article o;

o = new Article();

if (Convert.IsDBNull(reader[„ArticleID“])) {

o.Id = null;

}

else {

o.Id = ((System.Nullable<int>)(reader[„ArticleID“]));

}


result.Add(o);

}

reader.Close();

connection.Close();

return result;

}

// DALC function

public static List<Article> SelectAll() {

string internalSql;

internalSql = „SELECT * FROM Articles“;

return SelectData(internalSql);

}


}

For more detail information please refer to source code.

Summary

The DAL itself is a basic implementation and shows the concepts of creating dynamic code. To be accurate, the DALComp compiler is actually more a source-to-source translator then a compiler. The current version is only generating methods to select data, no updates or inserts. As you can see there is plenty of room for extending the DAL by augmenting the description language and generating more dynamic code to make the DAL productive.

For more information regarding building compilers and virtual maschines, I recommend Pat Terry’s „Compiling with C# and Java“ book. Another way to study compiler techniques in practise is to take a look at the sources of the .NET implementation of Python, IronPython. The source code is available as download on the CodePlex website.

For real-world compiler development there are plenty of utility tools available such as Coco/R, a scanner and parser generator, or the ANTLR compiler tools (used by the Boo compiler). You also find a lot of information on the web site of Microsoft Research, e.g. the F# compiler. Another interesting topic is the Phalanger project („The PHP Language Compiler for the .NET Framework“) on CodePlex.com.

Download Source Code