Like this:

If you are writing an Ajax based .NET web application or web page, there are situations were you might want to block the whole webpage from being accessed while a background or server side process is taking place. So for example if the user clicks submit, you want to make the web page not accessible until the backend server side event handler function finishes and reloads the page.

Like this:

In one of my latest projects, I found the client extensively using LLBL Gen Pro which I have never used before. After few hours of playing around with it, I managed to find my way round it just enough to the level that I need to get the basic functionality working.

Hence, for this reason, I thought I will write a post to show the basic and first steps for using LLBL Gen Pro to be used as an introduction by anyone who is, like me few weeks ago, was completely new to LLBLGen Pro.

The main feature of LLBLGen Pro is that it automatically creates a data layer for any application to use. The data layer provides full access and complete data management DML And DDL features for you allowing complete control of the specified Database.

LLBLGen Pro generates this Data layer for a variety of database types including full support for the following data bases: Firebird, IBM DB2, MS Access, My SQL, Oracle 8,9,10 and 11, Sybase, PostgreSQL and of course almost all Microsoft SQL Server version such as SQL Server 7, SQL Server 2000, SQL Server 2005, SQL Server 2008, SQL Server Express and MSDE drive. wow! You literally don’t need to write any code, it’s a full data layer created for you to access any database from any of the above types.

LLBLGen Pro generates the code as a .NET project built in your chosen development language (C# or VB.NET). LLBLGen Pro accesses your database using servername and login id and password in case of DB specific Authentication or just windows authentication if your database allows that. It will then generate the code along with the web.config that already has the connection string to your Database.

Now to use LLBLGen Pro you need to do the following:

First thing to do is obviously to install LLBLGen Pro and add the license file to its installation folder. Next is to run the application from the start menu.

Once LLBLGen is loaded, you need to create a new project. Go to File –> New project. Fill in the details as follows:

Once connected to the server, you will get the list of available catalogs (DataBases) within this database server.

Select your chosen Database(s) (you can select more than one) and then click Create.

Once you have done that, the project is now created and saved at the selected location.

Next step is to go and add the tables, views and stored procedures. To do this, right click on Entities and click on add new entities mapped on tables. To add views, right click on Entities and add entities mapped on views, and so on.

Once you have added all your tables and views, they will be all created as Entities. i.e. converted to classes in LLBLGen Pro generated code.

Now you got the project ready for generating the code, there are much more detailed configuration that you can do if you want to create complex code, but for most straight forward data layers, this project configuration is enough.

Now go to Project –> Generate, and you will get the code generation screen. This is where you need to specify the details about the generated code and the project holding it.

The Generate Project Window has 3 tabs. For simplicity and straight forward, not complicated, first project, you only need to fill in the General Settings details. They are the main and most used settings. You can use more complex ones later if you want to.

Template Group: Choose self servicing. This is the default and straight forward option in my view.

Destination Root folder: Folder where all generated code files will be created.

Everything is ready now, Click Generate!

Done.

The generated code can be found at the destination root folder specified in the previous step. You will see that the generated code is of type class library. Now go to your solution and add the generated project (Right click on solution –> add existing Project –> new LLBL generated project).

Next and final step in this LONG post is to just add another project that you want to use to work with the data layer that LLBLGen Pro has just created for you as a new project.

Using LLBLGen Pro classes is easy and straight forward but this is the subject of another post, if anyone is interested?!

Please comment if you want me to blog on using LLBLGen Pro generated code basics and introduction.

This post is an introduction and a collection of some useful information, notes and facts about WCF in genreal, Silverlight and running a silverlight web application that uses a WCF service via asynchronous web service call.

WCF Service Application is the conventional web service application similar to ASP.NET web service but in this case built on Windows Communication Foudnation technology.

WCF Service Library project is different. The output of this project is a compiled dll file(s) along with a dll.config configuration file. The produced output can then be added to a client service application or project, deployed as a separated secure web service or as part of a larger hosted web service.

– When creating a WCF service for a silverlight application, you can simply add the WCF service to the test Web project which is added by default by Visual Studio when you create a new Silverlight 3 project. You can do this by right clicking on the silverlight web project (test project usually called .Web) and then click on add new item. In the new item window, always choose in this case silverlight enabled WCF service as this adds one line to your web.config. On the other hand, you can add the WCF service as another project to the same solution or even a separate solution which is the more practical approach as normally the web service is separate and hosted independantly from its client(s).

– If the WCF client application is not silverlight, opt to use the asynchronous approach in your operation implementation especially in the case when the service implementation makes a blocking call, such as doing Input/Output IO work.

Generally, even if you have a choice to choose between a synchronous and asynchronous call, always try to opt for the asynchronous one.

MSDN (and Microsoft) specifies the essential cases that you need to use Asynchornous calls to WCF as follows:

* If it is a silverlight application (the only option you have really!).

* If you are invoking operations from a middle-tier application.

* If you are invoking operations within an ASP.NET page, use asynchronous pages.

*- If you are invoking operations from any application that is single threaded, such as Windows Forms or Windows Presentation Foundation (WPF).

In all these cases, always try to opt to Async calls rather than synchronous if this is an option.
– WCF is Microsoft’s attempt of unifying web and distributed services. WCF services can work with a variety of client applications including php, ruby, etc. and any client that can use basicHTTP binding.

– WCF is regarded as the successor to conventional Microsoft web services and communication technologies such as: DDL, DCOM, Remoting, Web Services, WSE, etc.. This is especially because WCF is built with the aim to provide “SOA” or Service Oriented Architecture for distributed applications.

– WCF can have messages sent in a variety of channels including HTTP, TCP, MSMQ, Named pipe, etc.

– WCF provides a DataContractSerializer which allows complex data types and private attributes to be serialized and sent. Being Serializable means, the object can be hydrated, dehydradted from a stream/bunch of bytes, into a living class instance.

Like this:

I’m sure many people needed to do a bulk delete operation on Microsoft Dynamics CRM 4.0. You may have uploaded thousands of records from an imported file or migrated them through Scribe or even used a .NET application to mass create records.

Unfortunately, and as far as I can see, there is no straight forward way to do bulk records deletion on Dynamics CRM 4.0 using the out of the box functionality and interface of Dynamics CRM 4.0.

To bulk delete records in Dynamics CRM 4.0, you have the following main options:

Get a third party tool or CRM add-on to bulk delete records. This option is a straight forward one but you might have to pay for purchasing or using the tool. It may also have security issues. I would not recommend it to my clients as most probably the tool is created by a small company or an individual which I don’t know. Hence, it will be rather difficult to put this tool on a live Production environment or client server. Let alone adding it to CRM Online or to a CRM hosted solution by a partner.

Use CRM SDK to write a .NET application (or a .NET console application) that will run and delete all records for a specified entity or entities. This is a more robust way of doing it, but it may take longer time and is probably not suitable for people who do not come from .NET development background.

Use Scribe Insight. This is what this post is about really.. Using Scribe Insight to bulk Delete Dynamics CRM records.

Please Note: This is a work around. It is not supported by Scribe and the advice in this post is provided as is with no warranty. I have tried it and it works perfectly but can not guarantee it will have the same acceptable results in any other environment.

Here is what you need to do:

Create a new Scribe workbench DTS (or Job). Point to your usual source file (even a sample one) and point to CRM: either IFD Forms for hosted CRM or direct connection.

Configure the targe: Create one delete step on the target.

Make sure that the option to “Allow multiple record matches on updates/deletes” is ticked under the All steps tab.

Under Step control tab, leave failure to go to next row but change all the success records (Success (0), Success (1) and Success (>1) ) to End Job. Select success radio button at the bottom and write a message to your log such as: “All records Deleted”.

No Data links are important as you are only deleting.

On the Lookup link, just make the lookup condition impossible. Such as: where Account Name = 123456789 or whatever.

Run the DTS.

The Job will read the first source line. Will then try to find this record at the target (remember it is update/delete). Since we have setup the lookup link to look for something “impossible to find”, the result of the update will be Success (0).

Once this happens, Scribe will go and delete all records for your chosen entity (or CRM table). This will be a complete bulk delete of all CRM records using Scribe.

Like this:

I was recently researching how to implement post code and address look-up on websites and on Microsoft Dynamic CRM. This feature is widely used in auto filling address details when a customer input their post code and sometimes the house number. This is also used in some desktop applications, plug-ins to Microsoft Dynamics CRM, in store locator functions, bank details verification and in applications where the customer inputs their post code and the system can accordingly specifies which region or area the customer falls into. For example, quote websites, i.e. websites offering quotes for nationwide website visitors, will most probably need to know which region the customer is part of so if he is from the North East of England, the server may be much cheaper for example than in London.

After a bit of research, I found out that there several UK post code suppliers. The major supplier is of course as expected Royal Mail. Royal Mail provides a service called Postcode Address File (PAF®). The problem though is that Royal Mail will not give you any application, they will just give you some raw data on a CD and they will be sending you updates for this data every month, 3 months, 6 months and even some daily changes depending on your choice. The data is great and I think it does have almost all Britain’s postcode areas and addresses. Regular updates also ensures that the information is up to date. The raw data though needs to be manipulated by an application. Some hard coding will be required which is not too difficult to do but if you are not an experienced developer or have no development background, this will be an issue for you. I will try to address the coding of such functionality in Microsoft .NET (C# and VB.NET) in a later post.

Alternatively, I found out about quite a few online companies that provide solutions for the raw data. So, basically they create web-services which manipulate Royal Mail (PAF ) raw data and all you need to do is to write simple code to return the details you require. So you pass on the post code to them for example and they send you back the full address details. Many of them will even give you example code making your life easier. If you do not need this for a website, you can always opt for their other services such as an online application for you that you can login to and get the address information you require.

Pricing of these post code data providers varies with most of them if not all providing the service using credits. So each credit gives you one address information request. Credits are sold in packs with prices averaging around £50 for 750 credits pack. Do not buy the first one you see, ask around and shop around. Royal Mail though has a different licence structure with prices starting from £85 for a single workstation and £2 for every extract up to £££thousands of pounds for a multi-organisation multi-server licences.

If you need more information, recommendation and links to the best providers of this post code and address information services, please get in touch and I will get back to you. I have now managed to nail down the best providers and got good offer from them. Just get in touch!

Share this:

Like this:

Few months ago, I was asked to research the possibility of integrating two systems for a client using SCRIBE Software specifically. The two systems are Microsoft CRM and SAP. I have done an intensive research in the subject and I have come up with a 9 pages document detailing the answer.

The quick answer is yes. SCRIBE is a good application that can be used to provide an integration between Microsoft Dynamics CRM and SAP.

If you want the report I created studying the strength of SCRIBE and the possibility of using it to do such integration, please request it via a comment on this page and I will email you the document. I will also email you a technical specification document that SCRIBE has sent me which details such integration. This technical document is not publicly available on their website as far as I know but you can always request it directly from them.

My technical report/document also has some examples of case studies in which Microsoft CRM system has been integrated successfullywith SAP. It also lists some white papers and technical documents/documentation that has covered the subject in general and the specific integration between SAP and CRM using SCRIBE.

The document does not include any reference to the client, the exact project specification or any information that could be confidential. It’s just simple facts and findings on SCRIBE and the possibility of using it for SAP and CRM integration.

Share this:

Like this:

I have just started my Microsoft CRM 4.0 VM, tried to run a .NET console application and I got a full list of errors although the code was working fine the night before! When I tried to open Microsoft CRM 4.0, I got an error “Invalid Action” with the sentence starting with “error in command”. This failure was happening with all CRM organisations not just related to a specific one.

This post is a copy of the entry I posted on NopCommerce Forum. NopCommerce is an excellent FREE .NET shopping basket. I suggest any .NET developer/consultant to have a look at it even before going towards the commerce server route. The company I work for is a Microsoft Gold Partner so they are rather inclined (rightly) to go for the Commerce Server option whenever the client project sounds like an online shop or e-commerce portal. But for any smaller kind of project, I definitely recommend Nop Commerce. At least you have the full source code to play with as you wish and add your bits.

Anyway, I had a nightmare trying to setup NopCommerce on GoDaddy. I did quite a bit of fiddling to get it working.

You will have problems with GoDaddy specifically, as you won’t be able to upload the create data sql file (too big – maximum 2.5MB and the files is 4MB). To overcome this, you will need to split your create data to few sql files. Make sure that you have insert set to “ON” on the first line of every one of these sql files.

Also, the installation wizard could not find my database (on GoDaddy), although it connected to the host correctly. i guess it is trying to find the database on an incorrect level. Anyway, to overcome this, you will have to use the sql files in the scripts folder (in the install folder) to create the tables and the data manually rather than using the script.

mmm, what else… yeah, you will also need to amend the web.config manually. Basically either use the wizard or do everything manually. the wizard didn’t work with Go Daddy though when I tried it as I said earlier.

Despite all that, Nop Commerce is excellent and many thanks for all the effort of the development team. I have to say it works perfectly when I play with it on my Dev machine, it is only when you try to set it up on a shared hosting account.