In two recent blogs, I demonstrated how to write web clients of REST APIs – with XML (demo application here) or JSON (demo application here) as data transfer format. In this blog, I will focus on the server side: How to implement a REST API as ABAP request handler. You can inspect all the code I am discussing here on the MIGROS BSP website: It’s all in Class ZCL_JOB_DATA .

The ICF Tree

Request handlers are classes implementing the interface IF_HTTP_EXTENSION, which consists of one single method HANDLE_REQUEST. A request handler class can be attached to a path in transaction SICF. An incoming HTTP request will be analyzed by the Internet Communication Framework, trying to match the request path against the SICF path. The match process is stopped as soon as a node with an attached request handler is found. If this is the case, an instance of the handler class will be created, and the method HANDLE_REQUEST will be called.

Our example service is attached to the path /job/attributes. The class ZCL_JOB_DATA is declared to be responsible for all incoming requests where the request path starts with /job/attributes :

First Strategy: HTTP Request Method

The implementation of the interface method if_http_extension~handle_request() forms the uppermost level of processing. Therefore, the implementation only gives the rough processing skeleton: An instance for database operations, as well as an instance for the processing of the REST operation are created, the request handling is delegated to that instance, and there is a catch block for error processing in case that no instance could be determined for processing the request. Such a situation should result in an HTTP response with status code ‘400 – Bad Request’.

At this place, we are using the Strategy design pattern: Depending on the HTTP method (GET, PUT, POST, DELETE, OPTIONS), a specific instance is created. Each possible instance corresponds to a specific strategy.

We are using a naming convention for the instance determination: The class LCL_REST_GET will be associated with HTTP verb GET, LCL_REST_PUT with PUT, and so on. All these classes implement the interface LIF_REST. This way, we can use dynamic instance creation. Alternatively, we could have written a large CASE … statement with many WHEN’s. The advantage of the CASE would be that the create object statement could be statically checked for syntactical correctness. I have chosen the dynamical variant since I find it clearer and more readable than a bunch of WHEN branches.

Observe that the HTTP request method (GET, PUT, POST, …) is available as pseudo header field with the name ‘~request_method’:

Second Strategy: Data Transfer Format

Now we have different handler classes for the different HTTP request methods. But for all these handlers, there are some common tasks. One of these common tasks is: to determine the current data transfer format, and to convert the input – if available – into ABAP data, and vice versa: to convert the ABAP result data into the output with the desired data transfer format (XML or JSON).

Now, some request methods like GET do not require any request content. So the conversion of incoming data is performed by those method handlers that know they require content data. On the other hand, there will always be a result of the following data type:

There may not always be entries in the job table. But not every component of this structure will be initial. If there is no job table, then usually there will be a message. So the conversion of the result can always be performed.

It makes sense to work with an abstract converter class, the specific subclasses containing the conversion algorithms per content-type. This is the second application of the Strategy pattern.

The Common Plot for All Requests

We can extract common tasks into a superclass lcl_rest of all specific method handlers, implementing the interface lif_rest~handle_request( ) once for all subclasses.

The common code in the superclasse needs to be mixed with specific code, implemented in the subclass and defining the specific behaviour of that subclass. To achieve this, we call at the desired point of time in lif_rest~handle_request(), an abstract method do( ), which has to be redefined in the subclasses. This do() method will contain the specific action.

Now, the common implementation lif_rest~handle( ) in the superclass only defines the flow of the processing, leaving the concrete actions to the subclasses or to delegates like go_converter:

A Specific Task – the PUT Request

Let’s look at a specific task for illustration: The PUT request – which always is a task to update or insert job attributes for a given ID on the database. As follows from the design, there is an own local class LCL_REST_PUT handling PUT requests. Actually, for this request handler, there was only the do method itself to implement (which is the absolute minimum for a specific task class to implement: do() is abstract in the parent class. Without an implementation, no instances could be built.):

Note that the implementation of this task doesn’t care about the HTTP data structure, the format actually in use, nor about the details of the transfer data format. It simply works with ABAP data structures ls_job for the input and es_result for the output.

Session, Identity and Locking

In the test applications (neither in the JSON app nor in the XML app), there is neither login nor enqueue of the data. Since the applications are open for everybody, this works only since I don’t really operate on a database table ZJOBS. Actually, each client who calls the application is working with his own session data, so he doesn’t conflict with other users’ operations, and is himself not disturbed by other users. The session data are preserved for him as server-side cookies, surviving the single dialogue step (for example reloading the page would reproduce the current state of the data).

When a web-client is written as BSP, there is a session-ID available in the attribute runtime->server_id. This session ID identifies the particular browser instance that made the request. On the client-side, this session-ID is always contained in a cookie called sap-appcontext. If an application has state which has to be preserved with a session ID, the ID has to be extracted from the sap-appcontext cookie and has to be passed as a query parameter with all the Ajax requests. Here is the function which extracts the sap-appcontext from the cookie:

As a fallback, in line 22, the server->session_id is used. However, there will be a new server->session_id for each request, which results in fresh session data with each dialogue step. If you really need session management, it is essential that the session id is passed to the server.

It is a good idea to combine the session id with the login procedure: If the user authenticates, his browser receives a session-id with a limited validity. That session-ID has to be passed with each successive REST operation. In ABAP, it can be used to store and retrieve session-specific data in the database table SSCOOKIE, via its database access class CL_BSP_SERVER_SIDE_COOKIE.

This coupling of a session id with login is – roughly – the way how the REST API for the HP Quality Center works.

Using ABAP’s Built-In JSON Converter

While the XML converter instance is pretty straightforward to implement – calling an XSLT transformation for XML -> ABAP, and another one for the way back – it might come as a surprise that the JSON conversion can be handled exactly the same way: with transformations. This is possible since the call transformation statement supports the JSON format (at least as per SAP_BASIS 702). JSON is auto-detected and parsed into an intermediate JSON-XML format. This can be processed with an arbitrary XSLT transformation, and converted into other XML documents or to ABAP data.

For example, a PUT request from our test application may send the following JSON data to the server:

If a string with this content is passed as “SOURCE XML” to ABAP’s CALL TRANSFORMATION statement, the JSON will be parsed into an XML representation like this one (the format is easy to understand – I think a detailled explanation is not necessary here):

When processing an arbitrary XSLT transformation, with the CALL TRANSFORMATION statement, and passing a JSON string as source, the XSLT will operate on this internal JSON-XML representation. It is easy to transform such a JSON-XML document into ABAP data – to be more precise: to transform it into an asXML representation of ABAP data. For example, consider the following XSLT transformation:

This is a valid ABAP data description. If the transformation is named ZJSON2JOB, the data can simply be imported into an ABAP data structure with the components ID, REPID, and so on – as is the structure es_job in the following implementation of the JSON converter.

Many things can be done with the identity transformation ID, with no need to define an own XSLT transformation at all. If you can impose the JSON data structure to be used in the web application, it is of advantage to use such a “canonical” structure. For example, consider wrapping the JSON hash with the job attributes into another hash, making it the value for some symbolic key name like “JOB”:

Then the data could be parsed into a structure without the need of developing a custom XSLT transformation, simple using the identity:

call transformation id
source xml iv_cdata
result job = es_job.

In this example, since I have written the web-client and the server-side processing, I could have chosen this more “canonical” format. But by not chosing it, I learned how to work with more flexible JSON data formats.

There are several reasons for working with “non-canonical” JSON representations of ABAP data:

A JSON format may be designed in favour of the web application – to optimize the readability of the client JavaScript code working on the data.

JSON-based third party services may be called from the ABAP side (with a HTTP client object)

ABAP data may be projected to the essential data, reducing the message size to the data which are really needed.

Just to illustrate, let’s have a look at the other conversion – the way out from the server to the client. Again, the format differs slightly from the “canonical” JSON format, which would simplify the ABAP-side handling considerably. As mentioned, the result data structure contains

The following format would be a perfect JSON pendant for this structure. It could be simply produced with the identity transformation, passing as “source result = ls_result” (where ls_result is a structure of type ty_result):

All the component names match perfectly with the JSON hash key names,

An internal table is mapped as a JSON array of hashs, each hash representing one entry of the table,

And there is a top level hash with a symbolic name “RESULT” for the complete thing:

We proceed in a similar way as above, only in the other direction: based on the ABAP data type ty_result, we write an XSLT transformation to obtain the internal JSON-XML format corresponding to this JSON data string.

The JSON-XML data format of the desired JSON data string looks like this:

We see that, basically, for each deviation from the “canonical” JSON representation of ABAP data, there is a template in the XSLT transformation handling this deviation. For example, the different name TYPE instead of MSGTYPE in the target is handled with the template

The ID has to be rearranged: From being a simple attribute of the ZJOBS data structure, it has to be raised one level higher to become the key of a hash. All the other attributes, except ID, are copied as string nodes into the result. For this, these two templates are necessary:

That’s all: ev_cdata will then contain the JSON data string, to be placed in the HTTP response body.

Summary

I outlined some typical topics concerning the implementation of REST APIs in ABAP. It is possible to keep separate concerns in separate (local or global) classes by applying patterns like strategy. This is how the class ZCL_JOB_DATA, serving my demo REST API, is organized (the basic ideas have been discussed in this blog):

The problem is i created a service in sicf with handler class which implements IF_HTTP_EXTENSION~HANDLE_REQUEST, but when i execute the corresponding html page it is not triggering the handler class 🙁 ..

“when i execute the corresponding html page it is not triggering the handler class 🙁 ..”

Assuming you know that a REST service is not an HTML page but only a remotely callable function using HTTP as protocol and with data as response (usually in XML or JSON format), I understand that you have a web application which calls the REST API implicitly with JavaScript.

When you analyze the page behaviour with a tool like Firebug:

is the request really issued by your JavaScript? You see this in your Network tab

is there a response sent from the SAP system?

Is this response OK (status 200), or not OK (status 40x or even 50x)? In the network tab, the latter are written in red color.

If the response is not OK, analyze the response for finding out what went wrong.

There is a variety of possible reasons that a request may fail: Wrong URL, SICF service not active, wrong or missing request data (when required), an ABAP short dump during processing, …

I am connecting to SAP using HTTPWebResponse and HTTPWebRequest. We are fetching a list of orders from SAP. The issue is that for each order that is fetched from SAP, a session gets created and does not get closed/end. On debugging we used the URL used to fetch the data (created in the code) and pasted in the browser window. It created a session. Now we used the URL (created in the code) for the close session and pasted in the browser window. It logged off the session. Ultimately from the browser we are able to open and close session but not from the code. Can you please help?

I have been reading and trying to implement REST API in ABAP and communicate with it from an Android device (Out of curiosity mostly). So far I have succeeded doing it by defining a class implementing IF_HTTP_EXTENSION~HANDLE_REQUEST( ).

Your idea of implementing Strategy pattern to handle PUT, GET etc is cool. I was happy that I could do the same untill I ran into this help documentation REST Interfaces and Classes – Connectivity – SAP Library. The new REST library already has interfaces to implement the PUT, GET requests instead of doing it via our logic using ‘~request method’ header value.

But I am not very clear on how to achieve this since I couldn’t get my hands on the REST library as its not yet available in the system that I am working on.

It will be very useful if we get to see some documented implementation.

The ABAP REST Library is quite new and it is very likely that your system doesn’t have it yet. It is coming in the basis system with release 7.40 and you can only get it in previous releases only with the latest SAP_BASIS patches.

The classes are defined with the name space, ‘/IWCOR/’ (eg./IWCOR/CL_REST_HTTP_HANDLER). But even in that system I was not able to find any Service defined using the REST Library, also there is no developer access hence it was not of much use. But it could be helpful if we need to know how SAP has handled REST communications.

The system you’re accessing is currently a release 7.02. As such, the SREST package is missing.

The classes you’re mentioning do indeed belong to the REST Library, but it is the implementation specific to be used by Gateway. This implementation is in principle not intended to be used generically. So I’m afraid you cannot do much with it. You need a 7.03/7.31 or 7.40 system.

thanks for pointing me to the SAP REST library which has been delivered with basis release 731 and is absolutely new to me.

As with every framework, it may be that it delivers too much for a concrete purpose. I will have to explore that.

If you have questions about my way of implementation – as exposed here – then go ahead. What I cannot do is answer questions about the SAP REST library, since it is new to me. (But you don’t have a concrete question about it, either. Only on the availability: which seems to be SAP_BASIS 731)

I have a situation where I need to autmatically login to SAP when the user enter id and pwd into a .net application. I am planning to create a rest service through SICF node and redirect the user to a SAP web dynpro application.

My biggest challenge is

How can I authenticate him, I am planning to match the user id and pwd of .Net application with that of SAP, so when the call the rest using the URL I dont want a Pop up for user id and passwod

Is there a way to avoid the popup requesting userid and password since I already have the user id and password and can pass as URL param. there is no security concerns as it a terminal server.

So you have a .Net application and want to present the user a web dynpro from there. But for what do you need REST then.

“Is there a way to avoid the popup requesting userid and password since I already have the user id and password and can pass as URL param. there is no security concerns as it a terminal server.”

This seems to be somehow unrelated to the former. If it bothers you that your service requires user and password, you could make the service anonymous by providing credentials in the SICF service.

If you really want to pass userid and password as URL parameters – which nobody would recommend you, even in a seemingly protected environment – you can afford this by using the appropriate URL query parameters: sap-user and sap-password.

Hi Rüdiger Really appreciate your reply, we have a scenario where in single sign on is not a possibility.

I have a third party application through which the user enter the credentials. Basically I want to call a Webdynpro screen after the user logins through third party, we are planning to sync the user id of the third party application with that of SAP. After user logins through third party application I want to call a service through SICF node and login without giving the prompt to enter user id and pwd again, after the login in is successful I shall redirect from the service handler to the web dynpro application. Sorry if I am confusing you.

I want to skip that password prompt which SAP gives when we call the SICF service by specifying the URL.

If it bothers you that your service requires user and password, you could make the service anonymous by providing credentials in the SICF service.

If you really want to pass userid and password as URL parameters – which nobody would recommend you, even in a seemingly protected environment – you can afford this by using the appropriate URL query parameters: sap-user and sap-password.

Additional question: how is it possible to handle locks? I want to change e.g. a notification in a web application, but if i want to lock it in sap (with enqueue) the lock will be deleted after the request is finished.

in the stateless request that I prefer, there is no other way than to write a lock entry in a special database entry. In the applications in which we needed such a locking, we had created a special database table ZLOCK, with a GUID as key, then in the data part a generic object key field, an object type (together building a secondary index), a field for user-id, and the lock creation time. Requesting a lock looks for entries in ZLOCK for the same object key & type, and with a creation date not older than a system-wide defined timeout parameter, and with a different user-id than the own one. If such an entry exists, the current user can set his lock, the exception is thrown to inform the service that the object in question can’t be changed. If no such entry exists, the user can set his own entry. If an entry exists but is out of date, user and creation time can be updated.

Additional to the timeout mechanism, you can provide a special logoff function which clears all entries for the current user.

Similar to SM12 for normal enqueue entries, a little report is helpful for support to show the current locks, and to delete them (in special cases).

Thanks for your answer. I had also the same idea, but it’s a lot to do to develop an extension in or around the enqueue-FMs code to hande this. So I changed to stateful requests with the method server->set_session_statefeful( ) inside the handler. I hope that no big disatvantages will appear.

It’s me again. Now I wrote a stateful application (outside sap, php on a web server). It works great with standard locking mechanism in sap. I can remove locks with the same session and if I logout, the locks are also removed! I can read an manupulate data, Perfect!

But… If I make a change at the source code (sap side), the current session doesn’t recognize the changes. This isn’t a big thing, I could kick the user/session from SM05.

Now my problem: if I change e.g. an equipment in SAP, the changes also aren’t recognized at the current session (if I’ve already open this equipment from web).

Do you know why this is happening? Is there a setting for it?

I though why not using SAP standart stuff… now I regret why I didn’t implement your suggestion…

what you describe is standard SAP behaviour. The needed data are loaded into the session memory from data base when a transaction is started, and later changes of the database won’t effect the session data in memory. This is for data consistency during a session.

BTW, you would face similar problems if you kept session data in table SSCOOKIE for usage across dialog steps, when working with stateless applications.

In both cases, you would need to have some kind of notification mechanism to get your transaction informed about changes.

For this, you would need a central entry point in the code which is passed with each dialog step. For BSP applications, this would be something like the method DO_INITATTRIBUTES or even DO_REQUEST (redefined). In this, you could check something like a “change stack table” for recent changes and reload the master data if necessary. You won’t get this for free!

But inspite of this being technically possible, the question remains:

What if the user’s changes make sense only for the version of data as of when the user started the transaction? How to handle this?

This blog gave me some understanding of configuring service in SICF and implementing the interface, IF_HTTP_EXTENSION~HANDLE_REQUEST. But, I need to access the URL used to access this service. I need to read or access the parameters concatinated to the URL generated by SICF. Because, to the URL generated by SAP, users will input and pass some default values to the selection screen of a transaction code. I need to capture those paratmeters and perform some operation on the same. Please help.

In addition, with the implementation of custom class, parameters passed in GUI configuration of SICF have become irrelevant. Though I am passing singletransaction parameter, it is not showing any effect as we are using custom handler class. Any suggestions on how to get this work?

you want to get form field values (passed as query parameters in the URL)? Then use server->request->get_form_field() and RTFM for further questions on the call details.

For this, you wouldn’t need the URL. You can use server->request->get_form_field( ), it retrieves the URL query parameters as parsed from the URL, so you don’t need to parse the URL yourself. If, for whatever reasons, you would like to do this parsing yourself, you can use the pseudo header field ~query_string (that is, you call server->request->get_header_field( ‘~query_string’ ) ). See List of the Pseudo Header Fields – Components of SAP Communication Technology – SAP Library for more details.

As for the second question: I must admit I don’t understand it. Maybe somebody else does.

Regarding the second question, we have the option to set parameters of the service. As shown below, I have selected the parameter ~singletransaction as 1. This will not allow user to navigate to any other transaction apart from the one linked with the respective URL. As I have used custom implementation of handle request, this is not working anymore.

Hi Arijit, your question is slightly off-topic, as this blog describes the SAP system in the server role. You are asking on how to use a SAP system as client to perform API calls via the unified tool apigee.

Anyway – if you know the form fields to submit, you can use class cl_http_client for this. It is documented, and there are sample reports like RSHTTP01.

Hi Jean-François, from your question it is not clear which problem you have. The piece of code that you pasted seems to indicate that you

have a self-written transformation zjpar_test04 (which we don’t know),

you want to apply this transformation to a structure, not an internal table, since you use a variable named ls_result

and you want to receive the result in the form of JSON data.

The result of the transformation must therefore be a valid XML document, conforming to the XML-JSON syntax. You can check this by using my schematron validator that I described in another blog. Just write the result not into lo_writer but into a string ( “result xml lv_result” ), copy the XML from the debugger at that point, and paste into the JSON-XML validator.

If it is valid, then pretty sure lo_writer will receive correct JSON in your code snippet. If not, you’ll have to adapt the transformation to make it work.

I know that my problem is not clear as I’m improving my skills concerning APIs while developping.

thanks to you, i succeeded in developping the transformation sheet and converting XML to JSON.

contrary to your solution, i’m not receiving API. I’m trying to send data to an API.

Unfortunately i’m facing two issues :

– with my SAP version (SAP BASIS 70207), when using SM59 the HTTP user is limited to 32 characters. there is an oss 2020611 which seems to correct this issue. however, it is not available for my SAP BASIS version.

with api, it seems that the version may evolve in the future. hence the url may change as well. it seems easier to change SM59 directly in production system than change the ABAP code and transport TOs.

The idea of a Restful Web Service is it should be Stateless and Loosely coupled and that’s why i use it directly.

I will suggest to test your api and the request parameters directly from some other application( like Javascript etc) or using Online HTTP Request sites(like Request Maker, Hurl.it etc) and see if you can connect it. If you are successful, then you should be successful within SAP also.