SAP Gateway
Browsing All 253 Articles Latest Live
Mark channel Not-Safe-For-Work? cancel confirm NSFW Votes: (0 votes)
Are you the publisher? Claim or contact us about this channel.

Code snippet for adding annotations to your Gateway Service


Now that annotations are making UI5 development easier by using Smart controls, it is important to learn how to add these annotations to your service. SEGW does not yet allow you to add most of the annotations. Till SEGW inherently provides that feature, here is how you can do it using code.


Step 1. Goto you MPC_EXT class


Step 2. Redefine Define method.


Step 3. Write this code.


    super->define( ). "Ensure you call the parent metadata

    lo_entity_type = model->get_entity_type( iv_entity_name = 'EmpDetail'). "Your Entity Name

    lo_property = lo_entity_type->get_property( iv_property_name = 'DateOfHire'). "Property inside your Entity

    lo_annotation = lo_property-/iwbep/if_mgw_odata_annotatabl~create_annotation( /iwbep/if_mgw_med_odata_types=>gc_sap_namespace ). "SAP's annotations

    lo_annotation->add( iv_key = 'display-format' iv_value = 'Date' ). "Specific annotation you want to add.


This will result in


Real life example of upgrading a 3-tier NetWeaver Gateway landscape




This blog shares our experience of upgrading a 3-tier NetWeaver Gateway landscape, pointing out the challenges we faced and how we were able to solve them.


Source Landscape


Existing 3-tier landscape with NetWeaver Gateway 2.00 SP 07 (NetWeaver 7.31 SP 09) in a central hub deployment model. Applications are connecting to a CRM 7.0 EHP1 system with NetWeaver Gateway 2.00 SP 07 backend components.


Target Landscape


Existing 3-tier landscape with SAP Gateway 7.40 SP 13 in a central hub deployment model. No changes to backend systems.


Upgrade Strategy


Execute a in-place upgrade. Start with sandbox systems, which are recent copies of respective production systems.


Initial Plan


According to SAP note 1830198 systems can be independently upgraded, upgrading Gateway doesn't require one to upgrade the backend components of the connected backend systems, assuming sufficient SP levels exist both in the Gateway and the backend systems. In our case we met the SP level requirements. The plan was not to upgrade the backend components.




As soon as we had upgraded sandbox, we realized that our existing Gateway services didn't work anymore. More specifically, none of the services leveraging filter functionality worked. In addition there were issues with currencies that used to work that no longer worked.




Debugging the filter problem we found out that passing filter values got broken by the upgrade, values were being truncated. Looking into it in detail, we found SAP note 2205402 which we applied on both the Gateway system as well as the backend system, as instructed by the SAP note. This however wasn't sufficient. Since the corrections are partly contained in 740/13, we had to also implement SAP notes 2232883 and 2241188 on the Gateway system. Even that wasn't sufficient, we had to also implement SAP note 2245413 in the backend system.


Applying the SAP notes fixed the issues with filter functionality. The issue with currencies is explained in SAP note 2028852. We chose to change the applications in order to avoid the decimal problems described in the SAP note.


New Plan


In order to apply the SAP notes required to fix the issues with filtering, we had to also update the backend components to 2.00 SP 11. The new plan is to execute the in-place upgrade of NetWeaver Gateway and update the backend components.




I'm sure breaking compatibility or interoperability wasn't on SAP's radar but it happened. I have contacted SAP Gateway Product Management but I haven't yet been provided with an official explanation. In our case a simple technical upgrade of NetWeaver Gateway turned into a full-fledged upgrade project.




Take everything with a grain of salt, even official information can't be trusted. Test and validate everything yourself, preferably in a sandbox environment.


We are currently executing the new plan in our development landscape. I will update this blog should we run into other issues.

Slacking Off (1 of 3)


(This is part 1 of a 3 part series. See Part 2 and Part 3 for the whole story.)


Did you ever have a late night instant message conversation that went something like this:


Screen Shot 2016-03-18 at 12.55.42 PM.png


It’s no fun to be in that conversation. You know you’re stuck sitting in front of a screen for at least the next 10 minutes. And since it’s your work laptop you know that the corporate internet police won’t even let you browse reddit for cat pictures while you wait for VPN and SAP GUI to load up. More so, you know that whatever this person is yelling about is probably not your fault.


I’ve been there, trust me.


What if your conversation could look like this, instead:


Screen Shot 2016-03-18 at 12.59.54 PM.png


Did you notice Commander Data interject in that exchange? More on that later.


As nerds our jobs often involve performing routine technical tasks for people who use our systems. Maybe you reset a lot of passwords, check the status of integrations, or respond to a support inbox. You probably have loads of different communication tools at your disposal. Chat, email, carrier pigeons…whatever gets the job done. If someone needs your help they’ll generally find a way to get in front of you. Digitally or otherwise.


One of the coolest communication tools I’ve worked with in the last couple years is Slack. It’s got individual conversations, group chats, categories, and anything you’d expect from a team chat tool. It’s quickly overtaken email as my preferred method of talking with colleagues.


Except it’s way more than chat. Slack allows you to use pre-built integrations to look at your Jira tasks, GitHub commits, and thousands of other things. What’s even better: you can make your own integrations that interact with its web API. Which makes it the perfect thing to plug into your SAP Gateway to make use of the REST APIs you’ve already created for other purposes.


In my next couple posts, I’ll show you how to make exactly what I did above using (nearly) free tools.


Slack Setup

If you're not using Slack already, you can get a free account. It's very simple and straightforward. Once you've got an account, follow these steps to set up the Slack end of this chain:


  • I set this up as a Slash Command. That's where the "/ask-sap" piece comes from in my chat transcript above. Go here to set a new one up for yourself.
  • On the next screen, choose the command that you want to use, starting with a '/' character. You can use /ask-sap if you want to stay perfectly within the tutorial, since these custom commands are for your own Slack team only.
  • Click the "Add integration" button.
  • Screen Shot 2016-03-18 at 1.36.36 AM.png
  • On the next page, pay close attention to the "Outgoing Data" and the "Token" sections. You may want to copy or screenshot them for reference later.
  • The only thing that absolutely has to be filled-in is the URL field. This has to be an https destination that you own or have control of, and it has to be an endpoint programmable by you. Post 2 of 3 in this series will show you how to set that up in Google App Engine - but you could realistically do it anywhere you have the rights to set up programs to run on a web server.
  • Click "Add integration" at the bottom of the page, filling out whatever else you want to along the way. I suggest at least showing your command in the autocomplete list.


What you just did set it up so that Slack will respond to any message that starts with "/ask-sap" by sending an HTTP POST to the URL you provided in the settings. The format of the POST it sends will look like the "Outgoing Data" section that you saw in the setup process. For this demo, the most important pieces are the token and text fields.


That's it! You now have a Slash Command available in any of your Slack channels. It won't do anything yet, but that's what we'll set up in the next section.


On to Part 2!

Slacking Off (2 of 3)


(This is Part 2 of a 3 part series. See Part 1 and Part 3 for the whole story.)

In Part 1, we got Slack up and running with a Slash Command that will send an HTTP POST to a specified URL endpoint. In this part, I'll show you how to set up a basic Google App Engine web server in Python to respond to this HTTP POST and format a request for SAP Gateway. From Gateway, we'll output the data that the request asks for and send it back to Slack. I will not be exhaustive of all the features of App Engine - this is an SAP blog, after all - but I'll provide sample code, links to how-tos, and some tricks I learned along the way. The amazing thing is that a super basic implementation is only about 40 lines of Python code!




  • You'll need a Google account (if you have a Gmail address, you're good to go). I like using an IDE like Eclipse with PyDev installed, but if you are a complete notepad.exe ninja then go for it.
  • You'll need to secure a domain name for yourself, or have rights to one. Google, again, has an easy way to do this.
  • You'll also need to get SSL set up for that domain, which you can do for 90 days free at Comodo.
  • Once you have the cert, you can apply it to your Google Domain like this.


Now you're ready to code! The easiest way to set up a project for App Engine is do the play-at-home 5-minute version. This will get you a project set up, the right deployment tools installed, and a project folder ready to go. Try it out, test it a few times.


Once you're comfortable with how that works, you can simply replace the code files with code I'll provide below. Note that there are several places in the code where I've put some angle brackets with comments - this is where you'll need to fill in your own solution details. My meager programmer salary won't cover a giant hosting bill because everyone copies my domain/settings and sends all their messages through my server.


First, replace the contents of your app.yaml file with this code:


application: <your-google-app-id> 

version: 1 

runtime: python27 

api_version: 1 

threadsafe: true   



- url: /.*   

  script: main.app 



Very straightforward, not much to comment on here. Just remember to replace the app-id section at the top.


Next, create a file called main.py (or replace the contents of the existing one) with this code:


import webapp2

import json

from google.appengine.api import urlfetch


class SlackDemo(webapp2.RequestHandler):

    def post(self):

        sap_url = '<your-sap-gateway>/ZSLACK_DEMO_SRV/RfcDestinationSet'

        json_suffix = '?$format=json'

        authorization = 'Basic <your-basic-credentials>'

        slack_token = '<your-slack-token>'

        request_token = self.request.get('token')


        if slack_token != request_token:

            self.response.headers['Content-Type'] = 'text/plain'

            self.response.write('Invalid token.')



        text = self.request.get('text')

        details = {}


        if text.find('shout') > -1:

            details['response_type'] = 'in_channel'

            response_text = ''


        if text.find('test') > -1:

            rfc_destination = text.split()[-1]

            request_url = sap_url + "('" + rfc_destination + "')" + json_suffix

            headers = {}

            headers['Authorization'] = authorization

            response_tmp = urlfetch.fetch(url=request_url,



            response_info = json.loads(response_tmp.content)

            response_text += 'Sensor sweep indicates the following:\n'

            response_text += response_info['d']['Destination'] + ' - '

            response_text += response_info['d']['ConnectionStatus'] + ' - '

            response_text += str(response_info['d']['ConnectionTime']) + ' ms response'


            response_text += "I'm sorry, Captain, but my neural nets can't process your command."


        details['text'] = response_text

        json_response = json.dumps(details)

        self.response.headers['Content-Type'] = 'application/json'



app = webapp2.WSGIApplication([

    ('/slackdemo', SlackDemo),

], debug=True)



I'll do a little explaining here.

  • We'll set up ZSLACK_DEMO_SRV in the next post, part 3.
  • To use Basic authentication, you'll need to take some credentials with access to your SAP Gateway and turn them into base64 encoded characters. One easy way is to bring up the Chrome javascript console (ctrl-shift-j), type "btoa('USERNAME:PASSWORD')", and take the resulting string. Obviously use a real user and password here.
  • Take the slack_token value from the screen where you set up your slack slash command in part 1.
  • The app configuration at the bottom will make it so that you should configure slack to send its commands to https://<your-domain>/slackdemo. Change that to whatever you like.
  • We treat the 'shout' text as a command to send the result of the command to the whole chat window. Otherwise the command will respond only to the person who sends the command and others won't see it.
  • We look for the word 'test' as the key to actually invoke our functionality. If we don't find that, Commander Data will respond with his polite apology.
  • We look for the name of the RFC by splitting the command up into words and then just taking the last word. Python has this nice little syntax for lists where index [-1] is the last element of the list. text.split()[-1] does this for us.


Build the project and deploy it to the web site you're using. Now we're ready to create the Gateway service that will do the simple RFC test that Commander Data did in part 1.


Off to part 3!

Slacking Off (3 of 3)


(This is Part 3 of a 3 part series. See Part 1 and Part 2 for the whole story.)

In the last 2 posts we paved the way to get some data out of SAP from Slack. First, we set up Slack to send out a request when a user enters a Slash Command. Then, Google App Engine handles that request and forwards it to Gateway. Now Gateway needs to respond back to Google with the RFC connection test that the Slack user asked for.

Here's a simple OData service setup that will test an RFC connection on the ABAP system. My intention is to inspire you to do other cool solutions - I'm just setting this up to show off quick-n-dirty style to explain concepts. Take this and make something else work for you!

Go to SEGW and create a service. I called mine ZSLACK_DEMO. Here's an example setup of the fields for an entity called RfcDestination:

segw for slack service.PNG

Then code up the RFCDESTINATIONSE_GET_ENTITY method in the generated class ZCL_ZSLACK_DEMO_DPC_EXT (assuming you kept the same names I used). Make sure you generate the project first, and then do the redefinition process for the method I mentioned. Here's a great document on setting up class-based Gateway services that goes more in-depth.

Here's a simple implementation of an RFC ping method that matches up with the service we created.

   METHOD rfcdestinationse_get_entity.
     DATA: lv_start TYPE i,
           lv_end TYPE i,
           lo_ex TYPE REF TO cx_root,
           lv_rfcdest TYPE rfcdest,
           ls_key_tab LIKE LINE OF it_key_tab.

     READ TABLE it_key_tab INTO ls_key_tab WITH KEY name = 'Destination'.
     IF sy-subrc IS INITIAL.
       lv_rfcdest = ls_key_tab-value.

     er_entity-destination = lv_rfcdest.

       GET RUN TIME FIELD lv_start.
           system_failure        = 1
           communication_failure = 2
           OTHERS                = 99.
       GET RUN TIME FIELD lv_end.

       IF sy-subrc IS INITIAL.
         er_entity-connection_status = 'OK'.
         er_entity-connection_time = ( lv_end - lv_start ) / 1000.
             error           = er_entity-connection_status.

     CATCH CX_ROOT INTO lo_ex.
       er_entity-connection_status = lo_ex->get_text( ).

Maybe not production quality, but ready to do the trick. For a good connection, it will give you an OK ConnectionStatus and a number in milliseconds for the response time. For a bad connection, it will respond with the RFC error in the ConnectionStatus field. Our Google App Engine web server receives this and plugs it into a text response to Slack. When Slack receives the response it puts the text into the chat window for the user who requested it.

Wrapping Up

Assuming all the pieces of the chain have been done correctly, you can activate your slash command. Try it with something like "/ask-sap shout out a test of RFC <your_destination>". If you're all set, the chat window will shortly return to you with a response from SAP.

This was a very simple prototype implementation - but there are so many things you could do! I'll leave you with a brain dump of ideas to inspire you beyond my work.

  • Set up a batch program that looks for new work items for people and send them a Slack message. Wouldn't it be cool to get a digest of everything I need to approve in one private chat window every morning? Check out Incoming Webhooks for more possibilities here.
  • Incoming webhooks would even be a simple way to enable some basic real-time notifications. User-exits or enhancement points could be created with knowledge of the webhooks and respond right away to almost any ABAP event.
  • Plug in some of your master data creation processes (assuming they're simple enough) to fit into a command. Imagine "/sap-create new business partner PAUL MODDERMAN <other field> <other field>".
  • Slack is cloud-based and doesn't require any complex VPN setup. The Slack app on your smartphone would be an easy way to enable your processes for quick response.


Go make cool stuff!

Maintain Data Concurrency in OData




More than one user has access to the same entity and hence the same data in SAP Fiori applications. So how to prevent parallel edit access for users so as to prevent the data from being overwritten in flight.


There are three ways to lock records to manage the concurrency.

    1. Pessimistic

        In this scenario the application assumes that concurrent writes will occur and hence protects it by aggressively locking out resources. This can lead to deadlocks and also have some performance reduction as applications using the resource will have to wait in queue to apply its changes.

    2. Optimistic

        In this scenario the application assumes that the likelihood of a concurrent write is rare and so allows the operation to continue.

    3. Semi-optimistic

        This is a combination of pessimistic and optimistic concurrency controls. This kind of solution is used for very specific scenarios.


For Data Concurrency OData recommends implementing Entity tags or ETAG's. Since OData uses HTTP protocols and is stateless, it can use only optimistic concurrency control of data and only that is discussed in detail further in this paper.



Implementing ETags in SAP Gateway


SAP Gateway allows three ways to implement ETags.


1. A field based ETag (typically a timestamp)

When an entity is typically a subset of a database table and the database has a field that  maintains a timestamp to signify a change whenever the record changes, then that field could be included in the entity and can be used as the E-Tag.


In transaction SAP Gateway Service Builder (SEGW), at the Entity Types, you select a field as your ETag property. Save and re-generate the Service.



2. A full entity based ETag (typically a timestamp)

    The shortcoming of the above method is that, there are scenarios where only a few fields from a database is exposed in an entity model. But changes to the DB table in the backend can occur through various sources including background programming, manual changes through backend transactions etc. These backend changes might not have changed the Entity fields at all, but still that will be considered as a change.


    In such a scenario the full entity based ETag could be implemented.


    Include a Field ETAG with data element HASH160 into the entity. And use that field as the ETag property for the entity.


    Once all data for the entity is fetched the hash value calculation happens.




3. Partial entity based ETag


Whatever ETag method that we implement will apply for all operations except Create. There are scenarios where the get_entityset implementation and get_entity implementation of the same Entity might not have same ETag values. This is due to that fact that there are certain information in an Entity which will be costly to fetch during a get_entityset operation. In such a scenario the partial entity based ETag could be implemented.


    The implementation of partial entity based ETag is very similar to the full entity based Etag. The only change is the calculation of hash value.


    Here instead of passing the complete entity, you remove the values for fields that you do not want to be part of the hash value calculation and then call the calculate hash.


For eg:-



Using ETags in Fiori


Classic Scenario with Etags





During the get entity if a valid etag is provided and the entity has not changed then the server responds with return code 304 (Resource not modified). If the entity has changed then the new set of data is returned with return code 200.



If-None-Match: W/"'DBF5DD4DE0073002917521B7057C0826FC5A7F8E'"



if you do not want to check against an existing e-tag, but get the data anyway you can use '*' to get the data.


If-None-Match: *



Update/Merge/Delete Entity uses the http header variable If-Match


If a valid e-tag is provided, the OData infrastructure checks for the hash value by calling the get_entity and verifies the passed e-tag value. If it is successful the update/merge/delete operation proceeds without any issues. If the e-tag does not match then the server returns status code 412 (Precondition failed.).


Note 1:

    As a developer if you want to any how update the data from client (while using the SAP Gateway Client for testing purpose) then you can pass the If-Match value as a '*', which means the client will win and will update the data irrespective.


Note 2:

     During the Update/Merge/Delete options in case of a ETag get_entity call is made in the backend before the update call to verify the ETag value.






How to expose Gateway Services via HCI OData Provisioning, and secure them using SAP API Management on HCP Trial - Part 1


There is already a very fine blog from Martin Bachmann explaining HANA Cloud Platform, the benefits of cloud and HCI Odata Provisioning in general, as well as the scenario for connecting SAP HCI OData Provisioning to an SAP Gateway system here How to connect the SAP Business Suite to the SAP HANA Cloud Platform using Gateway as a Service Trial Edition. However like so many things in the cloud, there are changes already, as soon as its published. I will not re-cover the informational content contained therein again, but I will be showing the updated screens for the most recent version.


If you have already configured HCI Odata Provisioning using Martin's blog, or on your own, and want to skip to the SAP API Management part, please copy down the URL path to your HCI Odata Provisioning Services and go to Part 2 of this blog.


A quick bit on "Why SAP API Management?"

SAP API Management does not replace SAP Gateway, and in fact, relies on SAP Gateway to expose data from SAP Backends. What SAP API Management adds is an enhancement of the capabilities provided by SAP Gateway. It can sit on top of a Gateway deployment in order to provide Secure and Scalable access, through security, and data management policies, as well as providing a developer engagement area. With a deployment on HCP, it is even easier, as a user of SAP Gateway only needs to install/run the IWBEP component of Gateway on their SAP Backend system, and use HCI OData Provisioning on HCP to connect to it, consuming the exposed OData endpoints directly in SAP API Management. Additionally SAP API Management can combine other data sources such as HANA XS or Non-SAP data together with Gateway exposed data, exposed via a single secure endpoint for developers to build impressive Apps.


But this walkthrough will focus on using HCI OData provisioning (hereafter referred to as HCIODP) to consume services exposed by SAP Gateway, and expose them as OData endpoints which will be consumed by SAP API Management as API Proxies.



PART I – Creating OData Endpoints in HCIODP


* Pre-Requisite: An account on an SAP Gateway system accessible via Internet. For this walk-through I will be using the SAP Developer Center ES4 system. Anyone can sign up for an account here: SAP NetWeaver Gateway Developer Center

1. Enabling HCIODP in Trial


Login to trial, check the Services section, under "Integration" i.e. In HCP Trial Account




Click HCI OData Provisioning tile to enter service. Check status of service. If click status is "Not enabled" click the "Enable" button. Wait until you see service status change to Enabled.





2. Configuring HCIODP Destination(s)


Click “Configure HCI OData provisioning” - This should bring up the “Destinations” tab under "Configure HCI OData Provisioning". Click “New Destination” in order to create the destination for the SAP Developer Center ES4 system.


Enter details for the Gateway system. All details, including login, and password will be those which you have registered on the Gateway system. E.g. for SAP Developer center ES4 system, see below:




After you save, wait until details are saved in system, which will be indicated by the configuration screen turning grey and no longer allowing input.


3. Configuring HCIODP Roles


Click the “Roles” tab to configure user access in HCIODP.


Select GW_Admin role, and click “Assign” below in the “Individual Users” section. This will authorize the user to enter the Admin window for HCIODP to configure available services, view logs, or configure Metadata options.


In the popup window, enter the SAP ID login information (P-User, S-User, I#, etc.) you will be using, and click "Assign"..





Repeat this process with the role GW_User, this will assign authorization for a user to consume the services configured on HCIODP (but not to access the Admin window).

Once complete, you should have a user assigned to both the roles GW_Admin and GW_User.



4. Configuring HCIODP Services

Click the “HCI OData Provisioning” link at the top of the window, to return to the HCIODP base screen



Then click “Go to Service” from the base screen for HCIODP.


If everything worked correctly, this will open a new browser tab, for the HCIODP Admin screen. You may be prompted to enter SAP ID credentials, enter the credentials for the user configured in the GW_Admin role. After login the Admin screen should appear as below:




To begin exposing services from Gateway system configured, click the "Register" button at the top of the screen to bring up the “Register Service” screen. Select the SAP Gateway system configured in Step 2 from the drop down list for Destinations, then click the icon of a Magnifying Glass next to “Search Services” to bring up a list of available Gateway Services.




Select the desired service to be exposed to API Management by clicking the empty box on the far left to highlight that selection. E.g. to select “GWDEMO” below:


Note: The box will fill blue when selected, and if you move the mouse cursor away, you will see the entire row is blue when selected. If this is not the case, the row was not properly selected.


Click the "Register" button, to register the selected service in HCIODP. The service should now appear in the list of Registered Services for HCIODP.



Click “Open Service Document” for the newly added service, to test that the service is exposing data as expected. This will open a new browser tab, with service data in OData format. Copy down the URL in your browser bar for the service, this will be used as the Target endpoint for the API Proxy.

Repeat these steps above for each Gateway service you want to expose.


When you have completed registering services, the next step will be creating API proxies in SAP API Management, using HCIODP as the OData Target Endpoints, and the Services as the APIs in this case. This will be covered in Part 2.



For questions, feedback, concerns, feel free to leave a comment, or send us an E-Mail.

Also follow us online
SAP API ManagementSAP.com|SCN| YouTube

How to expose Gateway Services via HCI OData Provisioning, and secure them using SAP API Management on HCP Trial - Part 2


This is a continuation from Part 1 where we walk-through setting up HCI OData Provisioning on HANA Cloud Platform against SAP Developer Center'ss ES4 Gateway system. If you haven't already completed this, I recommend going through Part 1 in order to be ready for this blog.

Now that you should have at least one SAP Gateway service exposed in HCI OData Provisioning (hereafter referred to as HCIODP), it's time to set things up so that they can be connected to by SAP API Management. Due to the Platform nature of HANA Cloud Platform, this is remarkably easy fortunately. If you have not yet enabled SAP API Management, please follow the steps outlined here Free Trial of SAP API Management on HANA Cloud Platform is available now!


Once you are set-up, can get started right away.

PART II – Creating API Proxies from OData Endpoints in SAP API Management


1. Creating a "System" connection to HCIODP.

Login to your HCP Trial Account with SAP API Management enabled. Open the Services pane, and locate SAP API Management under the "Integration" section. Launch API Management API Portal by clicking "Access SAP API Management API Portal"




This should load the "Configure" section of SAP API Management by default, if not, click the drop down menu from the top left hand corner and select “Configure”. This is where one generates Target systems for SAP API Management. Click "Create" in the bottom right hand corner to add a system.

This will take you to the Create System window, where you will need to enter the relevant details for the HCIODP system used in Part I.


Details: HCI OData Provisioning system, providing exposed Gateway Services

Host: enter the Base URL from the Service Document URL you saved from Part I. (The format will be gwaas-<userID>trial.hanatrial.ondemand.com)

Port: 443
Check Use SSL
Authentication Type:

Service Collection URL:  /CATALOGSERVICE/ServiceCollection




* The Catalog URL is something unique to SAP Gateway systems, which allows SAP API Management to "Discover" available services from the Catalog Service. It is not used for Non-SAP systems.


After checking details entered, then click “Save”. Once system has been saved, click “Launch” hyperlink at bottom left-hand side of screen, this will launch a new tab, opening the SAP API Management Destinations area on the HCP Cockpit, not to be confused with the Global HCP Destinations. Here you will add Authentication settings for the newly created System. Click the name of the system you newly created, e.g. HCIODP and click the "Edit" button at the bottom of the page.


User: <Use the accountID that was added under GW_User in Part I>

Password: <Password for accountID added under GW_User in Part I>
Next, Click “New Property” under Additional Properties, and enter Key: TrustAll || Value: true




Then click “Save”.Changes can take up to 5 minutes to save (but usually will only take a few seconds).



2. Creating an API Proxy using HCIODP Service


Close HCP Cockpit to return to API Management API Portal window. Select drop-down menu list from top left hand corner and select "Manage" to open API Proxy page.


In the API Proxy window, click “Create” at the bottom-right hand corner, and select “Create API” from popup selection to bring up the “Create API” screen.

Provider System: <Select system created in step 2>
Click “Discover” button to see a list of all services exposed by HCI-OData Provisioning.

Select a Service and click “OK” this should auto-fill all the API Information for you.



* Link Target Server will create the API with all System information determined by the Provider System you determine, and all pathing linked as a virtual path. This allows for easy transition between environments (such as Dev, Test, Prod) for the API Proxy. Documentation flag determines whether SAP API Management attempts to pull existing documentation from the Service, and is only applicable for SAP Gateway endpoints.

Click “Create” to generate the API Proxy. If everything was entered correctly, the API Details screen should come up, with all Resources (and corresponding descriptions of the model) created automatically from the Metadata.




Then click “Save”. If everything went well you should see a message at the bottom of the screen telling you "API Registered Successfully".


To test that the API Proxy is working correctly, click “GET” for one of the resources with that operation available. If no resources are available, select “Test” from the main Drop-Down menu, then select the name of the API Proxy from the list of APIs.

Click “Authentication: None” and select “Basic Authentication” from the drop down list.
User Name: <UserID added under GW_User in Part I>
Password: <Password for userID added under GW_User in Part I>
Select the “GET” operation.




Click “Send” in the bottom right hand corner.

This should successfully retrieve the OData Collection provided by HCIODP, via a call to the SAP API Management API Proxy Endpoint URL.


Now that you have an API Proxy sitting on top of the HCIODP service, you can start adding Policies to extend the functionality, as well as expose it to the Developer Portal so that Developers could begin to build apps on top of the data. I will not get into that in this blog, as we were just quickly getting up and running connecting SAP API Management to HCIODP.


If you would like to start learning more about what you can do with SAP API Management, I suggest looking at the repository of information SAP API Management - Overview & Getting started which will continue to be updated as more enablement content is added.


Of particular interest to this particular exercise, will be the Blog on creating a Policy SAP API Management - Enabling URL masking . If you notice above, the returned data includes links to the HCI OData Provisioning service, which is not what you want if SAP API Management is the intended target for connectivity. The linked blog will tell you how to have SAP API Management automatically mask all URLs through SAP API Management.


For questions, feedback, concerns, feel free to leave a comment, or send us an E-Mail.

Also follow us online
SAP API ManagementSAP.com|SCN| YouTube

Making your SAP Gateway work smarter not harder : 1) Developing Re-Use Patterns



As SAP customers, partners and consultants embark on their journey to designing and building your own Fiori style applications a major part of this journey will be building your own RESTful API’s using SAP Gateway ( OData services ).



I'm going to break down my learning and insights into SAP Gateway development into a series of blogs where I would like to cover the following topics:


  • Re-Use and Development Patterns
  • Encapsulation of Business Logic & Searching for Entities
  • Performance
  • eTags for concurrency control




Acknowledgement and Use Cases

The development patterns covered in this series are not originally mine.  If you look at the My Accounts Fiori application delivered by SAP in CRM (UICRM001), this pattern was written to support the OData service "CRM_BUPA_ODATA".


Our colleagues who have put together this pattern probably don't realise how much of a gem it has been for me to rapidly produce re-usable, nicely encapsulated entity implementations in SAP Gateway, so hats off to the folk who put a lot of though and effort behind this.

You can consider this pattern when creating your own OData services in the customer namespace.  For modifying SAP delivered OData services you should always follow the SAP recommended approach and enhancement documentation.




Key Learnings from this article:


  • You need someone who understands how to build OData services in SAP Gateway they should know how to leverage re-use patterns to help accelerate UX development and multi-channel consumption.

  • You can build a generic template that seeds a base implementation for your concrete entities.

  • You can build a module specific abstract object where you can define re-usable functions on a module level (SD, MM, BP for example )

  • This is NOT the only way or the standard SAP way of building OData services, I just really like this pattern as it has served me enormously well on a large scale wall to wall Fiori project and I’m hoping to adopt it has a gold standard on future projects I work on and hope it helps you to accelerate your OData service development.





Putting the Pattern Together


First let’s cover our typical OData service from a high level and use a simple entity relationship model as an example ( Account or Business Partner with an Address association ):





SEGW - Design Time.


Here we define our entities, attributes of those entities and association between them ( navigation path ).


In our example we want to look at two entities, Account and Address.  You can generate the entity with various methods but I like to generate new entities from a custom ABAP structure.


  • I like to start with a standard SAP structure as an Include ( BUT000 in this example )
  • I like to include the term "ODATA" in the naming convention to signify the structure is associated with a REST service that implies restricted use.







DPC, DPC_EXT - Data Provider Classes

These are the classes that are generate from activating the OData service and where you have been told to implement you entity logic for CRUD functions.


I have a couple of rules around DPC_EXT:


  • Pass through implementation only for each entity CRUD function to an encapsulated class  
  • NO business logic in DPC, this should all be encapsulated in a concrete class
  • Function Imports should have their own API, in other words any implementation in EXECUTE_ACTION should be calling some API or utility type class.





Entity Implementation and Class Hierarchy

Most implementations I have seen have a loose encapsulation concept, in other words implementation in CRUD methods in the DPC_EXT class contain some business logic and other business logic is contained somewhere else.


I don’t like this as it becomes difficult to maintain and causes regression problems when new developers take over and add new features.


Instead we can use a nice class hierarchy that helps us encapsulate different business logic in layers. 


This is the current hierarchy I like:






In context with our “Account” entity, this is what we would end up with:







This class provides a base line search for GET_ENTITYSET that handles skip / count / paging.  The base line search is based on a dyanmic SQL approach.  it also allows us to do eTag calculation across all out entities.  Apart from the Search pattern in this class, I only use it for Gateway framework type functions.


I’ve called this a modular layer, we can encapsulate module specific functions.  In our use case for Accounts (Business Partner) we may have common functions such as add leading zeros to a BP number( internal / external display ) or functions that format the Business Partners full name that we can write once and call across any concrete class that belongs to our module.


This is the concrete class, here you implement the CRUD functions GET_ENTITYSET, GET_ENTITY, UPDATE_ENTITY etc which are called from the corresponding methods in the actual DPC_EXT class of your service.  All entity specific business logic is maintained here.




Re-use of Entity Objects


Start thinking about the services you need to build.  Are there re-usable entities across these services? Let’s take a look at a one example.


Below is a diagram that shows two OData services:

  • My Accounts Service
  • My Sales Order Service


These two services share common entities:

  • Account
  • Address


In some implementations, I have seen the same code duplicated across different DPC_EXT classes for the same entity which doesn't lend itself to good re-use patterns although there certainly may be a use case where this is necessary.

Here is what I mean about acceleration, once I have the Account and Address entity implementation up and running, tested and stable I can re-use these entities across new services I'm putting together.

The initial build is obviously the longest but scaffolding up a new service with the same entities then becomes considerably faster.






Data Provider Class ( DPC_EXT )

To facilitate this pattern, we need to make some changes in our DPC_EXT. This allows us to access instantiated concrete classes at runtime.


First we need an attribute on our DPC_EXT class that holds the instances of each entity implementation:




The structure of the table is:






Now to instantiate our entity implementations, during the constructor call in the DPC_EXT class we instantiate each entity and store the instance, class name and entity name:







Now we need to call our entity concrete class implementation. Here we assume the service has been generated and you have re-defined the relevant method.








We've mentioned this class called ZCL_ODATA_RT.  The purpose of this class is to provide:

  • Common search capability based on dynamic SQL.  This means when I implement an entity for the first time I can re-use this search pattern that allows me to process complex searches that handles a range of other functions like paging.  In a nutshell we have implemented a re-usable search feature once across all our entity implementations that we can tailor for each concrete class. 


  • I can place framework specific functions such as eTag HASH calculations that will work across all of my entities

Overview of Entity Search

The first thing we probably want to do in the Fiori application is search for an entity.  I usually start by building out the generic search pattern in ZCL_ODATA_RT that follows this basic flow:


NB: I don't have to use this when setting up a new concrete class for the first time, I can redefine the GET_ENTITYSET method in the ZCL_ODATA_ACCOUNT and write other code to do the search for us.


Here is an overview of the logic of the GET_ENTITYSET method in ZCL_ODATA_RT:


1) Process paging, max hits and skip tokens.  This allows our dynamic SQL to retrieve our results based on certain filter parameters and then we have a generic paging pattern taken care for us. Think about the Master Object list in a Fiori app where you type in some search criteria and the OData call includes a "substring" filter.




2) Generate a dynamic SQL statement.  This method contains the select, inner join, where statements.


3) Apply additional filters.  Here I can read my entity filters passed by $filter and add additional criteria to our dynamic SQL



4) Execute the dynamic SQL statement and page the results




5) Enrich the data and return the result. This is where where you want to populate additional attributes of the entity that could not be retrieved during the dynamic SQL statement, thing such as texts or formatted names for example.






Implementing the search in our concrete class


I now need to implement this in our concrete class.  When I have created our ZCL_ODATA_ACCOUNT class, I can then redefine the appropriate methods where I need to put my logic, this includes:

  • ENRICH_WITH_USER_FILTER (not shown in our basic example below)





Generate Select

In our generate select method, all we are doing is forming up the dynamic SQL, what attributes we want to select into our entity set.  We can also include joins here if we want to search across multiple tables.




Enrich Entity Set Data

Before we pass the selected entity set records back to the odata framework we want to format and add additional things into the result.  Stuff like full name or texts.  In our example we've just formatted the full name of the business partner into the field FULL_NAME.




We now have a concrete class implementation with an entity set search routine.

Implementing Other CRUD methods

We can then continue to implement the other CRUD functions by redefining the appropriate methods in our concrete class (ZCL_ODATA_ACCOUNT) and data provider class.

Let's do an GET_ENTITY example.  Here is the re-definition in the DPC_EXT class:


And let's put some logic in the ZCL_ODATA_ACCOUNT method GET_ENTITY:


I won't go through POST, DELETE, UPDATE this should provide enough foundation for how the pattern works and encapsulates our business logic nicely into a concrete object.





In this part of the series we've demonstrated that it is possible to build a re-use pattern that encapsulates our entity implementations cleanly and also include a powerful search feature that we can consume on demand if required without having to re-write the same functions more than once.


I hope you have found this blog useful and see you next time where when we cover some performance considerations for your OData services and e-tag handling for data concurrency control.

Stay tuned for updates as I prepare a set of video walkthroughs to augment the content outlined in part 1 of this series and to show more complex search patterns.

Thanks for stopping by.

How $expand Influences the Importing Parameters of GET_ENTITYSET in SAP Gateway


Did you know that the method parameter IV_SOURCE_NAME of GET_ENTITYSET has different values depending on whether the service is triggered with or without $expand? This also applies to the parameter IT_NAVIGATION_PATH. I recently found out about this the hard way because not knowing this caused a bug in my code. I think it’s worth sharing my findings. Basically, you will see that using $expand fills the parameters mentioned above differently compared to directly triggering GET_ENTITYSETwithout $expand but still with Navigation Properties.


First of all, I’m using NW 7.4 SP12 together with SAP_GWFND SP12 (and some additional patches/notes).


We’re going to use the GWSAMPLE_BASIC project/service that is shipped with your GW installation. You can simply go to the Service Builder (transaction code SEGW) and open the project /IWBEP/GWSAMPLE_BASIC. I like this project because I could see how SAP implements GW services – and I learned a lot from that in the past :-) Also, check Demo service ZGWSAMPLE_SRV | SCN by Andre Fischer which tells you a little more about the data model of the Gateway Sample Project.


Now let’s first set a breakpoint in the method CONTACTSET_GET_ENTITYSET of the class /IWBEP/CL_GWSAMPLE_BAS_DPC_EXT. You can navigate there directly from SEGW (see screenshot above), or just use SE80.


Now, after we have set the breakpoint, let’s try some example URLs and see what happens.

1. Without $expand: /sap/opu/odata/IWBEP/GWSAMPLE_BASIC/SalesOrderSet('0500000000')/ToBusinessPartner/ToContacts


As you can see this URL is using two Navigation Properties. First, we navigate to the BusinessPartner of a given SalesOrder, then to the BusinessPartner’s contacts. This will only return the contact data to the caller of the service. Calling this URL will hit the breakpoint we set above:



While we are in the debugger inside CONTACTSET_GET_ENTITYSET let’s check the two mentioned parameters:

  • IV_SOURCE_NAME==> ”SalesOrder"
  • IT_NAVIGATION_PATH==> two entries, ToBusinessPArtner, ToContacts (see screenshot below)



This basically tells us from the source "SalesOrder", we first navigated to the BusinessPartner and then to the contacts. The itab IT_NAVIGATION_PATH also contains information about the Target_Type (=Entity) and the corresponding namespace. So far so good. Now let’s assume we want a given SalesOrder and the contacts of its BusinessPartner in one request. We will achieve this using $expand.



2. With $expand: /sap/opu/odata/IWBEP/GWSAMPLE_BASIC/SalesOrderSet('0500000000')?$expand=ToBusinessPartner/ToContacts


Again, hitting this URL will hit our breakpoint. However, this time both IV_SOURCE_NAME and IT_NAVIGATION_PATH have different values (see screenshots below):


  • IV_SOURCE_NAME ==> ”BusinessPartner”
  • IT_NAVIGATION_PATH ==> 1 entry, ToContacts (for some reason the TARGET_TYPE_NAMESPACE is not filled)







This basically tells us from the source "BusinessPartner" we navigated to the contacts. However, this is not what I expected. I expected that using $expand in this scenario would be equal to calling our first URL from above in Step 1 (/sap/opu/odata/IWBEP/GWSAMPLE_BASIC/SalesOrderSet('0500000000')/ToBusinessPartner/ToContacts).


Instead, when using $expand in our scenario it’s basically equal to calling the URL /sap/opu/odata/IWBEP/GWSAMPLE_BASIC/BusinessPartnerSet('0100000000')/ToContacts, the explanation comes next.



3. Behind the scenes: /sap/opu/odata/IWBEP/GWSAMPLE_BASIC/BusinessPartnerSet('0100000000')/ToContacts


So, when using $expand, we now know what is actually called behind the scenes by the Gateway. Well, let’s prove this in the debugger by calling /sap/opu/odata/IWBEP/GWSAMPLE_BASIC/BusinessPartnerSet('0100000000')/ToContacts directly:



As you can see the result is equal to what we had in Step 2 – and that’s the prove:


  • IV_SOURCE_NAME ==> ”BusinessPartner”
  • IT_NAVIGATION_PATH==> 1 entry, ToContacts


However, this time for some reason the TARGET_TYPE_NAMESPACE is filled correctly – which I can’t explain :-)

I’m sure there is a good reason for the different values passed to the parameters – but I don’t know that reason :-)

Anyway… I’m aware of this now and I hope you are as well.

Make your SAP Gateway Work Smarter not Harder – Part 1.2 ) Search Patterns and Service inclusion





Part 1 of this series can be found here if you have not seen it yet:




In Part 1 we discussed development patterns in SAP Gateway and how we can achieve re-use and business logic encapsulation.


Two concepts we covered were a “search pattern” where we used Dynamic SQL to retrieve entity sets and a class hierarchy to encapsulate business logic and separate out functions on a module and gateway level.


In this blog I’d like to clarify and enhance the discussion by covering different options for Search and how to do Service Inclusion for re-use purposes.



Search Pattern and Class Hierarchy


In Part 1 we covered a possible class hierarchy where we could gain re-use and business logic encapsulation across our OData services and also included SAP module specific functions.


We also covered a generic search pattern that uses Dynamic SQL, that is ABAP code where you build up your own query and execute the SQL statements by manipulating “From” “Where” clauses.


The dynamic SQL doesn't sit well with a lot of people and here a few reasons why:


  • Code can be difficult to maintain, there are a lot of hard coded variables and requires a deep understanding of the SAP data model.

  • It is easy to go from simple use cases to complex ones and therefore create a larger technical debt than required.

  • There are obviously other choices and where a good "framework" is available there is certainly no rule that says don't use that instead.



Re-Jigging the Class Hierarchy


  • Removing the "Search Pattern" from ZCL_ODATA_RT.  This ensures that only SAP Gateway level functions are encapsulated here, this makes more sense to perform things like "E-Tag" handling rather than search.

  • Have a separate inheritance for the search pattern, in this case I've put a new class in the hierarchy called ZCL_ODATA_QRY_[search_function] where "[search_function]" is a framework specific search pattern. This doesn't have to be module specific, in systems like CRM I have different frameworks like BOL or Reporting Framework that I can re-use here across different modules like Service or Sales.





Search Pattern - Example with Reporting Framework


This is an example I put together using Reporting Framework in CRM.  We created a new class "ZCL_ODATA_QRY_RF" which designates this class is coupled with the "Reporting Framework".  It still inherits from our ZCL_CRM_ODATA_RT class.



We have a simple constructor that takes in a CRM BOL query and object type ( like a Business Activity BUS2000126 ) and instantiates a instance of of the IO_SEARCH class provided by SAP.



And of course a "Search" method that takes in some query parameters, the IS_PAGING structure from our OData service and another flag that allows to return only the GUID's as a result rather than all the functional data.



This is our search method implementation:


METHOD search.
 DATA: lt_return TYPE bapiret2_tab,
 ls_message_key TYPE scx_t100key,  lv_max_hits    TYPE i.
 IF it_query_parameters IS INITIAL.
 CALL METHOD gr_search->set_selection_parameters
 EXPORTING  iv_obj_il = gv_query  iv_obj_type = gv_object_type
 it_selection_parameters   = it_query_parameters
 IMPORTING  et_return = lt_return
 EXCEPTIONS  partner_fct_error         = 1  object_type_not_found     = 2
 multi_value_not_supported = 3
 OTHERS = 4.
 READ TABLE lt_return ASSIGNING FIELD-SYMBOL(<fs_message>) WITH KEY type = 'E'.
 IF sy-subrc = 0.
 ls_message_key-msgid = 'CLASS'.
 ls_message_key-msgno = 000.
 RAISE EXCEPTION TYPE /iwbep/cx_mgw_busi_exception EXPORTING textid = ls_message_key.
 IF iv_keys_only = abap_true.
 CALL METHOD gr_search->get_result_guids
 iv_max_hits  = lv_max_hits
 et_guid_list = rt_guids  et_return    = lt_return.
 READ TABLE lt_return ASSIGNING <fs_message> WITH KEY type = 'E'.
 IF sy-subrc = 0.
 ls_message_key-msgid = 'CLASS'.
 ls_message_key-msgno = 000.
 RAISE EXCEPTION TYPE /iwbep/cx_mgw_busi_exception EXPORTING textid = ls_message_key.
 CALL METHOD gr_search->get_result_values
 iv_max_hits  = lv_max_hits
 et_results   = rt_results
 et_guid_list = rt_guids  et_return    = lt_return.
 READ TABLE lt_return ASSIGNING <fs_message> WITH KEY type = 'E'.
 IF sy-subrc = 0.
 ls_message_key-msgid = 'CLASS'.
 ls_message_key-msgno = 000.
 RAISE EXCEPTION TYPE /iwbep/cx_mgw_busi_exception EXPORTING textid = ls_message_key.
 * Process Top / Skip tokens for paginglts
 IF is_paging-skip > 0.
 DELETE rt_results FROM 1 TO is_paging-skip.
 DELETE rt_guids   FROM 1 TO is_paging-skip.
 IF is_paging-top > 0.
 DELETE rt_results FROM ( is_paging-top + 1 ).
 DELETE rt_guids   FROM ( is_paging-top + 1 ).


So now when you want to execute the search in your concrete class, you can consume the SEARCH method in the inherited Search Framework plugin you've created:


METHOD /iwbep/if_mgw_appl_srv_runtime~get_entityset.
 DATA: lt_query_parameters TYPE genilt_selection_parameter_tab,  ls_query_parameter LIKE LINE OF lt_query_parameters,  lt_sort TYPE abap_sortorder_tab,  ls_sort TYPE abap_sortorder.
 CREATE DATA er_entityset TYPE TABLE OF (gv_result_structure).
 ASSIGN er_entityset->* TO <fs_results>.
 * Navigation Path from an Account
 READ TABLE it_key_tab ASSIGNING FIELD-SYMBOL(<fs_key>) WITH KEY name = 'AccountId'.
 IF sy-subrc = 0.
 ls_query_parameter-attr_name = 'ACTIVITY_PARTNER'.
 ls_query_parameter-sign = 'I'.
 ls_query_parameter-option    = 'EQ'.
 MOVE <fs_key>-value TO ls_query_parameter-low.
 APPEND ls_query_parameter TO lt_query_parameters.
 * Process Filters
 LOOP AT it_filter_select_options ASSIGNING FIELD-SYMBOL(<fs_filter_select_option>).
 CASE <fs_filter_select_option>-property.
 WHEN 'ProcessType'.
 LOOP AT <fs_filter_select_option>-select_options ASSIGNING FIELD-SYMBOL(<fs_select_option>).
 MOVE-CORRESPONDING <fs_select_option> TO ls_query_parameter.  ls_query_parameter-attr_name = 'PROCESS_TYPE'.
 APPEND ls_query_parameter TO lt_query_parameters.
 it_query_parameters = lt_query_parameters  iv_keys_only        = abap_false  is_paging           = is_paging
 IMPORTING  rt_results          = <fs_results>.

Service Pattern Summary


Whilst this is a brief example, it shows the possibility of plug and play type frameworks for your OData services rather than tackling dynamic SQL that was included in the first part of this blog series.


If you've implemented a similar pattern I would love to hear from you with details about what you have put together..


Service Inclusion


Service Inclusion is the process of including another Service Model in your SAP Gateway Service Builder project, like this:









Effectively, it allows you to access the Entities in service B directly from service A:




This approach maximises the re-use of your services, not only will you get re-use and business logic encapsulation in the class hierarchy, your design time re-use is maximised as well.

Limitations / Gotchas


There are a couple of things to watch out for.


Referring to the diagram above, when you execute the URI for “/SALES_ORDER_SRV/Accounts(‘key’)/SalesOrders” the navigation path is not populated, a Filter parameter is passed to the SalesOrder GET_ENTITYSET where you now must filter the Sales Orders by the Account ID. 


What I mean by “limitation” is that usually you will be passed a navigation path ( IT_NAVIGATION_PATH) where you can assess where the navigation came from and what the keys were, in this use case you are missing the KEY_TAB values in the IT_NAVIGATION_PATH importing table in your GET_ENTITYSET method.


For this to work you must also set the referential constraint in the SEGW project and build your Associations as an external reference, like this:





When the SAP Gateway framework attempts to assess which model reference your entity belongs to so it can execute a CRUD function, the underlying code loops over the collection of model references you have included ( in Service B ) and tries to find the first available model where your external navigation entity is located. 

In case you have implemented the same entity in multiple included services, SAP picks up the first available which can lead to surprises if you're wondering why your debug point is not being triggered during a GET_ENTITYSET method call in the wrong service


Service Inclusion Summary

If you haven't used this feature yet it provides a really great way to maximise re-use of your OData services, just a side note here; a really good use for this pattern is your common configuration entity model, things such as Country and State codes, Status Key Value pairs etc,

I have built a common service before that reads all of the configuration we use across different Fiori applications and are contained in one common service.

This way i simply "include" the common service so i don't have to keep implementing the same entity set or function import in different services.

Final Summary

I have put together this blog to try and clarify and provide a different perspective on Search capability within our OData services.  I know there are some of us out there that dislike the dynamic SQL scenario and for good reason.

My aim is to encourage some thought leadership in this space, so those customers tackling their OData service development can at least learn from our experience and embrace re-use patterns to really try and reduce TCO not to mention accelerate UX development in parallel.

As always, I'd love to hear from our community about other development patterns you've come up with so please share your thoughts and learnings, it's really important as our customers start to ramp up their OData and Fiori development.

Exploring Multiple Origin using System ID and Client


While working with Multi Origin in service URL we have to pass additional parameter SAP_Origin (i.e. nothing but your system alias pointing to corresponding ECC) in filter and Get entity operation.


When working on same scenarios i found one interesting possibility of passing the System ID (SID) and Client (of ECC) and fetch data from particular system.


What is Multi Origin?


Multiple origin is used to fetch data from different back-end systems, collect them in one single service and updating different back-end systems using the same user.


Why System ID and Client?


  • When creating OData requests you might not have information on any SAP NetWeaver Gateway system alias.
  • You might wish to refrain from exposing the system name of your SAP NetWeaver Gateway system in the URL of your service for security reasons. In this case, instead of the system name you can also use the system ID (SID) together with the corresponding client.


Pre-Requisite :


  • Gateway system is connected to all back end system
  • System alias is available for all back end system
  • GW Project is created in all the back end system
  • Add all system alias in T-code /n/IWFND/MAINT_SERVICE for the ZPROJECT_SRV in GW system
  • Using the URL sap/opu/odata/sap/ZPROJECT_SRV;mo/$metadata expected output is coming.

Alias eg.

FIORI_ECC1 : Backend System Alias 1 (SID - ABC, Client -100)

FIORI_ECC2 : Backend System Alias 2 (SID - XYZ, Client -100)

How to Use System ID and Client in URL ?

  • In Gateway system Open SAP Reference IMGin transaction SPROand navigate to SAP NetWeaver SAP Gateway OData Channel Configuration Connection Setting SAP Gateway to SAP System Manage SAP System Aliases




  • Check for System Alias FIORI_ECC1 and FIORI_ECC2 in list.
  • Maintain SID and Client against System Alias

Manage SAP System.PNG

Now we are all set to test the scenario of using SID and Client in Gateway Service URl

Syntax for using SID and Client:


In our case it will be:- /sap/opu/odata/sap/ZPROJECT_SRV;o=sid(ABC.100)/EnitySet

See the result which refrains output results  only from system ABC-100.

GW Client.PNG

Syntax for using System Alias:


or /sap/opu/odata/sap/ZPROJECT_SRV;mo/EnitySet?$filter=SAP_Origin eq 'SystemAlias'

In our case it will be:- /sap/opu/odata/sap/ZPROJECT_SRV;o=FIORI_ECC1/EnitySet

See the result which refrains output results  only from system alias FIORI_ECC1.

GW Client2.PNG


Similarly it works for create and deep create operation also.

Below I am showing an example working with create deep entity.



The processing of request is as follows:

1.       The SAP NetWeaver Gateway system searches for all existing system aliases for the user and the specified service.

2.       The SAP NetWeaver Gateway system checks if one from above system aliases equals sid(ABC.100). If this is the case, this system will be used.

3.       If no such system exists underneath the specified service, then SAP NetWeaver Gateway checks whether one of the above system aliases has defined a system ID ABCand client 100.

4.       If this is the case, this system ID will be used. Otherwise an error message is displayed.

Hope this fills new developments. 

Why not build fewer, more useful OData services?


Do you ever have the feeling that you are missing something obvious?

The present


It seems to me that the current orthodoxy is that an OData service should be developed for each UI5 app.  In other words, each app should only have to call one service.  I think we should question whether that is the best approach.


Of course, a UI5 app can use many OData models, so long as each one has a unique name within the app.


In my opinion, when developing OData services (e.g. using Netweaver Gateway or HCI OData Provisioning), we should think of the big picture.  We shouldn't be too focused on the bespoke Fiori-style app we happen to be developing that day.  In a few years time we may have a large number of Fiori apps, both standard and bespoke.  There may be many other consumers of our OData services both internal (e.g. our own company website or intranet) and external (e.g. customer or supplier systems).


Why not take an 'API' approach, and think of our OData services as a way of interacting with the entities in our back end system?  Why not organise the services in such a way as to make it easy to navigate for any UI developer working with any technology? For example we could have a service for customers, one for inventory and one for employees. It seems to me that this would be much more in line with the RESTful architecture.  Just because we are using an SAP technology, such as Gateway, to deliver our services, it doesn't mean we should only consider SAP UI-technologies as consumers.  A big plus of the modern web-architecture is that the UI-layer is completely de-coupled from the data-provider and I think we should take advantage of that.


The future


If we (SAP and customers) carry on developing new services at a rapid rate, within a few years I fear we will have a large number of overlapping services.  It will be so hard to find one to use that developers will simply create another brand new service rather than trying to sort through that long list.  There will be much duplication of logic.


Each published service represents a responsibility (does it work as it should?) and a vulnerability (could it facilitate malicious activity?).  This isn't new (the same can be said for a remote-enabled function module) but surely a very large number of services makes it harder to manage these risks.


Pros & Cons


This is my take on the pros and cons of the one-service-per app model:


  • Robust, as development of a service for app #2 does not affect app #1.  This helps us get an app live very quickly, and may fit in better with an agile methodology(?).  This appeals to those who found the ESOA (Enterprise Service-Orientated Architecture) approach difficult to manage in practice
  • Re-usability can be introduced via 'Concrete classes', as described by Leigh Mason hereMaking your SAP Gateway work smarter not harderand also Service Inclusion
  • Data provider classes are quite small and manageable as there are only a few entities in each service
  • The service can be optimised for the app.  However, this can probably be achieved in a generic service by using features such as $select to specify the fields required.  Also, there is nothing to say that the 'customer' service must only have one, generic 'customer' entity
  • An external 'API' productcould perhaps be used to unify and present the services provided by Gateway (and other OData providers too)
  • Will lead in time to a large number of overlapping services.  It will be hard for a UI-developer to find what they need
  • Likely there will be a lot of duplication of logic (across services), leading to greater effort and reduced maintainability.  How many times will an employee entity be required, for example, in order to provide employee names?
  • Services are designed for individual apps on one particular platform (UI5/Fiori) and might not be suitable for future apps or other UI platforms


What do you think?  What do you do?


So, is there an obvious point that I am missing?  Are there advantages of the one-service-per-app approach that I have missed?  Am I wrong to call one-service-per-app the orthodoxy?


What is the strategy in your organisation (or at your clients) for OData services? I look forward to your comments.



step1) Open TCode SEGW and create a project as shown in below screen shot.


Provide the following details


Then the components like

  1. 1. Data Model
  2. 2. Service Implementation
  3. 3. Runtime Artifacts
  4. 4. Service Maintenance

Gets displayed automatically.

step2. Create an entity type as follows


Provide as following.



Click on properties as shown below.



Add the following values for header as shown below


Same way create entity type PR_Item for the item also and give the following values


step3. Create an entityset as shown below.


Give the following details


Then Header Entityset is created.


     Same way create for Item Entityset is created.


step 4. Create a association as shown below.






And association set is automatically created.

step 5. Now Navigation is automatically created.


step 6.After completion of data model in Odata service in Service Implementation is filled automatically as shown in below screen shot.


step 7.Now we need to generate runtime artifacts ,for that you need to select Runtime Artifacts and click on




Click OK and save it.


We get the following in Runtime Artifacts .


step 8. We need write in ZCL_Z_PURCHASE_REQUISI_DPC_EXT so double click on it .



step 9 A.Then we need to right click on the methods and redefine required methods in following process.


And write following code  to get entity .


Code for Get Entity










     READ TABLE it_key_tab INTO ls_key_tab WITH KEY name ='PRNumber'.



*    lv_pR_item = ls_key_tab-value.




*    IV_ENTITY_NAME          =

*    IV_ENTITY_SET_NAME      =

*    IV_SOURCE_NAME          =

*    IT_KEY_TAB              =

**    io_request_object       =

**    io_tech_request_context =



**    er_entity               =

**    es_response_context     =

*    .

** CATCH /iwbep/cx_mgw_busi_exception .

** CATCH /iwbep/cx_mgw_tech_exception .






9 B. Likewise redefine other required methods PRITEMCOLLECTION_GET_ENTITYSET and deep insert  also in same way  .




*** inactive new ***


DATA:        ls_key_tab TYPE /iwbep/s_mgw_name_value_pair,

            lv_pr_number TYPE BANFN,

            lv_pr_item TYPE BNFPO,

            lt_pr_items_bapi TYPE TABLE OF BAPIEBAN,

            ls_pr_item_bapi TYPE BAPIEBAN,



BEGIN OF ts_pr_item,

     PRITEM type C length 5,

     PURGROUP type C length 3,

     MATERIAL type C length 18,

     SHORTTEXT type C length 40,

     PLANT type C length 4,

     MATERIALGROUP type C length 9,

     QUANTITY type P length 7 decimals 3,

     UNIT type C length 3,

     DOCUMENTTYPE type C length 4,


     ITEMCATEGORY type C length 1,

     ACCTASSIGNCATEGORY type C length 1,

     PRNUMBER type C length 10,

END OF ts_pr_item.




READ TABLE it_key_tab INTO ls_key_tab WITH KEY name ='PRNumber'.

lv_pR_number = ls_key_tab-value.


READ TABLE it_key_tab INTO ls_key_tab WITH KEY name ='PRItem'.

lv_pR_item = ls_key_tab-value.




number =  lv_pr_number


requisition_items =   lt_pr_items_bapi



*DELETE lt_pr_items_bapi WHERE preq_item NE lv_pr_item.


READ TABLE lt_pr_items_bapi INTO ls_pr_item_bapi INDEX 1.


LOOP AT lt_pr_items_bapi INTO ls_pr_item_bapi.


      es_entityset-PRITEM = ls_pr_item_bapi-PReq_ITEM.

      es_entityset-PURGROUP = ls_pr_item_bapi-PUR_GROUP.

      es_entityset-material = ls_pr_item_bapi-material.

      es_entityset-SHORTTEXT = ls_pr_item_bapi-short_text.

      es_entityset-plant = ls_pr_item_bapi-plant.

      es_entityset-MATERIALGROUP = ls_pr_item_bapi-mat_grp.

      es_entityset-quantity = ls_pr_item_bapi-quantity .

      es_entityset-UNIT = ls_pr_item_bapi-unit .

      es_entityset-DOCUMENTTYPE = ls_pr_item_bapi-doc_type .

      es_entityset-ITEMCATEGORY = ls_pr_item_bapi-item_cat .

      es_entityset-ACCTASSIGNCATEGORY = ls_pr_item_bapi-acctasscat .

      es_entityset-PRNUMBER = ls_pr_item_bapi-preq_no .

      es_entityset-DeliveryDate = ls_pr_item_bapi-deliv_date .


      APPEND es_entityset TO et_entityset.

      CLEAR: es_entityset.












**    iv_entity_name          =

**    iv_entity_set_name      =

**    iv_source_name          =

*    IO_DATA_PROVIDER        =

**    it_key_tab              =

**    it_navigation_path      =

*    IO_EXPAND               =

**    io_tech_request_context =


**    er_deep_entity          =

*    .

** CATCH /iwbep/cx_mgw_busi_exception .

** CATCH /iwbep/cx_mgw_tech_exception .



     DATA: lv_new_pr_no TYPE BAPIEBANC-PREQ_NO.

*             ls_new_pr_header TYPE BAPIMEREQHEADER.


DATA: ls_bapi_item TYPE bapiebanc,

          lt_bapi_item TYPE TABLE OF bapiebanc,

          lt_return TYPE TABLE OF bapiret2.


TYPES: ty_t_pr_items TYPE TABLE OF zcl_z_purchase_requisi_mpc=>ts_pr_item WITH DEFAULT KEY.


TYPES: BEGIN OF ts_pr_items.

            INCLUDE TYPE zcl_z_purchase_requisi_mpc=>ts_pr_header.

TYPES: PrItemCollection TYPE ty_t_pr_items,

           END OF ts_pr_items.


DATA: lt_items TYPE zcl_z_purchase_requisi_mpc=>tt_pr_header,

          ls_item  TYPE zcl_z_purchase_requisi_mpc=>ts_pr_item,

          lt1_items type ty_t_pr_items,

          ls_pritems TYPE ts_pr_items.


DATA: ls_data TYPE ts_pr_items.


CALL METHOD io_data_provider->read_entry_data( IMPORTING es_data = ls_data ).

*    ls_item-PRItemCollection  = ls_data-PrItemCollection.

lt1_items  = ls_data-PrItemCollection.

*append ls_item to it_items.

*clear ls_item.



*    LOOP AT lt_items INTO ls_item.

LOOP AT lt1_items INTO ls_item.


      ls_bapi_item-material = ls_item-material.

      ls_bapi_item-plant = ls_item-plant.

      ls_bapi_item-quantity = ls_item-quantity.

      ls_bapi_item-doc_type = ls_item-DocumentType.

      ls_bapi_item-DELIv_DATE = ls_item-DeliveryDate.

      ls_bapi_item-PUR_GROUP = ls_item-PURGROUP.

      ls_bapi_item-PREQ_ITEM = ls_item-PRITEM.

      ls_bapi_item-SHORT_TEXT = ls_item-SHORTTEXT.

      ls_bapi_item-MAT_GRP = ls_item-MATERIALGROUP.

      ls_bapi_item-UNIT = ls_item-UNIT.

      ls_bapi_item-ITEM_CAT = ls_item-ITEMCATEGORY.

      ls_bapi_item-ACCTASSCAT = ls_item-ACCTASSIGNCATEGORY.

      ls_bapi_item-PREQ_NO = ls_item-PRNUMBER.

*      ls_itemx-po_item = ls_item-item.

      APPEND ls_bapi_item TO lt_bapi_item.

*      APPEND ls_itemx TO lt_itemx.

      CLEAR ls_item.






INPUT         = lv_new_pr_no


OUTPUT        = lv_new_pr_no







NUMBER = lv_new_pr_no


requisition_items = lt_bapi_item

RETURN = lt_return





        wait = 'X'.

*    MOVE-CORRESPONDING ls_headerdata TO ls_pritems.

ls_pritems-prnumber = lv_new_pr_no.




        is_data = ls_pritems


        cr_data = er_deep_entity ).



step10: Now Service Maintenance is automatically created but we need to register the service. So select the system i.e. EC7 and click on register.


Give the system Alias LOCAL_GW and click OK. Then Maintain The register

step 11.Test the service


Provide the following query



Click on Use as Request.


Then you will get the response as request on left side

For Purchase Requisition creation you need to remove the Purchase Requisition number from left side.


/sap/opu/odata/SAP/Z_PURCHASE_REQUISITION_TEST_SRV/PRHeaderCollection() in gateway client and click on post

Purchase Requisition is created ‘0010017270’ as shown in below screen shot.


Check in Table level entries in EBAN we can find the Purchase Requisition '0010017270'

Elementary load testing of SAP Gateway oData services using cURL



Probably everyone understands the need to perform load testing of applications and services. Usually it comes down to how much time and money one can spend in doing that. This blog gives a quick and free approach. Although not very advanced, it can be used to simulate load easily.



A fairly recent version of cURL is required. If you plan on load testing with HTTPS, your cURL needs to support HTTPS. The certificate of the accessed server needs to be trusted as well. With cuRL the latter can be a bit tricky depending on how cURL was compiled. In case cURL doesn’t have in-built support for certificates, you can retrieve the CAs from the cURL website and convert them to CRT format using the provided mk-ca-bundle. Name the output file curl-ca-bundle.crt and place in the cURL directory.


This blog was written while using Windows as a platform, it should be straight forward to map the used commands to your choice of Operating System, such as Linux.



First you need to solve how to get the CSRF token and use it for all POST requests. The solution is to invoke cURL with –i  to retrieve the header variables and –D and store them so that you can parse them, use

curl –s –i –H "Accept: application/json" -H "X-CSRF-Token: fetch" -D headers.txt -u user:password https://myurl/mylistservice

To parse the header variables use

findstr "x-csrf-token" headers.txt > token.txt


set /p TOKEN=<token.txt

to store the CSRF token in an environment variable called TOKEN. Now to use the CSRF token in POST requests, use –H “%TOKEN%” in addition to any other header variables you may want to set, e.g.

curl –s –H "Content-Type: application/json" -H "Accept: application/json" -H "%TOKEN%" –u user:password -X POST --data-binary @payload.txt https://myurl/mycreateservice

The payload (in JSON format) is stored in a text file called payload.txt.

CSRF token validation failed


If you followed instructions in this blog to the letter, you probably still couldn’t make it work. Any attempt to use POST would fail with a token validation error, even though you sent the CSRF token in the request. The most likely reason for this is that Gateway is generating a new token for each request, invalidating the previous one. The root cause is that Gateway doesn’t recognize the session. It’s not sufficient to pass the CSRF token together with Basic Authentication; you need to also pass the identifying session information. The session information is stored in Cookies. With cURL you’ll have to use –c parameter to store the sent Cookies and –b to send them back. In other words you would add –c cookies.txt –b cookies.txt to each cURL command. The commands used in our examples would become

curl –s –i –c cookies.txt –b cookies.txt –H "Accept: application/json" -H "X-CSRF-Token: fetch" -D headers.txt -u user:password https://myurl/mylistservice

curl –s –c cookies.txt –b cookies.txt –H "Content-Type: application/json" -H "Accept: application/json" -H "%TOKEN%" –u user:password -X POST --data-binary @payload.txt https://myurl/mycreateservice

Taking it up a notch


Now that you have your elementary script ready, you might ask yourself “how can I simulate load”, e.g. have multiple instances running. The quick solution is to run the script in as many directories as you want parallel instances. A more clever approach is to have cURL create and use files in one directory with a prefix/suffix tying it to a specific instance. I myself did the latter by using the parent process ID (of Command Prompt) as identifier, that way you can start as many scripts from the same directory as you want and as your systems can handle. To give you an idea how that would look, here is how the 2 commands in our example look in my script

curl -s -i -c _cookies_"%PPID%".txt -b _cookies_"%PPID%".txt -H "Accept: application/json" -H "X-CSRF-Token: fetch" -D _headers_"%PPID%".txt -u "%USR%":"%PWD%" https://myurl/mylistservice

curl -s -c _cookies_"%PPID%".txt -b _cookies_"%PPID%".txt -H "Content-Type: application/json" -H "Accept: application/json" -H "%TOKEN%" -u "%USR%":"%PWD%" -X POST --data-binary @payload.txt https://myurl/createservice

PPID environment variable stores the process ID of the parent process. Obviously user and password are stored in environment variables USR and PWD defined in the script. To use multiple users, you could either use a rotating (or random) list or use the multiple directory approach.

Other tricks


If you want that your script waits between individual commands (or loop iterations), you can use

SET /A SLEEP=%RANDOM% * 10 / 32768 + 5

timeout /t %SLEEP% /nobreak

to set wait time of 5-15 seconds. If you plan to implement the parent process ID approach you’ll soon find out that it’s surprisingly difficult to retrieve the process ID of the parent process, in Windows that is. I ended up writing a C# program and calling it from the script.



While the approach discussed in this blog is elementary, it does the job. While commercial alternatives provide more advanced and realistic load testing they usually come with a significant price tag, especially if you need to simulate tens, hundreds or even thousands of clients.

How to apply range table in integration between Gateway and Hana View



In integration between GW and hana view, hana view will provide input parameter as filter for one field, normally the input parameter accept one value. Issue is how to handle the filter has multiple values. For example, we need to count customer age distribution based on multiple branches customer belongs to.



1. AMDP:

Define/implement method in AMDP class.



          VALUE(branch)              TYPE string


          VALUE(et_agedistribution)  TYPE tt_output_agedistribution,


et_agedistribution = SELECT "AGE", "AGERATE"

                         FROM "_SYS_BIC"."ZOUI5_CV_CUSTOMERAGE_SQL"

                         (PLACEHOLDER."$$ZBRANCH$$"=> :branch)

                         GROUP BY "AGE", "AGERATE", "ZCOUNT";

2. ABAP Class:

Combine filters and call AMDP method

DATA lt_partner TYPE RANGE OF bu_partner.
DATA ls_partner LIKE LINE OF lt_partner.
DATA: o_cond       TYPE REF TO cl_lib_seltab,
TYPE REF TO cl_abap_tabledescr,
TYPE string.

ls_partner-sign = 'I'.
-option = 'EQ'.
-low = '1000000001'.
APPEND ls_partner TO lt_partner.

ls_partner-sign = 'I'.
-option = 'EQ'.
-low = '1000000002'.
APPEND ls_partner TO lt_partner.


CALL METHOD cl_lib_seltab=>new
= lt_partner
= o_cond.

CALL METHOD o_cond->sql_where_condition
= comm_bp_cond.
CONCATENATE '( ' branch_cond ' )' INTO branch_cond.


Branch_cond is like below:

(BRANCH = '1000000001' OR BRANCH = '1000000002')


Call AMDP Class:

= branch_cond
= lt_agedistribution
CATCH cx_amdp_execution_failed INTO DATA(lxage).


3. Stored Procedure:

Update stored procedure of ‘ZOUI5_CV_CUSTOMERAGE_SQL’ to adopt filter

var_filter =




var_int =

  select distinct







Calculate Age distribution based on var_int.

Output is "AGE", "AGERATE"

OData service development with SAP Gateway - code-based service development - Part I




In this blog I would like to show the basics of OData service development with SAP Gateway when using code based service implementation as it is shown in the upcoming SAP CodeJam events about OData service development with SAP Gateway.

Though the recommended approach for OData service development in the most recent SAP NetWeaver releases is to use CDS views there are still a number of use cases where code based service implementation is a valid option.

  1. Your SAP backend system is based on a SAP NetWeaver release 7.31 or earlier
  2. Your system is based on SAP NetWeaver 7.40 or later but the business logic you want to re-use has been implemented in ABAP code
  3. It is not possible to implement your business logic in CDS views


As an example we will use a simple OData model based that consist out of SalesOrders and SalesOrderItems with the option to navigate from a SalesOrder to the correponding line items.





The examples shown in this blog are based on the latest SAP NetWeaver release 750so that we can compare the different implementation approaches



Creating a basic OData service


We will perform the following steps:


  • Use the SAP Gateway Service Builder to create a new project
  • Import a DDIC structures for SalesOrder to create the data model
  • Generate, register and test the OData service using the Gateway Client
  • Perform a first implementation of the GET_ENTITYSET method


Create a new Service Builder project


  • Start the Gateway Service Builder by running transaction SEGW.


  • Click on the Create Project button
  • Provide the following information:


    Project: ZE2E100_XX

    Description: ZE2E100_XX

    Service Package: $TMP


    Note: Replace XX with your session-id / group number.






    • Press Continue
    • Press Save

    Import a DDIC structures for SalesOrder

    • Right-click on Data Model and select Import --> DDIC Structure.


    • In the first step of the wizard provide:

      In the field Name, enter SalesOrder
      In the field ABAP Structure, enter SEPM_ISOE.


    • In the second step of the wizard provide:

      Select the checkbox for SEPM_ISOE.
      Deselect the checkbox for MANDT,
      Click Next.


    • In the third screen of the wizard

      Select the checkbox Is Key for the field SALESORDER.
      Click Finish.
    • Expand Folder Properties and examine the entity SalesOrder and the entity set SalesOrderSet.

      (For example the property names, the Edm Core Type values and the ABAP field names)
    • Press Save
    • Press the Check Project Consistency button.
    • Verify that no errors were found.
    • Press the Generate Runtime Objectsimage018.pngbutton.

      Note: Using the Generate Runtime Objects button automatically saves the project.

    • Leave all values as defaulted and press Enter.


    • Choose Local Object in the Create Object Directory Entry popup.
    • Verify that the generation was successful.


    Now you have created an (empty) service implementation in the SAP backend.



    Register and Test the OData service using the Gateway client


    • Expand the node Service Maintenance and right-click on GW_HUB and select Register.


    • In the Add Service popup leave all values as defaulted and press the Local Object button to assign the artifacts to the $tmp package. Then press Enter to continue


    • Double-click on the GW_HUB entry under Service Maintenance. Verify that the registration status on the right-hand side is showing a green traffic light.


    • In the navigation tree right-click on GW_HUB and select SAP Gateway Client.

      Alternatively click on the SAP Gateway Client button in the detail section.
    • Confirm the warning popup that warns you that you will be redirected to the selected system. Select Yes.
    • This opens up the SAP Gateway Client in a new screen. The URI of the service document is already defaulted.
      Press the Execute button.


    • As a result you get the service document of your OData service in the HTTP response window


    • Press the Button “Entity Sets” and select SalesOrderSet.


    • After pressing Execute button which executes the following URI


      you see an error message that indicates that the method SALESORDERSET_GET_ENTITYSET has not yet been implemented.


    Provision the GET_ENTITYSET method of the service using ABAP code


    • Open the Service Builder Project again (if you have closed it)
    • Expand the node Service Implementation and expand SalesOrderSet.
      Now right-click on GetEntitySet(Query) and choose
      Go to ABAP Workbench.


    • Confirm the warning

      “Operation SALESORDERSET_GET_ENTITYSET has not yet been implemented”
    • Switch to edit mode, scroll down to the method SALESORDERSET_GET_ENTITYSET and make sure to select it.

      Click on the Redefine Method button.


    • Use Copy&Paste to copy the entire coding shown below into the salesorderset_get_entityset method. Please note: This is a new syntax for ABAP SQL.

    select * from sepm_i_salesorder_e into corresponding fields of table @et_entityset.

      • Info: The replaced coding selects data from the CDS view sepm_i_salesorder_e.

        The results are filled into the corresponding fields of the return structure et_entityset of the get-entityset method.
        This works out of the box since the structure SEPM_ISOE was used to create the entity type using DDIC import. This structure is the so called sqlViewName of the CDS view sepm_i_salesorder_e. The source code of the DDL source can be viewed using the report RUTDDLSSHOW.
    • Click on Activate.
    • Confirm the activation popup.
    • Navigate back to the Service Builder main screen using the green back image035.pngbutton multiple times.
    • In the navigation tree right-click on GW_HUB and select SAP Gateway Client.

      Alternatively click on the SAP Gateway Client button in the detail section.


    • This opens up the SAP Gateway Client in a new screen. The URI of the service document is already defaulted.
      Press the Button “Entity Sets” and select BusinessPartnerSet
      The request URI field will contain the following value:



             (Replace ‘<XX>’ by your group number.)


             After pressing Execute button you see a list of sales orders.




    • You can retrieve the number of business partners by adding “/$count” to the request URI

      Replace ‘<XX>’ by your group number and press Execute


    Add support for $filter to your code


    • Open the Service Builder Project again
    • Expand the node Service Implementation and BusinessPartnerSet and choose the entry GetEntitySet (Query).
      Now right-click on GetEntitySet (Query) and choose
      Go to ABAP Workbench.


    • This time the ABAP editor will open the method salesorderset_get_entityset  immediately.
      Switch to the change mode and replace the code with the code shown below.
      The code retrieves the $filter statement as an osql statement for the where clause.
      The data is retrieved and inserted into the return table et_entityset.

    data: lv_osql_where_clause type string.
    = io_tech_request_context->get_osql_where_clause( ).
    select * from sepm_i_salesorder_e
    into corresponding fields of table @et_entityset
    where (lv_osql_where_clause


    • Click on Activate.
    • Navigate back to the Service Builder main screen using the back button
    • In the navigation tree right-click on GW_HUB and select SAP Gateway Client.

      Alternatively click on the SAP Gateway Client button in the detail section.
    • This opens up the SAP Gateway Client in a new screen. The URI of the service document is already defaulted.
    • Replace the URI with the following:

      /sap/opu/odata/SAP/ZE2E100_XX_SRV/SalesOrderSet?$filter=Grossamountintransaccurrency ge 100000&$select=Salesorder,Grossamountintransaccurrency,Transactioncurrency&$format=json

      Replace ‘<XX>’ by your group number and press Execute

      This will deliver a list of four sales orders whose gross amount exceeds 100.000 Euro and will only show the sales order number and the gross amount.



    Add support for additional query options to your code



    • Open the Service Builder Project again
    • Expand the node Service Implementation and right-click on SalesOrderSet and choose the entry GetEntitySet (Query).
      Now right-click on GetEntitySet (Query) and choose
      Go to ABAP Workbench.

    • This time the ABAP editor will open the method salesorderset_get_entityset immediately.
    • Switch to the edit mode.
      Replace the code with the code shown below.
      The code retrieves the values for $top and $skip and the $filter statement as an osql statement for the where clause.The data is retrieved and in the end it is checked whether $inlinecount is used (which is a default for SAPUI5)


        data: lv_osql_where_clause type string,
    type i,
    type i,
    type i,
    type i.
    *- get number of records requested
    = io_tech_request_context->get_top( ).
    *- get number of lines that should be skipped
    = io_tech_request_context->get_skip( ).
    *- value for maxrows must only be calculated if the request also contains a $top
    if lv_top is not initial.
    = lv_top + lv_skip.
    = io_tech_request_context->get_osql_where_clause( ).

    select * from sepm_i_salesorder_e
    into corresponding fields of table @et_entityset
    up to @lv_max_index rows
    where (lv_osql_where_clause).
    *- skipping entries specified by $skip
    if lv_skip is not initial.
    delete et_entityset to lv_skip.
    *-  Inlinecount - get the total numbers of entries that fit to the where clause
    if io_tech_request_context->has_inlinecount( ) = abap_true.
    select count(*from   sepm_i_salesorder_e where (lv_osql_where_clause) .
    -inlinecount = sy-dbcnt.
    clear es_response_context-inlinecount.

    • Click on Activate.
    • Navigate back to the Service Builder main screen using the back button  multiple times.
    • In the navigation tree right-click on GW_HUB and select SAP Gateway Client.

      Alternatively click on the SAP Gateway Client button in the detail section.
    • This opens up the SAP Gateway Client in a new screen. The URI of the service document is already defaulted.
    • Replace the URI with the following:

      /sap/opu/odata/SAP/ZE2E100_XX_SRV/SalesOrderSet?$filter=Grossamountintransaccurrency ge 100000&$select=Salesorder,Grossamountintransaccurrency,Transactioncurrency&$top=2&$skip=1&$inlinecount=allpages&$format=json

      Replace ‘<XX>’ by your group number and press Execute
      This will deliver a list of only 2 sales orders whose gross amount exceeds 100.000 EUR. The first sales order 500000005 is skipped and only the first 2 of the rest of the list (highlighted in bold) are shown.
      The inlinecount however shows that in total 40 sales orders would fit to the filter criteria.

      500000030 <--
      500000034 <--

      Please note:
      $skip, $top and $inlinecount are used for client side paging. SAPUI5 uses this to calculate how many pages one has to navigate through.


    What is now left is the implementation of the GET_ENTITY method of the entity set SalesOrderSet and the modelling and implementation of the navigation between sales order header and the line items.

    This will be described in the second part of this blog post OData service development with SAP Gateway - code-based service development - Part II because the editor refused to let me add any additional screen shots at this point .

    OData service development with SAP Gateway - code-based service development - Part II


    Since the editor didn't let me enter additional screen shots I am continuing my last blog post OData service development with SAP Gateway - code-based service development - Part I in a new one:


    We will continue our service implementation by implementing


    1. the GET_ENTITY method for the SalesOrderSet entity set and
    2. by implementing navigation to the items of a sales order


    Provision the GET_ENTITY method of the service using ABAP code

    • Open the Service Builder Project again
    • Expand the node Service Implementation and SalesOrderSet and choose the entry
      GetEntity (Read).
      Now right-click on
      GetEntity (Read) and choose
      Go to ABAP Workbench.


    • Confirm the warning

      “Operation SALESORDERSET_GET_ENTITY has not yet been implemented”


    • Switch to edit mode, scroll down to the method SALESORDERSET_GET_ENTITY and make sure to select it.

      Click on the Redefine Method button.

      Please note that the method SALESORDERSET_GET_ENTITYSET has already been redefined and therefore appears in black color.


    • Use Cut&Paste to copy the entire coding into the ABAP editor.

      Replace ‘<XX>’ by your group number

      The code first calls the method io_tech_request_context->get_converted_keys.

      The retrieved key value is used to select a single entry which is stored in the return structure
      er_entityof the method.

    method salesorderset_get_entity.


         data: lt_keys       type /iwbep/t_mgw_tech_pairs,

               ls_key        type /iwbep/s_mgw_tech_pair,

               ls_bp_key     type zcl_ze2e100_xx_mpc=>ts_salesorder-salesorder,

               ls_headerdata type zcl_ze2e100_xx_mpc=>ts_salesorder.


         call method io_tech_request_context->get_converted_keys


             es_key_values = ls_headerdata.


         ls_bp_key = ls_headerdata-salesorder.


         select single *

           into corresponding fields of @er_entity

           from sepm_i_salesorder_e

           where salesorder = @ls_headerdata-salesorder.




    • Click on Activate
    • Confirm the Activation popup
    • Navigate back to the Service Builder main screen using the back button  multiple times.
    • In the navigation tree right-click on GW_HUB and select SAP Gateway Client.
    • Alternatively click on the SAP Gateway Client button in the detail section.Enter the following URI
      Replace ‘<XX>’ by your group number and Execute.
      You will get the details of a single business partner as a response




    Add an additional entity set for Sales Order Items

    • Open the Service Builder Project again
    • We will now create a second entity type for sales order items and entity set.

      Right-click again on Data Model and select Import --> DDIC Structure.

    • In the first step of the wizard provide:

      In the field Name, enter SalesOrderItem.
      In the field ABAP Structure, enter SEPM_ISOIE.
      Click Next.


    • In the second step of the wizard provide:

      Select the checkbox for SEPM_ISOIE.
      Deselect the checkbox for MANDT,Click Next.


    • In the third screen of the wizard:
      Select the checkbox Is Key for the fields SALESORDER and SALESORDERITEM.
      Click Finish.


    • Press Save

    Add an association and navigation properties

    • Expand Data Model and Right-click on Association and select Create


    • In the first step of the wizard provide:

      Association Name

    Principal Entity

    Entity Type Name




    Leave the checkbox Create related Navigation Property checked

    Enter ToItems in the field Navigation Property

    Dependent Entity

    Entity Type Name





    Select the checkbox Create related Navigation Property checked

    Enter ToSalesOrder in the field Navigation Property


    • Click Next
    • In the second step of the wizard in the field Dependent Property, enter SalesOrder by using the value help.


    • In the third step of the wizard press Finish.
    • Press the Check Project Consistency button.
    • Verify that no errors were found.
    • Press Generate Runtime Objects

    Provision the GET_ENTITY_SET method of the entity set SalesOrderItemSet


    • Open the Service Builder Project again
    • Expand the node Service Implementation and SalesOrderItemSet and choose the entry GetEntitySet (Query).
      Now right-click on GetEntitySet (Query) and choose Go to ABAP Workbench.


    • Confirm the warning

      “Operation SALESORDERITEMSE_GET_ENTITYSET has not yet been implemented”
    • Switch to edit mode, scroll down to the method SALESORDERITEMSE_GET_ENTITYSET and make sure to select it.

      Click on the Redefine Method button.


    • Use Cut&Paste to copy the entire code shown below.

      Replace ‘<XX>’ by your group number

      The code first calls the method io_tech_request_context->get_navigation_path.

      If the entity set is accessed via the navigation property ‘TOITEMS’ the sales order id is retrieved from the URI.

      If the entity set is accessed directly the where clause is retrieved from io_tech_request_context.

      Please note that the methods SALESORDERSET_GET_ENTITY and SALESORDERSET_GET_ENTITYSET have already been redefined and therefore appear in black color.


    method salesorderitemse_get_entityset.

         data: lt_nav_path   type /iwbep/t_mgw_tech_navi,

               ls_nav_path   type /iwbep/s_mgw_tech_navi,

               lt_keys       type /iwbep/t_mgw_tech_pairs,

               ls_key        type /iwbep/s_mgw_tech_pair,

               ls_so_key     type zcl_ze2e100_xx_mpc=>ts_salesorder-salesorder,

               ls_headerdata type zcl_ze2e100_xx_mpc=>ts_salesorder.



         data: lv_osql_where_clause type string.


         lt_nav_path = io_tech_request_context->get_navigation_path( ).


         read table lt_nav_path into ls_nav_path with key nav_prop = 'TOITEMS'.


         if sy-subrc = 0.


           call method io_tech_request_context->get_converted_source_keys


               es_key_values = ls_headerdata.


           ls_so_key = ls_headerdata-salesorder.


           select * from sepm_i_salesorderitem_e

           into corresponding fields of table @et_entityset

           where salesorder = @ls_so_key.




           lv_osql_where_clause = io_tech_request_context->get_osql_where_clause_convert( ).


           select * from sepm_i_salesorderitem_e

          into corresponding fields of table @et_entityset

          where (lv_osql_where_clause).







    • Click on Activate
    • Confirm the Activation popup
    • Navigate back to the Service Builder main screen using the back button  multiple times.
    • In the navigation tree right-click on GW_HUB and select SAP Gateway Client.

      Alternatively click on the SAP Gateway Client button in the detail section.

    • Enter the following URI


      Replace ‘<XX>’ by your group number and
      You will get the items of the sales order
      '500000000' as a response.

    • Enter the following URI

      /sap/opu/odata/SAP/ZE2E100_XX_SRV/SalesOrderItemSet?$filter=Salesorder eq '500000000'

      Replace ‘<XX>’ by your group number and Execute.

      Also here you will get the items of the sales order
      '500000000' as a response

    OData service development with SAP Gateway using CDS via Referenced Data Sources



    • 02.06.2016 Added link to blog that describes how to implement updates
    • 09.06.2016 Added section how to create the CDS view used for generating an OData service via Referenced Data Sources




    In a previous blog OData service development with SAP Gateway - code-based service development - Part I I described how to implement as basic OData Service using code based implementation.


    Since the recommended development approach as of SAP NetWeaver 750 is to use CDS views I would like to show how a service having the same capabilities as the service in the blog mentioned above can be generated based on CDS views.


    It will turn out that from a development perspective in the ABAP stack this is much less effort if appropriate CDS views are in place and if your backend system is based on SAP NetWeaver 750.


    If however no appropriate CDS view is in place instead of having the effort of developing an OData service via code based implementation you would have the effort to develop the appropriate DDL source code.




    As a prerequisite we are using a consumption view ZSEPM_C_SALESORDER_TPL since it is not good practice to access core CDS views directly.


    This consumption view is then also not annotated as an analytical view as the underlying core CDS view SEPM_I_SALESORDER_E.


    How to create this consumption view ZSEPM_C_SALESORDER_TPL is shown in the following.

    If you are using a system for a SAP CodeJam you can skip this step or create your own CDS view by replacing <TPL> with your group number.

    Creating the consumption view


    Please note:

    The creation of a consumption view is only necessary if you want to follow this how-to-guide on your own system or if you want to use your own CDS view.

    For a SAP CodeJam or other SAP event we will use a system where this view has already been created.




    • Start the ABAP Development Tools


    From the menu choose New --> ABAP Project

    01 Create new ABAP project in ADT.png

    • Select the system (here A4H) from the list of system connections and choose Next

      02 Select A4H from the list of system connections.png

    • In the "Connection Settings" screen press Next
      System Connection Details.png

    • Enter your credentials
      04 logon to system.png

    • Right click on your system connection and select New --> Other ...

    06 new other.png

    • Select DDL source and press Next

      07 DDL source.png
    • Enter the following details for the new DDL source and press Finish

      Package: $TMP
      Description: SalesOrders - EPM Demo Data

        07 DDL source 2.png
    •   Cut and paste the source code and press the Activate button. This will create the CDS view that we will use in this how to guide



    @AbapCatalog.sqlViewName: 'ZSEPM_ISOE_TPL'

    @AbapCatalog.compiler.compareFilter: true

    @AccessControl.authorizationCheck: #CHECK

    @EndUserText.label: 'SalesOrders - EPM Demo Data'

    define view Zsepm_C_Salesorder_Tpl as select from SEPM_I_SalesOrder_E {

                                                                          @ObjectModel.text.association: '_Text'

                                                                      key SEPM_I_SalesOrder_E.SalesOrder,

                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          @ObjectModel.readOnly: true


                                                                          /* Associations */














    Generating an OData service via Referenced Data Source


    • We first have to start with creating a new Service Builder project calledZE2E100_<XX>_2 since the project in the previous blog mentioned above was calledZE2E100_<XX>

      Replace <xx> with your group number
    • Right click on the folder Data Model.

      Choose Reference -->
      Modeled Data Source Reference from the context menu


    • The Reference Data Source Wizard opens.

      On the first screen choose:
      CDS Core Data Services for the field Modeled Data Source Type.
      ZSEPM_C_SALESORDER_TPL for the field Modeled Data Source Name


    • In the second window of the wizard select the following associations’

      _CUSTOMER and _ITEM.

      Please note that the cds views SEPM_I_BUSINESSPATNER_E and SEPM_I_SALESORDERITEM_E are automatically selected as well.

      Press Finish




    • Press the Generate Runtime Objects button

    • In the Model and Service Definition screen leave the default values unchanged and press Continue.


    • In the Create Object Directory Entry dialogue press Local Object or enter $TMP.

    • Expand the folder Service Maintenance right click on the entry GW_HUB and choose Register.


    • In the Add Service dialogue enter $TMP for the package assignment or press Local Object.


    • Expand the folder Service Maintenance right click on the entry GW_HUB and choose SAP Gateway Client.

      Press <Execute>


    • Check the Service Document

      It shall contain three entity sets:
      1. Zsepm_C_Salesorder_Tpl
      2. SEPM_I_SalesOrderItem_E
      3. SEPM_I_BusinessPartner_E


    • Check the metadata document by selecting <Add URI option> and select $metadata


    Check the Metadata Document
    Please note that the entity type SEPM_I_SalesOrder_EType contains:

      • Two navigation properties to_Customer and to_Item
      • A property SalesOrder_Text that has been generated by the SADL framework based on the annotation @ObjectModel.text.association: '_Text'.

        Please note this property is annotated as sap:updatable=”false”;



    • Now we can test the service and we will see that it supports various query options, $expand and selecting single entities out of the box.

      To do so execute the following URI’s in the SAP Gateway Client (transaction /n/IWFND/GW_CLIENT):
      • /sap/opu/odata/SAP/ZE2E100_XX_2_SRV/Zsepm_C_Salesorder_Tpl?$filter=GrossAmountInTransacCurrency ge 100000&$select=SalesOrder,GrossAmountInTransacCurrency,TransactionCurrency&$format=json


      • $skip and $top together with $inlinecount work out of the box as well

        /sap/opu/odata/SAP/ZE2E100_XX_2_SRV/Zsepm_C_Salesorder_Tpl?$filter=GrossAmountInTransacCurrency ge 100000&$select=SalesOrder,GrossAmountInTransacCurrency,TransactionCurrency&$top=2&$skip=1&$inlinecount=allpages&$format=json

    Please note that via &$inlinecount=allpages we retrieve the number of entries that would be returned without using $skip and $top


      • Read a single entity from the list of the sales order


        Please note that the weird looking key stems from the fact that the CDS view is annotated as an analytical view.

      • Read the list of items of a single sales order via the navigation property to_Item





    How to implement updates in a service that has been generated using referenced data sources will be described in the following blog


    OData service development with SAP Gateway using CDS via Referenced Data Sources - How to implement updates



    OData service development with SAP Gateway using CDS via Referenced Data Sources - How to implement updates




    The OData service that has been generated in the first part of this blog is read only.


    In the second part of the blog OData service development with SAP Gateway using CDS via Referenced Data Sources I would like to show how we can make the description of the sales order updatable.


    This will show


    1. how simple updates can be implemented and in particular
    2. how this can be done for Texts that are accessed by the SADL framework via a join and the annotation @ObjectModel.text.association: '_Text'.




    • Open the Service Builder project ZE2E100_<XX>_2 again that has been built based on the blog mentioned above.
    • Expand the folder Runtime artifacts and right-click on ZZCL_ZE2E100_XX_2_DPC_EXT and choose the entry Go To ABAP Workbench.


    • Switch to edit mode, scroll down to the method ZSEPM_C_SALESORD_UPDATE_ENTITY and make sure to select it and click on the Redefine Method button.


    • Copy and Paste the coding into the Update method

      method zsepm_c_salesord_update_entity.


         data: lt_keys               type /iwbep/t_mgw_tech_pairs,

               ls_key                type /iwbep/s_mgw_tech_pair,

               ls_so_id              type bapi_epm_so_id,

               ls_headerdata_update  type bapi_epm_so_header,

               ls_headerdatax        type bapi_epm_so_headerx,

               ls_headerdata_key     type zcl_ze2e100_xx_2_mpc=>ts_zsepm_c_salesorder_tpltype,

               ls_headerdata_payload type zcl_ze2e100_xx_2_mpc=>ts_zsepm_c_salesorder_tpltype,

               lt_return             type table of bapiret2,

               ls_return             type          bapiret2,

               err_msg               type          string,

               ls_message            type          scx_t100key.


         call method io_tech_request_context->get_converted_keys


             es_key_values = ls_headerdata_key.


         io_data_provider->read_entry_data( importing es_data ls_headerdata_payload ).


         ls_so_id-so_id = ls_headerdata_key-salesorder.


         " Product header data (non-key) fields that can be updated

         " via the BAPI are marked with an 'X'


         ls_headerdatax-so_id = ls_headerdata_key-salesorder.

         ls_headerdatax-note = 'X'.


         " move content of the fields that should be

         " updated from payload to the corresponding

         " field of the BAPI


         move ls_headerdata_key-salesorder to ls_headerdata_update-so_id.

         move ls_headerdata_payload-t_salesorder  to ls_headerdata_update-note.


         call function 'BAPI_EPM_SO_CHANGE'


             so_id         = ls_so_id         " EPM: SO Id

             soheaderdata  = ls_headerdata_update " EPM: so header data of BOR object

             soheaderdatax = ls_headerdatax


             return        = lt_return.     " Return Parameter


         if lt_return is not initial.


           loop at lt_return into ls_return.


             err_msg = ls_return-message .




           ls_message-msgid = 'SY'.

           ls_message-msgno = '002'.

           ls_message-attr1 = err_msg.


           raise exception type /iwbep/cx_mgw_busi_exception


               textid = ls_message.







    Info: The replaced coding above retrieves the content of the properties of the incoming request.

    Since different DDIC structures are used by the entity type and the BAPI that is used to update the sales order the incoming fields are moved to the data structure used by the BAPI.

    • To make the SAP Web IDE CUD Master Detail Template aware that the property SalesOrder_Text is now updatatable we have to annotate the property using the annoation sap:updatable=true.

      In this special case this cannot be done in the CDS view. So we have to use the option to add additional metadata by implementing the DEFINE method in the model provider extension class.
    • Expand the folder Runtime artifacts and right-click on ZZCL_ZE2E100_XX_2_DPC_EXT and choose the entry Go To ABAP Workbench.
    • Copy and paste the following coding into the DEFINE method.

    method define.



           lo_entity_type type ref to /iwbep/if_mgw_odata_entity_typ,

           lo_property    type ref to /iwbep/if_mgw_odata_property.


         call method super->define( ).

         lo_entity_type = model->get_entity_type( iv_entity_name = 'Zsepm_C_Salesorder_TplType').

         lo_property = lo_entity_type->get_property( iv_property_name = 'SalesOrder_Text').

         lo_property->set_updatable( abap_true ).




    • Click on Activate.
    • Confirm the activation popup.
    • Navigate back to the Service Builder by pressing the “Back-Button” several times
    • In the navigation tree right-click on GW_HUB and select SAP Gateway Client.

      Alternatively start the SAP Gateway Client in a separate window by starting transaction /n/IWFND/GW_CLIENT

      If we now test the update using the SAP Gateway Client this should work fine.

    • Enter the following URI


      After pressing Execute button you see a single sales order.

      Replace ‘<XX>’ with your group name
    • Press the Use as Request button to create a proper http request body.
      Change the content of the field SalesOrder_Text for example to ‘Test Update Text’.
      Change the http method from GET to PUT.
      Press Execute


    • As a result you get an empty http response with the return code 204.


    • Now again perform a GET request to verify that the data has been changed
      Enter the following URI
      After pressing Execute button you see a single sales order with the changed text.
      Replace ‘<XX>’ with your group name



    Testing with SAP Web IDE


    When testing with the CUD Master Detail Template in SAP Web IDE it turns out that the update of the field will not work.


    This is because to a small bug that is planned to be fixed with SAP NetWeaver 750 SP04.


    As a workaround we have to redefine the method /IWBEP/IF_MGW_CORE_SRV_RUNTIME~CHANGESET_BEGIN.


    After you have added the following code



    = abap_false.





    The text will be updated as you can check with the SAP Gateway Client



    Browsing All 253 Articles Latest Live