Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog


Channel Description:

SAP NetWeaver Gateway (formerly called "Project Gateway") is a technology that provides a simple way to connect devices, environments and platforms to SAP software based on market standards. The framework enables development of innovative people centric solutions bringing the power of SAP business software into new experiences such as social and collaboration environments, mobile and tablet devices and rich internet applications. It offers connectivity to SAP applications using any programming language or model without the need for SAP knowledge by leveraging REST services and OData/ATOM protocols '

older | 1 | .... | 8 | 9 | (Page 10) | 11 | 12 | 13 | newer

    0 0

    logo_codejam_hannover.jpg

    On Thursday, May 21, 2015 from 10:00 AM to 4:00 PM (CEST)   there will be a SAP CodeJam in Hannover which will be conducted by my colleague Frederic Berg and me.

     

    If you have not been able to make it to the CodeJam event in Bielefeld that will takes place on May the 5th this event is for you.

     

    The topic will be again the end-2-end development of a SAP Fiori like application showing the development of a simple OData service and the development of a SAPUI5 application using the SAP Web IDE.

     

    So the event is of interest for ABAP developers as well as for frontend developers. While ABAP developers will learn how to code an OData service using SEGW they have also the chance to see how to generate a simple SAPUI5 application using SAP Web IDE.

     

    Frontend developers will have the chance to learn more about SAP Web IDE to speed up there development of a SAPUI5 based application and will also see what is happening behind the scenes in the ABAP stack when developing an OData service to shed some light on this black box.

     

    As always this is not a classroom training but a chance to get your hands dirty, meet people and have fun with coding.

     

    If you are interested here is the registration page link:

     

    SAP CodeJam Hannover Registration, 30659 Hannover | Eventbrite

     

    Best Regard,

    Andre


    0 0

    Mobile Computing is getting more and more integrated into our daily lives. In the last 10 years users went through tiny laptops up to netbooks and tablets, from common mobile phones up to smartphones and now we've finally arrived at the age of the so-called "Wearable Devices". It doesn't matter if you're wearing a Smartband, a Smartwatch or even Google's highlight product the Glass - With all those little devices you can do nearly everything what you could do with an ordinary computer. That means that you could call your friends or text them a small message or e-mail. You could go to the internet and search for the current cinema program or whatever you're looking for. It's also possible to monitor diverse physical functions, to take videos and pictures or to let you navigate through an unknown city all by just looking through a small monitor fixed on eyeglasses for example. Actually those features are quite cool for conventional home users but what if you're a business user and want to take advantage of them in your company? This is exactly the question with which businesses are struggling at the moment and also what I'm trying to answer you in this blog post.

     

    My name is Stefan Heitzer and I'm a developer who works for the SAP consulting company T.CON GmbH & Co. KG in Plattling (Germany). We are always interested in new technologies and in things that can improve the user experience and the look and feel of applications. Due to this reason I dealt with scenarios for wearable computing in the field of ERP-Systems for my bachelor thesis. So we soon bought a Google Glass for our company and started with the first attempts of developing applications for the Glass which finally resulted in an app for mobile maintenance in connection with the SAP Netweaver Gateway. Our company has already developed the so-called “Mobile Maintenance Management” for Android phones. The app mainly helps technicians to overview and design the maintenance and repair proccesses much more efficient. For more information please check out the following link: Mobile Maintenance - T.CON Team Consulting. But let's start from the beginning. What I want to show you in this post consists out of the following points:

    • The general functionalities of the so-called „Mobile Maintenance Management Glass“
    • How to consume oData with the Glass
    • How to make pictures and videos with the Glass and send them to a SAP Backend

     

    The General functionalities of the so-called “Mobile Maintenance Management Glass”

    The app itself has two main menu items. It’s possible to review the current inspection and repair orders and to create confirmations for them. The other action is to generate new notifications for machine damages. As it’s the normal way of using the Glass you’re able to start the actions either by voice or by tapping and swiping through a normal menu. (For creating an uniform experience I decided to display all attached images in the way how it looks by tapping and swiping)

    glass_blog01.jpg

    The above picture shows you the main menu / start menu with the options you have. Let's have a look at the first option.

    Via "Show orders" the User is able to get an overview of all the current inspection and repair orders. This means that the first step is to establish a connection with the SAP Backend and retreive all the orders. The process looks like the following:

    glass_blog02.jpg
    Each order can have one or more so-called "Suborders". The user should be able to explore each suborder and should be able to make full or partially confirmations for them due to the fact that he's wearing the Glass in his company and can fix the problem live. The confirmation itself can consist out of a confirmation text and also needs to have a valid time in minutes which was needed to fix the problem. All the required information can therefore entered by voice. To make sure that only valid values for the time can be entered, the app checks the corresponding voice input for a number. The menu for this process and the validator is been shown in the next screenshotsglass_blog03.jpg

    glass_blog04.jpg

    As already mentioned the user can make a partial or a full confirmation. The partial one should be made if technican has something done for fixing the problem. A full confirmation should only be done if the error is completly solved. When all the required information have been entered the following actions look like the following:

    glass_blog05.jpg

    glass_blog06.jpg

    It doesn't matter which kind of confirmation the user makes. Always when this process is executed the app makes a connection to the SAP Netweaver Gateway and creates it with the entered information. For taking knowledge that there have already been made some partial confirmations for a suborder, each order has a small icon (circle) on the left corner. Red means that the erros hasn't been solved yet and yellow that there has already been made a partial confirmation.

     

    A possible scenario can also be that the technican is on the shop floor and suddenly a problem appears. For this reason the second menu point "Create notification" exists. As well as "Show orders" the first step is to establish an connection to the SAP backend. Each notification has a specific damage code and of course a priority. Those information are going to be fetched at the beginning. The notification itself refers always to a functional location or to a equipment. This code can either be known by the technician or it can be found as a barcode on a maschine for example. So for entering it there are generally two possibilities: Either by voice or by scanning a barcode with the built-in camera of the Glass. While the codes are scanned or entered by voice it's of course possible that some errors can occur. Therefore all entered values are checked as well as it happens with the input in the "Create confirmation" process.The complete order of events looks like the following:glass_blog07.jpg

    glass_blog08.jpg

    glass_blog09.jpg

    At the beginning of describing the second function I said that it's also important to enter a damage code and the priority of the fault notification. For a better user experience we designed a list consisting out of those elements. By swipping you can browse through the elements and via a simple tap you make your choice:

    glass_blog10.jpg

    After choosing the damage code, the priority and entering a valid code for a functional location / equipment the user reached the point where he's able to create the notification. But there can be more information and things be entered. For example the user is able to set a text, to make a picture or to record a video. All those information are transported later on to the SAP Backend:

    glass_blog11.jpg

    Now let's go on with the coding. In the next step I'm going to show you some relevant coding points which need to done for making the app work.

     

    How to consume oData with the Glass

    The app itself is an Android development project. This means that we coded with plain Java. Due to the fact that we consume data from a SAP Backend we had to look for a way how to connect with it and the best choice was to use the SAP Netweaver Gateway. The Gateway provides the data in the so-called oData format. For this reason it was necessary to create the corresponding entities and also the classes in the frontend. Let's have a look at the the entity for confirmations and at the entity for orders:

     

    vorgang.jpgrueckmeldung.jpg

    As you can see this are normal entities which can be created with the Netweaver Gateway. In the background we have coding for the needed CRUD-Operations. For the orders for example exists a GET_ENTITYSET-Method which is responsible for retrieving all the stored orders in the system and the redefined method for the confirmation is a simple CREATE-Method. When we would call the GET_ENTITYSET-Method in the browser we receive all the relevant data in the oData format. So we need to find a way to handle this format in our Java coding. Therefore we use odata4j. odata4j provides a toolkit with which we can build a complete consumer of oData inside our Android App. With this framework we create at first a new ODataConsumer. The builder for this consumer needs the URL which should be called. In many cases the services need a basic authentication so we also have to set a new so-called "ClientBehavior" ("BasicAuthenticationBehavior") which accepts a username and a password. In the following coding you'll see that we also have an instance of "SAPCSRFBehavior". This instance is needed due to the Cross-Site Request Forgery Protection. The SAP Netweaver Gateway is actually using for each modifying request like an insert, update or delete a so-called "CSRF-Token". This token needs to be included into an HTTP-Request because the gateway is checking for each request the validity of the token. If the validation fails you'll get an error 403. We do not want to get this kind of error so we have this class which tries to fetch the token while sending a normal GET-Request. After setting this "ClientBehavior" we actually haven't made the request yet. To finally execute the service call we have to choose the entityset which we want to receive and call the method "execute()". The whole coding for this looks like the following:

    glass_blog12.jpg

    As you can see we're looping over the found orders and for each order we create a new one and get the corresponding properties via methods of the class "OEntity". Afterwards we're able to store all found orders in for example an ArrayList and go on with displaying them.

    Also the creation of a confirmation is no magic. Here we just have to call another method of our ODataConsumer:

    glass_blog13.jpg

    With the method "createEntity" we're able to say for which entityset we want to send the POST request. Afterwards we only have to set the properties and give them the right values. Actually all requests are handled in this way.

    There is only one special thing. As I meantioned in the first point the user should be able to make pictures and also record a video for a notification. Here we have to do some special coding which I'm going to show you in the next section!

     

    How to make pictures and videos with the Glass and send them to a SAP Backend

    The process of taking the picture and sending it to the backend is just a bit easier than the video part so let's start with the picture. The Glass itself has a built-in camera which we use for this procedure. So actually the first thing what we have to do is to start with the shooting of the picture. Therefor we're using the following code:

    glass_blog14.jpgglass_blog15.jpg

    With the new instance of the "Intent" we can start the picture process. What we receive in the method "startActivityForResult" is - if it was successful - the path of the currently made picture. With this path we can create a new file. Afterwards we only have to check whether the picture really exists. Actually the best way to send the picture to the system is to convert it into a BASE64-String:

    glass_blog16.jpg

    The entityset for making a notification consists out of a property for the picture. This attribute get's the converted BASE64-String as a value. Afterwards the request is simply executed via the "createEntity" method.

    A picture is for transporting in one request ok but there's going to be a problem with the video because it's way to huge for one single request. If you try to convert the video into a BASE64-String you would result in an OutOfMemory-Exception. Each Android application has a specific amount of memory which can be used for transactions, processing etc. Converting the video into the mentioned string or sending it in one single request is just way to much action for  the app and so this leads to the named exception. In order to counteract this phenomenon we created a loop which creates blocks of 1024 bytes and converts this blocks step by step into the BASE64-String. Afterwards you can put the complete string together and the conversion is finished. The corresponding code looks like the following:

    glass_blog17.jpg

    Now we made it to solve the OutOfMemory-Exception while converting the image but as already said the same problem can occur when the request is executed. Therefore we're using a streaming mechanism. This means that we've created a function which includes a recursion. The BASE64-String is again separated into blocks of 2000000 bytes and afterwards each block is transported and the function calls itself again until the whole request is finished:

    glass_blog18.jpg

    The entityset which is called at the request here has a special attribute called "Flag". This flag is used for recognizing whether the request is finally finished or not. If the request shouldn't be at the end the transported BASE64-String is added to the already received string which is stored in a database table. The coding for the whole procedure looks like the following:

    video_request_finished.jpg

    video_request_not_finished.jpg

    This was the last coding snippet which I wanted to show you so we've actually reached the end of this blog post. If you need further information about MMM, the Glass development or SAP Netweaver Gateway check out the following links:

    I really hope you enjoyed reading my new blog post and if you have additional questions don't hesitate to contact me!

     

    Kind regards

    Stefan


    0 0

    This week it was the second time that I conducted a SAP CodeJam event in Bielefed at the location of our partner Lynx consulting. The location was the "knight's hall" of the Lynx building that offered a nice view into the spacious garden (see below).


    Probably due to a strike of the train drivers in Germany not all participants that had registered were able to join this nice event.

     

    WP_20150505_14_30_42_Pro.jpgIMG_5896.JPG

     

    We nevertheless got 15 motivated participants that wanted to learn about the E2E development scenario of SAP Fiori like applications starting with the development of an OData service in SAP Gateway and consuming the same by a SAP UI5 based application.

     

    What really impressed me was that we had customers and partners that came from the Netherlands as well as from the south of Germany to join the event in Bielefeld.  So I am thinking about offering such an event in or nearby Walldorf in the near future as well.

     

    My colleague Matthias Osswald and I started with presentations about SAP Web IDE, SAP UI5 and SAP Gateway that showed the development process and finally the participants could get their hands dirty on to SAP Gateway instances and the SAP Web IDE.

     

    Beside showing features and coding we also had time for interesting discussion amongst the participants. Also the wheather was fine and we could get out in the nice garden ...

    WP_20150505_13_04_33_Pro.jpg

     

     

     

    ... and as always there was plenty to eat and to drink ...

     

    IMG_5894.JPGFullSizeRender.jpg20150505_123232.jpg

     

    So if you are interested in another CodeJam in Bielefeld there is an upcoming event about SAP HANA at Lynx in Bielefeld in June. SAP CodeJam Bielefeld Registration, 33615 Bielefeld | Eventbrite

     

    If you missed this event and but are also interested in the E2E development process of a SAP Fiori like application you can join  my colleague Frederic Berg and me at the Upcoming SAP CodeJam in Hannover


    0 0

    Introduction

     

    First lets try to understand what OData is about:-

     

     

    1.) Odata helps you in building and consuming restful webservices.

    2.) It helps you focus only on business logic rather focusing on request and response headers, status codes, HTTP methods, URL conventions, media types, payload formats and query options etc.that we have to while developing restful webservices.

    3.) Additionally, OData provides facility for extension to fullfil any custom needs of your RESTful APIs.

    4.) OData (Open Data Protocol) is an OASIS standard

     

    So now we could say that OData is a Restful webservice with a standardized XML format.

     

    Now to create OData server in java, apache has provided us a very useful library called OLingo.

    OLingo library can be created with or without JPA, but when we create OData project without JPA following OData features can not be used:-

    $filter :- It works as where condition in sql

    $select :- It specifies the field for which the select is being fetched.

    $top :- works as rownum in the query

     

     

     

    Implementation:-

     

    Now lets have a look at creating the first Olingo project:-

     

    1.)Create a maven project using following command:-

    mvn archetype:generate -DgroupId=com.sf.example -DartifactId=odata-sample -DarchetypeArtifactId=maven-archetype-quickstart -DinteractiveMode=false

     

    2.)Create a pom.xml file as attached, this will download all the required of olingo to with OData2.

     

    3.)Now create the persistence.xml as shown:-

              <?xml version="1.0" encoding="UTF-8"?>

                   <persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"                xsi:schemaLocation="http://java.sun.com/xml/ns/persistencehttp://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">

                     <persistence-unit name="mysql" transaction-type="RESOURCE_LOCAL">

                     <provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>

                               <class>model.Employee</class>

                     <properties>

                                       <property name="javax.persistence.jdbc.url" value="jdbc:mysql://localhost:3306/EmpManagement" />

                                        <property name="javax.persistence.jdbc.driver" value="com.mysql.jdbc.Driver" />

                                        <property name="javax.persistence.jdbc.user" value="root" />

                                        <property name="javax.persistence.jdbc.password" value="root" />

                    </properties>

                    </persistence-unit>

                    </persistence>

     

    4.) Now create a table Employee in data base and create the entity class for that as shown:-

     

         @Entity

         @Table(name="Employee")

         public class Employee {

                @Id

                @Column(name="EmplID")

                private String emplID;

     

                @Column(name="FirstName")

                private String firstName;

     

                @Column(name="LastName")

                private String lastName;

     

                public String getEmplID() {

                     return emplID;

                }

                public void setEmplID(String emplID) {

                     this.emplID = emplID;

                }

                public String getFirstName() {

                     return firstName;

                }

                public void setFirstName(String firstName) {

                     this.firstName = firstName;

                 }

                public String getLastName() {

                     return lastName;

                 }

                 public void setLastName(String lastName) {

                     this.lastName = lastName;

                  }

         }

     

    5.) Now lets create the most important file for our OData project as shown:-

     

    package main;

     

     

    import java.net.URI;

     

     

    import javax.persistence.EntityManagerFactory;

    import javax.persistence.Persistence;

     

     

    import org.apache.olingo.odata2.jpa.processor.api.ODataJPAContext;

    import org.apache.olingo.odata2.jpa.processor.api.ODataJPAServiceFactory;

    import org.apache.olingo.odata2.jpa.processor.api.exception.ODataJPARuntimeException;

     

     

     

     

    public class PolicyServiceFactory extends ODataJPAServiceFactory {

     

     

    private static final String PERSISTENCE_UNIT_NAME = "mysql";

    @Override

    public ODataJPAContext initializeODataJPAContext() throws ODataJPARuntimeException {

      ODataJPAContext oDataJPAContext = this.getODataJPAContext();

      try {

       URI uri = oDataJPAContext.getODataContext().getPathInfo().getRequestUri();

       uri.getQuery();

       EntityManagerFactory emf = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT_NAME);

       oDataJPAContext.setEntityManagerFactory(emf);

       oDataJPAContext.setPersistenceUnitName(PERSISTENCE_UNIT_NAME);

        return oDataJPAContext;

      } catch (Exception e) {

       e.printStackTrace();

        throw new RuntimeException(e);

      }

    }

    }

     

    6.) Now apply the following maven commands to compile and create the war file for the given project:-

              mvn clean

              mvn compile

              mvn install

     

    7.) Now place the created war at webapps folder of tomcat to deploy our project on tomcat.

    8.) Now once the project is successfully deployed use following url and see how easy it has become access the database from browser itself:-

     

              http://localhost:8081/odata-sample/extdatasrc.svc/

     

    collection.png

    http://localhost:8081/odata-sample/extdatasrc.svc/Employees  

     

    This will fetch you all the records in Employee table as shown:-

     

     

     

    EmployeeList.png

     

    http://localhost:8081/odata-sample/extdatasrc.svc/Employees?$filter=EmplD  eq  1

     

    This will fetch the reocrd with EmplID equals to 1.

     

    http://localhost:8081/odata-sample/extdatasrc.svc/Employees?$top=1

     

    This will fetch the records with rownum as less then or equal to 1.

     

    http://localhost:8081/odata-sample/extdatasrc.svc/Employees?$select=FirstName

     

    It should fetch all the reocrds but with specific column, for FirstName in this case


    0 0

    Introduction

     

    in this blog I will give a solution to a typical error at metadata reload of an oData service.

     

    Symptom

     

    In Transaction /IWFND/MAINT_SERVICE (Activate and Maintain Services) at reloading the metadata an error message appears:

    Loading the metadata for service 'YOUR_SERVICE_NAME' failed. Check the error log. (/IWFND/CM_MGW066).

     

    Further the error log shows:

    ..ERROR_INFO    ABAP Dictionary element 'YOUR_STRUCTURE-DELETED_FIELD' not found (/IWFND/CM_MGW004)

     

    Reason

     

    The metadata cache contains the old version of the DDIC stuctures. If a field in the backend system has been removed, the metadata cache would not be informed about this.

     

    Solution

     

    1. Start transaction /IWBEP/REG_SERVICE (Maintain Service)
    2. Put your service name in the 'Technical Service Name' field
    3. Click on 'Display'
    4. Click on 'Cleanup Cache' to cleanup the metadata cache
    5. In Transaction /IWFND/MAINT_SERVICE now you can repeat reloading the metadata
      (The cleanup is also possible from here, if you select your service and click on the '(display) Service Implementation' Button in the bottom right area 'System Aliases')

     

    Other Terms

     

    /IWFND/CM_MGW066, /IWFND/CM_MGW004, /IWFND/CM_MGW, ABAP Dictionary element not found, metadata cache, DDIC field deleted

     

    Scenario

     

    In this case I have an

    1. SAP Backend System with an oData mapping
      • Model Provider Class
      • Data Provider Class
        • to the actual entity has been a strucure binded (/IWBEP/IF_MGW_ODATA_ENTITY_TYP=>BIND_STRUCTURE)
    2. SAP NW Gaterway with the oData Service Catalog

    0 0

    Is there any tool for separate monitoring the sap hana cloud database?like sqldeveloper,sqlyog


    0 0

    Here are a few highlights from a few SAP Gateway Hands-On session conducted in the ongoing TechEd 2015 in Bangalore.

     

    Session: DEV360 - Advanced OData Service Development with SAP Gateway

     

    With the Buzz around S4/HANA, it was clear that there will be a renewed need to supplant Business Suite capabilities either as HCP extensions or ABAP extensions for Public cloud and Managed Cloud / On-premise installations respectively.

     

    The Advanced OData Service Development with SAP Gateway Hands-on Session on the day of the keynote couldn’t have been scheduled on a better slot. 

    The session saw a packed house and we were concerned that more than a handful of participants might not find a vacant desktop to occupy!

     

    20150311_133414.jpg

     

     

    Arun started with a quick recap of Gateway concepts and explaining the salient features of our Design Time tools via segw.

      

    After getting developers on boarded onto their systems and sampling out the exercise files, Kranti took over and explained framework features of $expand and how self-expand could be utilized and backed up with some perceivable improvements in response time.

     

    Arun then went to speak about Server Side caching capabilities via Soft State, its relevance and guided the audience with the exercise files.

     

    It was my turn to put Client Side caching into perspective via ETag support for Conditional Read and Update requests.

     

    In a nutshell it was a well-balanced distribution between ‘slide talk’ and actual hands-on . At the end of two hours as we were winding up, we were glad to see happy faces leave the hall having completed the exercises and hopefully after having added a trick or two to their knowhow of OData Service Development

     

     

    Session: INT263 - Consuming SAP data in Google Apps leveraging SAP Gateway

     

    As an exciting day 1 was drawing to a close and we were skeptical of how many people would show up for the hands-on session for Consuming SAP data in Google Apps leveraging SAP Gateway which was coincidentally timed exactly around the afternoon snacks interval.

     

    But our fears were put to rest as participants started pouring in and it was just a matter of time before all seats were taken, so much so that we had to create additional backend users to account for the deluge of participants.

     

    20150311_160942.jpg

     

     

     

    Balu began with an overview of SAP Gateway, brief architecture and its relevance for user centric applications.

     

    Soon Chandan got onto the fore and started off the exercises to create a backend entity via Gateway tools and after having dealt with that portion satisfactorily, we started the much awaited segment of weaving a consumption story around Google APIs.

     

    I started off with a context around Google Apps Script and showed a few example with Gmail and Forms integration and we then lead the participants round it up with a SpreadSheet App that fetched Business Partner and Sales Order data from the Gateway service.

     

    Understandably, with questions and queries pouring in, we soon ran out of time and thankfully were able to extend the session by a good 15 minutes before we wound up an interesting day at TechEd2015.

     

     

     

    -Vinayak Adkoli

     

     

     

     


    0 0

    SAP_CodeJam_Freiburg.jpg

     

    On July the 23rd we will have again a SAP CodeJam about SAP Gateway. After having had two events in the north of Germany namely in Bielefeld and Hannover we now will visit the south of Germany.

     

    The event will take place on the premises of Haufe-Lexware GmbH & Co. KG who will kindly provide a location for this event.

     

    The topic will be again the end-2-end development of a SAP Fiori like application with a focus on the OData service development.

     

    We will be showing the development of a simple OData service and how you can use SAP Web IDE to quickly generate a SAPUI5 application based on a template that consumes this service and how to adapt the same.

     

    If you have not worked with the Service Builder so far here is the chance for you to get your hands dirty. For the experienced developer there will be the chance to try out new features like soft state that we have added to our stack recently and to have some in depth discussion about the product and its roadmap with representatives of the SAP Gateway development team.

     

    If you are interested you can register here:

     

    Best Regards,

    Andre


    0 0

    As mentioned by title, this blog does not introduce the OData trace functionality itself, but shows the way how to find and figure out the usage of trace functionality by yourself, for example, find the corresponding transaction code or report name to launch the trace.

     

    Actually this approach is not dedicated to gateway but generically applies to any other scenario:

     

    - You have found switch or flag evaluation in some ABAP source code which dynamically controls the enablement of certain functionality. You need to know where and how you can access this switchable function.

     

    For example, in gateway system, I found there are two flags which enable or disable the OData trace:

    clipboard1.png

    I need to find out how to perform the OData trace by the source code, without any debugging in the runtime.

     

    Step1: perform where-used-list on mv_perf_level:

    clipboard2.png

    7 hits. Based on experience, I can judge that the line 100 is fill it with value fetched from DB table via open SQL.

    Double click the line 100.

    clipboard3.png

    Step2: Now I found the configuration table which stores the trace configuration information. Perform where-used-list on the table again:


    clipboard4.png

    The second report, /IWFND/SUTIL_TRACE_CONFIG, is what I am looking for, the one to launch OData trace UI.

    clipboard5.png

    To verify, simply execute it. And that's it. After I made the following setting and click save button:

    clipboard6.png

    There is corresponding entry persisted in the table I found in this step.

    clipboard7.png

    Step3: I am also curious about at what time the other flag, mv_odata_trace_active, could become true. Still the same approach.

     

    Check the result. Based on experience, only the first method ENABLE_ODATA_TRACE performs the write access on the flag, all the left are read access such as IF mv_odata_trace_active = abap_true. ....

    clipboard10.png

    Double click on ENABLE_ODATA_TRACE, and we get to know the flag will become true if io_context->debug is true.

    clipboard11.png

    So now research /IWCOR/IF_DS_CNTXT instead:

    clipboard12.png

    Again the attribute DEBUG of interface only has the opportunity to be true in the constructor method of the implementation class, all other 41 hits are the read access on it and could be ignored.

    clipboard13.png

    so perform the where-used-list on constructor method:


    clipboard14.png

    Here we are very near to the target:

    clipboard15.png

    Just scroll up, and we get the result. The other flag could only become true when the two prerequisites are met:

     

    1. There is query parameter sap-ds-debug defined in the OData request url.

    2. The current user should have debug authorization, that is, should pass the check by function module SYSTEM_DEBUG_AUTHORITY_CHECK.

    clipboard16.png


    0 0

    Hi All,


    This document will assist how to create the Trusted RFC connection between Gateway and Back end systems.


    Use of Trusted RFC - Passwords are no longer sent for logging on to the trusting system.


    Case Study:


    ND1 is my Front End system (Gateway)

    DO1 is my Back End system (ECC)


    Pre requisites:

    1.   User should be same in both systems
    2.   Role should be assign to the user, with S_RFCACL authorization

    Then we have created a Trusted RFC in both systems.


    From ND1 to D01 system ->Trusted RFC, with technical settings details only.

    1.JPG

    Now connection test working fine.

    2.JPG

    Then Goto Tx. SMT1 in ND1 system

    3.JPG

    Create the trusted system, from ND1 Need to follow the Wizard

    Enter the RFC Destination, which we have created for back end system

    5.JPG

    select the continue, the below details will come automatically

    6.JPG

    Here we have provided the user name and password for DO1 system

    Select Validity period to 00:00:00, and select continue

    7.JPG

    Select complete to finish the wizard.

    8.JPG

    As done in ND1 system, we have to create the Trusted RFC creation in D01 system (ND1_TRUSTED) system also then we will go for SMT1 and follow the wizard like below

    From D01 system we are creating the Trusted RFC to ND1:

    Select Continue

    9.JPG

    Enter the RFC Destination

    10.JPG

    Here we have provided the user name and password for ND1 system

    11.JPG

    Select Continue

    12.JPG

    Enter the validity period to 00:00:00

    13.JPG

    Select complete to finish the wizard.

    14.JPG

    Now create the RFC to connect DO1 system from ND1 system, Select the Yes button in Trust relationship tab

    15.JPG

    Check the Connection test and Remote logon test, it will take to on DO1 system.


    Now create the RFC to connect ND1 system from DO1 system, Select the Yes button in Trust relationship tab

    16.JPG

    Check the Connection test and Remote logon test, it will take to on ND1 system.


    Successfully completed the Trusted RFC connection between the Front and Back end systems.


    Thank You..


    0 0
  • 07/09/15--20:46: Pagination decrypted...
  • Introduction

    With this blog , I wanted to share some aspects of pagination that I have come across from Gateway projects.

     

    Pagination

    Is the process of showing only selected records from the list of total records . Typical example would be our Word Processor "Microsoft Word" which has page breaks inside document.The bottom pages are shown only on demand . I am trying to analyse pagination from a Gateway point of view and the impact on performance

     

    As initial step I have set up some data

     

    Data setup

    I have two tables, one an employee table and other a text table for that employee table

     

    Employee table

     

    Emp_table.jpg

     

    Employee text table

     

    EMp_text_table.jpg

    I also maintained some dummy values for both English and Deutsche .. Forgive me for my linguistic blunders .

     

    Employee table entries

    emp_table_entries.jpg

     

    Employee text table entries

    emp_text_table_entries.jpg

     

    We have two types of Pagination

     

    1 Client side pagination

    Here the client calls the shot . The client decides the number of records to be loaded . It does not mean the client fetches all the records from the backend ,buffers it  in client and shows the records as per user request(drag down).

     

    Not so good approach

     

    I have seen in projects which implement this kind of pagination in which the entire records are fetched from the backend. The records are then filtered in the gateway layer. This is worst hit in the perfomance of the UI application. These can happen in case of standard BAPIs also where there is no option to restrict the maximum number of records (No iv_max parameters). The only option is to apply filter criteria to the set of records.

     

    We will try to simulate this 'Not so good approach' by creating  an FM API which fetches the all the records from these above tables

    FM_GET_EMP.jpg

    In this case there is no iv_max or upto parameter in the interface of FM.

    We will try mapping the FM to Entity and execute it from the URL.

    EMP_MAP.jpg

    As  you know in this case, the  DPC class code gets automatically generated for this query . If you have a look at the corresponding entities GET_ENTITYSET method.This is doing pagination after fetching all the records from DB. . .

    Pagination_bad.jpg

     

    The matter becomes worse when you have multiple sort parameters also to handle. Here you should take care of the sequence in which you do ordering and pagination . First sorting and then pagination otherwise you will get weird results.Moreover Ithe sorting part is not automatically handled by the framework method.So if I run the below URLs in the GW client

     

    1 /sap/opu/odata/SAP/ZARS_PAGE_SRV/EmployeeSet?$skip=2&$top=2

    I get correct results (even though performance is not good)

    GW_Client_rsult.jpg

    2  /sap/opu/odata/SAP/ZARS_PAGE_SRV/EmployeeSet?$skip=2&$top=2&$orderby=Id desc

    This will still yield the same results ...

     

    You need to explicitly handle the sorting like  what is mentioned in the below blog ..

    SAP GW - Implement a better OrderBy for Cust EXT Class

    I am copying that piece of code over here

    
    

     

    1. METHOD orderby. 
    2.    DATA: lt_otab   TYPE abap_sortorder_tab, 
    3.               ls_oline  TYPE abap_sortorder. 
    4.    DATA: ls_order LIKE LINE OF it_order. 
    5.    CONSTANTS:BEGIN OF lcs_sorting_order, 
    6.                              descending TYPE string VALUE 'desc', 
    7.                             ascending  TYPE string VALUE 'asc', 
    8.              END OF   lcs_sorting_order. 
    9. LOOP AT it_order INTO ls_order. 
    10.      ls_otab-name = ls_order-property. 
    11.      IF ls_order-order = lcs_sorting_order-descending. 
    12.        ls_otab-descending = abap_true. 
    13.      ELSE. 
    14.        ls_otab-descending = abap_false. 
    15.      ENDIF. 
    16.      APPEND ls_otab TO lt_otab. 
    17. ENDLOOP. 
    18.    SORT ct_data BY (lt_otab). 
    19. ENDMETHOD. 

    PS: The it_order(from io_tech_request_context) which is mentioned in the blog contains ABAP fieldnames of gateway properties. In most cases It is different from the abap backend API result fields.

    For eg.I have changed the Abap field name of GW property Id to EMPLOYEEID.

    abap_field_name.jpg

    This will trigger a corresponding change in the DPC code

    abap_field_name1.jpg

    So in this case if we have to handle sorting along with pagination . Then you have to do the following steps

     

    1 Loop thru the backend internal table to fill et_entityset(id-EmployeeId) mapping

    2 Pass et_entityset to sorting utility method (as mentioned in the blog) to get back the sorted table

    3 Do the pagination using lv_skip and lv_top parameters

     

    The sample code is as below

    *  - Map properties from the backend to the Gateway output response table -
    data:lt_entityset_bfr_paging TYPE zcl_zars_page_mpc=>tt_employee.  LOOP AT et_data INTO ls_et_data.
    *  Provide the response entries according to the Top and Skip parameters that were provided at runtime
    *    *  Only fields that were mapped will be delivered to the response table    ls_gw_et_data-text = ls_et_data-text.    ls_gw_et_data-name = ls_et_data-name.    ls_gw_et_data-employeeid = ls_et_data-id.    APPEND ls_gw_et_data TO lt_entityset_bfr_paging.    CLEAR ls_gw_et_data.  ENDLOOP.      lt_tech_order = io_tech_request_context->get_orderby( ).
    **Now pass this to Util method for sorting
    LOOP AT lt_tech_order INTO ls_tech_order.      ls_otab-name = ls_tech_order-property.      IF ls_tech_order-order = lcs_sorting_order-descending.        ls_otab-descending = abap_true.      ELSE.        ls_otab-descending = abap_false.      ENDIF.      APPEND ls_otab TO lt_otab.
    ENDLOOP.    SORT lt_entityset_bfr_paging BY (lt_otab).
    **Pagination  LOOP AT lt_entityset_bfr_paging INTO ls_et_data FROM lv_skip to lv_top.
    *  Provide the response entries according to the Top and Skip parameters that were provided at runtime
    *    *  Only fields that were mapped will be delivered to the response table    ls_gw_et_data-text = ls_et_data-text.    ls_gw_et_data-name = ls_et_data-name.    ls_gw_et_data-employeeid = ls_et_data-id.    APPEND ls_gw_et_data TO et_entityset.    CLEAR ls_gw_et_data.  ENDLOOP.

    Here we see a lot of unncessary loops getting executed to achieve the correct results. To avoid this loops we have another way ...Yes you guessed it right to push pagination to backend side .This approach I will be covering in the next part

     

    Good approach..

     

    Here we pass the skip and top parameters to the backend FM .So we only selected (almost) the required records from the backend.

    If we say skip 2 and top 4 , it means the records from 3 to 6 should be selected.

     

    As first step we go ahead and change the backend API interface

    FM_new_API.jpg

     

    Now we change the source code to handle pagination

    
    
    IF iv_skip IS NOT INITIAL
        AND iv_top IS NOT INITIAL.    DATA(lv_up_to) = iv_skip + iv_top .  ELSEIF iv_skip IS INITIAL    AND iv_top IS NOT INITIAL.    lv_up_to = iv_top.  ENDIF.  SELECT emp~id  emp~name empt~text FROM zars_emp AS emp  INNER JOIN zars_empt AS empt    ON emp~id = empt~id    INTO TABLE et_data    UP TO lv_up_to ROWS    WHERE empt~spras EQ sy-langu.  if iv_skip is NOT INITIAL.  DELETE et_data FROM 1 to iv_skip.
    ENDIF

    Here I am selecting records with "upto" clause and deleting all the records whose index is equal to or less than lv_skip.I agree that this is not the most efficient way of doing pagination . But we have to live with the same as $skip and $top are directly not supported in the open SQL select .

    There are some workarounds with AMDP(Abap Managed Database Procedures) where they support LIMIT(top) and OFFSET (skip). But the issue it does not support dynamic values.Another method is to dynamic native sql using ADBC but that also comes with its own disadvantages..

     

    Now we need to redefine the DPC_EXT GET_ENTITYSET method to integrate this call...

     

        lv_rfc_name = 'ZARS_FM_GET_EMP'.    IF lv_destination IS INITIAL OR lv_destination EQ 'NONE'.      TRY.          CALL FUNCTION lv_rfc_name            exPORTING              iv_skip        = ls_paging-skip              iv_top         = ls_paging-top             IMPORTING              et_data        = et_data            EXCEPTIONS              system_failure = 1000 message lv_exc_msg              OTHERS         = 1002.          lv_subrc = sy-subrc.
    *in case of co-deployment the exception is raised and needs to be caught        CATCH cx_root INTO lx_root.          lv_subrc = 1001.          lv_exc_msg = lx_root->if_message~get_text( ).      ENDTRY.   endif.   
     LOOP AT et_data INTO ls_et_data.
    *  Provide the response entries according to the Top and Skip parameters that were provided at runtime
    *    *  Only fields that were mapped will be delivered to the response table      ls_gw_et_data-text = ls_et_data-text.      ls_gw_et_data-name = ls_et_data-name.      ls_gw_et_data-employeeid = ls_et_data-id.      APPEND ls_gw_et_data TO et_entityset.      CLEAR ls_gw_et_data.    ENDLOOP.


    Its time to test the code . We head to GW client and run the below  URL

    /sap/opu/odata/SAP/ZARS_PAGE_SRV/EmployeeSet?$skip=2&$top=3

     

    correct_results.jpg

     

     

    As you can see here we get the correct results

     


    Closing comments..

     

    Here I have tried to explain an efficient(almost) way  of doing client side pagination. In my next blogs ,I will try to cover server side pagination and pagination with CDS views/AMDPs .Please let me know your comments for the blog


    0 0

    Now a 4 day course

     

    The new version of our classroom training GW100 is now available as a 4 day course since we have added various new topics to the content.

     

     

    What's in ?

     

    The course covers the development of OData services which are the basis of any SAP Fiori application.

     

    In the first unit we start with an introduction of SAP Gateway where you will learn about the benefits of the OData protocol, the SAP products that are using the OData protocol, the architecture of SAP Gateway and the different deployment options.

     

    In unit 2 we move to explain the term REST and you will examine an OData service and learn how to perform OData operations.

     

    Unit 3 covers the code based implementation of an OData service in SAP Gateway. You will learn about the toolset that is available for you as a developer, learn how to implement a Create, Read, Update, Delete and Query operation. In addition we will show you how to implement navigation, filtering and paging.

     

    Unit 4 is about how to generate OData services with SAP Gateway without the need to write a single line of code. Here you will learn how to generate OData services based on a RFC-function module, a Search Help and the new CDS Views that are available with SAP NetWeaver 7.40.

     

    In unit 5 we explain how to extend an OData Service in SAP Gateway.

     

    In Unit 6 you will learn about the functionalities of the SAP Gateway Hub. These are the support of multiple backend systems (mulit origin composition and routing) and the MyInbox app with the TaskGateway service.

     

    Unit 7 is about advanced OData operations such as function imports, $expand, deep insert. We also cover the handling of eTags, delta queries and $batch.

     

    Unit 8 is dedicated to the security of SAP Gateway

     

    In unit 9 you will finally learn how to use SAP WebIDE to generate a SAPUI5 application.

     

    Why should you attend ?

     

    This course is meant for ABAP developers that want to learn how to build OData services with SAP Gateway so that you can leverage these services to build your own custom SAP Fiori applications.

     

    Since at the heart of any SAP Fiori application there is an OData service it is mandatory to know how to create these services.

     

    How can I attend ?

     

    The GW100 classroom training can be booked via our  SAP Training and certification shop

     

     

    Best Regards,

    Andre


    0 0

    Introduction

    In the previous blog Pagination decrypted... we have seen how to implement client side pagination efficiently. Through this blog I will introduce server side pagination (which will be used mostly in offline scenarios) and considered better than client side pagination

     

    Server says "I will decide".Client says "I am delegating"

    Both client and server will not be on the same page always (both have their own Ego).Lets see a fireside chat scenario


    The server says:"I know how much records are there on the backend ,let me decide the packet size to be sent across the wire"

    The client says : "Its because I am delegating the task to you(more like a manager), you are able to decide"

     

    The client just delegates paging to the server, and the server provides a chunk of data, along with the total number of available items and  a link for asking more data.If you are an OData producer don’t forget to set Entity Set Page Size limit. If you are a Consumer, do look out for “next” link in the feed. Make use of this and loop accordingly through all the pages

     

     

    Implementing server side pagination


    Implementing server side pagination is pretty easy. The following steps have to be performed

    • Read the $skiptoken parameter from the client request
    • Decide on the packet size of the data to be sent back to the UI
    • If the number of result entries are greater than package  size , split the number of result entries
    • Set the last index of the result as the skiptoken for the client to fire the subsequent request


    Enough of lecture .... Lets get our hands dirty and do something in the system.

    We will try to implement server side paging for the test data in my previous blog


    Data Modeling


    I created a new entity type/entity set similar to Employee entity as in my previous blog Pagination decrypted... named EmployeeServer/EmployeeServerSet

    server_side_paging_1.jpg



    Backend API

     

    I have two importing parameters for the backend API .. One for the skiptoken and the other one for the packet size which I have kept by default as 3.

     

    server_side_paging_3.jpg

    Here the exporting parameters EV_END_INDEX is for returning the total count as inline count and ET_DATA is for actual data

     

    server_side_paging4.jpg

     

    Now we need to handle this importing parameters in the source code .

     

    **Find the no of rows to be selected from DB  IF iv_skiptoken IS NOT INITIAL    AND iv_packet_size IS NOT INITIAL.    DATA(lv_up_to) = iv_skiptoken + iv_packet_size .  ELSEIF iv_skiptoken IS INITIAL    AND iv_packet_size IS NOT INITIAL.    lv_up_to = iv_packet_size.  ENDIF.
    ***Select data from table up to n rows  SELECT emp~id  emp~name empt~text FROM zars_emp AS emp  INNER JOIN zars_empt AS empt    ON emp~id = empt~id    INTO TABLE et_data    UP TO lv_up_to ROWS    WHERE empt~spras EQ sy-langu.
    **Delete the initial skip entries  if iv_skiptoken is NOT INITIAL.  DELETE et_data FROM 1 to iv_skiptoken.
    ENDIF.
    **Send back index end for trigerring subsequent request
    **Send back only if the total count in DB is less than Up to count
    SELECT COUNT(*) from zars_emp INTO @data(lv_total_count).
    ev_total_count =  lv_total_count.
    if lv_up_to lt lv_total_count.
    ev_end_index = lv_up_to.
    ENDif.
    **

    From the above code snippet , you can see that its doing the following steps

     

    • 1 Find the number of rows to be selected from DB(Up to parameter)

     

    • 2 Select the data from DB

     

    • 3 Get the total count to be passed as inline count parameters(Will be useful if UI has to indicate the  user something like "3/10"..)

     

    • 4 Return the  index of the last selected row as the next ref link to fire the subsequent request

     

    One minor improvement would be to set the total count (i.e 10 in this case) only if the consumer/client has requested for the same

    So we add a new interface parameter to the API which indicates whether total inline count is requested from the consumer

     

    server_side_paging_impr.jpg

     

    Only query the count if it is requested

     

    if iv_total_count_flag eq abap_true.
    SELECT COUNT(*) from zars_emp INTO @data(lv_total_count).
    ev_total_count =   lv_total_count.
    endif.

     

    Testing

     

    We head to the gateway client for testing .

     

    First we fire the request without any skiptoken and we expect three rows as we have set the packet size to 3

    We fire the URL /sap/opu/odata/SAP/ZARS_PAGE_SRV/EmployeeServerSet?$inlinecount=allpages

     

    server_side_paging_gw_client_1.jpg

     

     

     

    Next we copy across the "Next Ref" link, append it to the existing URL  and fire the next request . This time we expect the next packet size and record numbers 4,5,6

     

     

    /sap/opu/odata/SAP/ZARS_PAGE_SRV/EmployeeServerSet?$skiptoken=3%20

     

     

    server_side_paging_gw_client_2.jpg

    As expected it returns the next packet size with a link for the next call ...

     

    Conclusion

    The choice of using Client  or Server side depends on the particular scenario. I personally prefer server side pagination as it gives more flexibility in the sense if the application scales , you control it from the server side and do not need to make any changes in the client side.But there is a catch to it, your backend SQL should be finetuned so that you select only the required records from DB.Thats the reason why I am huge advocate of LIMIT,OFFSET in open sql .Hope SAP lends some ears....

     

    In my next blog on pagination, I will discuss how pagination can be applied on our new age artifacts like CDS views, AMDPs etc....

    Stay tuned..


    0 0

    I was randomly browsing regarding OData services in SAP Netweaver gateway system and ended up reading regarding Eclipse’s Ogee project. It caught my attention, so I thought of sharing it with everyone here in community and listen up your views on it and at the same time I would like to listen from SAP experts that are there any plan down the line to use this Eclipse’s open source project.

     

    Now all of you will be thinking what is ‘Eclipse Ogee’?

    Let me give you very brief idea about it.


    Ogee is plug in provided by Eclipse which is an open source developer tool for the provisioning and consumption of OData services regardless of the service provisioning platform and service consuming technology i.e. can consume services from SAP Netweaver gateway system. It will be complementary tool for developers working on OData model especially, according to me as it lessen the efforts of developer. It comprises a graphical OData Model Editor for the creation of OData models that define the data structure required, it seems very cool feature. The models can then be exported as service metadata document.

    Ogee is a plug in lowers the barrier for developers who want to produce and consume data, by bringing the OData advantages to the fingertips of Eclipse developers.

     

    Ogee allows developers to developers to perform following operations:

    1. Model new OData services.
    2. Visualize and explore existing OData services.
    3. Validate the OData model against OData spec.
    4. The OData models can then be exported as service metadata document against the selected OData spec
    5. Extend Ogee framework for:
      • Consumption from other environments
      • Provisioning of OData services for different provider environments
      • Importing OData service models from different sources

       6.       The architecture of the project is open and extensible through Eclipse extension points

     

    If you want to know more about this, below link will give you more insights regarding this:

    https://projects.eclipse.org/projects/technology.ogee

     

    Sounds interesting right? Let me know how you find this new plug in comments sections below.


    0 0

    After initiating the strategic partnership with Apigee last year, SAP delivered API management on-premise solution based on the Apigee Edge intelligent API management platform in 2014. This year, in June 2015, SAP launched SAP API Management, cloud offering powered by HANA Cloud Platform that provides scalability to meet demand as needed at low investment cost, amortized over a subscription model. Since its launch, there has been tremendous interest from customers as these products provide quick, easy and low-investment solution for digital transformation of your business.


    Running on SAP HANA® Cloud Platform, the SAP® API Management technology enables organizations that publish APIs to share digital assets and processes. It also enables developer communities to consume digital assets with ease in new channels, devices, and user interfaces. Learn more about it at SCN API Management Community. SAP Gateway is a technology that enables a simple way to connect devices, environments and platforms to SAP software based on OData standard. SAP Gateway is one of the important API sources that work seamlessly with the API management solution. Learn more about it at SCN Gateway Community. Below is an overview of how SAP API management integrates with the SAP product family including SAP Gateway:


    APIMgtArchitecture.png 

     

    Do you want to check out these solutions, see the demo and have face to face conversation with SAP experts about Gateway and API Management? Then SAP TechEd 2015 at Last Vegas (Venue: Venetian|Palazzo Congress Center, Date: Oct 19-23, 2015) is a great opportunity for you. Easiest way to connect with SAP experts is to visit our pod. Here are the details about our POD:

    • Pod Headline: API Process Integration For Your Digital Economy
    • Pod Location: Explore (Hall C Level 2)
    • Pod Hours: Tuesday–Thursday • 10:00 a.m. - 6:00 p.m.
    • SAP experts present at the POD:

                   - Michael Hill- Director, HANA Cloud Platform Product Marketing at SAP

                   - Elijah Martinez- Product Manager, SAP Gateway and API Management at SAP

     

    Here is what our team offers at TechEd 2015, Las Vegas:

     

    1.       Lectures, hands-on and other sessions on SAP API management and SAP Gateway:

     

    Session ID

    Session Type

    Duration

    Title

    Speakers

    Session Date/Time

    INT100

    Lecture

    1 hr

    Integration and Orchestration: Overview and Outlook

    Christoph Liebig

    Tue: 11:00am- 12:00pm

    Wed: 10:30am-11:30am

    INT103

    Lecture

    1 hr

    Apply REST with SAP API Management

    Harsh Jegadeesan

    Tue: 2:00pm - 3:00pm

    Thu: 5:45pm - 6:45pm

    INT104

    Lecture

    1 hr

    OData Service Deployment Options for On-Premise, Mobile and Cloud Scenarios

    Christian Loos,

    Stephan Herbert

    Tue: 12:15pm – 1:15pm

    Wed: 9:15am - 10:15am

    INT105

    Lecture

    1 hr

    Intel Case Study: Enable Your Business with SAP Gateway

    Stephan Herbert

    Wed: 5:45pm - 6:45pm

    Thu: 2:00pm - 3:00pm

    INT201

    Lecture

    1 hr

    How to Deal with Integration and Orchestration Aspects of IoT Scenarios

    Stephan Herbert

    Wed: 11:45am- 12:45pm

    Thu: 8:00am-9:00am

    INT208

    Lecture

    1 hr

    Enabling Shadow IT with SAP Gateway

    Brian ONeill

    Thu: 4:30pm - 5:30pm

    INT260

    Hands-On

    4 hrs

    Develop an E2E Integration Scenario with SAP Gateway, SAP HANA, and SAPUI5

    John Patterson, Oliver Heinrich, Joerg Singler, Andre Fischer

    Wed: 8:00am-12:00pm

    Thu: 8:00am-12:00pm

    INT269

    Hands-On

    2 hrs

    SAP API Management: On Demand and on Premise

    Peter Ng,

    Harsh Jegadeesan

    Tue:   4:30pm - 6:30pm

    Wed: 2:00pm - 4:00pm

    INT600

    Mini Code Jam

    1 hr

    Build Your First SAP Gateway App (OData) with SAP API Management

    Harsh Jegadeesan

    Wed: 12:30pm - 1:30pm

    INT802

    Road Map Q&A

    1 hr

    Road Map Q&A: SAP Gateway

    Stephan Herbert

    Tue: 1:15pm - 2:15pm Thu: 4:00pm - 5:00pm

    INT804

    Road Map Q&A

    1 hr

    Road Map Q&A: SAP API Management

    Harsh Jegadeesan

    Tue:   3:15pm - 4:15pm

    Wed: 4:00pm - 5:00pm

     

    2.       Demos shown at the POD:

     

    Visit and interact with our experts at the POD at Explore zone (Hall C Level 2). There will be several innovative demos featuring entire Integration and Orchestration product portfolio that covers Gateway, API Management, HANA Cloud Integration, Process Integration, Operational Intelligence and much more. Here is the preview of one of the demos powered by SAP API Management that you can experience live at the POD:


     

    3.     POD Expert Networking Sessions

     

    More information coming soon.

     

    4.      Tech Talk:

     

            Time: Wed: 3.15 p.m. – 4 p.m. (30-35 minutes talk followed by 10-15 min Q&A).

            Presenter: Ed Anuff, VP of Product Strategy at Apigee

     

    For more information, check out following social channels:

     

    1. SAP Community Network: Gateway   API Management

    2. YouTube: Gateway    API Management

    3. Twitter: SAP Cloud   SAP Developers

    4. Facebook: SAP Cloud   SAP Developers

    5. SAP API Management Web site

     

     

    SAP API management and Gateway Product Management Team


    0 0

    Knowledge prepare:

    Topic:Expand in Framework and Data Provider

    URI:    Expand in Framework and Data Provider - SAP NetWeaver Gateway - SAP Library


    A request with a $expand query option enables the reading of entries of an entity together with an associated entity.


    Below is some blogs posted in SCN about expand, I would like to show my respect to them here.

    Step-by-step guide to build an OData Service based on RFCs – Part 3

     

     

    Ok, now let us start.

     

    1. Create a Entity Name: ApplicationLog

                     Entity Set Name: ApplicationLogCollection

     

    Image.png

     

    2. Set ABAP structure and Properties for Entity Name: ApplicationLog.

     

    Image.png

     

     

    Image.png

     

    3. Create Association Name: Task_To_M_ApplicationLog.

     

    Image.png

     

    click [Next], fill in Referential Constraints

    Image.png

     

    click [Next]-->click [finish].

     

     

     

    4.Click [Generate Runtime Object ] and make sure the message with "Runtime Objects for project 'xxxx' generated".


    Image.png

     

    5. Test metadata

     

    6.Create class: CL_CRM_ODATA_RT_ApplicationLog  (Runtime Class for ApplicationLog)

    Code Snippet:

    Class:CL_CRM_ODATA_RT_ApplicationLog

    Method: /IWBEP/IF_MGW_APPL_SRV_RUNTIME~GET_ENTITYSET

     

    DATA: applicationlog_get_entityset TYPE cl_crm_odata_mpc=>tt_applicationlog.

     

    " EntitySet -ApplicationLogCollection

         IF iv_entity_set_name = gc_application_log_collection.

           applicationlog_get_entityset(

             EXPORTING

              iv_entity_name = iv_entity_name

              iv_entity_set_name = iv_entity_set_name

              iv_source_name = iv_source_name

              it_filter_select_options = it_filter_select_options

              it_order = it_order

              is_paging = is_paging

              it_navigation_path = it_navigation_path

              it_key_tab = it_key_tab

              iv_filter_string = iv_filter_string

              iv_search_string = iv_search_string

              io_tech_request_context = io_tech_request_context

            IMPORTING

              et_entityset = applicationlog_get_entityset

              es_response_context = es_response_context

            ).

    *     Send specific entity data to the caller interface

           copy_data_to_ref(

             EXPORTING

               is_data = applicationlog_get_entityset

             CHANGING

               cr_data = er_entityset

           ).


    Class:CL_CRM_ODATA_RT_ApplicationLog

    Method: /IWBEP/IF_MGW_APPL_SRV_RUNTIME~GET_EXPANDED_ENTITY


    DATA: lr_bupa_provider TYPE REF TO cl_crm_bp_odata_data_provider,

                 lv_bupa_supports TYPE abap_bool.

     

           CREATE OBJECT lr_bupa_provider.

           lv_bupa_supports = lr_bupa_provider->supports_entity( EXPORTING iv_entity_name = iv_entity_name ).

     

           IF iv_entity_name = GC_TASK.

             IF go_task_impl IS NOT BOUND.

               CREATE OBJECT go_task_impl

                 EXPORTING

                   ir_context = mo_context.

             ENDIF.

     

             go_task_impl->get_task_expanded_entity(

               EXPORTING

                 iv_entity_name               = iv_entity_name

                 iv_entity_set_name           = iv_entity_set_name

                 iv_source_name               = iv_source_name

                 it_key_tab                   = it_key_tab

                 it_navigation_path           = it_navigation_path

                 io_expand                    = io_expand

                 io_tech_request_context      = io_tech_request_context

               IMPORTING

                 er_entity                    = er_entity

                 et_expanded_clauses          = et_expanded_clauses    " Table of Strings

                 et_expanded_tech_clauses     = et_expanded_tech_clauses    " Table of Strings

             ).

     

           ELSEIF lv_bupa_supports = 'X'.

             CALL METHOD lr_bupa_provider->/iwbep/if_mgw_core_srv_runtime~set_context

               EXPORTING

                 io_context = mo_context.

     

             CALL METHOD lr_bupa_provider->/iwbep/if_mgw_appl_srv_runtime~get_expanded_entity

               EXPORTING

                 iv_entity_name           = iv_entity_name

                 iv_entity_set_name       = iv_entity_set_name

                 iv_source_name           = iv_source_name

                 it_key_tab               = it_key_tab

                 it_navigation_path       = it_navigation_path

                 io_expand                = io_expand

                 io_tech_request_context  = io_tech_request_context

               IMPORTING

                 er_entity                = er_entity

                 et_expanded_clauses      = et_expanded_clauses

                 et_expanded_tech_clauses = et_expanded_tech_clauses.

           ENDIF.


    Class:CL_CRM_ODATA_RT_ApplicationLog

    Method: APPLICATIONLOG_GET_ENTITYSET


    TRY.

             DATA: lv_obj_guid TYPE crmt_object_guid.

     

             FIELD-SYMBOLS <fs_key_set_option> TYPE /iwbep/s_mgw_name_value_pair.

     

             READ TABLE it_key_tab ASSIGNING <fs_key_set_option> WITH KEY name = GC_GUID.

             IF <fs_key_set_option> IS ASSIGNED AND <fs_key_set_option>-value IS NOT INITIAL.

               lv_obj_guid = <fs_key_set_option>-value.

               CALL METHOD gr_task_rt->get_log_data

                 EXPORTING

                   iv_object_guid = lv_obj_guid

                 IMPORTING

                   et_log         = et_entityset.

             ENDIF.

           CATCH /iwbep/cx_mgw_busi_exception .

           CATCH /iwbep/cx_mgw_tech_exception .

         ENDTRY.

     

    7. Then you can test for the entity by below 2 kinds of URIs.

          (1)/sap/opu/odata/sap/CRM_ODATA/TaskCollection(guid'xxxxx')/ApplicationLogs

          (2)/sap/opu/odata/sap/CRM_ODATA/TaskCollection(guid'xxxxx')/?$expand=ApplicationLogs

     

    8.Finished.


    0 0

    After we launched SAP Gateway some years ago our partners and customers are constantly developing creative and inovative solutions. The capability to expose APIs in a convenient and well managed way to either external communities or to allow internal developers convenient and homogenious access to the backend systems was something where a piece was missing. With SAP API Management there is now a solution available to add the missing piece on-premise and in the cloud.

     

    This Wednesday (October 14th) I will give a 30 minute webinar on how SAP API Management is helping to enable innovative digital scenarios. If you are interested please register here and to get more details on the planned webinar.

     

    The webinar is part of a series of webinars that are showing the various aspects of "Integration Solutions for the digital age", as detailed out by Matthias here. Sindhu has already started the series last week by giving an overview on how to simplify the integration challenge.


    0 0

    Here is the recap of first 3 days at SAP TechEd 2015.

     

    On first day evening, Steve Lucas brought light to digital economy in his keynote session. He pointed out that today all economies are going digital (even agriculture).To survive, you must make the digital shift. He mentioned how SAP is moving from tradition enterprise company to a cloud company. He said that SAP you knew was analytics, db, Hana, Sap you know now is analytics, s/4, Hana and Sap you should know is cloud for analytics, s/4, Hana, Vora. Check out the recorded sessionhere


    SteveKeynote.png

     

    Second day started with energizing keynote session by SAP Executive Board Member Bernd Leukert. He said- "You have the ability to manage your company in a completely different way". He said that SAP has reinvented analytics business. Leukert called Cloud for Analytics “the newest member of our SAP family” before a demonstration, which featured a fictitious retail chain with a few underperforming stores. The tool singlehandedly quickly identified them, geographically represented them and even evaluated their competition, revealing that sales at rival stores were stealing market share. He also stressed on 'SAP HANA Cloud Platform' as the platform to build business applications. There was a demo that revealed how HCP can help an enterprise create a Web page to conduct business in another country. There was much more and you can see recorded videohere.

     

    Keynote2.jpg

    Keynote2_1.JPG

     

    After this energetic start of the day, there was a hands-on session(INT269) on SAP API Management given by Harsh Jagadeesan and Peter NG. They talked about how this solution provides simple, scalable, secure, and unified access for APIs based on open standards such as REST, OData, and Outh2. There were some hands-on exercises as well to get idea about different features of SAP API Management.

     

    HandsOn2.JPG

     

    Throughout the day, our staff at the POD(at Show floor at Explore zone (Hall C Level 2))- Michael Hill, Elijah Martinez and Sonali Desai,was busy interacting with interested customers and showing them demos related to Integration and Orchestration products(that include Process Integration, Process Orchestration, HANA CLoud Integration, Gateway and API Management). You can watch our demos for Gateway here and check out API Management demoshere

     

    Img1.jpg

    Img2.jpg

    Img3.jpg

    On day 3, there was an expert networking session with Paul J. Modderman from Mindset Consulting on 'Innovate with SAP Gateway and API Management'. He talked about how API Management can on board developers quickly and how you can use SAP Gateway to convert your RFCs/BAPIs into SOAP, ODATA/REST services. Here are some pics from that networking session:

     

    IMG_3191.JPG

    Expert2.jpg

    There was another exert networking session with Sandeep Murusupalli, Solution Architect from Apigee and Peter NG, Senior Product Manager at SAP Labs, LLC. It was Q & A session on API Management. They showed live demo of Asset repairs using smartwatch and discussed about how API management plays critical role in this demo.Here are some pics for the same:

     

    Expert3.jpg

    There was another session given by Stephan Herbert, VP P&I Technology I/O Gateway and Lisa Hoshida from Intel on 'Enable your business with SAP Gateway' They discussed about Intel case study and how SAP Gateway can empower your workforce by moving critical and real-time data, from a back end consisting SAP solutions, to any mobile device.

     

    INT105.jpg

    Overall it was a great experience. Customers were delighted to see Integration and Orchestration Product portfolio offering and were interested in learning more about the products and how these products can fit in their use cases.


    0 0

    Scenario:

     

    Many times there is a business requirement of linking documents, entering notes, sending notes or linking an internet address to various SAP objects. These external  attachments can be reference documents, pictures, Email attachments, design , diagrams or related spreadsheets. To meet this requirement SAP has provided a tool bar called 'Generic Service Toolbar'(GOS).

     

    Recently, I came across a requirement where i had to create a attachment for existing sales order (VA02) through Net weaver Gateway Service.

     

    By Using this Blog, you can attach a wide range of documents like Word Documents,Excel Sheets, PDF, Text Files and Images and many more through Gateway service.

     

    Procedure:

     

    We have created a project in ECC system for create attachment to Sales Order through Gateway Service. As shown in below.

     

    2.png

     

    Right Click on the Data Model folder and select Import DDIC structure and Give the Entity Type Name as Attachment and select the Required properties for the Entity Type.

     

    In the Entity Type Properties select the check-box: Media. Our  Entity Type Attachment and its properties look as below.

     

    3.png

     

    Then click on Generate Run time objects. It displayed "Generated Objects Successfully" , that time Generated all class automatically.

     

    Then Redefine the DEFINE  method in the *MPC_EXT Class and add the below logic.

     

    method DEFINE.

     

         super->DEFINE( ).


    DATA: lo_entity     type REF TO  /IWBEP/IF_MGW_ODATA_ENTITY_TYP,

                lo_property type REF TO  /IWBEP/IF_MGW_ODATA_PROPERTY.

     

    lo_entity = model->GET_ENTITY_TYPE( IV_ENTITY_NAME = 'Attachment' )."Entity Name

     

    IF lo_entity is BOUND.


    lo_property = lo_entity->GET_PROPERTY( IV_PROPERTY_NAME = 'Filename' )."Key Value(SLUG)

    lo_property->SET_AS_CONTENT_TYPE( ).


    ENDIF.


    endmethod.

     

     

    Then redefine the CREATE_STREAM method ( /IWBEP/IF_MGW_APPL_SRV_RUNTIME~CREATE_STREAM ) in the *DPC_EXT class and implement the below logic to upload the file attachment into the Sales Order(VA02) based on Sales Order Number.

     

    Code:


    method /IWBEP/IF_MGW_APPL_SRV_RUNTIME~CREATE_STREAM.

     

    *-------------------------------------------------------------

    * Constants

    *-------------------------------------------------------------

       CONSTANTS :

                                c_bus2032      TYPE swo_objtyp VALUE 'BUS2032',      " Bus number for sale order

                                c_ext(3)          TYPE c                VALUE 'EXT',

                                c_atta(4)         TYPE c                VALUE 'ATTA',

                                c_b(1)             TYPE c                VALUE 'B',

                                c_x(1)             TYPE c                VALUE 'X',

                                c_o(1)             TYPE c                VALUE 'O'.

     

    *-------------------------------------------------------------

    *  Data declaration

    *-------------------------------------------------------------

       DATAit_content               TYPE STANDARD TABLE OF soli,     " Content Of File Storage

                   it_objhead               TYPE STANDARD TABLE OF soli,

                   wa_folmem_k          TYPE sofmk,                                     " Folder Content Data

                   wa_note                  TYPE borident,                                   " BOR object identifier

                   wa_object                TYPE borident,

                   wa_obj_id                TYPE soodk,                                     " Definition of an Object (Key Part)

                   wa_fol_id                 TYPE soodk,

                   wa_obj_data             TYPE sood1,                                    " Object definition and Change attributes

                   lv_ep_note               TYPE borident-objkey,                       " BOR Object Key

                   lv_extension             TYPE c LENGTH 4,                         " File Extension only

                   lv_so_num                TYPE vbeln_va,                               " Sales order number

                   lv_file_des                TYPE so_obj_des.                           " File name

     

     

    */Refresh data

       REFRESH: it_content[], it_objhead[].

     

    */Field Symbol for SLUG

       FIELD-SYMBOLS:<fs_key> TYPE /iwbep/s_mgw_name_value_pair.

     

    */Read the SLUG Value and Name based on INDEX

       READ TABLE it_key_tab ASSIGNING <fs_key> INDEX 1.

     

    */Function module for  Xstring to Binary Conversion

       CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'

         EXPORTING

           buffer                = is_media_resource-value     "Xstring

          append_to_table        = c_x

    * IMPORTING

    *   OUTPUT_LENGTH         =

         TABLES

           binary_tab            = it_content[]                "Binary

                 .

    */Get folder id

       CALL FUNCTION 'SO_FOLDER_ROOT_ID_GET'

         EXPORTING

           region                = c_b

         IMPORTING

           folder_id             = wa_fol_id

         EXCEPTIONS

           communication_failure = 1

           owner_not_exist       = 2

           system_failure        = 3

           x_error               = 4

           OTHERS                = 5.

       CLEAR: lv_so_num,lv_file_des.

       IF iv_slug IS NOT INITIAL.

         SPLIT iv_slug AT '/' INTO lv_so_num lv_file_des.

         IF lv_so_num IS NOT INITIAL.

           CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'

             EXPORTING

               input  = lv_so_num

             IMPORTING

               output = lv_so_num.

     

         ENDIF.

       ENDIF.

     

    */ Assigning Valuse to the Standard Strucuture Fields

       wa_object-objkey     =   lv_so_num.              " Sales Order Number

       wa_object-objtype    =   c_bus2032.              " Bus Number

       wa_obj_data-objsns   =   c_o.                    " Sensitivity of Object (O-Standard)

       wa_obj_data-objla    =   sy-langu.               " Language

       wa_obj_data-objdes   =   lv_file_des.            " Slug Value - Description

       wa_obj_data-file_ext =   lv_extension.           " File Extension

     

    */ Change Extension to UpperCase

       TRANSLATE wa_obj_data-file_ext TO UPPER CASE.

     

       wa_obj_data-objlen lines( it_content ) * 255.

     

    */ Insert data

       CALL FUNCTION 'SO_OBJECT_INSERT'

         EXPORTING

           folder_id                  = wa_fol_id

           object_type                = c_ext

           object_hd_change           = wa_obj_data

         IMPORTING

           object_id                  = wa_obj_id

         TABLES

           objhead                    = it_objhead

           objcont                    = it_content

         EXCEPTIONS

           active_user_not_exist      = 1

           communication_failure      = 2

           component_not_available    = 3

           dl_name_exist              = 4

           folder_not_exist           = 5

           folder_no_authorization    = 6

           object_type_not_exist      = 7

           operation_no_authorization = 8

           owner_not_exist            = 9

           parameter_error            = 10

           substitute_not_active      = 11

           substitute_not_defined     = 12

           system_failure             = 13

           x_error                    = 14

           OTHERS                     = 15.

     

       IF sy-subrc = 0 AND wa_object-objkey IS NOT INITIAL.

         wa_folmem_k-foltp = wa_fol_id-objtp.

         wa_folmem_k-folyr = wa_fol_id-objyr.

         wa_folmem_k-folno = wa_fol_id-objno.

     

    */Please note: wa_fol_id and wa_obj_id are different work areas

         wa_folmem_k-doctp = wa_obj_id-objtp.

         wa_folmem_k-docyr = wa_obj_id-objyr.

         wa_folmem_k-docno = wa_obj_id-objno.

     

         lv_ep_note = wa_folmem_k.

         wa_note-objtype = 'MESSAGE'.

         wa_note-objkey = lv_ep_note.

     

    */Link it

         CALL FUNCTION 'BINARY_RELATION_CREATE_COMMIT'

           EXPORTING

             obj_rolea      = wa_object

             obj_roleb      = wa_note

             relationtype   = c_atta

           EXCEPTIONS

             no_model       = 1

             internal_error = 2

             unknown        = 3

             OTHERS         = 4.

     

         CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'

          EXPORTING

            WAIT           = c_x

    *     IMPORTING

    *       RETURN        =

                   .

     

         IF sy-subrc EQ 0.

    */Commit it

           COMMIT WORK.

         ENDIF.

       ENDIF.

       endmethod.

     

    All input Values we have to get into the SLUG parameter from the UI Side. If you have multiple input parameter values then with concatenate of multiple parameter values with delimiter we have to get the values in SLUG parameter.

     

    Testing Our Service:

     

    Now we will test our service in Gateway Client transaction for that is /IWFND/MAINT_SERVICE

     

    Upload a file through add file button which is on left side corner as shown in the below screen shot.

     

    4.png

    Pass SLUG values(mandatory) pass file name and sales order number as shown in the below screen shot.

     

    In this example we passing Multiple parameter values like SalesOrder Number and File Description. These two values are separted by ' , '. shown in below screen shot.

     

    5.png

     

    Paste our URI in Request URI field and click on POST HTTP Method.

     

    URI: /sap/opu/odata/sap/ZASC_ECOMM_SO_ATTACHMENT_SRV/AttachmentSet

     

    Service Response:


    6.png

    Successfully created the Attachment in GWS.

     

    Result:

    Go to Sales Order Display (VA03) Transaction and click on 1.png Services for Objects in Title Bar then you will get the attachment list as shown in below.

     

    You will find your attachment.

     

    7.png

     

    Attachment added successfully to the Sales Order.

     

    Thanks&Regards,

    Harikrishna Malladi


    0 0

    In this Blog, I will explain creating Gateway Service with Read and Query operationsusing Date as an Input.  Before starting, I am expecting that you have basic idea of Gateway Service.

     

    The reason for writing this Blog is, I had a requirement where I need to get the data from back-end system based on Date Field. I have searched Google but I was not able to find proper information on this requirement.

     

    Scenario:

     

    Search for flight details on given date and between the date ranges.

     

    Procedure:


    We have created ZSFLIGHT_DETAILS Gateway project to get the data based on Date Field as shown below.

     

    1.png

     

    Right Click on the Data Model folder and select Import DDIC structure and give the Entity Type Name as Sflight and select the required properties for the Entity Type.

     

    In the Entity Type properties check Fldate as Key Field. Our Entity Type and its properties look as below.

     

    2.png

     

    Then click on Generate Runtime objects. It will display "Generated Objects Successfully”, all the required classes will be generated automatically.

     

    1. Read Operation Using Date:


    When user inputs date to find a flight on particular date. For this scenario we need to redefine SFLIGHTSET_GET_ENTITY method in the *DPC_EXT class and implement the below logic to read the data based on Date.

     

    Code:

     

    METHOD sflightset_get_entity.


    *Data Declarations
    DATA: ls_key LIKE LINE OF it_key_tab.

    READ TABLE it_key_tab INTO ls_key WITH KEY name = 'Fldate'.

    IF sy-subrc = 0.

    SELECT SINGLE * FROM sflight INTO er_entity WHERE fldate = ls_key-value

      ENDIF.


    ENDMETHOD.


    2. Query Operation Using Date as filter:


    When user wants to find flights between dates. For this scenario we need to redefine SFLIGHTSET_GET_ENTITYSET method in the *DPC_EXT class and implement the below logic to get the data based on Date Using Filter Query.

     

    Code:


    method SFLIGHTSET_GET_ENTITYSET.

    *Data Declarations
    DATA lo_filter                   TYPE  REF TO /iwbep/if_mgw_req_filter"Filter System Query Option Class
            lv_filter_str              
    TYPE string,      "Filter Declaration
            lv_fldate                  
    TYPE s_date,
            lt_filter_select_options   
    TYPE /iwbep/t_mgw_select_option,
            r_date                     
    TYPE RANGE OF sflight-fldate,
            ls_date                    
    LIKE LINE OF r_date,
            lt_select_options          
    TYPE /iwbep/t_cod_select_options,
            ls_filter_select_options   
    LIKE LINE OF lt_filter_select_options,
            ls_select_options          
    TYPE /iwbep/s_cod_select_option.

    *Read the filter Select options
    READ TABLE IT_FILTER_SELECT_OPTIONS INTO ls_filter_select_options INDEX 1.

    IF sy-subrc = 0.
    lt_select_options
    = ls_filter_select_options-select_options.
    ENDIF.


    *Read the select options
    READ TABLE lt_select_options INTO ls_select_options INDEX 1.

    IF sy-subrc = 0.

    ls_date
    -sign    = 'I' .
    ls_date
    -option  = 'BT'.
    ls_date
    -low     = ls_select_options-low.
    ls_date
    -high    = ls_select_options-high.
    APPEND ls_date TO r_date.
    CLEAR ls_date.
    ENDIF.


    *To fetch the data from sflight based on Date Range

    SELECT mandt carrid connid fldate FROM sflight INTO TABLE et_entityset
    WHERE fldate IN r_date.
    endmethod.

    Testing Our Services:

    Now we will test our service in Gateway Client transaction /IWFND/MAINT_SERVICE


    3.png

     

    4.png

    In order to get the data for flight on given date use the following URI using HTTP GET Method.


    URI: /sap/opu/odata/sap/ZSFLIGHT_DETAILS_SRV/SflightSet(Fldate=datetime'1995-02-28T00:00:00')


    Service Response:

     

    5.png

     

    In order to get the flight details between the given date ranges, use the following URI using HTTP GET Method.

     

    URI: /sap/opu/odata/sap/ZSFLIGHT_DETAILS_SRV/SflightSet?$filter=Fldate ge datetime'1995-02-28T00:00:00' and Fldate le datetime'1997-08-30T00:00:00'

     

    Service Response:

     

    6.png

     

    Thanks&Regards,

    Gangadhar B


older | 1 | .... | 8 | 9 | (Page 10) | 11 | 12 | 13 | newer