Quantcast
SAP Gateway
Browsing All 253 Articles Latest Live
Mark channel Not-Safe-For-Work? cancel confirm NSFW Votes: (0 votes)
Are you the publisher? Claim or contact us about this channel.
0
Articles:

Easily Importing Data Model from File in SAP Net Weaver Gateway

0
0

Design Data Model

Perquisites:

Should use any deployment either Hub or Embedded. In this Blog I'm Using Embedded deployment.

 

Core Components (SAP Net Weaver Stack):

 

SAP Net Weaver 7.0 and 7.01


  • GW_CORE 190
  • IW_FND 240


SAP Net Weaver 7.02, 7.03, and 7.31


  • GW_CORE 200
  • IW_FND 250


SAP Net Weaver 7.40 and higher


  • SAP_GWFND 740: SAP Net Weaver Gateway Foundation 7.40

 

Create a Project:


In Order to work with Net weaver gateway building O Data Services, one should have a basic knowledge of ABAP & OOABAP.


Creating a project is simple, we can go and create a project in service builder (SEGW) in back end system.

1.png

Enter the details as shown in below.

2.png

Click on enter, Now project is created successfully.


Creating data model:


Now it’s time to create a data model as per the requirement. So here I'm creating the data model to importing Data Model from File.SAP Net Weaver Gateway provides many useful generator tools which allow you to develop services around RFCs, BAPIs, Dnipro screens, etc.


               Importing Data Model from file is the one of the option for to copy a particular gateway service from one server to another server, the most time consuming part is to copy the data model manually by creating all the entities, entity sets, associations and navigation paths. Now to overcome this big trouble, there is a better way given by SAP. There is an option called “Data Model from File”.


The File Import function is developed to allow importing of the data model files defined by external model editors like Visual Studio (edmx files) and the metadata files types (xml) into the Service Builder to create a data model. The files can be imported into the Service builder for the following Project Types:

 

                    1) Service with SAP Annotations

                    2) Service with Vocabulary-Based Annotations

Importing a Data Model - New and Empty Project


Right click on the Start of the Navigation pathData Model---> Import ---> Data Model from File


3.png


Import data using "Data Model from File" option SAP provide two options (i.e. Custom Import and Full Import).


Custom Import: To import a metadata/edmx file for an existing project, which already has a data model created (re-import). In this case, the existing data model is overwritten.


Full Import: To import a metadata/edmx file for a new project. In instances where you create a project and import a file.


In my case I'm "Importing a Data Model - New and Empty Project", so that’s the reason Custom Import Option is greyed out. Shown below screen shot.

Select Full Import to overwrite the existing model. Follow the steps in the previous section "Importing a Data Model - New and Empty Project" to complete the full import.


4.png

Browse to select an xml or edmx file and click next.

5.png

6.png

Click Next you will see the Entity Types and Entity Sets as shown below.

7.png

8.png

Click on finish, go and check the model was created successfully and automatically corresponding Entity Types, Entity sets, Associations and Navigation Properties will be created. As shown below.

9.png

After creating the data model click on generate button over the top (Tool bar) as shown below.

10.png

Run time classes are generated successfully as shown in below.

11.png

Importing a Model File - Existing Project


Right click on the Start of the navigation pathData Model---> Import ---> Data Model from File

3.png

Select Custom Import to select the objects manually from the model.

19.png

Click Browse to select and import an xml or edmx file and click next.

20.png

The Wizard Step 2 of 3: File Import window appears and displays the following status for the Artifacts.


1.New:

If the imported file contains an artifact that is not present in the Data Model, then that artifact is displayed with the Status New. On selecting that artifact, a new entry is created in the Data model.

21.png

2.Delete:

If the file does not contain the artifact that exits in the data model, then it displays Delete status. If you select the artifact, it is deleted from the data model. If this option is not selected, then data model remain unchanged.

22.png

3.Edit:

If an artifact is available in both the file and in the Data Model, but if the one in file (file that is being imported) has some changes, then these artifacts have the status changed. You can choose to adapt the file changes in the data model. If this option is not selected, then the artifact in the data model remain unchanged.


25.png


4.Equal:

If an artifact exist in both Data Model and in the file and if it is equal, then the artifact is disabled for selection. These artifacts are displayed with the Status Equal.

23.png

After select the Entity Types and Entity Sets then click on Next and then Click on finish, go and check the model was created successfully and automatically corresponding Entity Types, Entity sets, Associations and Navigation Properties will be created. As shown below.

24.png

Now you can redefine your methods write your own logic in the *DPC_EXT class. By using Import Data from file you can easily design model from existing model.


Thanks&Regards,

Harikrishna M

SSO problem with the SAP Gateway Client /IWFND/GW_CLIENT

0
0

Since I ran into the same problem two times this week I thought it would be a good idea to post the solution in a small blog ...

 

When configuring a demo system for SAP Gateway I tried to test an OData service using the SAP Gateway client /IWFND/GW_CLIENT and got the following error message:

 

HTTP Send failed:  

HTTPIO_ERROR_CUSTOM_MYSAPSSO~Fehlermeldung beim

Senden der Daten.

 

HTTPIO_ERROR_CUSTOM_MYSAPSSO.PNG

 

 

Since Single Sign-On is one of the key features of the SAP Gateway Client such an error message is quite annoying.

 

First possible root cause

 

The reason was that the following instance profile parameters were not set correctly.

 

login/accept_sso2_ticket

login/create_sso2_ticket

 

The correct values are (as mentioned in the Online Help Profile Parameters - SAP Gateway Foundation (SAP_GWFND) - SAP Library )

 

login/accept_sso2_ticket = 1

login/create_sso2_ticket = 2

 

So always read our documentation carefully .

 

To test the settings without maintaining the instance profile you can use transaction RZ11.

 

rz11.PNG

 

Press the Display button and the following window opens:

 

rz11_2.#PNG.PNG

 

Here you can choose the Change Value button that lets you change the parameter value immediately (but not permanently) so that I was able to proceed with my testing.

 

Second possible root cause

 

The same error also occurred in a second system but here the SSO problem did not vanish after changing the profile parameters mentioned above.

 

The error was more basic here since the certificates in that demo system were not valid anymore.

 

Looking into STRUST showed that something was wrong with the certificates.

 

An error that should hopefully not occur to often ;-).

 

STRUST_Problem.png

 

 

Best Regards,

Andre

Measure the performance of your OData service

0
0


The OData service url being tested is: /sap/opu/odata/sap/CRM_ODATA/TaskCollection?$filter=isMyTask eq true&$expand=DocumentNotes,DocumentNextUserStatuses,DocumentHistories,DocumentApplicationLogs,Attachments


Approach1 - gateway client

 

Log on your gateway ( frontend ) server, use tcode /IWFND/GW_CLIENT, paste the url and execute. Then you will get execution time in unit Millisecond.

clipboard1.png

Approach2 - Chrome development tool

 

Open Chrome development tool via F12, paste the COMPLETE url with host name and port number and execute. The time will be displayed in column "Time".

clipboard2.png

Hover the mouse to column "Timeline - Start Time", and more detail explanation of elapsed time will be displayed as below:

clipboard3.png

Approach3 - Chrome extension Postman

 

You can also use Chrome extension - Postman Restclient to get a draft

clipboard4.png

Approach4 - Gateway performance trace /IWFND/TRACES

 

Launch the tcode and enable the performance trace by selecting the checkbox "Performance Trace".

clipboard5.png

Then trigger a request via any of the above three approaches, and check your performance trace here:

clipboard6.png

If you want a more detailed view, double click the entry, and then click "Trace Details":

clipboard7.png


clipboard8.png

How to implement multi-level create deep

0
0

This blog will explain how to implement multi-level create deep.

 

Prerequisite, please make sure you are familiar with create deep implementation via Gateway:

Step by Step development for CREATE_DEEP_ENTITY operation

 

 

For multi-level create deep, we assume this following scenario:

Scenario.PNG

 

Here, FirstSon & SecondSon are the first-level deep of father, FirstGrandSon is the first-level of FirstSon but second-level of Father.

 

Then How to implement this create-deep scenario?

 

 

1. Create entities/associations in SEGW which map to this relationship.

     Four entities are created: Father/FirstSon/SecondSon/FirstGrandson; Each entity has three properties: Key/Property1/Property2

     Three associations are created: "Father to FirstSon" & "Father to SecondSon" & "FirstSon to FirstGrandSon"

Data Model.PNG

 

2. Generate and register the ODate Service.

     I assume you are familiar with these general Gateway steps .

 

3. Implement  create_deep_entity method.

     First create the multi-level deep structure which can holds the nested data from input; Then use  io_data_provider to get the input.

     Here I just write simple code snippet, as long as we can get nested data in the runtime.

Code.PNG

 

4. Test in RestClient

     I used to try to test the multi-level create deep in our traditional GW client and using XML format as payload. Nested data can be transferred into our create_deep_entity method, but the position of the nested data in second level is wrong. Thus, I strongly suggested to use JSON format as payload. Since our GW Client doesn't have a good support json fromat. I recommend to use the RestClient. (In RestClient you have to first get CSRF token and then post)

    

Payload:

 

{

    "Key": "Father-Key",

    "Property1": "Father-Property-1",

    "Property2": "Father-Property-2",

    "FirstSon": [

        {

            "Key": "FirstSon-Key-1",

            "Property1": "Firstson-Property-1",

            "Property2": "Firstson-Property-2",

            "FirstGrandson": [

                {

                    "Key": "GrandSon-Key-1",

                    "Property1": "GrandSon-Property-1",

                    "Property2": "GrandSon-Property-2"

                }

            ]

        },

        {

            "Key": "FirstSon-Key-2",

            "Property1": "Firstson-Property-3",

            "Property2": "Firstson-Property-4",

            "FirstGrandson": [

                {

                    "Key": "GrandSon-Key-2",

                    "Property1": "GrandSon-Property-3",

                    "Property2": "GrandSon-Property-4"

                },

                {

                    "Key": "GrandSon-Key-3",

                    "Property1": "GrandSon-Property-5",

                    "Property2": "GrandSon-Property-6"

                }

            ]

        }

    ],

    "SecondSon": [

        {

            "Key": "SecondSon-Key-1",

            "Property1": "SecondSon-Property-1",

            "Property2": "SecondSon-Property-2"

        },

        {

            "Key": "SecondSon-Key-2",

            "Property1": "SecondSon-Property-3",

            "Property2": "SecondSon-Property-4"

        },

        {

            "Key": "SecondSon-Key-3",

            "Property1": "SecondSon-Property-5",

            "Property2": "SecondSon-Property-6"

        }

    ]

}

 

 

5. Check if the multi-level nested data is mapping to the right position in our runtime.

   

Father-level data:

Father.PNG

 

Son-level data (deep to father):

FirstSon.PNGSecond.PNG

 

Grandson-level data (deep to FirstSon):

     FirstSon[1] has one entry; FristSon[2] has two entry; as payload expected.

Grandson1.PNGGrandson2.PNG

 

Now we get the multi-level nested data in the right position, then we can do anything we want in following .



Hope it helps!

How to do the data mining on the Gateway statistics using CDS view and AMDP?

0
0

Recently i got a very interesting requirement from management team to do a POC based on the Gateway statistics to address the following business questions.

 

  1. What are the Top 10 apps accessed within a certain period ?
  2. Who are the top 10 users within a certain period?
  3. Which hour of the day do the most users access ?
  4. Which week day of the week do the most users access?
  5. What are the most search criteria for all the services ?
  6. According to the functions defined in the app, we could also know that how many times of the business functions are executed?
  7. What are the average response time and maximum payload of the http requests?


i did some research on this topic and would like to share it within this community.

Agenda.jpg


step 1: The Gateway performance statistics are stored in tables /iwfnd/i_med_srh and /wfnd/i_med_srt, which could be accessed via tcode/iwfnd/maint_service

   The Gateway performance statistics are stored in table /iwfnd/su_stats , which could be accessed via tcode/iwfnd/stats


step 2 CDS View Building

CDS View 1:

@AbapCatalog.sqlViewName: 'V_CDS_SERV'

define view /nsl/cdsv_gw_service

( service_name, service_description )

as select from /iwfnd/i_med_srh as srh

          association[0..1] to /iwfnd/i_med_srt as srt

          on srh.srv_identifier = srt.srv_identifier and

             srh.is_active = srt.is_active

{

    srh.service_name,

    srt.description

}

where

   srt.language = 'E' and

   srt.is_active = 'A'

 

CDS View 2

@AbapCatalog.sqlViewName: 'V_CDS_STATS'

define view cdsv_stats_basic

( namespace, service_name, userid, timestampl, service_description, operation, entity_type, expand_string, request_address )

as select from /iwfnd/su_stats as stats

  association[1..1] to v_cds_serv  as service on

  stats.service_name = service.service_name

 

{

  stats.namespace,

  stats.service_name,

  stats.userid,

  stats.timestampl,

  service[1:inner].service_description,

  stats.operation,

  stats.entity_type,

  stats.expand_string,

  stats.request_address

}

where

      (  stats.namespace = '/SAP/' )

 

Step 3: ABAP Managed Database Procedure

Create a structure - Overview

Overview.jpg

AMDP Class Defination:

AMDP Snippet

class /XXX/cl_gw_su_stats definition

  public

  final

  create public .

 

 

  public section.

    interfaces: if_amdp_marker_hdb.

methods:

      get_su_stats_total

        importing

          value(iv_client)        type symandt

          value(iv_start_time)    type timestampl

          value(iv_end_time)      type timestampl

        exporting

          value(ev_user_num)      type /XXX/gw_su_basics_s-user_num

          value(ev_apps_num)      type /XXX/gw_su_basics_s-apps_num

          value(ev_apps_per_user) type /XXX/gw_su_basics_s-apps_per_user

          value(ev_top_user)      type /XXX/gw_su_basics_s-top_user

          value(ev_top_app)       type /XXX/gw_su_basics_s-top_app

          value(ev_hot_hour)      type /XXX/gw_su_basics_s-hot_hour

          value(ev_hot_day)       type /XXX/gw_su_basics_s-hot_day.

  protected section.

  private section.

endclass.

method get_su_stats_total by database procedure

                            for hdb language sqlscript

                            options read-only

                            using /XXX/v_cds_stats.

    DECLARE v_servicename INT;

    DECLARE v_user INT;

    /* get the total number users */

         select count(distinct userid) into ev_user_num from "/XXX/V_CDS_STATS"

              where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time ;

 

 

    /* get the total number services */

         select  count(distinct service_name) into ev_apps_num  from "/NSL/V_CDS_STATS"

            where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time ;

 

 

    /* get the apps per user */

*     ev_apps_per_user = ev_apps_num / ev_user_num ;

    /* get the top user name */

          select top 1 userid,

                          count (service_name ) as service_name into ev_top_user, v_servicename

                          from "/XXX/V_CDS_STATS"

                          where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time

                          group by userid

                          order by service_name desc

                          ;

 

    /* get the top app name */

    select top 1

         service_name,

         count(userid) as userid into ev_top_app, v_user

         from  "/XXX/V_CDS_STATS"

          where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time

         group by service_name

         order by userid desc ;

 

 

/* which the day of the week do the agents log in the most */

 

select top 1 to_char( utctolocal(to_timestamp (timestampl),'UTC+8'),'DAY')as Date,

               count(userid) as userid into ev_hot_day, v_user

      from "/XXX/V_CDS_STATS"

          where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time

           group by to_char( utctolocal(to_timestamp (timestampl),'UTC+8'),'DAY')

           order by userid desc;

 

/* which Hour of the day do the agents log in the most*/

select top 1 hour( to_time( utctolocal(to_timestamp(timestampl),'UTC+8')))as Hour,

               count(userid) as userid into ev_hot_hour, v_user

      from "/XXX/V_CDS_STATS"

          where MANDT = :iv_client and timestampl between :iv_start_time and :iv_end_time

           group by hour( to_time( utctolocal(to_timestamp(timestampl),'UTC+8')))

           order by userid desc;

  endmethod.

Gateway service snippet

data: ls_entity            like line of et_entityset.

 

 

    loop at  it_filter_select_options into data(ls_filter_select_option).

      case ls_filter_select_option-property.

        when 'SelectionDate'.

          loop at ls_filter_select_option-select_options into data(ls_select_option).

            if ls_select_option-low is initial.

              lv_start_time_feeder = sy-datum.

            else.

              lv_start_time_feeder = ls_select_option-low.

            endif.

 

 

            if ls_select_option-high is initial.

              lv_end_time_feeder = sy-datum.

            else.

              lv_end_time_feeder  = ls_select_option-high.

            endif.

          endloop.

        when others.

          raise exception type /iwbep/cx_mgw_busi_exception

            exporting

              textid = /iwbep/cx_mgw_busi_exception=>filter_not_supported.

      endcase.

    endloop.

 

 

    if sy-subrc <> 0.

      raise exception type /iwbep/cx_mgw_busi_exception

        exporting

          textid            = /iwbep/cx_mgw_busi_exception=>business_error_unlimited

          message_unlimited = |Filter is required|.

    endif.

 

    convert date lv_start_time_feeder time sy-uzeit into time stamp lv_start_time time zone sy-zonlo .

    convert date lv_end_time_feeder time  sy-uzeit into time stamp lv_end_time time zone sy-zonlo.

 

 

    data(lo_overview) = new /nsl/cl_gw_su_stats( ).

    try.

        lo_overview->get_su_stats_total(

          exporting

            iv_client        = sy-mandt

            iv_start_time    = lv_start_time

            iv_end_time      = lv_end_time

          importing

            ev_user_num      =  ls_entity-user_num

            ev_apps_num      = ls_entity-apps_num

            ev_apps_per_user = ls_entity-apps_per_user

            ev_top_user      = ls_entity-top_user

            ev_top_app       = ls_entity-top_app

            ev_hot_day       = ls_entity-hot_day

            ev_hot_hour      = ls_entity-hot_hour

        ).

 

      catch cx_amdp_execution_failed into data(lv_error).

    endtry.

    append ls_entity to et_entityset.

 

last step to test the service in GW client

Test.jpg

Problems with multi-origin in SAP NetWeaver 740 SP13 ... and how to solve them

0
0

Today I came accross a problem that can occur if you use the multi origin feature of SAP Gateway and if you have upgraded to SAP NetWeaver 740 SP13.

 

Strictly speaking it would also occur if you have upgraded solely the software component SAP_GWFND to the Support Package Level SAPK-74013INSAPGWFND.

 

A request such as /sap/opu/odata/sap/gbapp_poapproval;mo/WorkflowTaskCollection in this case does not return any data.

 

On the other hand $count would still return a result when running the following request in the SAP Gateway Client /sap/opu/odata/sap/gbapp_poapproval;mo/WorkflowTaskCollection/$count.


As a result of this strange behavior SAP Fiori applications such as the MyInbox would not work correctly.

 

Fortunately there is a solution .

 

Simply apply SAP Note 2250491 - OData request does not return entity set data in case of multi-origin composition to your system.

 

Best regards,

Andre

How to show file name when calling GET_STREAM

0
0

This document is about how to enhancement our OData service  - Download File.


Currently base on training material, we usually redefine GET_STREAM method and pass mime type and stream value to the result.

ls_stream-mime_type = output-return-MIME_TYPE .
ls_stream
-value= output-return-DOCUMENT_STREAM .
copy_data_to_ref
( EXPORTING is_data = ls_stream
CHANGING  cr_data = er_stream ).

This works, however the problem is if you trigger the service in Chrome. You can only get a document with named ‘entity’.

 

1.png

Problem

  • What if customer has already stored file name in ECM or in Database? They want to see the real name.
  • What if customer don't want to direct download, they may want to review the file before download.

Solution

Solution to this is to set a header to http framework. I attached code below.

DATA ls_lheader TYPE ihttpnvp.
"DATA lv_filename TYPE string.
"lv_filename = escape( val = lv_filename format = cl_abap_format=>e_url ).
ls_lheader
-name = 'Content-Disposition'.
ls_lheader
-value= 'outline; filename="Mobile.pdf";'.
set_header
( is_header = ls_lheader ).

ls_stream
-mime_type = output-return-MIME_TYPE .
ls_stream
-value = output-return-DOCUMENT_STREAM .
copy_data_to_ref
( EXPORTING is_data = ls_stream
CHANGING  cr_data = er_stream ).

Let’s test the result now:

2.PNG

If customer want to preview the document instead of directly download, please change code as below.

ls_lheader-value= 'inline; filename="Mobile.pdf";'.

Let’s test the result:

A new page was opened for preview, you can right click and choose to save. File name ‘Mobile.pdf’ comes up.

3.PNG

How to configure Fiori News App by using SICF service and RSS2.0

0
0

This document is written to shown step by step guide to configure news app in Fiori. In all, there are three steps:

  1. Configure Fiori title
  2. Create Table to store news information
  3. Generate SICF service and provide RSS 2.0 format

 

Step one: Configure Fiori title

Click on News Tile

1.png

Fill in SICF service name in the Feed, Article Refresh Interval means the refresh interval of news apps. Prefer to set to 15min.

SICF service name: znews (which will later be created)

1.png

 

Step two: Create a Table to store news information

This table should contain all essential information needed for news.

1.png

 

Step three: Generate SICF service and provide RSS 2.0 format

For the news app itself we need to provide RSS document.

Please refer XML RSS for the principle of RSS 2.0.

We create a SICF service called znews to generate this kind of RSS document (This is the service name you bind in Step one).

Create a class called ZCL_NEWS_SERVICE. Fill in IF_HTTP_EXTENSION in Interface.  Fill below code in IF_HTTP_EXTENSION~HANDLE_REQUEST.  Check and active this class. 

1.png

 

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

  method IF_HTTP_EXTENSION~HANDLE_REQUEST.

    DATA action TYPE STRING .

    DATA xmlcstr TYPE STRING .

    DATA newscontent TYPE TABLE OF ZNEWSCONTENT.

    DATA newscontent_l LIKE LINE OF newscontent.

    DATA guidstr TYPE STRING.

    DATA titlestr TYPE STRING.

    DATA descriptionstr TYPE C LENGTH 400.

    DATA imagestr TYPE STRING.

    DATA linkstr TYPE STRING.

 

    action = server->request->get_form_field( name = 'type' ).

    CASE action.

      WHEN 'news'.

        SELECT * FROM ZNEWSCONTENT INTO TABLE newscontent.

        xmlcstr = '<?xml version="1.0" encoding="utf-8"?><rss version= "2.0"> <channel>'.

        LOOP AT newscontent INTO newscontent_l.

CLEAR: guidstr, titlestr, descriptionstr, imagestr, linkstr.

          CONCATENATE xmlcstr '<item>' INTO xmlcstr.

          guidstr = newscontent_l-guid.

          titlestr = newscontent_l-title.

          descriptionstr = newscontent_l-description.

          imagestr = newscontent_l-imagelink.

          linkstr = newscontent_l-contentlink.

          CONCATENATE xmlcstr '<guid>' guidstr '</guid>'

                              '<title>' titlestr '</title>'

'<description>' descriptionstr '</description>'

                              ''

                              '<link>' linkstr '</link>' INTO xmlcstr.

         CONCATENATE xmlcstr '</item>' INTO xmlcstr.

       ENDLOOP.

       CONCATENATE xmlcstr '</channel></rss>' INTO xmlcstr.

 

server->response->set_header_field(

       name = 'Content-Type'

       value = 'application/xml' ).

 

server->response->set_header_field(

       name = 'accept-origin'

       value = '*' ).

 

       server->response->set_cdata( data = xmlcstr ).

    ENDCASE .

  endmethod.

 

Run traction SICF.

In path default_host/sap/bc, right click on bc and choose New Sub-Element. Fill in your SICF service name.

1.png

Fill in a description and Handler List, leave other as default.

1.png

Check and Active this service.

You can have a test the SICF service now. Right click on the service name and click Test Service.

1.png

Also if you have configure the Fiori Launchpad correct, you can see the real news app.

1.png

2.PNG

Error - ABAP Dictionary element not found

0
0

If you have been used to creating, updating gateway services for some time, you might have encountered an issue where you transported your objects/changes to the productive system but your changes did not reflect in the metadata and therefore not usable/working for your update.

 

The change could be as simple as introducing a new field, or changing the name of the field or changing the properties of it. How it is supposed to work is that every time you make a change to the structure of your app via SEGW or directly in your model definition class. The objects are regenerated and once you kick off a call to the service's metadata, it checks for the time stamps and refreshes the metadata based on the model definition.

 

Sometimes, that doesn't work that well and would result in the infamous error /IWBEP/CM_MGW_RT106 (message class /IWBEP/CM_MGW_RT, message no 106). It reads "ABAP Dictionary element 'ZCLXXX_MDP=>FieldXX' not found.", where the field name is ZCLXXX_MDP=>FieldXX, which seems to be the root cause. You could look this up in the error log in the gateway system under transaction /n/IWFND/ERROR_LOG

error.png

The reference documentation has the solution, simply clear the cache in BOTH gateway and the backend system.


Usually the landscape admins would set up jobs for maintenance of Cache which could be directly kicked off using tcode ‘/IWFND/CACHE_CLEANUP’ on the gateway system or ‘/IWBEP/CACHE_CLEANUP’ in the backend system, to fully clear the cache or to clear the cache for a specific model.

 

The error message class also points out to where (in which system) the problem is. In my quoted example, it is the backend system represented by IWBEP

 

Hope this blog is helpful to you!

FIELD WORK FORCE MANAGEMENT

0
0

Executive Summary:   Crave InfoTech introduces Asset Tracking &  Field Workforce Management Solution certified from SAP.

This solutioncan help you improve your field workforce productivity, emergency planning & response , scheduling of jobs & off course  customer satisfaction.

A solution for your retiring workforce which helps tracking jobs, vehicles, employees and assets.

 

As a electric or water utility professional, you recognize the value of good data. When you link that data to a geographic location on a map, you can visualize the big picture which gives you a powerful decision-making tool.

Asset Tracking & FWMS provides you with an efficient platform for  tracking movable and immovable assets, data management, planning and analysis, field workforce automation and situational awareness.

Asset Tracking can take you to places that are beyond imagination such as underground fitting, pipeline, poles, switches etc.

 

 

Bread Crumb Trail, Alerts, Internal Messaging, Geo fence are few eye catching

Feature, Download route for mobile users, deviation alert from route, and powerful algorithm to route to nearest job using direction API is one of the intuitive features.

For more information visit:  www.Cravetracker.com



Asset Tracking.png

Capture.PNG

 

 

 

 

Asset tracking

  • Asset tracking is cloud based solution for tracking jobs, immovable asset, vehicles and people on the field real time using Google & ESRI MAP.
  • Jobs Creation, jobs assignment and jobs execution is made much easier and transparent using C-ASTRA (Best solution to keep track on field work).
  • Vehicle theft, fuel theft, misguided route, long halt is no longer a problem, C-ASTRA is single solution to strengthen your field force.
  • Asset Tracking provides scintillating features like, Bread Crumb Trail (Vehicle History), Alerts (misguided route, vehicle maintenance, geo fence entry & exit and much more), Internal Messaging, Geo fence are few eye catching Feature Intuitive features.

Best surveillance system for your Immovable and Movable AssetVehicle Tracking:

  • Track your vehicle real time on map using GPS device.
  • Disable Ignition using SMS commands remotely.
  • Calculate Mileage, maximum idling , kilometer driven at place for all vehicle
  • Control Speed using alerts to driver; get acceleration and hard brake used during trip.
  • Get the history of each trip.

Employee Tracking:

  • Track your marketing professional’s real time in working hours.
  • Monitor client visit of your marketing professionals
  • Helps employee in claiming accidental policy in case of mishap in working hours.

Asset Inspection & inventory

  • This app helps to captures physical condition of asset with current location details and sends it to GIS server.
  • Easy to use, save time and helps to locate asset easily.

Mobile device Management:

  • With 100 of users been tracked on field, managing mobile user base is quite difficult, MDM called as AFARIA help to serve big customers.
  • Helps to deploy mobile app easily on bigger scale
  • Apply policy to minimize use of data plan on roaming.
  • Help to locate stolen mobile device
  • Troubleshoot if no data received.

Interface with SAP and Other Applications:

  • Interface with SAP using SAP Net weaver Gateway.
  • Use of Web services and Online Data proxy to interface with GIS to download jobs and assign work centers (drivers to vehicles) from GIS
  • Can run independently.

GPS Device (Cal Amp LMU)

  • Cal Amp GPS devices are small and easy to install with 100% accurate result, this devices are built in such a way that even remote and daunt area does not have effect on its accuracy.
  • Cal Amp devices are easy to configure and troubleshoot remotely using SMS commands.
  • Minimal use of data plan as it uses hexadecimal code to traverse data.
  • Inbuilt memory to store when no signal to send data.

 

 

Visit SAP Store

https://store.sap.com/sap/cpa/ui/resources/store/html/SolutionDetails.html?pcntry=US&sap-language=EN&pid=0000012059



Asset Tracking & Inspection is listed as top five apps on SAP store

http://www.news-sap.com/5-top-apps-sap-store/


 


 

 

Field Service Manager (Using SAP Mobile Platform):

This application is used for field workforce management; this application is developed using SAP Mobile Platform as middleware and SAP (ISU) as backend for Android and MDT. Usage of application is to manage the service order completion, inventory movements (truck to truck transfer), Meter reading entry, Meter  Installation, removal, exchanges, Turn On ,Turn Off, Shut off non pay, Customer Complaint Investigation,

Disconnection and Re-connections, Permitting, Maintenance order, Measuring points and Payment Collection, Substation Measuring points, Hydrant maintenance, Leak reading, Pole Inspection, Pending and completed jobs. Routing of Jobs with shortest distance calculation using Google Map, Voice Navigation. The business process logic and data is encapsulated in innovative Mobile Business Objects (MBO) using SAP Sync BO and BAPI wrappers.

 

C-FSM (Crave Field Service Manager) for Utilities enables a platform-independent processing of maintenance and service processes using mobile end devices. Representing a full offline/Online mobile application, SAP C-FSM provides ready-made scenarios that help field forces and service technicians perform their daily activities at customer sites and within plants. The strong integration into Enterprise Asset Management (EAM) and SAP Utilities (IS-U/CCS or SAP ECC Industry Extension Utilities) allows the field forces to work on internal orders (within your own technical installations) as well as at the customer site. The solution integrates the customer data from SAP Utilities and SAP EAM in one mobile application.

Visit Android app Store: https://play.google.com/store/apps/details?id=com.crave.mam&hl=en


 

 

 

PLAN YOUR FIELD WORKFORCE TODAY WITH INTELLIGENT GLOBAL INFORMATION SYSTEM.

 

Visit : www.craveinfotech.com

http://www.news-sap.com/5-top-apps-sap-store/

Sneak Peek into SAP Gateway and API management Program at SAP TechEd, 2015 at Las Vegas on Oct 19-23, 2015

0
0

After initiating the strategic partnership with Apigee last year, SAP delivered API management on-premise solution based on the Apigee Edge intelligent API management platform in 2014. This year, in June 2015, SAP launched SAP API Management, cloud offering powered by HANA Cloud Platform that provides scalability to meet demand as needed at low investment cost, amortized over a subscription model. Since its launch, there has been tremendous interest from customers as these products provide quick, easy and low-investment solution for digital transformation of your business.


Running on SAP HANA® Cloud Platform, the SAP® API Management technology enables organizations that publish APIs to share digital assets and processes. It also enables developer communities to consume digital assets with ease in new channels, devices, and user interfaces. Learn more about it at SCN API Management Community. SAP Gateway is a technology that enables a simple way to connect devices, environments and platforms to SAP software based on OData standard. SAP Gateway is one of the important API sources that work seamlessly with the API management solution. Learn more about it at SCN Gateway Community. Below is an overview of how SAP API management integrates with the SAP product family including SAP Gateway:


APIMgtArchitecture.png 

 

Do you want to check out these solutions, see the demo and have face to face conversation with SAP experts about Gateway and API Management? Then SAP TechEd 2015 at Last Vegas (Venue: Venetian|Palazzo Congress Center, Date: Oct 19-23, 2015) is a great opportunity for you. Easiest way to connect with SAP experts is to visit our pod. Here are the details about our POD:

  • Pod Headline: API Process Integration For Your Digital Economy
  • Pod Location: Explore (Hall C Level 2)
  • Pod Hours: Tuesday–Thursday • 10:00 a.m. - 6:00 p.m.
  • SAP experts present at the POD:

               - Michael Hill- Director, HANA Cloud Platform Product Marketing at SAP

               - Elijah Martinez- Product Manager, SAP Gateway and API Management at SAP

 

Here is what our team offers at TechEd 2015, Las Vegas:

 

1.       Lectures, hands-on and other sessions on SAP API management and SAP Gateway:

 

Session ID

Session Type

Duration

Title

Speakers

Session Date/Time

INT100

Lecture

1 hr

Integration and Orchestration: Overview and Outlook

Christoph Liebig

Tue: 11:00am- 12:00pm

Wed: 10:30am-11:30am

INT103

Lecture

1 hr

Apply REST with SAP API Management

Harsh Jegadeesan

Tue: 2:00pm - 3:00pm

Thu: 5:45pm - 6:45pm

INT104

Lecture

1 hr

OData Service Deployment Options for On-Premise, Mobile and Cloud Scenarios

Christian Loos,

Stephan Herbert

Tue: 12:15pm – 1:15pm

Wed: 9:15am - 10:15am

INT105

Lecture

1 hr

Intel Case Study: Enable Your Business with SAP Gateway

Stephan Herbert

Wed: 5:45pm - 6:45pm

Thu: 2:00pm - 3:00pm

INT201

Lecture

1 hr

How to Deal with Integration and Orchestration Aspects of IoT Scenarios

Stephan Herbert

Wed: 11:45am- 12:45pm

Thu: 8:00am-9:00am

INT208

Lecture

1 hr

Enabling Shadow IT with SAP Gateway

Brian ONeill

Thu: 4:30pm - 5:30pm

INT260

Hands-On

4 hrs

Develop an E2E Integration Scenario with SAP Gateway, SAP HANA, and SAPUI5

John Patterson, Oliver Heinrich, Joerg Singler, Andre Fischer

Wed: 8:00am-12:00pm

Thu: 8:00am-12:00pm

INT269

Hands-On

2 hrs

SAP API Management: On Demand and on Premise

Peter Ng,

Harsh Jegadeesan

Tue:   4:30pm - 6:30pm

Wed: 2:00pm - 4:00pm

INT600

Mini Code Jam

1 hr

Build Your First SAP Gateway App (OData) with SAP API Management

Harsh Jegadeesan

Wed: 12:30pm - 1:30pm

INT802

Road Map Q&A

1 hr

Road Map Q&A: SAP Gateway

Stephan Herbert

Tue: 1:15pm - 2:15pm Thu: 4:00pm - 5:00pm

INT804

Road Map Q&A

1 hr

Road Map Q&A: SAP API Management

Harsh Jegadeesan

Tue:   3:15pm - 4:15pm

Wed: 4:00pm - 5:00pm

 

2.      Tech Talk:


Time: 10/21/2015- Wed: 3.15 p.m. – 4 p.m. (30-35 minutes talk followed by 10-15 min Q&A).

Presenter: Ed Anuff, VP of Product Strategy at Apigee

Title:API-First: The New Enterprise Architecture For Systems of Engagement
Abstract: Many developers and architects who are very familiar with the principles of SOA are confused about all the talk of APIs. Ed Anuff will discuss why APIs are core to your enterprise's "Systems of Engagement" and why they are different than "Systems of Record" technologies such as ESBs and SOA.  He'll explore the latest trends in API design such as experience-based APIs that are designed to power end-user interactions across devices and mobile apps, as well as the emerging use of micro-services to use API best practices to assemble web-scale composite applications.
Check out more here.


3.       Demos shown at the POD:

 

Visit and interact with our experts at the POD at Explore zone (Hall C Level 2). There will be several innovative demos featuring entire Integration and Orchestration product portfolio that covers Gateway, API Management, HANA Cloud Integration, Process Integration, Operational Intelligence and much more. Here is the preview of one of the demos powered by SAP API Management that you can experience live at the POD:


 

  • Augmented Reality - Visual Showroom: Through a tablet device users can see the augmented reality of an interior design item such as furniture or windows, check the catalog details and create orders in SAP. SAP jointly developed this demo app with Quantilus. Learn more about Quantilus at:http://www.quantilus.com/index.html



VisualShowroom.png


  • SAP API management cloud edition: showcasing tight integration between SAP API Management Solution and classical SAP Backend system running SAP Gateway to expose Business Data via OData as an API, in addition demonstrate mashup of SAP Business partner Data onto a live google map


4.     POD Expert Networking Sessions

 

With Sandeep Murusupalli, Solution Architect from Apigee and Peter NG, Senior Product Manager at SAP Labs, LLC


Join us for Q & A session on API Management. They will show live demo of Asset repairs using smartwatch and will discuss about how API management plays critical role in this demo.

 

 

 

For more information, check out following social channels:

 

1. SAP Community Network: Gateway   API Management

2. YouTube: Gateway    API Management

3. Twitter: SAP Cloud   SAP Developers

4. Facebook: SAP Cloud   SAP Developers

5. SAP API Management Web site

 

 

  • SAP API management and Gateway Product Management Team

FIELD WORK FORCE MANAGEMENT

0
0

Executive Summary:   Crave InfoTech introduces Asset Tracking &  Field Workforce Management Solution certified from SAP.

This solutioncan help you improve your field workforce productivity, emergency planning & response , scheduling of jobs & off course  customer satisfaction.

A solution for your retiring workforce which helps tracking jobs, vehicles, employees and assets.

 

As a electric or water utility professional, you recognize the value of good data. When you link that data to a geographic location on a map, you can visualize the big picture which gives you a powerful decision-making tool.

Asset Tracking & FWMS provides you with an efficient platform for  tracking movable and immovable assets, data management, planning and analysis, field workforce automation and situational awareness.

Asset Tracking can take you to places that are beyond imagination such as underground fitting, pipeline, poles, switches etc.

 

 

Bread Crumb Trail, Alerts, Internal Messaging, Geo fence are few eye catching

Feature, Download route for mobile users, deviation alert from route, and powerful algorithm to route to nearest job using direction API is one of the intuitive features.

For more information visit:  www.Cravetracker.com



Asset Tracking.png

Capture.PNG

 

 

 

 

Asset tracking

  • Asset tracking is cloud based solution for tracking jobs, immovable asset, vehicles and people on the field real time using Google & ESRI MAP.
  • Jobs Creation, jobs assignment and jobs execution is made much easier and transparent using C-ASTRA (Best solution to keep track on field work).
  • Vehicle theft, fuel theft, misguided route, long halt is no longer a problem, C-ASTRA is single solution to strengthen your field force.
  • Asset Tracking provides scintillating features like, Bread Crumb Trail (Vehicle History), Alerts (misguided route, vehicle maintenance, geo fence entry & exit and much more), Internal Messaging, Geo fence are few eye catching Feature Intuitive features.

Best surveillance system for your Immovable and Movable AssetVehicle Tracking:

  • Track your vehicle real time on map using GPS device.
  • Disable Ignition using SMS commands remotely.
  • Calculate Mileage, maximum idling , kilometer driven at place for all vehicle
  • Control Speed using alerts to driver; get acceleration and hard brake used during trip.
  • Get the history of each trip.

Employee Tracking:

  • Track your marketing professional’s real time in working hours.
  • Monitor client visit of your marketing professionals
  • Helps employee in claiming accidental policy in case of mishap in working hours.

Asset Inspection & inventory

  • This app helps to captures physical condition of asset with current location details and sends it to GIS server.
  • Easy to use, save time and helps to locate asset easily.

Mobile device Management:

  • With 100 of users been tracked on field, managing mobile user base is quite difficult, MDM called as AFARIA help to serve big customers.
  • Helps to deploy mobile app easily on bigger scale
  • Apply policy to minimize use of data plan on roaming.
  • Help to locate stolen mobile device
  • Troubleshoot if no data received.

Interface with SAP and Other Applications:

  • Interface with SAP using SAP Net weaver Gateway.
  • Use of Web services and Online Data proxy to interface with GIS to download jobs and assign work centers (drivers to vehicles) from GIS
  • Can run independently.

GPS Device (Cal Amp LMU)

  • Cal Amp GPS devices are small and easy to install with 100% accurate result, this devices are built in such a way that even remote and daunt area does not have effect on its accuracy.
  • Cal Amp devices are easy to configure and troubleshoot remotely using SMS commands.
  • Minimal use of data plan as it uses hexadecimal code to traverse data.
  • Inbuilt memory to store when no signal to send data.

 

 

Visit SAP Store

https://store.sap.com/sap/cpa/ui/resources/store/html/SolutionDetails.html?pcntry=US&sap-language=EN&pid=0000012059



Asset Tracking & Inspection is listed as top five apps on SAP store

http://www.news-sap.com/5-top-apps-sap-store/


 


 

 

Field Service Manager (Using SAP Mobile Platform):

This application is used for field workforce management; this application is developed using SAP Mobile Platform as middleware and SAP (ISU) as backend for Android and MDT. Usage of application is to manage the service order completion, inventory movements (truck to truck transfer), Meter reading entry, Meter  Installation, removal, exchanges, Turn On ,Turn Off, Shut off non pay, Customer Complaint Investigation,

Disconnection and Re-connections, Permitting, Maintenance order, Measuring points and Payment Collection, Substation Measuring points, Hydrant maintenance, Leak reading, Pole Inspection, Pending and completed jobs. Routing of Jobs with shortest distance calculation using Google Map, Voice Navigation. The business process logic and data is encapsulated in innovative Mobile Business Objects (MBO) using SAP Sync BO and BAPI wrappers.

 

C-FSM (Crave Field Service Manager) for Utilities enables a platform-independent processing of maintenance and service processes using mobile end devices. Representing a full offline/Online mobile application, SAP C-FSM provides ready-made scenarios that help field forces and service technicians perform their daily activities at customer sites and within plants. The strong integration into Enterprise Asset Management (EAM) and SAP Utilities (IS-U/CCS or SAP ECC Industry Extension Utilities) allows the field forces to work on internal orders (within your own technical installations) as well as at the customer site. The solution integrates the customer data from SAP Utilities and SAP EAM in one mobile application.

Visit Android app Store: https://play.google.com/store/apps/details?id=com.crave.mam&hl=en


 

 

 

PLAN YOUR FIELD WORKFORCE TODAY WITH INTELLIGENT GLOBAL INFORMATION SYSTEM.

 

Visit : www.craveinfotech.com

http://www.news-sap.com/5-top-apps-sap-store/

File Attachment in Material Document (MIGO) using SAP Gateway

0
0

Business Case: The business requirement is to attach any document while doing the goods receipt for material. The  attachments can be any reference documents or image or any other document related to goods receipt of the material. To meet this requirement SAP has provided a tool bar called 'Generic Service Toolbar'(GOS).

 

 

Now a days since we are using mobile application so there is a requirement to create a attachment for existing Material document through Net-weaver Gateway. We can able to upload any document (PDF/DOC/JPG/XLS)  and through OData service  the document will be linked to the corresponding Material document.

 

The below are steps required to create an OData service in SAP NW Gateway.

 

Step-1: Create a project for Attachment using SEGW transaction.

 

Step-2: Create an Entity type, Entity sets. Remember the entity type should be defined as a Media type.

 

1.jpg

 

Step-3: Create a property for the Entity type.

 

2.jpg

Step-4: Generate the project. It should create all the back end classes for MPC, MPC_EXT, DPC and DPC_EXT.

              Now go to DPC_EXT class and redefine the method /IWBEP/IF_MGW_APPL_SRV_RUNTIME~CREATE_STREAM.

 

              Inside the method write the below code to convert the XSTRING received from the OData service into Binary format and

              then upload the binary data into SAP.

 

METHOD /iwbep/if_mgw_appl_srv_runtime~create_stream.


DATA lt_objhead    TYPE STANDARD TABLE OF soli,
       lt_xdata     
TYPE solix_tab,
       lt_data      
TYPE soli_tab,

       ls_folmem_k   TYPE sofmk,
       ls_note      
TYPE borident,
       ls_object    
TYPE borident,
       ls_obj_id    
TYPE soodk,
       ls_fol_id    
TYPE soodk,
       ls_obj_data  
TYPE sood1,
       ls_data      
TYPE soli,
       ls_xdata     
TYPE solix,

       lv_ep_note    TYPE borident-objkey,
       lv_extension 
TYPE c LENGTH 4,
       lv_mblnr     
TYPE mblnr,
       lv_mjahr     
TYPE mjahr,
       lv_objkey    
TYPE char70,
       lv_tmp_fn    
TYPE string,
       lv_file_des  
TYPE so_obj_des,
       lv_offset    
TYPE i,
       lv_size      
TYPE i,
       lv_temp_len  
TYPE i,
       lv_offset_old
TYPE i.


CONSTANT: lc_hex_null TYPE x LENGTH 1   VALUE '20'.


**Call function to convert XSRTING to Binary

CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'
EXPORTING
buffer          = is_media_resource-value    
append_to_table
= lc_x
TABLES
binary_tab     
= lt_content.


**Call function to get Folder id

CALL FUNCTION 'SO_FOLDER_ROOT_ID_GET'
EXPORTING
region               
= 'B'
IMPORTING
folder_id            
= ls_fol_id
EXCEPTIONS
communication_failure
= 1
owner_not_exist      
= 2
system_failure       
= 3
x_error              
= 4
OTHERS                = 5.


**Get the document number and file name from SLUG

SPLIT iv_slug AT '/' INTO lv_mblnr lv_mjahr lv_file_des.


**Get the file extension

SPLIT lv_file_des AT '.' INTO lv_tmp_fn lv_extension.

CONCATENATE lv_mblnr lv_mjahr INTO lv_objkey.
ls_object
-objkey   = lv_objkey.


**For Goods movement BUS type is BUS2017
ls_object
-objtype    = 'BUS2017'.    
ls_obj_data
-objsns   = 'F'.  
ls_obj_data
-objla    = sy-langu.   
ls_obj_data
-objdes   = lv_file_des.
ls_obj_data
-file_ext = lv_extension.
TRANSLATE ls_obj_data-file_ext TO UPPER CASE.


**Calculate the length

lv_offset = 0.
lv_size
xstrlen( is_media_resource-value ).


ls_obj_data-objlen lv_size.

WHILE lv_offset <= lv_size.
lv_offset_old
= lv_offset.
lv_offset
= lv_offset + 255.
IF lv_offset > lv_size.
lv_temp_len
= xstrlen( is_media_resource-value+lv_offset_old ).

CLEAR ls_xdata-line WITH lc_hex_null IN BYTE MODE.
ls_xdata
-line = is_media_resource-value+lv_offset_old(lv_temp_len).
ELSE.
ls_xdata
-line = is_media_resource-value+lv_offset_old(255).
ENDIF.
APPEND ls_xdata TO lt_xdata.
ENDWHILE.

**Change Hex data to Text data

CALL FUNCTION 'SO_SOLIXTAB_TO_SOLITAB'
EXPORTING
ip_solixtab
= lt_xdata
IMPORTING
ep_solitab 
= lt_data.

 

**Insert document

CALL FUNCTION 'SO_OBJECT_INSERT'
EXPORTING
folder_id                 
= ls_fol_id
object_type               
= 'EXT'
object_hd_change          
= ls_obj_data
IMPORTING
object_id                 
= ls_obj_id
TABLES
objhead                   
= lt_objhead
objcont                   
= lt_data
EXCEPTIONS
active_user_not_exist     
= 1
communication_failure     
= 2
component_not_available   
= 3
dl_name_exist             
= 4
folder_not_exist          
= 5
folder_no_authorization   
= 6
object_type_not_exist     
= 7
operation_no_authorization
= 8
owner_not_exist           
= 9
parameter_error           
= 10
substitute_not_active     
= 11
substitute_not_defined    
= 12
system_failure            
= 13
x_error                   
= 14
OTHERS                     = 15.

IF sy-subrc = 0 AND ls_object-objkey IS NOT INITIAL.
ls_folmem_k
-foltp = ls_fol_id-objtp.
ls_folmem_k
-folyr = ls_fol_id-objyr.
ls_folmem_k
-folno = ls_fol_id-objno.

ls_folmem_k
-doctp = ls_obj_id-objtp.
ls_folmem_k
-docyr = ls_obj_id-objyr.
ls_folmem_k
-docno = ls_obj_id-objno.

lv_ep_note
= ls_folmem_k.

ls_note
-objtype = 'MESSAGE'.
ls_note
-objkey  = lv_ep_note.


**Link the object inserted

CALL FUNCTION 'BINARY_RELATION_CREATE_COMMIT'
EXPORTING
obj_rolea     
= ls_object
obj_roleb     
= ls_note
relationtype  
= 'ATTA'
EXCEPTIONS
no_model      
= 1
internal_error
= 2
unknown       
= 3
OTHERS         = 4.

CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
wait = 'X'.
IF sy-subrc EQ 0.
COMMIT WORK.

ENDIF.
ENDIF.

ENDMETHOD.

 

Step-5: Now go to gateway client /IWFND/MAINT_SERVICE and add a document. In the SLUG parameter

              pass the Material Document number, Year and File name separated by '/'.

 

3.jpg

 

Now click on execute. While executing please keep in mind that the HTTP method "POST" must be selected.

 

Step-6: After execution of the OData service go to the transaction MIGO and provide the Material document number and year.

             you can able to see the attachment.

 

I hope this will be helpful to many of you. 

 

Regards,

Nitin

Action packed first 3 days at TechEd 2015

0
0

Here is the recap of first 3 days at SAP TechEd 2015.

 

On first day evening, Steve Lucas brought light to digital economy in his keynote session. He pointed out that today all economies are going digital (even agriculture).To survive, you must make the digital shift. He mentioned how SAP is moving from tradition enterprise company to a cloud company. He said that SAP you knew was analytics, db, Hana, Sap you know now is analytics, s/4, Hana and Sap you should know is cloud for analytics, s/4, Hana, Vora. Check out the recorded sessionhere


SteveKeynote.png

 

Second day started with energizing keynote session by SAP Executive Board Member Bernd Leukert. He said- "You have the ability to manage your company in a completely different way". He said that SAP has reinvented analytics business. Leukert called Cloud for Analytics “the newest member of our SAP family” before a demonstration, which featured a fictitious retail chain with a few underperforming stores. The tool singlehandedly quickly identified them, geographically represented them and even evaluated their competition, revealing that sales at rival stores were stealing market share. He also stressed on 'SAP HANA Cloud Platform' as the platform to build business applications. There was a demo that revealed how HCP can help an enterprise create a Web page to conduct business in another country. There was much more and you can see recorded videohere.

 

Keynote2.jpg

Keynote2_1.JPG

 

After this energetic start of the day, there was a hands-on session(INT269) on SAP API Management given by Harsh Jagadeesan and Peter NG. They talked about how this solution provides simple, scalable, secure, and unified access for APIs based on open standards such as REST, OData, and Outh2. There were some hands-on exercises as well to get idea about different features of SAP API Management.

 

HandsOn2.JPG

 

Throughout the day, our staff at the POD(at Show floor at Explore zone (Hall C Level 2))- Michael Hill, Elijah Martinez and Sonali Desai,was busy interacting with interested customers and showing them demos related to Integration and Orchestration products(that include Process Integration, Process Orchestration, HANA CLoud Integration, Gateway and API Management). You can watch our demos for Gateway here and check out API Management demoshere

 

Img1.jpg

Img2.jpg

Img3.jpg

On day 3, there was an expert networking session with Paul J. Modderman from Mindset Consulting on 'Innovate with SAP Gateway and API Management'. He talked about how API Management can on board developers quickly and how you can use SAP Gateway to convert your RFCs/BAPIs into SOAP, ODATA/REST services. Here are some pics from that networking session:

 

IMG_3191.JPG

Expert2.jpg

There was another exert networking session with Sandeep Murusupalli, Solution Architect from Apigee and Peter NG, Senior Product Manager at SAP Labs, LLC. It was Q & A session on API Management. They showed live demo of Asset repairs using smartwatch and discussed about how API management plays critical role in this demo.Here are some pics for the same:

 

Expert3.jpg

There was another session given by Stephan Herbert, VP P&I Technology I/O Gateway and Lisa Hoshida from Intel on 'Enable your business with SAP Gateway' They discussed about Intel case study and how SAP Gateway can empower your workforce by moving critical and real-time data, from a back end consisting SAP solutions, to any mobile device.

 

INT105.jpg

Overall it was a great experience. Customers were delighted to see Integration and Orchestration Product portfolio offering and were interested in learning more about the products and how these products can fit in their use cases.

Action packed first 3 days at TechEd 2015

0
0

Here is the recap of first 3 days at SAP TechEd 2015.

 

On first day evening, Steve Lucas brought light to digital economy in his keynote session. He pointed out that today all economies are going digital (even agriculture).To survive, you must make the digital shift. He mentioned how SAP is moving from tradition enterprise company to a cloud company. He said that SAP you knew was analytics, db, Hana, Sap you know now is analytics, s/4, Hana and Sap you should know is cloud for analytics, s/4, Hana, Vora. Check out the recorded sessionhere


SteveKeynote.png

 

Second day started with energizing keynote session by SAP Executive Board Member Bernd Leukert. He said- "You have the ability to manage your company in a completely different way". He said that SAP has reinvented analytics business. Leukert called Cloud for Analytics “the newest member of our SAP family” before a demonstration, which featured a fictitious retail chain with a few underperforming stores. The tool singlehandedly quickly identified them, geographically represented them and even evaluated their competition, revealing that sales at rival stores were stealing market share. He also stressed on 'SAP HANA Cloud Platform' as the platform to build business applications. There was a demo that revealed how HCP can help an enterprise create a Web page to conduct business in another country. There was much more and you can see recorded videohere.

 

Keynote2.jpg

Keynote2_1.JPG

 

After this energetic start of the day, there was a hands-on session(INT269) on SAP API Management given by Harsh Jagadeesan and Peter NG. They talked about how this solution provides simple, scalable, secure, and unified access for APIs based on open standards such as REST, OData, and Outh2. There were some hands-on exercises as well to get idea about different features of SAP API Management.

 

HandsOn2.JPG

 

Throughout the day, our staff at the POD(at Show floor at Explore zone (Hall C Level 2))- Michael Hill, Elijah Martinez and Sonali Desai,was busy interacting with interested customers and showing them demos related to Integration and Orchestration products(that include Process Integration, Process Orchestration, HANA CLoud Integration, Gateway and API Management). You can watch our demos for Gateway here and check out API Management demoshere

 

Img1.jpg

Img2.jpg

Img3.jpg

On day 3, there was an expert networking session with Paul J. Modderman from Mindset Consulting on 'Innovate with SAP Gateway and API Management'. He talked about how API Management can on board developers quickly and how you can use SAP Gateway to convert your RFCs/BAPIs into SOAP, ODATA/REST services. Here are some pics from that networking session:

 

IMG_3191.JPG

Expert2.jpg

There was another exert networking session with Sandeep Murusupalli, Solution Architect from Apigee and Peter NG, Senior Product Manager at SAP Labs, LLC. It was Q & A session on API Management. They showed live demo of Asset repairs using smartwatch and discussed about how API management plays critical role in this demo.Here are some pics for the same:

 

Expert3.jpg

There was another session given by Stephan Herbert, VP P&I Technology I/O Gateway and Lisa Hoshida from Intel on 'Enable your business with SAP Gateway' They discussed about Intel case study and how SAP Gateway can empower your workforce by moving critical and real-time data, from a back end consisting SAP solutions, to any mobile device.

 

INT105.jpg

Overall it was a great experience. Customers were delighted to see Integration and Orchestration Product portfolio offering and were interested in learning more about the products and how these products can fit in their use cases.

Gateway Goggles

0
0

Imagine this: your friend is a plant maintenance supervisor. His colleagues running the plant fill out hand-written forms identifying broken down equipment. Half those papers he loses somewhere in his disaster of a desk. The other half he has to mash out on his desktop keyboard over lunch. His job is fixing problems, but he's spending more time pushing papers.

 

Enter you. You're a phenomenally skilled SAP developer, known through the company for creative solutions. You hand your buddy an iPhone. On it runs an SAPUI5 app that lets him take a picture of the paper as soon as it's handed to him. The app interprets the things written on the paper and puts them in the appropriate fields on his screen. He gives it a once-over and hits "Submit" - ready to actually enjoy a distraction-free lunch break.

 

You are a hero to your friend and he tells everyone how awesome you are. The CEO gets wind of this innovation and makes you CTO. You retire wealthy and start that surf shop on the beach you always dreamed of.


I'm going to show you how to give your Gateway system “eyes” that can interpret content in a photo. In a ridiculously easy way. No promises on the surf shop, though.

 

Google recently made their Cloud Vision API available for anyone to try. I love when the big software companies give me things to play with. It means I can try out wild and crazy ideas from the comfort of my keyboard. So as soon as I could, I took some free time to tinker with the Vision API and mash it up with SAP Gateway.

 

I present here a simple prototype for using these two tools in tandem. There are about a billion ways this could be useful, so I hope my little slice of code helps someone along the way.

 

I’ll show you how to use Gateway to request Google Vision API processing. I picked a couple Vision abilities that I find awesome, but the API is capable of more.


Without further ado - let’s get started!


Setup


Before you write any code, you’ll need:

  • An SAP Gateway system. If you’re reading this blog and don’t know what that is, then I apologize because you’re probably really bored.
  • Configuration to allow that system to HTTP POST to an external internet API. See here for setting up STRUST to allow that.
  • A Google account, with the Cloud Vision API enabled. Be warned: if you use it more than 1,000 times a month, it’s not free. Just make sure it takes you less than 1,000 tries to get it right.
  • An API key set up in the Google account. I suggest using the browser API key for prototyping, and service accounts for productive use. Getting an API key is covered in the Google getting started guide.

 

Once you have the above configured, it’s time to cracking on the code.

 

Show Me The Code Already

Now that we have things ready to roll, fire up your Gateway system and go to t-code SEGW. I set up a very simple entity that will just hold the description of what Google thinks an image is. Just 3 fields:


Screen Shot 2016-02-26 at 4.56.55 PM.png


Make sure to flag that entity as a "Media" entity:


Screen Shot 2016-02-26 at 4.57.38 PM.png


And that’s it for our bare-bones service definition. You could get a lot more architecturally crazy and set up a bunch of entities and fields to capture every single thing that comes out of the Google API - but I just wanted to get it up and running to see what I could get.


Only two steps left in the setup: coding the model enhancement necessary for media entities and coding the CREATE_STREAM method for processing the file data.


First, the model enhancement. Navigate to the *MPC_EXT class in your project and do a redefinition of the DEFINE method. This code should get you what you need. It’s so short that it’s basically self-explanatory:



  METHODdefine.    super->define( ).    DATA:    lo_entity   TYPE REF TO /iwbep/if_mgw_odata_entity_typ,    lo_propertyTYPE REF TO /iwbep/if_mgw_odata_property.    lo_entity=model->get_entity_type( iv_entity_name='VisionDemo' ).    IFlo_entityIS BOUND.      lo_property=lo_entity->get_property( iv_property_name='ContentType' ).      lo_property->set_as_content_type( ).    ENDIFENDMETHOD.


The model is now ready to support media stuff (in our case pictures) coming in. The other side of the equation is to prepare the request to be sent to Google for processing. We’ll do that in the CREATE_STREAM method of the *DPC_EXT class that the SEGW project generated. Same deal as before, do a redefine of that method and put in the following code:



  METHOD /iwbep/if_mgw_appl_srv_runtime~create_stream.    TYPES: BEGIN OFfeature,             typeTYPEstring,             max_resultsTYPEi,           END OFfeature.    TYPES: featuresTYPE STANDARD TABLE OFfeatureWITHDEFAULTKEY.    TYPES: BEGIN OFimage,             contentTYPEstring,           END OFimage.    TYPES: BEGIN OFrequest,             imageTYPEimage,             featuresTYPEfeatures,           END OFrequest.    TYPES: requestsTYPE STANDARD TABLE OFrequestWITHDEFAULTKEY.    TYPES: BEGIN OFoverall_request,             requestsTYPErequests,           END OFoverall_request.    DATAoverall_requestTYPEoverall_request.    DATArequestsTYPETABLE OFrequest.    DATArequestTYPErequest.    DATAfeatureTYPEfeature.    DATAlv_b64_contentTYPEstring.    DATAlo_http_client  TYPE REF TOif_http_client.    DATAlv_response_dataTYPEstring.    DATAlv_urlTYPEstring.    DATAlv_request_jsonTYPEstring.    DATAlv_response_jsonTYPEstring.    DATAlo_descrTYPE REF TOcl_abap_structdescr.    DATAlv_startTYPEi.    DATAlv_endTYPEi.    DATAlv_total_charsTYPEi.    DATAls_visiondemoTYPEzcl_zgoogle_vision_mpc=>ts_visiondemo.    DATAlv_end_markerTYPEstring.    CALL FUNCTION'SCMS_BASE64_ENCODE_STR'      EXPORTING        input  =is_media_resource-value      IMPORTING        output=lv_b64_content.    lv_url='https://vision.googleapis.com/v1/images:annotate?key=GET_YOUR_OWN_KEY'.    request-image-content=lv_b64_content.    feature-type=iv_slug.    feature-max_results=1.    APPENDfeatureTOrequest-features.    APPENDrequestTOrequests.    overall_request-requests=requests.    lo_descr?=cl_abap_typedescr=>describe_by_data( overall_request ).    lv_request_json= /ui2/cl_json=>dump( data=overall_requesttype_descr=lo_descrpretty_name=abap_true ).    cl_http_client=>create_by_url(      EXPORTING        url                =lv_url      IMPORTING        client             =lo_http_client ).    lo_http_client->request->set_method( method='POST' ).    lo_http_client->request->set_content_type( content_type='application/json' ).    lo_http_client->request->append_cdata2( EXPORTINGdata=lv_request_json ).    lo_http_client->send( ).    lo_http_client->receive( ).    lv_response_data=lo_http_client->response->get_cdata( ).    IFiv_slug='LOGO_DETECTION'.      lv_end_marker='"score":'.    ELSE.      lv_end_marker='"boundingPoly":'.    ENDIF.    SEARCHlv_response_dataFOR'"description":'.    lv_start=sy-fdpos+16.    SEARCHlv_response_dataFORlv_end_marker.    lv_end=sy-fdpos.    lv_total_chars=lv_end-lv_start.    ls_visiondemo-id=1.    ls_visiondemo-description=lv_response_data+lv_start(lv_total_chars).    copy_data_to_ref( EXPORTINGis_data=ls_visiondemo                       CHANGINGcr_data=er_entity ).  ENDMETHOD.


Note the following about this code snippet:

  • I’m using the IV_SLUG parameter to control what kind of request (logo or text detection) I’m making to Google. This means using the “slug” header in an HTTP request, which I’ll show you below.
  • Google expects picture data to be base64 encoded, so the FM SCMS_BASE64_ENCODE_STR handles that for us.
  • Get your own API key - the string at the end of my URL will not work for you. Replace GET_YOUR_OWN_KEY with your actual key.
  • There are a number of ways to handle JSON type data in ABAP. I used the /ui2/cl_json method purely out of simplicity for a demo. For a more robust solution see how to use JSON with the XML parsing tools.
  • There is basically no error handling here. That’s the great thing about prototyping.
  • I know the way I pull the description out of the response is a total hack.

 

 

Try It Out

The easiest way to try this out is through the Gateway Client (/iwfnd/gw_client). Here’s how:


Navigate to /iwfnd/gw_client on your Gateway system. Enter the request parameters as seen here (assuming you’ve named things the same that I have):


gw_client setup.PNG


The two possible values I verified for “slug” are TEXT_DETECTION and LOGO_DETECTION - though the API supports many more than that.


Next, put a picture in the body of the request by clicking “Add File”. If you choose TEXT_DETECTION as the slug, then make sure your image actually has text. Here’s what the result looks like if I put in a picture of my business card. Look at the “Description” field in the right hand text (and note that Google automatically puts in newline characters if there are line breaks in the picture):


result from business card.PNG



And check it out if I put in a logo with the LOGO_DETECTION slug parameter (“Skunk Works” is the right answer for this picture):


logo detection result.PNG


Wrap It Up, Paul

So I’ve proved out that I can use the Google Cloud Vision API in conjunction with SAP Gateway - but I haven’t really done anything truly useful. However, I have some really exciting ideas for this and can’t wait to continue using these mega-powerful cloud services! I hope that my small example helps someone else dream big and make something amazing.

ODATA SERVICE FOR PURCHASE ORDER using RFC

0
0

Let discuss the PO creation steps using gateway services in detail.

Step1. Open Tcode SEGW and create a project as shown in below.

1.png

Give the details as shown  below .

2.png

Step 2. Create the first entity by importing an RFC interface. For this right-click on Data Model and choose Import -> RFC/BOR Interface.

3.png

Step 3. Enter the following values in the wizard and then choose next:

 

Entity Type Name

PurchaseOrder

Target System

Local

Data Source Type

Remote Function Calls

Data Source Name

bapi_po_getdetail

4.png

Step 4. Expand the PO_HEADER node and select the following fields:
POHEADER,COMPANYCODE,DOC_CAT,DOC_TYPE,STATUS,VENDOR,PURCH_ORG,PUR_GROUP and Choose Next.



5.png

Step 5. In the first line, PO_NUMBER, select the field Is Key and choose Finish:

6.png

Step 6. Create the second entity again by importing an RFC interface. Right-click Data Model and choose Import -> RFC/BOR Interface

7.png

Step 7. Enter the following values in the wizard and choose next:

 

Entity Type Name

PurchaseOrderItem

Target System

Local

Data Source Type

Remote Function Calls

Data Source Name

BAPI_PO_GETITEMS   

8.png

Step 8. Expand the PO_ITEMS node and select the following fields:
po_item,material,pur_mat,mat_grp,net_price,price_unit,disp_quan
Choose Next.

     9.png

10.png

Step 9.Now our project has 2 entities – one for the Purchase Order and one for the Purchase Order Item. As a next step we create entity-sets out of these entities. Expand the node Data Model and double-click Entity Sets:

Name

Entity Type Name

PurchaseOrderSet

PurchaseOrder

PurchaseOrderItemSet

PurchaseOrderItem

11.png

Step 10.Now the basic definition of the Model is done. As a next step we can generate the necessary runtime artifacts.

a.     Click on the Generate pushbutton:

b.     Leave the default values and choose Enter:

12.png

Please note the Technical Service Name ZPURCHASEORDER_SRV is equal to the External Service Name required to consume this service later on. 

c .Choose Local Object.

d. Runtime objects have been generated successfully now.

  Step 11.Now we can Register and Activate the Service.

a. Double-click Service Maintenance

13.png

b. Select system EC7 and click on the Register button. Please note that the entries listed here depend on the System Alias configuration you have done in the SAP Net Weaver Gateway Implementation Guide (IMG). In a local deployed environment (Backend and Hub components deployed on the same box) you might also find “LOCAL” with the RFC destination “NONE” here

14.png

c. Confirm the warning message displayed in the popup: click yes

d.Press F4 to select the system alias. Select LOCAL from the input held.

e.Confirm the Select System Alias popup: click ok

f. Leave the default values and enter $tmp as the package and choose Enter:

15.png

The External Service Name is defaulted with the Technical Service Name from the Generation Step

g. Verify that the service has been registered and activated successfully:

16.png

Step 12.Now we can run our service the first time. Please note that we’ve only maintained the basic model data so far. As a consequence we can access the metadata of the service,

a. Open a new window, start transaction /IWFND/GW_CLIENT.

b. Enter URI:  /sap/opu/odata/sap/ ZPURCHASEORDER_SRV/$metadata and choose Execute

17.png

Step 13.ZPURCHASEORDER_SRV is External Service Name that was registered before.

We have created a Service Builder Project with two entities and two entity-sets. We have generated the runtime artifacts and registered and activated our OData service.

Step 14.Now we will map the data provider to bring life into our OData service.

(i)Get the PO header data

  • We will start with the Query method for the PurchaseOrderSet entity-set. Expand the node Service Implementation -> PurchaseOrderSet and right-click GetEntity (Read) and select Map to Data Source:

18.png

  • In the map to data source window, enter the following values and choose Enter:

Target System

Local

Data Source Type

Remote Function Call

Data Source Name

bapi_po_getdetail

19.png

  • Map the output parameters by dragging the fields from the model on right side. Also create an input parameter for providing the PO to the RFC. Choose Enter:

Mapping will look like,

20.png

Then Save.

We are done with getting the PO header details.

(ii). Now, we need to get the lines item based on given purchase order number.

For this, we will create operation in entity set PurchaseOrderItemSet.

  • We will start with the Query method for the PurchaseOrderItemSet entity-set.
  • Expand the node Service Implementation - PurchaseOrderItemSet and right-click GetEntitySet (Query) and select Map to Data Source:

21.png

  • Provide Data source name as ‘BAPI_PO_GETITEMS’ and Data source type as ‘Remote Function Call’.
  • Map the output parameters by dragging the fields from the model on right side. Also create an input parameter for providing the PO to the RFC. Choose Enter:

22.png

  • Regenerate the artifacts.

Testing the Service:

Select the registered service and click on gateway client.

23.png

Sample Case 1: For getting PO HEADER DATA provide

/sap/opu/odata/SAP/ZPURCHASEORDER_SRV/PurchaseOrderSet('3000000004') and click on execute.

24.png

25.png

Sample Case 2: For getting PO ITEM DATA provide /sap/opu/odata/SAP/ZPURCHASEORDER_SRV/PurchaseOrderItemSet?$filter=PoNumber eq '3000000004' and click on execute

26.png

27.png

Ever wanted to know why SAP uses OData?

SAP Gateway and OData in a nutshell

0
0

The following video on YouTube provides a nice and comprehensive high level overview of SAP Gateway and OData.

 

So if you want to explain to somebody what SAP Gateway and OData are in just 1:45 minutes you can share this link.

 

SAP Gateway and OData - YouTube

 

Best Regards,

Andre

Consume Odata Service in ABAP CL_HTTP_CLIENT->CREATE_BY_DESTINATION

0
0

This blog would have not been possible, but for the awesome community that is SCN. I would like to thank every contributor ever for helping me out in my hours of need!.

 

I had tremendous help in navigating through my current requirement thanks to the below blog posts.

How to consume an OData service using OData Services Consumption and Integration (OSCI)

Thank you. Andre Fischer

 

Consuming an External RESTful Web Service with ABAP in Gateway

And Paul J. Modderman

 

both these blogs helped me understand the intricacies of the functionality that is Consuming an OData Service.

this Blog can be considered as an extension of the Blog by Paul J. Modderman.

 

we can consume a data service by using the method CREATE_BY_URL of the class CL_HTTP_CLIENT, but when authentication is involved this method was ill suited for it.

 

The CREATE_BY_DESTINATION method however enables us to store the Credentials in a more standard and secured fashion.

 

The Requirement:-

I needed to access an OData service that was exposed by HANA . It is required that we trigger this service from the ECC system and process the result.

 

The user would be logging in to ECC directly and not via portal, thus it would require the use of an RFC destination Login Credentials.

 

The Process:-

Step 1.

we have to create the RFC connection in SM59 as below.

 

  1. Go to TCode SM59
  2. Click on the create new connection button.
  3. Provide the description
  4. Choose the connection type as G.
  5. enter the Host name/port number(an Odata Service generally has a URL containing ".COM" followed by ":" Port Number)
    1. The  part until the ".com" without the HTTP:// is the host name
    2. The part after the ":" is the port number
  6. Enter any proxy details if a proxy is being used(we were not in my case)
  7. go to the Logon & Security tab
    1. CHoose the Basic Authentication Radio Button.
    2. Enter the logon credentials
  8. Save and click on connection test.
  9. If all setting are proper you should be able to see a webapage that is relevant to the Host you entered, in the Response Body/Response Text tabs.

 

Step2.

 

Now that we have created the RFC connection we proceed to the creation of the HTTP client .

 

to create the client we use the attached  code. CL_HTTP_CLIENT.txt.

 

 

DATA:   lo_http_client TYPE REF TO if_http_client,

        lv_service TYPE string,

        lv_result TYPE string.

 

 

"xml variables

DATA:    lo_ixml TYPE REF TO if_ixml,

        lo_streamfactory TYPE REF TO if_ixml_stream_factory,

        lo_istream TYPE REF TO if_ixml_istream,

        lo_document TYPE REF TO if_ixml_document,

        lo_parser TYPE REF TO if_ixml_parser,

        lo_weather_element TYPE REF TO if_ixml_element,

        lo_weather_nodes TYPE REF TO if_ixml_node_list,

        lo_curr_node TYPE REF TO if_ixml_node,

        lv_value TYPE string,

        lv_node_length TYPE i,

        lv_node_index TYPE i,

        lv_node_name TYPE string,

        lv_node_value TYPE string.

************************************************************************

* lv_ destination will be name of the RFC destination we created in SM59

************************************************************************

CALL METHOD cl_http_client=>create_by_destination

  EXPORTING

    destination              = lv_destination

  IMPORTING

    client                   = lo_http_client

  EXCEPTIONS

    argument_not_found       = 1

    destination_not_found    = 2

    destination_no_authority = 3

    plugin_not_active        = 4

    internal_error           = 5

    OTHERS                   = 6.

IF sy-subrc <> 0.

* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO

*            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.

ENDIF.

 

*************************************************************************************************

* we need to build the URI, this is the part in the OData Service that comes after the port number

* This includes the Path and Query string for the service that is being called on the host.

* lv_uri holds the path and query string

*************************************************************************************************

 

CALL METHOD cl_http_utility=>set_request_uri

  EXPORTING

    request = lo_http_client->request

    uri     = lv_uri.

 

lo_http_client->receive(

  EXCEPTIONS

    http_communication_failure = 1

    http_invalid_state         = 2

    http_processing_failed     = 3 ).

 

**************************************************

* Making sense of the result parsing the XML

**************************************************

lv_result = lo_http_client->response->get_cdata( ).

 

lo_ixml = cl_ixml=>create( ).

lo_streamfactory = lo_ixml->create_stream_factory( ).

lo_istream = lo_streamfactory->create_istream_string(

                                 lv_result ).

lo_document = lo_ixml->create_document( ).

lo_parser = lo_ixml->create_parser(

                       stream_factory = lo_streamfactory

                       istream        = lo_istream

                       document       = lo_document ).

 

 

 

 

"This actually makes the XML document navigable

lo_parser->parse( ).

DATA: lv_name TYPE string.

"Navigate XML to nodes we want to process

*lo_weather_element = lo_document->get_root_element( ).

lv_name = 'content'.

lo_weather_element = lo_document->find_from_name( lv_name ).

lo_weather_nodes = lo_weather_element->get_children( ).

 

 

 

 

"Move through the nodes and assign appropriate values to export

lv_node_length = lo_weather_nodes->get_length( ).

lv_node_index = 0.

 

 

 

 

WHILE lv_node_index < lv_node_length.

  lo_curr_node = lo_weather_nodes->get_item( lv_node_index ).

  lv_node_name = lo_curr_node->get_name( ).

  lv_node_value = lo_curr_node->get_value( ).

  ADD 1 TO lv_node_index.

ENDWHILE.



Hope this Helps!, let me know if i can clarify further.


Peace!!!!

Browsing All 253 Articles Latest Live