From my blog on SCN – http://scn.sap.com/community/pi-and-soa-middleware/blog/2013/12/23/troubleshooting-adapter-module-jco-rfc-calls
Many a times we have adapter modules ( AM ) which in turn do a RFC call to ABAP stack of the XI system to retrieve information. This can be handy to make a generic AM that can be plugged in for multiple interfaces.
As an example, if we already have a text file with idoc data in plain text format, the adapter module can be used to convert this data into idoc format. The ABAP stack FM can be created to inquire idoc metadata information in IDX2 and use it to create the idoc structure.
The different steps are as follows:
1.This step shows the data being processed in the module chain. It’s represented in hex format for the message. So, if we have a text file with content “hello” – the message will be “68 65 6c 6c 6f”.
2.The message is entering the module chain. It can be after reading the file for a sender FTP adapter if this is the first module or after another module call e.g. for code page conversion.
3.As part of the adapter module processing, a RFC call is made using Java Connector ( JCo ) with the message and other parameters. These will be adapter module parameters.
4.The returned message is idoc data in hex format of the XML stream.
5.The message is leaving the adapter module.
6.The data is in hex for the idoc. To make it simple, if the idoc data is “<hello>” the message here is “3c 68 65 6c 6c 6f 3e”.
In the above diagram, the module parameters are sent to the RFC function module in text format though the message data will be hex.
With all the above theoretical knowledge, let’s create a simple Java program that can aid in troubleshooting.Pre-requisite Libraries to be added to our project:
1. Google Guava :
Get it from here – http://code.google.com/p/guava-libraries/
2. JCO libs
Get it from service marketplace.
3. Create a jar with the property file: I’ve provided a sample property file. It has the details to make us connect to the host, authenticate
Files to be created:
1. Props.properties : to store connection / authentication details so that it’s easy to change the target instead of hardcoding the values in the program.
2. PropHelper.java : Helper class to read properties file.
3. simulateRFCBean: This is the main class – it’s used for reading the file and making the JCo RFC call.
The steps will be:
1.Check the function module for parameters of type IMPORT (or ask your friendly ABAP colleague about the function module.)
This FM requires two parameters:
SOURCE (the message sent as hexadecimal stream).
SNDPRN (This is a just normal text)
The names won’t have to necessarily match as within the module, we map parameter ‘partner’ to ‘SNDPRN’ for the RFC call.
2.Map the data types to Java using the table mentioned above.
Looking at the function module and the data mapping, we’ll need to
a) Convert the file contents to a hexadecimal string ( byte array in Java )
b) Send the sending system as a string ( String in Java )
3.With the above information, it’s straight forward to create the Java program.
a)Read the JCO properties – PropHelper.java is a helper class to help read these properties instead of hardcoding in the program.
b)Read the file to be checked and convert to byte array.
byte fileData = Files.toByteArray(new File(“C://temp//bad3.txt”));
– Do the necessary JCO Set up, set the parameters to be supplied for the RFC call
and finally, make the call.
Now with the set up done, we’re able to replicate the error scenario.
The issue on the communication channel log indicated.
Using this program, we’re able to replicate the failure.
For debugging on ABAP, an external break-point needs to be set up in the called function module.
Some of the error scenarios we encountered:
- Bad data – new line appearing in the message.
- IDX2 configuration missing for the unique message combination – existing for the basic type and an extension.
However, once we’re able to simulate the failure and debug the issue, finding the resolution is much easier.
Sourcde code link : https://github.com/viksingh/SimulateRFCCalls
I have been doing some work of late with Java proxies on SAP PO .I tried to use product documentation to understand their working. Product documentation is always the best source of information but things are much clearer for me after I’ve already developed something using a concrete example.
I hope that this post will be useful for anyone trying out SAP PI Java proxies. Here both the sender as well as receiver are Java proxies and it has been implemented on a 7.31 SAP PO system.
Scenario: We’ll use the flight demo model scenario with two Java proxies – interfaces FlightSeatAvailabilityQuery_Out_new for outbound service call and FlightSeatAvailabilityQuery_In_new for inbound service implementation.
Broadly, we’ll have three major objects:
Sender Proxy ( Consumer ) – Being sender, this proxy doesn’t provide any service but is used to make the outbound service call.
Receiver Proxy – It has the actual service implementation .
Servlet ( for testing ) – As the proxy calls are to be in an JEE environment, we’ll create a simple Servlet for testing.
At a high level, the below diagram shows what we’ll end up creating. We’ll need to create a configuration scenario as well but that shouldn’t cause too much grief.
We need to create the below projects in NWDS. The names in italics are our project names.
a) Dynamic Web Project ( JavaProxyWeb ). This will hold consumer proxy (sender objects) and servlet used for testing.
b) EJB Project ( JavaProxy ) : This will hold service implementation on the receiver side.
c) EAR (JavaProxyEAR) – EAR to deploy JEE objects.
So we have the below three projects to start our work..
Consumer Proxy Generation
Let’s generate code on the sending side ( consumer proxy ) . We don’t want to chose a Javabean as there is no actual service implementation. We’re just generating client to call our proxy and hence chose “Generate Client”.
And we don’t have to WSDL as we’re not really using the end-points for any service all . We’ll be creating a configuration scenario in PI Designer to generate the configuration.
After that, let the wizard go through default values and just press finish.
Inbound Service Generation
For our inbound service implementation, let’s take the corresponding service interface. This time we want to generate a bean and hence chose “Generate JavaBean Skeleton”.
Our EJB project looks something like this.
And the web project should look similar to this.
Adding Servlet to web project ( for testing )
As we need a servlet for testing, let’s create one in our web project.
Additions to servlet which makes the outbound proxy call object
and the actual proxy call to get the result.
We have set the business system programmatically. It’s also possible to not set it here and instead set Party and Sender Service in configuration for Java proxies ( this appears once the project is deployed ) in Port Configuration.
Inbound Service Implementation
For the inbound service implementation, we’ll need to add @XIEnabled annotation and it’ll need @TransportBindingRT as well. Add @XIEnabled and IDE will help with @TransportBindingRT as they’re required together.
Our EAR has our EJB as well as dynamic web project.Deploy the EAR .
Service verification in NWA
Once the project is successfully deployed, we should be able to find our consumer proxy . This was actually confusing for me. I tried to deploy the generated client without the actual service call in servlet and it never worked for me. Only when I created the servlet making the outbound call I could see the outbound proxy. It’s best to finish outbound proxy and inbound service implementation along with consumer application ( servlet in this case ) and then try to deploy them in one shot.
Our development is done. Now, we need to create an integration flow with the required sender / receiver interface values.
Sender communication channel uses XI message protocol for a SOAP adapter and HTTP as the transport protocol.
Similarly, create a receiver communication channel of type SOAP, using HTTP as transport protocol and XI as message protocol.
Path prefix for sending proxy messages bia JPR is /MessagingSystem/receive/JPR/XI.
Now, configure the receiver communication channel. Change the authentication mode to “Use logon data to no-SAP system’ and provide
Activate the iFLow and we’re ready to test the set up.
Fire the servlet and it should come up with the below screen
Put your flight and connection details. We aren’t really using the date and hence it’s not in screen.
Voila ! Our proxy worked and it has come back with the response.
This matches with our service implementation – we’re just returning 20 free and 200 max business class seats .
The message can be displayed in RWB / PIMON.
The audit log confirms that JPR has successfully delivered the message along with the response message.
The source code is available on https://github.com/viksingh/PIJavaProxy . I’ve just put the outbound proxy call in the servlet and the inbound service implementation as these are the two files we need to modify. Rest all were left standard including deployment descriptors.
PI 7.31 introduces a Java only version of PI. This obviously is much more efficient for message processing. However, many existing XI/PI customers have built processing logic on ABAP stack. In this blog, we’ll look at the steps that can be carried out to create similar functionality on Java stack. I hope the information will be useful and I welcome any suggestions/comments.
I’ll take a sample use case – dynamic receiver determination based on custom table entries. ( i.e. based on file name, route the message to different receivers ). For our discussion, the table will be very simple – it has three columns : id, filename and receiver. Id is primary key and filename is the field that is compared to pattern match with the message’s filename to determine the receiver business system. The table is too simplistic and data won’t be normalized but let’s ignore that for now .
Of course, it can be easily implemented using XPath expressions in PI 7.31 but I’m using this as it covers a functionality that was often implemented by clients on ABAP stack and it’s a good use case as we can use it to show usage of persistence. On ABAP stack, this functionality can be implemented using the following approach.
- – Create an ABAP class (with interface IF_MAPPING ) for processing logic where lookup on custom table is performed. This ABAP class is used in the operation / interface mapping.
- – SM30 can be used for table maintenance
- – SE16 can be used to display table entries.
As shown in the below diagram, a message may arrive / leave at Java stack but the scenario used ABAP development infrastructure. Of course, it was not mandatory but given ABAP skill set being more widespread in SAP landscapes we had the below situation in many cases.
So in a file to file transfer, the FTP adapters (on Java stack ) will receiver / send files and an ABAP map is used to perform lookup to find the receiver.
Option from PI7.31 onwards:
The following approach could be taken to create similar functionality on Java stack:
- Create tables in Java dictionary.
- Since SAPUI5 is supported by SAP AS Java and is the recommended UI by SAP, it can be used to build display and maintenance screens.
- – For server side processing ,we can use servlets to handle requests from SAPUI5 client. We’ll need two servlets – one to display data entries already in database and a second one to create new entries.
- – Receiver determination class is on Java stack where we can use Lookup API in a java mapping program. This is a normal Java map and can be built using DatabBase Accessor. I won’t cover in this blog but the github link for this project has the code which can be referred.
To keep the model simple, we can perform database operation in the servlet itself via JPA and not really go via EJBs. EJBs can be used for more complicated scenarios.
Additional Libraries Required:
Hence, our development objects will look somewhat like this.
We’ll use SAP Java Development Components ( DCs ) to leverage benefits of NetWeaver Development infrastructure for lifecycle management.
Some customers may still want to maintain the database on ABAP stack ( e.g. in their ECC system / SOLMAN etc .) and have it replicated to Java stack. In these cases, the messages can be sent to Java AS over HTTP via a RFC destination by tying it in with changes over ABAP tables. In case servlet expects/ returns JSON messages, we can use the standard ABAP class /SDF/CL_E2E_XI_ALERT_JSON_DOC / use SCN’s code exchange to get the JSON tool in case your ABAP system doesn’t have this standard JSON processing class.
In this case, the scenario will look like as below.
Implementation Steps ( our original scenario using SAPUI5):
- – Create a DC for external library and add the jar file to libraries folder.
Change the access to allow access for both COMPILATION as well as ASSEMBLY.
- – Ensure that both public parts have jars.
2. Create a dictionary DC. It’ll have an ID column and FILENAME and RECEIVER columns. Based on FILENAME, we’ll determine the RECEIVER.
Create dictionary and deploy it to SAP Java Server.
- – Create an additional TABLE for generating keys. We’ll use this later for primary key generation.
- – Verify that the table exists on database. You may switch to database development perspective. Create a connection using your dB connection details.
3. Create an EE application of type “Web Module” and perform the following steps:
It should already have the below annotation.
As we’ll use key generation from a different table (TMP_KEYGEN) , the below annotations are required.
Update persistence.xml to change the transaction type as Local as we’ll be managing transactions in servlet code while persisting data.
- Add SAPUI5 libraries – Refer to the below blog for detailed steps.
- In summary:
- Add SAPUI5 SCAs and put dependency for both the DC as well as the SC.
- Add SAPUI5 Java libraries.
- Update web.xml to add components required for SAPUI5
- – We’ll create two servlets: RetrieveData will display existing table entries and UpdateData is used to create entries. Both servlets will need to have the JPA information injected at runtime and hence we need the following annotation / declaration.
RetrieveData servlet will return all values in the database.
- – UpdateData inserts values to the database table:
write( response, map ) is a helper method has been written to write the JSON response – it’s the last 3 lines of the previous servlet.
Nothing too fancy in html file – it calls the main view.
And code to trigger POST. We just display an alert message .
Java files structure for the web module:
and files for UI in the web module:
This is how the project DCs look.
Dictionary DC was created and deployed separately as it doesn’t have to be part of this EAR.
jsonlib: Has Google GSON libraries for Java processing.
detcvr: It’s the webmodule – It has servlet , views as well as JPA generated entity file.
dtrcvear: It is our EAR to deploy the application to SAP AS Java.
This is how the screen turns out to be. We can maintain values :
And entries are dsiplayed using the “Display Routing Values” tab.
To test in PO, create an iFLow with two receivers, one default and the second one will be determined based on Java map which will perform a lookup on the table to get the receiver business system based on filename. Configure Routing with Dynamic Message Router
and process two files: First with an arbitrary name and second one has name starting with IN which was updated in our dB by SAPUI5 application. Based on the table entry, the message is routed to BS_DUMMY_IN business system.
The code for this project can be referred at https://github.com/viksingh/PIDYNAMICRCVDCs .
This is in continuation of the first blog on reducing integration effort by using SAP enterprise services (http://scn.sap.com/community/pi-and-soa-middleware/blog/2013/06/04/reducing-integration-effort-by-leveraging-sap-enterprise-services )
I’ll describe the steps in more detail here.
Step1. Identify services required.
Go to http://esworkplace.sap.com/ and identify the service required. There’re various ways you can navigate the content. If you’re implementing a new scenario, typically a complete process , you can use integration scenarios ( e.g. agency business ) . Using solution map is higher level ( e.g Sales Order Management ) . Business Scenario Description seems hybrid of the above two and I prefer that. Then, in each bundle you can navigate to the service operation and read through the documentation. Make sure the selected operation is not deprecated. SAP has good documentation around features, configuration required in backend system, error handling and any business function required.
Step2. Identify system set up requirements
Identifying all operations will help to come up with a list of requirements for system set up. It’s relatively straight-forward in PI as we just need to import the software component. However, for the back end system, there can be couple of scenarios:
– Service requiring business function activation (e.g. SAP APPL 6.04 requires LOG_ESOA_OPS_2 activation) .
– Service requiring add-ons to be installed by BASIS in SAINT transaction (e.g. ESA ECC-SE 604 requires BASIS to install ECC-SE add on).
Get the ECC activities organised (as mentioned in part1, some of the business functions are irreversible) and import corresponding PI SWCVs into ESR – these could be downloaded from SMP and imported as usual.
Step3. Check components are completely installed in the system.
Go to transaction SPROXY. If the ECC system satisfies all the pre-requisites and PI has the components as well, the SWCVs should appear in transaction SPROXY.
In SLD, the ECC technical system should have the additions appearing as a software component version in Installed software of the ECC technical system.
Step4. Create custom software component version. This is strictly not required but in experience, there’s always a need to customize messages and hence it should be created with the required service’s SWCVs as the underlying SWCV.
Step5. Test: Once the services are in ECC, you can use them to start doing testing.
Use transaction SPROXY for testing – this should help to identify the elements required in the message to be populated and the business documents processed. In experience, this is where you’ll spend the maximum time trying to identify what is required, where to populate the information etc.
You can test using SOAMANAGER as well but I prefer to just use SPROXY and then when the PI configuration is complete use SOAPUI.
Step6. There will be cases where the standard doesn’t fit the requirement completely. In that case, perform a data type enhancement in SAP PI in the custom namespace.
This should make the data type enhancement show up in ECC SPROXY. Creating the proxy here will update the ECC structures which are used by backend ECC classes for business document processing.
Step7. Some Hints: As usual, testing can be done by configuration in ID and using SOAP UI. Just couple of hints here:
Many of the standard services are sync and ensure that message logging for sync interfaces is on.
The messages appear in SXMB_MONI in ECC only when there’s an error (not application error but more like a system error like configuration, input message not conforming to length / type restrictions etc).
For debugging, you can create a comm channel with your username and turn on HTTP debugging if you’re trying to investigate the SOAP message (say headers).
Step 8: Special case: Lean Order being too Lean !
For one of the scenarios, while creating sales order we realised that lean order doesn’t have the fields we’re interested in. However, there’s a good document on SMP about “Enhancement Options for the Lean Order Interface” and it was very helpful
Step 9: Generate some positive karma – Do some good for people maintaining it later. As the development on ECC side is going to involve mostly enhancements, two things can really be useful.
– Keep all the enhancements in a composite enhancement.
– Create a checkpoint group so that it’s easier to debug messages.
Step 10: Logging in SAP: For synchronous interfaces, SAP does return the proper messages back to the calling application. However, the functional team doesn’t have access / interest to access PI to look at the errors. Hence, we had to build logic to update the messages as an application log (which can be checked in SLG1). This in some ways satisfies their requirement to look at the system to figure out what’s going on.
This is one area where perhaps either I’m missing something or SAP needs to provide information so that users / functional consultants can monitor the messages. Many functional consultants don’t want to even try to look at XML.
A simple class can be created to log the information and call them at appropriate enhancement points. However, we also created a generic method to convert any exception into a BAPI message.
Code can be referenced here.
Couple of observations where things could potentially by improved by SAP.
– There should be a free tool by SAP to let users/ functional teams monitor messages during testing. I’m aware of AIF but don’t have experience as it’s not free.
– Lack of out of box support for JSON RESTful web service in SAP PI. The initial requirement was to use them but then the source application had to be modified to use CXF to generate SOAP web service calls on the calling application side. I was almost ashamed to go back to the third party
Some of the books & articles I found useful.
SAP Press Book: “Developing Enterprise Services for SAP”: Although I referenced this book only recently, I found an absolute joy to read and did pick up many things. This definitely helped to refine some of the ideas.
Enterprise Services Enhancement Guide
SOA MADE EASY WITH SAP
Blog-Add a field to LORD API
Posted from my blog on SCN : http://scn.sap.com/community/pi-and-soa-middleware/blog/2013/05/23/pi-adjustments-for-app-and-db-server-split
Recently, I was involved in a project with the goal of splitting app server and dB to split them on different hosts. If app and dB server are on the same host, there’s no confusion as there’s only a single host. However, if in case of dB and app server split, dB host should be taken for technical systems.
We got following errors after app/dB split.
– Error with cache notification.
– Adapter engine not found.
– Business system not found on execution of LCR_GET_OWN_BUSINESS_SYSTEM.
We had to do the following steps in the below order to get it working:
- Recreate SAP technical systems
a) Delete existing technical systems for SAP systems.
b) Regenerate these technical systems (RZ70).
2.Regenerate PI business system and technical systems of type “Process Integration”
a)Delete all technical systems of type ‘Process Integration’.
b) Delete PI business system
c) Execute PI self registration:
In SAP Net Weaver Administrator:
Configuration Management –> Scenarios –> Configuration Wizard : All Configuration Tasks –> PI SLD Self Registration
i) Regenerates technical systems of type ‘Process Integration’
ii) Recreates PI business system. If PI business system is not deleted in 2 b), this step will fail. If you want to use a different name, it should be done after all steps are complete.
3.Restart integration builder / integration repository / RWB and AE.
Net weaver Administrator:
SAP Netweaver Administrator –> Operation
Management –> Systems –> Start&Stop –> Java EE Applications
- com.sap.aii.af.app (Adapter Engine)
- com.sap.xi.directory (Integration Builder/Configuration)
- com.sap.xi.repository (Integration Builder/Design)
- com.sap.xi.rwb (Runtime Workbench)
4.Adjust all SAP business systems to use the new technical system. The technical systems should appear as below.
Hopefully it can save someone some time in trying to work these out.