Quantcast
Channel: WebCenter Content – ATeam Chronicles
Viewing all 69 articles
Browse latest View live

Embedded Oracle Documents Cloud Service Web User Interfaces for folder contents and search results

$
0
0

cloudDocuments_48x48Oracle Documents

The Embedded Oracle Documents Cloud Service Web User Interface allows customers to embed the Oracle Documents Cloud Service experience everywhere people work. Oracle Documents Cloud Service puts content at the center of collaboration in any application.

The Embedded Web User Interface uses an HTML inline frame (IFRAME tag). It removes the default branding and resizes the content to fit the enclosing frame. Therefore, allowing customers and partners to integrate Oracle Documents Cloud Service into their own web applications.

The Embedded Oracle Documents Cloud Service Web User Interface allows users to work together on centralized content with document management features in a ready-to-use interface.

To embed the Documents Cloud Service user interface in an inline frame, add /embed to a link immediately after the /documents URL element.

An iframe element is of the form:

<iframe src=”URL” more attributes>

</iframe>

Inline frames appear inside the presentation of a page much the same way as images are: the browser allocates some space for an inline frame and takes this into account when rendering the page. An inline frame as a whole scrolls along with the rest of the page, so it might be scrolled away. Whether this is positive or negative depends on the aims and nature of the page. (An inline frame usually has its own internal scroll bar too, for scrolling its own content.)

Some navigational features for the Embedded Oracle Documents Cloud Service Web User Interface are:

  • The cloudDocuments_48x48 icon opens the user’s home page using the standard interface in a new tab or window, depending on the browser’s default setting.
  • The elements in the path are active links that open the associated folder in the current window using the embedded interface.
  • TheLaunch Full Site icon  opens the current folder using the standard interface in a new tab or window, depending on the browser’s default setting.

Find below a few examples:

Embedded Interfaces for Folder Browse
A folder is identified by a Folder ID. In the below example, the ID “F231C3E825CFEFE995BEAB52T0000DEFAULT00000000” corresponds to folder “Document Cloud Service”

https://documents.us.oracle.com/documents/embed/folder/F231C3E825CFEFE995BEAB52T0000DEFAULT00000000/_Document_Cloud_Service/nameasc

Embedded UI folder browse
Embedded Interfaces for Search Operations

Searching in the Oracle Document Cloud Service is case-insensitive. This means that performance will find the same results as Performance. An empty search finds all items. The not operator has higher precedence than the others, with and having lowest precedence. Search results sort options are by Last Updated, Name Ascending, and Name Descending.

Embedded Interfaces for “Standard” Search Operations

Search for “REST” and “API”. Use “and” or a “space”

https://documents.us.oracle.com/documents/embed/search/global/REST%20API/updated

Embedded UI search REST API

Embedded Interfaces for Search Inclusions and Exclusions

Search for “jpg” and DO NOT include “Mercedes”. Use “not” or the dash (-)

https://documents.us.oracle.com/documents/embed/search/global/jpg%20-Mercedes/updated

Embedded UI search pg -Mercedes

Embedded Interfaces for Search OR Operators

Search for either “bmw” OR “mercedes”. Use “or” or a “comma”

https://documents.us.oracle.com/documents/search/global/bmw,mercedes/updated

Embedded UI search BMW OR Mercedes

Favorites 

Direct access to a user’s favorites is provided through the Favorites Embededed UI

https://documents.us.oracle.com/documents/embed/favorites/nameasc

Embedded UI favorites

Please note that the embedded user interface adjusts the content to fit within windows as small as 320 pixels wide. Windows that are smaller than 320 pixels may hide content on the right edge of the window.

The Embedded Oracle Documents Cloud Service Web User Interface allows users to work together on centralized content with document management features in a ready-to-use interface.


Calling REST Services in WebCenter Enterprise Capture Client Scripts

$
0
0

For customizing WebCenter Enterprise Capture, the JavaScript engine included with the Java Runtime Environment can be used. The developer’s guide for WebCenter Enterprise Capture contains two sample client scripts that can be used to modify the standard behavior of the Capture client that runs in the browser. The two samples do the following tasks:

Sample 1: This sample script customizes client behavior in the following ways:

  • Prevents the client user from leaving a metadata field if the entry contains the word “test”
  • Prevents the user from entering an asterisk in any metadata field.
  • Outputs event information to the java console, such as coordinates after a user right-mouse-drags a selection on an image.

Sample 2: This sample script customizes client behavior in the following ways:

  • Uses the BatchScanBegin function to restrict files that can be imported to those with a .TIF extension only.
  • Uses the DBSearchResults function to modify the results of a database lookup so that only the first result is used, and prevents the results list from displaying.

These scripts create a starting point for further client side customizations in Enterprise Capture. To extend the samples a bit further, the following script displays how to call a web service and populate a client profile field automatically. This example pulls a stock quote from a Yahoo Finance public REST API to demo how to call the REST service, parse the JSON response, and set the Capture field value.

 

Note that using the JavaScript engine, packages and classes that need to be included can be imported in either of the following ways:

  importClass(java.awt.event.KeyEvent);
  importPackage(java.io, java.net);

 

For the example script that follows, the java.io and java.net packages are required, since calling the web service will use the classes URL and HttpUrlConnection. To read the JSON response, an InputStreamReader and BufferedReader classes are used.

importClass(java.awt.event.KeyEvent);
importPackage(java.io, java.net);

function DocumentSelected(event) {
  DocumentSelectedEvent println("DocumentSelected: " + event.getDocument().getTitle());
  var MY_FIELD = "MetaField"; // The name of the metadata field to populate
  var document = event.getDocument();
  var batch = document.getParentBatch();
  var fieldDef = batch.getWorkspace().getFieldDefinitions().findByName(MY_FIELD);

  if (fieldDef != null) {
    var fieldId = fieldDef.getId();
    var fields = document.getFields();
    var field = fields.get(fieldId);

    //Setup URL to call Yahoo Finance service
    var yqlHostAndContext = "http://query.yahooapis.com/v1/public/yql";
    var yqlQuery = "select * from yahoo.finance.quotes where symbol in ('ORCL')";
    var yqlEnv = "http://datatables.org/alltables.env";
    //Encode uri parameters
    var urlStr = yqlHostAndContext + "?q=" + encodeURIComponent(yqlQuery) + "&env=" + encodeURIComponent(yqlEnv);

    println(urlStr);

    var url = new URL(urlStr);
    var conn = url.openConnection(); //Returns HttpUrlConnection
    conn.setRequestMethod("GET");
    conn.setRequestProperty("Accept", "application/json");

    var responseCode = conn.getResponseCode();
    println("Sending 'GET' request to URL : " + url);
    println("Response Code : " + responseCode);
    if(responseCode == "200"){
      var is = conn.getInputStream();
      var isr = new InputStreamReader(is);
      var br = new BufferedReader(isr);
      var response = new java.lang.StringBuffer();
      var line;
      while ((line = br.readLine()) != null) {
        response.append(line);
      }
      br.close();
      conn.disconnect();

      println(response);
      var jsonObj = JSON.parse(response);
      var price = jsonObj.query.results.quote.LastTradePriceOnly;

      //Set the field value
      field.setValue(price);
    }
    else{
      //Could throw an error instead, but for sample script just setting meta field to a generic error string so user can see it.
      field.setValue("ERROR getting stock price. HTTP Response code:" + responseCode );
    }

    // Save the document data
    document.persist();
  }
}

 

When the script succeeds, the last trading price value of the ORCL stock will be placed automatically into the field (MetaField).

metafield

When troubleshooting client side scripts, see the Java Console. This is where ECMA errors or other syntax errors will appear. Note that when updating client scripts, the Enterprise Capture managed server does not need to be restarted. Logout of the client profile and login once again, and the updated client script will load.

javaconsole

 

Oracle Support has additional as-is samples for customizing and enhancing the Enterprise Capture import processor, recognition processor, and client profiles. See these knowledge base articles for other example scripts.

    • Document 1900716.1 – Enterprise Capture Recognition Processor Script to Modify Database Lookup Search Value does not Work
    • Document 1909021.1 – Sample Import Processor Script (Email Jobs Only) for Converting PDF to TIFF Using GhostScript
    • Document 1664009.1 – Enterprise Capture Sample Script to Display Message in FieldGotFocus Event
    • Document 1901189.1 – Enterprise Capture Sample Client Script to Perform DB Lookup on FieldLostFocus Event
    • Document 1672851.1 – Sample Enterprise Capture Client Script to Show Field Properties in FieldGotFocus Event
    • Document 1662311.1 – Sample Import Processor Script to Rename a Batch to Match a FileName
    • Document 1665817.1 – How to Access a Batch Name in the Enterprise Capture Recognition Processor Script
    • Document 1639684.1 - Sample Enterprise Capture Script to Populate Field in Client Profile with UserName in DocumentSelected Event
    • Document 1677731.1 – Sample Enterprise Capture Client Script to Connect to the Database using JDBC/ODBC Bridge
    • Document 1918874.1 – Sample Enterprise Capture Recognition Processor Script to Retain Existing Meta-Data for New Documents
    • Document 1943347.1 – Sample Enterprise Capture Client Script to Copy Meta Data on PreBatchRelease Event (Sticky Fields)
    • Document 1934031.1 – Sample Enterprise Capture Import Processor Script to Set Meta Data Value in “preCreateDocument” Event
    • Document 1917387.1 – How To Get The Total Number Of Pages In A Document Using Scripts Within Enterprise Capture
    • Document 1614232.1 – Enterprise Capture Sample Script to Add a User ID to a Batch Name at Scan Time
    • Document 1936687.1 – Sample Enterprise Capture Client Script to Unlock a Batch and Make Available for Other Users (Skips Post Processing Step)
    • Document 1594882.1 – How to Execute Enterprise Capture WLST Commands
    • Document 1946580.1 – Sample Enterprise Capture Client Script to Call A REST Web Service and Populate a Field With the Result
    • Document 1946581.1 – Sample Enterprise Capture Client Script to Call Documents Cloud Service Using Jersey

 

Development Patterns in Oracle Sales Cloud Application Composer, Part 1

$
0
0

Global Functions and Object Trigger Functions — What Goes Where

 

Introduction

Overheard the other day while walking by a new home construction site:

Electrician speaking to carpenter: “I wish I didn’t have to follow these house plans. It would be much easier if I could put some of the outlets and light switches somewhere else. And have you seen where they want me to put the main circuit box? It’s crazy. What was this architect thinking?”

Carpenter, replying: “Quit your complaining — these plans are OK. At least everything’s laid out in detail. I’ve been on some jobs where I had to decide where to place the wall studs. The plans were useless. Let’s just say that my choices came back to bite me on my last job. I had to redo seven or eight walls and an entire upstairs floor when the plumbers got on site to do their thing.”

Plumber, adding to the conversation: “No surprise — the plumbers always get the blame. But yeah, I’ve also been on jobs where I ended up getting the worst of it. On my last job, the carpenters were long gone when I found out I had no room between the wall studs to put in the PVC drain pipes and vents. Almost everything had to be rerouted. What it amounts to is this: in the end it’s the plans that are at fault.”

We could continue with this semi-hypothetical building trades conversation, but the lesson should be obvious from the short (edited for a G rating) excerpt: building a house without well-designed and thorough plans and blueprints will usually result in one disaster after another, and the amount of work and re-work required to get the job done will blow through any estimates and budgets. And this says nothing about completing construction on or before the homeowners’ scheduled move-in date, let alone how maintainable the (cobbled-together) house would be in the future.

Applying the lessons from this home construction story to software design and development is a very short logical leap: without well-designed and thorough plans for a software project, the potential need for re-work, delivery date slippages, and general overall quality of the end product, not to mention the ease of future maintenance, is directly related to having complete, detailed, and well-thought-out plans.

New building construction and ground-up software projects require plans. But what about remodeling or adding on to a house? Or, on the software side, what role does planning play when customizing or extending an existing application? For these smaller-scale projects, it may be tempting to forgo or expend less effort in the planning process. This path, however, would be ill-advised. Discounting the need for plans, even for software extension projects or home remodeling, can be dangerous for both software and building construction. With software, for all but the most trivial projects, relying on design/development patterns maximizes productivity, minimizes the need for rework and refactoring, eases code maintenance, and results in reusable code. This dictum applies to extensibility projects as much as it applies to new application development.

Taking the construction/software development analogy one step further, it is usually the role of a home architect to produce a set of house plans, as it is the responsibility of the software architect to develop plans for software projects. Both types of architects have a variety of tools at their disposal to make the design/planning process as efficient as possible. But both types of architects would be woefully unproductive if they had to start from scratch with each new project. What options are available to improve productivity?  On top of their accumulated experience, architects can take advantage of shortcuts, utilizing best practices and design patterns wherever and whenever they can. There are tried-and-true software development patterns that architects can (and should) leverage in system design. The ability of either type of architect to take advantage of and utilize best practices in their plans will only improve the quality of the plans and therefore maximize the quality and integrity of the completed building or software application.

This post illustrates several design patterns and best practices when working with Oracle Sales Cloud Application Composer. Specifically, the focus here, in Part 1 of this post, is on the relationship between global functions and object trigger functions, and how to model these functions optimally no matter what kind of extensibility is required. The use case grounding the discussion is a requirement to add enhanced document management capabilities to the Opportunity object in Sales Cloud by integrating with Oracle Document Cloud Service using REST as the communications vehicle.

Oracle Sales Cloud Customization and Extensibility Tools

Oracle Sales Cloud includes Application Composer, an integrated bundle of browser-based customization tools or tasks available to extend and/or customize the out-of-the-box application components. Oracle Sales Cloud and other related offerings in the Oracle Fusion Applications family (Marketing Cloud, etc.) can be extended with Application Composer in several ways: extending the data model by adding custom objects and building new relationships, adding new objects to the user interface, modifying the default business logic of the application by hooking into application events, enhancing the default reporting capabilities of the application, and by modifying and extending the application security model.

Within App Composer, there are a number of places where Groovy scripting can be used to hook into existing Sales Cloud processes, making it possible to add custom flows or to call out to external applications with web services. But with such a wide variety of options available, it then becomes necessary to make choices on how best to execute the integration. Lacking best practices or design patterns that apply to the requirements, it might take an architect/developer team multiple iterations before an optimal solution is discovered. Our goal here is to reduce the number of these iterations by presenting a set of generic design patterns that will apply to a multitude of extensibility use cases. Although we will be using integration examples from Oracle Documents Cloud Service, or DOCS, (which has been generally available as an Oracle PaaS offering for a month or two as of this post), the intent is to concentrate on the Application Composer side of the integration pattern rather than to drill down into the details of any specific PaaS application’s web service offerings. The discussion is devoted almost exclusively on design patterns and what belongs where in the App Composer framework. Although security, parsing methodologies, handling asynchronous calls, and other web service framework components are extremely important, they are not going to receive much attention here. Some of these components will get more attention in Part 2 of this post.

The spotlight’s focus is on global functions and an object trigger functions, and the roles they should play in external application integrations and extensions. The end result should be a representative framework of optimized, greatly reusable, global functions acting as the foundation of a reusable script library, with object-based trigger functions built on top of this foundation to address the specific business requirements. In order to be productive with Application Composer functions, it is necessary to understand Groovy scripting. Fortunately, there is an excellent guide available that not only covers generic Groovy scripting features, but also functions that are specific to using Groovy inside Application Composer. (The Release 8 version of the guide is located here: http://docs.oracle.com/cloud/latest/salescs_gs/CGSAC.pdf )

Integrating with External Applications

For integration purposes, it has become standard practice for today’s enterprise applications to expose web services interfaces as a way to communicate with other applications using industry standard protocols. SOAP (Simple Object Access Protocol) used to be the predominant protocol, but several factors, not the least of which is the ever-growing requirement to support mobile clients more efficiently, have made REST (Representational State Transfer) a better fit, and therefore more and more applications are opting for REST support, either as an adjunct to SOAP methods or as an all-out replacement for SOAP. In either case, the process flow pattern in App Composer is virtually identical for SOAP (synchronous) and REST: build a request payload, submit the request to a designated URL, receive the response payload and parse it, process the parsed response, and do something based on payload component values, assuming there are no errors.  Of course error handling becomes a required piece to the process flow, but for our introductory discussion it will be taking a back seat.

Here is a graphical depiction of the general process flow:

ProcessFlow

In many integration patterns, the same web service endpoint is the target for multiple calls at different processing points within the source application. Frequently, data returned in one response will be used to build requests for subsequent responses, hence the circular flow of the diagram.

What Lives in Global Functions

To borrow from object-oriented programming strategies, global functions should be designed for maximum re-use. Although there may be unusual situations that are exceptions to this design principle, the goal should be to maximize the potential for re-use whenever possible. In many cases, global functions can be designed to be reused across multiple applications, making them truly “global”. How is this design goal best accomplished? Here is a partial list of design strategies and objectives:

  • Minimize hard-coding. Whenever any values are hard-coded in a global function it may diminish the ability for it to be reused somewhere else. While the practice of hard-coding values may be impossible to eliminate entirely, minimizing hard-coding will pay dividends in the ability to reuse global functions to their maximum potential.
  • Liberal use of input parameters. Instead of hard-coding values inside functions pass them to the global function as parameters. This practice allows for far greater flexibility in when and where the global function can be used.
  • One job for one global function. Build the function so that it is not doing too much for its purpose. Think about multiple situations where the function will be used and then design appropriately for all of them. If the global function contains logic or flows that will not apply to these situations, extract the extraneous logic and build additional functions, if that function needs to be supported. There is a tightrope to walk at this stage of the design process:  model the function to do all that is required, but do not add so many sub-functions that it becomes too specialized.
  • Take strategic advantage of return values. Global functions can be used more flexibly if they are designed to return appropriate values to the processes that call them. In some cases it may be necessary to return multiple values to the calling process; if that is the case populate a Map object and return multiple values instead of returning single values of one type.
  • Manage the scope of error and exception handling. In general it is advisable to handle errors and exceptions as soon as possible after they occur instead of letting them bubble up into the calling stack. However, in the case of global functions, there may not be enough information or context to determine whether an event or a SOAP or REST response payload is an error condition or not. In those cases the error handling would then have to be relegated to the calling process.

When developing global functions evaluate every piece of logic and every line of code, and if certain parts of the code logic start to tie the function down to one or two business usage patterns, the function’s global nature diminishes.  Granted, occasionally there may be a legitimate need for a hybrid type of global function which is application-specific and less reusable in scope. These semi-global functions consolidate processes that are called from multiple places in a specific application.  They may contain application-specific logic which would restrict their use elsewhere.  There is a subtle difference between these application-specific global functions and generic, cross-application global functions. The former category encapsulates patterns that may be required by two or more calling object trigger functions. Properly-designed application-specific functions will normally be able to call a generic global function to get their job done. Acting as wrappers around generic global functions, they should only contain logic needed to support application-specific requirements. The bulk of the work should occur in the called generic global functions. (Part 2 of this post will explore the relationship between application-specific global functions and what we are referring to as truly cross-application global functions.)

Often it is difficult to determine where to start with the breakdown of application process flows into discrete global functions. What normally produces the optimal result is to decide on what sub-processes will be called most often from different places in the application, and work down from the “most popular” processes to those that will not be called as often. In the case of integrating with web services, starting with a skeleton of three global functions – a function that prepares the request payload, a function that calls the service, and a function for processing the response payload — is logically sound. Our high-level process flow diagram should have given us a big hint on how to break down the process into discrete functions.

In Release 8 of Oracle Sales Cloud, there are vast differences in how SOAP and REST endpoints are defined and supported in Application Composer.  With SOAP, there is a web services registration applet in Application Composer that formalizes and streamlines the definition and configuration of web services by pointing to the WSDL URL and setting up a security scheme that can be used across the active application (Common, Customer Center, Marketing, Sales, etc.) in Application Composer. The identifying name given to these connections can then be referenced in Groovy scripts, which then exposes the specific functions supported by the defined service endpoint in the Groovy Scripting palette.

By contrast, for REST endpoints there is no formal endpoint definition support as of Sales Cloud Release 8 or Release 9, and therefore all support needs to be developed from lower-level classes. One solution packaged as a global function, which utilizes lower-level Java classes) may look like this:

/*****************************************************************************************************
* Function Name: callRest
* Returns: String (JSON-formatted)
* Description: When supplied with parameters REST method (GET, POST, PUT, DELETE), resource URI extension,
and optional map for request payload parameters, returns JSON-formatted response payload String
* Example: callRest( String requestMethod, String urlExt, Map requestProps)
*
* returns response as a String or JSON-formatted ERROR String
*******************************************************************************************************/

println('Entering callRest')
def jsonInput
def respPayload
def restParamsMap = adf.util.getDocCloudParameters()
def authString = (restParamsMap.userName + ':' + restParamsMap.userPw).getBytes().encodeBase64().toString()
def fullUrlStr = restParamsMap.url + restParamsMap.restUrlExt + urlExt
def url = new URL(fullUrlStr)
HttpURLConnection connection
try {
connection = (HttpURLConnection) url.openConnection()
connection.setRequestMethod(requestMethod)
connection.setRequestProperty('Authorization', 'Basic ' + authString)
connection.setRequestProperty('Accept', 'application/json')
connection.setRequestProperty('Content-Type', 'application/json')
connection.setDoOutput(true);
connection.setDoInput(true);
def processed = false
switch (requestMethod) {
case 'PUT' :
case 'POST' :
jsonInput = ''
if (requestProps) {
jsonInput = map2Json(requestProps)
}
DataOutputStream os = new DataOutputStream(connection.getOutputStream());
os.writeBytes(jsonInput)
os.flush()
os.close()
processed = true
break
case 'GET' :
case 'DELETE' :
connection.connect()
processed = true
break
default :
println 'Unrecognized REST Method!'
}
if (processed) {
respPayload = ''
if (connection.responseCode == 200 || connection.responseCode == 201) {
respPayload = connection.content.text
} else {
respPayload = '{ERROR:' + connection.responseCode + ', URL = ' + fullUrlStr + '}'
}
} else {
respPayload = '{ERROR:Unrecognized REST Method ' + requestMethod + '}'
}
} catch (e) {
println e.getMessage()
connection?.disconnect()
}
println 'Exiting callRest'
return respPayload

After creating a properly-formatted Basic authentication string from user credentials, the function uses the HttpUrlConnection method to create a connection to the REST endpoint. Depending on the REST method that is supplied to it, the function optionally streams a request payload through the connection and saves the response, if a response is available. Otherwise it will either catch an exception or return an error string with the HTTP response code.

Note the call to the adf.util.getDocCloudParameters() in the initial part of this function.  This is a call to another global function.  In this case, the getDocCloudParameters() function satisfies a requirement that is almost universal to App Composer extensions:  because App Composer does not support global variable constructs, it has become common practice to write one global function that can provide the equivalent of global variables and constants that are used throughout the application.  (In this case the userName and userPw Map values are pulled from the function’s returned Map and are used to build the Basic authentication string.)  Building such a function allows for commonly-used hard-coded values and session-scoped variables to be maintained in a single place, and making these values available with a function call facilitates flexibility by avoiding hard-coding of values in other function code.

It is critical here to warn/note that the HttpUrlConnection method is not officially supported in Application Composer Groovy scripts as of Release 8, and that depending on Sales Cloud environment and patch bundle level the function may very well be blacklisted and unavailable for use in Groovy scripts. In the short term, or until Application Composer scripting support for REST endpoints becomes more formalized, the future for REST calls in Groovy/AppComposer is somewhat hazy. One possibility is that the URL and HttpUrlConnection methods will be given exception status by adding them to a whitelist. Another possibility is that a wrapped version of these methods (e.g. OrclHttpUrlConnection), which would prevent these methods from being used maliciously, will be made available to Groovy scripts. There could be other outcomes as well. No matter what the future holds in store, the general design pattern of this generic global function should be valid until future releases of Sales Cloud add more formal support mechanisms for outbound REST integrations.

Does this callRest function satisfy the development guidelines for a truly global function? In this case there may be some room for improvement:

  • Minimize hard-coding. This version of the function hard-codes JSON support. It would be fairly easy to extend the function to support both JSON and XML requests and responses.
  • Liberal use of input parameters. JSON/XML support options could be enabled by adding input parameter(s) that could tell the function which request/response payloads are needed/returned by a specific REST call.
  • One job for one global function. This function fulfills the one function/one job guideline, while not being too granular.
  • Take strategic advantage of return values. The function returns a String. What the calling process does with the return value can take many different routes.
  • Manage the scope of error and exception handling. Because the function is working within a network protocol layer and not a business-related layer of the application, it should manage errors only within that layer. Other exceptions, for example an error condition that may be packaged inside a returned response payload, would need to be handled by the calling process.

One other significant rationale for structuring the callRest function in this manner has to do with making any need for future maintenance as easy as possible. Knowing that it will be necessary to rewrite the function for compatibility with future releases of Sales Cloud and changing support (or lack thereof) for URL/HttpUrlConnection methods, it should be a tipoff to keep potential modifications localized to one function if at all possible. Structuring the function in this way does exactly that.

Other global functions that would normally be called by a trigger function immediately before and after calling the callRest global function are responsible for preparing the request payload (called before) and processing the response payload (called after).

The map2Json function in the Document Cloud Service integration receives a Java Map object (which Groovy can manipulate very efficiently) as an input parameter and converts it to a JSON-formatted String. There are a number of ways to get this job done; what is presented here makes use of the third-party Groovy jackson libraries that are popular with developers for supporting JSON requirements. The function could look like this:

/*************************************************************
* Function Name: map2Json
* Returns: String (JSON formatted)
* Description: Returns a JSON-formatted String that is built from a passed-in Map of key:value pairs
* Example: map2Json( Map reqParms ) returns String
**************************************************************/

println 'Entering map2Json function'
String jsonText = ''
ByteArrayOutputStream byteOut = new ByteArrayOutputStream()
org.codehaus.jackson.map.ObjectMapper mapper = new org.codehaus.jackson.map.ObjectMapper()

try {
mapper.writeValue(byteOut, (HashMap)reqParms)
} catch (org.codehaus.jackson.JsonGenerationException e1) {
e1.printStackTrace()
} catch (org.codehaus.jackson.map.JsonMappingException e2) {
e2.printStackTrace()
} catch (IOException e3) {
e3.printStackTrace()
}
jsonText = byteOut.toString()
println 'Exiting map2Json function'

return jsonText

After instantiating a new jackson ObjectMapper, the function feeds the ObjectMapper the reqParms Map object, which the function has received as an input parameter. Converting the ByteArrayOutputStream to a String is the final step before the function can return the JSON-formatted text back to the calling process.

Again, a warning is necessary here. There is no guarantee that the jackson libraries (or any other libraries such as JsonSlurper or XmlSlurper) will be available in future releases of Sales Cloud. If access from Groovy in AppComposer does get changed, it would be possible to replace dependency on the third-party libraries with a Groovy script that calls only native Java methods. Even though it means extra coding work and more lines of code, the work would be isolated to the global function and, with proper design, would not extend out to the trigger functions.

The converse function, json2Map, may look like this:

/*************************************************************
* Function Name: json2Map
* Returns: Map
* Description: Returns a Map of key:value pairs that are parsed from a passed-in JSON-formatted String
* Example: json2Map( String jsonText ) returns Map
**************************************************************/

println 'Entering json2Map function'
def map
org.codehaus.jackson.map.ObjectMapper mapper = new org.codehaus.jackson.map.ObjectMapper()
try {
map = mapper.readValue(jsonText, Map.class)
} catch (org.codehaus.jackson.JsonGenerationException e1) {
e1.printStackTrace()
} catch (org.codehaus.jackson.map.JsonMappingException e2) {
e2.printStackTrace()
} catch (IOException e3) {
e3.printStackTrace()
}
println 'Exiting json2Map function'
return map

Like the map2Json function, this function also takes advantage of the third-party jackson libraries. After instantiating a new ObjectMapper, it feeds the JSON-formatted String into the mapper and reads the value returned from the ObjectMapper as a Map, which is then returned to the calling process.

How does this pair of global functions measure up to the best practice design criteria? Again, there may be some room for marginal improvement, but not too much:

  • Minimize hard-coding. Not much apparent room for improvement.
  • Liberal use of input parameters. No need for any more parameters in its current implementation.
  • One job for one global function. These functions fulfill the one function/one job guideline, while not being too granular.
  • Take strategic advantage of return values. Return values fit the need of the calling processes, as will be seen below.
  • Manage the scope of error and exception handling. The functions include very basic exception handling at the level of the JSON processing, i.e. to handle any exceptions thrown by the jackson libraries.

Again, given the possibility that the availability of the Groovy jackson libraries that provide JSON processing support may change in future releases, these functions are built with a level of granularity that will make it far easier to rewrite them if it becomes necessary.

Object Trigger Functions – How They Should Be Designed

With the three global functions now in place to support REST calls to an external service from Sales Cloud business processes, the focus can now change to designing and building object trigger functions in Application Composer to support the business requirements. For this use case, one of the requirements is to create a dedicated opportunity folder in Document Cloud Service whenever a new opportunity is created. There are related requirements to change the Document Cloud Service folder name if the opportunity name changes, and also to delete the dedicated opportunity folder if and when the opportunity in Sales Cloud is deleted.

All three business requirements can be satisfied with object trigger functions for the Opportunity object. To get to the design-level entry point for trigger functions, open Application Composer, change to the Sales application if necessary (which is where the Opportunity object lives), open the Opportunity object, and drill down to the Server Scripts link.

From the Server Scripts work area page, click on the Triggers tab, and then either click the New icon or select Add from the Action drop-down. This will bring up the “Create Object Trigger” page. The first task on this page, which is irreversible by the way, is to decide on which trigger event to fire the function. The strategy here is to select the event which offers the highest degree of transaction integrity.

The “After Insert in Database” event is not perfect, but it does satisfy the requirement of calling the REST service to create the folder in Document Cloud Service only after the opportunity is successfully created on the Sales Cloud side. (Refer to Section 3.9 of the Release 8 Groovy Scripting Guide for a detailed list of the fifteen trigger events that can be accessed at the object level.)

Here is the implementation of the CreateNewOpportunityFolderTrigger object trigger script:

/***********************************************************
* Trigger: After Insert in Database
* Trigger Name: CreateNewOpportunityFolderTrigger
* Description: Creates new opportunity DocCloudService folder after opportunity record is written to database
************************************************************/

println 'Entering CreateNewOpportunityFolderTrigger'
def docFolderGuid = nvl(DocFolderGuid_c, '')
if (!docFolderGuid) {
def restParamsMap = adf.util.getDocCloudParameters()
def urlExt = '/folders/' + restParamsMap.topFolderGuid
def reqPayload = [name:(Name)]
def respPayload = adf.util.callRest('POST', urlExt, reqPayload)
def respMap = adf.util.json2Map(respPayload)
//TODO: better error checking required here
def newFolderGuid = respMap.id
setAttribute('DocFolderGuid_c', newFolderGuid)
println 'DocFolderGuid is ' + nvl(DocFolderGuid_c, 'null')
def urlExtSub = '/folders/' + newFolderGuid
def reqPayloadDocuments = [name:'Documents']
def respPayloadDocuments = adf.util.callRest('POST', urlExtSub, reqPayloadDocuments)
def reqPayloadSpreadsheets = [name:'Spreadsheets']
def respPayloadSpreadsheets = adf.util.callRest('POST', urlExtSub, reqPayloadSpreadsheets)
def reqPayloadPresentations = [name:'Presentations']
def respPayloadPresentations = adf.util.callRest('POST', urlExtSub, reqPayloadPresentations)
def reqPayloadPublished = [name:'Published']
def respPayloadPublished = adf.util.callRest('POST', urlExtSub, reqPayloadPublished)
} else {
println 'Opportunity folder already created for ' + Name
}
println 'Exiting CreateNewOpportunityFolderTrigger'

NOTE: The script depends upon the existence of a custom field that was added to the top-level Opportunity object: DocFolderGuid with the API name of DocFolderGuid_c.

Sequential sub-tasks that are handled by the script:

  1. 1. Checks to see if folder exists by checking if DocFolderGuid_c is null. If so, the script builds a REST endpoint URI containing the parent folder GUID under which the new opportunity folder will be created.
  2. 2. Builds a request payload consisting of a Map containing the name of the folder to be created.
  3. 3. Invokes the callRest global function to create the opportunity folder.
  4. 4. Converts the returned JSON response from the callRest functionto a Map and pulls out the GUID value for the newly-created folder.
    Assigns this value to the custom DocFolderGuid field.
  5. 5. Creates four subfolders under the dedicated opportunity folder which was just created.

By relying on calls to global functions, all of these subtasks can be done with fewer than 24 lines of code (although adding complete error/exception handling to this script would probably increase this line count significantly).

The object trigger script to handle the folder renaming is even more compact. There may be an inclination to hook into one of the field-level trigger events, which would fire after the value of a specific field has changed. But this strategy leads to problems. Testing/debugging shows that this event fires too often, and far too many network REST/HTTP calls, most of them totally unnecessary, would be generated by hooking into this event. As it turns out, if it is necessary to check for a change in the value of a field using the isAttributeChanged() function, the “Before Update in Database” event is the correct event for this object trigger script:

/***********************************************************
* Trigger: Before Update in Database
* Trigger Name: ModifyOpportunityFolderTrigger
* Description:
************************************************************/

println 'Entering ModifyOpportunityFolderTrigger'
if (isAttributeChanged('Name')) {
def reqPayloadFolder = [name:nvl(Name, 'NULL Opportunity Name -- should not happen')]
def docFolderGuid = nvl(DocFolderGuid_c, '')
if (docFolderGuid) {
def respPayloadFolder = adf.util.callRest('PUT', '/folders/' + docFolderGuid, reqPayloadFolder)
println 'respPayloadFolder'
def respMapFolder = adf.util.json2Map(respPayloadFolder)
//TODO: error checking based on respMapFolder
} else {
println 'Empty DocFolderGuid, so cannot complete operation.'
}
} else {
println 'No name change, so no folder edit.'
}
println 'Exiting ModifyOpportunityFolderTrigger'

This script initially determines if there is any work for it to do by checking to see if the value of the Opportunity Name was updated. If so, it verifies that a Document Cloud Service folder GUID has been stored for the active Opportunity, and then, after building a request payload (in this case a one-element Map key:value pair) containing the new folder name, the script calls the callRest global function to change the folder name.

One more object trigger function completes the example use case for extending default Sales Cloud Opportunity processing.   Because the global functions for REST support were designed properly (more or less), it is fairly simple to write the trigger function to delete the dedicated Document Cloud Service folder whenever a Sales Cloud Opportunity is deleted. (Instead of deleting the folder and its contents, it would be just as easy to move the document folder to an archive location, which may be a more real-world example of what a Sales Cloud user would require.)

Here is the object trigger script:

/************************************************************
* Trigger: After Delete in Database
* Trigger Name: DeleteOpportunityFolderTrigger
* Description: Will delete DocCloudService folder tied to a specific opportunity when an Opportunity object is deleted in the database
*************************************************************/

println 'Entering DeleteOpportunityFolderTrigger'
def respPayload
def docFolderGuid = nvl(DocFolderGuid_c, '')
if (docFolderGuid) {
respPayload = adf.util.callRest('DELETE','/folders/' + docFolderGuid, [:])
// TODO: error checking with response payload
println respPayload
} else {
println 'No GUID for Opportunity folder, so no folder delete'
}
println 'Exiting DeleteOpportunityFolderTrigger'

This script once again checks to see if there is a value in the DocFolderGuid custom field that points to the Opportunity-specific folder that was created in the Document Cloud Service instance. If so, it then builds a REST URI and passes control to the callRest global function. The pattern is very similar to the other object trigger functions: build the required pieces that are going to be passed to a global function, then call it, and finally process the response. The global function should be built with the appropriate try/catch structures in place so that system-level exceptions are caught and dealt with at that level. However, the situation-specific object trigger function needs to handle any higher-level exception processing that could not (should not) be caught in the global function. (More detail on exception processing will be covered in Part 2 of this post.)

Which Trigger Event for What Purpose?

The mantra of experienced Application Composer Groovy scriptwriters normally is to use the “Before…” trigger types. As the names imply, these types of functions fire before an event occurs. They are the best fit when a function (through web services calls) will be populating standard or custom field attributes prior to a transaction getting saved to the database. But there also may be a valid case for taking advantage of the “After” triggers, especially when the web service being called is creating or updating a record (or database row) of its own in another database. An example here will help to clarify. Suppose that the business requirement is for an object to be created in the target system whenever an Opportunity, Account, Contact, etc. is created in Sales Cloud. It makes more sense to have the web service call fire only after it is absolutely certain that the Sales Cloud object has been created, modified, or deleted, whatever the case may be. (More details on handling trigger events will be covered in Part 2 of this post.)

More often than not, after everything has been fleshed out, it will be necessary to refactor and possibly restructure both global and trigger functions after these components have been subjected to analysis and to unit testing. Obviously the preferred approach is to recognize the need for refactoring as early along in the development lifecycle as possible so as to minimize the amount of rework and retesting required.

Summary – Part 1

Much like building a house from the ground up or adding on to an existing home, integrating Sales Cloud with external applications using web services in Application Composer requires advance planning, but the amount of effort for planning can be minimized if best practices and proven design patterns exist, and are followed, to shortcut the design process. The design patterns presented in this post prove that reusable global functions can be built in such a way that object trigger functions handling the business logic can rely on them to do the bulk of the processing. Following this pattern will result in leaner, tighter, efficient code that is easier to maintain, and in many cases, will be reusable across multiple Sales Cloud applications. Although the context here is REST web services, following the design pattern presented here will produce the same desirable outcome for other development scenarios. The net end result is something that pleases everyone: a development team with members who can be more responsive to business end users and an organization that isn’t tied down by having to recreate the wheel every time a new business requirement surfaces.

Part 2 of this post will present other related design patterns in Application Composer and Groovy scripting, again with a grounding reference to an integration pattern between Sales Cloud and Documents Cloud Service.

Integrating with Documents Cloud using a Text Editor as an example

$
0
0

Introduction

Oracle Documents Cloud Service provides a powerful REST API and embed tools allowing integrate with almost anything today. This section will cover a web text editor reading and sending html/text content to the Documents Cloud using basic javascript and JQuery. The first example covers the basic action to load a text area and send as a text file to the server. For the second example, the web editor will be the CKEditor and we can see the following steps covered in the example:

  •  Basic Authentication
  •  Download (GET) an existing file to the editor from the Documents Cloud
  •  Change the text and save (POST) to the Documents Cloud as a new revision

This post is the first of a series of different uses of the API for document creation and manipulation. Other features such as Document Picker will be covered in the future.

Main Article

Simplifying this example, I’ve removed complex scripts and functions and using basic authentication. The web editor component, CKEditor also has the minimal plugins and only basic features are covered, which you can change later in a more complex solution.

What you need?

  •   Access to a Oracle Documents Cloud instance (https://cloud.oracle.com/documents)
  •   A Web Server to host your custom HTML file
  •   Optionally download the 3rd party editor CKEditor (Any edition) from http://ckeditor.com
    (The examples are using the CDN version)
  •   Follow the steps and have fun

Preparing the environment

Each example is a single html file that you can use a web server to host.

For this example, we will create three HTML files, the hellodocs.html to test the environment, the simpletexteditor.html with just a textarea and the texteditor.html with the 3rd party CKEditor create Rich Text and send to the Documents Cloud.

Testing the environment

Test the access to the Documents Cloud UI by entering you username, password and identify domain. The address will looks like this: https://<DoCS_server>/documents. Optionally you can also enter in the address https://<DoCS_server>/documents/api to see the response from the server.

To make sure that your environment is ready, use this code to see if you can run JQuery and you have connectivity to your Oracle Documents Cloud instance:

        $(document).ready(function(){
          $("#hellodocs").click(function(){
              var docsUrl = DoCSinstance.value + '/documents/api/1.1';
              $("#test1").text('Loading!');
              $.ajax ( {
                  type: 'GET',
                  url: docsUrl,
                  dataType: 'text',
                  beforeSend: function (xhr) {
                      xhr.setRequestHeader ('Authorization',
                        'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value));
                  },
                  success: function(data) {
                      $("#test1").text(data);
                      $("#editor1").val(data);
                      $("#status").text('Success');
                  },
                  error: function(jqXHR, textStatus, errorThrown) {
                      $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                      $("#test1").text('Error: '+ textStatus, errorThrown);
                  }
              } );
          });
        });

 

For the JQuery REST calls you need to have the correct setup to avoid CORS issues, or else you will experience 401 error messages.

Now include the following code to test the CKEditor:

    <script src="//cdn.ckeditor.com/4.4.7/full/ckeditor.js"></script>
    <script src="//cdn.ckeditor.com/4.4.7/full/adapters/jquery.js"></script>
    <script>
      $( document ).ready( function() {
          $("#editor1").ckeditor();
      } );
    </script>

 

Full Hello World code here:

<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
    <title>Hello DoCS</title>
    <meta name="description" content="Hello Documents Cloud - by A-Team">
    <script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
    <script src="//cdn.ckeditor.com/4.4.7/full/ckeditor.js"></script>
    <script src="//cdn.ckeditor.com/4.4.7/full/adapters/jquery.js"></script>
    <script>
      $( document ).ready( function() {
          $("#editor1").ckeditor();
      } );
    </script>
  </head>
  <body>
    <script>
        $(document).ready(function(){
          $("#hellodocs").click(function(){
              var docsUrl = DoCSinstance.value + '/documents/api/1.1';
              $("#test1").text('Loading!');
              $.ajax ( {
                  type: 'GET',
                  url: docsUrl,
                  dataType: 'text',
                  beforeSend: function (xhr) {
                      xhr.setRequestHeader ('Authorization',
                        'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value));
                  },
                  success: function(data) {
                      $("#test1").text(data);
                      $("#editor1").val(data);
                      $("#status").text('Success');
                  },
                  error: function(jqXHR, textStatus, errorThrown) {
                      $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                      $("#test1").text('Error: '+ textStatus, errorThrown);
                  }
              } );
          });
        });
      </script>
      <h2>Hello Documents Cloud</h2>
      <p>
      Username: <input id="DoCSuser" type="text" value="tenant.user">
      Password: <input id="DoCSpassword" type="password" value="userpassword">
      </p>
      DoCS Instance: <input id="DoCSinstance" type="text" size="36" value="https://tenant.documents.us2.oraclecloud.com">
      <button id="hellodocs">DoCS Test</button>
      <p id="test1">.</p>
      <p id="status">Enter your username/password and Documents Cloud Instance to test</p>
      <textarea cols="80" id="editor1" name="editor1" rows="10">
        Some Text
      </textarea>
  </body>
</html>

 

With this simple code, your page will looks like this:

Hello Documents Cloud Screenshot 

Cloud Text Editor

Only two REST calls will be used in the example, the Download File(/documents/api/1.1/files/{{file id}}/data)[GET] and the Upload File Version(/documents/api/1.1/files/{{file id}}/data)[POST].

Here you can find the code example of the text editor in the cloud integrated to the Documents Cloud:

[Example 1: simpletexteditor.html]

 

<!DOCTYPE html>
<html>
    <head>
        <meta charset="utf-8">
        <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
        <title>Simple Documents Cloud Text Editor Example(Not really editor)</title>
        <meta name="description" content="Simple Documents Cloud Text Editor (Not really editor)">
        <script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>      
    </head>
    <body>    
        <script>
            $(document).ready(function(){
                $("#btnLoadDoc").click(function(){
                    var docsUrl = DoCSinstance.value + '/documents/api/1.1';
                    var strFileId = metadataFileId.value;
                    $("#status").text('Loading!');
                    $.ajax ( {
                        type: 'GET',
                        url: docsUrl + '/files/' + strFileId + '/data',
                        crossDomain: true,
                        xhrFields: { withCredentials: true },                        
                        beforeSend: function (xhr) { 
                            xhr.setRequestHeader ('Authorization', 
                                                  'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value)); 
                        },
                        success: function(data) { 
                            $("#editor1").text(data);
                            $("#status").text('Document loaded');
                            $("#metadataInfo").text('');
                        },
                        error: function(jqXHR, textStatus, errorThrown) {
                            $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                            $("#metadataInfo").text('Error: '+ textStatus, errorThrown);
                            
                        }
                    } ); 
                });
                $("#btnSaveDoc").click(function(){
                    var docsUrl = DoCSinstance.value + '/documents/api/1.1';
                    var strFileId = (metadataFileId.value == '' ? '' : '/' + metadataFileId.value);
                    var strFileName = metadataFilename.value;
                    $("#status").text("Saving!");
                    var fileContent = new Blob([editor1.value], { type: 'text/plain'});;
                    var filePackage = new FormData()
                    filePackage.append('jsonInputParameters','{"parentID": "self"}');
                    filePackage.append('primaryFile',fileContent, strFileName);
                    $.ajax ( {
                        type: 'POST',
                        url: docsUrl + '/files' + strFileId + '/data',
                        enctype: 'multipart/form-data',
                        data: filePackage,
                        cache: false,
                        processData: false,
                        contentType: false,
                        crossDomain: true,
                        xhrFields: { withCredentials: true },
                        beforeSend: function (xhr) { 
                            xhr.setRequestHeader ('Authorization', 
                                                  'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value));  
                        },
                        success: function(data) { 
                            $("#status").text('Document Saved');
                            $.each(data, function(key, value) { 
                                if(key == "version"){
                                    $("#metadataVersion").text('Version: ' + value);
                                }
                                if(key == "id"){
                                    $("#metadataFileId").val(value);
                                }
                                $("#metadataInfo").append(key + ': ' + value + '<br>');
                            });
                        },
                        error: function(jqXHR, textStatus, errorThrown) {
                            $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                            $("#metadataInfo").text('Error: '+ textStatus, errorThrown);
                        }
                    } ); 
                });
            });                     
        </script>
        <h2>Simple Oracle Documents Cloud Text Editor sample</h2>
        <p>
        Username: <input id="DoCSuser" type="text" value="tenant.user">
        Password: <input id="DoCSpassword" type="password" value="userpassword">
        </p>
        Documents Cloud Address: <input id="DoCSinstance" type="text" size="50" value="https://tenant.documents.us2.oraclecloud.com">
        <p>
        File Name: <input id="metadataFilename" type="text" size="10" value="">
        File Id: <input id="metadataFileId" type="text" size="53" value="">
        <span id="metadataVersion" style="color:blue">--</span>
        </p>
        <p></p>
        <button id="btnLoadDoc">Load Text</button>
        <button id="btnSaveDoc">Save Text</button>
        <br>
        <textarea cols="80" id="editor1" name="editor1" rows="10">
            My First Documents Cloud text document
        </textarea>
        <p id="status">Enter your username/password and Documents Cloud Instance to test</p>
        <p id="metadataInfo"></p>
        <script>
            $("#metadataFilename").val('mytext' + (Math.floor((Math.random() * 1000) + 1)) + '.txt');
            $("#btnLoadDoc").prop('disabled', true);
            $('#metadataFileId').on('input', function() {
                $("#btnLoadDoc").prop('disabled', false);
            });
        </script>
    </body>
</html>

 

/ / ]] >

// ]]>

Now including the 3rd party CKEditor:

[Example 2: texteditor.html]

 

<!DOCTYPE html>
<html>
    <head>
        <meta charset="utf-8">
        <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
        <title>Custom Documents Cloud Text Editor Sample - by A-Team</title>
        <meta name="description" content="Custom Documents Cloud Text Editor - by A-Team">
        <script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
        <script src="//cdn.ckeditor.com/4.4.7/full/ckeditor.js"></script>
        <script src="//cdn.ckeditor.com/4.4.7/full/adapters/jquery.js"></script>
        <script>
          $( document ).ready( function() {
              $("#editor1").ckeditor();
          } );
        </script>
    </head>
    <body>    
        <script>
            $(document).ready(function(){
                $("#btnLoadDoc").click(function(){
                    var docsUrl = DoCSinstance.value + '/documents/api/1.1';
                    var strFileId = metadataFileId.value;
                    $("#status").text('Loading!');
                    $.ajax ( {
                        type: 'GET',
                        url: docsUrl + '/files/' + strFileId + '/data',
                        crossDomain: true,
                        xhrFields: { withCredentials: true },                        
                        beforeSend: function (xhr) { 
                            xhr.setRequestHeader ('Authorization', 
                                                  'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value)); 
                        },
                        success: function(data) { 
                            $("#editor1").val(data);
                            $("#status").text('Document loaded');
                            $("#metadataInfo").text('');
                        },
                        error: function(jqXHR, textStatus, errorThrown) {
                            $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                            $("#metadataInfo").text('Error: '+ textStatus, errorThrown);
                            
                        }
                    } ); 
                });
                $("#btnSaveDoc").click(function(){
                    var docsUrl = DoCSinstance.value + '/documents/api/1.1';
                    var strFileId = (metadataFileId.value == '' ? '' : '/' + metadataFileId.value);
                    var strFileName = metadataFilename.value;
                    $("#status").text("Saving!");
                    var fileContent = new Blob([$("#editor1").val()], { type: 'text/plain'});;
                    var filePackage = new FormData()
                    filePackage.append('jsonInputParameters','{"parentID": "self"}');
                    filePackage.append('primaryFile',fileContent, strFileName);
                    $.ajax ( {
                        type: 'POST',
                        url: docsUrl + '/files' + strFileId + '/data',
                        enctype: 'multipart/form-data',
                        data: filePackage,
                        cache: false,
                        processData: false,
                        contentType: false,
                        crossDomain: true,
                        xhrFields: { withCredentials: true },
                        beforeSend: function (xhr) { 
                            xhr.setRequestHeader ('Authorization', 
                                                  'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value));  
                        },
                        success: function(data) { 
                            $("#status").text('Document Saved');
                            $.each(data, function(key, value) { 
                                if(key == "version"){
                                    $("#metadataVersion").text('Version: ' + value);
                                }
                                if(key == "id"){
                                    $("#metadataFileId").val(value);
                                }
                                $("#metadataInfo").append(key + ': ' + value + '<br>');
                            });
                            $("#btnLoadDoc").prop('disabled', false);
                        },
                        error: function(jqXHR, textStatus, errorThrown) {
                            $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                            $("#status").text('Login Error: '+ textStatus, errorThrown);
                        }
                    } ); 
                });
            });                     
        </script>
        <h2>Oracle Documents Cloud Text Editor Sample</h2>
        <p>
        Username: <input id="DoCSuser" type="text" value="tenant.user">
        Password: <input id="DoCSpassword" type="password" value="userpassword">
        </p>
        Documents Cloud Address: <input id="DoCSinstance" type="text" size="50" value="https://tenant.us2.oraclecloud.com">
        <p>
        File Name: <input id="metadataFilename" type="text" size="10" value="">
        File Id: <input id="metadataFileId" type="text" size="53" value="">
        <span id="metadataVersion" style="color:blue">--</span>
        </p>
        <p></p>
        <button id="btnLoadDoc">Load Text</button>
        <button id="btnSaveDoc">Save Text</button>
        <br>
        <textarea cols="80" id="editor1" name="editor1" rows="10">
            My First <b>Documents Cloud</b> text document
        </textarea>
        <p id="status">Enter your username/password and Documents Cloud Instance to test</p>
        <p id="metadataInfo"></p>
        <script>
            $("#metadataFilename").val('mytext' + (Math.floor((Math.random() * 1000) + 1)) + '.html');
            $("#btnLoadDoc").prop('disabled', true);
            $('#metadataFileId').on('input', function() {
                $("#btnLoadDoc").prop('disabled', false);
            });
        </script>
    </body>
</html>

 

Your page will looks like this:

Documents Cloud Text Editor Sample Screenshot

 

Simple and powerful.

 

View of the Documents Cloud UI and the articles stored with the example:

Documents Cloud UI

 

You can also use with Inline editing with multiples editable regions:

Inline Editing example

 

And other more complex coding can use multiple instances of the editor with various documents cloud documents in the same page.

 

 

Conclusion

This article covers the basis of using the REST API of Documents Cloud, with a simple example of what you can do with Oracle Documents Cloud Service. With this you can expand to use more features for a complete solution fitting your needs, like browse an image in the repository and include in the text, create new files, share your documents with others and much more.

 

Reference

Oracle Documents Cloud Service info: http://cloud.oracle.com/documents

DoCS REST API: http://docs.oracle.com/cloud/latest/documentcs_welcome/WCCCD/odcs-restapi.htm

(3rd Party) CKEditor Documentation: http://docs.ckeditor.com

Calling Oracle Documents Cloud Service REST APIs from WebCenter Content Custom Components

$
0
0

Files from Oracle Documents Cloud Service can be migrated to and from WebCenter Content using the SkySync desktop application, which makes for simple transferring between WCC and ODCS. However, other use cases with ODCS may be desired from within WCC itself. To interact with ODCS from WCC, a custom component can be created to call Documents Cloud REST APIs. This allows for creating hybrid ECM solutions specific to business needs. This post illustrates one method for integrating WCC with ODCS using the AppLink REST API.

The method of creating folders in Documents Cloud and displaying AppLinks in on-premise WCC has many potential applications. Moreover, the use of the AppLink feature of Documents Cloud has many integration possibilities outside of WCC and the same approach may be re-used in other Oracle applications, and even third party applications. The use case here is just an illustration of how to use the AppLink feature in on-premise WebCenter Content to achieve additional hybrid ECM functionality.

The use case in the sample is to create Documents Cloud attachments tied to an “anchor” document in WebcenterContent. The demo uses a “JobApplications” profile as the sample use case. The WCC content item is a Job Posting that provides the anchor for attachments in the cloud. When a Job Posting content item is created in WCC, a cloud folder is created at the same time and the GUID for that cloud folder is stored in a WCC metadata field. The AppLink feature allows temporary access for non-provisioned Documents Cloud users. This is accomplished through a REST call that grants tokens for displaying a cloud folder.
When the anchor content item’s DOC_INFO page is displayed, an AppLink is generated to display the embedded cloud folder. This provides an attachments user interface in on-premise WCC.

The embedded AppLink on WebCenter Content for this example displays the iframe on the top of the DOC_INFO page. This enables drag and drop uploads to Documents Cloud and secures the cloud content so that it is only visible to users that have access to that content item in WCC.

Topics covered in this post are:

  • Webcenter Content Filters
  • Reading Credentials from Enterprise Manager in WCC Component
  • Using Jersey to call Documents Cloud Rest Services
  • Parsing JSON responses into POJOs
  • Overriding the display of a metadata field to display the AppLink iframe

 

 

Documents Cloud folder AppLink embedded on DOC_INFO page in WebCenter Content

 


 

WebCenter Content Filters Review

Using WebCenter Content filters, functionality to call Documents Cloud Service REST services can be added to a custom component. Filter events execute on service calls and allow for modified functionality of content server actions such as checkins, metadata updates, or retrieving content information. In this example, filter events in the CHECKIN and DOC_INFO services are used to trigger REST calls to Documents Cloud. A custom component’s hda file, also known as a “glue file” which describes the component, has a section for filter events that define Java methods for execution. The Filters result set in the hda file define the event, such as postComputeDocName or onEndServiceRequestActions, and the Java class that extends the intradoc.shared.FilterImplementor class. The FilterImplementor class has a doFilter method that can be overridden. This is where custom code can be placed to run.

postComputeDocName: Upon checkin of a new content item, a cloud folder is created and the GUID for that folder is stored in a metadata field.
The filter class demo.filter.DOCSRestFilterSample accesses a CSF key for the Documents Cloud username and password. A call is made to the ODCS REST API to create a folder. The returned JSON response is stored in a POJO called FolderItems.java. This class allows easy access to the JSON response in a Java object. For details on the REST call to create a folder, see the documentation on the Folder Resource.

onEndServiceRequestActions: When a DOC_INFO page is accessed, this filter is used to read the CSF key for the Documents Cloud username and password. A REST call is then made to create an AppLink in ODCS. The JSON response is parsed into a class called AppLink.java. This allows simple access to the AppLink details. For details on the REST call to create a folder, see the documentation on the AppLink Resource. The AppLink  response contains variables that must be set into the databinder’s local data section so that the values can be accessed from idocscript. An applink response contains several key pieces: a url, an access token, a refresh token, and the user’s role for access rights to the Cloud Folder. At a minimum these values are needed to render an iframe properly with an AppLink url.

 

@ResultSet Filters
4
type
location
parameter
loadOrder
postComputeDocName
demo.filter.DOCSRestFilterSample
null
1
onEndServiceRequestActions
demo.filter.DOCSAppLinkRestFilterSample
null
1
@end

 

Note: Discovering which filter to use for customizations is an important step in creating WebCenter Content custom components that use filters. To determine which filter to use, verbose “services” tracing on the System Audit Information page can be enabled. Be aware that this tracing is quite verbose, so clear the output and then run the service for which you are seeking a filter to use. Viewing only the output for the service call you are investigating will make it much easier to determine what filter hooks are available for your customization.

 

Since only two filters are used in this demo, only two Java classes are needed in the component to create the integration. Both classes are similar in that they require:

1. Access to a secured username and password, which can be accessed using JPS packages. The credentials are secured in Enterprise Manager and obtained only at runtime.

2. Ability to call external REST services. This can be done using Jersey or Apache CXF libraries (which must be added to the classpath in WCC). This example uses the Jersey classes.

3. Interaction with the service call’s localdata in the databinder. This is needed to modify the behavior of the checkin and document information service calls.

 


 

Create a credential in Enterprise Manager

A credential can be created through Enterprise Manger (EM) or through WebLogic Scripting Tool (WLST). This is needed to secure the REST calls to the Documents Cloud Service. The filters will need to access the map and key programatically.

To create a credential using WLST, execute the following command:

createdCred(map=”oracle.documents.cloud.service”, key=”DocumentsCloudUser”, user=”peter.flies@oracle.com”, password=”mysecretpassword”)

 

To create a credential using EM, do the following:
1. Log in to Enterprise Manager.
2. Click WebLogic Domain.
3. Click Security and then Credentials.
4. Create the map and key:
a. Select Create Map.
b. Enter oracle.documents.cloud.service in the map field and click OK. Note that this can be something else if preferred, just remember to update the component’s environment setting DocumentsCloudUserCredMap.
c. Click Create Key. The key is now available for selection.
5. Enter a key name of DocumentsCloudUser. This is the credential alias used in the component’s environment file DocumentsCloudUserCredKey. This name can be anything, but the key must match the component’s environment file to locate the key.
6. Select password as the type.
7. Enter a user name and password. (e.g peter.flies@oracle.com, welcome1)
8. Optionally, enter a description for the credential. (e.g. “Documents Cloud user for Job Posting attachments”)
9. Click OK.

 

Once this map and key are created, the JpsContectFactory class can be used to get the credentials at runtime. A code snippet shows a method of interaction with the factory.

            JpsContextFactory ctxFactory;
            CredentialStore store;

            ctxFactory = JpsContextFactory.getContextFactory();
            JpsContext ctx = ctxFactory.getContext();
            store = ctx.getServiceInstance(CredentialStore.class);

            PasswordCredential pc = null;

            pc = (PasswordCredential) store.getCredential(map, key);
            if (pc == null) {
                Report.trace("restfilter", "Credential not found for map " + map + " and key " + key + ".", null);
                throw new DataException("Credential not found for map " + map + " and key " + key + ".");
            } 

            String authString = pc.getName() + ":" + new String(pc.getPassword());

            try {
                encodedAuth = DatatypeConverter.printBase64Binary(authString.getBytes("UTF-8"));
            } catch (UnsupportedEncodingException e) {
                e.printStackTrace();
            }

 


 

Creating the Folder and Storing the Folder GUID in Metadata

During a new checkin event a folder is created in Documents Cloud Service using standard Jersey libraries to make the REST call. The HTTP request is sent, and the JSON response will contain the newly created folder, identified by a GUID. This GUID must be kept with the content item in WCC. Metadata fields in WCC are ideal for storing this GUID. Worth noting is that this should only be executed on a new checkin and not a revision. Likewise, this should not occur on a metadata update. The cloud folder creation in this example is a one-time event, since the GUID will be used throughout all revisions of the WCC content item. The metadata field should also be “info only” for all users, locked down to block updates to the field.

The call to Documents Cloud REST APIs in Jersey can be done using the usual Jersey methods. One item to note here is that the client configuration can use FEATURE_POJO_MAPPING to map the JSON response into a Java object. This must be enabled when creating the client configuration.

 

        ClientConfig cc = new DefaultClientConfig();
        Client client = null;

        cc.getClasses().add(MultiPartWriter.class);
        cc.getFeatures().put(JSONConfiguration.FEATURE_POJO_MAPPING, Boolean.TRUE);

Once the client is prepared, the folder creation can be executed. Creating a Documents Cloud folder takes a JSON payload with a name and description. The POST message is sent and the response is mapped to a FolderItems object, which is a POJO that represents the JSON response.

        String jsonPayload = "{\"name\":\"" + folderName + "\",\"description\":\"" + folderDescription + "\"}";
        WebResource webResource = client.resource(restRequestUrl);
        Report.trace("restfilter", "Calling POST on " + restRequestUrl + " with payload " + jsonPayload, null);
        ClientResponse response =
            webResource.header("Authorization", "Basic " + encodedAuthString)
            .header("Content-Type", "application/json")
            .post(ClientResponse.class, jsonPayload);

        //Check for success, otherwise exit. Can be 200 or 201 on folder create
        if (response.getStatus() != 200 && response.getStatus() != 201) {
            Report.trace("restfilter", "Rest call failed with " + response.getStatus() + "\nStatus Info: " + response.getStatusInfo(), null);
            throw new RuntimeException("Failed : HTTP error code: " + response.getStatus() + "\nStatus Info: " +
                                       response.getStatusInfo());
        }

        fi = response.getEntity(FolderItems.class);

 

Once the POJO is ready, the metadata field can be assigned the folder GUID using the standard WCC databinder method putLocal.

            //Persist new folder into the metadata field 
            dataBinder.putLocal(folderGUIDField, fi.getId());

At this point, the cloud folder is saved to the content item’s metadata in WCC.

 

Creating the AppLink

In the other filter, a similar process is used to create the AppLink so that the logged in user can see the embedded Documents Cloud folder. The saved GUID in the metadata field is used to create the AppLink. The AppLink REST API is called in the same manner as the create folder API was called. First, the credential for the Documents Cloud user must be obtained. The AppLink service can be called using Jersey, and the JSON response can be parsed into a POJO as well to obtain the token information.

The service call to create a folder AppLink requires a JSON payload with “assignedUser” and “role”. The user that is currently logged into WCC should be the “assignedUser”. This demonstration assigns the “contributor” role to all AppLinks created, but the role could be determined if the user’s access rights to the content item were interrogated. For simplicity of the demo, the same role is used for all WCC users that have access to the content item’s DOC_INFO page.

        AppLink al = null;
        String jsonPayload = "{\"assignedUser\":\"" + assignedUser + "\",\"role\":\"" + role + "\"}";
        WebResource webResource = client.resource(restRequestUrl);
        Report.trace("restfilter", "Calling POST on " + restRequestUrl + " with payload " + jsonPayload, null);
        ClientResponse response =
            webResource.header("Authorization", "Basic " + encodedAuthString)
            .header("Content-Type", "application/json")
            .post(ClientResponse.class, jsonPayload);

        //Check for success, otherwise exit. Can be 200 or 201 on folder create
        if (response.getStatus() != 200 && response.getStatus() != 201) {
            Report.trace("restfilter", "Rest call failed with " + response.getStatus() + "\nStatus Info: " + response.getStatusInfo(), null);
            throw new RuntimeException("Failed : HTTP error code: " + response.getStatus() + "\nStatus Info: " +
                                       response.getStatusInfo());
        }

        al = response.getEntity(AppLink.class);

 

Once the response is obtained and parsed into a POJO created for the AppLink, the databinder can have the required tokens and details added using the putLocal method.

            dataBinder.putLocal("appLinkRefreshToken", al.getRefreshToken());
            dataBinder.putLocal("appLinkAccessToken", al.getAccessToken());
            dataBinder.putLocal("appLinkUrl", al.getAppLinkUrl());
            dataBinder.putLocal("appLinkRole", al.getRole());

 


 

Displaying the AppLink in an iframe on the DOC_INFO page

With the AppLink URL prepared and ready for display, the remaining step is to enable it on the desired page. This example overrides the metadata field that is storing the GUID and replaces it with an iFrame. With dynamichtml includes, the display of WebCenter Content pages is customizable and the placement of the iframe could be done in different ways. A simple way to do this is to use a custom include for the metadata field (e.g. xDocumentCloudFolder in the sample).

The appLinkUrl that was already set in the databinder is accessible in server side idocscript. This allows the iframe source URL to be set during page creation. The include that displays the field should be limited only to execute when the page is a DOC_INFO page (IsInfo) and when the field actually contains a value. Custom include for metadata field that holds the cloud folder GUID

Using Profiles and Rules, the display of the cloud folder metadata field can be altered as needed. Profiles and Rules are a feature of WebCenter Content and allow admin users to change behavior and display of metadata fields.

As was stated before, the display of the iframe can be done in different ways and this is only one example of how to accomplish the UI end of this demonstration.

<@dynamichtml DOCSRestFilterSample_display_field_xDocumentCloudFolder@>
	% Used as a custom include in the rule DocumentsCloudFolder. Displays iframe for applink to load.%

	<$if isInfo AND fieldValue$>

		<$trace("DOCSRestFilterSample display_field_xDocumentCloudFolder", "#console", "restfilter")$>
		<iframe  id="content_frame" src="<$appLinkUrl$>" style="width: 100%; height: 520px; overflow: scroll;" ></iframe>
	<$endif$>
<@end@>

 

 


 

Handling the appLinkReady messaging between the parent and the iframe

When using the AppLink resource of Documents Cloud, placing the returned URL into an iframe’s src attribute is not the last step. A message must be handled from the Documents Cloud service when the AppLink is “ready”. This can be done using the HTML5 postMessage method. A message containing the access tokens and the user role must be passed to the iframe in order for the AppLink to appear.

One dynamichtml include that can handle the appLinkReady event from Documents Cloud Service is custom_schema_finalize_display. This include is used because it is one of the last to load during page creation of WebCenter Content pages, meaning the DOM is ready.  JavaScript handling of the appLinkReady event is added to pass the appropriate message to the iframe. Again, the contributor role is assumed in this sample but could be assigned based on WCC rights for the logged in user.

<@dynamichtml custom_schema_finalize_display@>
	% Outputs trace to the console on the "system" subject. Also, browser console outputs added for illustration purposes. %
	<$trace("DOCSRestFilterSample include", "#console", "restfilter")$>
	
		console.log("DOCSRestFilterSample: applink in ready function");

			function OnMessage (evt) {
				console.log("DOCSRestFilterSample: onMessage function " + evt);
			if (evt.data.message === 'appLinkReady') {
				console.log("OnMessage invoked for appLinkReady event...");
				var dAppLinkUrl = "<$appLinkUrl$>";
				var dAppLinkRefreshToken = "<$appLinkRefreshToken$>";
				var dAppLinkAccessToken = "<$appLinkAccessToken$>";
				var dAppLinkRoleName="<$appLinkRole$>";
				var embedPreview = "true";

				var iframe= document.getElementById("content_frame");
				var iframewindow= iframe.contentWindow ? iframe.contentWindow : iframe.contentDocument.defaultView;
					var msg = {
					message: 'setAppLinkTokens',
					appLinkRefreshToken:dAppLinkRefreshToken,
					appLinkAccessToken:dAppLinkAccessToken,
					appLinkRoleName:dAppLinkRoleName,
					embedPreview:embedPreview
				}
				
				console.log("DOCSRestFilterSample: sending message to iframe with access and refresh tokens");

				iframewindow.postMessage(msg, '*');
			}
		};
		window.addEventListener && window.addEventListener('message', OnMessage, false);
	
<@end@>

If the appLinkReady event is not handled, the AppLink URL will show a generic loading page that will never fully render.

 


 

AppLink in Action: Checkin a Job Posting and Add Cloud Attachments

 

The end user usage of this example is a WebCenter Content user that creates Job Postings and attaches or views submitted resumes for candidates seeking the job. The steps are basic content server tasks:

1. Checkin a content item

2. Go to the DOC_INFO page for the content item.

3. Drag and drop attachments into the iframe.

4. View the attachments.

5. When the user navigates away from the DOC_INFO page, the AppLink is no longer used. Each time the DOC_INFO page is opened, a new AppLink is generated.

 



After checking, the DOC_INFO page will show an empty cloud folder in the iframe.

 

Information page displaying newly created Cloud Folder with no files

Files can be dragged into the folder or uploaded using the Upload icon on the AppLink. The AppLink can be opened in a separate window as well using the popout icon.

To verify the folder’s creation, log into the Documents Cloud Service as the owner of the folders in a separate browser or window. As users create Job Postings, folders will begin appearing in the Documents Cloud interface, but using the AppLink feature, users will only see the folder tied to the content item in WebCenter Content.

Documents uploaded to Cloud folder

 


 

Sample Component

Download the attached DOCSRestFilterSample.zip file for the sample component described in this post. Note that this sample is as-is and not supported, and solely to show the possibilities of WCC integrations to Documents Cloud throught the REST API.

DOCSRestFilterSample

 

 


 

Using File Based Loader for Fusion Product Hub

$
0
0

Using File Based Loader for Fusion Product Hub

Introduction

File Based Loaders (FBL) offer a broad bandwidth of varieties to import batch data manually by user interaction or automated via locally scheduled processes using existing API’s and Web Services. This article will highlight the Fusion Product Hub specific capabilities to import item data in a batch by using FBL. Another and more generic article about File Based Loaders can be found here.

The current FBL solution for Fusion Product Hub is covering the following customer scenarios:

  • Manual (UI based) upload and import of item data to Fusion Product Cloud or Fusion Product Hub on-premise instances
  • Automated loader and import processes for item data to Fusion Product Cloud or Fusion Product Hub on-premise instances

This article will describe a technical implementation that can be used with Fusion Product Hub and on-premise installations the same way. It will also cover some basic and necessary functional setup aspects. Please note that item import via FBL doesn’t replace the other Product Hub solutions for data import such as item batch imports via Fusion Desktop Integration. It can rather be seen as an additional offering for item imports.

Main Article

File Based Loader for Fusion Product Hub is using standard technologies and components from the Fusion Apps technology stack on the backend – both in the cloud and on-premise. It’s not necessary to install extra components or products in addition to Fusion Product Hub.

Figure below visualizes the available product data load options for manual (user interaction trough portal or desktop integration) and automatic (Web Services, API’s) scenarios. This blog will explain how to use the various features.

Overview

Customers can use a thin technology footprint in their client environment to use the capabilities of FBL for Fusion Product Hub. The following runtime components and tools are sufficient to create a connection to FBL for uploading item data and triggering scheduling jobs:

  • Java Development Kit (JDK) 1.8.x
  • JDeveloper 12c
  • WebCenter Content Document Transfer Utility for Oracle Fusion Applications (free of charge utility available on Oracle Technology Network)

Especially for cloud customers this footprint eliminates the necessity to install additional server components in their data center while Fusion Apps on-premise or Fusion Middleware customers can leverage their existing infrastructure to run the FBL related client programs and tools.

FBL can be seen as an additional integration point with an option to provide item loader data in Fusion Content Server (UCM) for further import processing. These tasks can be done as manual interactions (occasional item loads) or alternately as an automated task via scripts and API’s. Details will be explained in following sections:

  • Common functional setup for successful item imports
  • Loading data to Fusion Content Server
  • Initiating a item load scheduled job

Note: Other item import capabilities using Desktop Integration are co-existing with the current FBL offering and remain another import offering for on-premise customers.

Part I: Functional Setup for Fusion Product Hub Item Loader

This blog will not cover all aspects of functional setup steps. Instead we’ll focus just on some basic introduction about a functional setup which is generic in terms of being valid for performing any other item definitions as well. Fusion Product Hub offers a set of capabilities for definition of custom item structure as required by customers needs.

DefineAttributesIn a first step, after receiving information describing the item structure and validations, an authorized user will create custom attributes by running the setup task Manage Attribute Groups and Attributes as shown in screenshot above. This step is optional and needs to be carried out only if attributes other than those available out of the box (operational attributes) in Product Hub are required.

DefineAttributes2Attribute Groups consist of attributes, which describe a specific feature of an item. Attribute values can be validated by value sets or more complex, coded validations. All these definitions are stored in an internal metadata repository called Extensible Flexfields (EFF).

DefineAttributes3Once these Attributes and Attribute Groups have been defined, they can be assigned to Item Classes as shown below. New items being loaded via FBL belong to dedicated item classes after import.

DefineItemClassBefore running an item import we must create a mapping between import item structure and the equivalent item class in Fusion Product Hub. Once defined, we must save the Import Map and will refer to it later in the loader process.

DefineItemClass2Screenshot below shows a sample mapping in an overview page. Process of mapping consists of assigning columns in CSV structure to attributes defined per item.

MouserImportMapDefinition2The import structure for mapping is derived from a sample CSV file being loaded to Fusion Product Hub. The first line (header) in a CSV file describes the columns in import structure that has to be mapped to target structure. This can be done via UI by dragging target fields to source fields.

MouserImportMapDefinitionLast but not least an Item Import will run in the context of a Spoke System in Fusion Product Hub. If not already existing, it must be created and assigned to an Item Organization. Every import job started via FBL must refer to a spoke system.

PIMDH_SpokeSystemDefinitionThe functional setup for FBL Item Import as shown above doesn’t differ from any other item import like Desktop Integration. This is usually a one-time activity per import structure. The functional setup is complete after finishing the previous tasks.

Part II: Loading Product Data to Fusion Content Server

FBL leveraged the Universal Content Management (UCM) server coming with Fusion Product Hub for storing import files. It’s usually available under the following URL:

https://<fusion_product_hub_host>:<port>/cs

Customers have a choice to either use the FBL UI for occasional data loads or to setup a machine-to-machine communication instead. This chapter will give an overview about folder structures, basic security structures and given functionality in UCM to put loader files into a staging area for further processing for both variants: the manual and the automated loader tasks.

Manual Steps

Login page to Fusion Content Server is available via URL above.

UCM_LoginPageIn demo system we’re using a Fusion Product Hub identity named PIMQA.

UCM_LoginPage2This user PIMQA is assigned to the following roles as shown below. By using these roles we ensure that all required permissions are given to run File Based Loader.

PIMQA_RolesFile Based Loader requires two files being available in UCM:

  • Data File in CSV format containing the item information
  • Manifest file defining the used import mapping (see above) and the path/name for file containing item data

Both files must be existent and accessible in Fusion Content Server before triggering the loader job. Screenshot below shows a sample of item data in CSV format.

NewStructureItemCSV As stated above a manifest file describes the file and import mapping information for a dedicated item load as shown below.

NewStructureManifestCSVStaging area for Fusion Product Hub is predefined as /Content Folders/PIM.

Via menu New Item new files must be uploaded into that folder. The screenshot below provides further details. The field Account must be filled with the correct values for accessibility permissions – in case of this FBL for Fusion Product Hub sample we used scm$/item$/import$. This account is a seeded value and it can be used for this purpose. Users can setup their own accounts and use them alternately. It’s also possible to use Security Groups instead of Accounts when using UI based file upload. More details about security mechanisms are explained in the next section below.

UCM_UploadPimFile_ManuallyOnce all files were uploaded – either manually or automatically via scripts – the required files must reside in UCM folder before triggering the item load job. The screenshot below shows a sample.

UCM_PimFiles3When double checking properties for uploaded files the screenshot below shows a typical setup (meaning for Security Group and Account to be explained further down in this document):

  • Folder: /Contribution Folders/PIM
  • Security Group: FAImportExport
  • Account: scm$/item$/import$

UCM_PimFiles2As soon as these files have been uploaded and correct data has been provisioned the UCM part of FBL is done and we can proceed to do the next step.

Optional: Checklist Security Setup in UCM (on-premise only)

Normally there are no requirements to modify or extend the UCM security setup. The security features described above (i.e. Security Group, Account etc.) are supposed to be existent. However in case of troubleshooting it might be good having a quick checklist about UCM security options as needed by FBL for Fusion Product Hub. A full documentation can be found on product documentation site here.

The following relationship between Users, Roles, Policies and Resources exist:

  • UCM resources like Security Groups and Accounts define access to various folders and files
  • These resources are grouped in APM Policies
  • APM Policies are assigned to Application Roles
  • Application Roles are assigned to Fusion Product Hub Users

The best option to check security setup is by using a privileged user like FAAdmin. Obviously that won’t work in Fusion Product Cloud. Its recommended to submit a Service Request in case of doubt that security options might not be set correctly if using cloud services.

After login as a privileged user open the Administration sub-page. In the right window pane a list of Admin Applets appears after activation of Administration -> Admin Applets (see below).

UCM_AdminAppletsThe applet User Admin shows the user PIMQA we’re using in this sample as the external user in our sample. It means the user is registered in Fusion Identity Management. Only a few UCM built-in users are marked as local. Usually it’s neither necessary nor recommended to touch any of these entries via this applet.

UCM_UserAdminAppletThe screenshot below shows more details of our sample user.

UCM_UserAdminApplet2Furthermore it might be useful to show where the UCM Account scm$/item$/import$ (see usage in section above) is defined as it will improve the understanding of underlying security concepts.

Entries can be found via the Authorization and Policy Management (APM) page in Fusion Product Hub via a link like this:

https://<fusion_product_hub_server>:<port>/apm

You must use privileged user credentials like FAAdmin for a successful login to APM.

Once logged in we can search for UCM Accounting details as shown below.

Search for Application Roles -> IDCCS -> Resources

APM_UCM_AccountNext step is to search for resources starting with scm in the Search Resources pages s shown below.

APM_UCM_Account2Open detail page for scm$/item$/import$ with the results as shown below and click the button Find Policies.

APM_UCM_Account3In Policies overview page we find the attached policies.

APM_UCM_Account4Opening these policies will show the details about document permissions per resource as defined for Item Import in Fusion Content Server.

APM_UCM_Account5

Programmatic Interface for Item Upload to Fusion Content Server

As an alternative to manual uploads of Item data we can use a publicly available toolset called Webcenter Content Document Transfer Utility for Fusion Apps. It can be downloaded from OTN as shown below.

WebCenterTransferUtility_Download

This toolset provides some Java programs to be used from command line interface. Such an interface is useful when running periodical jobs in customer environments to upload new or changed item data without human interaction.

A processing pipeline could look like this:

  • Extract item data from a local system and transform them into a CSV format as expected by Fusion Product Hub
  • Put the file to a staging area
  • Create a manifest file or reuse an existing manifest file in case file names remain the same on Fusion Content Server
  • Run the command line utility to upload file(s)
  • Initiate further processing to load Item Data via calling a Webservice to run an Import Job.

Recently some related articles have been published on Fusion Apps Developer Relations Blog like this post. Please refer to those sources if you want to learn more about the tool usage.

In this section we will cover the tool usage as required for Fusion Product Hub item load.

The transfer utility provides two different interfaces to connect to Fusion Content Server:

  • The RIDC-based transfer utility as a feature-set Java library that encapsulates a proprietary protocol (ridc) to Fusion Content Server via HTTPS.
  • A generic soap-based transfer utility using the Oracle JRF supporting libraries for JAX/WS over HTTPS to communicate with the Fusion Content Server.

After download and extraction of transfer utility two sub-directories will exist: ridc and generic. Details about the specific command line parameters and connection information can be found below.

In addition to these both sub-directories a file WebCenter Content Document Transfer Utility Readme.html will be extracted with a comprehensive documentation about tool usage, troubleshooting and additional options.

Upload via the RIDC Java library

Using a RIDC connection might be the preferred option for those customers who have no FMW products in place. The Java library oracle.ucm.fa_client_11.1.1.jar (existing in in sub-directory ridc after extraction) can be used standalone and doesn’t require any other libraries in addition to a JDK with a minimum release of 1.7 (for JRockit 1.6).

Connection information can be located in a configuration file connection.properties with content like this:

url=https://<fusion_product_hub_server>:<port>/cs/idcplg
username=<IntegrationUser>
password=<Password>
policy=oracle/wss_username_token_client_policy

In production environments it’s strongly recommended to avoid saving passwords in clear text in configuration files like this. Putting them into wallets and reading values from there would be the preferred choice.

A command line running the document upload via RIDC would look like this (“\” used for same line where columns too long) :

${JAVA_HOME}/bin/java \
-jar ./oracle.ucm.fa_client_11.1.1.jar UploadTool \
--propertiesFile=./connection.properties \
--primaryFile=ItemManifest.csv \
--dDocTitle="ItemManifest.csv" --k0=dCollectionPath \
--v0="/Contribution Folders/PIM/" \
-dDocAccount="/scm$/item$/import$"

A successful execution will result in an output like this:

Oracle WebCenter Content Document Transfer Utility
Oracle Fusion Applications
Copyright (c) 2013-2014, Oracle. All rights reserved.
* Custom metdata set: "dCollectionPath"="/Contribution Folders/PIM/".
Performing upload (CHECKIN_UNIVERSAL) ...
Upload successful.
[dID=76 | dDocName=UCMFA000076]

The uploaded document from the example above resides in the Fusion Content Server with a global document id 76 and an internal document name UCMFA000076. For further processing we’d rather locate it by its logical file information /Contribution Folders/PIM/ItemManifest.csv.

Using RIDC connection would be apparently a first choice for cloud customers who are not using any Oracle Fusion Middleware runtime environment. However it is possible for Fusion Product Hub on-premise customers to use this connection type too.

Upload via the generic Java library

The generic approach will connect to a WebService in Fusion Content Server to perform a file upload. After extraction in folder generic a Java library oracle.ucm.fa_genericclient_11.1.1.jar can be found. For this type of connection the connection.properties will point to a different URL as shown below:

url=https://<fusion_product_hub_server>:<port>/idcws
username=<IntegrationUser>
password=<Password> 
policy=oracle/wss_username_token_client_policy

Its Important to mention that the tool can’t run standalone, as we must add an additional library from a Weblogic Server runtime directory: jrf-client.jar. This can be found in WLS directory oracle_common/modules/oracle.jrf_11.1.1. No other libraries are required to be added to the classpath as the remaining Oracle JRF Web Service are referred from jrf-client.jar.

The command line using the generic Java library would look like this:

${JAVA_HOME}/bin/java -classpath \
<WLS_HOME>/oracle_common/modules/oracle.jrf_11.1.1/jrf-client.jar:./oracle.ucm.fa_genericclient_11.1.1.jar \
oracle.ucm.idcws.client.UploadTool \
-propertiesFile=./connection.properties \
--primaryFile=/home/oracle/CASE_1-CsvMap.csv \
--dDocTitle="Product Item Import 0001" \
--k0=dCollectionPath --v0="/Contribution Folders/PIM/" \
-dDocAccount="/scm$/item$/import$"

Output looks identical like the RIDC version:

Oracle WebCenter Content Document Transfer Utility
Oracle Fusion Applications
Copyright (c) 2013-2014, Oracle. All rights reserved.
* Custom metdata set: "dCollectionPath"="/Contribution Folders/PIM/".
Performing upload (CHECKIN_UNIVERSAL) ...
Upload successful.
[dID=77 | dDocName=UCMFA000077]

As mentioned above using this connection type requires a fully installed Weblogic runtime environment and uses a standard WebService interface.

Logging option

In order to adjust the level of logging information, the log level can be controlled through a properties file such as logging.properties that can be added to the runtime Java call by option

-Djava.util.logging.config.file=./logging.properties

The content of this file could look like this:

handlers=java.util.logging.ConsoleHandler
.level=FINEST
java.util.logging.ConsoleHandler.formatter=java.util.logging.SimpleFormatter
java.util.logging.ConsoleHandler.level=FINEST
oracle.j2ee.level=FINEST

As this is a standard Java feature the full list of values looks as follows:

  • SEVERE (highest value – least logging)
  • WARNING
  • INFO
  • CONFIG
  • FINE
  • FINER
  • FINEST (lowest value – most logging)

Using the logging features might help in cases where content transfer utility runs into issues with the connections and/or the upload of files.

Optional: Managing SSL Self-Signed Certificates

It’s strongly recommended to use the HTTPS protocol when connecting to Fusion Content Server despite the fact that plain HTTP connections would technically work as well. In scenarios using Fusion Product Cloud the server certificates are signed by well-known authorities (trust centers), who’s root certificates are normally part of JDK or browser distributions and no special certificate handling is required.

When using Fusion Product Hub on-premise there might be situations where self-signed certificates are used for SSL. When running Java programs these certificates must be imported into the clients certificate store. Here is a short explanation how to manage this:

The connection to a server with self-signed certificate will produce a warning in web browsers. Its possible to take a closer look to the certificate details like shown in a Firefox screenshot below:

  • Warning page appears stating that a connection can’t be trusted
  • Click on “I understand the risks”
  • Click on “Add exception …”
  • Click on “View” and as a result the certificate details appear like shown below

DownloadSSLSelfSignedCert2

Usually unknown certificates shouldn’t be trusted, but in this special case we are the issuers and make an exception.

We can download the certificate via the following access path in Firefox:

  • Click on tab “Details” as shown in screenshot above
  • Click on “Export … ” as shown in screenshot below
  • In File Save dialog choose “X.509 Certificate (DER)”
  • Save the file in a folder

DownloadSSLSelfSignedCert

Once saved, we must import this certificate into the certificate store of Java runtime which we use to run the content transfer utility. The command line looks like this:

${JAVA_HOME}/bin/keytool –importcert \
–alias <name_referring_to_ssl_server> \
-keystore ${JAVA_HOME}/jre/lib/security/cacerts \
–file <path_to_der_certificate>

When asked for a password and it has never been changed the default value would be “changeit”.

Part III: Initiating the Item Data Load

In the previous section the provisioning of files to Fusion Content Server has been explained. The final step to import these items into Fusion Product Hub is running the loader and import job. This job runs seamlessly and the following jobs are included:

  • Transfer item data from the file in Fusion Content Server to Item Interface Tables
  • Run a batch import from Item Interface Tables to Item tables

Step 2 above is identical to Item Batch Import as existent in Fusion Product Hub for a while including exception handling and job status reporting.

Customers have an option to initiate the scheduled job (occasional triggering) via UI or to embrace it by scripts for automated and periodical runs.

Manual Item Load via Fusion Product Hub UI

Initiating a manual item load is pretty straightforward, as users just have to follow the standard dialog to trigger a job. For this purpose use the Scheduled Processes menu entry in Fusion Product Hub Navigator.

ScheduleProcessesNavigatorSearch for a job called Schedule Product Upload Job as shown below.

NewScheduleJobProvide parameters as required:

  • Manifest File Path: file location of the manifest file as uploaded to Fusion Content Server previously
  • Assigned Spoke System: Spoke system as defined in the functional setup previously (see section I)

Once the job parameters have been provided the job execution can be submitted for immediate execution or scheduled for a later time.

NewScheduleJobParametersThe execution status of jobs can be monitored via the same UI as shown below. Once finished the items are supposed to be transferred from CSV file into the system and can be found in the standard Fusion Product Hub UI for further processing.

NewScheduleJobProgress

Programmatic execution of the loader job from command line

Similar as for uploading files into Fusion Content Server the triggering of loader jobs can be initiated by Java programs. For this purpose it’s recommended to use Oracle JDeveloper 12c that can be downloaded from OTN as shown below.

JDev_Download

It’s not necessary to download more technology products to run the programmatic interface for job scheduling.

Create WebService Client to initiate a scheduling job

Technically the job scheduler can be accessed via an existing WebService interface. Oracle JDeveloper 12c provides a great offering to generate the WebService accessory code via a coding wizard.

The series of screenshots below will document the step-by-step procedure to generate the Java code. Once done, we have a skeleton of Java code and configuration files that require some minor extensions in order to execute the web service.

As a very first step create a Java application with a custom project. Then choose Create Web Service Client and Proxy via “New …” and “Gallery …”.

JDeveloper2As shown in screenshots below we must provide the information for the WebService we intend to invoke in our code. For Item Loader it has the following format:

https://<fusion_product_hub_server>:<port>/finFunShared/FinancialUtilService?wsdl

JDeveloper3Once provided click Next and the wizard will start determining the web service configuration by introspecting the provided web service WSDL.

JDeveloper4As shown in screenshot below there is a choice to enter a custom root package for the generated WebService client code. Default code will use a package name like this:
com.oracle.xmlns.apps.financials.commonmodules.shared.financialutilservice

In most cases customers want to reflect their own packaging naming conventions and this screen is the location where to configure it.

JDeveloper5In next step, as shown screenshot below, it is not necessary to change any information and user can enter Next.

JDeveloper6The next dialog will give users a choice to configure the client using a synchronous or asynchronous method. Scheduling a job is a synchronous activity and therefore its not required to generate asynchronous Java methods.

JDeveloper7After reading and analyzing the web service a WSM policy oracle/wss11_username_token_with_message_protection_server_policy has been found on the server side. The code generator uses the corresponding client policy oracle/wss11_username_token_with_message_protection_client_policy to fulfill the server requirements. This value must be accepted, as the communication between client and server will fail otherwise.

JDeveloper8On the next dialog screen no changes are required and the user can press Next.

JDeveloper9Last screenshot of this code generation dialog will show a summary of methods being generated, as they will fit with the methods found in the web service wsdl. After clicking finish the code generation starts and might take up to one or two minutes.

JDeveloper10The development environment after finishing code generation will have a look like shown below in screenshot.

JDeveloperThis code generation helps saving a tremendous amount of time if programming manually. Its worth to mention, that some parts of the generated code are under control of JDeveloper and might be overwritten in case of some configuration changes happen. Developers must be careful to add their own code in sections where foreseen and indicated in the code.

The generate code doesn’t provide any details about

  • Authentication by providing credential
  • Message encryption as required by the WSM policy
  • Web service operations to be initiated by this Java code – here Java method submitESSJobRequest() for web service operation submitESSJobRequest
  • Parameters to be passed to these operation calls

All the additions above are manual tasks to be performed by programmers.

Below is a piece of Java code that shows a working example kept simple for better readability. For a production use we recommend the following improvements:

  • Put details for keystore etc in config files
  • Same for username
  • Store passwords in a wallet
  • Important: as mentioned the code is under control of a code generator. To avoid unintentionally code changes its strongly recommended to create an own class by copy and paste from generated class

Generated and modified Java File FinancialUtilServiceSoapHttpPortClient.java

package wsclient.mycompany.com;

import com.sun.xml.ws.developer.WSBindingProvider;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;

import oracle.webservices.ClientConstants;

import weblogic.wsee.jws.jaxws.owsm.SecurityPoliciesFeature;

// This source file is generated by Oracle tools.
// Contents may be subject to change.
// For reporting problems, use the following:
// Generated by Oracle JDeveloper 12c 12.1.3.0.0.1008
public class FinancialUtilServiceSoapHttpPortClient {
  public static void main(String[] args) {
      FinancialUtilService_Service financialUtilService_Service = 
            new FinancialUtilService_Service();

      // Configure security feature
      SecurityPoliciesFeature securityFeatures = 
            new SecurityPoliciesFeature(new String[] {
"oracle/wss11_username_token_with_message_protection_client_policy"
            });
      FinancialUtilService financialUtilService =
      financialUtilService_Service.getFinancialUtilServiceSoapHttpPort(
                                      securityFeatures);
      // Add your code to call the desired methods.
      WSBindingProvider wsbp = (WSBindingProvider) financialUtilService;
      Map<String, Object> reqCon = wsbp.getRequestContext();

      reqCon.put(WSBindingProvider.USERNAME_PROPERTY, "IntegrationUser");
      reqCon.put(WSBindingProvider.PASSWORD_PROPERTY, "Password");

      reqCon.put(ClientConstants.WSSEC_KEYSTORE_TYPE, "JKS");
      reqCon.put(ClientConstants.WSSEC_KEYSTORE_LOCATION, 
                                "/home/oracle/FusionClient.jks");
      reqCon.put(ClientConstants.WSSEC_KEYSTORE_PASSWORD, "Welcome1");
      reqCon.put(ClientConstants.WSSEC_ENC_KEY_ALIAS, "mykey");
      reqCon.put(ClientConstants.WSSEC_ENC_KEY_PASSWORD, "Welcome1");
      reqCon.put(ClientConstants.WSSEC_RECIPIENT_KEY_ALIAS, "mykeys");

      Long jobID = startEssJob(financialUtilService);

      System.out.println("Item Data Import Job started with ID: " +
                     jobID.toString());
    }

  private static Long startEssJob(FinancialUtilService fus) {
      Long essRequestId = new Long(-1);

      try {
          List<String> paramList = new ArrayList<String>();
          // UCM folder and file name
          paramList.add("/Contribution Folders/PIM/ProductLoad.csv"); 
          // Spoke System Code
          paramList.add("PIMDH"); 
          // Product Upload - static value here
          paramList.add("true");
          // Product Hub Portal Flow
          paramList.add("false");
          essRequestId = fus.submitESSJobRequest( 
             "/oracle/apps/ess/scm/productHub/itemImport/",
             "ExtProductUploadSchedulingJobDef", paramList);
        } 
      catch (ServiceException e) {
            e.printStackTrace();
            System.exit(1);
        }

      return essRequestId;
  }
}

Running this Java program from command line doesn’t require any additional libraries except those coming with a JDeveloper installation and a standard JDK.

Its recommended to package all files in project into a JAR file via a Deployment Profile. Once done, a sample call for this WebService client would look as follows:

clientJar=<project_dir>/deploy/FinancialUtilService-Client.jar
jdevDir=<jdev12c_install_dir>
modulesDir=${jdevDir}/oracle_common/modules

${JAVA_HOME}/java \
-server \
-Djava.endorsed.dirs=${jdevDir}/oracle_common/modules/endorsed \
-classpath ${clientJar}:\
${javaDir}/wlserver/server/lib/weblogic.jar:\
${modulesDir}/oracle.jrf_12.1.3/jrf.jar:\
${modulesDir}/ oracle.toplink_12.1.3/eclipselink.jar:\
${modulesDir}/ oracle.toplink_12.1.3/org.eclipse.persistence.nosql.jar:\
${modulesDir}/ oracle.toplink_12.1.3/org.eclipse.persistence.oracle.nosql.jar:\
${jdevDir}/wlserver/modules/com.bea.core.antlr.runtime_2.0.0.0_3-2.jar:\
${modulesDir}/javax.persistence_2.0.jar:\
${modulesDir}/com.oracle.webservices.fmw.wsclient-impl_12.1.3.jar:\
${modulesDir}/com.oracle.webservices.fmw.jrf-ws-api_12.1.3.jar \
wsclient.mycompany.com.FinancialUtilServiceSoapHttpPortClient

Output of this call will be the Job ID of the scheduled item loader job. The monitoring of job progress can be done via the application UI. There are other web services that can be used for checking job status but their explanation is subject to a future blog post.

Managing WS Security

As mentioned earlier in this blog the web service policy to run a scheduler job is oracle/wss11_username_token_with_message_protection_server_policy. For our Java client it means that two core requirements must be satisfied and have been shown in code sample above:

  • Passing username/password (here: IntegrationUser/Password)
  • Encrypt the message content

For encryption we must use the public key of web service as provided inside the WSDL. An example can be seen in the screenshot below.

WSDLX509Cert

The following steps are required to create an entry in the client side Java Key Store for message encryption:

  • Save the certificate from the WSDL in a certificate file in .pem or .der format
  • Import the certificate to an existing key store or create a new key store by importing the certificate
  • Refer to the certificate entries for message encryption as shown above in the sample class.

Open the WSDL in a web browser and search for XML tag dsig:X509Certificate. The content must be copied and pasted into a text file as shown below between a line BEGIN CERTIFICATE and END CERTIFICATE (sample data as copied from our test case):

-----BEGIN CERTIFICATE-----
MIIB+zCCAWSgAwIBAgIEUzRQ0zANBgkqhkiG9w0BAQUFADBCMRMwEQYKCZImiZPyLGQBGRYDY29tMRkwFwYKCZImiZPyLGQBGRYJbXljb21wYW55MRAwDgYDVQQDEwdzZXJ2aWNlMB4XDTE0MDMyNzE2MjQ1MVoXDTE3MDMyNzE2MjQ1MVowQjE
TMBEGCgmSJomT8ixkARkWA2NvbTEZMBcGCgmSJomT8ixkARkWCW15Y29tcGFueTEQMA4GA1UEAxMHc2VydmljZTCBnzANBgkqhkiG9w0BAQEFAAOBjQAwgYkCgYEA1eGZzKgK5ZvSzfDVJ06oDYR0Zn79JQXNpddopXKLTWy87w95hfVv2UFSmK
0+3yjHR/OCpxHERwtBk3Q4jjLVv3nINwKmt/ELnMAm+pa4pAK3wXEzopoxM5phQPp2Mn/iLLNp1OfRI8yzRGowi9K71JcuDhlWJCGRETLxyDgxy3ECAwEAATANBgkqhkiG9w0BAQUFAAOBgQBdLYGuZSbCBpG9uSBiPG+Dz+BMl4KuqSjPjv4Uu
nCWLFobNpb9avSw79nEp4BS42XGaOSrfXA2j+/9mY9k9fUxVV+yP7AeKDKwDMoLQ33Yoi0B2t/0LkUDkYEa3xlluLAavrFvJfxSZH87WanJ2HbNwQWpbfRq1iG1aiji/2g9Tw==
-----END CERTIFICATE-----

Save the file under a name like <fusion_product_hub_server>.der and create an entry or a new Java keystore file as follows:

${JAVA_HOME}/bin/keytool –importcert \
–alias <alias_as_referred_in_Java_code > \
-keystore <my_local_client_trust_store> \
–file <path_to_der_certificate_above>

If the keystore doesn’t exist it will be created including new passwords (Welcome1 in Java code sample above). If the files exist we must provide the keystore passwords in order to be able to create the entry.

Once created, the key store will contain an entry like this:

$ ${JAVA_HOME}/bin/keytool -v -list -keystore <my_local_client_trust_store>
Enter keystore password:  

Keystore type: JKS
Keystore provider: SUN

Your keystore contains 1 entry

Alias name: mykeys
Creation date: Feb 24, 2015
Entry type: trustedCertEntry

Owner: CN=fahost2.mycompany.com, OU=defaultOrganizationUnit, O=defaultOrganization, C=US
Issuer: CN=fahost2.mycompany.com, OU=defaultOrganizationUnit, O=defaultOrganization, C=US
Serial number: 5397498d
Valid from: Tue Jun 10 20:08:13 CEST 2014 until: Sat Jun 10 20:08:13 CEST 2017
Certificate fingerprints:
	 MD5:  6D:BA:94:CE:84:E6:C0:A3:CA:A3:F1:8A:39:1E:E9:2E
	 SHA1: C7:3D:62:42:D8:E7:A0:DB:57:93:40:32:A8:54:E0:57:60:F0:8B:FD
	 SHA256: 40:D0:C3:81:CF:5D:6B:61:95:23:27:24:83:8D:1A:34:9F:31:C7:E5:15:BE:49:44:81:E6:D9:34:0A:69:FA:06
	 Signature algorithm name: SHA1withRSA
	 Version: 3


*******************************************
*******************************************

Summary

In this article we provided a 360° view on tasks and activities for automating the use of File Based Loader for Fusion Product Hub. Everything discussed in this article applies to both cloud and on-premise deployments of Fusion Product Hub the same way.

Link collection in order of appearance in this blog:

Fusion HCM Cloud Bulk Integration Automation

$
0
0

Introduction

Fusion HCM Cloud provides a comprehensive set of tools, templates, and pre-packaged integration to cover various scenarios using modern and efficient technologies. One of the patterns is the bulk integration to load and extract data to/from cloud. The inbound tool is the File Based data loader (FBL) evolving into HCM Data Loaders (HDL). HDL supports data migration for full HR, incremental load to support co-existence with Oracle Applications such as E-Business Suite (EBS) and PeopleSoft (PSFT). It also provides the ability to bulk load into configured flexfields. HCM Extracts is an outbound integration tool that let’s you choose data, gathers and archives it. This archived raw data is converted into a desired format and delivered to supported channels recipients.

HCM cloud implements Oracle WebCenter Content, a component of Fusion Middleware, to store and secure data files for both inbound and outbound bulk integration patterns. This post focuses on how to automate data file transfer with WebCenter Content to initiate the loader. The same APIs will be used to download data file from the WebCenter Content delivered through the extract process.

WebCenter Content replaces SSH File Transfer Protocol (SFTP) server in the cloud as a content repository in Fusion HCM starting with Release 7+. There are several ways of importing and exporting content to and from Fusion Applications such as:

  • Upload using “File Import and Export” UI from home page navigation: Navigator > Tools
  • Upload using WebCenter Content Document Transfer Utility
  • Upload programmatically via Java Code or Web Service API

This post provides an introduction, with working sample code, on how to programmatically export content from Fusion Applications to automate the outbound integration process to other applications in the cloud or on-premise. A Service Oriented Architecture (SOA) composite is implemented to demonstrate the concept.

Main Article

Fusion Applications Security in WebCenter Content

The content in WebCenter Content is secured through users, roles, privileges and accounts. The user could be any valid user with a role such as “Integration Specialist.” The role may have privileges such as read, write and delete. The accounts are predefined by each application. For example, HCM uses /hcm/dataloader/import and /hcm/dataloader/export respectively.

Let’s review the inbound and outbound batch integration flows.

Inbound Flow

This is a typical Inbound FBL process flow:

 

HDL_loader_process

The data file is uploaded to WebCenter Content Server either using Fusion HCM UI or programmatically in /hcm/dataloader/import account. This uploaded file is registered by invoking the Loader Integration Service – http://{Host}/hcmCommonBatchLoader/LoaderIntegrationService.

You must specify the following in the payload:

  • Content id of the file to be loaded
  • Business objects that you are loading
  • Batch name
  • Load type (FBL)
  • Imported file to be loaded automatically

Fusion Applications UI also allows the end user to register and initiate the data load process.

 

Encryption of Data File using Pretty Good Privacy (PGP)

All data files transit over a network via SSL. In addition, HCM Cloud supports encryption of data files at rest using PGP.
Fusion supports the following types of encryption:

  • PGP Signed
  • PGP Unsigned
  • PGPX509 Signed
  • PGPX509 Unsigned

To use this PGP Encryption capability, a customer must exchange encryption keys with Fusion for the following:

  • Fusion can decrypt inbound files
  • Fusion can encrypt outbound files
  • Customer can encrypt files sent to Fusion
  • Customer can decrypt files received from Fusion

Steps to Implement PGP

  1. 1. Provide your PGP Public Key
  2. 2. Oracle’s Cloud Operations team provides you with the Fusion PGP Public Key.

Steps to Implement PGP X.509

  1. 1. Self signed fusion key pair (default option)
    • You provide the public X.509 certificate
  2. 2. Fusion Key Pair provided by you:
    • Public X.509 certificate uploaded via Oracle Support Service Request (SR)
    • Fusion Key Pair for Fusion’s X.509 certificate in a Keystore with Keystore password.

Steps for Certificate Authority (CA) signed Fusion certificate

      1. Obtain Certificate Authority (CA) signed Fusion certificate
      2. Public X.509 certificate uploaded via SR
      3. Oracle’s Cloud Operations exports the fusion public X.509 CSR certificate and uploads it to SR
      4. Using Fusion public X.509 CSR certificate, Customer provides signed CA certificate and uploads it to SR
    5. Oracle’s Cloud Operations provides the Fusion PGP Public Certificate to you via an SR

 

Modification to Loader Integration Service Payload to support PGP

The loaderIntegrationService has a new method called “submitEncryptedBatch” which has an additional parameter named “encryptType”. The valid values to pass in the “encryptType” parameter are taken from the ORA_HRC_FILE_ENCRYPT_TYPE lookup:

  • NONE
  • PGPSIGNED
  • PGPUNSIGNED
  • PGPX509SIGNED
  • PGPX509UNSIGNED

Sample Payload

<soap:Envelope xmlns:soap=”http://schemas.xmlsoap.org/soap/envelope/”> <soap:Body>
<ns1:submitEncryptedBatch
xmlns:ns1=”http://xmlns.oracle.com/apps/hcm/common/batchLoader/core/loaderIntegrationService/types/”>
<ns1:ZipFileName>LOCATIONTEST622.ZIP</ns1:ZipFileName>
<ns1:BusinessObjectList>Location</ns1:BusinessObjectList>
<ns1:BatchName>LOCATIONTEST622.ZIP</ns1:BatchName>
<ns1:LoadType>FBL</ns1:LoadType>
<ns1:AutoLoad>Y</ns1:AutoLoad>
<ns1:encryptType>PGPX509SIGNED</ns1:encryptType>
</ns1:submitEncryptedBatch>
</soap:Body>
</soap:Envelope>

 

Outbound Flow

This is a typical Outbound batch Integration flow using HCM Extracts:

extractflow

The extracted file could be delivered to the WebCenter Content server. HCM Extract has an ability to generate an encrypted output file. In Extract delivery options ensure the following options are correctly configured:

  1. Select HCM Delivery Type to “HCM Connect”
  2. Select an Encryption Mode of the 4 supported encryption types. or select None
  3. Specify the Integration Name – his value is used to build the title of the entry in WebCenter Content

 

Extracted File Naming Convention in WebCenter Content

The file will have the following properties:

  • Author: FUSION_APPSHCM_ESS_APPID
  • Security Group: FAFusionImportExport
  • Account: hcm/dataloader/export
  • Title: HEXTV1CON_{IntegrationName}_{EncryptionType}_{DateTimeStamp}

 

Programmatic Approach to export/import files from/to WebCenter Content

In Fusion Applications, the WebCenter Content Managed server is installed in the Common domain Weblogic Server. The WebCenter Content server provides two types of web services:

Generic JAX-WS based web service

This is a generic web service for general access to the Content Server. The context root for this service is “/idcws”. For details of the format, see the published WSDL at https://<hostname>:<port>/idcws/GenericSoapPort?WSDL. This service is protected through Oracle Web Services Security Manager (OWSM). As a result of allowing WS-Security policies to be applied to this service, streaming Message Transmission Optimization Mechanism (MTOM) is not available for use with this service. Very large files (greater than the memory of the client or the server) cannot be uploaded or downloaded.

Native SOAP based web service

This is the general WebCenter Content service. Essentially, it is a normal socket request to Content Server, wrapped in a SOAP request. Requests are sent to the Content Server using streaming Message Transmission Optimization Mechanism (MTOM) in order to support large files. The context root for this service is “/idcnativews”. The main web service is IdcWebRequestPort and it requires JSESSIONID, which can be retrieved from IdcWebLoginPort service.

The Remote Intradoc Client (RIDC) uses the native web services. Oracle recommends that you do not develop a custom client against these services.

For more information, please refer “Developing with WebCenter Content Web Services for Integration.”

Generic Web Service Implementation

This post provides a sample of implementing generic web service /idcws/GenericSoapPort. In order to implement this web service, it is critical to review the following definitions to generate the request message and parse the response message:

IdcService:

IdcService is a predefined service node’s attribute that is to be executed, for example, CHECKIN_UNIVERSAL, GET_SEARCH_RESULTS, GET_FILE, CHECKOUT_BY_NAME, etc.

User

User is a subnode within a <service> and contains all user information.

Document

Document is a collection of all the content-item information and is the parent node of the all the data.

ResultSet

ResultSet is a typical row/column based schema. The name attribute specifies the name of the ResultSet. It contains set of row subnodes.

Row

Row is a typical row within a ResultSet, which can have multiple <row> subnodes. It contains sets of Field objects

Field

Field is a subnode of either <document> or <row>. It represents document or user metadata such as content Id, Name, Version, etc.

File

File is a file object that is either being uploaded or downloaded

For more information, please refer Configuring Web Services with WSDL, SOAP, and the WSDL Generator.

Web Service Security

The genericSoapPort web service is protected by Oracle Web Services Manager (OWSM). In Oracle Fusion Applications cloud, the OWSM policy is: “oracle/wss11_saml_or_username_token_with_message_protection_service_policy”.

In your SOAP envelope, you will need the appropriate “wsee” headers. This is a sample:

<soapenv:Header>
<wsse:Security xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" soapenv:mustUnderstand="1">
<saml:Assertion xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion" MajorVersion="1" MinorVersion="1" AssertionID="SAML-iiYLE6rlHjI2j9AUZXrXmg22" IssueInstant="2014-10-20T13:52:25Z" Issuer="www.oracle.com">
<saml:Conditions NotBefore="2014-10-20T13:52:25Z" NotOnOrAfter="2015-11-22T13:57:25Z"/>
<saml:AuthenticationStatement AuthenticationInstant="2014-10-20T14:52:25Z" AuthenticationMethod="urn:oasis:names:tc:SAML:1.0:am:password">
<saml:Subject>
<saml:NameIdentifier Format="urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified">FAAdmin</saml:NameIdentifier>
<saml:SubjectConfirmation>
<saml:ConfirmationMethod>urn:oasis:names:tc:SAML:1.0:cm:sender-vouches</saml:ConfirmationMethod>
</saml:SubjectConfirmation>
</saml:Subject>
</saml:AuthenticationStatement>
</saml:Assertion>
</wsse:Security>
</soapenv:Header>

Sample SOA Composite

The SOA code provides a sample on how to search for a document in WebCenter Content, extract a file name from the search result, and get the file and save it in your local directory. The file could be processed immediately based on your requirements. Since this is a generic web service with a generic request message, you can use the same interface to invoke various IdcServices, such as GET_FILE, GET_SEARCH_RESULTS, etc.

In the SOA composite sample, two external services are created: GenericSoapPort and FileAdapter. If the service is GET_FILE, then it will save a copy of the retrieved file in your local machine.

Export File

The GET_FILE service returns a specific rendition of a content item, the latest revision, or the latest released revision. A copy of the file is retrieved without performing a check out. It requires either dID (content item revision ID) for the revision, or dDocName (content item name) along with a RevisionSelectionMethod parameter. The RevisionSelectionMethod could be either “Latest” (latest revision of the content) or “LatestReleased” (latest released revision of the content). For example, to retrieve file:

<ucm:GenericRequest webKey="cs">
<ucm:Service IdcService="GET_FILE">
<ucm:Document>
<ucm:Field name="dID">401</ucm:Field>
</ucm:Document>
</ucm:Service>
</ucm:GenericRequest>

Search File

The dID of the content could be retrieved using the service GET_SEARCH_RESULTS. It uses a QueryText attribute in <Field> node. The QueryText attribute defines the query and must be XML encoded. You can append values for title, content Id, and so on, in the QueryText, to refine the search. The syntax for QueryText could be challenging, but once you understand the special characters formats, it is straight forward. For example, to search content by its original name:

<ucm:Service IdcService="GET_SEARCH_RESULTS">
<ucm:Document>
<ucm:Field name="QueryText">dOriginalName &lt;starts&gt; `Test`</ucm:Field>
</ucm:Document>
</ucm:Service>

In plain text, it is dOriginalName <starts> `Test`. The <substring> is the mandatory format. You can further refine the query by adding more parameters.

This a sample SOA composite with 2 external references, genericSoapPort and FileAdapter.

ucmComposite

This is a sample BPEL process flow that demonstrates how to retrieve the file and save a copy to a local directory using File Adapter. If the idcService is GET_SEARCH_RESULTS, then do not save the file. In a real scenario, you will search, check out and start processing the file.

 

ucmBPEL1

The original file name is preserved when copying it to a local directory by passing the header property to the FileAdapter. For example, create a variable fileName and use assign as follows:

1. get file name from the response message in your <assign> activity as follows:

<from expression="bpws:getVariableData('InvokeGenericSoapPort_GenericSoapOperation_OutputVariable','GenericResponse','/ns2:GenericResponse/ns2:Service/ns2:Document/ns2:ResultSet/ns2:Row/ns2:Field[@name=&quot;dOriginalName&quot;]')"/>
<to variable="fileName"/>

Please make note of the XPath expression as this will assist you to retrieve other metadata.

2. Pass this fileName variable to the <invoke> of the FileAdapter as follows:

<bpelx:inputProperty name="jca.file.FileName" variable="fileName"/>

Please add the following property manually to the ../CommonDomain/ucm/cs/config/config.cfg file for the QueryText syntax: AllowNativeQueryFormat=true
Restart the managed server.
The typical error is: “StatusMessage”>Unable to retrieve search results. Parsing error at character xx in query….”

Testing SOA Composite:

After the composite is deployed in your SOA server, you can test it either from Enterprise Manager (EM) or using SoapUI. These are the sample request messages for GET_SEARCH_RESULTS and GET_FILE.

The following screens show the SOA composites for “GET_SEARCH_RESULTS” and “GET_FILE”:

searchfile

getfile

Get_File Response snippet with critical objects:

<ns2:GenericResponse xmlns:ns2="http://www.oracle.com/UCM">
<ns2:Service IdcService="GET_FILE">
<ns2:Document>
<ns2:Field name="dID">401</ns2:Field>
<ns2:Field name="IdcService">GET_FILE</ns2:Field>
....
<ns2:ResultSet name="FILE_DOC_INFO">
<ns2:Row>
<ns2:Field name="dID">401</ns2:Field>
<ns2:Field name="dDocName">UCMFA000401</ns2:Field>
<ns2:Field name="dDocType">Document</ns2:Field>
<ns2:Field name="dDocTitle">JRD Test</ns2:Field>
<ns2:Field name="dDocAuthor">FAAdmin</ns2:Field>
<ns2:Field name="dRevClassID">401</ns2:Field>
<ns2:Field name="dOriginalName">Readme.html</ns2:Field>
</ns2:Row>
</ns2:ResultSet>
</ns2:ResultSet>
<ns2:File name="" href="/u01/app/fa/config/domains/fusionhost.mycompany.com/CommonDomain/ucm/cs/vault/document/bwzh/mdaw/401.html">
<ns2:Contents>
<xop:Include href="cid:7405676a-11f8-442d-b13c-f8f6c2b682e4" xmlns:xop="http://www.w3.org/2004/08/xop/include"/>
</ns2:Contents>
</ns2:File>
</ns2:Document>
</ns2:Service>
</ns2:GenericResponse>

Import (Upload) File for HDL

The above sample can also be use to import files into the WebCenter Content repository for Inbound integration or other use cases. The service name is CHECKIN_UNIVERSAL.

Summary

This post demonstrates how to secure and automate the export and import of data files in WebCenter Content server implemented by Fusion HCM Cloud. It further demonstrates how integration tools like SOA can be implemented to automate, extend and orchestrate integration between HCM in the cloud and Oracle or non-Oracle applications, either in Cloud or on-premise sites.

The SOA sample code is here.

Development Patterns in Oracle Sales Cloud Application Composer, Part 2

$
0
0

Introduction

In Part 1 of this post (http://www.ateam-oracle.com/development-patterns-in-oracle-sales-cloud-application-composer-part-1), we used the experiences of three tradespersons – a plumber, a carpenter, and an electrician – to make the case that planning is critical before building or remodeling a house. During their brief hypothetical conversation while working on a job site, they convinced each other that formal planning, normally executed in building construction projects by drafting blueprints, ensures that all of the individual sub-systems work together efficiently in the completed house. We used the jist of their conversation to reinforce the necessity of planning in software development, especially when mapping out how all of the individual components of a software development project will work together, as much as possible optimizing the relationships among the various components. This kind of planning is as much a fundamental requirement for successful software development projects as blueprints are for building construction.

Laying out the structural framework for more complex software development projects greatly increases the odds of successful outcomes. This should come as no surprise, but nonetheless planning is often given short shrift or even ignored entirely.  Also generally accepted, but also occasionally ignored, is the practice of not re-inventing the wheel with every new project. With both building construction and software development, it would be redundant to start the planning and design stages of every new project from scratch. Normally, proven and optimized patterns are available to jumpstart planning and design, and they should be utilized whenever possible. In fact, the main goal of Part 1 was to suggest a framework for how global functions and trigger functions could interact in an Oracle Sales Cloud (OSC) Application Composer extensibility project, using the Oracle Documents Cloud Service (DOCS) as a representative integration target.

The plan for Part 2 (even blog posts benefit from plans!) is to continue exploring the relationships between the global function library we started to build and other extensibility artifacts, adding new features to the extensibility project with an additional trigger function, again using the Oracle Documents Cloud Service (now in version 1.1 of the REST API) as the integration target. If they are designed correctly, the global functions should be able to support the addition of new features without major refactoring.

Availability of New REST Resource in DOCS: Folder Sharing

Think of the global function library as similar to the building’s foundation. With the foundation in place, the fundamental means of interacting with Oracle Documents Cloud Service REST services from Sales Cloud is ready to support the superstructure, which in the case of Application Composer usually takes the form of object trigger functions. With working trigger functions (covered in Part 1 of the post) that support three outbound calls to the Oracle Documents Cloud Service – (1) creating a dedicated document folder when a new opportunity is created, (2) renaming the folder if and when the opportunity is renamed, and (3) deleting the folder with its content if the opportunity is deleted — the extensions are functional/usable and can at least be tested end-to-end even if they are not quite ready for a production rollout.

One set of new features added to version 1.1 of the DOCS REST API allows for programmatic sharing of folders and documents. To round out the integration with Sales Cloud, we would like to take advantage of the new feature by writing a trigger function that adds or removes contributors from the DOCS folder set up for the opportunity whenever a new team member is added to or removed from the opportunity. Adding this new trigger function will be an additional test of how well the global functions are designed. If we can implement the trigger function with minimal effort, it is a good sign that the global functions have been built correctly.

Implementing the New Feature

As a refresher, below is the sequence of how the global functions work together when making a generic REST call from an object function or object trigger function:

  1. Prepare request payload if required.  Required format: Map<String, Object>
  2. Call global function: callRest( String restHttpMethod, String urlExtension, Map requestPayload)
    1. Inside callRest function: use map2Json function to convert requestPayload Map to JSON format.
  3. General error checking after function returns (e.g. check that response is returned in the expected format)
  4. Process response: json2Map(responsePayload) converts JSON response to Map
  5. Detailed error checking based on response content
  6. Process converted Map content returned in response as needed (function-specific)

In Sales Cloud Application Composer, the steps necessary for adding folder sharing support can be used as a prototype for adding virtually anything exposed in a REST API. Being able to address multiple sets of requirements is part of the advantage of having well-crafted global functions.

Below are the steps for designing and incorporating the new feature, including details and discussion of options:

  1. Identify the affected Sales Cloud object in the transaction. In this case we know we are working with an Opportunity or something related to an Opportunity (either or parent, child, or another object with a defined relationship).
  2. Decide on the best-fit trigger event. Normally all but a few of the sixteen or so triggers available can be eliminated.
  3. Create the trigger function and write the code. This step may require a few iterations.
  4. Add error handling code. How this step is implemented is going to depend on whether or not any supporting global functions existing for error handling.

Identify the affected Sales Cloud object in the transaction that will trigger the outbound web service call.  Typically, it is easiest to work from the top down to identify the object for which to write the trigger function. For example, in the folder sharing case, the known top-level object is Opportunity. In OSC Release 8/9 there are eleven candidate objects, consisting of four child objects and seven objects with defined one-to-many relationships. (The lists of child and related objects are shown on the overview page for the top level object.) It is obvious, as it will be in the vast majority of cases that the object of interest is Opportunity Team Member. The trigger will be fired whenever a team member is added or removed from the parent object. In the small number of cases where it may be impossible to isolate just one object, opening up the candidate child or related objects and examining the fields should lead to identifying (or eliminating) the object as the candidate for a trigger function.

Decide on the best-fit trigger event. For Release 8 and Release 9, refer to Table 1 for the available object triggers.

Table 1: Object Trigger Events for Groovy Function Code*

Event Fires When?
After Create New instance of an object is created
Before Modify Field is initially modified in an existing row
Before Invalidate Field is initially modified in an existing object, or when child row is created, removed,   modified
Before Remove Attempt is made to delete an object
Before Insert Before new object is inserted.
After Insert After new object is created.
Before Update Before existing object is modified in database.
After Update After existing object is modified in database.
Before Delete Before existing object is deleted in database.
After Delete After existing object is deleted in database.
Before Commit Before pending changes are committed.
After Commit After pending changes are committed.
Before Rollback Before pending changes are rolled back.
After Rollback After pending changes are rolled back.
After Changes After all changes have been posted but before transaction is committed.

*NOTE: Not all trigger events are exposed for every object. Minor variations exist across application containers and object types.

There are a number of behavioral caveats around the use of trigger events. Primarily, selecting the trigger that makes the most sense in the context of whether or not data updates are occurring in the trigger function code will dictate the correct event to use. For example, if a function is updating a field value it does not make any sense at all to do that in any of the “After…” events, as the main database transaction will have taken place already. From a performance perspective, this will force another set of transactions to the database, which is bad enough, but to add insult to injury, all validation code will run another time needlessly.  In the worst case, triggers may be called repeatedly, which may result in an endless loop (if App Composer did not have safeguards in place to prevent this from happening).

In some cases business logic will help make an informed choice of the best trigger event. For example, it makes little sense to add an opportunity team member to a DOCS folder as a contributor unless it is certain that the database transaction which adds or deletes the team member completes successfully. Since the team member trigger function is not making any data updates, it is not only safe, but also logical, to use one of the “After…” events.

Create the trigger function and write the code. Obviously this step will probably take up the bulk of the effort. To ease the amount of work required, look at what the global function(s) require for input parameters as well as what the global functions return. Structurally and functionally, that discovery process, in conjunction with the business need, will dictate a large part of what the trigger function needs to accomplish.

Below is the code for the new trigger function that adds a folder contributor:

println 'Entering AddFolderContributorTrigger'
def docFolderGuid = nvl(Opportunity?.DocFolderGuid_c, '')
if (docFolderGuid) {
  def restParamsMap = adf.util.getDocCloudParameters()
  // prepare URL extension
  def urlExt = '/shares/' + docFolderGuid
  // prepare request payload  
  def userGUID = adf.util.getDocCloudServiceUserGUID(Name)
  def reqPayload = [userID:(userGUID), role:’Contributor’, message: ‘adding you to Opportunity folder’] 
  // make REST call (this is POST method) and save response payload 
  def respPayload = adf.util.callRest('POST', urlExt, reqPayload) 
  // convert JSON to Map for ease of handling individual attributes 
  def respMap = adf.util.json2Map(respPayload) 
  //TODO: better error checking required here 
  def errorCode = respMap.errorCode 
  if (errorCode != 0) { 
    // error occurred 
  } else { 
    println ‘Team member successfully added as contributor’ 
  } 
} else { 
  println 'Opportunity folder has not been created for ' + Opportunity?.Name 
} 
println 'Exiting AddFolderContributorTrigger'

By leveraging the global functions, the object trigger script to add a contributor to a DOCS folder when a new opportunity team member is added is less than a dozen lines of code. The first block of code, after checking to see if a DOCS folder exists, obtains a DOCS user GUID by querying the service with a REST call. Then a URL extension string is built, a Map of required key:value pairs is populated, and both of which are fed to the global callRest function. The response from the function is converted to JSON and rudimentary error checking is performed.

Below is the code for the new trigger function that removes an existing folder contributor:

println 'Entering RemoveFolderContributorTrigger'
def docFolderGuid = nvl(Opportunity?.DocFolderGuid_c, '')
if (docFolderGuid) {
  def restParamsMap = adf.util.getDocCloudParameters()
  // prepare URL extension
  def urlExt = '/shares/' + docFolderGuid + ‘/user’
  // prepare request payload
  def userGUID = adf.util.getDocCloudServiceUserGUID(Name)
  def reqPayload = [userID:(userGUID), role:’Contributor’, message: ‘removing you from Opportunity folder’] 
  // make REST call (this is DELETE method) and save response payload 
  def respPayload = adf.util.callRest('DELETE', urlExt, reqPayload) 
  // convert JSON to Map for ease of handling individual attributes 
  def respMap = adf.util.json2Map(respPayload) 
  //TODO: better error checking required here 
  def errorCode = respMap.errorCode 
  if (errorCode != 0) { 
    // error occurred 
  } else { 
    println ‘Team member successfully removed from DOCS folder’ } 
} else { 
    println 'Opportunity folder has not been created for ' + Opportunity?.Name 
} 
println 'Exiting RemoveFolderContributorTrigger'

The script to remove a folder contributor is also less than a dozen lines of code, and relies upon the global functions in the same way as the add contributor script.  Obviously, the REST DELETE method is specified instead of using a POST, as per the DOCS REST specifications.

One additional function to obtain a specific user GUID, or unique id, from DOCS is needed.  This function takes a search string, representing a user name, as input, and after making a REST call into the DOCS users resource, returns the user GUID.  Below is the code for the function:

println 'Entering getDocCloudServiceUserGUID'
def returnGUID = ''
// prepare URL extension
def urlExt = '/users/items?info=' + searchString
// no request payload
def reqPayload = [:]
// make REST call (this is GET method) and save response payload
def respPayload = adf.util.callRest('GET', urlExt, reqPayload)
// convert JSON to Map for ease of handling individual attributes
def respMap = adf.util.json2Map(respPayload)
//TODO: better error checking required here
def errorCode = respMap.errorCode
if (errorCode != 0) {
   // error occurred
   println 'DocCloudService error; errorCode ' + errorCode
} else {
   // get user GUID
   returnGUID = respMap.items[0].get('id')
}
println 'Exiting getDocCloudServiceUserGUID'
return returnGUID

It may make the most sense to create this as an object function under the Opportunity object, or perhaps as a global function.  The differences are minor, and function location is a matter of developer preference.

Add error handling code. Given the simple integration architecture set up for this example — a SaaS Sales Cloud instance making REST calls into a PaaS Documents Cloud Service instance — admittedly there are not many options available, other than reporting that something bad happened, when unexpected errors occur at runtime. In an environment where user actions – for example saving a new opportunity – trigger synchronous outbound web service calls, interrupting the user experience by blocking the database transaction may not be optimal.

The error handling options are few: (1) continue with the Sales Cloud transaction, in this case completing the create or edit of an Opportunity object, (2) back out of the Sales Cloud transaction if any failures are detected in the web service calls, or (3) take a hybrid approach and give the user a certain degree of control over what to do after an error.  Due to the non-critical nature of the transactions between Sales Cloud and DOCS in this example, reporting the error and moving on suffice.  If there is a need to create a DOCS folder for an Opportunity after the fact, it would be possible to create an Action button that could call into the same global functions with the same logic as the object trigger functions.

Summary

Planning out what work is done in global functions and what gets done in object trigger scripts, if done correctly, can lead to major efficiencies when adding new features to an existing extensibility project. This example used existing global functions that make REST calls from Sales Cloud to Documents Cloud Service to implement support for maintaining a group of DOCS folder contributors as team members are added or removed from the Opportunity team. Due to prior planning and following guidelines laid out in Part 1 of this post, object trigger functions were extremely lightweight and were added to the extensibility project with minimal effort.


Managing Oracle Documents Cloud Service files via curl commands

$
0
0

 

Oracle Documents Cloud Service allows users to keep their files in the cloud. It allows organizations to centralize their data. This blog post covers how to manage your files from a terminal. It could be a terminal running on your MacBook laptop, Windows desktop, or Linux server. Command-line tools are a simple applications to upload content into the cloud.

I will cover how using curl, a command line tool for transferring data, we could issue HTTP/HTTPS requests that will interact with Oracle Documents Cloud Service REST API File Resource.

 

curl is a free and open software. It supports a range of internet protocols including HTTP, HTTPS, FTP, FTPS, etc. It can be downloaded from http://curl.haxx.se/download.html

The Oracle Documents Cloud Service REST API File Resource documentation is available at: http://docs.oracle.com/cloud/latest/documentcs_welcome/WCCCD/GUID-2A7675E9-536D-47FD-B761-DD1881ADBC7E.htm#WCCCD3763

 

Find below a list of operations allowed on the File Resource:

 

1) Get File Metadata (GET)

FileID for the target document is passed as part of the URLuokiad

curl -u <username>:<password> https://<ORACLE-DOCS-SERVER>/documents/api/1.1/files/D7FA6B2DFA043CF6C8F460BAT0000DEFAULT00000000

 

2) Rename File (PUT)

FileID for the target document is passed as part of the URL and the new filename is passed as a JSON argument

curl -u <username>:<password> -X PUT -d "{\"name\":\"renamed_document.pdf\"}" https://<ORACLE-DOCS-SERVER>/documents/api/1.1/files/D7FA6B2DFA043CF6C8F460BAT0000DEFAULT00000000

 

3) Copy File (POST)

FileID for the source document is passed as part of the URL and the FolderID for the destination is passed as a JSON argument

curl -u <username>:<password> -X POST -d "{\"destinationID\":\"FD00F625F56050BA15CF567AT0000DEFAULT00000000\"}" https://<ORACLE-DOCS-SERVER>/documents/api/1.1/files/D7FA6B2DFA043CF6C8F460BAT0000DEFAULT00000000/copy

 

4) Delete File (DELETE)

FileID for the document to be deleted is passed as part of the URL. Note that this file will be placed in the Trash bin and will not be removed from the system until it is purged from the Trash.

curl -u <username>:<password> -X DELETE https://<ORACLE-DOCS-SERVER>/documents/api/1.1/files/D6178E2C7B4ACAE15E179393T0000DEFAULT00000000

 

5) Upload File (POST)

This request passes a two-part multipart/form-data via the -F curl tag: the FolderID for the destination as a JSON argument and the local filename to be uploaded

curl -u <username>:<password> -X POST -F "jsonInputParameters={\"parentID\":\"F5F20CC273D80927A89D0701T0000DEFAULT00000000\"}" -F "primaryFile=@sample_document.pdf" https://<ORACLE-DOCS-SERVER>/documents/api/1.1/files/data

 

6) Upload File Version (POST)

FileID for the target document is passed as part of the URL and the new version of the document is passed as a multipart via the curl tag -F

curl -u <username>:<password> -X POST -F "primaryFile=@sample_document.pdf" https://<ORACLE-DOCS-SERVER>/documents/api/1.1/files/DE21B5568A9F9A27B16CBA00T0000DEFAULT00000000/data

 

7) Download File (GET)

FileID for the target document is passed as part of the URL

curl -u <username>:<password> https://<ORACLE-DOCS-SERVER>/documents/api/1.1/files/D7FA6B2DFA043CF6C8F460BAT0000DEFAULT00000000/data

 

8) Get File Versions (GET)

FileID for the target document is passed as part of the URL

curl -u <username>:<password> https://<ORACLE-DOCS-SERVER>/documents/api/1.1/files/DE21B5568A9F9A27B16CBA00T0000DEFAULT00000000/versions

 

The above operations were tested on a Windows 7 laptop using curl version 7.42.0 (x86_64-pc-win32).

 

Index of WebCenter Content articles

$
0
0
  • WebCenter Content
  • Integrating Oracle Fusion Applications – WebCenter / Universal Content Management (UCM) with Oracle Business Intelligence Cloud Service (BICS)

    $
    0
    0

    Introduction

     

    This article describes how to integrate Oracle Fusion Applications – WebCenter / Universal Content Management (UCM) with Oracle Business Intelligence Cloud Service (BICS). The integration pattern covered shares similarities to those addressed in the previously published A-Team blog on: “Integrating Oracle Fusion Sales Cloud with Oracle Business Intelligence Cloud Service (BICS)”. The motivation behind this article is to provide a fresh perspective on this subject, and to offer an alternative for use cases unable to use OTBI web services to extract Fusion data.

    The solution uses PL/SQL and Soap Web Services to retrieve the Fusion data. It was written and tested on Fusion Sales Cloud R10. That said, it is also relevant to any other Oracle product that has access to WebCenter / UCM – provided that idcws/GenericSoapPort?wsdl is publicly available. The article is geared towards BICS installations on an Oracle Schema Service Database. However, it may also be useful for DbaaS environments.

    The artifacts provided can be used as a starting point to build a custom BICS – WebCenter / UCM adapter. The examples were tested against a small test data-set, and it is anticipated that code changes will be required before applying to a Production environment.

    The article is divided into four steps:

     

    Step One – Create and Activate the Schedule Export Process

    Describes the data staging process – which is configured through “Schedule Export” (accessed via “Setup and Maintenance”). “Schedule Export” provides a variety of export objects for each Fusion module / product family. It walks through creating, editing, scheduling, and activating the “Scheduled Export”. The results of the “Scheduled Export” are saved to a CSV file stored in WebCenter / UCM.

    Step Two – Confirm data is available in WebCenter / UCM

    Verifies that the user can log into Webcenter / UCM and access the CSV file that was created in Step One. The “UCM ID” associated with the CSV file is visible from the WebCenter / UCM Content Server Search. The id is then used in Step Three to programmatically search for the object.

    Step Three – Test GET_SEARCH_RESULTS and GET_FILE Soap Requests

    Outlines how to build the Soap requests that utilizes the public idcws/GenericSoapPort?wsdl (available in Fusion R10). “GET_SEARCH_RESULTS” is used to retrieve the “dID” based on the “UCM ID” of the CSV file (gathered from Step Two). “GET_FILE” is then used to retrieve the file associated the given “dID”. The file is returned as a SOAP attachment.

    Step Four – Code the Stored Procedure

    Provides PL/SQL samples that can be used as a starting point to build out the integration solution. The database artifacts are created through Apex SQL Workshop SQL Commands. The Soap requests are called using apex_web_service.make_rest_request.

    Generally speaking, Make_Rest_Request is reserved for RESTful Web Services and apex_web_service.make_request for Soap Web Services. However, in this case it was not possible to use apex_web_service.make_request as the data returned was not compatible with the mandatory output of XMLTYPE. Apex_web_service.make_rest_request has been used as a workaround as it offers additionally flexibility, allowing the data to be retrieved as a CLOB.

    For “GET_SEARCH_RESULTS” the non-XML components of the file are removed, and the data is saved as XMLTYPE so that the namespace can be used to retrieve the “dID”.

    For “GET_FILE” the data is kept as a CLOB. The non-CSV components of the file are removed from the CLOB. Then the data is parsed to the database using “csv_util_pkg.clob_to_csv” that is installed from the Alexandria PL/SQL Utility Library.

     

    Main Article

     

    Step One – Create and Activate the Schedule Export Process

     

    1)    Click Setup and Maintenance.

    Snap0

     2)    Enter “Schedule Export” into the “Search: Tasks” search box.

    Click the arrow to search.

    Snap2

    3)    Click Go to Task.

    Snap3

     

    4)    Click Create.

    Snap4

     

     

     

     

     

     

    5)    Type in Name and Description.

    Click Next.

    Snap1

     

    6)    Click Actions -> Create.

    Snap6

     

     

    7)    Select the desired export object.

    Click Done.

    Snap2

     

     

     

     

     

    8)    Click the arrow to expand the attributes.

    Snap9

     

    9)    Un-check unwanted attributes.

    For this example the following five attributes have been selected:

    a)    Territory Name
    b)    Status Code
    c)    Status
    d)    Type
    e)    Forecast Participation Code

    Snap3 Snap4

    10)   Select Schedule Type = Immediate.

    Click Next.

    Snap10

    11)   Click Activate.

    Snap11

     

    12)   Click refresh icon (top right) until status shows as “Running”.

    Confirm process completed successfully.

    Snap12a

     

    Snap6

    Snap7

    13)   Once the process is complete – Click on “Exported data file” CSV file link.

    Confirm CSV contains expected results.

    Snap8

    Step Two – Confirm data is available in WebCenter / UCM

     

    1)    Login to WebCenter / UCM.

    https://hostname.fs.em2.oraclecloud.com/cs

    2)    Search by Title.

    Search by Title for the CSV file.

    Snap1

     

    3)    Click the ID to download the CSV file.

    4)    Confirm the CSV file contains the expected data-set.

    Snap2

    Step Three – Test GET_SEARCH_RESULTS and GET_FILE Soap Requests

     

    1)    Confirm that the idcws/GenericSoapPort?wsdl is accessible. (Note this is only public in Fusion Applications R10.)

    https://hostname.fs.em2.oraclecloud.com/idcws/GenericSoapPort?wsdl

    Snap15

     

    2)    Launch SOAPUI.

    Enter the idcws/GenericSoapPort?wsdl in the Initial WSDL box.

    Click OK.

    https://hostname.fs.em2.oraclecloud.com/idcws/GenericSoapPort?wsdl

    Snap16

     

    3)    Right Click on the Project.

    Select “Show Project View”.

    Snap17

    4)    Add a new outgoing WSS Configuration called “Outgoing” and a new WSS Entry for “Username”

    a)    Click on the “WS-Security Configurations” tab.

    b)    Click on the + (plus sign) located in the top left.

    c)    Type “Outgoing” in Name.

    d)    Click OK.

    e)    Click on the + (plus sign) located in the middle left. Select Username. Click OK.

    f)    Type in the WebCenter / UCM  user name and password.

    g)    In the Password Type drop down box select “PasswordText”

    Snap18

     

    5)    Add a new WSS Entry for Timestamp

    a)    Click on the + (plus sign) again located in middle left.

    b)    Select Timestamp. Put in a very large number. This is the timeout in milliseconds.

    c)    Close the window.

    Snap19

     

    6)    Click on Request

    Delete the default envelope and replace it with below:

    Replace the highlighted UCM ID with that found in “Step Two – Confirm data is available in WebCenter / UCM”.

    Do not remove the [``] tick marks around the UCM ID.

    For a text version of this code click here.

    <soapenv:Envelope xmlns:soapenv=”http://schemas.xmlsoap.org/soap/envelope/”
    xmlns:ucm=”http://www.oracle.com/UCM”>
    <soapenv:Header>
    Right Click here … then remove this text
    </soapenv:Header>
    <soapenv:Body>
    <ucm:GenericRequest webKey=”cs”>
    <ucm:Service IdcService=”GET_SEARCH_RESULTS”>
    <ucm:Document>
    <ucm:Field name=”QueryText”>dDocName &lt;starts> `UCMFA001069`</ucm:Field>
    </ucm:Document>
    </ucm:Service>
    </ucm:GenericRequest>
    </soapenv:Body>
    </soapenv:Envelope>

    7)    Place the cursor in between the <soapenv:Header> tags (i.e. “Right Click here … then remove this text”).

    Right Click -> Select Outgoing WSS -> Select Apply “Outgoing”.

    Snap4

    8)    The previously defined header containing the user name, password, and timeout settings should now be added to the request.

    Remove the “Right Click here … then remove this text” comment.

    Confirm Outgoing WSS has been applied to the correct position.

    Snap5

    9)    Submit the Request (by hitting the green arrow in the top left).

    Snap6

    10)   The Request should return XML containing the results of GET_SEARCH_RESULTS.

    Ctrl F -> Find: dID

    Snap7

     

     

     

     

    11)   Make note of the dID. For example the dID below is “1011”.

    Snap8

     

     

    12)   Right Click on the Request.

    Rename it GET_SEARCH_RESULTS for later use.

    Snap25a

     

     

    Snap25b

     

     

    13)   Right Click on GenericSoapOperation -> Select New Request

    Snap26

     

     

    14)   Name it GET_FILE.

    Snap27

     

     

    15)   Delete the default request envelope and replace with below:

    For a text version of the code click here.

    <soapenv:Envelope xmlns:soapenv=”http://schemas.xmlsoap.org/soap/envelope/” xmlns:ucm=”http://www.oracle.com/UCM”>
    <soapenv:Header>
    Right Click here … then remove this text
    </soapenv:Header>
    <soapenv:Body>
    <ucm:GenericRequest webKey=”cs”>
    <ucm:Service IdcService=”GET_FILE”>
    <ucm:Document>
    <ucm:Field name=”dID”>1011</ucm:Field>
    </ucm:Document>
    </ucm:Service>
    </ucm:GenericRequest>
    </soapenv:Body>
    </soapenv:Envelope>

    16)  (a)   Repeat process of adding Outgoing WSS

    Place the cursor in between the <soapenv:Header> tags (i.e. “Right Click here … then remove this text”).

    Right Click -> Select Outgoing WSS -> Select Apply “Outgoing”.

    The previously defined header containing the user name, password, and timeout settings should now be added to the request.

    Remove the “Right Click here … then remove this text” comment.

    Confirm Outgoing WSS has been applied to the correct position.

    (b)   Submit the Request (green arrow top left).

    An attachment should be generated.

    Click on the Attachments tab (at bottom).

    Double click to open the attachment.

    Snap9

     

    17)   Confirm results are as expected.

    Snap10

    Step Four – Code the Stored Procedure

    1)    Test GET_SEARCH_RESULTS PL/SQL

    (a)    Copy the below PL/SQL code into the Apex -> SQL Workshop -> SQL Commands.

    (b)    Replace:

    (i) Username

    (ii) Password

    (iii) Hostname

    (iv) dDocName i.e. UCMFA001069

    (c)    For a text version of the PL/SQL click here.

    DECLARE
    l_user_name VARCHAR2(100) := ‘username‘;
    l_password VARCHAR2(100) := ‘password‘;
    l_ws_url VARCHAR2(500) := ‘https://hostname.fs.us2.oraclecloud.com/idcws/GenericSoapPort?wsdl';
    l_ws_action VARCHAR2(500) := ‘urn:GenericSoap/GenericSoapOperation';
    l_ws_response_clob CLOB;
    l_ws_response_clob_clean CLOB;
    l_ws_envelope CLOB;
    l_http_status VARCHAR2(100);
    v_dID VARCHAR2(100);
    l_ws_resp_xml XMLTYPE;
    l_start_xml PLS_INTEGER;
    l_end_xml PLS_INTEGER;
    l_resp_len PLS_INTEGER;
    l_xml_len PLS_INTEGER;
    clob_l_start_xml PLS_INTEGER;
    clob_l_resp_len PLS_INTEGER;
    clob_l_xml_len PLS_INTEGER;
    clean_clob_l_end_xml PLS_INTEGER;
    clean_clob_l_resp_len PLS_INTEGER;
    clean_clob_l_xml_len PLS_INTEGER;
    v_cdata VARCHAR2(100);
    v_length INTEGER;
    BEGIN
    l_ws_envelope :=
    ‘<soapenv:Envelope xmlns:soapenv=”http://schemas.xmlsoap.org/soap/envelope/” xmlns:ucm=”http://www.oracle.com/UCM”>
    <soapenv:Body>
    <ucm:GenericRequest webKey=”cs”>
    <ucm:Service IdcService=”GET_SEARCH_RESULTS”>
    <ucm:Document>
    <ucm:Field name=”QueryText”>dDocName &lt;starts> `UCMFA001069`</ucm:Field>
    </ucm:Document>
    </ucm:Service>
    </ucm:GenericRequest>
    </soapenv:Body>
    </soapenv:Envelope>';
    apex_web_service.g_request_headers(1).name := ‘SOAPAction';
    apex_web_service.g_request_headers(1).value := l_ws_action;
    apex_web_service.g_request_headers(2).name := ‘Content-Type';
    apex_web_service.g_request_headers(2).value := ‘text/xml; charset=UTF-8′;
    l_ws_response_clob := apex_web_service.make_rest_request(
    p_url => l_ws_url,
    p_http_method => ‘POST’,
    p_body => l_ws_envelope,
    p_username => l_user_name,
    p_password => l_password);
    –dbms_output.put_line(dbms_lob.substr(l_ws_response_clob,24000,1));
    –Tested on a very small CLOB. Less than 32767. If larger may need to slice.
    –dbms_output.put_line(length(l_ws_response_clob));
    –Remove header as it is not XML
    clob_l_start_xml := INSTR(l_ws_response_clob,'<?xml’,1,1);
    clob_l_resp_len := LENGTH(l_ws_response_clob);
    clob_l_xml_len := clob_l_resp_len – clob_l_start_xml + 1;
    l_ws_response_clob_clean := dbms_lob.substr(l_ws_response_clob,clob_l_xml_len,clob_l_start_xml);
    –dbms_output.put_line(l_ws_response_clob_clean);
    –Remove the tail as it is not XML
    clean_clob_l_end_xml := INSTR(l_ws_response_clob_clean,’——=’,1,1);
    clean_clob_l_resp_len := LENGTH(l_ws_response_clob_clean);
    clean_clob_l_xml_len := clean_clob_l_end_xml – 1;
    l_ws_response_clob_clean := dbms_lob.substr(l_ws_response_clob_clean,clean_clob_l_xml_len,1);
    –dbms_output.put_line(l_ws_response_clob_clean);
    –Convert CLOB to XMLTYPE
    l_ws_resp_xml := XMLTYPE.createXML(l_ws_response_clob_clean);
    select (cdata_section)
    into v_cdata
    from
    xmltable
    (
    xmlnamespaces
    (
    ‘http://schemas.xmlsoap.org/soap/envelope/’ as “env”,
    ‘http://www.oracle.com/UCM’ as “ns2″
    ),
    ‘//env:Envelope/env:Body/ns2:GenericResponse/ns2:Service/ns2:Document/ns2:ResultSet/ns2:Row/ns2:Field[@name="dID"]‘
    passing l_ws_resp_xml
    columns
    cdata_section VARCHAR2(100) path ‘text()’
    ) dat;
    dbms_output.put_line(‘dID:’ || v_cdata);
    END;

     (d)    The Results should show the corresponding dID.

    Snap11

    2)    Install the relevant alexandria-plsql-utils

    (a)    Go to: https://github.com/mortenbra/alexandria-plsql-utils

    Snap2

    (b)    Click Download ZIP

    Snap1

    (c)    Run these three sql scripts / packages in this order:

    \plsql-utils-v170\setup\types.sql

    \plsql-utils-v170\ora\csv_util_pkg.pks

    \plsql-utils-v170\ora\csv_util_pkg.pkb

    3)    Create table in Apex to insert data into.

    For a text version of the SQL click here.

    CREATE TABLE TERRITORY_INFO(
    Territory_Name VARCHAR(100),
    Status_Code VARCHAR(100),
    Status VARCHAR(100),
    Type VARCHAR(100),
    Forecast_Participation_Code VARCHAR(100)
    );

    4)    Test UCM_GET_FILE STORED PROCEDURE

    (a)    Copy the below PL/SQL code into the Apex -> SQL Workshop -> SQL Commands.

    (b)    Replace:

    (I) Column names in SQL INSERT as needed

    (Ii) Header name of first column. i.e. “Territory Name”

    (c)    For a text version of the PL/SQL click here.

    CREATE OR REPLACE PROCEDURE UCM_GET_FILE
    (
    p_ws_url VARCHAR2,
    p_user_name VARCHAR2,
    p_password VARCHAR2,
    p_dID VARCHAR2
    ) IS
    l_ws_envelope CLOB;
    l_ws_response_clob CLOB;
    l_ws_response_clob_clean CLOB;
    l_ws_url VARCHAR2(500) := p_ws_url;
    l_user_name VARCHAR2(100) := p_user_name;
    l_password VARCHAR2(100) := p_password;
    l_ws_action VARCHAR2(500) := ‘urn:GenericSoap/GenericSoapOperation';
    l_ws_resp_xml XMLTYPE;
    l_start_xml PLS_INTEGER;
    l_end_xml PLS_INTEGER;
    l_resp_len PLS_INTEGER;
    clob_l_start_xml PLS_INTEGER;
    clob_l_resp_len PLS_INTEGER;
    clob_l_xml_len PLS_INTEGER;
    clean_clob_l_end_xml PLS_INTEGER;
    clean_clob_l_resp_len PLS_INTEGER;
    clean_clob_l_xml_len PLS_INTEGER;
    BEGIN
    l_ws_envelope :=
    ‘<soapenv:Envelope xmlns:soapenv=”http://schemas.xmlsoap.org/soap/envelope/” xmlns:ucm=”http://www.oracle.com/UCM”>
    <soapenv:Body>
    <ucm:GenericRequest webKey=”cs”>
    <ucm:Service IdcService=”GET_FILE”>
    <ucm:Document>
    <ucm:Field name=”dID”>’|| p_dID ||'</ucm:Field>
    </ucm:Document>
    </ucm:Service>
    </ucm:GenericRequest>
    </soapenv:Body>
    </soapenv:Envelope>
    ‘;
    apex_web_service.g_request_headers(1).name := ‘SOAPAction';
    apex_web_service.g_request_headers(1).value := l_ws_action;
    apex_web_service.g_request_headers(2).name := ‘Content-Type';
    apex_web_service.g_request_headers(2).value := ‘text/xml; charset=UTF-8′;
    l_ws_response_clob := apex_web_service.make_rest_request(
    p_url => l_ws_url,
    p_http_method => ‘POST’,
    p_body => l_ws_envelope,
    p_username => l_user_name,
    p_password => l_password);
    –Note: This was tested with a very small result-set
    –dbms_output.put_line(dbms_lob.substr(l_ws_response_clob,24000,1));
    –Tested on a very small CLOB. Less than 32767. If larger may need to slice.
    –dbms_output.put_line(length(l_ws_response_clob));
    –Remove junk header
    clob_l_start_xml := INSTR(l_ws_response_clob,'”Territory Name“‘,1,1);
    clob_l_resp_len := LENGTH(l_ws_response_clob);
    clob_l_xml_len := clob_l_resp_len – clob_l_start_xml + 1;
    l_ws_response_clob_clean := dbms_lob.substr(l_ws_response_clob,clob_l_xml_len,clob_l_start_xml);
    –dbms_output.put_line(l_ws_response_clob_clean);
    –Remove junk footer
    clean_clob_l_end_xml := INSTR(l_ws_response_clob_clean,CHR(13),-3)-2;
    clean_clob_l_resp_len := LENGTH(l_ws_response_clob_clean);
    clean_clob_l_xml_len := clean_clob_l_end_xml;
    l_ws_response_clob_clean := dbms_lob.substr(l_ws_response_clob_clean,clean_clob_l_xml_len,1);
    — dbms_output.put_line(l_ws_response_clob_clean);
    –Insert into database
    DELETE FROM TERRITORY_INFO;
    INSERT INTO TERRITORY_INFO (Territory_Name,Status_Code,Status,Type,Forecast_Participation_Code)
    select C001,C002,C003,C004,C005 FROM table(csv_util_pkg.clob_to_csv(l_ws_response_clob_clean,’,’,1));
    END;

    (d)    To run the stored procedure – Copy the below PL/SQL code into the Apex -> SQL Workshop -> SQL Commands.

    (e)    For a text version of the PL/SQL click here.

    (f)    Replace:

    (i) Username

    (ii) Password

    (iii) Hostname

    (iv) dID i.e. 1011

    BEGIN
    UCM_GET_FILE(‘https://hostname.fs.us2.oraclecloud.com/idcws/GenericSoapPort?wsdl’,’username‘,’password‘,’dID‘);
    END;

    (g)    Confirm data was loaded successfully

    SELECT * FROM TERRITORY_INFO;

    Snap3

    5) Combine GET_SEARCH_RESULTS and GET_FILE

    (a)    Copy the below PL/SQL code into the Apex -> SQL Workshop -> SQL Commands.

    (b)    Replace:

    (i) Username

    (ii) Password

    (iii) Hostname

    (iv) dDocName i.e. UCMFA001069

    (c)    Note: The code highlighted in green is the only change made to the original.

    (d)    For a text version of the PL/SQL click here.

    (e)    Note: This code would also be converted to a stored procedure with parameters should it be implemented in production.

    DECLARE
    l_user_name VARCHAR2(100) := ‘username‘;
    l_password VARCHAR2(100) := ‘password‘;
    l_ws_url VARCHAR2(500) := ‘https://hostname.fs.us2.oraclecloud.com/idcws/GenericSoapPort?wsdl';
    l_ws_action VARCHAR2(500) := ‘urn:GenericSoap/GenericSoapOperation';
    l_ws_response_clob CLOB;
    l_ws_response_clob_clean CLOB;
    l_ws_envelope CLOB;
    l_http_status VARCHAR2(100);
    v_dID VARCHAR2(100);
    l_ws_resp_xml XMLTYPE;
    l_start_xml PLS_INTEGER;
    l_end_xml PLS_INTEGER;
    l_resp_len PLS_INTEGER;
    l_xml_len PLS_INTEGER;
    clob_l_start_xml PLS_INTEGER;
    clob_l_resp_len PLS_INTEGER;
    clob_l_xml_len PLS_INTEGER;
    clean_clob_l_end_xml PLS_INTEGER;
    clean_clob_l_resp_len PLS_INTEGER;
    clean_clob_l_xml_len PLS_INTEGER;
    v_cdata VARCHAR2(100);
    v_length INTEGER;
    BEGIN
    l_ws_envelope :=
    ‘<soapenv:Envelope xmlns:soapenv=”http://schemas.xmlsoap.org/soap/envelope/” xmlns:ucm=”http://www.oracle.com/UCM”>
    <soapenv:Body>
    <ucm:GenericRequest webKey=”cs”>
    <ucm:Service IdcService=”GET_SEARCH_RESULTS”>
    <ucm:Document>
    <ucm:Field name=”QueryText”>dDocName &lt;starts> `UCMFA001069`</ucm:Field>
    </ucm:Document>
    </ucm:Service>
    </ucm:GenericRequest>
    </soapenv:Body>
    </soapenv:Envelope>';
    apex_web_service.g_request_headers(1).name := ‘SOAPAction';
    apex_web_service.g_request_headers(1).value := l_ws_action;
    apex_web_service.g_request_headers(2).name := ‘Content-Type';
    apex_web_service.g_request_headers(2).value := ‘text/xml; charset=UTF-8′;
    l_ws_response_clob := apex_web_service.make_rest_request(
    p_url => l_ws_url,
    p_http_method => ‘POST’,
    p_body => l_ws_envelope,
    p_username => l_user_name,
    p_password => l_password);
    –dbms_output.put_line(dbms_lob.substr(l_ws_response_clob,24000,1));
    –Tested on a very small CLOB. Less than 32767. If larger may need to slice.
    –dbms_output.put_line(length(l_ws_response_clob));
    –Remove header as it is not XML
    clob_l_start_xml := INSTR(l_ws_response_clob,'<?xml’,1,1);
    clob_l_resp_len := LENGTH(l_ws_response_clob);
    clob_l_xml_len := clob_l_resp_len – clob_l_start_xml + 1;
    l_ws_response_clob_clean := dbms_lob.substr(l_ws_response_clob,clob_l_xml_len,clob_l_start_xml);
    –dbms_output.put_line(l_ws_response_clob_clean);
    –Remove the tail as it is not XML
    clean_clob_l_end_xml := INSTR(l_ws_response_clob_clean,’——=’,1,1);
    clean_clob_l_resp_len := LENGTH(l_ws_response_clob_clean);
    clean_clob_l_xml_len := clean_clob_l_end_xml – 1;
    l_ws_response_clob_clean := dbms_lob.substr(l_ws_response_clob_clean,clean_clob_l_xml_len,1);
    –dbms_output.put_line(l_ws_response_clob_clean);
    –Convert CLOB to XMLTYPE
    l_ws_resp_xml := XMLTYPE.createXML(l_ws_response_clob_clean);
    select (cdata_section)
    into v_cdata
    from
    xmltable
    (
    xmlnamespaces
    (
    ‘http://schemas.xmlsoap.org/soap/envelope/’ as “env”,
    ‘http://www.oracle.com/UCM’ as “ns2″
    ),
    ‘//env:Envelope/env:Body/ns2:GenericResponse/ns2:Service/ns2:Document/ns2:ResultSet/ns2:Row/ns2:Field[@name="dID"]‘
    passing l_ws_resp_xml
    columns
    cdata_section VARCHAR2(100) path ‘text()’
    ) dat;
    –dbms_output.put_line(‘dID:’ || v_cdata);
    UCM_GET_FILE(l_ws_url,l_user_name,l_password,v_cdata);
    END;

    Further Reading

    Click here for the Application Express API Reference Guide –  MAKE_REST_REQUEST Function

    Click here for the Alexandria-plsql-utils

    Click here for related A-Team BICS blogs

    Summary

    This article described how to integrate Oracle Fusion Applications – WebCenter / Universal Content Management (UCM) with Oracle Business Intelligence Cloud Service (BICS). It covered the functional and technical steps necessary to automate exporting of data from WebCenter / UCM and load it into a BICS database.

    In order to implement this solution the WebCenter / UCM idcws/GenericSoapPort?wsdl must be publicly available. At the time of writing this was available in Fusion Applications R10.

    The solution did not cover creating the BICS Data Model or Dashboards. Information on this topic can be found on other A-Team blogs published by the same author.

    The SQL scripts provided are for demonstration purposes only. They were tested on small sample data-set. It is anticipated that code adjustments would be need to accommodate larger Production data-sets.

    Fusion HCM Cloud – Bulk Integration Automation Using Managed File Transfer (MFT) and Node.js

    $
    0
    0

    Introduction

    Fusion HCM Cloud provides a comprehensive set of tools, templates, and pre-packaged integration to cover various scenarios using modern and efficient technologies. One of the patterns is the bulk integration to load and extract data to/from the cloud.

    The inbound tool is the File Based data loader (FBL) evolving into HCM Data Loaders (HDL). HDL is a powerful tool for bulk-loading data from any source to Oracle Fusion Human Capital Management (Oracle Fusion HCM). HDL supports one-time data migration and incremental load to support co-existence with Oracle Applications such as E-Business Suite (EBS) and PeopleSoft (PSFT).

    HCM Extracts is an outbound integration tool that lets you choose HCM data, gathers it from the HCM database and archives it as XML. This archived raw XML data can be converted into a desired format and delivered to supported channels recipients.

    HCM cloud implements Oracle WebCenter Content, a component of Fusion Middleware, to store and secure data files for both inbound and outbound bulk integration patterns.

    Oracle Managed File Transfer (Oracle MFT) enables secure file exchange and management with internal systems and external partners. It protects against inadvertent access to unsecured files at every step in the end-to-end transfer of files. It is easy to use, especially for non technical staff, so you can leverage more resources to manage the transfer of files. The built in extensive reporting capabilities allow you to get quick status of a file transfer and resubmit it as required.

    Node.js is a programming platform that allows you to execute server-side code that is similar to JavaScript in the browser. It enables real-time, two-way connections in web applications with push capability, allowing a non-blocking, event-driven I/O paradigm. Node.js is built on an event-driven, asynchronous model. The in-coming requests are non-blocking. Each request is passed off to an asynchronous callback handler. This frees up the main thread to respond to more requests.

    This post focuses on how to automate HCM Cloud batch integration using MFT (Managed File Transfer) and Node.js. MFT can receive files, decrypt/encrypt files and invoke Service Oriented Architecture (SOA) composites for various HCM integration patterns.

     

    Main Article

    Managed File Transfer (MFT)

    Oracle Managed File Transfer (MFT) is a high performance, standards-based, end-to-end managed file gateway. It features design, deployment, and monitoring of file transfers using a lightweight web-based design-time console that includes file encryption, scheduling, and embedded FTP and sFTP servers.

    Oracle MFT provides built-in compression, decompression, encryption and decryption actions for transfer pre-processing and post-processing. You can create new pre-processing and post-processing actions, which are called callouts.

    The callouts can be associated with either the source or the target. The sequence of processing action execution during a transfer is as follows:

    1. 1. Source pre processing actions
    2. 2. Target pre processing actions
    3. 3. Payload delivery
    4. 4. Target post processing actions
    Source Pre-Processing

    Source pre-processing is triggered right after a file has been received and has identified a matching Transfer. This is the best place to do file validation, compression/decompression, encryption/decryption and/or extend MFT.

    Target Pre-Processing

    Target pre-processing is triggered just before the file is delivered to the Target by the Transfer. This is the best place to send files to external locations and protocols not supported in MFT.

    Target Post-Processing

    Post-processing occurs after the file is delivered. This is the best place for notifications, analytic/reporting or maybe remote endpoint file rename.

    For more information, please refer to the Oracle MFT document

     

    HCM Inbound Flow

    This is a typical Inbound FBL/HDL process flow:

    inbound_mft

    The FBL/HDL process for HCM is a two-phase web services process as follows:

    • Upload the data file to WCC/UCM using WCC GenericSoapPort web service
    • Invoke “LoaderIntegrationService” or “HCMDataLoader” to initiate the loading process.

    The following diagram illustrates the MFT steps with respect to “Integration” for FBL/HDL:

    inbound_mft_2

    HCM Outbound Flow

    This is a typical outbound batch Integration flow using HCM Extracts:

    extractflow

     

    The “Extract” process for HCM has the following steps:

    • An Extract report is generated in HCM either by user or through Enterprise Scheduler Service (ESS) – this report is stored in WCC under the hcm/dataloader/export account.
    • MFT scheduler can pull files from WCC
    • The data file(s) are either uploaded to the customer’s sFTP server as pass through or to Integration tools such as Service Oriented Architecture (SOA) for orchestrating and processing data to target applications in cloud or on-premise.

    The following diagram illustrates the MFT orchestration steps in “Integration” for Extract:

     

    outbound_mft

     

    The extracted file could be delivered to the WebCenter Content server. HCM Extract has an ability to generate an encrypted output file. In Extract delivery options ensure the following options are correctly configured:

    • Select HCM Delivery Type to “HCM Connect”
    • Select an Encryption Mode of the four supported encryption types or select None
    • Specify the Integration Name – this value is used to build the title of the entry in WebCenter Content

     

    Extracted File Naming Convention in WebCenter Content

    The file will have the following properties:
    Author: FUSION_APPSHCM_ESS_APPID
    Security Group: FAFusionImportExport
    Account: hcm/dataloader/export
    Title: HEXTV1CON_{IntegrationName}_{EncryptionType}_{DateTimeStamp}

     

    Fusion Applications Security

    The content in WebCenter Content is secured through users, roles, privileges and accounts. The user could be any valid user with a role such as “Integration Specialist.” The role may have privileges such as read, write and delete. The accounts are predefined by each application. For example, HCM uses /hcm/dataloader/import and /hcm/dataloader/export respectively.
    The FBL/HDL web services are secured through Oracle Web Service Manager (OWSM) using the following policy: oracle/wss11_saml_or_username_token_with_message_protection_service_policy.

    The client must satisfy the message protection policy to ensure that the payload is encrypted or sent over the SSL transport layer.

    A client policy that can be used to meet this requirement is: “oracle/wss11_username_token_with_message_protection_client_policy”

    To use this policy, the message must be encrypted using a public key provided by the server. When the message reaches the server it can be decrypted by the server’s private key. A KeyStore is used to import the certificate and it is referenced in the subsequent client code.

    The public key can be obtained from the certificate provided in the service WSDL file.

    Encryption of Data File using Pretty Good Privacy (PGP)

    All data files transit over a network via SSL. In addition, HCM Cloud supports encryption of data files at rest using PGP.
    Fusion HCM supports the following types of encryption:

    • PGP Signed
    • PGP Unsigned
    • PGPX509 Signed
    • PGPX509 Unsigned

    To use this PGP Encryption capability, a customer must exchange encryption keys with Fusion for the following:

    • Fusion can decrypt inbound files
    • Fusion can encrypt outbound files
    • Customer can encrypt files sent to Fusion
    • Customer can decrypt files received from Fusion

    MFT Callout using Node.js

     

    Prerequisites

    To automate HCM batch integration patterns, the following components must be installed and configured respectively:

     

    Node.js Utility

    A simple Node.js utility “mft2hcm” has been developed for uploading or downloading files to/from a MFT server callout to Oracle WebCenter Content server and initiate HCM SaaS loader service. It utilizes the node “mft-upload” package and provides SOAP substitution templates for WebCenter (UCM) and Oracle HCM Loader service.

    Please refer to the “mft2hcm” node package for installation and configuration.

    RunScript

    The RunScript is configured as “Run Script Pre 01” to configure a callout that can be injected into MFT in pre or post processing. This callout always sends the following default parameters to the script:

    • Filename
    • Directory
    • ECID
    • Filesize
    • Targetname (not for source callouts)
    • Sourcename
    • Createtime

    Please refer to “PreRunScript” for more information on installation and configuration.

    MFT Design

    MFT Console enables the following tasks depending on your user roles:

    Designer: Use this page to create, modify, delete, rename, and deploy sources, targets, and transfers.

    Monitoring: Use this page to monitor transfer statistics, progress, and errors. You can also use this page to disable, enable, and undeploy transfer deployments and to pause, resume, and resubmit instances.

    Administration: Use this page to manage the Oracle Managed File Transfer configuration, including embedded server configuration.

    Please refer to the MFT Users Guide for more information.

     

    HCM FBL/HDL MFT Transfer

    This is a typical MFT transfer design and configuration for FBL/HDL:

    MFT_FBL_Transfer

    The transfer could be designed for additional steps such as compress file and/or encrypt/decrypt files using PGP, depending on the use cases.

     

    HCM FBL/HDL (HCM-MFT) Target

    The MFT server receives files from any Source protocol such as SFTP, SOAP, local file system or a back end integration process. The file can be decrypted, uncompressed or validated before a Source or Target pre-processing callout uploads it to UCM then notifies HCM to initiate the batch load. Finally the original file is backed up into the local file system, remote SFTP server or a cloud based storage service. An optional notification can also be delivered to the caller using a Target post-processing callout upon successful completion.

    This is a typical target configuration in the MFT-HCM transfer:

    Click on target Pre-Processing Action and select “Run Script Pre 01”:

    MFT_RunScriptPre01

     

    Enter “scriptLocation” where node package “mft2hcm” is installed. For example, <Node.js-Home>/hcm/node_modules/mft2hcm/mft2hcm.js

    MFTPreScriptUpload

     

    Do not check ”UseFileFromScript”. This property replaces an inbound file (source) of MFT with the file from target execution. In FBL/HDL, the response (target execution) do not contain file.

     

    HCM Extract (HCM-MFT) Transfer

    An external event or scheduler triggers the MFT server to search for a file in WCC using a search query. Once a document id is indentified, it is retrieved using a “Source Pre-Processing” callout which injects the retrieved file into the MFT Transfer. The file can then be decrypted, validated or decompressed before being sent to an MFT Target of any protocol such as SFTP, File system, SOAP Web Service or a back end integration process. Finally, the original file is backed up into the local file system, remote SFTP server or a cloud based storage service. An optional notification can also be delivered to the caller using a Target post-processing callout upon successful completion. The MFT server can live in either on premise or a cloud iPaaS hosted environment.

    This is a typical configuration of HCM-MFT Extract Transfer:

    MFT_Extract_Transfer

     

    In the Source definition, add “Run Script Pre 01” processing action and enter the location of the script:

    MFTPreScriptDownload

     

    The “UseFileFromScript” must be checked as the source scheduler is triggered with mft2hcm payload (UCM-PAYLOAD-SEARCH) to initiate the search and get WCC’s operations. Once the file is retrieved from WCC, this flag tells MFT engine to substitute the file from downloaded from WCC.

     

    Conclusion

    This post demonstrates how to automate HCM inbound and outbound patterns using MFT and Node.js. The Node.js package could be replaced with WebCenter Content native APIs and SOA for orchestration. This process can also be replicated for other Fusion Applications pillars such as Oracle Enterprise Resource Planning (ERP).

    Displaying Oracle Documents Cloud Services File Picker and Link Picker from other Domains

    $
    0
    0

    Introduction

    The Oracle Documents File Picker and Link Picker allow web applications to select files and folders that reside in the Oracle Documents Cloud Service. These Pickers can start from a specific folder and can further be restricted to a folder structure for a specific user and role when combined with the Oracle Documents App Link. Some of these integration calls are made from external domains therefore this embedded content transfer needs to be properly allowed and configured.

    Main Article

    The FilePicker tutorial allows the user to choose a List or a Grid Layout and the Sort Order. Some options include the ability to select a single item, to select folders only, to select files only, and to allow the upload of new files as shown below:

    File Picker Tutorial

     

    Most applications run on a different domain from that where the Oracle Documents Cloud Service (DOCS) instance is running. Therefore an additional configuration step is required to embed a DOCS web user inline frame interface:

    1. 1) Goto the Administration page on the source DOCS instance (http://hostname:port/documents/admin)
    2. 2) Goto the System-wide Settings tab
    3. 3) Goto the Embedded Content section and select YES
    4. 4) Add the target domain and the CAPTCHA/SafeMode options

    Further information is available in the document “Administering Oracle Documents Cloud Service” – section: “Displaying Content from Other Domains” in the link below:
    http://docs.oracle.com/cloud/latest/documentcs_welcome/WCCCA/GUID-6511347B-87ED-43D9-A183-BBD91E9E17C8.htm#WCCCA-GUID-6511347B-87ED-43D9-A183-BBD91E9E17C8
    The example below shows a Java Cloud Service (JCS) application invoking the DOCS File Picker from a different domain.

    First of all, the domain where the app is going to be invoking the File Picker needs to be added to the list of allowed domains:

    File Picker Embedded Content YES

    Note that CAPTCHA was enabled for the invoking domain. This will challenge users to perform a simple visual test. Therefore preventing automated scripts from reaching out to the DOCS instance.

    A simple application was created to call the DOCS File Picker from an HTML page. This application was deployed in JDeveloper so that the EAR file could be deployed in the Java Cloud Service.

    File Picker App in JDeveloper

    The application was deployed successfully in JCS:

    File Picker App deployed in JCS

    When running the App, the below simple visual challenge is issued since CAPTCHA was enabled for this target domain:

    File Picker CAPTCHA

    Note the two different domains from the application running on JCS and from the DOCS instance:

    File Picker App running on JCS

    After select the item to be picked, DOCS returns a JSON with data corresponding to the selection made:

    File Picker returns data

    The preClick mechanism allows options to be passed to the createFilePickerButton. For example the ID of the initial folder may not be available until page load. The File or Line Picker could create a button or a custom one could be used. Using the
    pre-configured button from DOCS is simpler but may not match UI requirements and a custom button would be desired.

    function onPreClick(options) {
    	
    	var foldId = //Application logic to obtain folder id  
    	option.id = getFolderId();
    };
    
    function onOk(selection) {
    	//Do something with the selection 	
    };
    	
    window.onload=function() {	
    	options = {
    	   preClick : onPreClick,
    	   ok : onOk,
    	   id: ""  // Id not yet known  
    	};
    
    	var button = OracleDCS.createFilePickerButton(options);
    	document.getElementById("button-container").appendChild(button);		
    };

    Both the File and Link Picker could be used in combination with AppLink to restrict access to a folder structure for a specified user and role. The Picker would need an appLinkID, an accessToken, and a refreshToken. These are returned by the createAppLink service.

    function onOk(selection) {
    	//Do something with the selection 	
    };
    	
    window.onload=function() {	
    
    	//Call application logic to get AppLink target folder id
    	var appLinkFolderId = getFolderId(); 
    
    	//Application logic to invoke create folder AppLink service with folder id
    	var createApplink = createAppLink(appLinkFolderId);
    
    	options = {
    		ok: onOk,
    		id:appLinkFolderId;
    		appLinkId: createApplink.appLinkID,
    		appLinkAccessToken: createApplink.accessToken,
    		appLinkRefreshToken: createApplink.refreshToken
    	};
    
    	var button = OracleDCS.createFilePickerButton(options);
    	document.getElementById("button-container").appendChild(button);		
    };

    The source for the above code is available at the Picker Tutorials:

    http://hostname:port/documents/static/api/FilePickerTutorial.html

    http://hostname:port/documents/static/api/LinkPickerTutorial.html

    The hostname and port variables identify the Oracle Documents Cloud Service instance.

    In summary, invoking domains need to be added to the list of allowed domains in the Embedded Content DOCS Administration section. If not present, the application would remain at the File Picker screen. It will not proceed with any action when clicking OK until the File/Link Picker section was cancelled and closed. The simple JDeveloper application called DOCS File Picker. However, the sample principles apply for the case of Link Picker.

     

    Oracle HCM Cloud – Bulk Integration Automation Using SOA Cloud Service

    $
    0
    0

    Introduction

    Oracle Human Capital Management (HCM) Cloud provides a comprehensive set of tools, templates, and pre-packaged integration to cover various scenarios using modern and efficient technologies. One of the patterns is the batch integration to load and extract data to and from the HCM cloud. HCM provides the following bulk integration interfaces and tools:

    HCM Data Loader (HDL)

    HDL is a powerful tool for bulk-loading data from any source to Oracle Fusion HCM. It supports important business objects belonging to key Oracle Fusion HCM products, including Oracle Fusion Global Human Resources, Compensation, Absence Management, Performance Management, Profile Management, Global Payroll, Talent and Workforce Management. For detailed information on HDL, please refer to this.

    HCM Extracts

    HCM Extract is an outbound integration tool that lets you select HCM data elements, extracting them from the HCM database and archiving these data elements as XML. This archived raw XML data can be converted into a desired format and delivered to supported channels recipients.

    Oracle Fusion HCM provides the above tools with comprehensive user interfaces for initiating data uploads, monitoring upload progress, and reviewing errors, with real-time information provided for both the import and load stages of upload processing. Fusion HCM provides tools, but it requires additional orchestration such as generating FBL or HDL file, uploading these files to WebCenter Content and initiating FBL or HDL web services. This post describes how to design and automate these steps leveraging Oracle Service Oriented Architecture (SOA) Cloud Service deployed on Oracle’s cloud Platform As a Service (PaaS) infrastructure.  For more information on SOA Cloud Service, please refer to this.

    Oracle SOA is the industry’s most complete and unified application integration and SOA solution. It transforms complex application integration into agile and re-usable service-based components to speed time to market, respond faster to business requirements, and lower costs.. SOA facilitates the development of enterprise applications as modular business web services that can be easily integrated and reused, creating a truly flexible, adaptable IT infrastructure. For more information on getting started with Oracle SOA, please refer this. For developing SOA applications using SOA Suite, please refer to this.

    These bulk integration interfaces and patterns are not applicable to Oracle Taleo.

    Main Article

     

    HCM Inbound Flow (HDL)

    Oracle WebCenter Content (WCC) acts as the staging repository for files to be loaded and processed by HDL. WCC is part of the Fusion HCM infrastructure.

    The loading process for FBL and HDL consists of the following steps:

    • Upload the data file to WCC/UCM using WCC GenericSoapPort web service
    • Invoke the “LoaderIntegrationService” or the “HCMDataLoader” to initiate the loading process.

    However, the above steps assume the existence of an HDL file and do not provide a mechanism to generate an HDL file of the respective objects. In this post we will use the sample use case where we get the data file from customer, using it to transform the data and generate an HDL file, and then initiate the loading process.

    The following diagram illustrates the typical orchestration of the end-to-end HDL process using SOA cloud service:

     

    hcm_inbound_v1

    HCM Outbound Flow (Extract)

    The “Extract” process for HCM has the following steps:

    • An Extract report is generated in HCM either by user or through Enterprise Scheduler Service (ESS)
    • Report is stored in WCC under the hcm/dataloader/export account.

     

    However, the report must then be delivered to its destination depending on the use cases. The following diagram illustrates the typical end-to-end orchestration after the Extract report is generated:

    hcm_outbound_v1

     

    For HCM bulk integration introduction including security, roles and privileges, please refer to my blog Fusion HCM Cloud – Bulk Integration Automation using Managed File Trasnfer (MFT) and Node.js. For introduction to WebCenter Content Integration services using SOA, please refer to my blog Fusion HCM Cloud Bulk Automation.

     

    Sample Use Case

    Assume that a customer receives benefits data from their partner in a file with CSV (comma separated value) format periodically. This data must be converted into HDL format for the “ElementEntry” object and initiate the loading process in Fusion HCM cloud.

    This is a sample source data:

    E138_ASG,2015/01/01,2015/12/31,4,UK LDG,CRP_UK_MNTH,E,H,Amount,23,Reason,Corrected all entry value,Date,2013-01-10
    E139_ASG,2015/01/01,2015/12/31,4,UK LDG,CRP_UK_MNTH,E,H,Amount,33,Reason,Corrected one entry value,Date,2013-01-11

    This is the HDL format of ElementryEntry object that needs to be generated based on above sample file:

    METADATA|ElementEntry|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|MultipleEntryCount|LegislativeDataGroupName|ElementName|EntryType|CreatorType
    MERGE|ElementEntry|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|E|H
    MERGE|ElementEntry|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|E|H
    METADATA|ElementEntryValue|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|MultipleEntryCount|LegislativeDataGroupName|ElementName|InputValueName|ScreenEntryValue
    MERGE|ElementEntryValue|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|Amount|23
    MERGE|ElementEntryValue|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|Reason|Corrected all entry value
    MERGE|ElementEntryValue|2015/01/01|2015/12/31|E138_ASG|4|UK LDG|CRP_UK_MNTH|Date|2013-01-10
    MERGE|ElementEntryValue|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|Amount|33
    MERGE|ElementEntryValue|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|Reason|Corrected one entry value
    MERGE|ElementEntryValue|2015/01/01|2015/12/31|E139_ASG|4|UK LDG|CRP_UK_MNTH|Date|2013-01-11

    SOA Cloud Service Design and Implementation

    A canonical schema pattern has been implemented to design end-to-end inbound bulk integration process – from the source data file to generating HDL file and initiating the loading process in HCM cloud. The XML schema of HDL object “ElementEntry” is created. The source data is mapped to this HDL schema and SOA activities will generate the HDL file.

    Having a canonical pattern automates the generation of HDL file and it becomes a reusable asset for various interfaces. The developer or business user only needs to focus on mapping the source data to this canonical schema. All other activities such as generating the HDL file, compressing and encrypting the file, uploading the file to WebCenter Content and invoking web services needs to be developed once and then once these activities are developed they also become reusable assets.

    Please refer to Wikipedia for the definition of Canonical Schema Pattern

    These are the following design considerations:

    1. Convert source data file from delimited format to XML

    2. Generate Canonical Schema of ElementEntry HDL Object

    3. Transform source XML data to HDL canonical schema

    4. Generate and compress HDL file

    5. Upload a file to WebCenter Content and invoke HDL web service

     

    Please refer to SOA Cloud Service Develop and Deploy for introduction and creating SOA applications.

    SOA Composite Design

    This is a composite based on above implementation principles:

    hdl_composite

    Convert Source Data to XML

    “GetEntryData” in the above composite is a File Adapter service. It is configured to use native format builder to convert CSV data to XML format. For more information on File Adapter, refer to this. For more information on Native Format Builder, refer to this.

    The following provides detailed steps on how to use Native Format Builder in JDeveloper:

    In native format builder, select delimited format type and use source data as a sample to generate a XML schema. Please see the following diagrams:

    FileAdapterConfig

    nxsd1

    nxsd2_v1 nxsd3_v1 nxsd4_v1 nxsd5_v1 nxsd6_v1 nxsd7_v1

    Generate XML Schema of ElementEntry HDL Object

    A similar approach is used to generate ElementEntry schema. It has two main objects: ElementEntry and ElementEntryValue.

    ElementEntry Schema generated using Native Format Builder

    <?xml version = ‘1.0’ encoding = ‘UTF-8’?>
    <xsd:schema xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns:nxsd=”http://xmlns.oracle.com/pcbpel/nxsd” xmlns:tns=”http://TargetNamespace.com/GetEntryHdlData” targetNamespace=”http://TargetNamespace.com/GetEntryHdlData” elementFormDefault=”qualified” attributeFormDefault=”unqualified” nxsd:version=”NXSD” nxsd:stream=”chars” nxsd:encoding=”UTF-8″>
    <xsd:element name=”Root-Element”>
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name=”Entry” minOccurs=”1″ maxOccurs=”unbounded”>
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name=”METADATA” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”ElementEntry” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”EffectiveStartDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”EffectiveEndDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”AssignmentNumber” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”MultipleEntryCount” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”LegislativeDataGroupName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”ElementName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”EntryType” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”CreatorType” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”${eol}” nxsd:quotedBy=”&quot;”/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:annotation>
    <xsd:appinfo>NXSDSAMPLE=/ElementEntryAllSrc.dat</xsd:appinfo>
    <xsd:appinfo>USEHEADER=false</xsd:appinfo>
    </xsd:annotation>
    </xsd:schema>

    ElementEntryValue Schema generated using Native Format Builder

    <?xml version = ‘1.0’ encoding = ‘UTF-8’?>
    <xsd:schema xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns:nxsd=”http://xmlns.oracle.com/pcbpel/nxsd” xmlns:tns=”http://TargetNamespace.com/GetEntryValueHdlData” targetNamespace=”http://TargetNamespace.com/GetEntryValueHdlData” elementFormDefault=”qualified” attributeFormDefault=”unqualified” nxsd:version=”NXSD” nxsd:stream=”chars” nxsd:encoding=”UTF-8″>
    <xsd:element name=”Root-Element”>
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name=”EntryValue” minOccurs=”1″ maxOccurs=”unbounded”>
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name=”METADATA” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”ElementEntryValue” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”EffectiveStartDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”EffectiveEndDate” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”AssignmentNumber” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”MultipleEntryCount” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”LegislativeDataGroupName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”ElementName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”InputValueName” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”|” nxsd:quotedBy=”&quot;”/>
    <xsd:element name=”ScreenEntryValue” type=”xsd:string” nxsd:style=”terminated” nxsd:terminatedBy=”${eol}” nxsd:quotedBy=”&quot;”/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:annotation>
    <xsd:appinfo>NXSDSAMPLE=/ElementEntryAllSrc.dat</xsd:appinfo>
    <xsd:appinfo>USEHEADER=false</xsd:appinfo>
    </xsd:annotation>
    </xsd:schema>

    In Native Format Builder, change “|” separator to “,” in the sample file and change it to “|” for each element in the generated schema.

    Transform Source XML Data to HDL Canonical Schema

    Since we are using canonical schema, all we need to do is map the source data appropriately and Native Format Builder will convert each object into HDL output file. The transformation could be complex depending on the source data format and organization of data values. In our sample use case, each row has one ElementEntry object and 3 ElementEntryValue sub-objects respectively.

    The following provides the organization of the data elements in a single row of the source:

    Entry_Desc_v1

    The main ElementEntry entries are mapped to each respective row, but ElementEntryValue entries attributes are located at the end of each row. In this sample it results 3 entries. This can be achieved easily by splitting and transforming each row with different mappings as follows:

    <xsl:for-each select=”/ns0:Root-Element/ns0:Entry”> – map pair of columns “1” from above diagram

    <xsl:for-each select=”/ns0:Root-Element/ns0:Entry”> – map pair of columns “2” from above diagram

    <xsl:for-each select=”/ns0:Root-Element/ns0:Entry”> – map pair of columns “3” from above diagram

     

    Metadata Attribute

    The most common use cases are to use “merge” action for creating and updating objects. In this use case, it is hard coded to “merge”, but the action could be set up to be dynamic if source data row has this information. The “delete” action removes the entire record and must not be used with “merge” instruction of the same record as HDL cannot guarantee in which order the instructions will be processed. It is highly recommended to correct the data rather than to delete and recreate it using the “delete” action. The deleted data cannot be recovered.

     

    This is the sample schema developed in JDeveloper to split each row into 3 rows for ElementEntryValue object:

    <xsl:template match=”/”>
    <tns:Root-Element>
    <xsl:for-each select=”/ns0:Root-Element/ns0:Entry”>
    <tns:Entry>
    <tns:METADATA>
    <xsl:value-of select=”‘MERGE'”/>
    </tns:METADATA>
    <tns:ElementEntry>
    <xsl:value-of select=”‘ElementEntryValue'”/>
    </tns:ElementEntry>
    <tns:EffectiveStartDate>
    <xsl:value-of select=”ns0:C2″/>
    </tns:EffectiveStartDate>
    <tns:EffectiveEndDate>
    <xsl:value-of select=”ns0:C3″/>
    </tns:EffectiveEndDate>
    <tns:AssignmentNumber>
    <xsl:value-of select=”ns0:C1″/>
    </tns:AssignmentNumber>
    <tns:MultipleEntryCount>
    <xsl:value-of select=”ns0:C4″/>
    </tns:MultipleEntryCount>
    <tns:LegislativeDataGroupName>
    <xsl:value-of select=”ns0:C5″/>
    </tns:LegislativeDataGroupName>
    <tns:ElementName>
    <xsl:value-of select=”ns0:C6″/>
    </tns:ElementName>
    <tns:EntryType>
    <xsl:value-of select=”ns0:C9″/>
    </tns:EntryType>
    <tns:CreatorType>
    <xsl:value-of select=”ns0:C10″/>
    </tns:CreatorType>
    </tns:Entry>
    </xsl:for-each>
    <xsl:for-each select=”/ns0:Root-Element/ns0:Entry”>
    <tns:Entry>
    <tns:METADATA>
    <xsl:value-of select=”‘MERGE'”/>
    </tns:METADATA>
    <tns:ElementEntry>
    <xsl:value-of select=”‘ElementEntryValue'”/>
    </tns:ElementEntry>
    <tns:EffectiveStartDate>
    <xsl:value-of select=”ns0:C2″/>
    </tns:EffectiveStartDate>
    <tns:EffectiveEndDate>
    <xsl:value-of select=”ns0:C3″/>
    </tns:EffectiveEndDate>
    <tns:AssignmentNumber>
    <xsl:value-of select=”ns0:C1″/>
    </tns:AssignmentNumber>
    <tns:MultipleEntryCount>
    <xsl:value-of select=”ns0:C4″/>
    </tns:MultipleEntryCount>
    <tns:LegislativeDataGroupName>
    <xsl:value-of select=”ns0:C5″/>
    </tns:LegislativeDataGroupName>
    <tns:ElementName>
    <xsl:value-of select=”ns0:C6″/>
    </tns:ElementName>
    <tns:EntryType>
    <xsl:value-of select=”ns0:C11″/>
    </tns:EntryType>
    <tns:CreatorType>
    <xsl:value-of select=”ns0:C12″/>
    </tns:CreatorType>
    </tns:Entry>
    </xsl:for-each>
    <xsl:for-each select=”/ns0:Root-Element/ns0:Entry”>
    <tns:Entry>
    <tns:METADATA>
    <xsl:value-of select=”‘MERGE'”/>
    </tns:METADATA>
    <tns:ElementEntry>
    <xsl:value-of select=”‘ElementEntryValue'”/>
    </tns:ElementEntry>
    <tns:EffectiveStartDate>
    <xsl:value-of select=”ns0:C2″/>
    </tns:EffectiveStartDate>
    <tns:EffectiveEndDate>
    <xsl:value-of select=”ns0:C3″/>
    </tns:EffectiveEndDate>
    <tns:AssignmentNumber>
    <xsl:value-of select=”ns0:C1″/>
    </tns:AssignmentNumber>
    <tns:MultipleEntryCount>
    <xsl:value-of select=”ns0:C4″/>
    </tns:MultipleEntryCount>
    <tns:LegislativeDataGroupName>
    <xsl:value-of select=”ns0:C5″/>
    </tns:LegislativeDataGroupName>
    <tns:ElementName>
    <xsl:value-of select=”ns0:C6″/>
    </tns:ElementName>
    <tns:EntryType>
    <xsl:value-of select=”ns0:C13″/>
    </tns:EntryType>
    <tns:CreatorType>
    <xsl:value-of select=”ns0:C14″/>
    </tns:CreatorType>
    </tns:Entry>
    </xsl:for-each>
    </tns:Root-Element>
    </xsl:template>

    BPEL Design – “ElementEntryPro…”

    This is a BPEL component where all the major orchestration activities are defined. In this sample, all the activities after transformation are reusable and can be moved to a separate composite. A separate composite may be developed only for transformation and data enrichment that in the end invokes the reusable composite to complete the loading process.

     

    hdl_bpel_v2

     

     

    SOA Cloud Service Instance Flows

    The following diagram shows an instance flow:

    ElementEntry Composite Instance

    instance1

    BPEL Instance Flow

    audit_1

    Receive Input Activity – receives delimited data to XML format through Native Format Builder using File Adapter

    audit_2

    Transformation to Canonical ElementEntry data

    Canonical_entry

    Transformation to Canonical ElementEntryValue data

    Canonical_entryvalue

    Conclusion

    This post demonstrates how to automate HCM inbound and outbound patterns using SOA Cloud Service. It shows how to convert customer’s data to HDL format followed by initiating the loading process. This process can also be replicated to other Fusion Applications pillars such as Oracle Enterprise Resource Planning (ERP).

    Preventing Deletes from Replicating In Archiver

    $
    0
    0

    I'm working on a project in which there is a particular use-case to prevent the deletion of content from migrating to a target instance of WebCenter Content.  Normally, when automatic replication is configured between instances of WebCenter Conten...


    Oracle WebCenter and Dynamic Groups from an External LDAP Server (Part 1 of 2)

    $
    0
    0

    Oracle WebCenter and Dynamic Groups from an External LDAP Server (Part 1 of 2)

    Do you have some dynamic groups on your Directory?

    Are you puzzled on how to get these privileges into your WebCenter Portal/Spaces/Content -a.k.a. UCM?

    This blog post will cover what works and how to fix what doesn’t work yet so that you could have your dynamic-group-based memberships on your WebCenter products.

    What do you mean you don’t read HDA?

    $
    0
    0

    For any WebCenter Content or Records administrator who's done any
    customizing or troubleshooting of the server has undoubtedly run across
    an .hda (HDA) file.  A HDA file is proprietary data structure in ASCII
    text files used by WebCenter Conten...

    Oracle WebCenter and Dynamic Groups from an External LDAP Server (Part 2 of 2)

    $
    0
    0

    This blog post will cover how to get dynamic groups to work with Oracle WebCenter without having to use an External Oracle Virtual Directory instance.

    Background information on Dynamic Groups and Oracle WebCenter could be found on the Part 1 of 2 blog post. It also covers how the OVD DynamicGroups Plug-in works as well as its use with an external OVD instance.

    WebCenter’s user creation, authentication, and authorization is managed using Oracle Platform Services Security (OPSS). The OPSS API queries the LDAP server for groups with users identified by the uniquemember objectclass. The DynamicGroups Plug-in works by monitoring returned LDAP objects and detects objects where the memberurl attribute is present, it automatically processes any memberurl values and adds the results to the uniquemember attribute.

    Creating custom report templates with BI Publisher

    $
    0
    0

    In the records management capabilities within WebCenter Content 11g, reports that are created are generated by a runtime version of Oracle BI Publisher.  In order to create those reports, there are report templates that are checked in during the ...

    Full-text indexing? You must read this

    $
    0
    0

    For those of you who may have missed it, Peter Flies, Principal Technical Support Engineer for WebCenter Content, gave an excellent webcast on database searching and indexing in WebCenter Content.  It's available for replay along with a downl...

    Viewing all 69 articles
    Browse latest View live