Quantcast
Channel: WebCenter Content – ATeam Chronicles
Viewing all 69 articles
Browse latest View live

WebCenter Content Web Search Performance: Do you really need that folder path info?

$
0
0

Introduction

End-users want content at their fingertips at the speed of thought if possible. When running search operations in the WebCenter Conter Web Interface every second or fraction of a second improvement does matter.

Main Article

When doing some trace analysis on the systemdatabase tracing on a customer environment, we came across some SQL queries that were unnecessarily being triggered! These were related to determining the folder path for every entry part of the search result set. However, this folder path was not even being used as part of the displayed information in the user interface.

Why was the folder path information being collected when it was not even displayed in the UI? We found that the configuration parameter ‘FolderPathInSearchResults’ was set to ‘true’ under Administration > Admin Server > General Configuration > Additional Configuration Variables as shown below:

Fig1_FolderPathInSearchResults_TRUE

When executing a quicksearch by keyword we were getting 100 out of 2280 entries in the first page of the result set.

Fig2_FolderPathInSearchResults_100of2280

When thera ‘FolderPathInSearchResults’ configuration parameter is set to ‘true’, the following queries appear in the systemdatabase tracing:

100 executions for a query on the FolderFiles table for each of the documents displayed in the first page:

>systemdatabase/6       12.13 11:17:48.188      IdcServer-199   1.45 ms. SELECT * FROM FolderFiles WHERE dDocName=’SLC02VGVUSORAC140641′ AND fLinkRank=0[Executed. Returned row(s): true]

382 executions for a query of the folders tables – most of the documents that match the keyword criteria are at a folder depth level of three or four:

>systemdatabase/6       12.13 11:17:48.114      IdcServer-199   2.57 ms. SELECT FolderFolders.*,FolderMetaDefaults.* FROM FolderFolders,FolderMetaDefaults WHERE FolderFolders.fFolderGUID=FolderMetaDefaults.fFolderGUID(+) AND((FolderFolders.fFolderGUID = ‘1EB8E527E19B09ED3FE82EE310AEA13A’ ) )[Executed. Returned row(s): true]

By setting this ‘FolderPathInSearchResults’ configuration parameter to ‘false’, the above queries were no longer reported in the Server Output System Audit Information.

Now, let’s consider a practical scenario:
Search result set page = 100
Average folder depth der document in the search result set: 5

The number of folder path related queries will be: 100 + 5*500 = 600
If each query takes slightly over 3 ms. You would have 2000 ms (2 seconds) spent in server time to get this information.

Fig3_FolderPathInSearchResults_Firebug

The overall performance impact goes beyond seerver time execution, as this information needs to travel from the server to the browser. If the documents are further nested into the folder hierarchy, additional hundreds of queries may be executed. If folder path is not being displayed in the end-user interface profile, your system may be better of with the ‘FolderPathInSearchResults’ configuration parameter disabled.


Generating barcodes in reports

$
0
0

Introduction

I recently had a comment posted on a previous blog post regarding generating barcodes in the reports that come with the records management module (either in WebCenter Content/UCM or WebCenter Content: Records/URM).

I knew we could output barcodes because we do  in some of the default reports that come with the product.  But even when looking at those rich-text templates, it wasn’t clear how they were defined.  So I did a little digging and discovered the code needed to be added to those fields to do the barcode magic.  I won’t repeat the steps on how to update/create the custom reports from my earlier post, but will just cover the few extra steps for barcodes.

Main Article

Once you have your field input into the template in Word, right-click on the field and choose BI Publisher -> Properties.  Click on the Advanced tab and you should see the box for Code with the field you are outputting surrounded by <?field_name?>. For barcodes, you’ll want to enter this in that code field:

<?register-barcode-vendor:’oracle.xdo.template.rtf.util.barcoder.BarcodeUtil';'XMLPBarVendor'?><?dBarcodeFormated?>*<?dBarcode?>*<?format-barcode:dBarcodeFormated;code39;XMLPBarVendor?>

Just replace dBarcode with your field name (e.g. dDocName, xComments, etc).

Barcode code

Next, you’ll want to change the font on the field to be ‘BC 3of9‘.  This font should have been added when the BI Publisher Desktop add-in for Word was installed.

Barcode font

Now simply follow the steps to add the template to the repository and configure the appropriate reports.  Now when the reports are run, it should provide the values in barcodes.

Barcode report

One thing I noticed is when I saved the Word document in rich-text format, I was no longer able to re-open that rtf file and get back to the code for the field properties.  But in Word’s default doc format, I was.  So if you think you might need to edit the report later on, it’s probably a good idea to save a copy in doc format as well.

WebCenter Content Performance: Web Browser Choice

$
0
0

Introduction

New web technologies are helping end users get a faster and stronger web experience. However, some organizations are clinging on old browser technologies and traditionally vendors provide a minimum browser version to be used. Old web browser versions contribute to the overall “not-so-great” performance of some web applications. WebCenter Content makes extensive use of JavaScript for its browser interface.

Main Article

In the case of WebCenter Content, section 2.1 “Using a Supported Web Browser” from the Oracle WebCenter Content User’s Guide for Content Server 11g Release 1 (11.1.1) states the following:

Consumers and contributors access Content Server from a standard Web browser. The computer you use to access Content Server is a client computer. You can access Content Server on a supported client computer from a Web browser listed in Table 2-1.

Table 2-1 Supported Web Browsers

Browser Versions
Internet Explorer 7.0 or higher
Firefox 3.5 or higher
Safari 4.0 or higher
Google
Chrome
10.0 or higher

 

Some of these old web browser versions contribute to the overall “not-so-great” performance of some web applications. Let’s take a look at the browser market for starters. According to StatCounter Global Stats, from January to December 2012, the following 4 web browsers captured 84% of the usage:

  • Chrome: 32.78%
  • Firefox 5+: 20.88%
  • IE 9.0: 15.88%
  • IE 8.0: 14.61%

StatCounter_TopBrowser_versions_for_2012

Source: http://gs.statcounter.com/#browser_version_partially_combined-ww-monthly-201201-201212-bar

WCC makes extensive use of JavaScript for its browser interface. The navigation structure, menus, option lists, folder item listings, and other parts of the page are drawn dynamically by the browser using JavaScript. The reason for this is to provide less download/network traffic as the JS files can be cached on the client. It also alleviates work having to be done on the server-side to render certain parts of the page. But some browsers can process JavaScript faster than others.

Using an independent JavaScript benchmark site, SunSpider (http://www.webkit.org/perf/sunspider/sunspider.html), we measured that performance differences between IE8 and the others were dramatic. The primary reason for the difference between browsers is the JavaScript engine that each uses to process client-side JavaScript.

Microsoft has tuned IE9.0 to run well for the WebKit SunSpider JavaScript Benchmark, and their results are shown below for the IE 9.0 Final Release version:

SunSpiderResultsPPB9RTW

Source: http://ie.microsoft.com/testdrive/benchmarks/sunspider/default.html

Therefore, when choosing a browser to access a WebCenter Content server, make sure it is a modern browser with a good JavaScript engine. However, beware that you may still encounter IE 8.0 at many organizations. Some of these are still running Windows XP as the Operating System – according to StatCounter, about 31% of desktop internet usage is on Windows XP. For this O/S, IE is capped at IE8. IE9 cannot be installed on Windows XP. Chrome 10+, Firefox 5+, IE 9.0+, Safari 5+, Opera 11+ will do just fine. With this, the Recommended Web Browsers for WebCenter Content 11g table would be:

Browser Versions
Internet Explorer 9.0 or higher
Firefox 5.0 or higher
Safari 5.0 or higher
Google Chrome 10.0 or higher

 

Adding browser search engines in WebCenter Content

$
0
0

Introduction

In a post I made a few years ago, I described how you can add WebCenter Content (UCM at the time) search to the browser’s search engines.  I think this is a handy shortcut if you find yourself performing searches often enough in WCC.

Main Article

Well, in the PS5 release, this was actually included as a new feature.  You need to enable the DesktopIntegrationSuite component in order to access it.  Once you do, go to the My Content Server -> My Downloads link.  There you will see the ‘Add browser search’ link.

Add Browser Search

Once clicked, an OpenSearchDescription XML file is produced which each modern browser supports for adding in the search engine.

Search bar

The one piece that’s missing is something I mentioned in my earlier post: forcing authentication.  If you haven’t logged into the server, your search will be performed anonymously and you will only get back content that is available to the guest role.  To make sure the search is performed as your user, the extra parameter Auth=Internet can be passed to the server to cause the server to challenge your request and force a login if needed.  Because the definition of the search engine URL is defined within the DesktopIntegrationSuite component, a new custom component can be added to override this.  Basically, the new component must override the dis_search_plugin resource and modify the Url locations.  Below is an example:

&lt;@dynamichtml dis_search_plugin@&gt;
 &lt;?xml version="1.0" encoding="UTF-8"?&gt;
 &lt;OpenSearchDescription xmlns="http://a9.com/-/spec/opensearch/1.1/"
 xmlns:moz="http://www.mozilla.org/2006/browser/search/"&gt;
 &lt;ShortName&gt;&lt;$if DIS_SearchPluginTitle$&gt;&lt;$DIS_SearchPluginTitle$&gt;&lt;$else$&gt;Oracle WebCenter Content Server Search&lt;$endif$&gt;&lt;/ShortName&gt;
 &lt;Description&gt;&lt;$lc("wwDISSearchPluginDescription")$&gt;&lt;/Description&gt;
 &lt;Url type="text/html" method="get" template="&lt;$xml(HttpBrowserFullCgiPath &amp; "?IdcService=DESKTOP_BROWSER_SEARCH&amp;Auth=Internet&amp;MiniSearchText={searchTerms}")$&gt;" /&gt;
 &lt;$iconlocation=strReplace(HttpBrowserFullCgiPath,HttpCgiPath,"") &amp; HttpImagesRoot &amp; "desktopintegrationsuite/dis_search_plugin.ico"$&gt;
 &lt;Image height="16" width="16" type="image/x-icon"&gt;&lt;$iconlocation$&gt;&lt;/Image&gt;
 &lt;Developer&gt;Oracle Corporation&lt;/Developer&gt;
 &lt;InputEncoding&gt;UTF-8&lt;/InputEncoding&gt;
 &lt;moz:SearchForm&gt;&lt;$xml(HttpBrowserFullCgiPath &amp; "?IdcService=DESKTOP_BROWSER_SEARCH&amp;<strong>Auth=Internet</strong>&amp;MiniSearchText=")$&gt;&lt;/moz:SearchForm&gt;
 &lt;/OpenSearchDescription&gt;
 &lt;$setContentType("application/xml")$&gt;
 &lt;$setHttpHeader("Content-Disposition","inline; filename=search_plugin.xml")$&gt;
 &lt;$setHttpHeader("Cache-Control", "public")$&gt;
 &lt;@end@&gt;

I’ve included a a sample component that does just that.

UPDATE (Jan 15, 2013)

In addition to enabling the component, there is also a configuration preference that must be enabled.   After enabling the Desktop Integration Suite component,  go to the ‘advanced component manager’.  Go to the bottom to the ‘Update Component Configuration’ list and select DesktopIntegrationSuite and click Update.  The first entry is to ‘Enable web browser search plug-in’.  Check that and click Update.

DIS Configuration

If you’ve already restarted to enable the DIS component, you do not need to restart for this configuration to take effect.

Migrating folders and content together in WebCenter Content

$
0
0

Introduction

In the case of migrating from one WebCenter Content instance to another, there are several different tools within the system to accomplish that migration depending on what you need to move over.

This post will focus on the use case of needing to move a specific set of folders and their contents from one instance to another.  And the folder architecture in this example is Folders_g. Although Framework Folders is the recommended folders component for WebCenter Content 11g PS5 and later, there are still cases where you must still use Folders_g (e.g. WebCenter Portal, Fusion Applications, Primavera, etc).  Or perhaps you are at an older version and Folders_g is the only option.

Main Article

To prepare, you must first have the FoldersStructureArchive component enabled on both the source and target instances.  If you are on UCM 10g, this component will be available within the CS10gR35UpdateBundle/extras folder.  In addition to enabling the component, there is a configuration flag to set.  By default, the config variable ArchiveFolderStructureOnly is set to false which means content will be exported along with the folders, so that can be left alone.  The config variable AllowArchiveNoneFolderItem is set to true by default which means it will export content both in the folder structure as well as those not selected…or even outside of folders.  Basically, it means you must use the Export Criteria in the archive to control the content to export. In our use case, we only want the content within the folders we select, so the configuration should be set as AllowArchiveNoneFolderItem=false.  Now only content that is in our selected folders will get exported into the archive. This can be set in the General Configuration in the Admin Server.

You will also need to make sure the custom metadata fields on both instances is identical. If they are mismatched, the folders will not import into the target instance correctly. You can use the Configuration Migration Utility to migrate those metadata fields.

Once the component is enabled and configurations set, go to Administration -> Admin Applets -> Archiver and select Edit -> Add… to create a new archive.

New Archive

Now that the archive is established, go back to the browser and go to Administration -> Folder Archiver Configuration.  For the Collection Name, it will default to the local collection.  Change this if your archive is in a different collection.  Then select your Archive Name from the list.

Folder Archive Setup

Expand the folder hierarchy and you can now select the specific folder(s) you want to migrate.  The thing to keep in mind are the parent folders to the ones you are selecting.  If the idea is you want to migrate a certain section of the folder hierarchy to the other server and you want it to be in the same place in the target instance, you want to make sure that the parent folder already exists in the target.  It is possible to migrate a folder and place it within a different parent folder in the target instance, but then you need to make sure you set the import maps correctly to specify the destination folder (more on that later).

Select Folder

Once they are selected, click the Add button to save the configuration.  This will add the right criteria to the archive. Now go back to the Archiver applet.  Highlight the archive and select Actions -> Export.  Be sure ‘Export Tables’ is selected.  Note: If you try using the Preview on either the contents or the Table data, both will show everything and not just what you selected.  This is normal. The filtering of content and folders is not reflected in the Preview. Once completed, you can click on the View Batch Files… button to verify the results.  You should see an entry for the Collections_arTables and one or more for the content items.

View batch files

If you highlight the Collections row and click Edit, you can view and verify the results.

Collections Table

You can do the same for the document entries as well.

Once you have the archive exported, you need to transfer it from the source to the target instance. If I don’t have the outgoing providers set up to do the transfer, I sometimes cheat and copy over the archive folder from <cs instance dir>\archives\{archive name} directly over to the other instance.  Then I manually modify the collection.hda file on the target to let it know about the archive:

@ResultSet Archives
 2
 aArchiveName
 aArchiveDescription
 <strong>exportfoldersandfiles
 Export some folders and files</strong>
 @end

Or if I have Site Studio installed and my archive is fairly small, I’ll take the approach described in this earlier post.

Before you import the archive on the target, you need to make sure the folders will be going into the right “parent” folder. If you’ve already migrated the parent folder to your folders to the target instance, then the IDs should match between instances and you should not have to do any import mappings. But if you are migrating the folders and the parent IDs will be different on the target (such as the main Contribution Folders or WebCenter Spaces root folder), then you will have to map those values.

First, to check what the folder’s ID is, you can simply place your mouse over the link to the particular folder to get it’s ID.  It will be identified as dCollectionID in the URL.  Do this on both the source and target instances.

Collection ID

In this example, the dCollectionID on the source instance for the parent folder (Contribution Folders) is 826127598928000002.  On the target instance, its Contribution Folders ID is 838257920156000002.  So that means when the top level ‘Product Management’ folder in our archive moves over, the ID that specifies the ParentID needs to be mapped to the new value. So now we have all the information we need for the mapping.

Go to the Archiver on the target instance and highlight the archive.  Click on the Import Maps tab and then on the Table tab.  Double-click on the folder and then expand they date entry.  It should then show the Collections table.

Import Table map

Click on the Edit button for the Value Maps. For the Input Value, you want to enter the value of the dCollectionID of the parent folder from the source instance. In our example, this is 826127598928000002. For the Field, you want to change this to be the dParentCollectionID. And for the Output Value, you want this to be the dCollectionID of the parent folder in the target instance.  In our example, this is 838257920156000002.  Click the Add button.

Value mapping

This will now map the folders into the correct location on target.

The archive is now ready to be imported.  Click on Actions -> Import and be sure the ‘Import Tables’ check-box is checked. To check for any issues, be sure to go to the logs at Administration -> Log Files -> Archiver Logs.

And that’s it.  Your folders and files should now be migrated over.

Conversions in WebCenter Content

$
0
0

Introduction

One of the guiding principles with WebCenter Content has been to make it as easy as possible to consume content.  And part of that means viewing content in a format that is optimal for the end user… regardless of the format the content was created in.  So WebCenter Content has a long history of converting files from one format to another.  Often this involves converting a proprietary desktop publishing format to something more open that can be viewed directly from a browser.  Or taking a high resolution image and creating a rendition that download quickly over a slow network.

Over the life of the product, the types and methods for those conversions has grown to provide a broad range of options.  It’s sometimes confusing to know what conversion are available and where exactly they are done (Content Server or Inbound Refinery), so I’ve put together a flowchart and list describing all of the different types of conversion, how and where they are done, and the pros and cons of each.  This list covers what’s available as of the current release – WebCenter Content 11g PS5.

WCC Conversion Decision Tree
(click for full-version)

Main Article

PDF Conversions

Where: Inbound Refinery
When: Upon check-in
How: Multiple ways
Platform: All (* but depends)

So PDF conversions are probably the most common type of conversion done with WCC.  This involves converting a desktop publishing format (e.g. Microsoft Word) into Adobe PDF format.  The benefits obviously include being able to read the document directly in the browser (with a PDF reader plug-in) and not requiring the 3rd party product to read the proprietary format. In addition, PDFs also provide additional benefits such as being able to start viewing the document before the entire file downloads, possible compression on file size, and the ability to provide watermarks and additional security on the file.  And optionally, PDF/A format can be chosen which is recognized as an approved archival format.

Within PDF conversions, there are several different methods that can be used to create the PDF, depending on the needs and requirements.

PDFExportConverter – This method uses Oracle’s own OutsideIn filters to directly convert multiple format types into PDF.  The benefits include multiple platform support (any platform that WCC supports), fastest conversion, and no 3rd party software requirements.  The main downside to this type of conversion is it has the lowest fidelity to the original document. Meaning it won’t always exactly match the look and feel of the original document.  These formats are supported by the OutsideIn filters for conversion to PDF.

WinNativeConverter – Like the name implies, this type of conversion uses the native applications on Windows to do the conversion.  By using the original application that was used to create the document, you will get the best fidelity of PDF compared to the original.  The downside is that the Inbound Refinery can only be run on Windows and not other platforms.  It also requires a distiller engine to convert the PostScript format that gets printed from the native applications to PDF.  The recommended choice for that is AFPL Ghostscript.

OpenOfficeConversion – The Open Office conversion is a bit of a compromise between the two types of conversions mentioned above.  It uses Apache Open Office to open and convert the native file. In most cases, it will give you better fidelity of PDF then the PDFExportConverter, but still not as good as WinNativeConverter.  Also, it does support more than just Windows, so it has broader platform support then WinNativeConverter.

Tiff Converter

Where: Inbound Refinery
When: Upon check-in
How: Uses a 3rd party (CVISION PdfCompressor) engine to perform OCR and PDF conversion
Platform: Windows Only

When needing to convert TIFF formatted files into PDFs, this can be done with either PDFExportConverter or Tiff Converter.  The major difference is if optical character recognition (OCR) needs to be performed on the file in order to extract the full-text off the image.  If OCR is required, then Tiff Converter is used for that type of conversion.  In addition, a 3rd party tool, CVISION PdfCompressor, is required to do the actual OCR and conversion piece.  Tiff Converter acts as the controller between the Inbound Refinery and PdfCompressor.  But because PdfCompressor is a Windows-only application, the Inbound Refinery must also be on Windows.

XML Converter

Where: Inbound Refinery
When: Upon check-in
How: Uses Oracle OutsideIn filters to convert native formats into XML
Platform: All

The XML Converter allows for native documents to be converted into 2 flavors of XML: FlexionXML (based on FlexionDoc schema) and SearchML (based on the SearchML schema).  In addition, those formats can go through additional transformation with a custom XSLT.  Because the XML Converter utilizes the Oracle OutsideIn filter technology, it supports all platforms.

DAM Converter

Where: Inbound Refinery
When: Upon check-in and updates
How: Can use both Oracle OutsideIn filters as well as 3rd party applications to do image conversions.  Flip Factory is required for video conversions.
Platform: All (* but depends)

DAM Converter is used to create multiple renditions of either image or video files.  The primary goal is to convert original formats which can typically be high resolution and large in size into other formats that are geared towards web or print delivery.  One thing that is unique to DAM Converter is the metadata that is used to specify the rendition set can be updated after the item has been submitted which will send the file back to the Inbound Refinery to be reprocessed.

When using the image converter, the Inbound Refinery comes with the Oracle OutsideIn filters to create renditions, so nothing else is required and it can run on all platforms.  But the converter also supports other types of image converters which are command-line driven such as Adobe Photoshop, XnView NConvert, ImageMagick.  Some are commercial and some are freeware.  Each has different capabilities for different use-cases and are supported on various platforms.  But for general purpose re-sizing, resolution, and format changes, OutsideIn can handle it.

For video conversion, Telestream’s Flip Factory is required.  The DAM Converter acts as the controller between the Inbound Refinery and Flip Factory.  What makes this integration a bit unique is that it is handled purely at a file system level.  This means that Flip Factory, which is a Windows-only application, does not need to reside on the same server as the Inbound Refinery.  They simply need shared file system access between servers.  So the Inbound Refinery can be on Linux while Flip Factory is on Windows.

HTML Converter

Where: Inbound Refinery
When: Upon check-in
How: Uses Microsoft Office to convert Office documents into HTML
Platform: Windows Only

HTML Converter uses Microsoft Office to save the documents as HTML documents, collects the output (into a zip file if multiple files), and returns them to Content Server.  Using the HTML save output directly from Office, you get a very good fidelity of HTML compared to the original native format.  This is especially true for Excel and Visio which are less text-based.  The downside is you have no control over the HTML output to make any changes or provide consistency between conversions.  It’s simply formatted based on Office’s formatting.  Also, it does not apply any templating around the content to insert code before or after the content or present the document within the structure of a larger HTML page such as in the case of Site Studio.

Dynamic Converter

Where: Content Server
When: Upon check-in or on-demand
How: Uses Oracle OutsideIn filters to convert native documents into HTML
Platform: All

Like HTML Converter, Dynamic Converter converts Office documents into HTML.  But there are several key differences between the two.  First is Dynamic Converter uses OutsideIn filters to convert to HTML so it supports a wide range of different native formats. Another difference is the processing happens on the Content Server side and not Inbound Refinery.  This allows the conversion to happen on-demand the first time the HTML version is requested.  Alternatively, DC can be configured to do the conversion upon check-in and cache the results so they are immediately available and don’t need to go through conversion on first request. DC also supports a wide range of controls over how the HTML is precisely formatted.  The result can be very minimal and clean HTML with various div or span tags to allow styling with CSS.  This can lead to a more consistent look and feel between converted documents.  In also allows for insertion of code before or after the content to embed the output within a template and is what is used within Site Studio.

Thumbnail Creation

Where: Content Server or Inbound Refinery
When: Upon check-in
How: Uses Oracle OutsideIn filters to create a thumbnail representation of the document to be used on search results
Platform: All

As a new feature in PS5, thumbnails can now be generated directly in the Content Server and not require the document to be sent to the Inbound Refinery (if it doesn’t need other conversions).  This allows the document to become available much more quickly.  But if the file is sent to the Inbound Refinery for other types of conversions, the thumbnail can be generated at that point.

For further information on conversions, see the documentation on Conversions as well as Dynamic Converter.

Caught in the act!

$
0
0

Introduction

Caught in the act

Sometimes when troubleshooting issues, the exact cause of the issue may be difficult to find.  You may run across an error appearing in the log file.  But it may not have enough information about what went wrong…or how it might happen again.  So you can turn on tracing and watch the output, but if you don’t know when the error may happen, you may have to sift through a lot of trace logs to find the spot of the error.  That’s where Event Trap tracing comes it.

Main Article

Event Trap tracing allows you to specify keywords for content server to look for as it’s writing out tracing in the server output.  If that keyword is found, all of the tracing in the buffer at that time will be sent to a separate event tracing output file.  So now you have a nice slice of tracing activity at the exact moment the particular keyword (based off error message or such) is hit. In addition, a thread dump from the JVM can be obtained at the same time to capture all of the thread activity as well. By default, the keyword is Exception so that every exception is captured this way.

Event Trap

By default, the log files can be found in the <content server instance directory>/data/trace/event directory or they can be viewed in the browser by clicking on the ‘View Event Output’ link.

Getting started with Desktop Integration Suite

$
0
0

 Introduction

I recently discovered the Oracle Learning Library which is a nice site for self-learning videos and tutorials on Oracle products.

Main Article

Getting Started with DIS

Marsha Hancock, Senior Principal Curriculum Developer for WebCenter Content, just posted a video on Getting Started with Desktop Integration Suite (DIS).  This is a great way to quickly understand how to connect to WebCenter Content with DIS and begin working with it.


Index of WebCenter Content articles

$
0
0
  • WebCenter Content
  • Moving To Oracle WebCenter Content 11g Web Services

    $
    0
    0

    Introduction

    Oracle WebCenter Content 11g now provides a new set of JAX-WS Web Services interfaces, read on to find out more details about how to integrate with Oracle WebCenter Content Server 11g using these new Web Services interfaces.

    Main Article

    Oracle WebCenter Content (formerly UCM) has been a pioneer in Services Oriented Architecture long before the term took on the meaning it has in today’s technical realm (e.g. Web Services, UDDI, WSDL, SOAP, XML).  Since the introduction of the Intradoc 4.x release as a Java application back in late 1999 early 2000, the platform has been making its content management services (a.k.a. IdcService) accessible over Http protocol.

    When the concept of Services Oriented Architecture turned to the meaning it has today with Web Services, UDDI, WSDL, SOAP, and XML, the Oracle UCM platform followed suit in its 7.x release by making its idcServices available to Web Service clients by publishing a set of WSDL’s that were packaged and released with the product for its core services like CHECKIN_NEW, DOC_INFO, GET_SEARCH_RESULTS, etc.  In addition to a set of pre-packaged WSDL’s there was also a WSDL generator that provided the ability to create and publish WSDL’s for any custom server that was developed using Component Architecture.  Information on this version of Web Service implementation is available in section 25 of the WebCenter Content developers guide located at http://docs.oracle.com/cd/E23943_01/doc.1111/e10807/c25_wsdl_and_soap.htm

    The core WSDL’s and WSDL generator are still packaged and deployed with Oracle WebCenter Content 11g, however when Oracle WebCenter Content 11g was released a new set of Web Service interfaces became available which are deployed as Web Applications on Oracle WebLogic Server.  The two new Web Service implementations that became available with WebCenter content 11g are:

    Generic Web Service – A JAX-WS Web Service implementation whose application context root is /idcws and publishes a single Web Service provider interface GenericSoapService.  Use of this Web Service provider is targeted for development of Web Service proxy clients implemented in your development platform of choice (e.g. java, .NET, .NET#, etc) whose proxy stubs are constructed from the WSDL published from /idcws/GenericSoapPort?WSDL.

    Native Web Service – A native SOAP based Web Service implementation whose application context root is /idcnativews and publishes two Web Service provider interfaces IdcWebRequestService and IdcWebLoginService.   Use of these Web Service providers is targeted specifically for the development of Web Service proxy clients implemented in Java that embeds the Oracle WebCenter Content RIDC interface.

    When I started this endeavor my intent was to publish it in its entirety as a blog post but then the breadth and depth of the topic quickly outgrew the length of a traditional blog post so I turned it into a technical article that is attached here.  In addition to the article I have supplied a sample application that can be downloaded from here. Information on the sample application is provided in the article.  For those of you who need to integrate with Oracle WebCenter Content 11g over a Web Service interface this article and sample application should give you a jump start on using the latest Web Services interfaces available on the Oracle WebCenter Content 11g platform.

    Getting Started with WebCenter Portal – Content Contribution Project – Part 1

    $
    0
    0

    Introduction

    A very common use-case for WebCenter applications is to use them for Content Contribution projects. WebCenter Portal and WebCenter Content are designed to work seamlessly with each other. Once set up correctly a content contribution user can make all content changes from within the Portal application, without going into content administration application.

    The following article shows the steps required to start a new WebCenter project for content contribution.

    Main Article

    Jdeveloper Setup

    Create a new Content repository connection in JDeveloper and add the following values.

     

    Parameter Value
    RIDC Socket Type socket
    Server Host Name <Content Host name>
    Listener Port 4444
    Context Root /cs
    Cache Invalidation Interval 2
    Authentication Identity Propogation

     

    In our example the content server is running on a local VM hence host name is localhost.

     content-rep-conn

     

    Application Setup

    Create a new Portal Framework Application

    create-app

    Index Page: Change the default page in index.html to be your home page. This will show a pretty URL when home page is rendered.

    index

     

    Login Success Page: For showing a pretty URL after login change the login_success page in faces-config.xml to your home page. This will always redirect to home page post login, so do not change it if users can login from any page in your application.

    faces-config

    Page Hierarchy: Once the necessary design time pages are created, add them to the page hierarchy and set the appropriate Title and security options. Failure to do so will prevent the page to be shown on the UI.

    Contrary to the name, the page hierarchy shown on UI is controlled by the Navigation Model.

    pages

     

    Navigation Model: Best practices states that all pages should be added as Page Links within Navigation Model. The Navigation Hierarchy created here dictates how page URL is created.

    nav-model

     

    Page Template: Most users choose to create their own custom template. Our recommendation is to keep taskflows to the minimum for optimal page performance. To allow users to edit page at runtime, add panelEditor tag either within the template or within page.

     

      <pe:pageCustomizable id="hm_pgc1">
        <cust:panelCustomizable id="hm_pnc1" layout="scroll">
          <af:facetRef facetName="content"/>
        </cust:panelCustomizable>
        <f:facet name="editor">
          <pe:pageEditorPanel id="pep1"/>
        </f:facet>
      </pe:pageCustomizable>

     

    Content Presenter: A content can be added directly via JDeveloper or at runtime via composer. Custom content presenter templates should be created as per requirements.

    When adding the content presenter via JDeveloper surround it with showDetailFrame so that it can be edited at runtime by composer.

     

    create-presenter

     

    Configuration Settings

    High Availability Support: Most production grade applications run on multiple servers in cluster. Hence “High Availability for ADF Scopes” should be checked so JDeveloper can prompt developer high availability violations are detected.

    high-avail

    Cookie Path: Cookies are enabled by default in weblogic.xml and should remain that way. Additionally Cookie Trigger Path should be set to a unique value. A good rule of thumb is to set it same as the context root.

    cookie-trigger

     

    Context Root: Needless to say a context root should always be set to appropriate name. Its done via Java EE Application in Portal project properties. Optionally, the deployment profile name and application name can also be changed for clarity.

    context-root

     

    Refer to performance blogs for other application configuration settings.

     

    Deployment

    Integrated Server deployment: For design/development time testing, the application can be deployed by running index.html page. 

    image021.

     

    In Part 2 we will explain how runtime environment should be set up for development time testing.

     

    Getting Started with WebCenter Portal – Content Contribution Project – Part 2

    $
    0
    0

    Introduction

    This continues for part 1 and dives into the runtime environment setup of WebCenter portal project.

    Main Article

    After following Part 1, the application looks like this:

    image021

    Runtime Activities

    In order to Edit the page an Administrator login is needed. User can login by clicking the Login link.

    image023

     

    Update Content

    Hit Ctrl+Shift+C to enable contribution mode. In this mode the user can update the content on page eg: replacing image from content server or changing text.image025

    Add new content

    A new content can be added on the page by adding a Content Presenter from Resource Catalog.

    Hit Ctrl+Shift+E to reload the page in Edit mode.

    image027

    To add a content from WebCenter Site Studio region definition click on “Create Web Content” button. The detailed process of how to setup site studio region for consumption in content presenter is given in this blog

    image029

    Clicking on the Create WebContent will however give a 404 page even though Content server is running. The reason being that the popup contains an iframe which points to <portal-hostname>/cs. In order to get this working on local machine both portal and content should be served via same web server.

    image031

    OHS Configuration

    The OHS configuration needs to be changed to point both Local portal and remote content.

    In our test environment the ohs and content are on a local VM while JDeveloper integrated server is running on desktop machine. Hence we use Virtualbox Host Network IP address. If the ohs is installed locally then hostname 127.0.0.1 can be used. While if ohs is external then desktop’s IP address can be used. Make sure external ohs has access to both content server and desktops integrated server.

    image035

    Open the mod_wl_ohs.conf file located in WebTier installation

    image033

    Add the highlighted code below. Make sure to replace the Host IP accordingly. Port number is the integrated server port which is shown in browser during previous run.

    Also make sure a <Location> tag for /cs is created as well.

    image037

    Restart OHS and run the page with URL like http://<ohs-hostname>/mycontent which will route the jdeveloper page via OHS.

    Follow the Edit steps again to select region and content from the Content server seamlessly.

    content_working

     

    Sign Here Please

    $
    0
    0

    Introduction

    For those of you who manage a process which requires you to capture electronic signatures on the documents that are part of that process, this blog post is a must read for you.

    Main ArticleSign Here

    With the 11.1.1.1.6  (PS/5) release of WebCenter Content 11g a new, and not so well published,  feature called Electronic Signatures was introduced.   The electronic signatures feature is enabled in Oracle WebCenter Content 11g by enabling the ElectronicSignatures component through the Component Manager Web Interface.  To avoid confusion on terminology that tends to travel in close company, there is a point of distinction that should be made between electronic and digital signatures before proceeding with the Oracle WebCenter Content 11g electronic signature features..  Electronic Signature allows you to capture specific data points (ex: Name Of Signer, Date of Signature, etc) that represent the intent of signature and provides an electronic record of proof that someone has authorized the information contained within a document.  Electronic signatures are not embedded within the document that the electronic signature is associated with.  Digital signatures can be considered an implementation of electronic signatures.  However unlike electronic signatures, digital signatures are implemented with cryptographically based public and private keys.  These are keys which cannot be easily repudiated and represent a users/approvers handwritten signature and provide authenticity which is directly bound with the document or message being sent.  In some instances digital signatures can also pertain to capturing a digital image of a person’s hand written signature that is linked to or imprinted on a document.

    In Oracle WebCenter Content 11g the implementation of Electronic Signatures provides you with the following features that I will cover in this post:

    End User:

    • Document Check Sum Creation
    • Electronic Signature Capture
    • Electronic Signature Viewing
    • Matching of a local file to revisions of a specific WebCenter Content managed content item
    • Searching for a document in the WebCenter Content Repository based on a local file


    Administrative:

    • Signature Metadata Configuration
    • Signature Watermark Configuration

    Document Check Sum Creation

    When the Electronic Signature feature is enabled a check sum is calculated during the check-in of a document. The calculated check sum value is stored as metadata with each revision of the document being checked-in in a custom metadata field called xCheckSum. The checksum value calculated is based solely on the document contents and does not depend on an electronic signature being captured for the document. The supported algorithms used for computing the document checksum value are

     

    • MD2
    • MD5
    • SHA-1
    • SHA-256
    • SHA-384
    • SHA-512

     

    The check sum compute algorithm used can be altered through a Checksum_Algorithm environment configuration entry that is part of the Electronic Signature component configuration. The configured value on initial installation of the component is SHA-512. The computation of the check sum for the document can be disabled by setting the environment configuration entry

    • primaryFile:computeChecksum

    to a value of false. When a document check sum value is being computed it is used in the features

    • Matching a local file to a selected content items revision(s) based on the calculated check sum value
    • Searching for a files in the WebCenter Content Repository based on the check sum value of a local file

    Which are discussed further down in this post.

    Electronic Signature Capture

    Electronic signature capture can be performed on documents either when they are not in a workflow approval process or are participating in a workflow approval process. The following illustrations outline the user experience for capturing an electronic signature for both scenarios

    Outside of Workflow Approval Process

     

    1. 1. From the document information page of a content item revision select the Sign Content option from the Content Actions menu

    approvenowf



    1. 2. The Sign Content Item dialog is displayed where user supplies electronic signature metadata. The password that is supplied is authenticated against the security realm of the weblogic domain the WebCenter Content application is running in. This is done as a way to ensure that the user supplying the electronic signature is truly the user being identified by the user id. The password is not stored as part of the signature data

    EsigForm

    Sign Content Item Dialog

    In a workflow approval process

    When the electronic signatures feature is enabled the workflow approval tasks are modified by changing the standard Approve action to an action of Sign and Approve


    1. 1. The approval of the content presents the user with the Sign Workflow Content Item dialog which allows the user to supply an electronic signature before the document advances in the workflow process.

    SignatureWfTask



    1. 2. The approval of the content presents the user with the Sign Workflow Content Item dialog which allows the user to supply an electronic signature before the document advances in the workflow process.

    SignWfContentForm



    Note: By default capturing an electronic signature requires SSL to be enabled on an Oracle WebCenter Content instance via having the UseSSL configuration entry set with a value of true in the WebCenter Content core config.cfg configuration file.  When you think of it this makes sense as a user is going to have greater piece of mind applying a signature to document that has no potential of being modified in transit which is what SSL encryption provides. For those implementations of WebCenter Content whose data only traverses internal corporate network segments where there is no concern of  data being breached in transit, the following configuration entry that is part of ESignature can be set in either the ESignature environment resource configuration file or the core WebCenter content config.cfg file  to disable the SSL requirement for signature capture

    • DisableESigSSLCheck=true


    Electronic Signature Viewing

    When an electronic signature has been captured on a specific content item revision, the content items document information page will display a signatures tab that allows the user to view the signatures that have been captured. (Note: A content item revision can receive more than one electronic signature as illustrated below)

    SignatureList

    Clicking on the actions icon in the column on a particular signature row will display additional details about the signature.

    SignatureDetails


    Matching of a local file to revisions of a specific WebCenter Content managed content item

    This feature uses the check sum value that is calculated for the revisions of checked-in content items to allow users to see if a document on their local file system matches any of the revisions of a selected content item. The following illustrations walk through the end user experience of matching a local file to a content item revision


    1. 1. From the signature tab of content items document information page the user selects Search For Local File in Repository

    MatchLocalAction


    1. 2. User is presented with the Search For Local File dialog where they use the Choose File button to browse their local file system for a file. User clicks the Search button to submit request to see if locally selected file matches any of the revisions of the currently selected WebCenter Content managed content item.

    MatchLocalForm


    1. 3. User is presented with either the File Search Successful or File Search Unsuccessful dialog depending on if the local file matched any revisions of the selected managed content item based on the calculated checksum value.

    MatchLocalResultsSuccess

    MatchLocalResultFailed


    Searching for a document in the WebCenter Content Repository based on a local file

    This feature uses the check sum value that is calculated on revisions of checked-in content items to allow users to search the WebCenter Content repository for content items whose checksum value matches the calculated checksum value of a file on their local file system. The following illustrations walk through the end user experience.


    1. 1. From the Content Management Menu, user selects Search For Local File In Repository

    SearchForLocalAction


    1. 2. User is presented with “Search For Local File” dialog. User clicks on Choose File button to browse their local file system for the file they want to use for the search. After file is selected, the Search button is clicked to initiate search.

    SearchForLocalForm


    1. 3. User is presented with either the File Search Successful or File Search Unsuccessful dialog dependent on if any content items matched the selected local file based on the calculated checksum value.

    SearchForLocalResultsSingle

    Single Content Item Match


    SearchForLocalResultMultiple

    Multiple Item Content Matches


    Electronic Signature Metadata Configuration

    There is one administration option that is available with the Oracle WebCenter Content Electronic Signatures feature which allows you to extend the default set of electronic signature metadata. The default signature metadata that is supplied with the initial installation of the signature feature is:

    • User ID of signer
    • Full Name of Signer
    • Date that signature is captured


    The following illustration shows the administration interface that is provided for setting up additional signature metadata.

    EsigAdmin

    Electronic Signature Document Watermarking

    When the Electronic Signature feature is enabled along with PDF Watermark, the PDF Watermark capabilities are enhanced to provide the ability to apply a watermark of selected electronic signature data on a PDF rendition of a document. The following illustrations show how to configure a watermark template with electronic signature data using the PDF Watermark Administration Applet. For complete instructions on PDF watermark administration refer to the Oracle Fusion Middleware Managing Webcenter Content guide.

    As it relates to electronic signature and PDF watermark there are primarily two main configuration points to consider

    • Rules
    • Template Configuration

    There is a custom Boolean metadata field that gets maintained with each content item named xESigHasElectronicSignature that gets stamped with a value of 1 when an electronic signature has been captured. I have found that this is a good metadata field to use to in a rule for a watermark template that will be used to watermark an electronic signature on a PDF rendition of a document.
    PDFWaterMarkRule


    When the electronic signature feature is enabled Edit Template dialog of the PDF Watermark Administration applet has a SignatureWaterMark tab added. The illustration below shows the interface for adding electronic signature fields. It needs to be noted that only electronic signature fields available for applying to a watermark template are custom ones that are added through the Electronic Signature Administration page.

    PDFWaterMarkSignature

    Extended Functionality

    As I was working with the features of the Electronic Signature component I thought that the document checksum calculation it provides lent itself for some nice feature enhancements which I have implemented in an add on component called ESignatureExtensions. The extended features being supplied with this component are:

    • File Download validation

    This feature uses the checkum calculated when the content item revision was originally checked-in  compared to a checksum that is computed when a request to download the document is initiated through the GET_FILE service to validate that the file had not been modified on the file system of the server after it was checked-in. If the comparison of checksums fails indicating the file has been changed, the file download is interupted and the user is presented with the following message

    downloaderrror

    • Locating duplicate content items of a selected revision or all revisions of a currently selected content item

    This feature uses the checksum calculated on the currently selected revision or all revisions of a content item to search the repository for other content items which are duplicates based on the checksum value which is computed from the file contents.  These feature are accessible from the content information page actions menu
    findduplicatesactions2


    findduplicatesresults3

    • Scanning the entire repository for duplicate content item

    This feature uses the checksums computed on documents to scan the entire repository for content items which are duplicates of each other.  In addition this feature provides the ability to download the results of the scan to a CSV formatted file that can be distributed to end users to view and analyze the data in Microsoft Excel to assist in the process of eliminating duplicate content from the repository
    scanrepodups4

    ~Happy Signing~

    WebCenter Content and Multiple Identity Providers: The Virtualization Issue

    $
    0
    0

    A common scenario that arises with WebCenter Content Suite of products is one where an external LDAP directory such as Oracle Internet Directory or Active Directory is used along with the embedded WebLogic LDAP ‘DefaultAuthenticator’. But by default only users/groups from the primary authenticator (the first authenticator in the WLS provider list) are available to the Oracle Platform Security Services (OPSS). However, when the virtualization setting is enabled, this enables the identity store to merge both the external LDAP and embedded LDAP roles, thus solving the problem where Oracle Platform Security Services (OPSS) will only use the top provider that is defined in the list in the WLS Security Realm.

    To view whether the setting is enabled or not, Enterprise Manager can be used.

    1. Log in to Fusion Middleware Control and navigate to Domain > Security > Security Provider Configuration to display the Security Provider Configuration page.
    2. Expand, if necessary, the area Identity Store Provider, and click the “Configure” button to display the page Identity Store Configuration.
    3. In the Custom Properties section, if turned on, the virtualize setting will be displayed.

     idstore-virtualize

     

    The side effect of this is added complexity in the lookup of user groups, since additional work is then needed for authorizing users. Given that most customers only use the DefaultAuthenticator for one user (weblogic), turning on virtualize for one user is not recommended. The impact of using virtualize can be significant, depending on the complexity of the external LDAP directory. The solution then is to determine how to avoid using the DefaultAuthenticator at all, and to only use the external LDAP directory for all user roles including the WebLogic Console Administrator.

    Turning off the Default Authenticator can be simple with Oracle Internet Directory, but only if the OID administrator will create a user named “weblogic” (or whatever admin user name is chosen) and add that user to a group called “Administrators” in OID.  In the case of Active Directory, this is not as simple. In Active Directory, the “Administrators” group has special meaning, just like it does in WebLogic. There is a naming collision. AD admins are loathe to add any user to the Administrators group, since that opens up the domain to that user. This would mean a “weblogic” user would have full access to AD, and no AD administrator is likely to give the nod to that request.

    The solution to this naming issue with the Administrators group in WebLogic is to use a different LDAP group, such as FMWAdministrators, to separate the need for Active Directory to protect the Administrators group from application users, and for WLS and application users to have full access. Once an FMWAdministrators group exists in Active Directory, the XACML (eXtensible Access Control Markup Language) settings in WebLogic can be updated to use FMWAdministrators instead of Administrators for allowing access to the WLS console. Of course, the “weblogic” administration user needs to be a member of the FMWAdministrators group for this to work properly.

    The primary reason to remove the Default Authenticator is to improve performance. The virtualize=true setting is easy to turn on but adds complexity to the user authorization process. In development and test environments using this setting may not show any performance degradation, but in production this can lead to unwanted side effects in your applications as the LDAP structure becomes increasingly complex. The best scenario in production is to use your external enterprise LDAP directory, such as OID or Active Directory, and turn off the Default Authenticator.

     

    WebCenter Content pre-requisite steps 

    Removing the DefaultAuthenticator for use with WebCenter Content and WebCenter Content: Imaging requires a set of steps that must be performed in order to maintain proper access to content. Like WebLogic and Active Directory, WebCenter Content also uses the Administrators group for assigning elevated access. Before removing the WebLogic DefaultAuthenticator create a Credential Map in WebCenter Content that maps FMWAdministrators to both the Administrators and admin roles.

    Example Credential Map name: FMWAdministrators

    Example Credential Map contents:

    |#all|, %%
    FMWAdministrators, Administrators
    FMWAdministrators, admin

     

    This mapping allows the content server to make use of the FMWAdministrators group for admin users. This map must be set for use in the JpsUserProvider, otherwise it will not take effect. The provider.hda file must be updated to have a line like the following. For the JpsUserProvider, the location of the provider.hda file is at this path:

    <domain>/ucm/cs/data/providers/jpsuserprovider

    Keep in mind if WebCenter Content is clustered, this path will be on the shared file system, not on local disk. This file must be edited in a text editor. The following line can be added anywhere in the section “@Properties LocalData”.

    ProviderCredentialsMap=FMWAdministrators

     

    Upon restart, the JpsUserProvider in WebCenter Content will begin mapping the role FMWAdministrators. An important point to make here is that you will want to add this Credential Map prior to removing the Default Authenticator, otherwise you will have to add the map manually on the file system.

    The other requirement for WebCenter Content when removing the Default Authenticator is that the Admin Server link will not work in WCC unless the admin user (e.g. weblogic) is a member of either Administrators or sysmanager groups. WebCenter Content blocks access to the Admin Server unless the user is associated to the Administrators or sysmanager group. In the case of Active Directory, you will likely not be using Administrators, which is why the FMWAdministrators group is needed in the first place. Thus the only other option is to create a group called “sysmanager” in the external LDAP setup (AD or OID) and assign the desired WebCenter Content Admin user to that group, whether is it weblogic or another user.

     

    Removing the Default Authenticator

    Much of the steps followed below are similar to Oracle Business Intelligence documentation on “Using Alternative Authentication Providers”. However, some of the BI-specific users and groups are not needed. To view the official documenation used by BI customers, see the link below.

    http://docs.oracle.com/cd/E23943_01/bi.1111/e10543/privileges.htm#BABFHAIC

    Backup config.xml file

    Backup the system before deleting the Default Authenticator. To do so, make a copy of <domain-home>/config directory so that it may be restored if needed.

    Create the Active Directory or OID authenticator

    This has likely already done if you are considering removal of the Default Authenticator. If not already, this provider should be moved to the top of the provider list in the WebLogic security realm and the services restarted.

    Identify or Create Essential Users Required in external LDAP

    Create the following essential groups in Active Directory or OID.

    weblogic – This username may be different if you have defined another username in the embedded LDAP directory in WebLogic.
    OracleSystemUser – This user is needed for Oracle Web Services Manager (OWSM).

    Create Essential Groups in external LDAP

    To remove the Default Authenticator, certain groups must exist in the external LDAP directory.

    FMWAdministrators – This group name can be anything you choose.
    AdminChannelUsers
    AppTesters
    CrossDomainConnectors
    Deployers
    Monitors
    Operators
    OracleSystemGroup

    sysmanager  (This is the group needed for WebCenter Content access to the admin server.)

     

    After creating the users and groups in the Active Directory, the default weblogic user (or username that you’ve chosen) should be made a member of FMWAdministrators group. The OracleSystemUser should be a member of OracleSystemGroup.

    Note: The WebLogic security realm provides the capability to export an ldif file of all users and groups defined in the embedded LDAP directory. This can provide a method of exporting all users and groups at once. However, the ldif produced will need modification if used for creating users and groups in Active Directory or OID. To get an ldif file for the DefaultAuthenticator, in Weblogic Console, click on Security Realms -> myrealm -> Provider -> DefaultAuthenticator -> Migration -> Export.

     

    Update WebLogic to use FMWAdministrators for the “Admin” role

    Login to the WebLogic Admin Server console.

    Click on Security Realms -> myrealm -> Roles and Policies -> Global Roles

    Expand “Roles”.

    On the row with the role “Admin”, click “View Role Conditions”.

    Click the “Add Condition” button.

    Select “Group” from the Predicate List drop down. Click “Next”.

    In the “Group Argument Name” text field, enter “FMWAdministrators”.

    Click the “Add” button and then click “Finish”.

    At this point the “Admin” role for WLS console will be assigned to users who are members of either Administrators or FMWAdministrators. Once the DefaultAuthenticator has been removed then the Administrators group should also removed unless you want to allow your external LDAP Administrators admin access to the WebLogic Console.

    Imaging Solution Accelerators and SOA Roles

    If using the Imaging Accounts Payable Solution Accelerator, steps must be taken to update the SOA application roles in Enterprise Manager. Various roles for SOA rely on the Administrators group, thus the addition of FMWAdministrators is also needed for SOA to function as expected when the Administrators group is no longer in use.

    add FMWAdministrators to these in SOA app roles. Out of the box, soa has all of these set to “Administrators”.

     

    SOAAdmin

    SOAOperator

    SOAMonitor

    SOAAuditAdmin

    SOAAuditViewer

    SOADesigner

     

    This can be done through Enterprise Manager or WLST.

    Updating the SOA role using Enterprise Manager:

     

    Log in to Enterprise Manager (e.g. http://hostname:7001/em).

    Click on soa-Infra

    Click on the Soa Infrastructure dropdown and navigate to security -> Application Roles

    Click on the Search button in the middle of the screen (this will display the SOA App roles)

    Add the Group to each of the roles listed above.

     

    Updating the SOA role using WLST:

    The WebLogic scripting tool can also be used for this step. Change the paths to the Middleware home as needed.

    export ORACLE_HOME=/opt/middleware/Oracle_SOA1
    cd $ORACLE_HOME/common/bin
    ./wlst.sh
    
    #connect to the SOA server
    connect('weblogic','welcome1', 't3://hostname:7001')
    
    #Add the new external group name to SOAAdmin role
    grantAppRole(appStripe='soa-infra',appRoleName='SOAAdmin',principalClass='weblogic.security.principal.WLSGroupImpl',principalName='FMWAdministrators')
    grantAppRole(appStripe='soa-infra',appRoleName='SOAOperator',principalClass='weblogic.security.principal.WLSGroupImpl',principalName='FMWAdministrators')
    grantAppRole(appStripe='soa-infra',appRoleName='SOAMonitor',principalClass='weblogic.security.principal.WLSGroupImpl',principalName='FMWAdministrators')
    grantAppRole(appStripe='soa-infra',appRoleName='SOAASOAAuditVieweruditAdmin',principalClass='weblogic.security.principal.WLSGroupImpl',principalName='FMWAdministrators')
    grantAppRole(appStripe='soa-infra',appRoleName='SOAAuditViewer',principalClass='weblogic.security.principal.WLSGroupImpl',principalName='FMWAdministrators')
    grantAppRole(appStripe='soa-infra',appRoleName='SOADesigner',principalClass='weblogic.security.principal.WLSGroupImpl',principalName='FMWAdministrators')

    Updating Imaging security

    At this point, you will also want to run the mbean operation “refreshIpmSecurity” to make sure everything is updated in the Imaging managed server.

    Login into Enterprise Manager.
    Navigate down to the Imaging server under the Weblogic Domain Folder.
    Once the right hand pane refreshes, click on the ‘Weblogic Server’ drop down menu and select ‘System MBean Browser’.
    On the MBean Browser tree go to Application Defined MBeans –> oracle.imaging –> Server: IPM_server1 –> cmd –> cmd
    Click on the ‘refreshIPMSecurity’ link on the right hand pane.
    Press Invoke button.

    In WebCenter Content: Imaging(IPM) all the existing security references to DefaultAthenticator users/groups that have not been duplicated in the external LDAP will need to be /replaced with external LDAP users/groups by walking through the System Security, Definition Security, and Application Document Security using the IPM UI. This must be performed before the DefaultAthenticator or virtualization have been removed.

    Delete the DefaultAuthenticator

    Before deleting the DefaultAuthenticator, verify that the Active Directory or OID users and groups show up in the WLS security realm. Click on the “Users and Groups” tab of the realm page to verify that the LDAP provider is finding external users and groups. Also log out of the WebLogic Console and attempt to log in as the new administrative user you have created in the external LDAP provider.

    In addition, don’t forget to turn off virtualize=true in Enterprise Manager. This step should be done before removing the DefaultAuthenticator. 

     

    1. Log in to Fusion Middleware Control and navigate to Domain > Security > Security Provider Configuration to display the Security Provider Configuration page.
    2. Expand, if necessary, the area Identity Store Provider, and click the “Configure” button to display the page Identity Store Configuration.
    3. In the Custom Properties section, if turned on, the virtualize setting will be displayed. Remove the setting.

    Once that is verified, go to the Providers tab and check the box next to DefaultAuthenticator, and then click the “Delete” button.

    Restart the WebLogic Admin/Managed Servers and verify that you can login to the WLS console as weblogic or whatever user you setup as the WLS Admin.

     

     

    Note: If users or passwords are changed for the admin users referenced in credentials for web services, csf-key values will need to be updated. Credentials are used in keys for calling web services, so if a username or password change is made in AD, the credential stores need to be updated. Be aware of these. Imaging’s AXF integration point can be used with web services that communicate with SOA, E-Business Suite, and PeopleSoft. If the password or usernames change, the keys must be updated to match.
    Imaging: MA_CSF_KEY and basic.credential

    SOA: basic.credential

    EBS: SOAP user used by AXF.
    execute fnd_vault.put(‘AXF’,’AXF_SOAP_USER’,’SOAP_PASSWORD’);

    PSFT: Integration Broker -> Node Configuration

     

    If using a different user or password from the original weblogic user then the boot.properties file will need to be manually updated. Back this file up and replace the encrypted user and/or password with the new user/password. Once the services have been restarted this information will automatically be encrypted.

    Additional warning: If using Active Directory, and the weblogic user is set to only be allowed logon from certain workstations/hosts, and WebLogic is running on Linux (or non-Windows), the WebLogic servers may fail to start because of a bind failure. The weblogic user cannot be restricted to specific “Logon Workstations”, otherwise the error is difficult to locate. Run a manual ldapbind to test that the weblogic user can bind to AD. If it cannot bind, the issue may be due to a LDAP 531 error, meaning the user is restricted to logon only from certain machines. An example error of an ldapbind command is shown below:

    ldap_bind: Invalid credentials ldap_bind: additional info: 80090308: LdapErr: DSID-0C0903A9, comment: AcceptSecurityContext error, data 531, v1db1

     

    Verify Imaging and Content Server login work as expected

    Login to Imaging to ensure that all applications show up as they did before when the DefaultAuthenticator was in place.

    Login to the Content Server as the weblogic user and click on the username in the upper right. The FMWAdministrators role should appear, and the user should also have the “admin” role in the list. Verify that the sysmanager role also appears. To test that the Admin Server is opening correctly, go into the Administration menu and click “Admin Server”.

     

    Conclusion

    The virtualize=true option is a powerful feature, but often not needed and adds an additional layer of complexity to security. In the event you have multiple external LDAP providers, you may require that the virtualization setting is enabled. However, even in that situation, it is often best to remove the DefaultAuthenticator because instead of virtualizing two sets of LDAP directory roles, the embedded WLS LDAP directory also will be virtualized. The WLS embedded LDAP directory can be removed safely and give a performance bump to user interactions not only within the Imaging and WebCenter Content instances but throughout the entire ECM product.

    Lastly, the performance between Imaging and WebCenter Content can be further boosted by changing the user caching settings on WebCenter Content. By default, WCC has a UserCacheTimeout of 1 minute (UserCacheTimeout=60000). This time can be increased to a higher value and make the link between Imaging and the WCC repository faster, since authorization will only need to be done at a higher interval. To increase the UserCacheTimeout on the Content Server, add an entry to the config.cfg file using milliseconds as the value. To set the timeout to 1 hour, use the following:

    UserCacheTimeout=3600000

    This will make the Content Server return requests faster to Imaging, as well as reduce load on the LDAP directory servers.

     

     

    If virtualization is still needed with AD…

    For customers that require multiple security providers the virtualize=true flag will still need to be set. When this is the case, the Default Authenticator should still be removed to reduce the workload of resolving groups in the OPSS layer. This is especially important if the users have many group associations (i.e. 50 assigned groups).

    When virtualize=true is enabled, a parameter called max.search.filter.length is hardcorded as 500 bytes. When making nested group membership calls, number of (uniquemember=group_dn_1 OR uniquemember=group_dn_2 OR …) filter values are limited by this max filter length parameter. With user having around 100 roles assigned, there are around 10-15 ldap search calls made for a single user. Each LDAP search call may only take 50 milliseconds, but because the calls are done serially and for each individual security provider, performance problems arise.

     

    However, most LDAP providers can support filters of much larger length. Increasing the max.search.filter.length to bigger value can reduce number of nested search calls significantly.

    To benefit from this setting, a patch needs to be applied to the oracle_common home. Once the patch is applied, the setting needs to be added to the jps-config.xml file. To add this setting in Enterprise Manager, use the following steps. Note that this setting will only have value if virtualize=true is already enabled.

    1. Log in to Fusion Middleware Control and navigate to Domain > Security > Security Provider Configuration to display the Security Provider Configuration page.
    2. Expand, if necessary, the area Identity Store Provider, and click the “Configure” button to display the page Identity Store Configuration.
    3. In the Custom Properties section, verify the virtualize=true flag exists. If not then the max.search.filter.length is not required.

    4. Add the parameter max.search.filter.length=20000

     

    The base bug for the max.search.filter.length setting is below. This bug is fixed in 11.1.1.9 and 11.1.1.5. Backports to 11.1.1.6 and 11.1.1.7 have been requested at this time.

    Bug 17302469 - MAX.SEARCH.FILTER.LENGTH IS NOT CONFIGURABLE FOR LIBOVD PROVIDER

    Oracle Database Vault security policies for Oracle WebCenter Content

    $
    0
    0

    The Oracle Database is the content repository of choice for many customers. Oracle Database Vault prevents data breaches from WebCenter Content Administrator Accounts and applications that may connect to WebCenter Content.

    Oracle WebCenter Content, an Oracle Fusion Middleware component, is an integrated suite of applications designed for managing content. Oracle WebCenter Content contains the Oracle WebCenter Content Server, which is used to manage the content repository. The content repository is used to store content and deliver it to users as needed in the correct format.

    Oracle Database Vault provides privileged user controls, command controls, and privilege analysis, helping prevent data breaches resulting from threats, external or internal, that target applications and privileged user accounts.

    These  Database Vault security policies for WebCenter Content are available in the Oracle Support website via Master Note for Oracle Database Vault security policies with Siebel, PeopleSoft, JD Edwards EnterpriseOne and WebCenter Content (Doc ID 1623425.1). This Note contains an attachment that could be downloaded to access the Database Vault security policies for WebCenter Content (179.32 KB).

    Before getting started:

    1. Make sure WebCenter Content 11.1.1.7+ is installed on Oracle Database release 11.2.0.3 or higher.
    2. Make sure the database has “TEMP” as a temporary Tablespace.
    3. Installation steps:

      1. 1. Unzip the file wcc_dbvault_sec_policies.zip into a temporary directory.
      2. 2. Using Sqlplus, login to the database with the Database Vault Owner account and run the script wcc_create_dbv_policies.sql. This script is located under the create_policies directory.

       In order to temporarily disable the protection policies, run the ‘wcc_disable_dbv_policies.sql’ script. It is in the ‘disable_policies’ subdirectory. Note: Disabled Database Vault policies expose your application data to access by privileged user accounts.
      To re-enable the security policies, go to the subdirectory ‘enable_policies’ and run the ‘wcc_enable_dbv_policies.sql’ script.
      To delete the security policies, go to the directory ‘delete_policies’ and run the ‘wcc_delete_dbv_policies.sql’ script.

      Description:

      The following security policies are installed:

    WebCenter Content Realm: This realm protects all WebCenter Content business data owned by the _IPM, _MDS, _OCS, _OCSSEARCH, _OIM, _OPSS, _ORAIRM, _ORASDPM, _SOAINFRA, _URMSERVER schemas (note that the <PREFIX> chosen at installation/configuration time needs to be added to the schema names) against unauthorized access by privileged users, like DBAs. Access rights to business data by authorized application users remains unchanged. You may need to adjust the script files to reflect your instance users.

    Connect Command Rule: This command allows connection to the database by various users according to specific security policies.

    • Database users are allowed to connect to the database using pre-defined list of processes. These processes include middle tier processes, and any client tools.
    • These include rules that specify which IP addresses and/or hostnames these processes should connect from. You may want to enter here the IP and hostname for all the middle tiers in the WebCenter Content cluster.

    Use Cases

    Our main goal is to validate that customers can ensure a DBA and other privileged users (system administrators) cannot view application data but can still perform necessary DBA and system administration functions such as application rollout and system maintenance.When it comes to backend privileged user access, Oracle Database Vault can be used to help fulfill various compliance related requirements such as the following:

    Database Vault (DBV) can help mitigate the risks of the following regulations at the data tier level

    Regulatory Legislation

    Regulation Requirement

    Does DBV Mitigate This Risk?

    Sarbanes-Oxley Section 302

    Unauthorized changes to data

    Yes

    Sarbanes-Oxley Section 404

    Modification to data, Unauthorized access

    Yes

    Sarbanes-Oxley Section 409

    Denial of service, Unauthorized access

    Yes

    Gramm-Leach-Bliley

    Unauthorized access, modification and/or disclosure

    Yes

    HIPAA 164.306

    Unauthorized access to data

    Yes

     

    Yes

    HIPAA 164.312

    Unauthorized access to data

    Yes

    Basel II – Internal Risk Management

    Unauthorized access to data

    Yes

    PCI

    Restrict Access to cardholder data by business need-to-know

    Yes

    CFR Part 11

    Unauthorized access to data

    Yes

    Japan Privacy Law

    Unauthorized access to data

    Yes

    Several of the anticipated customer use cases for Database Vault include:

    • Limiting the system DBA’s access to WebCenter Content application data.
    • Providing the application DBA proper access to the usual database objects such as tables, views, and indexes, for maintenance and deployment, without giving the ability to see any application business data.
    • Ensuring the above policies are followed in a production environment as well as during any application maintenance or patching.

    The scripts delivered have been developed to accomplish the above security policies.  If your policies differ from the above, the scripts can be modified to support your specific security needs.

     

    The Rule Set ‘WebCenter Content Access’ is configured to be static, which means that changes to the underlying Rules are not effective until the user(s) log out and back in. But static Rule Sets minimize the performance impact, especially in Rule Sets with many complex Rules.

    Clustered environments

     

    Find below the modifications to 01_wcc_create_rule.sql in order to handle clustered WebCenter Content environments:

    – create Rule ‘Check WebCenter Content IP 3′

      DBMS_MACADM.CREATE_RULE(
    rule_name => ‘Check WebCenter Content IP 3′,
    rule_expr => ‘SYS_CONTEXT(”USERENV”,”IP_ADDRESS”) = ”111.222.333.444” OR
    SYS_CONTEXT(”USERENV”,”IP_ADDRESS”) = ”111.222.333.555” OR
    SYS_CONTEXT(”USERENV”,”IP_ADDRESS”) = ”111.222.333.666” OR
    SYS_CONTEXT(”USERENV”,”IP_ADDRESS”) = ”111.222.333.777”’
    );

    commit;

    — create Rule ‘Check WebCenter Content Hostname 4′
    DBMS_MACADM.CREATE_RULE(
    rule_name => ‘Check WebCenter Content Hostname 4′,
    rule_expr => ‘SYS_CONTEXT(”USERENV”,”HOST”) = ”myhost.us.oracle.com” OR
    SYS_CONTEXT(”USERENV”,”HOST”) = ”myhost2.us.oracle.com” OR
    SYS_CONTEXT(”USERENV”,”HOST”) = ”myhost3.us.oracle.com” OR
    SYS_CONTEXT(”USERENV”,”HOST”) = ”myhost4.us.oracle.com”’
    );

    commit;

    Hostname vs. IP addresses

    The initial set of rules provided in this kit are provided as a sample to aid system administrators in the creation of rules that would meet their end user’s business requirements. Either the IP check rule or the Hostname check rule could be removed if not needed.


    WebCenter Sites and WebCenter Content Integration

    $
    0
    0

    WebCenter Sites and WebCenter Content Integration

     

    Overview

    WebCenter Sites is a great tool for business users and marketers to easily manage their Web Experience. WebCenter Content allows consolidation of all enterprise documents and digital assets in a single repository. Its transformation engine easily creates different renditions of the documents and images. As enterprise standard, it is recommended that clients use WC Content as their enterprise repository for all documents, images, videos and digital assets, and they use WC Sites to manage Web Experiences for all sites.

    WC Sites Content Connector tool automates the process of pulling the documents from WC Content into WC Sites. The documents and digital assets should be created, edited, work-flowed, and approved in WC Content. The approved documents and digital assets, along with their meta-data are pulled from WC Content as assets in WebCenter Sites. If the document and digital assets have different renditions, they can also be pulled into WebCenter Sites. WC Sites Content Integration is designed to synchronize content from WebCenter Content to WebCenter Sites – documents and digital assets, including images and videos – as native files, renditions, and HTML conversions. Renditions can be created by the core transformation engine or Digital Asset Manager, depending on the type of native file and which renditions are required. HTML conversions are generated by Dynamic Converter. File formats are unlimited.

    When it runs for the first time, the WC Sites Content connector imports the latest released versions of native files, renditions, conversions, metadata, or combinations of them into WebCenter Sites. Connector rules and the mappings you specify decide which content it imports.

    If the content items are then modified on WebCenter Content, the connector synchronizes them to WebCenter Sites in its next session. Updated content items are re-imported into WebCenter Sites, and deleted content items have their counterpart assets deleted from WebCenter Sites (unless dependencies on those assets were created since the last synchronization session). Import and synchronization can be triggered manually, or they can be scheduled to run on a timed basis.

    Synchronization is always unidirectional, from WebCenter Content to WebCenter Sites. WC Sites Content Integration supports a many-to-one client-server model. Any number of WebCenter Sites clients can be connected to a single WebCenter Content instance.

     

    Use Case

    WC Sites Content Connector was designed for the use case where documents, images and digital assets are created, edited, managed and approved in WC Content, but they are rendered from WC Sites. The documents from WC Content are pulled over and assets are created in Sites.

    However, there are many times when clients do not want to pull the documents from WC Content in to WC Sites. This may be due to following reasons:

    • Extra Storage If large quantities of documents need to be pulled from WC Content in to WC Sites, they are duplicated in WC Sites. Moreover, WC Sites has multiple environments, like Authoring/Editorial, QA, Delivery etc. This means the documents needs to be copied to two or three environments. Some clients do not want the extra storage required to duplicate the documents two or three times.
    • Publish Times Publishing documents adds to the publish time. If a client has tens of thousands or hundreds of thousands of documents, it needs to plan to initially pull all the documents from WC Content into WC Sites. Subsequently, depending on how many documents are to be published on a regular basis, they may need to plan for extra time accordingly.
    • Multiple Sources of Documents Recommended best practice is to make WC Content as the master source for all enterprise documents. The documents should be edited and maintained only in WC Content. But, if the documents are pulled from WC Content into WC Sites, it opens the possibility that someone will download the document from WC Sites, edit it and upload it back into WC Sites.

    Thus there is a case where the client does not want to pull the document from WC Content into WC Sites, but only wants to pull the meta-data associated with the document. Using the meta data, the client should be able to make a link to the document, and link it to any page on the web site.

     

     

     

    Setting-up WC Sites Content Connector for Meta-Data Only

    WC Sites Content Connector document says that meta-data along with document and the required renditions will be pulled from WC Content. However, it is possible to pull only the meta-data and not the documents, as shown below. In the screens shown below, I am working with the AVISports site released with WC Sites.

     

    STEP 1. Asset Definition

     

    Create an asset definition, that does not have a blob attribute required to hold the document.

    I have created a new AVIArticle Definition called wcDoc. You will notice that wcDoc definition does not have any blob attribute, required to hold a document.

     

    SiteConnector1

     

     

    Step 2: Configure WC Sites Connector for only Meta-Data

    In the screen below, I have added a new rendition called WC MetaData, and removed the Primary & Web default renditions.

     

    SiteConnector2

     

    STEP 3: Setup Rules

    Setup the rules for pulling the documents from WC Content. Give it a name, and define the rules and target.

     

    Specify name for the given rule:

    SiteContent3A

     

    Specify rules for selecting Content Items from WC Content

    SitesConnector3

     

    Specify the target (Site name) in WC Site

    SitesConnector4

     

    STEP 4: Define Attributes

    While defining the attributes, do not pull the document itself. You must include the URL for the document. The document URL is not a default attribute that can be pulled from in WC Content into WC Sites. However, WC Content Admin can easily expose this attribute so it can be pulled from Content into Sites. In the screen below, I am pulling xAbsoluteURL from WC Content and mapping it to docURL attribute in wcDoc assets on WC Sites.

     

    Define attributes to be loaded:

    SitesConnector5

     

    That is all you need to do the setup. Next time when Connector is run, it will pull the meta-data for the document, including its URL, but will not pull the physical document into WC Sites.

     

    Here is the asset inspection screen on WC Sites, showing the meta-data and the URL for the document pulled form WC Content.

    SitesConnector6

     

    Rendering the document from WC Content

    If the document is not pulled from WC Content, but only the meta-data along with its URL is pulled, the document from WC Content can be rendered in the following ways:

    1)      Use the URL One can make a link to the document using the URL from WC Content. Using this link the document can directly be rendered from WC Content.
    2)      Use a proxy However in many cases, WC Content may be behind a fire-wall and not easily accessible from internet. In such cases one can setup a proxy outside of the firewall that will connect to WC Content and render the document.
    3)      Secure Document Servlet Documents in WC Content are secure, and may have access rules regarding who can see the documents. In such cases, it is best to write a simple document servlet to render the document. The links from WC Sites should be to this document servlet. The document servlet should pass the username and authentication information about the user to WC Content, read the document from WC Content using the GET_FILE service APIs, and render it on the site. It is not advisable to directly render this document from WC Sites template, as that will have performance implication on WC Sites. It is best to do it from a separate servlet.
     

     

     

     

    Methods for Resubmitting Imaging Documents to SOA Workflow

    $
    0
    0

    When a document is created or uploaded in WebCenter Imaging, if a workflow connection exists the Imaging managed server automatically attempts to inject the document into workflow. However, in scenarios where either the SOA instance is down or another error causes a workflow injection failure, the workflow never gets started. If an error occurs, the document id is written to the BPEL_FAULT_DATA table in the Imaging schema (e.g. DEV_IPM).  It’s worth noting up front that this table only contains errors related to the injection of the document into workflow, but does not track faults that occur with the BPEL process itself. The BPEL_FAULT_DATA table only tracks issues where Imaging could not initiate the workflow.

    fault-data

    The question most Imaging admins arrive at sooner or later is “how can a document be resubmitted to workflow?” There are a few different ways to resolve these issues where the document never entered workflow, or the document did enter workflow but for some reason needs to re-start the workflow process.

    Several MBeans can be used in Enterprise Manager under the Application Defined MBeans for oracle.imaging. Under the server name, the cmd MBeans contain operations that can list the data from the BPEL_FAULT_DATA table and repair the failures by retrying those. In addition, another MBean can clear the fault data once all desired documents have been resubmitted to workflow.

    The MBeans useful for viewing, resubmitting, and clearing workflow faults are accessible using WLST as well, with full examples and documentation on how to execute at this link.

    • clearIPMWorkflowFaults – Clear processing failures that occurred during workflow agent processing.
    • listIPMWorkflowFaults – Provide details of processing failures that occurred during workflow agent processing.
    • repairIPMWorkflowFaults – Repair processing failures that occurred during workflow agent processing.
    • sumIPMWorkflowFaults – Count processing failures during workflow agent processing, grouped by choice of date, application ID, or batch ID.

    There is an additional MBean that can be very useful for resubmitting individual Imaging documents that has identified as needing to re-enter workflow.

    submitIPMToWorkflow – Submits a document to the workflow agent. Note that a confirmation message is displayed stating that the document has been submitted, however if the document is stored in an application that is not configured with a workflow, no action is taken.

    mbeans

     

    In some cases the “repairIPMWorkflowFaults” may be needed to clean up failed resubmissions. After this is invoked, the listIPMWorkflowFaults can be used once again to see if any of the documents have failed a second time. When all of the failed documents are resubmitted, the clearIPMWorkflowFaults can be used.

    In other situations, the submitIPMToWorkflow task can be invoked for individual documents, or it can be scripted for a list of documents if using WLST. This can be a powerful tool where a batch or selection of Imaging documents failed to enter workflow or hit a non-recoverable fault within a BPEL process.

    submittoworkflow

     

    For some admins, the use of MBeans and WLST is not the preferred approach. Imaging also provides Java and Web Service methods for resubmitting to workflow. The “DocumentService” web service has an operation called submitToWorkflow that allows for the same behavior as the MBean. The WSDL path is shown below:

    http://myimaginghost:16000/imaging/ws/DocumentService?wsdl

    A quick test with SoapUI can list all of the capabilities of the DocumentService WSDL, with submitToWorkflow being one of many. A sample SOAP request is shown below for resubmitting a single Imaging document to SOA for workflow injection. The only parameter needed in the request is the document id, such as 3.IPM_057805.

     

    <soapenv:Envelope xmlns:imag="http://imaging.oracle/" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">
       <soapenv:Header>
          <wsse:Security soapenv:mustUnderstand="1" xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">
             <wsse:UsernameToken wsu:Id="UsernameToken-1">
                <wsse:Username>weblogic</wsse:Username>
                <wsse:Password Type="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#PasswordText">welcome1</wsse:Password>
                <wsse:Nonce EncodingType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary">H2eFojfVXWdW2n4k8PJbjg==</wsse:Nonce>
                <wsu:Created>2014-03-18T19:39:41.638Z</wsu:Created>
             </wsse:UsernameToken>
          </wsse:Security>
       </soapenv:Header>
       <soapenv:Body>
          <imag:submitToWorkflow>
    
             <documentId>3.IPM_057805</documentId>
          </imag:submitToWorkflow>
       </soapenv:Body>
    </soapenv:Envelope>

    soapui

     

    In addition to Web Service calls, Java can be used to resubmit documents to workflow. The DocumentService class allows for the same operation using the following method:

    docService.submitToWorkflow("3.IPM_057805");

    A basic sample class file below shows how to use the DocumentService to update to an application field and then resubmit an Image to workflow.

    package devguidesamples;
    
    import java.io.IOException;
    import java.util.ArrayList;
    import java.util.List;
    import java.util.Locale;
    
    import oracle.imaging.BasicUserToken;
    import oracle.imaging.Document;
    import oracle.imaging.DocumentContentService;
    import oracle.imaging.DocumentService;
    import oracle.imaging.ImagingException;
    import oracle.imaging.ServicesFactory;
    import oracle.imaging.UserToken;
    
    public class ResubmitToWorkflow {
       public static void main(String[] args)
          throws IOException {
          try { // try-catch
             UserToken credentials = new BasicUserToken("weblogic", "welcome1");
             ServicesFactory servicesFactory =
                ServicesFactory.login(credentials, Locale.US, "http://myimaginghost:16000/imaging/ws");
    
             try { // try-finally to ensure logout
    
                DocumentService docService = servicesFactory.getDocumentService();
                DocumentContentService docContentService = 
                       servicesFactory.getDocumentContentService();
    
                String documentID = "3.IPM_057809";
    
                // update field value before resubmit.
                List<Document.FieldValue> fieldValues = new ArrayList<Document.FieldValue>();
                fieldValues.add(new Document.FieldValue("Organization", new String("Vision Operations")));
                docService.updateDocument(documentID, null, fieldValues, false);   
    
                <strong>docService.submitToWorkflow(documentID);</strong>
             }
             finally {
                if (servicesFactory != null) {
                   servicesFactory.logout();
                }
             }
          }
          catch (ImagingException e) {
             System.out.println(e.getMessage());
             e.printStackTrace();
          }
       }
    }

    Getting The Most Out Of Your WebCenter Content Database Metadata Search Engine configuration On Oracle 11g Database

    $
    0
    0

    Metadatasearch

     

    Introduction

    Being able to retrieve information as quickly as possible in a Content Management implementation is extremely important to provide knowledge workers with the data they need to do their jobs as efficiently as possible. This article is targeted at providing information on some recent discoveries on ways to keep your WebCenter Content Metadata search at peak performance. It needs to be noted that although this article is for WebCenter Content which is the application issues were experienced in, the remedies described in this article are database centric and require the skills of your database administer to implement. Also these techniques could also likely benefit any application which is query intensive.

    Main Article

    As most everybody is aware of WebCenter Content 11g provides three different search engine options that are available in the core product they are

      • Oracle Text Search
      • Database Full Text
      • Database Metadata

    The first two options (Oracle Text & Database Full Text) provide you with capabilities to not only search based on the metadata that is tagged on the content but also search on the text within the checked-in documents themselves (e.g. Full Text Searching). With the third option (Database Metadata) the metadata criteria that are specified on searches are distilled down into standard SQL Select statements with where predicates comprised of the metadata fields specified as the search criteria that are then submitted to the database to execute and return results.

    Recently I have been dealing with a couple of cases with customers whose WebCenter Content Server search engine configuration were Database Metadata and were experiencing significant performance issues with searches. The impact of the performance issues resulted in the JDBC connection timing out and WebCenter Content generating an error page to the users. The size of these customers repositories in terms of rows in the tables of the WebCenter Content schema used in searches were in the range of 40 to 80 million rows. These customers had applied the standard tuning to their WebCenter Content database schemas like

    • Ensuring indexes were applied to the most frequently referenced metadata fields from the DocMeta table used in searches.
    • Performed regular gathering of statistics on the database instance

    Yet despite these standard tuning efforts they were still experiencing significant performance issues with searching due to latency on the database.

    Upon analyzing the situation deeper using things like Automated Workload Repository Reports (AWR) it was realized that a great percentage of query execution time was being spent in User I/O. High User I/O time means the database is spending the majority of the time doing physical I/O on its disk subsystem to gather the data to return the results to a SQL query that was submitted. Anytime large amounts physical disk I/O is involved with any application or database it is going to be one of the slower parts due to the physical logistics of spindle turning and head seeks involved leading to higher latency. If you are experiencing performance issues with you WebCenter Content metadata search and are seeing the same symptoms appear in your AWR reports the first thing that should be done is to ensure that your database disk sub-system is not the thing introducing significant bottlenecks. If this has been eliminated and the health of the database disk subsystem has been deemed to be acceptable then the following configuration on the database has been effective at addressing these symptoms and restoring performance to the WebCenter Content metadata search.

    There are four primary tables in the WebCenter Content database schema that are involved when the WebCenter Content search engine configuration is Database.Metadata they are:

    • Revisions
    • DocMeta
    • Documents
    • RevClasses

    Using the advanced compression feature of the database, advanced compression was applied at the table space level which by compressing blocks will significantly reduce physical I/O. In addition to further speed performance parallelism (parallel degree ) was applied to the four tables outlined above along with parallelism was applied to all indexes applied to the tables. The best degree of parallelism (e.g. value of ) applied is dependent on the physical compute resources (e.g. CPU/Memory) that have been allocated to the database servers and some trial and error testing is required to find the best parallel degree to apply. With parallel degree you also want to be sure that you are striking a good balance between optimal performance and CPU saturation. If you over allocate parallel degree you risk inflicting CPU saturation and negatively impacting performance due to not having enough compute resources available for O/S kernel operations on the database servers. The increase in query performance will of course vary due to things like data volumes, queries, and hardware sizing. Testing in environments where this issue was experienced yielded average gains of 5.9 times (590%) to 10 times (1000%). As it pertains to advanced compression it needs to be noted that this is an additional licensable feature of the Oracle 11g database so you need to be sure you are licensed for it before exercising it as a method of tuning. Over and above the performance benefits that are outlined in this article, advanced compression will also provide you with storage savings due to the compression factor that is applied to reduce the physical footprint of the database on disk. The degree of compression gained will vary based on data patterns. In one instance where this was applied it took the physical storage footprint from 19G to 9G. This article has been focused on the tables involved in WebCenter Content metadata searching but it should be noted if there are any other queries that have been implemented in customizations to the WebCenter Content platform that suffer from the same symptoms these methods could also be applied to tables involved in those queries as well.

    Implementation of any of the tuning changes outlined in this article should first be implemented in a non production environment that has comparable infrastructure and data volumes to a production environment. Testing to monitor impact and results should first be done directly against the database by submitting the application queries through a tool like SQL plus or SQL Developer. When testing your search queries in SQL plus use of bind variables (as opposed to literal values) should be applied to where predicates to have the execution of the queries the same as when they are received from the application. The AWR reports should be used as a way to monitor the results of the tuning. After the appropriate tuning has been achieved as identified by AWR reports and improved raw query response times then testing searches through the WebCenter Content application can be performed.

    Mounting your WebCenter Content NFS file system correctly

    $
    0
    0

    Introduction

    NFSIt is a very well known fact that NFS protocol used to mount file systems in clustered application environments can be prone to issues with things like lock contention, dirty reads, and stale file handle if not configured correctly.  This article is targeted at outlining how a WebCenter Content clustered implementation requires NFS file systems to be mounted in order to operate without issues.

    Main Article

    NFS V3 and V4 are very common protocols to be used in Unix/Linux environments for mounting shared file systems which are mutually accessible to physical servers which are hosting clustered applications.  WebCenter Content is an application which supports and is commonly deployed in a clustered capacity to provide high availability and fault tolerance.  I have always thought that it was common knowledge on the requirements that WebCenter Content had for mounting shared file systems over NFS protocol but recently I have run into a couple of very high profile implementations where NFS mounts were not correctly configured.  This leads me to believe that the knowledge I thought was common might not be as common as I thought and has inspired me to write this article to ensure that this information is made highly visible via the A-Team chronicles.

    In a clustered implementation of WebCenter Content a shared file system is required. This is true even if WebCenter Content has been configured to store managed content in the database using a filestore provider storage rule configured for jdbc storage.   This is the case due to the fact that in addition to content there is application process data that resides on a portion of the shared file system used by WebCenter Content which also needs to be mutually accessible to all nodes in a cluster.  The high level requirements of a shared file system for WebCenter Content no matter what protocol is used are:

    • Must be seen as local disk to all nodes in the cluster (i.e. be transparent to the application).
    • Must provide the exact same view of the data (file reads/writes/renames/deletes/time stamps, etc…) to every attached cluster node at the same time.
    • All member servers of a cluster are time synchronized within 5 seconds. This includes the shared file system server/appliance and all servers where the WebCenter Content application process is running in the same instance/cluster. Setting all members to use the same time (NTP) server is the easiest way to accomplish this.
    • File system needs to support atomic operations (write/rename/etc) to ensure that operations complete successfully before they are visible/available to processes running on node which did not initiate the operation

    Specific to when NFS protocol is being used for file system mounting and sharing over a standard TCP/IP backbone the following requirements also need to be adhered to in order to avoid performance issues

    • The device used for NFS sharing should be a dedicated filer appliance designed for high speed file operations.  This is contrast of using hardware with a full O/S kernel on it which supports being used as a NFS server.  O/S kernel hardware is not designed to handle file operations at the speed required.
    • The bandwidth of your TCP/IP backbone connecting your servers hosting the clustered WebCenter Content application cluster to the filer appliance should be at a minimum a 10G-E connection to mitigate network latency and collisions from causing bottlenecks to storage.
    • The NFS mount should be configured with locking and file handle and attribute caching disabled.  Options used for this can vary by operating systems but as an example on a Linux kernel these would be nolock, noac, and actimeo=0.  Consult with your server administration team on these to make sure that the correct options are being used for your environment.

    It is common that the last requirement of having locking and caching disabled will give most server administrators cause for concern over file contention and file system performance degradation.   These NFS mounting options are required for WebCenter Content due to how it uses a portion of the shared file system as a semaphore for process coordination between nodes in a cluster running the application.

    To minimize concerns related to locking, WebCenter Content has been designed to manage file contention and locking on shared file system resources at the application level.  Therefore it needs full control over files without potential of NFS locking conflicting with its application processes.  Disabling attribute and file handle caching is required to ensure that the application processes running on each node is not subject to dirty reads for operations like checking time stamps on files used as part of process coordination/synchronization.  To manage the breadth of impact disabling caching has on operations like

    • Content check-in (CHECKIN_NEW service)
    • Content retrieval (GET_FILE service)
    • Text extraction during indexing

    WebCenter Content provides configuration entries that allows the sub-section of the overall shared file system used for process coordination and indexing to be placed on separate NFS mounted volume that has locking and caching disabled.  This enables the NFS volume that stores things like content (e.g. Vault and Weblayout ) and components which do not require caching and locking to be disabled to be on a volume that has caching and locking enabled to mitigate performance issues.  The configuration entries are the following:

     

    • DataDir – Used to partition application data onto its own storage volume to allow file handle caching and locking requirements to be met without affecting other volumes with these settings.
    • UserProfileDir – Used to partition user profile data which typically resides under the larger application data onto its own NFS mounted volume that has file handle caching and locking enabled to make disk access to this data faster. (Note: This configuration entry was not  available until the 11.1.1.6 release)
    • SearchDir – Used to partition application search processing data  onto its own storage volume to allow file handle caching and locking requirements to be met without affecting other volumes with these settings
    • VaultDir – Used to partition application native rendition content data  onto its own storage volume to allow file handle caching and locking requirements to be met without affecting other volumes with these settings
    • WebLayoutDir – Used to partition application web rendition content data  onto its own storage volume to allow file handle caching and locking requirements to be met without affecting other volumes with these

    The following diagram is used to illustrate the recommended distribution of WebCenter Content directory structure across two NFS volumes to allow WebCenter Content to operate in the most scalable capacity in relation to shared file system access.

    NFSFileSystemDistribution

    Hopefully this article has been able to provide you with good insight on the NFS shared file system requirements that WebCenter Content has and a good explanation behind these requirements.

    Oracle BPM 12c just got Groovy – A Webcenter Content Transformation Example

    $
    0
    0

    Introduction

    On the 27th June 2014 we released Oracle BPM 12c which included some exciting new features.
    One of the less talked about of new features is the support of BPM Scripting which incorporates the Groovy 2.1 compiler and runtime.

    So what is Groovy anyway?

    Wikipedia describes Groovy as an object-oriented programming language for the Java platform and you can read the definition here.

    In short though it is a Java like scripting language, which is simple to use. If you can code a bit of Java then you can write a bit of Groovy and most of the time only a bit is required.

    If you can’t code in groovy yet don’t worry, you can just code in Java and that work most of the time too.

    With great power comes great responsibility?

    The benefits and possibilities of being able to execute snippets of groovy code in a BPM process execution are almost limitless. Therefore we must be responsible in its use and decide whether it makes sense from a BPM perspective in each case and always implement best practices which leverage the best of the BPM execution engine infrastructure.

    If you can easily code, then it is easy to write code to do everything. But this goes against what BPM is all about. We must always first look to leverage the powerful middleware infrastructure that the Oracle BPM execution engine sits on, before we look to solve our implementation challenges with low level code.

    One benefit of modelled BPM over scripting is Visibility. We know that ideally BPM processes should be modelled by the Business Analysts and Implemented by the IT department.

    Business Process Logic should therefore be modelled into the business process directly and not implemented as low level code that the business will not understand nor be aware of at runtime. In this manner the logic always stays easily visible and understood by the Business. Overuse of logic in scripting will quickly transcend into a solution that will be hard to debug or understand in problem resolution scenarios.

    If one argues that the business logic from your business process cannot be modelled directly in the BPM  process, then one should revisit the business process analysis and review whether the design actually makes really makes sense and can be improved.

     

    What could could be a valid usecase for groovy in BPM?

    One valid usecase of groovy scripting can be complex and dynamic data transformations. In Oracle BPM 12c we have the option to use the following mechanisms for transformations:

    Data Association

    Good for:

    • Top level transformations of the same or similar types
    • Simple transformations of a few elements
    • Lists and arrays
    • Performance

    XSL transformation

    Good for:

    • Large XML schema elements
    • Assignment of optional XML schema elements and attributes
    • Lists and arrays
    • Reuse

    Groovy Scripting

    Good for:

    • Generic XML schema types like xsd:any
    • Dynamic data structures
    • Complex logic
    • Error handling
    • Reuse

    Java callouts using a mediator or Spring component

    Good for:

    • Pure Java implementation requirements
    • Large batch processing

    Each method have their own benefits and downsides, but in combination you can transform any payload. What to use is largely a case of:

    • Best practice within your organization
    • Best practice for BPM
    • The level of organized structure of your schemas

    In practice, an efficiently implemented BPM process will be a combination of associations, xslt & bpm scripts.

     

    tip3Tip: Always try to solve transformation tasks using using a data association first before turning to xslt or groovy. Use the right tool in your toolkit for the right job.

     

     Upgrading from BPM 10g

    The inclusion of BPM scripting will also aid in the upgrade from BPM 10g processes. This should be seen as an opportunity to review and improve the implementation as opposed to blindly copying the existing functionality. This is a process that is beyond the scope of this post.

     

    A Complex and Dynamic Webcenter Content SOAP Example

    Invoking very generic SOAP services can be one instance where groovy can save the day. When a SOAP service is well defined it’s very easy to create a mapping using the xsl or data association mappers. But what if the element definition is very wide open with the use of schemas elements like xsd:any, xsd:anyType or xsd:anyAttribute.

    To solve this transformation in XSLT could potentially be complex with lots of hand written, harder to read code.

    The GenericRequest of the Webcenter Content SOAP service is an example of such a generic SOAP service. The flexibility of its use means that the payload required is very dynamic.

    The actual schema element looks like this.

     

    content.xsd

     

    Now consider the situation where this payload for the GenericRequest needs to look like this and could potentially have lots of required logic.

     

    soapui

    This might be accomplished using a complex, hand coded xslt transformation.

    Alternatively if you don’t have any xslt world champions on the team, anyone on your development team that can code code java can do this easily with groovy scripting.

    Building the Transformation Demo

    To demonstrate the transformation capabilities of groovy scripting we are going to create a simple synchronous BPM process based on the above usecase.

    We send an Incident as a request and as a response will receive the transformed GenericRequest. In this manner it will be easy for us to see the whole transformed payload that we would normally send to Webcenter Content.

    The finished process looks like this.

     

    FinishedProcess

     

     

     

     

     

     

     

    Create a new BPM Application and define Data Objects and Business Objects

    We will create a new BPM application and define the:

    • Input arguments as an Incident
    • Output argument as a Webcenter GenericRequest

     

    1) Download the schema zipfile called docs and extract to a local location. Then open Studio (JDeveloper) and from the top menu choose Application->New->BPM Application

     

    NewApplication

     

     

     

     

     

     

     

     

    2) Click OK, use the application name GroovyDemoApp and click Next

     

    AppName

     

     

     

     

     

     

    3) Use the Project Name GroovyDemo, then click Next

     

    ProjectName

     

     

     

     

     

     

     

     

     

     

    4) Now choose the Synchronous Service, name the process GroovyDemoProcess and click Next

     

    SyncProcess

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Now we need to define and add the input and output arguments. Here we use some predefined schema elements in schema files that I provide. Firstly we define these as Business Objects, then we use these Business Objects as a definition for our arguments and Data Objects in the process itself.

     

    5) Click on the green add icon to add a new argument, name the argument incidentARG

     

    incidentARG

     

     

     

     

     

    6) Choose Browse under Type and then click the Create Business Object Icon

     

    CreateBO

     

     

     

     

     

    7) Use the name IncidentBO and click the magnify icon choose a Destination Module

     

    DestModule2

     

     

     

     

     

     

     

    8) Click the Create Module icon and use the name Domain

     

    Domain

     

     

     

     

     

     

     

     

     

     

     

     

    9) Click OK twice to return back to the Create Business Object window

     

     

     

     

     

     

     

     

    10) Select the checkbox Based on External Schema and the magnifying glass icon to choose a Type

     

    TypeChooser

     

     

     

     

     

     

     

    11) Click the Import Schema File icon, select the incidents.xsd schema file and OK

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    12) Click OK to localize the schema files to your composite project

     

    localize

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    13) Select the Incident element from the Type Explorer and OK twice to return to Browse Types

     

    type_explorer

     

     

     

     

     

     

     

     

     

     

     

     

    14) Select the IncidentBO type and OK

     

    IncidentBOSelect

     

     

     

     

     

     

     

     

    15) To complete the In argument creation click OK

     

    InArgumentFinal

     

     

     

     

     

    16) Now click the output tab to define the GenericRequest type as a an Output

     

    InArgComplete3

     

     

     

     

     

     

     

    17) Using the same procedure as before create an output argument using the following values:

     

    Output Argument Name GenericRequestARG
    Type GenericRequestBO
    Schema Filename content.xsd
    Module Domain
    Element GenericRequest

     

    OutArg5

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    18) Click Finish to complete the initial definition of the GroovyDemoProcess BPM process.

     

    DefinitionProcess

     

     

     

     

     

     

     

     

    We have created a GroovyDemoProcess syncronous BPM process that has an Incident as a request and a GenericRequest as a response.

    Next we need to define process variables based on the business objects that we have already created. These will be used to store the payload data in the BPM process.

     

    19) Ensure the GroovyDemoProcess is selected in the Application Navigator, then  in the Structure Window right-click the Process Data Objects icon. Use the name incidentDO and select the IncidentBO as the Type.

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    20) Similarly create another process data object called genericRequestDO of Type GenericRequestBO

     

    GenericRequestDO

     

     

     

     

     

    Performing Data Associations of the Data Objects

    Now we have to assign the payload of the incidentARG argument to the data object we have just created. We do this in the Catch activity.

     

    21) Right-click the Start catch activity and select Properties. Select the Implementation tab and click the Data Associations link.

     

    DataAssociations

     

    Now we need to assign the incidentARG argument to the incidentDO data object.

    Since we have defined these to be the same type it is easy. All we need to do is a top level assignment and not even worry about optional sub-elements.

    21) Drag from the incidentARG to the incidentDO nodes and click OK twice to complete and close the Start node property definition.

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Now we need to associate the GenericRequestDO data object to the response.

    This is in the Properties of the Throw End node.

    22) Create a Copy association from the genericRequestDO to the GenericRequestARG nodes.

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Defining the Groovy Expression in the BPM Script

    Now at last we are ready to start defining the groovy code that will be responsible for the transformation.

    Drag a Script Activity and place it just after the Start node. Re-name this to Transform Request

     

    transform

     

     

     

     

     

     

     

     

     

     

     

     

     

    Transform2

     

     

     

     

     

     

     

     

     

    23) Right-click the Transform Request Script Activity and select Go To Script 

     

     

    GoToScript

     

     

     

     

     

     

     

     

     

    tip3Tip: The Script Activity must not have any implementation defined when it is being used for Groovy scripting. It functions as a container for the groovy script

     

    Before we can start scripting we have to define the imports for the script, similar to what we would do in Java. First lets take a look at the Scripting Catalog to see what is already there. This will help us understand what we need to import.

     

    24) In the Scripting Catalog expand the oracle–>scripting nodes to see what is already available to us.

     

    Here we can see the Business Objects we have already created and all the elements that are included in the schema files that we imported.

     

    ScriptingCatalog

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Now we need to recall what is the format of the GenericRequest that is the target data structure of our transformation. We need to know this so we can choose the correct imports for our Groovy script.

     

    soapui

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Above we can see that a GenericRequest contains the following elements:

     

    • Service–>Document–>Field
    • Service–>Document–>File–>Contents

     

    25) Now return back to the Scripting tab and enter the following in the Script Editor window. This as you can see are comments and printing output to the console. This will be seen directly in the weblogic server diagnostic log.

     

    //You can add comments like this
    //You can print to the console like during your development/testing procedures&nbsp;
    println("Starting transformation of Incident to Generic Request")

     

    tip3Tip: printing to the console log like this should only be used in development scenarios and should be removed for production. Alternatively we could add some logic to conditionally log messages only by specifying a payload value or composite mbean.

     

    Selecting the Scripting Imports

    Now we need to add in the imports for the elements that we will be using.

    26) Click the Select Imports button on the top right of the editor to open the Select Imports window

    SelectImports

     

     

     

     

     

     

    27) Click the green Add icon and click with the mouse cursor in the new import row that appears

     

    SelectImports2

     

     

     

     

     

     

     

     

    28) Type oracle. (oracle and a dot)

     

    OracleDot

     

     

     

     

     

     

     

     

    The context menu will now open up to help you find the correct package path.

     

    ConextMenu

     

     

     

     

     

     

     

     

     

     

    tip3Tip: Do not use the cursor keys until you have clicked inside the context menu with your mouse since this will cause the context menu to disappear.

     

    29) Now use the cursor keys to choose the following oracle.scripting.xml.com.oracle.ucm.type.Service, or type it in directly and click the Add icon to add another import.

     

    Imports

     

     

     

     

     

     

     

     

    30) Add the following imports and click OK

     

    oracle.scripting.xml.com.oracle.ucm.type.Service
    oracle.scripting.xml.com.oracle.ucm.type.File
    oracle.scripting.xml.com.oracle.ucm.elem.Field
    oracle.scripting.xml.com.oracle.ucm.type.Service.Document

     

    Writing the Groovy Expression

    31) Return back to the Groovy Script editor window.

     

    Now we need to define the classes we need to use to build our GenericRequest. We define a Service, Document, Field, File and two arrays for the lists of files & fields.

     

    tip3Tip: In essence here we are just instantiating POGO (plain old groovy objects) objects that are a Groovy representation of our GenericRequest element

     

    32) Now enter in the following code after the debug code you entered earlier

     

    //Define the message element types for data population
    
    //The Service element
    Service service = new Service()
    //The Document element
    Document document = new Document()
    //The File element (base64 message embedded attachment)
    File file = new File()
    //The filed element
    Field field = new Field()
    //An array of type Field
    List<Object> fields = new ArrayList()
    //An array of type File
    List<Object> files = new ArrayList()

     

    Now we have created our POGO objects. Now we need to populate them with real data. Since we are transforming from an Incident to a GenericRequest, most of our data comes from the data object incidentDO, which we have populated from the argument.

    We will start by creating each of the individual Field elements and assigning them to the array, since these constitute the bulk of our message.

    Our first field looks like this.

     

    FirstField

     

    It contains an XML Schema attribute called name and a value which is the Internal BPM process ID of the in flight process.

    Type field.set (field dot set) in the expression editor to show the context list of the available methods for the field object. We can see that the methods to set and get data from the field POGO already exist.

     

    FieldDot

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    32) Type in the following expression to populate the first Field element and add it to the array at position 0 (appears first in the message)

     

    //sDocName element containing BPM process instance ID
    field.setName("dDocName")
    field.setInner_content(predef.instanceId)
    fields.add(field)

     

    tip3Tip: We could get the BPM process instance ID by executing an xpath expression in a data association. However BPM 12c conveniently provides several pre-defined variables which are available from predef, of which some can be also updated in a groovy expression.  See the full list here.

     

    The next field that we need to populate in the GenericRequest is the dDocTitle, which comes from the incident subject.

    The transformed element looks like this.

     

    SecondField

     

    This time we get the value from the process data object incidentDO by directly calling the get method.

     

    33) Add the following expression to the end of the script.

     

    //dDocTitle from the incident subject
    field = new Field()
    field.setName("dDocTitle")
    field.setInner_content(this.incidentDO.subject)
    fields.add(field)

     

    Now this is really straight forward right? Actually, with the power of groovy expressions it really is.

    Now imagine that you wanted to implement some complicated if/then logic to only conditionally display some elements. All you need to do is write some simple logic into the script. Perhaps you need to format some dates or concatenate some strings values or convert some data types, again easy as pie.

    Consider the xincidentDate field below. Here we get a date and convert it into a Webcenter Content required format in a few lines.

     

    ConvertDate

     

     

     

     

     

     

     

     

    34) Now add the remaining field definitions to the expression.

     

    field = new Field()
    field.setName("dDocAuthor")
    field.setInner_content(this.incidentDO.reporter)
    fields.add(field)
       
    field = new Field()
    field.setName("dDocAccount")
    field.setInner_content("incident");
    fields.add(field)
      
    field = new Field()
    field.setName("dSecurityGroup")
    field.setInner_content("webcenter")
    fields.add(field)
      
    field = new Field()
    field.setName("dDocType")
    field.setInner_content("Incident")
    fields.add(field)
      
    field = new Field()
    field.setName("xClbraRoleList");
    field.setInner_content(":CaseMgr(RW),:CaseWorker(RW),:ActionOfficer(RW)");
    fields.add(field)
      
    field = new Field()
    field.setName("xClbraUserList");
    field.setInner_content("&${this.incidentDO.getReporter()}(RW)");
    fields.add(field)
      
    field = new Field()
    field.setName("xIdcProfile")
    field.setInner_content("IncidentRecord")
    fields.add(field)
      
    field = new Field()
    field.setName("xComments")
    fields.add(field)
      
    field = new Field()
    field.setName("xCitizenName")
    field.setInner_content(this.incidentDO.name);
    fields.add(field)
      
    field = new Field()
    field.setName("xEMail")
    field.setInner_content(this.incidentDO.email);
    fields.add(field)
      
    field = new Field()
    field.setName("xCity")
    field.setInner_content(this.incidentDO.city)
    fields.add(field)
      
    field = new Field()
    field.setName("xGeoLatitude")
    field.setInner_content(this.incidentDO.geoLatitude)
    fields.add(field)
      
    field = new Field();
    field.setName("xGeoLongitude");
    field.setInner_content(this.incidentDO.geoLongitude);
    fields.add(field);
    
    field = new Field()
    field.setName("xIncidentDate")
    Calendar nowCal = this.incidentDO.getDate().toGregorianCalendar()
    Date now = nowCal.time
    String nowDate = now.format('M/d/yy HH:mm aa')
    field.setInner_content(nowDate)
    fields.add(field)
      
    field = new Field()
    field.setName("xIncidentDescription")
    field.setInner_content(this.incidentDO.description)
    fields.add(field)
      
    field = new Field()
    field.setName("xIncidentStatus")
    field.setInner_content(this.incidentDO.incidentStatus)
    fields.add(field);
      
    field = new Field()
    field.setName("xIncidentType")
    field.setInner_content(this.incidentDO.incidentType)
    fields.add(field)
      
    field = new Field();
    field.setName("xLocationDetails")
    field.setInner_content(this.incidentDO.locationDetails)
    fields.add(field)
      
    field = new Field()
    field.setName("xPhoneNumber")
    field.setInner_content(this.incidentDO.phoneNumber.toString())
    fields.add(field)
      
    field = new Field()
    field.setName("xStreet")
    field.setInner_content(this.incidentDO.street)
    fields.add(field)
      
    field = new Field();
    field.setName("xStreetNumber");
    field.setInner_content(this.incidentDO.streetNumber);
    fields.add(field);
      
    field = new Field()
    field.setName("xPostalCode")
    field.setInner_content(this.incidentDO.getPostalCode());
    fields.add(field)
      
    field = new Field()
    field.setName("xTaskNumber")
    field.setInner_content(this.incidentDO.taskNumber)
    fields.add(field)

     

    The next element to add is the embedded base64 attachment. We add this in a similar fashion.

     

    34) Add the following expression.

     

    file.setContents(this.incidentDO.attachment.file)
    file.setName("primaryFile")
    file.setHref(this.incidentDO.attachment.name)
    files.add(file)

     

    Now we are nearly finished our groovy script. All we need to do is:

     

    • Add the arrays to the Document element
    • Add the Document element to the Service element
    • Add the Service to the process data object genericRequestDO

     

    35) Add the following expression for the Document, Service and gerericRequestDO

    //Add Field and Files
    document.setField(fields)
    document.setFile(files)
    
    //Add Document to Service
    service.setDocument(document)
    service.setIdcService("CHECKIN_UNIVERSAL")
    
    //Add the Service element to data object genericRequestDO
    genericRequestDO.setWebKey("cs")
    genericRequestDO.setService(service)

     

    The BPM script is now complete and your Studio Application should look similar to this.

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Deploying the Process

    Now we need to deploy the BPM process to our BPM server so we can test it. We are going to deploy to the new BPM 12c Integrated Weblogic Server that comes with studio, but another server can be used if preferred.

     

    tip3If this is the first time deployment to the Integrated Weblogic Server then Studio will ask for parameters and then create the domain first before deployment.

     

    36) In the Application Explorer Right-click the GroovyDemo project and select deploy–>GroovyDemo–>Deploy to Application Server–>Next–>Next–>IntegratedWeblogicServer–>Next–>Next–>Finish

     

    deploy1

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    deploy2

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    The deployment log should should complete successfully.

     

     

     

     

     

     

     

     

     

     

     

    Testing the Deployed Process

    Now it is time to test the process. We will invoke our BPM process through the web service test page.

    37) Open a browser window and go to the Web Services Test Client page http://localhost:7101/soa-infra/ and login with the weblogic user.

    Click on the Test GroovyDemoProcess.service link .

     

     

     

     

     

     

     

     

    38) Click on the start operation

     

    teststartopp

     

     

     

     

     

     

     

     

     

    39) Click on the Raw Message button to enter a raw XML SOAP payload.

     

    raw

     

    In the text box paste the following sample Webcenter Content GenericRequest payload.

     

    <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:gro="http://xmlns.oracle.com/bpmn/bpmnProcess/GroovyDemoProcess" xmlns:v1="http://opengov.com/311/citizen/v1">
       <soapenv:Header/>
       <soapenv:Body>
          <gro:start>
             <v1:Incident>
                <v1:Name>Joe Bloggs</v1:Name>
                <v1:Email>joe.blogs@mail.net</v1:Email>
                <v1:PhoneNumber>12345</v1:PhoneNumber>
                <v1:Reporter>03a7ee8a-ae3f-428b-a525-7b50ac411234</v1:Reporter>
                <v1:IncidentType>Animal</v1:IncidentType>
                <v1:IncidentStatus>OPEN</v1:IncidentStatus>
                <v1:Date>2014-09-17T18:49:45</v1:Date>
                <v1:Subject>There is a cow in the road</v1:Subject>
                <v1:Description>I have seen a big cow in the road. What should I do?</v1:Description>
                <v1:GeoLatitude>37.53</v1:GeoLatitude>
                <v1:GeoLongitude>-122.25</v1:GeoLongitude>
                <v1:Street>500 Oracle parkway</v1:Street>
                <v1:StreetNumber>500</v1:StreetNumber>
                <v1:PostalCode>94065</v1:PostalCode>
                <v1:City>Redwood City</v1:City>
                <v1:LocationDetails>Right in the middle of the road</v1:LocationDetails>
                <v1:Attachment>
                   <v1:File>aGVsbG8KCg==</v1:File>
                   <v1:Name>hello.txt</v1:Name>
                   <v1:Href/>
                </v1:Attachment>
             </v1:Incident>
          </gro:start>
       </soapenv:Body>
    </soapenv:Envelope>

     

    40) Click the Invoke button in the bottom right hand corner

     

    invoke

     

     

     

     

     

     

     

     

    41) Scroll down to the bottom to see the Test Results

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Congratulations! We can see that the Incident request we sent to Oracle BPM 12c has been transformed to a Webcenter Content GenericRequest using Groovy Scripting.

     

     tip3Tip: The Web Services Test Client is an lightweight method for testing deployed web services without using Enterprise Manager. For full instance debugging and instance details use Enterprise Manager or the Business Process Workspace

     

    If we track this instance in Enterprise Manager we can see what happened at runtime in the Graphical form.

     

    graph

     

     

     

     

     

     

     

     

     

     

     

     

     

    We can also look at the log from the Integrated Weblogic Server in Studio, which shows the debug expression we included.

     

     

     

     

     

     

     tip3Tip: This process could be easily remodelled to be asynchronous or re-usable and the transformed GenericRequest could be used in the input association for a Service Activity to actually invoke the Webcenter Content SOAP Service.

    The actual implemented process where this example comes from in the B2C Scenario looks like this. It is a reusable process that waits for the the upload to Webcenter Content to complete before querying the final documenting details and returning to the main BPM process.

    CreateContent

     

     

     

     

    Summary

    In this blog we introduced Groovy BPM Scripting in BPM 12c. Firstly we learned how to model a synchronous BPM process based on predefined XML schema types.

    We learned how to do the following using BPM Scripting:

    • Where and how we should use BPM scripting in a BPM process.
    • How to import classes
    • Instantiate and declare groovy objects
    • Print debug messages to the weblogic log file
    • Use process data objects
    • Use predefined variables
    • Format data
    • Dynamically build data object data structures
    • Programmatically transform data between different XML schemas types
    • Deploy and test using the Web Services Test Client tool

     

    In the next blog in this series I will demonstrate how to define and use BPM scripting in Business Objects and Exception handling in BPM scripting.

     

    tip3Tip: For more information on BPM Scripting (e.g. the list of predefined variables) see the section Writing BPM Scripts in the official BPM documentation

     

     

     

     

     

     

     

    Viewing all 69 articles
    Browse latest View live