These are general-interest articles about computer hardware, software, and the internet. You know what kind of nerd I must be when I have separate sections for Science and Technology... and multiple subsections for each.

I'm currently at IOUG Collaborate 2014 in Las Vegas, and I recently finished my 2-hour deep dive into WebCenter. I collected a bunch of tips & tricks in 5 different areas: metadata, contribution, consumption, security, and integrations:

We recently has a client with some LDAP performance issues, and had a need to tune how WebLogic was querying their LDAP repository. In WebLogic, the simplest way to do this is with their LDAP Filters. While trying to explain how to do this, I was struck by the lack of clear documentation on what exactly these filters are and why on earth you would need them... The best documentation was in the WebCenter guide, but it was still a bit light on the details.

Firstly, all these filters use LDAP query syntax. For those familiar with SQL, LDAP query syntax looks pretty dang weird... mainly because it uses Prefix, or Polish notation to construct the queries. So if you wanted all Contact objects in the LDAP repository with a Common Name that began with "Joe", your query would look like this:

(&(objectClass=contact)(cn=Joe*))

Notice how the ampersand AND operator is in the front, and the conditionals are in their own parenthesis. Also note the * wildcard. If you wanted to grab all Group objects that had either Marketing or Sales in the name, the query would look like this:

(&(objectClass=group)(|(cn=*Marketing*)(cn=*Sales*)))

Notice that the pipe OR operator prefixes the conditionals checking for Marketing or Sales in the group. Of course, this would not be a great query to run frequently... substring searches are slow, and multiple substring searches are even worse!

Below are what these filters do, and why I think you'd need to change them...

All Users Filter: This is basically the initial filter to grab all "user" objects in the entire repository. LDAP stores all kinds of objects (groups, contacts, computers, domains), and this is a simple query to narrow the list of user objects from the collection of all objects. A common setting is simply:

Users From Name Filter: This is a query to find a user object based on the name of the user. This is a sub-filter based on the previous All Users Filter to grab one specific person based on the user name. You would sometimes change this based on what single sign on system you are using, some use the common name as the official user ID, whereas other systems use the sAMAccountName. The %u token is the name being looked up. One of these two usually works:

(&(cn=%u)(objectclass=user))
(&(sAMAccountName=%u)(objectclass=user))

All Groups Filter: Similar to the all names filter, this filter narrows the list of all objects in the LDAP repository to just the list of groups. By default, most applications just grab all group objects with this filter:

(objectCategory=group)

However, if you have a particularly large LDAP repository, this can be a performance problem. We usually don't need all the groups defined in the repository, we just need the ones with a specific name:

(&(objectCategory=group)(|(cn=Sales)(cn=Marketing)))

Or the ones under a specific organizational unit:

(&(objectCategory=group)(|(ou:dn:=Sales)(ou:dn:=Marketing)))

Then the list of group objects to query based on name is much smaller and faster.

Group From Name Filter: Similar to the User From Name Filter, this filter looks up a specific group by the name (the %g token). Again, thie value here usually depends on what single sing on solution you are using, but one of these two usually works:

Hopefully that clears things up a bit! If you have performance problems, your best bet is to modify the All Groups Filter and the All Users Filter to only grab the groups and users relevant to your specific app.

Three years ago I blogged about the site 99 Bottles of Beer, which is a site dedicated to generating the lyrics of that oh so annoying song in every programming language known... currently over 1500 languages have been submitted. It's a surprisingly useful exercise when learning a new language... loops, text output, conditionals, etc.

Three years ago I submitted IdocScript to their library. I recently came across it again, and was shocked to find that nobody has submitted ADf yet! Geek rules state that ADF cant be an "official" language until it's on that site, so I had to do my part. Below is my humble submission:

I'm using a pretty basic program here... I'm using a vertical af:panelGroupLayout, an af:forEach tag to loop, and an af:outputText tag with expression language to print out the index. The af:spacer is there just to make it easier to read. Unfortunately, the af:ForEach tag does not iterate backwards (setting the 'step' attribute to '-1' makes it bark at me), so I have to subtract the index from 100 to get the number of bottles of beer on the wall.

This mainly demonstrates how expression language, loops, and conditionals can be used on an ADF Faces page. Another option would be to generate an array in a backing bean and bind that to the af:forEach tag, but I wanted to keep it all in one file.

Not sure if this will ever be officially accepted... because there are over 1117 languages are in their approval queue! I guess after IdocScript was admitted, the site got so popular they just couldn't keep up with demand...

Another talk I gave at Collaborate 2013 is this one on ADF Mobile and WebCenter. It builds off my talk from last year about general techniques, and gets into specific about the new ADF Mobile technology, and how to integrate it with WebCenter content and WebCenter Portal.

At Collaborate 2013 this year, Tony Field and I put together a talk about a topic that has been been floating around the WebCenter community as of late...How do I integrate WebCenter Sites (Fatwire) with WebCenter Content or Site Studio? We put together a handful of integration techniques, but the main focus was on upcoming features in the next version of WebCenter... specifically the official Sites/Content connector, and support for External Repositories. Cool by themselves, but when combined with Site Studio for External Applications, it's a compelling set of integration options:

I was recently doing some training on ADF, and the students were complaining how slow JDeveloper was... Dragging and dropping Data Controls onto a JSF page? It's the pause of death if you will. Not to mention the "Out Of Memory" errors that crop up in the middle of debugging a large app. Very frustrating for developers, so I decided to once and for all get figure out what magic JVM tuning parameters would speed it up.

As a general rule, Java is optimized for throughput, not latency. Once the garbage collector kicks in, performance drops like a rock. A 2 second pause every once in a while is OK for a server, but for an IDE it's misery. So here's the fix:

Go to your JDeveloper root directory, is should be something like C:\Oracle\jdev\Middleware\jdeveloper

Open the file ide\bin\ide.conf, scroll down to the default memory settings:

Then restart JDeveloper... If it doesn't start, you'll need to reduce the amount of memory allocate in the ide.conf file from step 3.

And that's it! Your mileage may vary, of course... And you may need additional parameters, depending on what version of JDeveloper you're running. Just keep in mind that you are tuning Java for shorter pauses, and not greater throughput.

UPDATE 1: some students still had issues, so in addition to the JVM settings, I've found these tips also help out:

Go to Tools / Preferences / Environment, and switch to the "Windows" look and feel. The Oracle look and feel is prettier, but slower.

Disable all extensions that you don't need. This is usually a huge savings... Go to Tools / Preferences / Extensions, and turn off thnigs you know you don't need. One thing I do is disable all extensions by default, then enable only the ones I know I need for my current project. For example, disable everything, then enable only those extensions that start with ADF. This will automatically enable dependent extensions. Enable others (Portal, SOA, RIDC) only if needed.

Open all documents in "Source" mode by default. Go to Tools / Preferences / File Types, and click the Default Editor tab. For all web pages (HTML, JSF, JSP) set the default editor to "Source". You can always click the "Design" tab to see the design. For best results, select items in the "Structure" window (by default on lower left) and edit them in the "Property Inspector" window (by default on the lower right).

If you really want to get extreme... you can install a solid-state hard drive for your workstation. Barring that, if you have enough RAM you can allocate 4 GB and create a RAM driver for your system. This looks like a normal hard drive, but it's all in RAM. Then install JDeveloper on that, and it will be almost as good as a solid state drive.
Other developers have had success using

UPDATE 2: A reader has informed me that this line:

#AddVMOption -XX:+AggressiveOpts

Breaks offline database support in JDeveloper... so that one will have to be avoided in some cases.

In addition to my FatWire tutorial talk, I gave one on ADF Mobile. Or, more accurately, I talked about how cool it is that Oracle is going to be bundling PhoneGap with their ADF Mobile toolkits!

I was never really a fan of mobile applications: I prefer the mobile web experience. Every mobile device supports HTML5, which means that you can do just about everything a mobile app can do, other than high-performance graphics. In fact, according to an Adobe study users prefer mobile web to mobile apps for just about everything.

In cases where you do need mobile functionality (camera, bar code scanner), it makes sense to make a hybrid app with PhoneGap/Cordova/ADF, rather than a native app. This means 99% of the functionality is in HTML5, and 1% is in native code called from JavaScript

And please make sure your mobile strategy is a natural extension of your business model... or you'll be out a lot of money!

Oracle recently acquired FatWire, and renamed it WebCenter Sites. It is a "web experience management" toolkit, which is similar to Oracle's existing Site Studio product -- a part of Oracle UCM, now called WebCenter Content.

After using Site Studio for years, I got pretty accustomed to it's terminology and toolkits... so looking at FatWire was initially intimidating because it was just so dang different. But, after using it for several months, I've come to the conclusion that a lot of the fundamentals are pretty similar. Pretty much everything Site Studio does is built in to FatWire, and FatWire has a few nifty extras as well.

So, for IOUG Collaborate this year, I put my insights together into a presentation: Crash Course in FatWire for Site Studio Developers:

It's not a replacement for actual training... but it does cover all the major low-level assets, and how they fit together to form a site. If you know a thing or two about Site Studio, this should help you get over the initial "fear of the unknown!"

Upgrading ECM can be a multi-step process. You need to upgrade WebLogic before upgrading ECM, and you need to make sure you have the right version of the Repository Creation Utility (RCU)... not to mention the multi-gigabyte general installer for ECM itself (which includes IPM, UCM, IRM, and URM). If it's a new install, just grab the most recent Weblogic Server downloads. Otherwise, use the upgrade installers below:

You're looking around Oracle for the latest patches, and after copious amounts of digging, you finally find the mystery patch that you need... you click on the "download" link, install it, and you're good to go!

Later on... your client, or co-worker, or somebody on the message board asks, "How'd you do that?" And because you have a photographic memory, you reply "With patch 12395560, of course!" Then they ask, "got a link?" And then you say this:

In order to simplify the process (and make my documentation more readable), I set up a URL Shortener for Oracle patches for myself. Unlike most URL shorteners, it takes a parameter. The number after the slash is the Oracle patch number... which should be easy to spot on the form. So, instead of the crazy URL above, you could use one of these two:

The first one goes to the standard My Oracle Support page -- with all it's flashy goodness -- and gets as close to a "quick-link" that I could deduce. The second URL goes to the old fashioned Oracle Updates web site, which supports parameterized URL quite nicely. Guess which one I prefer? ;-)

Ideally, the Oracle support team would implement a parameter-based redirect themselves... and expose that "quick link" on the support page. Until then, I'm going to do it this way. I wonder if it will catch on???

In part 1 of this post, I covered the JSON-P "standard" for mashups. Not so much a standard per se, but a sneaky way to share JSON code between servers by wrapping them in a 'callback' function... For example, if we have our raw JSON data at this URL:

http://example.com/data.js

A direct access would return the raw data dump in JSON format:

{ foo: "FOO", bar: "BAR" }

However, a JSON-P call would return a JavaScript file, that calls a 'callback' function with the raw data:

callback({ foo: "FOO", bar: "BAR" });

Since this is pure JavaScript, we can use it to bypass the "Same-Origin Policy" for AJAX... A typical AJAX call uses the XmlHttpRequest object, which only allows calls back to the originating server... which, of course, means true mashups are impossible. JSON-P is one of the (many) ways around this limitation.

Since JSON-P is something of a hack, many developers started looking for a more secure standard for sharing JSON and XML resources between web sites. They came up with Cross-Origin Resource Sharing, or CORS for short. Enabling CORS is as simple as passing this HTTP header in your XML/JSON resources:

Access-Control-Allow-Origin: *

Then, any website on the planet would be able to access your XML/JSON resources using the standard XmlHttpRequest object for AJAX. Despite the fact that I like where CORS is going, and see it as the future, I just cannot recommend CORS at this point.

Security

Since CORS is built on top of the XmlHttpRequest object, it has much nicer error handling. If the server is down, you can recover from the error and display a message to the user immediately. If you use JSON-P, you can't access the HTTP error code... so you have to roll-your-own error handling. Also, since CORS is a standard, it's pretty easy to just put a HTTP header in all your responses to enable it.

My big problem with CORS comes from the fact that it just doesn't seem that well supported yet... Only modern browsers understand it, and cross-domain authentication seems to be a bit broken everywhere. If you wanted to get secure or personalized JSON on a mashup, your back-end applications will need to also set this HTTP header:

Now, JSON-P isn't great with security, either. Whereas CORS is too restrictive, JSON-P is too permissive. If you enable JSON-P, then you pass auth credentials to the back-end server with every request. This may not be a concern for public content, but if an evil web site can trick you into going to their mashup instead of your normal mashup, they can steal information with your credentials. This is call Cross-Site Request Forgery, and is a a general security problem with Web 2.0 applications... and JSON-P is one more way to take advantage of any security holes you may have.

Performance

In addition, the whole CORS process seems a bit 'chatty.' Whereas JSON-P requires one HTTP request to get secure data, CORS requires three requests. For example, assume we had two CORS enabled applications (app1 and app2) and we'd like to blend the data together on a mashup. Here's the process for connecting to app1 via CORS and AJAX:

Pre-Flight Request: round-trip from client browser to app1 as a HTTP 'OPTIONS' request, to see if CORS is enabled between mashup and app1

Request: if CORS is enabled, the browser then sends a request to app1, which sends back an 'access denied' response.

Authenticated Request: if cross-origin authentication is enabled, data is sent a third time, along with the proper auth headers, and hopefully a real response comes back!

That's three HTTP requests for CORS compared to one by JSON-P. Also, there's a lot of magic in step 3: will it send back all the auth headers? What about cookies? There are ways to speed up the process, including a whole ton of good ideas for CORS extensions, but these appear to be currently unpopular.

Conclusion: Use JSON-P With Seatbelts

If all you care about is public content, then CORS will work fine. Also, it's a 5-minute configuration setting on your web server... so it's a breeze to turn on and let your users create mashups at their leisure. If you don't create the mashups yourself, this is sufficient.

However... if you wish to do anything remotely interesting or complex, JSON-P has much more power, and fewer restrictions. But, for security reasons, on the server side I'd recommend a few safety features:

But wait, isn't it easy to spoof the HTTP referrer? Yes, an evil client can spoof the value of the referrer, but not an evil server. In order for an evil mashup to spoof the referer, he'd have to trick the innocent user to download and run a signed Applet , or something similar. This is a typical trojan horse attack, and if you fall for it, you got bigger problems that fancy AJAX attack vectors... DNS rebinding is much more dangerous, and is possible with any AJAX application: regardless of JSON-P or CORS support.

Links and Free Downloads

For those of you interested in Oracle WebCenter, I created a CrossDomainJson component that enables both CORS and JSON-P, and it includes some sample code and documentation for how to use it. It currently works with WebCenter Content, but I might expand it to include WebCenter Spaces, if I see any interest.

For those of you in the Toronto area, I'll be presenting at the AIIM/Oracle Social Business Seminar this Thursday! Its at Ruth's Chris Steakhouse, 145 Richmond Street West, Toronto, ON. The agenda is as follows:

10:00 a.m: How Social Business Is Driving Innovation, Presented by: John Mancini, AIIM

In my previous post, I was talking about the JSON-P standard for mashups. It's very handy, but more of a "convention" than a true standard... Nevertheless, it's very popular, including support in jQuery and Twitter. In this post I'm going to discuss what some consider to be the modern alternative to JSON-P: Cross-Origin Resource Sharing, or CORS for short.

Lets say you had two applications, running at app1.example.com and app2.example.com. They both support AJAX requests, but of course, they are limited to the "Same-Origin Policy." This means app1 can make AJAX requests to app1, but not to app2. Let's further assume that you'd like to make a mashup of these two app at mashup.example.com.

No problem! In order to enable cross-origin AJAX, you simply need to make sure app1 and app2 send back AJAX requests with this HTTP header:

Access-Control-Allow-Origin: http://mashup.example.com

This is easily done, by adding one line to the Apache httpd.conf file on app1 and app2:

Header set Access-Control-Allow-Origin http://mashup.example.com

DONE! Now, with standard AJAX calls you can host a HTML page on mashup.example.com and connect to app1> and app2 using nothing but JavaScript! There are about a half dozen additional Cross-origin HTTP header that you can set... including what methods are allowed (GET/POST), how long to cache the data, and how to deal with credentials in the request... naturally, not all browsers support all headers, so your mileage may vary!

Not to mention, because the XmlHttpObject is used, CORS has much better error handling than JSON-P. If there's an error accessing a file, you can catch that error, and warn the end user. Contract that with JSON-P, where there's no built-in way to know when you can't access a file. You can build your own error handling, but there's no standard.

Nevertheless, I still prefer JSON-P for mashups. Why? Well, it boils down to two things: performance, and security. I'll be covering the specifics in part 3 of this port.

In a recent project, I had a client who wanted to resurface Oracle UCM content on another web page. The normal process would be to use some back-end technology -- like SOAP, CIS, or RIDC -- to make the connection. But, as a lark, I thought it would be more fun to do this purely as a mashup. I would need to tweak UCM to be more "mashup-friendly" -- I'll be sharing the code (eventually) -- but first I needed to do some research on the best mashup "standard" out there.

UCM supports JSON, but that's not enough for a true mashup. The problem is that even though UCM can send back JSON encoded responses, you cannot access this data from a different web page. This is because of the "Same-Origin Policy" in AJAX. Basically, you can make an AJAX call back to the originating server, but you cannot make it to a different server. This is quite annoying, because then you can't "mash-up" UCM content onto another web page using just JavaScript. The best mashup APIs -- like Google Maps -- can't use AJAX because of this limitation.

Many developers consider this 'security' feature quite odd, because it's totally okie-kosher to include JavaScript from other people's web sites... so why not AJAX? Knowing full well that this was kind of stupid, some developers came up with a 'convention' for fixing it: "padded JSON," or JSON-P. This means 'padding' a standard JSON response with a callback, and then calling that callback function with the response. For example, if you called the PING_SERVER service with JSON enabled, with a URL like so:

You would then use the standard AJAX XmlHttpResponse object, parse this JSON data, then do something with the message. My jQuery Plugin for UCM does exactly this... but of course has the limitation that it will only work on HTML pages served up by UCM. You can use fancy proxies to bypass this limitation, but it's a pain.

Instead, if UCM supported 'padded JSON', the process would be different. The URL would look something like this:

In this case, the callback=processData parameter triggers the server to 'wrap' the JSON response into a call to the function processData. Then, instead of using the XmlHttpResponse object, you'd use good old-fashioned remote scripting. Like so:

Notice how we define a function on the page called 'processData.' When the UCM response returns, it will call that function with our response data. The beauty here is that you can put this JavaScript on any web page in your enterprise, and connect directly with UCM with nothing but JavaScript. Pretty nifty, eh?

Now... JSONP is a good idea, but it's about 5 years old... A lot of newer browsers support a slightly different standard: Cross-Origin Resource Sharing. It's an actual standard, unlike JSON-P which is more of a convention... the purpose is to safely allow some site to violate the silly "Same-Origin Policy". I'll be covering CORS in part 2 of this post, including the security enhancement. But, in part 3 I'll explain why I still prefer JSON-P, provided you add some extra security.

I gave two presentations at Oracle Open World this month... one on Integrating WebCenter Content: Five Tips to Try, and Five Traps to Avoid! I broke it down into the big sections: contribution, consumption, metadata, security, and integrations. Special thanks to IOUG for sponsoring this talk!

My second talk was a case study based on a big project that completed recently, integrating WebLogic Portal, UCM, E-Business Suite, Autonomy IDOL, and a whole bunch of other stuff to make a global e-commerce web site. The client is in a highly regulated industry, and I was unable to get permission to use their name... but if you're curious about the details ping me!

The WebCenter Portal team has put together a VirtualBox virtual machine to showcase the WebCenter Portal product. You can download it from Oracle. It's a big one: clocking in at 30 GB, so pack a lunch before downloading it.

The install instructions are pretty good for Windows and Linux clients... but if you're on a Mac (like me), it's missing one important tip. The file REAVDD-HOL-WC.ovf contains the information needed to import the files into a VirtualBox VM... but if you're running the free version of VirtualBox, it chokes on the import every time. The culprit is this line:

If you're on Windows, and have a D drive, this works fine... but if you're on a Mac (and probably Linux), this will break the import. The fix? Use this XML instead:

And re-do the import... you'll need to re-set-up sharing once it's running. But at least now it will have a valid path!

NOTE: This is just meant to be a sandbox for testing integrations, and the like. It's not meant to be placed into a production environment... but, like all demo code, I'm sure I'l find it floating around in production eventually... and have to make it work.

I was always a bit little skeptical about the initial mobile offerings for UCM and WebCenter. They never impressed me, because I felt strongly that these apps were fundamentally flawed in their design...

Why? Because they focused on being Mobile Applications instead of Mobile Web. The first time I held an iPhone, I noticed that it was running a browser that supported HTML5. The first Android was the same. This was at a time where HTML5 support was rare on desktop browsers, and few developers knew how to use it. Nevertheless, I predicted yearsago that it would be the future... HTML5 was so powerful, that Flash and native mobile apps were unnecessary for 95% of applications. Many clients asked my advice on mobile apps, and my answer was always the same: "Skip native apps, and focus on the mobile web!"

This week, Oracle announced their next generation of the ADF Mobile toolkit... and (as I predicted) they are going the same route! Native code is no longer the focus: previously, you would create an ADF component, and it would be compiled down into native iOS or Android controls. No more! The next version will compile to HTML5 and be rendered in the mobile browser!

How can this be? With a technology called PhoneGap. It allows you to create your application in nothing but HTML5, render it in a browser, and still access native functionality (camera, location, files) with JavaScript functions. It's basically a wrapper around the built-in HTML5 browser, plus a plug-in library, which together give you an extremely powerful development environment. The next generation of ADF Mobile will be an ADF wrapper around PhoneGap, plus a few extra goodies (that I'm not allowed to talk about yet!). They call these hybrid applications because they are mostly HTML5, with a tiny bit of native code mixed in.

Well, what about those candy-coated user interfaces? How do I get those? The same way as always: mobile JavaScript toolkits. There are several available that can make very attractive interfaces, that render in any smartphone:

If you prefer to roll-your-own UI, I'd recommend Zepto as a minimalist framework instead...

What's next for the web, then? I believe that mobile application development will be the biggest driver for the adoption of HTML5 browsers. Yes, probably only 10% of mobile phones are HTML5-enabled smart phones... but people cycle through cell phones every 2 years. Compared that to the enterprise, some of which stubbornly refuse to upgrade from IE6!

I'd bet 90% of Americans will have a HTML5 mobile phone, before 90% of them are off IE6! Sad, but true... but good news for mobile developers!

UPDATE: Dang it! Just as soon as I blog about this, Adobe goes and purchases PhoneGap!What does this mean for Oracle? Tough to say... it's probably a good thing, since most of PhoneGap is open source. The only piece that's not Open Source is their nifty build engine. But, since Oracle already owns their own build engines (jDeveloper and Eclipse plugin), this is not a stumbling block.

UPDATE 2: It appears that Adobe has done "The Right Thing" and is submitting PhoneGap to the Apache group, and re-branding it as Project Callback. This will hep cement it as "the standard" toolkit for mobile app development.

PowerPoint is a necessary evil... everybody is expected to give presentations in it, but few people are good at it. They cram too much information into one slide, and pack them full of data that might better go in a report. Presentations work best when used to persuade, it's an awkward tool when you try to educate. There's a reason PowerPoint was banned by the Pentagon:

"PowerPoint is dangerous because it can create the illusion of understanding and the illusion of control" -- Brig. Gen. H. R. McMaster

But alas... we're still stuck with PowerPoint... so we should probably make the best of it!

One of the ways to make PowerPoint presentations more compelling is to tell a story... unfortunately, most people are pretty bad at telling stories as well. There's an entire industry created around corporate storytelling that trains people how to engage your audience with a full-fledged story... but there's an even simpler approach. The creators of South Park stumbled on a formula that they still use to assemble stories:

These same rules can apply to making a PowerPoint presentation flow like a story.

You initially assemble your main points -- which is usually the hard part. Then, when assembling your points to tell a story, try to transition between your points with the word "therefore," or the word "but." Like so:

Slide 1

therefore...

Slide 2

but...

Slide 3

therefore...

Slide 4

but...

Simple, no? You'll be surprised how much better your presentations will "flow" from one point to the next with this method.

Naturally, not all presentations can fit into this pattern... for example, "Top 10" presentations flow numerically from one point to another... so if people doze off they can pick up the next chunk at the start. Also, there may be times where the dreaded "and then" transition is needed, such as when a point needs to be communicated over several slides.

Nevertheless, if you try hard to use better transitions, your story will be more compelling, and PowerPoint will be one notch less evil.

Open World is barely a month away! I'll be heading there early for some Oracle ACE briefings and the like... I'm normally a "broadcast only" Twitter user, but when I'm at conferences I check it all the time, and tweet with location services on. If you want to meet up, just message me!

I have a couple of sessions this year... unfortunately they are all on Thursday! Dang it! I was hoping to leave the conference early -- since Michelle and I are having our first kid, and her due date is a few weeks after Open World. Alas, the scheduling gods were not with me:

Session: 10843

Creating a Global E-Commerce Site with Oracle E-Business Suite and Oracle Fusion Middleware

I know picking Open World sessions can be a bit of a baffling ordeal... so if you're pressed for time, I'll suggest a few tips. If you want to see WebCenter based content, check out the WebCenter partner sessions. Lots of good stuff there. If you're curious about non-WebCenter products but don't know where to start, I'd recommend the Oracle ACE sessions over just about everything else. ACE sessions are a good bet: speakers are usually very knowledgeable, very passionate, and very excited to share what they know. Translation: minimal marketing fluff. You don't get the title "Oracle ACE" by being a self-promoting fool!