Category: java livelink

Recently I was tasked with upgrading our systems from its initial version to two times jump so 971 to 10.5.I will chronicle my experiences here. Perhaps provide a structure.I also want to quell some myths about the whole process just to make sure you can do this logically and correctly.

First create a draft plan,in my draft plan I included this.I would need at least one VM that can house or Production 971 Binaries,a clone of the database and a clone of the EFS.If one had been storing items in perhaps Archive Server one should have a playing copy of that also. An OTAS based livelink upgrade can pose its challenges if not done correctly but it is no different than a EFS if you do it wrong you can write test data into your productive store 🙂 This involves talking to other people like DBA’s and Storage folks as a heavily used LL system can have tons and tons of data.You will most likely encounter the after effects of long time use,different administrators and their styles etc.So after a few failed attempts I understood why OT says a DB5 verification is very important.The Verification basically runs single threaded and will try to pinpoint anomalies in the database.Most likely it will pinpoint content loss such as bulk imports etc without actual version files et al .In my case the database had wrong pointers,duplicate dataid’s (wow),removed KUAf Id’s(somebody had fun with a oracle tool).So after going back and forth between OT and us I just thought like this and created a program(oscript) to check important structures.I list 3,your mileage will vary but essentially I grabbed all categories ,form templates and wf map structures.None of these checks happen in DB5 it just looks for existence.In hilarious cases I uncovered a category data structure mysteriously would turn up as a drawing PDF.One could argue that the users would have noticed, so this was an old category that nobody noticed.You don’t have to do this but each of those anomalies would prevent a upgrade and it is back and forth with OT support so I did it just to get some time advantage.

The second thing was I had to get all the optional modules and our system is no different from any other system a lot of optional modules some inserting schema some not.

The third thing was our systems were customized with Oscript to provide enhanced user experience as well as awesome home written modules that takes the tedium of long running things like category upgrades,permission pushes et al a very intelligent add on to the OI module that people can use generically to upload their data like that.So our team sat down and thought of what we could retire and what we should re-code,refactor so many of them were thrown out .Many useful ones we kept.We particularly liked the Distributed Agent framework,ours is very similar to that but if time comes and there is an appetite we would code it for that.

About the re-coding effort here’s the clunker,we were to use CSIDE .Naturally all of my team are hard core Builder aficionados and boy that was a journey.I probably think we were the first org to do something with CSIDE so naturally I gave them a lot of screenshots and dumps and OT dev was very receptive.In extremely tight time constraints we developed parallel modules in 97 1 and 10 and just just sucked them up for our SRC code maintaining thing called TFS. We also learned how to work parallel on oscript but not being a code churning house our code is not that complicated so if I was working on a module somebody else was working on another as well.

So the environment prepared was

971 VM with the same Oracle client and the same DB type of clone.

I removed Prod EFS and Prod Admin server info from clone.

We decided to rebuild search index as nobody had any clue whether it was good and it was eons before new search technology.

A base install of LL was done and the working 971 binary copy from prod was put as an overlay.Now many starting out people do not know this technique where you can get a copy of your prod binary replete with patches et al into a playing system.BTW this is what you hear as the “Parallel Upgrade” approach .Its database connection was redefined .In very olden times when there was no differentiation between 32 and 62 bits and VM’s were not popular and cheap, it was possible to use the existing boxes and run the upgrade.This was the “Update an Instance” method.This is largely vestigial at this time and very error prone not to say you would have a real hard time going back to the working version 🙂

All old 971 custom modules were removed so this database had only knowledge of OT software core and optional.

Did my health check I said earlier like look at cats,form templates and wf maps,removed any that would hamper a upgrade.

At this point one can take this to a CS10.5 binary and start the upgrade.That is my next article.

BTW- I am not a novice when it comes to Upgrades I started using livelink software in 1999 and has worked in version 8.1.5 so I pretty much know the heartbeat of an upgrade and gets pleasure in challenges and how one methodically removes the challenges.Do not try an upgrade if you have not dry run the procedure without fail a couple of times.In most cases the upgrade can be done timely the planning can take weeks,if not months.

Livelink comes up with a extremely powerful Workflow Engine .At the minimum if you have only installed Livelink OOB you get a decent workflow features.This involves a “business user” to design a logical map that may involve “players” or “roles”.The only criterion is the process needs to be repeatable or it can be plotted in visio or such like.Note Livelink WF came before the wf consortium came up with open standards but it will have almost everything one sees in the standard.

WF Manager-By default the person who created the original map.Best practices dictate that you assign a proper livelink group as the ‘Master Manager‘.Not to be confused as the livelink team’s manager or a organizational manager.In theory this manager can re-assign steps and repaint the map.One of the few places where the sysadmin profile or the ‘admin’ user won’t cut it so due diligence thinking that at some point a higher up user may be asked to assist.Please do not run maps with <Initiator /> as the Master Manager.It is quite possible these users know only to click and when something breaks nobody can intervene.You can make them do only the “See Details” which is more than enough in many cases.

WF Instance- The process logic that is set in motion when a WF Map is set in motion a.k.a initiated .This is easily identifiable by a clickable link in the WF managers assignments and it will show in GREENthe step and useful info.

Steps-The series of steps that can be assigned to processes and users.This is what apparently the business user can be trained to paint..there are a myriad of steps that OT puts on the palette.

Role Based or Map Based-If you create a map OOB then livelink gives you a map where the forward steps can be done by business programming,like if WF attribute is green then my user is this group.If you change the map to “ROLE” based before initiation the Initiator will have to fill all the people in the workflow.People who use the Transmittals product of livelink can see this in action very clearly.

Attachments Volume- By default the attachments package is permissioned Full Control to“Public Access”very bad idea as when a casual user does search he will see unwanted or secretive results..Always put a good permission bit there.If you do not give participants Add Items up to Delete in that volume you will get crude messages from livelink . Note that the Truth Table implementation is observed by livelink on any kind of Object Creation so to get around problems in workflow OT assigns default full control to PA. Just observe what OT gives and change that to a good group and add all the people in the workflow to that group.

Loop Back– Be extremely careful about this as you can create a infinite wf instance.

Item Reference– You can point to existing object subtypes in livelink like folders /documents etc so in conjunction with Item Handlers you can perform “auto magic”

It is quite possible that a good business user or a livelink user can be trained over a day to understand the process flow/swim lanes.I usually design my maps on a paper.I have a cheat sheet of sorts I maintain and many of the things in this article is based on that. Always check “Verify Map Definition” to understand any problems this may have.

The Item Handler is a predictive step modeled by some placeholder logic( Design Book 3) although which it appear useful it could look frustrating as very little process automation can be done just by it.The workarounds or “auto magic” is usually a human at a prior step.If you use it with XML WF Extensions a lot of “auto magic” can be done with it.

The “XML WF EXtensions” is a optional module that uses the capability of livelink objects to have a XML representation(everybody probably has heard of XML Export/XML Import)

the XML WF Ext uses XML representations of livelink objects and manipulation of those objects by a SAX(or DOM not sure I know LL has both in it ) parser. Usually people get bitten by load balancers and file system permissions.

Like wise -XML Work Flow Interchange.Can send /call Web Services of other systems.

ESign – A specialized workflow that can do electronic signatures prevalent in 21 CFR11 operations.

Perfectly suited for livelink organizations who will not invest in Oscript coding /scared by OT sales/marketing in not writing Oscript understanding of this remarkable product.It was very expensive when I started livelink programming so it is a personal bias as well since I like to look at the map and many times I can get a 1-1 representation to the process.

Rate this:

Oscript is a 4GL that is the basis of the application that the OpenText company adopted when they chose to market Livelink the Document Management System.It is safe to say that anything the Livelink product lacks and is a valid requirement can be coded in this language.The power of the the Oscript language takes root in the philosophy that was prevalent in the 60’s . Oscript is a superior offering to traditional OOP languages such as Java or C++ that is what Pundits say.

To a trained Oscripter one is able to understand how livelink code is received as well as change it.Now change to any COTS product comes at a cost.That is where the strategy as well as overall grasp in the product comes into mind.If the organization has decided for itself that it will spend the big bucks on livelink, one approach is to identify the ways to use it more.Where the organization is going to make mistakes is when it sees the cost of the “customization”.Many decision makers wrongly decide to go after fly by night programmers/vendors to be totally left in the fray.OT has a professional vested interest to sell customers more code and services so I have been in decisions where sales and marketing will say do not customize livelink, but that is only half truth in them.If you buy these OT customizations, they are almost certain to be written in Oscript .In many cases it would have been written by senior Oscripters with thought on maintenance aspects and upgrade ability as well.The same kind of service can be bought by other Oscript houses,in many cases these are extremely talented ex OT ‘ers. Many of the Professional Services modules sometimes outgrows the intents and becomes a marketable module as well.I have no knowledge and am just speculating on that.Once you have identified and decided to use livelink the best avenue is to hire or ask services of reputed vendors.Modules that contained Oscript code that I used to code in 1999 (first time was on a livelink 8 system) livelink 9.1 version I can almost compile in the modern 10.5 version.My entire career was and is fruitful because I tried to interact/collaborate and share with the OT team in the KB as well as smaller websites like Greg Griffiths.I was also able to mentor myself amongst very die hard oscripters that I knew from work and just by tracing code to find bugs that needed closure before OT could provide fixes.The knowledge of OT code also allows me to become a better application architect,administrator and integrator.

If the organization is strong in Java and .Net skills then note that the user interface at this time cannot be changed using those languages,however you may use exposed API’s to make functioning applications that use Livelink as the data store

Typical things organizations can and should do.

Create implementations of Nodes just like Folder or Document if there is a need for it to be there.Just because OT shows you a “Addressbook” or “Contacts” module not every livelink installation in the world needs new subtypes.It is there if you want to use it ,if there is a complex business requirement that involves a “document” like implementation or a “folder” like implementation.Do search in the KB for such terms as “invasive customizations”, overrides,orphans,subclasses, customizationsRT, weblingo customization.

Add Commands in the Nodes these are basically very easy to do in a short order.These two are taught very nicely in class as well.

Write agents,distributed agents,expose to columns and facets api something that OT does not give you OOB.

Extending SOAP and REST can only be done using Oscript.If you are a shop who has a need to program a integration using java or C# and you see a “GAP” in existing offering it is rather easy for a Oscripter to extend that.

Opentext Programmers have black boxed thing so that many things acts as internal API’s.For e.g we are able to do work without understanding how the “Node” itself is created in livelink. Other examples are the way Oscript can talk to the search system.

Rate this:

Finally I found the link to my original blog.In 2007 I joined Alitek as a senior person and was instrumental in training/mentoring most of the people there to fearlessly transgress Oscript .I am not sure if OT marketing /sales because of their vested interests trying to scare people from using the language of the application would create rumors in less technical decision makers of the org.What has happened is a reckless decline of proper design architects in the beautiful language of livelink .It is written in un paralleled style withstanding many patterns of OOP.

In any case what I would advise decision makers is if you have installed and said that Content Server I even hate that name, Livelink it is for me is there for the ORG then hire/train good developers.It does not matter if you cannot get Oscripters. All you need to do is get a good thinking programmer either versed in java or .net technologies.The person could be set for training and that is it.

Do not employ cheap gimmicks like skins,reporting avenues etc. What a org needs can be handled by Oscript. You DO NOT HAVE to look anything other than oscript to get 150 % out of your livelink application.

Now anyone who knows their computer science one on one would not be questioning why Oscript could not be mastered.

I was very thrilled to get accolades from OT and was told it was done by a peer group which does mean something to me.Now they gave me a blog and I lost the blog because I could not obey one of their time things.I doubt if OT has a award for selfless souls now most awards I see at CW is for huge accounts and managers .Money talks in software decisions one must conclude.

Rate this:

Me and my colleagues recently used this framework for transporting a custom subtype that we designed.We were helped by a kind OT developer All in all it saved our administration team from running XMLEXPORTS and XMLIMPORTS but the key factor is that needs to work for the transport code to work.

Update ends…

Recently OT announced a “Transport” mechanism. Users working in SAP etc are familiar with such a mechanism as changes to a baseline system are marked as configuration changes and ported to other systems.While I am oversimplifying this in a SAP deployment since so many code modules have to be configured it is impossible for an admin to remember what all has to be changed hence they decided at least 20 yrs back an enterprise program needs a transporting strategy as well.

Now lets look at how livelink gets deployed in an org. Mostly Livelink is deployed to address a document management pain/problem.Some consultants or a admin/developer who gets trained in livelink or somebody who has some documentation starts to install/configure this.assuming the person(s) have some common sense and tickets later it is working.Actually the first time install guide is very nicely written and it is mostly a “next” “next” kind of install.

Having configured it the person/persons start modelling the solution.It may involve setting up folders,categories for collecting metadata,classifications for alternative taxonomies,RM classifications ,Permissions(ACL) etc. OT and a lot of ecosystem partners take the money to sometimes set it up.You can make it complicated or simple.

Ok now that it is past the POC phase the next phase is the perhaps a TEST a more formalized place.Now comes the first shock how do I not necessarily duplicate all the work I did on a lower system.Fortunately while there was no advanced GUI or “Transport Bench” mechanism many of these building blocks can easily be transferred by XML IN/OUT operations.Smarter modules such as Classifications/RM anticipated the need for this so they did make their own transport mechanism well ahead of OT. But almost everything one could accomplish in a poor man’s methods

Now in controlled industries like Pharma etc this manual stuff was sheer heresy so companies like Global Cents made excellent products

So why the Transport now. My theory is the entry of integrations between SAP and Livelink.In fact the web services also was something my thought SAP forced OT’s hands.Many SAP customers are now getting a facelift with livelink doing a lot of backend services,pretty gui’s etc.

Here’s an interesting fictitious exchange between different players in a Integration meeting this is circa 2010,2011

SAP Basis Admin-Here we got the xECM thingy and archive server thingy going with OpenText (Notice OT in that because there is no product called that),I sthere anything else you guys need and can I close the client.

OT Consultant- Oh sure but on the next client you have to give me high access to edit the connections,change a few GUID’s(dataids) etc.

SAP Basis Admin-What the …..That means you have to do all this again

OT Consultant-Not everything just a few tweaks and pointing to the livelink admin..Hey Kishore where are we on the next OT system. Have the ELIB items(categories and folder looking things) coming along

Livelink Admin-I did that according to documentation.

OT Consultant- How about the dataids did you run the queries I gave you for the re-mapping

Livelink Admin-Yeah everything is there

SAP Basis Admin-All the while thinking that what are these guys talking about re-configure every system.I have about 20 different clients I have to maintain.

OT Consultant-Just to prop up the SAP guy,well you know our guys in Germany are coming up with a Transport mechanism just like yours…

Well finally after a lot of back and forths the project is a success

If it is just a DMS project a lot of us have no problem using the old methods.

I am pretty sure the transport mechanism is a GUI version of XML Export/XMl Import.

The reason why I wrote this is recently a friend asked if he would be lucky in a project that would not have Object Importer(OI)/Object Exporter(OE) and its RM variations.

I told him this

OI/OE can be leased from OT for a price .They used to entertain that.

With XMLEXPORT/XMLIMPORT here are a few things you would run into.

XMLEXPORT is almost available universally but the reverse is not true universally.

. So if one exported PhysObjects the corresponding import will fail.

XMLEXPORT will choke on big documents and the number of nodes because of DOM parser limitations. In OE they bundle the file separately so they can get more throughput even though the code invokes a DOM parser there as well.

XMLEXPORT/XMLIMPORT is based on source dataids which is not going to match targets. So if you find the xml exported file and change the category id’s to match the target it will work.

XMLEXPORT/XMLIMPORT does not understand the hierarchy. For e.g OI will create nodes as it wants enterprise:folder1:::::::::;;;folder99. In XMLEXPORT import you would have to export and follow the reverse process. So if a child object is nested way into a path OI will handle it XML import you will have to handle it

Most people will use XMLEXPORT/XMLIMPORT with lapi code so they can manage a lot of things with code in java or a .net lang

Now that GUIDS are part of 10 and 10.5 I do not know how that affects anything

OI/OE is kind of a framework .So with that I was able to transport compound email,email folder and email in a project with relatively less coding.

Readers-Please tell me if the Tranport Bench from OT is good and easy to use and it resolves conflicts et al and I will delete my post.

I LOST ALL CONTENT WHICH WAS STORED IN COMMUNITIES.OPENTEXT.COM.IF YOU ARE A STARTER READING THIS AND WOULD LIKE TO SEE THE ORIGINAL CONTENT DEPENDING ON TIME I HAVE I CAN SEND YOU SNIPPETS.LEAVE A COMMENT TO SUCH EFFECT.

Java Samples for Content Webservices are few and far in between.OT mainly puts samples in C# and many a times I myself wonder if my company did not allow me to do that how I would do that in Java.In all truth OT ships a massive application called TreeView.java and TreeView.s ln for both languages .But the java thing takes a lot of setup especially if one was not keeping active in that world.However I used the NetBeansIDE to actually do most of that so I use that debugging environment to try all my java stuff Recently I debugged how to run a Live Report in livelink via C# webservices.Then somebody wanted me to show them how to do it via Java.One of these days I want to return to seeing if I can do all this command line with nothing else other than notepad.

Well I created two word documents, my guess is the poster did not speak english very well,did not know livelink nor did know java to start with .Hence I dumbed it enough for people to follow in almost any criteria.

Having livelink and its builder helps me in my endeavours and so Oscript remains my first love although I am getting farther and farther from it all