So you are smart and have a heart of Gold. You want to use your talent for social good but do not know how or where to start.

How about we, give you a chance to showcase your smartness for a social cause and let you win dollars too!

Sounds exciting? Read on.

Natural or man-made, disasters bring chaos. developerWorks gives you a chance to put your coding and testing skills to become a part of the rescue team from miles away. Participate in the contest and help ease the communication and logistics breakdown that happens when disaster stricks. Use your own idea or copy and improvize someone elses.

Every line of code you create comes with a complexity cost. How can you tame this complexity for your large source base? One way is to streamline your delivery turnaround time for enhancements and fixes by visualizing your projects' source code—after all, "a picture is worth…” Some major financial systems are driven by industry standard visual representations that allow those projects to meet agile DevOps demands. Join this session to learn about productivity gains and improve your continuous delivery of software.

Roger brings twenty five years of software product innovation and consultative engagements across several industries.

Roger is an OMG Certified UML Professional, IBM and Open Group Certified Specialist and has been a featured global speaker on Cloud Software, DevOps for Mobile, SOA, Design, and Rational software topics.

Roger is also a volunteer for the American Youth Soccer Organization that promotes a fun, family soccer experience for 1,000 kids in the Eastern Panhandle of West Virginia.

***Dial in codes will be sent a few minutes before the webcast and posted in the online meeting.

By registering for this webcast you are allowing the GRUC to provide your information to IBM and/or webcast sponsors for direct contact regarding IBM products and promotions. You will also receive a complimentary membership to the Global Rational User Community.

Model Driven Development (MDD), together with its associated UML-based tools, has been around for more than a decade now. Several advanced organizations have successfully used MDD to substantially increase their competitive edge and market share through improved productivity, quality, and time-to-market.

UML-to-COBOL transformation requires the modeling of data structures and programs. The next step in the process is the generation of the COBOL source code by using the output of a transformation. This process uses both Rational® Software Architect and Rational Developer for System z as platforms.

Generating COBOL from UML is a two-step process: modeling data structures and programs by using Rational® Software Architect and generating COBOL source from the output of this model by using Rational Developer for System z®.

Rational® Software Architect is required to begin the UML-to-COBOL process.

Note: Rational Developer for System z provides an extension that is installed with Rational Software Architect that lets you develop a COBOL model. To use this capability, download and install the "UML Profiles for COBOL Development" extension.

The final goal of the UML-to-COBOL process is to generate COBOL source code that can be enhanced further within the development environment of Rational® Developer for System z®.

At this part of the process, the system architect who has been modeling the programs and data structures has been using the capabilities of Rational System Architect, enhanced with profiles containing additional stereotypes, primitive types, and patterns. The second modeling transformation generates the output that is used with Rational Developer for System z to continue COBOL development for System z.

Rational Asset Analyzer provides in-depth insight into dependencies within and among mainframe and distributed applications. Rational Asset Analyzer can assist you with the maintenance, extension, and reuse of existing mainframe and web applications.

Rational Asset Analyzer can simplify software projects by:

delivering up-to-date knowledge of application assets from the code itself

making the same application insight available to all team members through a web browser interface

taking an inventory of mainframe and distributed application assets

analyzing the impact of a change on mainframe and distributed application assets

improving the reuse of assets and sharing of knowledge throughout the development cycle

During the inventory process for software assets in the distributed usage tool users are facing lot of challenges to offload the code from Mainframe and process the same accordingly. To overcome this RAA has been enabled to scan the source code directly from RTC repositories.

Use the Rational Team Concert™ build engine to launch and record Rational® Asset Analyzer scans to provide ongoing analysis as part of your software development and change lifecycle.

Rational Asset Analyzer software analyzes source code artifacts, such as COBOL or JCL, and subsystem information, such as resources defined in IBM® CICS® or IBM DB2® software. To analyze source code, you point Rational Asset Analyzer software to the source code and scan. You can include this source code analysis as part of your ongoing software change process. Do this by associating a Rational Team Concert build definition with a Rational Asset Analyzer scan request. The Rational Team Concert build extracts source code to a location specified by the build definition and then makes a request to Rational Asset Analyzer software to scan from that location. This solution uses standard capabilities of both products.

We are pleased to invite you to the Global Rational User Community (GRUC) virtual webcast on DevOps.

With the unprecedented explosion of technology around us and increasing customer expectations, speed of delivery becomes a key differentiator. Over the last few years, the importance of a DevOps approach to software delivery has been gaining more and more traction. Organizations now recognize that DevOps is a business capability that brings value to their business. They are seeking to understand how they can adopt DevOps to become more efficient, deliver higher quality product, be more agile and innovate.

Government agencies are constantly seeking ways to reduce unnecessary overheads and non-value added work and transform their operations.

This webcast will help adopt lean thinking to identify and address delivery bottlenecks.

Take this opportunity to listen to and interact with Sanjeev Sharma, a DevOps thought leader from IBM.

DevOps is a set of principles and practices designed to help development, test, and operations teams work together to deploy code more frequently and to ensure a more effective feedback loop. The practices include iterative development, deployment automation, test automation, release coordination, monitoring and optimization, and many more. This article describes the factors to consider as you build a deployment automation solution for an enterprise that has applications that run on multiple platforms, including the mainframe.

Manage the deployment of multi-platform applications

Although DevOps principles apply across all platforms, the shared nature of the IBM z/OS environment has shaped, and sometimes constrained, the deployment process. In the current z/OS environment, deployment is generally automated consistently across all environments. However, this capability cannot extend to other platforms because the tools are specific to the z/OS platform. In the z/OS environment, the tools that manage source code also provide the build and deployment capabilities. Because these tools have been in place for many years, they have been significantly customized. In the current multi-platform environment, composite applications drive the need to coordinate the deployment of the entire application across various platforms. The deployment capability in place for the z/OS environment does not coordinate well with other environments. A comprehensive and automated deployment solution is not available. At the heart of a multi-platform application deployment system, you might expect to see a consolidated inventory view, which shows you the application with all its components and subsystems, mapped to the deployment environment.

Manage the environment

A software project typically has a set of deployment environments such as development, quality assurance (QA), and production. An environment is a collection of resources as a deployment target. A resource can be a physical server, a logical partition (LPAR), a virtual machine, or a subset of a cloud. It can also be a logical deployment target, such as an IBM® CICS® region, a database, or an application server platform. The deployment system needs to understand and be able to model the environment before it can create and maintain an inventory of application versions mapped to environments. In the distributed platforms, most IT organizations use application-specific environments, but multi-tenant servers can be the targets of multiple applications. The mainframe environment is typically highly shared. Approvals and
team processes are typically scoped to environments.

CLM has been developed to support transparent, agile development in a collaborative fashion. That was the main focus in the beginning and is designed for transparency across the stakeholders. Gradually Enterprise customers wanted to give access of the Work Items to the external world in a restricted fashion. It is just a question how to implement all that without sacrificing the benefits such as usability and performance. One way of doing this is having the reverse proxy sitting outside of the firewall. Another method is to provide customers a way to reach the server(s) e.g. by providing a VPN tunnel. Or you have to punch a hole into the firewall to be able to reach the public URI's. In this article we are trying to explain one of the ways of providing external access to CLM. (through proxy)

Scenario

JKE as a product company wanted to have a mechanism to expose the CCM server to their end customers to create a new Feature into the existing products list and even wanted to keep track of the progress of the same. One CLM instance/server is exclusively allocated for the JKE internal team for their product development lifecycle (Name:-CLM). The second instance of CLM will be used to give access to the external users to isolate the production server from external world (Name:-CLMEXT). One user id will be provided for each customer with restricted access to fulfill the following conditions.

One user id for each customer and one JKE id for the sales team

Each customer will get access only to their specific features and its workflow.

Only common features are accessible to all the customers.

Sales team should be in a position to respond to the queries from the customers through pre-configured responses.

Reverse proxy has to be configured in JKE reverse proxy server to provide access to the external customers. (Reverse proxy can be configured within CLM also.)

Solution

Once the features are been created in CLMEXT by the customers, JKE team can validate the same and a new WI will be added manually into the CLM server. (Note:- Work items can be added to CLM server automatically using cross server communication)

A link from the CLM WIs to CLMEXT Feature will be created and will be visible only from CLM by the JKE team.

JKE team has to manually update the status of the progress of the new Features in CLMEXT.

The complete document can be found in Rational User Group India chapter:-

Many teams find it challenging to get a project started quickly, to get team members oriented, to set up and configure tools, and to take advantage of proven patternsof success to do their jobs. Many other teams are required to document their processes for compliance reasons, and show that they follow that process.

Jazz allows us to create a new process template or customize an exisiting process template in RRC, RTC & RQM.

Jazz tools (RRC,RTC & RQM) have pre-defined process templates which can be modified according to the project needs. Process template has different components such as Overview, Timelines, Roles, Permissions, Access Control, Configuration Data and Process Description. Each of these components can be customized according to your project needs.

Managing software and product lifecycle integration has always been a challenge and with the rate of the new demands on the enterprise the challenges are increasing. Leaders from different standards organizations and industry will lead interactive discussions on the importance of open technologies to help enterprises manage the lifecycle activities within their environments. Learn about the direction lifecycle integration is taking as a result of the inclusion of open standards and the importance of this work to you. You will also hear how you can bring forward your requirements and influence the supporting work activities.

The Summit is free to attend for all those attending IBM Innovate. Join us for an exciting session and refreshments to start your attendance at Innovate 2013. For more information and to RSVP visit http://ibm.co/16jTusU

I'm working on.........I am Guru Prasad, working
for Tata Consultancy Services. I am working as Rational Product Specialist
& part of TCS Rational CoE team. I have conducted several internal
webinars, workshops on various Rational tools including Rational Software
Architect, Rational Requirement Composer, Rational Requisite Pro etc. I always
like to develop custom extensions for Rational tools. I
keep myself skilled & updated on rational trendsI always focus on Self educating myself on
IBM Rational tools. Most of the time I spend browsing IBM Jazz.net, IBM
Developer Works & go through the latest releases of the tools, discussion
forums, articles on Rational tools. Also, I attend most of the webinars
conducted by IBM Rational User Group. I do write Rational articles & publish
them in the internal sites. I also assist my team on POC activities / publishing
white papers on Rational in External forums like IBM Developerworks, jazz.net
etc.

Since most of the Rational tools
are Eclipse based, Rational users can develop custom extensions easily using the
Eclipse Plugin development features. IBM provides REST APIs , Java APIs to
extend features of Jazz based tools. Users can connect to jazz.net, RUG forum
& discuss on the queries specific to such implementation.My accomplishmentsI am an active member of IBM Rational User
Group & IBM Certified Deployment Professional & Solution Designer. I
have developed several plugins for Rational tools on demand.

Below are the methodologies which have been proven to be highly successful in RFT automation projects

Steps in designing the architectureFirst Step: Organize the code into a layered architecture as per the ITCL architecture

1. First Tier : AppObjects -->All the application objects of the application reside in the AppObject scripts. All the finders will be in this package/script2. Second Tier : Tasks --> Methods around these Appobjects to do a task eg: Logging into an application 3. Third Tier: Test Cases --> The scripts which actually verify and test the application. Testcases are abstracted from the Appobjects which make this architecture robust

Here, the test cases talk to tasks and tasks in turn talk to Appobjects

Advantages of this architecture1. Highly Organized code2. Ease in debugging3. If an application object changes, tester need not update all the test scripts having the appobject, but he needs only to modify the affected appobject and all the test cases are automatically updated. This is due to the fact that the test cases are abstracted from Appobjects

If the application is a Dojo application, we can further use BLUE framework on the top of ITCL. BLUE contains very robust methods for Dojo.

Also, for Web applications, if the RFT tool does not recognize the objects beneath the layers/pages, we can further use the IE developer tools to find the unique properties of the object and feed them to the RFT tool dynamic find

One can make the code browser independent by making the document page as the top parent instead of making browser as the top parent. All the underlying code will be having the document page as the top parent.

Third Step: Scripted Verifications

Use scripted verifications instead on recorded verification to verify the state of an application. it makes the automation suite very robust and resilient to the application changesFor instance a verify_table method will fetch the contents of a table/grid during the first run and write it in an excel and place it in an expected results cannon. During the subsequent runs, it will again fetch the contents of a table/grid and write to an actual excel file. The method then compares the two files and logs the result

Fourth Step: Appobject Unit Tests

Keeping the Appobjects Unit Tests in place makes a confident automation code. Once the build is ready, the appobject unit tests are run to ensure all the application objects are recognized. If there is a change in object properties, the tester updates the recognition properties. This will further ease the debugging effort as we are sure the test cases are failing due to a bug/ defect or due to a functionality change and not due to object recognition

Fifth Step: Multi Client Test Script Maintenance

Using a property file can make a single test script to run on multiple clients. This is done by defining a property in the properties file which would instantiate the objects specific to that type of client mentioned.

Before actually starting to automate, please go thru the above link to check if your environment is supported by RFT, else RFT will not runRational
Functional Tester plugs into IBM's open-sourced Integrated Development
Environment (IDE) known as Eclipse. By embracing IBM's IDE, Rational Functional
Tester sits along side other development tools created by IBM Rational and
other vendors allowing easy tool integration into a common interface. RFT is
the QSE(Quality Software Engineering) recommended tool for GUI automation.

Scripting with RFT

RFT enables
you to program in standard Java or with Visual Basic .NET. The fact
that RFT uses these programming languages provides two advantages.

One
advantage is that the languages are standard so the learning curve is smaller
than if the script developer had to learn both the tool and the language.

Another advantage is the flexibility the two languages offer. Programmers have
the choice of familiar languages. So they can write programs in the one at
which they are most adept or which they feel best suits their needs.

GUI test automation tools
also feature libraries of functions useful for testing, such
as click, select, or verify. In RFT, you can add to this library
("wrapping" several lines of code to perform a single operation
is one useful technique), to provide functions useful to all RFT programmers.

Script
maintenance

When developers change properties of UI elements, existing scripts potentially
can fail. How do you maintain your scripts if developers keep changing
properties that identify the user interface elements, such as the position of
the object or its name? This is an inevitable part of development, but how can
your scripts keep up?

This is one of the most
important issues to think about when selecting a tool and creating your
scripts, because if your scripts take too much effort to maintain, they cease
to be a cost-effective and efficient solution for testing.

RFT uses two technologies to
address this problem.

The core technology is the RFT Object Map feature.

The
RFT Object Map feature is enhanced by ScriptAssure.

The RFT Object Map stores
information about GUI objects and their properties during test development.
This information is used to find GUI objects during test execution. Some of the
properties that identify an object include color, size, position, state (such
as checked and unchecked), text label, and logical name (the name you assign to
the object). Object maps are often shared across multiple tests.

The purpose of ScriptAssure is
to eliminate the need to update the script when the objects in the user
interface change. ScriptAssure accomplishes this by allowing you to weight
the different properties used to identify a UI element. You determine the
most important characteristics for identifying the object. When one property
changes, ScriptAssure can still identify the object based on the other
properties. No single change to any object prevents an RFT script from running
to completion.

Another way that
RFT reduces script maintenance is with the Object Map update
tool. The tool enables you to globally update a centralized object map.

Rational Quality Manager is based on the Jazz platform and inherits many characteristics from that platform

IBM Rational Quality Manager is a collaborative and web-based tool

Offers comprehensive test planning, test construction, and test artifact management functions throughout the software development lifecycle Rational Quality Manager is designed to be used by test teams