Hi, everyone. I wanted to bring your attention to an article that was recently published on developerWorks that describes the pureQuery / OpenJPA integration that I discussed in my earlier blog post, which is a new feature in WebSphere Application Server v7 that enables developers to generate SQL from their JPA application entities and named queries, which can then be bound into static DB2 packages, providing a fast path to the security and performance benefits of static access for DB2 data. In addition, when you use this capability, your JPA app can take advantage seamlessly of optimizations provided in pureQuery such as the ability to update multiple tables in a single network call.

This integration is our first step toward providing a strong, integrated stack among DB2 and WebSphere, using pureQuery as the "glue". (As an aside, the Performance Expert Extended Insight Feature also uses pureQuery to provide new insights into the interactions between Java applications and DB2, with the most capabilities being provided for Java applications in WebSphere.)

From a tooling perspective, this initial JPA/pureQuery integration is fairly light, but you will notice that the static binder utility is now invocable from the WebSphere Application Server console, so WebSphere admins won't have to switch to another tool to do the bind. In addition, you'll be able to use the new capability in Data Studio Developer 2.1 to visualizeelapsed time for SQL statements directly from the pureQuery outline.

Nevertheless, there is still much more we can do to make this process easier and less command-line driven. WebSphere plans to ship enabling technology that will "turn on" capability in Data Studio Developer 2.1 to invoke the wsdb2gen utility. In addition, you'll be able to use the output from wsdb2gen within Data Studio Developer and take advantage of other pureQuery outline capabilities, including the ability to correlate SQL statements with specific OpenJPA queries and the relations between the SQL and the associated tables and columns. Check out the article if you get a chance.-- Steve Brodsky

What if there were no walls between the DBA and developers? I'm not really talking about your cubicle walls but those implicit barriers we create when we are so focused on our own thing and work using our own tools. But to develop higher quality applications with more agility, we all need to work together and even cross-pollinate our skills more.

I'm the Team Lead for the Data Studio Developer tooling. Earlier this year, with Data Studio Developer 1.2, we provided tools that help teams break down barriers between developers and DBAs:

If the developer knew about the queries in their Java application, we told them more about the SQL and the database used by the queries in the Java application

If the DBA knew about the database, we told them more about where in the Java application the SQL came from.

DBAs and developers can control the quality and performance of the SQL, and not lose control to a framework.

We now tell you more about the performance of your SQL and how many times it ran so you can then easily determine the candidates to focus your tuning efforts on. If your product is in the development phase, you can change the SQL in your application. If you are in production, we also give you the ability to replace with a more optimal SQL without changing your application. Automate the performance tests in your regular test buckets and ensure consistent application performance!

We tell you more (and with greater accuracy) about the SQL in your application, so you can get an answer to a question like "What are all the SELECT statements in my application." Other search criteria can be applied as well.

Most of these new features are available if you are using pureQuery APIs or have an existing JDBC or framework-based applications. And, these features work for both DB2 and IDS servers.

As always, if you use DB2, you can use static SQL to reduce CPU consumption in some cases. With this release, you'll find the experience from developing static SQL applications to deploying them to be significantly improved.

Ever since Ibloggedmy experience with Shell Sharing, people (ok, that will be exactly one) startedthinking I were some sort of expert and asking me questions. Luckily, Ihave found a newly published Shell Sharing article (written by real experts mayI add) on IBM developerWorks that should have all the answers. To save you thetime for searching, you can get it from http://www.ibm.com/developerworks/db2/library/techarticle/dm-0811khatri/index.html.

This article uses DSD 1.2 as an example, which is just a bitdifferent from my previous experience with Data Studio 1.1.2. First of all, thedefault installed directories have been changed to:

1)Installation directory:C:\Program Files\IBM\DSDEV1.2

2)Shared ResourcesDirectory: C:\Program Files\IBM\DS12Shared

The default package name has also changed to “IBM DataStudio” instead of the more general “IBM Software Development Platform” in lastrelease.

I then followed the instructions to download a trial versionof Rational Data Architect (RDA,) and used Installation Manager to install itas Shell Shared with Data Studio. The only funny thing was “InstallationManager” displayed a message that a newer version of “Installation Manger” mustbe installed in order to continue. After I clicked OK, it just went ahead andinstalled a new version of itself. Don’t you wish every product would upgradeitself like this?

Here I’ll share a secret: developerWorks has a special spacefor “Data Studio” at http://www.ibm.com/developerworks/spaces/datastudio.Click on the “Trials and downloads” tab will not only bring you to the trialdownload page, but it will also tell you what will happen to you after thetrial period ends. You may also find thefollowing information useful:

The following products can shell-share with Data StudioDeveloper Version 1.2.

Data Studio Administrator Version 1.2

Rational Data Architect Version 7.0.0.5

Rational Application Developer Version 7.0.0.6 and 7.0.0.7

Rational Software Architect Version 7.0.0.6 and 7.0.0.7

Data Studio Developer 2.1 has been announced and will beavailable soon. Since it’s an Eclipse 3.4 based product, it is safe to assumeit won’t shell share with the Eclipse 3.2 based products. My secret source toldme one nice improvement is there will be a common splash screen with a list ofthe products being shell-shared upon launch. I can’t wait to try it out.

It's been less than 5 months since we announced our 1.2 releases of Data Studio, which I blogged about back in July.

Since then, we have talked to thousands of people, provided demonstrations to hundreds, and visited dozens of customers. People are starting to understand Data Studio and the value of Integrated Data Management better.

With this latest release, announced today, we are really targeting the DBA with enhancements across the portfolio to help DBAs improve application performance, security, manageability, and TCO. In this release, the enhancements are particularly targeting Java applications that access DB2 data, but you'll see we're starting to branch into .NET as well.

The announcements today are for:

Data Studio Administrator 2.1, in which we've really focused on both usabilty and functionality. We've done lots of usability testing with DBAs and have provided a more natural approach for doing many tasks, including copy and paste of database changes, flatter traversal of the data source explorer, better sorting and filtering of objects, and new task assistants for utilities, commands and configuration parameters, so you won't have to leave your environment to go out to the command line or control center to perform those tasks.

Eliminate SQL injection risk for Java database applications by giving you the ability to indicate that only SQL that has been captured and approved my be executed.

Optimize SQL performance by providing developers with the ability to profile the SQL to see immediately how many times a SQL statement is executed, and how long it takes to run (elapsed time), giving developers an easy way to start identifying potential hot spots in the application before coming to the DBA.

Improve quality of service for OpenJPA and .NET applications. Steve Brodsky blogged about the integration of pureQuery with OpenJPA, which actually was avalable with the 1.2 pureQuery release with WebSphere Application Server v7. For the many many people who ask when can we see the benefits of static SQL with .Net applications, we have taken an initial step in this release by allowing client optimization .NET applications; in other words, the ability to capture dynamically executing SQL and bind them into packages.

Last but not least, DB2 Performance Expert for Linux, UNIX, and Windows 3.2 and the new DB2 Performance Expert Extended Insight Feature 3.2. This is an announcement particularly close to my heart as many of you who have sat in on my talks probably know. Whenever I sit down with DBAs and talk about the problems with diagnosing performance problems in a Java application environment, they always nod their heads in agreement. There is a real pain point here by not having the same diagnostic capabilities for Java as many DBAs are familiar with for COBOL/CICS applications.

If you extend DB2 Performance Expert with the Extended Insight feature (separate PID and separately priced but prereqs DB2 PE), you can enable new end-to-end database monitoring for Java applications for DB2 servers on Linux, UNIX, and Windows. This monitoring capability will really help improve availability of mission-critical database applications by making it much easier to detect performance issues and figure out whether the problem is one in the database or somewhere else in the software stack.

Also, you can set thresholds (your SLAs, so to speak) so you can easily see how the application is performing against those targets. If you haven't read it yet, I encourage you to see the article that the Germany team who develops this feature wrote. It's a great introduction to this new capability, and it's really just our first step. This whole concept of providing greater insight to DBAs and developers is planned to be rolled out across more databases and more data access environments.

Just a head up. We're not done. We have more announcements coming soon!

Is "agile data" just another buzzphrase? Does it even make sense to try to apply agile development principles to the database?

An expert in agile development, Scott Ambler, sees agile data as an essential component for application development that goes against a database. You can learn more about agile data here: http://www.agiledata.org/

I think one of the classic challenges that agile data faces is about dealing with a "brittle" database. What do I mean by brittle? Basically, I am talking about how difficult and time consuming it can be to refactor the database schema to improve software. Check out the results of this survey question: "How long does it take to safely rename a column in a production database?"

The database and/or your software development techniques around the database are "brittle" if it takes longer than one week to make a simple rename change. Almost half of these respondents fell into that category. I would venture to say that more interesting refactoring would therefore take most shops much longer than a week.

Another part of the agile data challenge is about being able to quickly tell what the impact of a change is going to be. If we want to rename a column, what are all the database objects (tables spaces, views, stored procedures, etc ...) that will be impacted, and is there a tool to help me automate a script to make these changes?

If this sounds interesting to you and you want to learn more about agile data and how Data Studio can help, come listen to a replay (until May 09) of a webcast I did last week on how Data Studio can help make data more agile.

If you listen to the replay or are exploring agile data I am very curious to get your feedback. Just call me an agile guy. What do you think of applying agile techniques to the database? Are you doing it? If so, what is your experience? What tools are you using? What tools do you need?

Recently I finished conducting a day long Proof of Technology session in New York on Data Studio Developer and pureQuery and I thought I'd share my experience.For those of you who have never attended an IBM Proof of Technology, it is usually a day long event at an IBM location and is a combination of presentations and hands-on exercises designed to help attendees learn and play with the technology. The computers at these sites are pre-loaded with the software and exercises that complement the presentations. Your IBM sales rep or tech sales contact is the one who would nominate you to attend one of these.

Back to the pureQuery PoT - It walks attendees through some of the basics of Data Studio Developer, all the way to advanced pureQuery concepts. Here are details of some of the modules:

Basics of Data Studio Developer (including a primer on the Eclipse environment). This is especially useful if you are not familiar with the Eclipse environment.

pureQuery concepts and exercises

Tooling in Data Studio Developer for pureQuery

Bottom-up code generation using pureQuery

Deploying existing Java applications using Static SQL without changing a line of code (aka client optimization)

Explain capabilities within Data Studio. Check the Explain plan as you develop Java programs, stored procedures or in the Integrated Query editor.

Judging from the questions and comments from the attendees, it seemed as if they found it worthwhile.

I always like the feedback and validation (and sometimes invalidation) of our ideas. Things I learnt during this trip:

Challenges associated with deploying applications when using SQLJ. This is significantly simplified using pureQuery.

The PoT has way more material than one could cover in a day. Attendees cherry-picked some of the later exercises.

NYC can get quite cold at night without a thick jacket! (Call me a California wimp.)

There were two sessions on using pureQuery with Groovy at IOD. One of the sessions was a joint one that I presented with my friend Vladimir Bacvanski at InferData, who, by the way, offers a pureQuery course.

Here's a description of the session we gave at IOD:

Project Zero and WebSphere® sMash represent a new wave of technologies for agile development of dynamic Web applications based on scripting runtime environment for Groovy and PHP, REST-style interfaces, mashups and rich Web interfaces. In this talk we show the united power of Project Zero with pureQuery . a high-performance technology that accesses relational data and which is the foundation for the data access in Project Zero.

This session was due to be held on Friday morning.Vladimir and I created the first ever (we think!) Groovy, pureQuery and DB2 Static SQL application on Thursday night, 10/30/08 at 11:14 pm PDT at a casino in Las Vegas.

In getting prepared for the Friday session Vladimir and I started experimenting on what is possible. Groovy is Java, and pureQuery is also Java-based, so hypothetically we should be able to completely mix and match it. So, we thought, let's try using sMash's Groovy to quickly build a database web 2.0 application that leverages the power of DB2 static SQL execution. After some hacking, at 11:14 we built a small Dojo app that received data using JSON from DB2 that was using static execution. Waaaayyyyyy coool!!!!! We got so excited about how cool, easy and quick it was we ended up playing with the app till 2 am in a Las Vegas Casino cranking out code.

So we are almost near the end of the conference and attendees of the conference are starting to head back.I am still amazed at the scale of these events, especially watching lunch being served to so many people. I heard there are around 7000+ attendees!

I am done with two presentations on Data Studio Developer, and have just one more lab to go. The presentations went well, however, for me, the post presentation conversations are always the most interesting part. Some of the questions / comments:How do I deploy pureQuery applications and merge it with existing deployment processes - Need to pass along some examples

How does Data Studio Developer integrate with RAD? - I explained and showed how shell sharing works on my laptop

One random comment: An attendee came up to me and said, I like how you present ( I say - Thank You). Then he says, I like how you do the 1 - 2 punch ! ... ( I am trying to figure out that comment .. I am still clueless) :)

I also spent quite a bit of time at the Data Studio pedestal. Every customer that I talked to seemed to have pain points associated with Java and data access that pureQuery is addressing. We had a significant number of sessions, so the conversations with quite a few of them usually started with some level of familiarity and that was refreshing.

Attendees, what was your experience like ?

p.s. The background music at the conference hallways is super-annoying ! Agree?

We had the Data Studio Customer Advisory Council today. One of our toughest customers gave Torsten a standing ovation upon completing his demo of the E2E database monitoring that is planned for delivery soon. I never saw Torsten blush before today.

One of the Toronto user-centered design guys, Rick, came up with a clever way (aka Vegas style) for the CAC members to vote on future functions... They were given poker chips to put into feature function cups. It was really clever.

I finally won tonight at the tables after the Rock the Mainframe party.

-- Bryan Smith

Here are my impressions from Monday through Wednesday.

There is a lot of interest on the new end-to-end database technology we are adding to DB2 Performance Expert. Had 133 people in our session on Monday. That's more than we ever had in a PE session before.

Torsten is asked to do many demos of this functionality to customers during this week

As usual you have to walk a lot during IOD. Conference building is huge, and hotels are sometime a good walking distance away. :-)

Curt almost missed his session yesterday. Someone called him 10 mins before the session to remind him -- he recovered quickly and was only 5 mins late. Full room with heads bobbing up and down when talking about problems with supporting Java applications.Most folks liked Dana Carvey much better last year than Martin Short.Listening now to Jim Pickel on DB2 Security.Lost more money at craps and blackjack table. I'm now starting to feel this financial crisis that I keep hearing about on the news. -- Bryan Smith

I am sitting in the awesome developer den where there is a roomful of colorful bean bags! Along with 2 Wiis that they will be giving away at the end of the week. IOD attendees should stop by to enter into the Wii contest and there are many laptops set up with Data Studio developersWorks articles. Visitors can check out articles on Data Studio and its family of products and also ask the experts about Data Studio. We are located in Breakers G from 10am - 5pm.

Bad news, we had connectivity problems in the Data Studio for DB2 for z/OS labs. But we have a re-do on Thursday, and I hope everyone who really wanted to do this lab with a DB2 for z/OS server will come: Session: HOL-2670B Data Studio and DB2 for z/OSTime: Thu, 30/Oct, 02:00 PM - 05:00 PMLocation: Mandalay Bay South Convention Center - Lagoon F-- Tina Chen

This is actually day 4 for me as I spent the weekend with DB2 LUW customers attending the DB2 Customer Advisory Council, a group of DB2 customers that provide feedback to the DB2 Toronto team on upcoming releases and future strategies to help shape DB2 LUW. We had a three hour window with these customers on Data Studio and the feedback was tremendous. The big change I'm seeing from 6 months ago at IDUG is customers are now downloading and using Data Studio throughout their developer communities. They're seeing the value Data Studio has to add over and above what they get from either Developer Workbench or other 3rd party development tools. Also, not only are they using this for their DB2 LUW environments, but several users indicated that they're using this for DB2 z/OS, as it enables their developers to easily build and debug SQL stored procedures for their DB2 z/OS environments. This is a big change from 6 months ago when customers didn't even know Data Studio existed.

Yesterday I held a session called "Empowering DBAs with Data Studio", the room was full with standing room only. Goes to show that DBAs are always looking for the latest and greatest technology to manage their databases. I demonstrated the new Data Studio coming soon. This upcoming release has added a ton of functionality for the DBA; including utilties, commands and more DDL management. The audience was very excited and really wants to see Data Studio become their tool for managing both DB2 LUW and DB2 z/OS databases. There's definitely a buzz around Data Studio at this conference.

Tonight I arranged a podcast with YL&A consultants and Curt Cotner on Data Studio. I'll let you know how that goes....-- Deb Jenson

Opening session: Ambuj announced over 7000 registered. It looked like they all attended the opening session. Martin Short was hilarious. His no-holds-barred routine was great.

I gave my session with Holger this am on Administration tooling. It went well. I took some questions about some upcoming branding changes that were discussed during the opening session. As we've been saying over time, the tools will evolve to support more than one database platform. In addition, over time the thought is to rebrand to Optim, which has good heterogeneous support.

Sitting now in Torsten and Holger's session on end to end database monitoring. There are probably150 folks in attendance. If you want to know more about it... see this hot off the press article in IBM Database Magazine.

I wonder what it's like outside... wish I could get some time to just sit and veg....-- Bryan Smith

Definitely growing momentum. Tina did a Data Studio Developer for DB2/z hands on lab that was at capacity and had 15 people waiting outside to get into the lab.

The Gold consultants heard about next generation Web 2.0 and data-oriented cloud computing topics. Then, Curt talked about Data Studio plans, which was mostly stuff that is going to come in 2009, and he was asked three times when exactly is that feature going to be available -- just goes to show how compelling the Data Studio vision is.

I did a session on Data Studio and Rational integration. And a few common interesting thoughts came out. It seems like folks are very hungry to be able to tie the software requirements using Rational Req Pro to both the application and data model. Folks are so very happy about the shell sharing concept, about being able to install a series of tools into a common Eclipse in which the value of the whole is greater than the sum of the individual parts.

I was also working the Data Studio demo booth, and it was interesting to see the reaction of DBAs on being re-empowered to manage the data access layer via pureQuery for both existing JDBC application and new applications based on pureQuery.-- Rafael Coss

"RU Ready?" This is what I was thinking when I was on the plane to China.

Being that this was my first time in China, I didn’t even know what to expect much less how many people would show up at my Data Studio session. I was on my way to attend the Rational Software Development Conference in China, which was hosted in the Shanghai Convention Center in September.

When I finally arrived I was completely surprised to see how many people showed up at the one day conference!

Around 900-plus people packed the 2nd floor to listen to “R Heroes” discuss what Rational had to offer this year. Due to the larger than expected audience, the auditorium was packed with attendees for the keynote speech, and since so many people showed up they had to open a separate hall to televise the speech! Did I mention the entire conference was also streamed online live? Talking about using technology to its fullest advantage!

To my surprise, I had around 110 people show up at my talk, and the conference hall was so packed that there were people standing on chairs outside the door trying to get a glimpse of the slides.

By the request of the local team at the last minute, I was asked to present my Data Studio slides in Chinese. Now I really had to put my 10-plus years of Chinese school learned in the US to work!

When I speech was over, there was silence. I could hear people scribbling on their notepad, jogging down notes. Was my Chinese so bad that no one knew what I was saying?

And all of sudden, hands were popping into the air.

This was the first time many of the attendees were introduced to Data Studio and its family of products. Attendees were asking great questions like how to use pureQuery with their existing Hibernate applications. And how Data Studio could help them become productive.

On my plane ride back to the states I exhaled a sigh of relieve as I boarded the plane. My mission was complete.

Hi, I'm a Program Director with IBM Data Studio software. I've had a long history with IBM and, in particular, database and database tools spanning DB2 for z/OS, data replication, business intelligence, information integration, and database management tools.

I recently attended the CIO Forum in Pittsburgh, Pennsylvania. The CIO Forum is a 2 day regional event held in various cities across the United States focused on developing executive IT leadership, networking with peers, and leveraging technology to succeed in the global economy. There were around one hundred attendees which made it a great size for interacting.

A few of things stand out for me:

First, of course, our panel on Integrated Data Management with Deborah Hurley, Chief Warehouse Architect for Constant Contact, and Joe Farrugia, VP of Sales and Consulting for Dynamic Systems Solutions. Deborah is leading the way towards effective alignment of business and IT at Constant Contact. Using Rational Data Architect, IBM InfoSphere Information Server, and Business Glossary together, Constant Contact is driving ownership of the glossary terms with the subject matter experts in the line of business, but keeping IT in synch through integrations with Rational Data Architect and Information Server. Linking business and IT is key to developing trusted information assets, and business analysts are going to be front and center in this evolution.

The increasing importance of the business analyst role was also echoed in an analyst research note I'd read on the plane on the way to Pittsburgh. And at the event, one of the attendees spoke of their business curriculum that was morphing to focus on developing the cross-business-and-IT skill set needed to perform these emerging roles. Check out Deborah's session at Information on Demand.

I'd really like to hear from you... How is your organization managing alignment across business and IT?

The next thing I noticed is the degree of interest in data archiving solutions (even more than data privacy solutions). Our IBM table garnered lots of interest, particularly about our data archiving solutions (or maybe it was the ever-popular THINK ball caps). Folks we spoke to were looking to data archiving to improve performance for both online and batch processing and to retain information off-line for regulatory purposes while providing fast access in case of discovery requirements. They thought IBM Optim solutions sounded very promising, particularly given their cross-platform, cross-database, and application-aware characteristics. I think Joe Farrugia's description of client successes and savings drew them our way, too. If this is an area of interest to you, too, you might want to check out another session at IOD:Mid-American Energy: DB2 Archiving for Performance and Cost Control, Wednesday, Oct 29th at 11:30 AM.

Are archiving or data privacy top issues for your organization? What are your key business drivers?

I'm really interested in hearing from you all about issues around business and IT. It's always good to be able to feed real scenarios back to our engineering teams to ensure that we hit the mark. Feel free to comment using the Comment link below, or send an email to dstudio@us.ibm.com and tell us your story!

(Updated 10/15/2008 with updated Guide to IOD)These Information on Demand conferences that we have every year really keep our lab and sales people busy. But all the work leading up to them is really worth it in the end, because it gives us an opportunity to talk directly to so many of our customers and get insight into the problems they face (which hopefully we can help solve). Last year’s IOD is where Data Studio was launched, and I’m very pleased that at this year’s IOD we have so much more to share with you – lots of sessions, demos, labs, community events, usability sandboxes, BOFs, etc. It’s also a great opportunity to get a sneak peek of upcoming enhancements.

I’m very honored that State Farm Insurance will be co-presenting with me on their experiences using pureQuery and Data Studio software, and I hope you can make that session.

We like working with customers on early releases to help us drive the product vision forward. This is the same basic pattern that we’ve used in DB2 for our entire 25 year history for each of the major steps in our technology journey (initial release of DB2, sysplex technology, DRDA and DDF, stored procedures, and now Data Studio and pureQuery). At a high level, the pattern is pretty simple:

Design and implement the next technology leap based on the best information available to our product developers

Find some key customers that want to push the envelope in this technology area, and partner closely with them

Be responsive to these customers, and adapt the technology as needed to make sure the early customers are successful and extremely happy with the solution. Sometimes, this will mean that we will drive the technology in new directions as we learn more about the problems the customer is trying to solve.

Finally, release the product for general availability and hope that we picked the early customers wisely… Here, I mean that we want to partner with customers that are representative of the broader customer population. If we picked the right customers, whatever direction they help us choose will end up being the solution that the other customers need as well. I’m happy to report that so far, we’ve had a stellar record of choosing the right customers, and have never ended up with a solution that was great for one customer, but had little value for the rest of the customer population.

It really helps our team deliver products that will fulfill your needs if they actually spend time talking to you. This is why I encourage you to come to our sessions, our birds of a feather sessions, our demos and tell us what you think about what we’re saying and showing. Effective product management and development relies on conversations, not on lectures.

For those of you who can’t make the conference in person, our team will be blogging from the conference. Hopefully that will give you the opportunity to get some idea of what is happening and provide you with the opportunity in this blog to add your comments and feedback.

I’m attaching a document here that highlights key sessions around Data Studio and Integrated Data Management and provides a pretty comprehensive list of sessions by day and product/topic area. I hope you find it useful, and I hope to see you there.

As I mentioned in my previous blog on the RDA 7.5 announcement, I promised to let you know when the trial code is ready. Well, it's ready now. You can download the 30-day trial off developerWorks.

For the highlights of the new release, check out the What's New documentation and check out my earlier blog on this topic that focuses on the data privacy and integration aspects of the new release, or even better, listen to my webcast.

I can show you all the new features in person. Come meet me at the IOD 2008 conference. I'll be busy at the conference - you can find me at either of the following sessions: