DataRage 2009 Replays

Abstract: DataRage is three days of 100% online technical sessions focused on database development and data management issues which you can attend from wherever you like to log in. It’s bringing you top industry speakers, technologists, and industry practitioners to present on a wide variety of database-related topics, a raging confluence of conveniently-delivered information you can’t get anywhere else.

Watching Session Replays:

ER/Studio

ER/Studio Macro ManiaEver wondered what all those macros do, but never had time to investigate? This session provides detailed explanations and examples of each macro available in ER/Studio 8.0. Join us and watch the macro madness unfold!

CASE STUDY: Establishing Data Modeling as a Service in BPBP is a large federated organization with a diverse set of systems and data. This case study describes the four year journey BP embarked upon (and is still on) when establishing Data Modeling as a Service (DMaaS).

Evolve or Die: Modeling Isn't Just for DBMS’s AnymoreData Modeling has been around for 30 years. Its roots were firmly in the DBMS world. How many of you can remember implementing a DBMS on a tape-based system? But, in the intervening three decades the World has moved on.

Jumpstart your SOA Infrastructure with Data ModelsThe backbone of any SOA implementation requires defining, building, and managing XML schemas. Often times, this is done with very little visibility of the back end databases which ultimately results in a huge chasm between the data represented in the service bus and what is persisted in the database. This session shows you how to reverse that trend and drive XML schemas from your data models so they’re built with the same standards as your databases.

Adding Content to the Enterprise PortalExtend the capabilities of the Enterprise Portal with new reports and charts. The Enterprise Portal provides a great set of reports out of the box, but how do you extend this to satisfy you ever changing needs of users? During this session, learn how to create new reports and integrate them into the Enterprise Portal by using the same tools that were used to build the Enterprise Portal itself.

Integrating Advanced Model Lifecycle Management with the Enterprise Repository & MS SharePointIn large environments, the integration of data modeling into the entire life cycle is important. The intellectual knowledge that is invested in well designed data models is not simply of use by database developers. When enterprise, conceptual, and logical data models are used in an effective information management framework, business users and technical users can leverage the information contained within them.

Comparing Generalized and Detailed Approaches to Data ModelingSome data modelers prefer large models with many, highly detailed entities. Others prefer small models with a handful of highly generalized entities. Each approach has merit, but combining them in a best-of-both-worlds approach is so difficult, most modelers give up and declare themselves to be in one camp or the other. In this presentation, Burton Group Senior Analyst catalogs the relative merits of the two approaches and describes why one approach is highly desirable, but the other approach is absolutely indispensible.

Five Classic Data Modeling Mistakes & How to Avoid ThemWe’ve all been there: a shortcut here, a compromise there, an overmodeling over there. In this presentation, Karen Lopez demonstrates five classic and all-too-common data modeling mistakes that are easy to make and yet just as easy to avoid.

Universal Data Models and Patterns: Developing Higher Quality Data Models in Less Time

Learn how Universal Data Models and Universal Patterns can help your organization reduce development time on new projects and facilitate standardization of existing data models, presented by renowned author and data modeling expert, Len Silverston.

This session will take an in depth look into the data dictionary system of ER/Studio and how it can benefit data architects and their models. The attendees will come out of this session with a deeper understanding of domains, attachments, reference values and the enterprise data dictionary system as a whole. They’ll be able to take their modeling to an entirely new level of productivity.

Standardizing data across the organization lowers the cost and effort of data integration, reporting, and warehousing tasks and raises the reliability (i.e., the validity of) data sets and report data. But, how do you ensure that in today’s world of fast paced, distributed development, data definitions are consistent, valid, and reliable? The answer lies in creating a data dictionary. In order to be successfully adopted, the data dictionary must not only be complete and specific to the organization, but must be easy to access and search. This session describes the process of analyzing existing data structures, of integrating stakeholders in order to agree upon a common set of definitions, and of publishing a data dictionary in an easily accessed, Web format.

Change Manager and JBuilder

Building a Repository of Database ChangesLearn to write a plug-in for Change Manager to populate a database repository of comparison results when running schema and configuration comparisons. See how to setup comparisons, schedule them, and then query a repo to see what’s changed and see example views to get useful and concise difference information.

DBArtisan

A Simple Approach to DB2 Index RedesignThis presentation reviews index structure and discusses a number of ways to redesign indexes without affecting application integrity. This presentation applies to all DB2 platforms and examples come from the presenter’s experience. The use of IBM’s Explain is also shown.

DB Optimizer

DB Optimizer: Powerful Simple Database Performance Solutions

Identifying the root cause of a performance issue can be difficult or practically impossible using standard tools. Even if the root cause is found, the next step to finding a solution can prove almost as hard. DB Optimizer solve these issues. DB Optimizer can identify, in seconds and with ease, the root causes for database performance issues. DB Optimizer empowers the user to see database load in the blink of an eye, quickly see anomalies and performance issues, and drill down to the root of the problem. Problems will be clearly identified as machine, database configuration, application design, or SQL optimization issues. If SQL has optimizations issues, DB Optimizer will parse the query and look through hundreds of options to find better alternative execution paths as well as verify that the necessary indexes, statistics, and even histogram information is in place for the fastest execution.

All Access

The Right Tools Just in Time: How to Best Position Yourself for Perpetual Fire Fighting

In an agile development environment, change happens quickly and code is developed, tested, and deployed to production with extraordinary frequency. Whether you’re developing internal-facing applications that serve the business or external-facing applications that serve your customers, you need the right combination of tools to manage change, ensure high performance, and meet expected service levels (SLAs). This session introduces the tools that any master craftsman should have handy to best position yourself for any fire fighting situation.

Performance Center

Sybase ASE Performance & Tuning: Best PracticesThis session covers the fundamental concepts, recommendations, and problems of performance monitoring in Sybase ASE and how to utilize Embarcadero tools to facilitate the process.

RapidSQL

Getting the Most from JBuilder and RapidSQL DeveloperYou are a Java developer and you need to write or work with SQL. What tools can help you get your job done faster and better than before? What about JBuilder 2008 R2 and Power SQL 2.0 -- Two products that work well together and add great value to the developer's tool chest? Learn how to use these two products together successfully.

MassMutual's SQL Preprocessing EfficienciesDan Lukasik of MassMutual explains the efficiencies that have been realized using preprocessor directives in SQL Code. He reviews how MassMutual uses the directives and provides examples that have been proven to not only save developers time, but also result in error-free code.

Using InterBase EncryptionThis session covers how to use the new encryption feature of InterBase using IBConsole and ISQL.

With Jerry Barnes of Embarcadero Technologies

Rapid Application, Web and Java™ Development Tools

Delphi

Multi-tier Database Applications with Delphi 2009This session provides an introduction to the Delphi 2009 renewed DataSnap architecture that let's you create multi-tier applications in an easy, RAD way delivering power and flexibility for your remote, zero-configuration, client applications.

With Marco Cantu - WinTech Italia Srl

The DBX Architecture in DelphiThis session covers the basics of the TClientDataSet Provider Architecture and how it's used by DBX. Learn how to use the components both visually and in code. The session also covers the common needs of a database application and how DBX provides solutions to meet these needs.

With Robert Love - State of Utah

Data Mashups in Web 2.0How do we make data available to the Web 2.0 world? Come and see database data surfacing in Google Maps, home page gadgets and widgets, Google Apps spreadsheets, and even Facebook.

Marco Cantu - WinTech Italia Srl

Working with Data Using ADO.NETMicrosoft documentation for ADO.NET 2.0 (and later versions) emphasizes the use of data source objects for data access. These objects, however, hide the real worker classes in ADO.NET, including connections, commands, data readers, command builders, and data tables. This presentation discusses the roles of these important classes and demonstrates how to use them directly in your .NET applications.

Cary Jensen - Jensen Data Systems

Bridging the Gap between Application and Database DevelopersMost applications interact in some way with a database. This presentation demonstrates how Change Manager, RapidSQL and ER/Studio can help Delphi developers understand interactions between an application and a database and make it easy to quickly find and fix bugs in application code.

With Pawel Glowacki of EmbarcaderoTechnologies

Writing Unit Tests for SQL Databases using DelphiWriting unit tests for client/server applications accessing a database on a SQL server is more complex than writing tests for stand-alone applications due to the fact that the resulting state of the system is not a value returned by every function, but it is stored in the database and must be retrieved in a more complex way. Even if Delphi does not ship with a tool specifically designed to test SQL databases, with some coding work it is possible to use DUnit framework to write, not only tests for classes working on the DB, but also for stored procedures on the server. It is also possible to extend DUnit so that DB developers and testers can effectively write tests without the need for deepening their knowledge of Delphi syntax and class libraries, focusing on their main task of writing SQL code.

With Giacomo Degli Esposti - Optima s.r.l

Building Database Applications with DelphiThis session shows database architects, database developers, and database administrators how simple it is to create database applications using Delphi 2009.