Tag Archives: data management definition

Post navigation

Consilience is the confluence of concepts and/or principles from different disciplines, especially, when forming a comprehensive unifying theory.

Independent Confirmation

Why are some inventions discovered at the same time in different parts of the world? Does this have something to do with the scientific process of “sharing important discoveries?” Generally, scientists believe that they are part of a community of knowledge. Their discoveries do not occur in a vacuum. They must give credit to those who went before and created the foundation for their work. Therefore, when they discover something new, they are required to share it with the entire world. This sharing is part of knowledge evolution. Interestingly enough, it is also key to the World Wide Web. Collaboration is one of the key strengths of the Internet. It is a way to increase overall knowledge of Planet Earth. Science can also increase the strength of their theories through independent confirmation.

Result Conciliation

There are oftentimes prescriptions for the types and numbers of witnesses to accomplish certain legal requirements. Anyone who has completed an experiment understands the importance of result conciliation. A hypothesis is not proven to be true unless it can be repeated by independent sources. This shows that the reality is objective. The word, Consilience was formed by two Latin words – “com” meaning “together” and the suffix “-silence” meaning “jumping.” Therefore, Consilience means “jumping together” or a “convergence of proof from independent sources.” Scientists should use different methods to reach the same conclusion. Business and economics have a similar concept. Just think of the concept of a Recession or Depression. These are officially declared when a variety of indicators are in agreement – stock market, employment, inflation, money supply and so forth.

Knowledge Evolution

Consulting can use the concept of Consilience to teach firms how to follow objective norms. Technology consulting can compare a subjective company’s practices to objective industry norms. The best career development is successful based on objective, independent analysis. The concordance of evidence can help a business create a successful strategy. Consilience is the convergence of evidence from independent sources to prove the validity of a conclusion. Objective corporate success can be achieved by satisfying objective needs of your customers. Business intelligence requires an objective standard, such as Consilience to be useful.

Conclusion

Consilience is important to you because the answer to any given problem may not necessarily come from within your field of expertise and experience. rather, to be truly competitive in an ever in an ever increasing world of knowledge, we need to adopt a broad-scoped renaissance approach to learning and thinking, which folds in other sets of concepts and principles to create the durable solutions for today and tomorrow.

What does ACID mean in database technologies?

Why is ACID important?

Atomicity, Consistency, Isolation, and Durability (ACID) are import to database, because ACID is a set of properties that guarantee that database transactions are processed reliably.

Where is the ACID Concept described?

Originally described by Theo Haerder and Andreas Reuter, 1983, in ‘Principles of Transaction-Oriented Database Recovery’, the ACID concept has been codified in ISO/IEC 10026-1:1992, Section 4

What is Atomicity?

Atomicity ensures that only two possible results from transactions, which are changing multiple data sets:

either the entire transaction completes successfully and is committed as a work unit

or, if part of the transaction fails, all transaction data can be rolled back to databases previously unchanged dataset

What is Consistency?

To provide consistency a transaction either creates a new valid data state or, if any failure occurs, returns all data to its state, which existed before the transaction started. Also, if a transaction is successful, then all changes to the system will have been properly completed, the data saved, and the system is in a valid state.

What is Isolation?

Isolation keeps each transaction’s view of database consistent while that transaction is running, regardless of any changes that are performed by other transactions. Thus, allowing each transaction to operate, as if it were the only transaction.

What is Durability?

Durability ensures that the database will keep track of pending changes in such a way that the state of the database is not affected, if a transaction processing is interrupted. When restarted, databases must return to a consistent state providing all previously saved/committed transaction data

What is Process Asset Library (PAL)?

Process Asset Library (PAL) is a centralized repository, within an organization, which contains essential artifacts that document processes or are process assets (e.g. configuration Items and designs) used by an organization, project, team, and/or work group. The assets may, also, be leveraged to achieve process improvement, which is the intent of lessons learned document, for example.

What is in the Process Asset Library (PAL)?

Process Asset Library (PAL), usually, houses of the following types of artifacts:

Conley’s Law

Any organization that designs a system (defined more broadly here than just information systems) will inevitably produce a design whose structure is a copy of the organization’s communication structure.

What is DDL (Data Definition Language)?

DDL (Data Definition Language), are the statements used to manage tables, schemas, domains, indexes, views, and privileges. The the major actions performed by DDL commands are: create, alter, drop, grant, and revoke.

What is RAD?

Rapid Application Development (RAD) is a type of incremental software development methodology, which emphasizes rapid prototyping and iterative delivery, rather than planning. In RAD model the components or major functions are developed in parallel as if they were small relatively independent projects, until integration.

RAD projects are iterative and incremental

RAD projects follow the SDLC iterative and incremental model:

During which more than one iteration of the software development cycle may be in progress at the same time

In RAD model the functional application modules are developed in parallel, as prototypes, and are integrated to complete the product for faster product delivery.

RAD teams are small and comprised of developers, domain experts, customer representatives and other information technology resources working progressively on their component and/or prototype.

A peer review is an examination of a Software Development Life Cycle (SDLC) work product by team members, other than the work Product’s author, to identify defects, omissions, and compliance to standards. This process provides an opportunity for a quality assurance, knowledge sharing, and product improvement early during the SDLC life cycle.

What, exactly, the definition of a baseline is depends on your role and perspective on the SDLC (Software Development Life Cycle) process. The baseline concept plays in many aspects of SDLC execution, including project management, configuration management, and others. Additionally, the baseline concept and practice is applicable to all the SLDC methodologies, including, but not limit to the, Agile Model, Waterfall Model, Iterative Model, Spiral Model, and V-Model.

Baseline Definition

A baseline is a reference point in the software development life cycle marked by the completion and formal approval of a set of predefined work products for phase completion. The objective of a baseline is to reduce a project’s vulnerability to uncontrolled change and to provide a point in time set of artifacts for references and recovery, if necessary. Baselining an artifact (requirements specification matrix, design, code, data model, etc.) moves it into formal change control (usually, in one or more repository tools) at milestone achievement points in the development life cycle. Baselines are also used to identify the essential software, hardware, and configuration assembly components that make up a specific release of a system.

Data modeling is the documenting of data relationships, characteristics, and standards based on its intended use of the data. Data modeling techniques and tools capture and translate complex system designs into easily understood representations of the data creating a blueprint and foundation for information technology development and reengineering.

A data model can be thought of as a diagram that illustrates the relationships between data. Although capturing all the possible relationships in a data model can be very time-intensive, a well-documented models allow stakeholders to identify errors and make changes before any programming code has been written.

Data modelers often use multiple models to view the same data and ensure that all processes, entities, relationships and data flows have been identified.

There are several different approaches to data modeling, including:

Concept Data Model (CDM)

The Concept Data Model (CDM) identifies the high level information entities, their relationships, and organized in the Entity Relationship Diagram (ERD).

Logical Data Model (LDM)

The Logical Data Model (LDM) defines detail business information (in business terms) within each of the Concept Data Model and is a refinement of the information entities of the Concept Data Model. Logical data model are non-RDBMS specific business definition of tables, fields, and attributes contained within each information entity from which the Physical Data Model (PDM) and Entity Relationship Diagram (ERD) is produced.

Protected Health Information (PHI)

The Privacy Rule protects all “individually identifiable health information” held or transmitted by a covered entity or its business associate, in any form or media, whether electronic, paper, or oral. Privacy Rules call this information, protected health information (PHI).