I get tons of questions about "how much it costs to develop an analytical application." Alas, as most of us unfortunately know, the only real answer to that question is “it depends.” It depends on the scope, requirements, technology used, corporate culture and at least a few dozen of more dimensions. However, at the risk of a huge oversimplification, in many cases we can often apply the good old 80/20 rule as follows:

Components

~20% for software, hardware, and other data center and communications infrastructure

Comments

Hi Boris,Your assessment to apply 80/20 rule for infrastructure and services component; initial v/s ongoing; data integration v/s reporting and indirect v/s direct seems to be right. I think some of the old rules are still golden. While this rule may help the team to validate the efforts estimated under the two buckets, the million dollar question is how you translate that 80/20 rule into the actual dollar figures. Based on quite a few parameters like which tools & software packages are considered, if the deployment is in multiple data centers to cover different geographies, what is underlying sizing and capacity of hardware based on requirements and number of users, etc, the 20% cost bucket for software, hardware, data center infrastructure may translate from less than a million to few million dollars. So while the infrastructure bucket may still translate to 20% of overall cost, the overall spend to build the analytical application has a wide range.The new trends of offering the analytical applications as managed services (with a combination of dedicated environment for data security and other related aspects) are changing the whole game where the emphasis is more on transactional costs.

Rajesh, great comments, thanks for the feedback. There were another series of comments that I received recently that I'd like to respond to.Some other implications that may affect the 80/20 rules may be due to the fact that different vendors and analysts draw lines between BI, DW and data integration in different places. But if we draw the lines as follows:BI- Operational reports- Ad-hoc reports- OLAP- Dashboards- Data mining- AlertsAnalytical data integration- ETL / CDC / EII- Data quality- MDM- Metadata- Data Warehouse, Data Martsthen the 80/20 rule applies pretty well. But what can throw that ratio rather significantly in favor of BI is if we mix in what I define as “applications” that run on top of BI platform such as- Performance management (budgeting, planning, forecasting, scorecards, metrics management)- Customer and/or marketing analytics- Risk analytics- HR analyticsIn this case 80/20 ratio gets much closer to 50/50. But I typically define these as “apps”, and don’t include them in my estimates for “platforms” or “infrastructure” such as BI, DW, ETL, etc.