Managed services and utilities can cut the cost of reference data, but to be truly effective managed services must be more flexible and utilities must address issues of data access and security.

A panel session led by A-Team Group editor-in-chief Andrew Delaney at the A-Team Group Data Management Summit in London set out to discover the advantages and challenges of managed services and utilities, starting with a definition of these data models.

Martijn Groot, director at Euroclear, said: “A managed service lifts out existing technology and hands it over to the managed service provider, while a utility provides common services for many users.” Tom Dalglish, CTO, group data at UBS, added: “Managed services run data solutions for us and utilities manage data for themselves.”

Based on these definitions, the panellists considered how and why managed services and utilities are developing. Dalglish commented: “We need to move away from all doing the same things with data. Managed business process outsourcing services are well understood, but utilities present more challenges – will they be run as monopolies and make data difficult to access, what is the vendor interest?” Steve Cheng, global head of data management at Rimes Technologies, added: “The market has moved on from lift outs. New technologies mean managed services can be more flexible than outsourcing.”

It is not only the nature of available services that is driving financial firms to third-party providers, but also cost and regulation, both of which are high on the agenda. Jonathan Clark, group head of financial services at Tech Mahindra, explained: “Cost is significant, but regulation is the number one issue. Regulations require more holistic and high quality data and that is high cost for firms, so they are trying to get data quality at a reasonable price point.”

Dalglish focussed on cost, saying: “The business case is about money. Large companies have lost the ability to change, a utility can help to reduce costs. Banks are looking at these data models to regain efficiencies they have lost internally and are difficult to rebuild.”

Cheng described the reference data utility model as being more like the satellite television model than water or electricity models, and noted that Rimes’ experience of customers is that they want to innovate, but not allow their cost base to increase.

While consensus among the panellists was that managed services and utilities can provide cost savings, they also agreed that it is not the cost of data, but the infrastructure, sources, services and people around the data that rack up the cost to an extent that is leading firms to seek lower cost solutions. Firms that opt to use a data utility can convert capital costs to expenditure and chip away at elements such as multiple data sources.

Dalglish commented: “If you can achieve savings of 30% to 35% that is good, but this is a conservative estimate and it should be possible to save more going forward.” Cheng added: “The rule of thumb is that for every £1 spent on data licences, £2 or £3 is spent on infrastructure and staff. The need is to identify those hidden costs so that the use of a managed service or utility can be justified.”

Returning to the pressure of regulation, Delaney asked the panel whether managed reference data services and utilities would be regulated in the same way as banks. While this is not happening at the moment, some panel members expect it to happen and warn that utilities may find a way around regulation by using disclaimers. Cheng said: “Forthcoming regulations are very prescriptive about data models and regulators may look at the whole data chain. This means utilities and managed services may in future be subject to the same regulatory requirements as other market participants.”

The concept of managed services and utilities is not new. Dalglish recalled an effort to set up a utility that did not take off back in 2005 and said that the moment has now come for utilities as the technology stack has improved, data is better understood and this is a good time for competition and collaboration in the market. Groot added: “Data delivery mechanisms have changed, the bar has been raised on projects and the business case for an internal service is difficult, making external services attractive.” Panellists also noted technologies such as the Internet and cloud facilitating mass customisation, and the benefit of utilities that are built for a single purpose.

With so much to offer, Delaney questioned the panel on what type of organisations will benefit from third-party utilities. Panel members said both large and small firms could benefit, with large companies reducing today’s massive data costs and small firms being able to hand off non-core reference data services. Clark added: “Firms that can benefit most are those that find it difficult to discover the cost of data, perhaps because it is managed in different departments or geographic regions. But these firms are also the hardest to convert because they don’t know their costs.”

A question from the audience about defining reference data, making it open and putting it in a utility for all to use, met a consensus response from panel members who said it is a great idea, but will not happen because there are too many vendors with vested interests in the market.

Closing with a blue skies scenario, Delaney asked how far the utility concept could go. Groot concluded: “There is a need for operational procedures and recovery planning, but utilities could go a long way as there is a lot of data in scope.”

Building a comprehensive data strategy for your organization can be a daunting task. Where do you start, and how do you put all the proper pieces in place for a suitable strategy to bring value to your company from data? The key is to ensure that the strategy focuses on the strengths and needs of the individual company.

1. Identify and Describe

A data strategy is “a roadmap and plan to identify what to do with a company’s data and to support accessing, sharing and managing the content,” says Evan Levy. In identifying and referencing content, look for subject area and data element names/descriptions, content type (structured, unstructured, semi-structured) and metadata. This component also includes the tools and methods to enable and automate the collection, publishing and managing content details as well as the business and technical-level details.

2. Provision and Share

The key to providing and sharing data between systems is to determine packaging (files, transactions, data streams, etc.), content and formatting (values, formats, etc.) and package metadata details (location, origin, availability, changes, etc.). This component also defines methods to allow data availability, such as support for internally and externally delivered content, production support and change control (errors, fixes, versions, etc.) and interface and access to allow data delivery.

3. Stage and Store

Areas to consider for storing and sharing enterprise content include master/reference data (systems of reference), business event details (transactional history), external reference and descriptive content and processing (applications, reporting and data integration). Also decide the tools and technologies to be used for content storage, specifically the storage system (DBMS, flat files, cloud, Hadoop, etc.) and the access method (API, Web services, applications, etc.).

4. Integrate and Move

It is necessary to consider the movement and transformation of source data for use by downstream systems, including identification and matching; cleansing, standardization and acceptance of content; and metadata and data lineage details. Determine the data movement/migration infrastructure to be used for bulk data movement and processing and application/transaction messaging (ESB).

5. Govern and Manage

What are the company’s policies for managing the data? Data governance methods and processes should cover information access policies, and methods and process for supporting data access and resolving conflict. Data management methods and practices should include the adoption and usage of data standards and the tactical deployment of data policies into applications and data usage.

Download this white paper – Data Management – a Finance, Risk and Regulatory Perspective – about the challenges facing financial institutions operating across borders with respect to ensuring data consistency and usability across multiple user types. The finance, risk and compliance operations of any financial institution need nimble access to business information, for performance measurement, risk management, and client and regulatory reporting. But although the underlying data may be the same, their individual requirements are different, reflecting the group-level view required by senior management and regulators, and the more operational view at the individual business level.

Where in the past, risk managers have been left to figure out what’s most appropriate for their particular institutions, regulators today are adopting a more aggressive stance, challenging the assumptions underpinning banks’ approaches to risk management. As a result, today’s challenge is not only to understand the current regulatory, risk or finance requirements, but also to set in place the analytical framework that will help anticipate future requirements as they come on stream. To find out more, download the white paper now!

Full text and downloads available to members only, please log in or become a member to view premium content and attached documents. Becoming a member of the ReferenceDataReview.com community is free and easy – just click the link below.

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again.

But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I think there’s nothing more to say on the topic, there is – well – more to say. With the artifice of the March ‘launch date’ behind us, it’s time to deal with reality. And the reality practitioners are having to deal with is one that’s changing rapidly.

Avaloq subsidiary B-Source is taking over operational responsibility for the Wealth Management Operations Back-Office of the Deutsche Bank (Switzerland) Ltd with effect from July 1, 2013 and supports the bank in further concentrating on its core business.

With effect from July 1, 2013, B-Source, a member of the Avaloq group, is taking over operational responsibility for the Wealth Management Operations Back-Office of Deutsche Bank (Switzerland) Ltd, including 80 employees in Geneva. Transformation of the core-banking platform to the integrated Avaloq Banking Suite is scheduled for summer 2014.

Markus Gröninger, CEO of B-Source, said: “Our continuously increasing community corresponds clearly to our growth path. This confirms us to be on the right track with our strategy of offering highly industrialised services that let banks concentrate on their core business and generate future growth.”

Francisco Fernandez, CEO of Avaloq, is also very pleased with the new deal: “In migrating to the integrated Avaloq Banking Suite, our clients are setting the highest standards in terms of individualising processes and industrialising operations. We love challenges like that. This is how we generate maximum added value for our clients.”

The past 12 months has seen the emergence of new players offering Business Process Outsourcing (BPO) services for Reference Data Management. These new arrivals expand the range of options available to financial institutions for addressing the challenges of regulatory compliance, operational cost reduction and scalability.

But BPO has other benefits, and innovative adopters have benefited from using the model to create new value-added services. By catering to clients’ data management needs, these players have been able to transform what’s traditionally been considered a cost centre into a new and significant source of revenue.

This paper – from AIM Software – explores this exciting new trend, and describes how an established financial institution took advantage of BPO to turn its enterprise data management initiative into a new source of revenue and business growth.

Zurich, Switzerland – SIX Financial Information and LUZ Engenharia Financeira, the largest provider of risk management software and consulting services to buy- and sell-side institutions in Brazil, have established a strategic relationship to meet the growing need for broader and deeper international financial market data in Brazil.

As Brazil’s investment community increasingly turns to foreign markets to achieve superior returns, reliable, high quality pricing and reference data becomes more important every day. SIX Financial Information, a leading provider of global financial information since 1930, will fill that need for clients of LUZ Engenharia Financeira.

Edivar Vilela Queiroz, CEO of LUZ Engenharia Financeira commented, “While the Brazilian financial services community has been well served by local data providers, SIX Financial Information has the breadth and depth of global market data to support current needs as well as the capacity to grow as our local market evolves.” He continued, “And as the world’s markets become ever more connected and transparent, this is an important differentiator that will allow seamless growth into offshore markets.”

“As Brazil becomes a major force in the global financial markets, foreign investors will undoubtedly continue their steadily increasing interest in the Brazilian markets and help foster even more growth,” said Barry Raskin, Managing Director for SIX Financial Information USA. “We are excited to extend our focus to this vibrant market, and very pleased that LUZ-EF has given us their stamp of approval through this strategic partnership.”

On Wednesday, March 13 2013, SIX Financial Information and LUZ-EF will jointly host a client event in São Paulo where they will formally announce their partnership and describe the international equity and options pricing and reference data available through the LUZ-EF platform.

B-Source successfully migrated Falcon Private Bank with its global locations to the B-Source Master at the beginning of the year. This will enable the established Swiss private bank to further optimize its processes and concentrate on its strategic expansion.

The successful migration of Falcon Private Bank to the B-Source Master means another Swiss financial institution has put its faith in B-Source’s reliable and innovative banking solution. Falcon Private Bank opted to outsource the operation of its banking platform and migrate it to the B-Source Master, an Avaloq-based banking application landscape using the ASP (application service provisioning) model. All three banking locations in Switzerland, Hong Kong and Singapore were migrated. The work was successfully completed within 15 months, a short period given the differing regional legal regulations. Orbium, a long-standing partner of B-Source, also played a decisive role in the successful project implementation.

By outsourcing its banking platform, Falcon Private Bank has a powerful, efficient and scalable banking solution that will allow it to focus on its strategic expansion in emerging markets. The bank chose B-Source in part due to its extensive expertise and long-standing experience not only in Switzerland but also with locations in other countries.

“The main reason behind our decision was B-Source’s experience in international outsourcing business, as we wanted to migrate several locations to the new banking system at the same time,” explains Tobias Unger, COO of Falcon Private Bank. “The migration of our banking platform to the B-Source Master creates the basis for optimal fulfilment both of our clients’ growing demands for higher quality service and of new regulatory requirements, and for pressing ahead with our global strategy and direction,” adds Unger.

“The migration of Falcon Private Bank to the B-Source Master is a further success for us, and we are proud to count another renowned first-class Swiss private bank among our clients in the shape of Falcon Private Bank. B-Source’s long-standing experience with international private banks enabled us to successfully implement this challenging project in a very short time and to a high level of quality,” says Markus Gröninger, CEO of B-Source AG.

Financial firms are falling short on data management issues such as calculating the true cost of data, identifying the operational cost savings of improved data management and embracing social media data, but according to research by consultancy Capco, these issues can be resolved with a cross-organisational and practical approach to data management and the development of a data culture.

The business and technology consultancy’s report – ‘Why and how should you stop being an organisation that manages data and become a data management organisation’ – is based on interviews with towards 100 senior executives at European financial institutions. It considers the many approaches to data management across the industry and within individual enterprises, as well as the need to rethink data management. It states: “There is one certainty: data and its effective management can no longer be ignored.”

The report suggests an effective data management culture will include agreed best practices that are known to a whole organisation and leadership provided by a chief data officer (CDO) with a voice at board level and control of data management strategy and operational implementation.

Turning the situation around and attaching practical solutions to the data management vision of an all-encompassing data culture, Capco lists regulatory compliance, risk management, revenue increase, innovation and cost reduction as operational areas where good data management can have a measurable and positive effect on profit and loss.

Setting out how an organisation can create an effective data culture, Capco notes the need to change from being an organisation that is obliged to do a certain amount of data management, to a mandated and empowered data management organisation in which data has ongoing recognition as a key primary source. The report concludes: “Every organisation has the potential, as well as the need, to become a true data management organisation. However, the journey needs to begin now.”

Reference data management (RDM) is a foundational element of financial enterprises, yet the collection of solutions used to manage reference data in most firms is not satisfactory, according to a report published this week.

The report – Reference Data Management: Unlocking Operational Efficiencies, published by Tabb Group in conjunction with data integration specialist Informatica – describes current sentiment around RDM. It looks at development through four generations of solutions, details the obstacles to RDM success and sets out how firms at different levels of RDM adoption can move forward towards the holy grail of centralised RMD coupled to consistent reference data processing.

Despite huge investments in RDM over the past decade, research carried out among 20 firms – 25% in Europe, 75% in the US, 50% on the buy side and 50% on the sell side – in April 2012 found 86% of respondents dissatisfied with their RDM capabilities. Of these, 48% are being driven to improvement for reasons related to resource optimisation and outcomes, while 35% are responding to specific catalysts such as compliance.

Recommending how to navigate the road ahead, the study suggests firms committed to bolstering existing suites of RDM solutions should focus on wrapping current solutions with technology that enables a consistent enterprise data governance process, while those yet to make a significant commitment to an RDM solution should seek solutions that manage multiple reference data domains in a consistent and integrated enterprise framework.

The report concludes: “There can be no glory without doing the hard work first. Data fluency, a critical precursor to data consumability, simply means that data flows more easily, which in turn means that end users must be able to find it. And, finding data requires meticulous attention to standards, labels and other metadata, however imperfect they may be now or in the future. That way, no matter how big or complex the data gets, end users will have a much better shot at harvesting value from it.”

Exchange Data International (EDI), a premier back office financial data provider, today announced it adopted Bloomberg’s Global Securities Identifiers (‘BBGID’) to name and track all equities securities in its Worldwide Corporate Actions service.

EDI is the latest financial data provider to adopt Bloomberg’s Open Symbology (BSYM), an open and free-use system for naming global securities across all asset classes with a BBGID, a 12 digit alpha-numeric identifier for financial instruments. EDI has implemented BBGID numbers in its equities reference, pricing and corporate actions data feeds. Its Worldwide Corporate Actions service provides detailed information on 50 corporate action event types affecting equities listed on 160 exchanges.

“EDI decided to integrate Bloomberg’s Open Symbology, as it is easily accessible and has no license fee or restrictions on usage,” said Jonathan Bloch, the Chief Executive Officer of EDI. “Bloomberg’s Symbology also advances straight-through processing of equity orders, which aids reporting and compliance management.”

Peter Warms, Global Head of Bloomberg Open Symbology, said, “Existing identifiers that change due to underlying corporate actions introduce inefficiencies, increase costs and add complexity to the data management process. Bloomberg and EDI recognise the importance of comprehensive, open and unchanging identifiers, like the BBGID, in enabling customers to track unique securities consistently and to process corporate action data seamlessly. As BSYM grows in adoption, interoperability across market systems and software using BSYM will improve steadily and reduce operational costs.”

NYSE Technologies, the commercial technology division of NYSE Euronext, and Xignite Inc., provider of web-based market data services, have announced their agreement to launch a new service providing access to real-time, historical, and reference market data for all NYSE Euronext markets via the Internet. In extending the benefits offered by the NYSE Technologies Capital Markets Community platform introduced in 2011, NYSE Technologies Market Data Web Services is geared towards non-latency sensitive clients and those in remote locations. The first phase offers real-time retail reference pricing for NYSE, NYSE MKT, and NYSE Arca markets.

NYSE Technologies Market Data Web Services, which is powered by XigniteOnDemand, allows clients the flexibility to access only the content that they need for a wide range of purposes from developing trading solutions for financial web portals to enabling Internet-powered devices. The user interface offers data services from across NYSE Technologies’ full portfolio of market data assets. The second phase scheduled for the third quarter of 2012 will offer NYSE Bonds data, NYSE Liffe Level 1 and Level 2 data, and NYSE and NYSE MKT Order Imbalances.

“Our goal is to connect data consumers directly to our content in multiple ways- via collocation at our Liquidity Centers, direct connection to our SFTI network and now via the web,” said Jennifer Nayar, Head of Global Data Products, NYSE Technologies. “We are pleased to partner with Xignite to address the demand for internet-based delivery of market data and as a result, further extend our client-base to non-latency sensitive and remote clients.”

Using a standard Internet connection, users can access NYSE Euronext market data and customize it according to their specific trading needs. Customers anywhere around the world, including those in remote locations, are able to access the data they need and develop to it with ease for fast time-to-market.

“The delivery of market data content via websites and mobile devices continues to build momentum and we are excited to leverage these applications to help increase access to NYSE Euronext data,” said Stephane Dubois, Xignite’s CEO and founder. “Both NYSE Technologies and Xignite have demonstrated a strong commitment to the electronic delivery of market data and the ability to serve today’s growing, diverse array of applications, especially the mobile market.”

The initiative with Xignite complements NYSE Technologies’ enterprise cloud strategy. NYSE Technologies Capital Markets Community Platform enables a range of industry firms and registered market participants to purchase computing power as needed, freeing them to focus on core business strategy rather than complicated IT infrastructure. NYSE Technologies Market Data Web Services provides clients with another market data delivery option for NYSE Euronext content, supporting current access methods offered by NYSE Technologies where direct connect clients and SuperFeed clients have the choice of collocating in NYSE Technologies’ Liquidity Center or connecting to its Secure Financial Transaction Infrastructure® (SFTI) network.

The report, Market Data Acceleration: More than Just Speed, also predicts 4.5% compound annual growth in these investments for the next three years based on expected growth in FX, Derivatives and Commodities as well as movement by Asian markets towards automation.

The largest segment of this investment, 73%, will come from Europe and North America, but according to Tabb Group, there’s considerable growth potential from the Asian markets.

Moreover, while the equities markets are matured from a growth perspective, driving 45% of the global spend, a strong percentage of growth will come from over-the-counter (OTC) derivatives, FX and commodities.

According to the report, market data is an area where performance can play a crucial role for a host of trading activities. Obtaining, decoding and utilizing market data in a timely and efficient manner are no longer the purview of the ultra-low-latency firms; everyone involved needs to be able to get at market data in as timely a fashion as possible.

“This is not to say that everyone needs to be at the ‘tip of the spear’; however, it does mean that anyone who is actively involved in trading needs to be moving in that direction,” said the report.

However, according to the research firm, firms are struggling with conflicting pressures of the “need for speed” in comparison to the “need to save,” as they try reconcile price with performance.

“Market participants need to ensure that their investment in speed gets them more than just a solitary solution for a single platform,” said Tabb partner and report writer Alexander Tabb in a statement.

Different firms, according to Tabb, have different strategies, thus different needs. Whether a firm is a high frequency trader, an institutional market maker, or an algo-trading desk, the challenge is placing speed into its proper context within the accelerated market data equation.

“Due to the democratization of speed, it’s essential for every buyer to remember to factor in total cost of ownership, price versus performance, operational flexibility, control, scalability and time-to-market,” says the report.

The idea of an LEI pre-dates the 2008 financial crisis by several decades. The ISO (International Organization for Standardization) had advocated an LEI (at one time called the IBEI – International Business Entity Identifier) for many years, but was unable to pinpoint an organization ready to build and maintain such a directory. For many securities industry participants, existing identifiers, such as the Bank Identifier Code (BIC), met most of the market’s needs.

The collapse of Lehman Brothers revealed the problems firms faced in readily identifying their counterparty exposure. With critical data residing in multiple, unconnected silos, many firms had no way to calculate their counterparty risk across front, middle, and back office systems. This damaged reputations, led to tremendous financial losses, unleashed law suits, and brought into focus the dire need for a system to uniquely identify entities.

On its 30th anniversary Bloomberg officially launched an updated $100 million version of its core terminal yesterday in London and New York simultaneously. The NEXT platform of the Bloomberg Professional Service is intended to give traders and financial services end users faster, deeper insights into the markets and to enable the market data terminal to answer questions more intuitively in future, not just present research and data, via an enhanced ‘natural language’ search function and ‘give me the answer’ front-end tool.

According to Tom Secunda, the co-founder and vice chairman of Bloomberg speaking at the launch, “this is an evolutionary step” that helps order increasingly complex markets and aids productivity, while continuing the company’s mission to deliver on “Mike Bloomberg’s famous three-legged stool, consisting of news, data and analytics”. The NEXT platform consolidates and crucially integrates these feeds better than ever before believes the company, giving users easier access to the information that exists on the terminal and enhancing the customer experience. “For example, you can ask what was US CPI in 2000 …and bang, there is the answer.” Users can then drill down into the answer for further research, added Jean-Paul Zammitt, global head of core product development at Bloomberg, pointing out that this is the key presentational change in the NEXT platform, requiring every help screen and back end process to be rewritten and updated.

Under development for the last two years, Bloomberg asserts that 3,000 technologists were involved in the overhaul of its core terminal, which is used by traders, analysts and even some large multinational corporate treasuries looking to hedge their foreign exchange exposure. A select group of existing clients, including OCBC Bank, Credit Agricole CIB, and Glenhill Capital were involved in the development phrase, allowing Bloomberg to review common keystrokes and commands across an array of functions in order to improve the customer experience.

More than 100,000 clients have already converted to Bloomberg NEXT at no extra cost in the £20,000 per year outlay since its ‘soft launch’ at the end of last year, with less than 1% converting back to their old terminal. The company said that two thirds of them are using the NEXT platform more than their old terminal and that it wants to convert all of its 313,000 subscriber base for the Bloomberg Professional Service by the end of this year.

“Bloomberg NEXT saves me time by discovering functions and data more quickly,” said Seth Hoenig, head trader at one of the ‘soft launch’ development partners, Glenhill Capital. “The new help menus enable users to find the answer that they need fast. Stumbling upon the hidden gems within Bloomberg has always been revelatory; now it’s easier.”

According to Lars Hansen, senior portfolio manager at Denmark’s DIP, the Danish Pension Fund for Engineers: “Bloomberg NEXT is a major step forward. It is much more intuitive – you can see multiple pieces of information on one screen, which lets you see new interrelationships.”

Bloomberg highlighted what it sees as three key improvements in its updated terminal:

• Better discoverability: Bloomberg NEXT’s new discoverability features allow users to get quick, direct answers to their queries as well as pull together a wide variety of related details such as companies, research and charts. A more powerful search engine means users can type just a few words and go directly to the desired securities, functions, people and news. The streamlined menu listing puts the most relevant information and topics right up front.

• More uniformity: Every screen of the Bloomberg Professional Service has been redesigned to provide a common look and feel. This consistent interface across all asset classes, from FX to commodities and fixed income, and across all functions should allow expert users and generalists alike to more efficiently navigate often-used functions and discover new ones. An educational overview of each market segment for novices is also included in the update.

• Intuitive workflow: The functionality of the Bloomberg Professional service has been re-engineered so that a user should be able to quickly and seamlessly navigate through the series of questions and answers essential to making smart market decisions. The new workflow, with user prompts, in Bloomberg NEXT is intended to allow expert users to drill deeper into the data and to let occasional users discover new functions.

“The complexity and interconnectedness of the global financial marketplace has grown significantly. Business and financial professionals need to synthesize astounding amounts of information to make intelligent investment decisions,” explained co-founder, Tom Secunda. The firm is still a big believer in a single product approach, however, he stressed at the official launch of NEXT but this, “obviously gives us challenges as markets get more and more complex.”

NEXT is Bloomberg’s response. “The pace of change in financial markets will only accelerate and with it the need for more information,” added Secunda, before concluding that he believes, “Bloomberg is now positioned to quickly answer those evolving questions and ensure that our clients will always have the leading edge in making investment decisions.”

News Analysis

Bloomberg’s new NEXT platform will go head-to-head against Thomson Reuters in the market data sector, which is increasing in value as financial markets get more and more complex and new post-crash regulations place new information demands upon market participants. Both companies are running neck and neck in terms of market data share, with estimates of 30% for each at present.

One terminal is proprietary, of course, with Bloomberg maintaining its closed market data platform in its NEXT iteration, while Thomson Reuters is now following an open access model with its Eikon terminal, allowing users to add their own data and applications. The relative failure of Thomson Reuters Eikon platform, which has sold only tens of thousands of units since launch rather than the hoped for hundreds of thousands, is what prompted the open access model from Thomson Reuters, although it does of course take time to build up a following. It will be interesting to see if Thomson Reuters move allows the firm to win back lost market data share or if Bloomberg’s updated terminal can keep it on its recent upward curve. The former is still benefiting from the 2008 merger that united Thompson Financial with Reuters, giving it synergies in the data collection and delivery areas, but the competition between the two has just hotted up.