Can Your Business Afford ‘Alternative Facts’?


Kellyane Conway, part of the Trump Administration, first introduced the term ‘alternative facts’ into mainstream media after defending the White House Press Secretary Sean Spicer (who fired back at the media for the way they reported the size of the crowd at President Donald Trump’s inauguration) claiming it was “‘Largest Audience Ever to Witness an Inauguration, Period‘”. The reality however, looked rather different when provided with photographic evidence from the media,U.S. National Park Service and other bipartisan outlets . Pushing politics aside, this type of representation is, I would argue, analogous with the Commodity Data Management Industry that DataGenic have been proud to serve for the last 15 years.

Over the years and even today, I hear and read of companies, such as C/ETRM Vendors and indeed desktop solution providers, advocating they provide ‘enterprise data management solutions’. This frustrates our whole team and even more so some of the unfortunate companies that have succumbed to the razzle dazzle of the marketing literature and smoke and mirror sales demos. I understand that the landscape is competitive and everyone is trying to get a piece of the pie, but some of the marketing literature out there even stretches the term ‘alternative facts’.

How to avoid ‘fake data management’ promises

Before anyone procures a system in the marketplace I would strongly encourage some hard lined due diligence, which includes a detailed understanding of the system. This can be achieved by RFIs or RFPs and more importantly, a highly qualified Proof of Concept (POC) with clear objectives in mind and transparent KPI’s. Furthermore, taking a reference from the existing vendor clients should also be part of the procurement due diligence process.

However, as procuring a data management system is normally part of a committee decision process, there are times (not uncommon) during any stage of the engagement process to be asked by a client why they should be investing in a data management solutit on, rather than receiving their data direct from source, via a C/ETRM Vendor or desktop solution provider? Sometimes, these questions and subsequent answers provide a comfort blanket for the consolidation and unification of the team’s decision.  Other times, these questions have an edge to them, wondering if why they would invest this money when their current desktop of C/ETRM vendor can do the job just as well with a smaller investment or even as part of the deal they are on. Well, here are just some of the reasons why:

Poor data = poor and costly decisions

Data is the life-blood of any business, and customers require a new level of information maturity to transform into a more data responsive entity.

RQ Alternative Facts Quote.png

It is without doubt, that an organisation which acknowledges and invests in one of the most underrated of corporate assets – data – can have the edge in today’s competitive marketplace. The degree of advantage is the sum of many things including: data responsiveness; data timing; data integrity; data consolidation (views); data relevance; data universe, data operator and execution of the corporate and operational actions. Achieving excellence in this area is one of the continual challenges of any organisation. However, with vision, a clear mandate, tight initial planning and goals, the correct financial resources and buy-in at the corporate level – they form a crucial part of the contribution to the success of an organisation.

Achieving this new level of information maturity and a more data responsive organisation requires a technical, functional and operational review of the business, which will dictate the architecture and business road map to follow.

What good data management looks like

Below are some of the key areas that someone who is in the market for a commodity data management solution may want to consider and ensure they have established the actual ‘facts’ rather than ‘alternative facts’:

  • Central Market Data Repository: Optimised for time series and forward curves a dedicated data management solution acts as a central repository for all market data, both pricing and fundamentals. Thus enabling quick access to and retrieval of the full complement of historical data for each source (irrespective of the frequency).. ETRM’s are not designed for this purpose.
  • Up- & Down-Stream Integrations: A successful data management solution also acts as the central source and integration point for multiple downstream systems, not just your chosen ETRM. Individual data loaders to each application would multiply effort, costs and the number of potential failure points. At DataGenic we have experience of integrating and mapping with numerous such systems (SAP, Matlab, TrayPort).
  • Automation & Standardisation: Data delivered via a central data HUB should be automated and standardised and equates to having one channel of data from all the required sources. The data is timely, validated, enriched (e.g. mids calculated, absolutes/relative applied) and corrections are issued and versioned within the system. The data flow is fully transparent and the client should be proactively notified of any issues. Outsourcing this process relieves the pressure both on internal resources and any assistance required by the ETRM system.
  • Official partnerships with the data providers and direct access to their support teams. In addition to enabling your chosen commodity data management solution provider to fully support data delivery, they should also receive direct notifications of any methodology, liquidity changes and, therefore should be able to on board new quotes as soon as they become available.
  • Meta data support: Any data management provider worth the title, should provide full meta data properties, including calendars,, conversion rates, time zones etc. Crucially, this would not be provided and maintained in a feed direct from source.
  • Data Quality Checks: These are not a common facility within an ETRM, making it difficult to create customised validation routines and reduce data errors.
  • User roles & management: Any enterprise level data management system should be designed for concurrent users to maintain their own studies, conduct calculations, save formulas to the server, etc.; and be be able to mirror/macth existing user roles and access rights. Again, although this is possible in an ETRM, but not the optimal place to be conducting such activity by any means.
  • Forward Curves: The majority of our clients prefer to create their forward curves outside of the ETRM so they are able to generate and automate any type of curve using the rules packages, maintain versions, undergo full testing, benefit from a full audit trail and apply dependable calendars. They can then control and add to the curve portfolio as they wish. ETRM’s aren’t designed for such self-sufficiency and often require the use of their consultants. 
  • Extensive Data Feed Library: The more data feeds are already being handled by your provider of choice the better! In our case, we have over 600 data feeds already linked to our Central Data Hub, meaning the range of data immediately available to clients with no configuration required is second to none. In addition, new sources can generally be added and made available with full history in approximately 5 working days.

The above summary is based on frequently asked questions which we get asked during demos. Are there any other features or functionalities which you think are crucial in the decision making process to procure a dedicated commodity data management solution? If so, feel free to let us know in the comments below.

In the meantime, if you’d like to see our very own, award-winning commodity data management solution in action, book your demo now! 


Picture credit: Bob Englehart,{921AD264-049A-43AB-AF7E-CFA58070F3C5}