Data is ‘the new oil’

Data, like oil, has huge inherent value, yet difficult to access and refine to realise it’s true value.

The 20th century saw an explosion in oil extraction and oil became the commodity that drove the world economy. The 21st century is seeing a momentum shift in economic power, those with the power are those that are strongest in the digital world.

Businesses such as Amazon and Google are built on a ‘data first’ philosophy, using the data that they generate to drive customer interaction with their brand. Consequently, they have grown their market caps to values that no-one could have foreseen and even the current values are somewhat theoretical as no one truly knows the value of the data, and its applications, that such firms are creating.

From 2013 to 2020, the digital world is expected to grow by a factor of 10 – from 4.4 trillion gigabytes to 44 trillion. Like oil, data is worthless unless its inherent value can be extracted. Companies from start-ups to market leaders like IBM, are working hard to find data’s version of fractional distillation and enable the conversion of these trillions of gigabytes of unstructured data into actionable business intelligence.

Business leaders, to stay ahead of the competition and avoid being eclipsed by datacentric new entrants, need to realise the value of the data within their businesses. Blockbuster vs Netflix makes for a great case study here. The truly great business leaders in the digital era are those that use the insights found within their business data to make better, more informed and incisive strategy decisions and create insight driven businesses.

Context rich, real time analysis of the data that businesses create daily enables firms to gain greater visibility of their organisation and the interactions it has with its marketplace and customers. The analysis highlights potential opportunities to be explored and exploited.

As well as internal motivations to drive better strategic decisions, external factors such as compliance, social media and digital communications channels are driving specific innovations in data analytics.

Synetec’s Anasight analytics platform provides organisations with a simple to use single world view of all communications data within their business. Its machine learning engine enables leaders to drive better behaviours, better governance and better decision making.

Specifically, the volume of data that financial firms generate is increasing as more and more of their transactions and customer engagement becomes digital. In response to this the regulation of financial services firms is increasing.

Regulators now place a huge emphasis on the monitoring and proactive not reactive, analysis of firms’ communications, both internal and external. Firms must have stringent policies and procedures in place to minimise the risk of non-compliance through miss-selling or using unauthorised methods of communication with clients and colleagues.

The use of data and the implementation of the correct mix of technologies to proactively monitor communications is paramount to being compliant. The technologies deployed must be able to record multi-channel communications, store the content, allow access for retrieval and perform analysis.

The requirement to analyse omni-channel communications suggests that all data needs to be accessible and convertible into easily interrogatable forms. This data can then be analysed for the presence or absence of key words and phrases, as well as identifying patterns relating to key events and client sentiment to help drive strategic business decisions.

Is your client MiFID II compliant?

The client is a company that manages foreign exchange transactions for clients across EMEA. Their traders provide expert risk management at market leading rates. The company provides a suite of online capabilities, structured products, customised analysis and consultancy services.

As an update to the 2007 Markets in Financial Instruments Directive, MiFID II applies to financial industry players that operate and / or do business with European firms providing investment services, such as investment banks, portfolio managers and brokers. The new directive seeks to increase market stability and confidence, therefore the detail required in transaction reports will increase, including personal details to identify execution and investment decision makers, LEIs (unique legalentity identifiers) and trade details. As a financial company the client must become compliant with MiFID II and the solution chosen for achieving this was the integration with the Bloomberg solutions, BTRL and RHUB.

Bloomberg Trade Repository (BTRL), approved by the European Securities and Markets Authority (ESMA) as a reporting house for derivatives trades accepts reporting for commodities, credit, FX, equities and interest rate derivatives trades mandated under the European Market Infrastructure Regulation (EMIR).

Bloomberg is authorised by the UK Financial Conduct Authority (FCA) for its Approved Reporting Mechanism (ARM) and also for its own Approved Publication Arrangement (APA).

Bloomberg’s ARM allows companies to enrich their transaction reports with data from Bloomberg or third-party order management systems. BTRL is also integrated with Bloomberg Regulatory Hub (RHUB), a platform that connects to the Bloomberg APA and ARM to help firms comply with their regulatory reporting-related obligations.

Under MiFID II, financial companies are required to make public through an APA certain transaction information, and Bloomberg’s APA will allow the publication of required trade details when MiFID II goes live.

Synetec’s solution for the compliance regulations consists of specific implementations for each type of report required by the NCAs.

The EMIR implementation involves sending of the trade and company details using XML files with specific fields (PGP encrypted) using SFTP connectivity. The report file is uploaded into the Bloomberg’s BTRL, processed and the results of the processing are available within the Compliance module of the client’s order management and trade execution system for further amending (in case it is necessary).

The ARM implementation also involves sending of the trade details using files and SFTP connectivity, in CSV file format with specific header and content. The report file is uploaded into the Bloomberg’s RHUB, processed and the results of the processing are available through the Bloomberg terminal. The users can see the successfully submitted trades, the trades that need further modifications, and for these trades the amendments can be made using the client’s system and will automatically be resubmitted.

The APA implementation involves sending the post-trade details for a trade and counterparty as a message, using the Financial Information eXchange (FIX) Protocol. The FIX Protocol has been chosen as it is the global messaging standard across the financial industry for pre-trade, trade and post trade communication. The required fields are sent from the client’s system to Bloomberg and after processing the trade is published to the Bloomberg’s APA website. In case of any necessary amendments the client has the ability to perform these amendments from within its system.

The main challenge encountered during Synetec’s implementation comprised mostly of determining exactly the proper data needed to be sent in each of the required fields, for the different report and instrument types required by the NCAs. This challenge has been overcome by Synetec’s team of project managers, business analysts and software engineers working closely together with the client and Bloomberg’s implementation team.

Through a deep understanding of the MiFID II reporting requirements and the technical ability to pull the integration quickly Synetec’s team was able to provide a bespoke solution that ensured the client met their regulatory obligations in a very short space of time.


Written by Constantin Huiung, a software engineer at Synetec, one of the UK’s leading providers of bespoke software solutions

Entity Framework Core

In a recent article we have discussed about upgrading legacy applications using .NET Core we mentioned a few different technologies for which, as explained, it would have been difficult to get into details for each of them. This new article will provide more information about Entity Framework Core and why we have decided to use it at Synetec.

Entity Framework Core is not more than a lightweight version of the popular Entity Framework. For readers who don’t know anything about that it is an object-relational mapper (ORM) which means that developers don’t need to write most of the data access code needed to communicate with the database. One of the big benefit of that is that you could have engineers with very little or no experience writing SQL. There are debate within the community talking about the benefits of using an ORM or simply to stick with SQL code and stored procedures. We mentioned in the previous article that there is no perfect solution and the technical decisions you make can be driven by what you want to achieve with your applications.


Let’s see some advantages of using Entity Framework Core and ORMs in general:

  1. Productivity: as the developers don’t need to worry about writing the data access code they have more time to focus on the feature development itself
  2. Application design: an ORM is a tool designed by experienced engineers and architects which means that to fully take advantages of it you need to adopt good programming practices in your applications
  3. Code reuse: writing a class library to access your entities is a good approach and you would need to write it only once and reference the library in the applications when you need it
  4. Maintainability: if integrated properly with a good architecture when you need to change your database schema you wouldn’t need to rewrite the business logic and how your entities are used across the applications.

Of course advantages means that there are some disadvantages:

  1. Understanding: developer who are not very curious by nature won’t understand what the code is actually doing behind the scene
  2. Control: you have less control than when you write plain SQL
  3. Performance: as the SQL is generated by the tool it makes it more complicated to control the performances of the data access and for complex queries an ORM will fail to compete with SQL, particularly as sql stored procedures are pre-compiled and ORM queries are not.

As with .NET Core, Entity Framework Core is a brand new tool, not just a simple upgrade of Entity Framework 6. The great benefit being that the tool can start its journey based on good foundations and a disadvantage being of course that it is not as mature as its predecessor.


If you want to know the details of how to integrate the tool a search on Google will give you more information than you need on the first page. The focus for this series of articles is to analyze the technologies we are currently adopting at Synetec and also to explain the decisions we made. So don’t hesitate to subscribe to receive more news about exciting technologies.

Written by Tarik Miri




509 The Print Rooms
164-180 Union Street
London, SE1 0LH
Phone: 0208 1444 206


Important: The information contained in this website is for general information purposes only. Any reliance you place on such information is therefore strictly at your own risk. Synetec Ltd endeavour to keep it up to date and correct.
All images are copyrighted to their respective owners.