CIO Influence
CIO Influence Interviews Data Management

CIO Influence Interview with Grethe Brown, CEO at DiffusionData

CIO Influence Interview with Grethe Brown, CEO at DiffusionData

“Developers have their own tools and coding styles they like to use to build applications based on their own personnel preferences.”

Hi, Grethe. Welcome to our Interview Series. Please tell us a little bit about your role and responsibilities at DiffusionData. 

In my most recent role I was a COO in the fintech industry before I was headhunted to join Push Technology. After running operations alongside the Chairman of the Board for a period, he asked me to take up the post of CEO.

On joining the company, I canvassed opinions on a wide range of things and one of the messages that filtered through was that Push Technology as a name and the website wasn’t working for us. These are big issues and brave things to change, but we conducted a strategy workshop and just went with the findings.

We are now called DiffusionData. There wasn’t one person who was against the realignment because it was needed. Our product is called Diffusion – it just makes sense. It creates an alignment between what the business is, the product, and what we do. We are proud of our new brand and website.

The company is exciting and the people who work here are excited about their product. The sales team and support staff speak of the product with pride – trust me when I say I’ve had very different experiences in this regard in the past. They’re clever and an amazing bunch of people, so I’m delighted to be here.

CIOs are increasingly looking at real-time data streaming and messaging capabilities for their business operations. Could you please tell us how DiffusionData fits into a CIO’s radar? 

Everybody knows people are trying to quickly move data in large quantities over the Internet. That’s what all the products in the general arena are trying to do, but there is a little more to the story. To do this successfully, you have to consume the data, do something to it, position it, and arrange it to be delivered to the people who actually want the data in the format they want and only the data they desire, not the whole kitchen sink.

Unfortunately, many companies hadn’t really looked at this close up. Now, having spent an enormous amount of time with customers looking at this particular issue, we have determined that it takes platform intelligence all along the way. Every step of the data journey from consumption to transformation to delivery needs intelligence.

You can’t just shove data down the pipe for a variety of reasons. Firstly, you waste a lot of bandwidth and infrastructure because you are sending data you don’t need to send. You don’t have the ability to hyper-personalise. Data delivery is in high volume and real-time, but the needs of everyone are different.

During the data journey, the systems need to have the intelligence to pull the pieces that are appropriate to the end consumer without slowing it down or impeding the flow of data. To determine that we have optimised the journey every bit of the way with intelligence. We know what we’re taking in when we consume it and it is formatted appropriately.

Then, it goes through a transformation and this can be lots of different things. You can take an entire trading flow and just ask for the FX data in Deutschmarks, not dollars, yen or anything else – crucially you do not want repeated data.  If the information hasn’t changed, there is no reason for it to be sent again.

That realisation alone, which we call Delta Data, has special algorithms we’ve developed to avoid sending repeat data and saves 90 per cent of bandwidth in distributing data. There are a bunch of pieces to intelligence, it’s non-trivial and I think it’s the keys to the castle.

Read More: CIO Influence Interview with Francois Ajenstat, Chief Product Officer at Tableau Software

You recently announced Gateway Framework 1.0. Could you tell us about its unique capabilities and how it simplified data streaming for customers? 

Our Gateway Framework is a unique industry tool. Developers have their own tools and coding styles they like to use to build applications based on their own personnel preferences. This means organizations end up with a plethora of applications which behave differently and can create daily challenges and issues, especially if the person responsible for a specific set of code leaves the company. This industry-wide problem is resolved by deploying the Gateway Framework as adapters will now look and behave the same. This saves time in production support as every adapter has the same architecture and 90% of the code, which is tested and production ready, sits in the Framework. Not only does this mitigate the risk of bugs every time a new adapter is built, but the user also benefits from a consistent configuration.

The Gateway Framework removes the need for any significant Diffusion knowledge, providing a ‘low code’ solution to integration, where the application writer only needs to concentrate on the interactions with the external systems. In addition, the Gateway Framework is fully integrated with the Diffusion management console, allowing applications to be managed and monitored from the console.

The release of the Gateway Framework provides an exciting new stage in the development of the Diffusion product, allowing easy integration of Diffusion with external systems of diverse types. This supports our continuing effort to make Diffusion easier to use. This allows customers to very quickly implement low code Diffusion based solutions. The framework will significantly cut development and maintenance costs for customers and opens up the possibility of a new community of Diffusion application developers.

Who are your customers? What IT services and support do you provide to your customers that are just beginning their digital transforming journeys with data streaming platforms? 

Over the last 15 years we have acted as a strategic partner with our platform serving companies worldwide and across industries. In the Gaming sector for example, our customers include Caesars (one of the big 5 gaming companies currently listed on the S&P 500), William Hill, Microgaming, Sportingbet, Oddschecker, Betsson and others. Serving our blue chip customer base, Diffusion powers $15+ billion bets per year. This gives us a unique and experienced view into how to solve the event-data processing challenges that sports b****** and gaming organizations face.

It is generally accepted that digital transformation is getting data sent in large volumes fast to those who want to get it and giving the recipient exactly what they want. What we have determined is that this needs to be filtered and this is where the transformation piece comes in.

For a long time, people have been throwing the term digital transformation around and everyone understood the end state. What wasn’t understood is the pieces of the journey that are required to successfully and cost effectively achieve the end state. I believe we have cracked that code and understand what needs to be done.

To achieve this goal there cannot be restrictions on data consumption and low-code functionality to enable data enrichment, and transformation is required. Most importantly, the orchestration of the data journey throughout these stages requires platform intelligence which removes the hassles of data handling from the development teams enabling them to focus on critical business requirements and deliver applications to market faster.

You serve financial industry too. How is providing data streaming technology to this industry different from the rest of the world? 

Indeed we do. Intense competition, complicated regulations and increasingly sophisticated clients demand financial institutions to provide tailored, real-time services. Leading trading and retail banks, brokerages, and research firms rely on Diffusion to deliver. They use it to fuel event data applications that provide superior customer experiences, high performance and significant cost savings.

We have clients such as Lloyds Bank (commercial), Tradition (trading and exchanges), Consorsbank! (retail), IHS Markit (investment), Signal Centre (trading research), and Baker Technology (trading applications).

Using Diffusion, Exchanges can easily add new data streams such as integrated FX and Fixed Income systems to improve scalability to reach new global markets. Global interdealer brokers, including TPICAP and Tradition who together represent 49% of the sector revenues, use Diffusion to assure reliable data management, integration and delivery.

Read More: CIO Influence Interview with Russell McMahon, Associate Professor at the University of Cincinnati and Aaron Kalb, Co-founder and Chief Strategy Officer at Alation

Please tell us more about the role of AI and machine learning in the software development at your organization. How do you define new benchmarks for your product and marketing team?

As companies seek competitive strategies to expand into new markets and geographies they must shorten development timelines, increase and assure reliable and secure performance, and reduce costs, in order to go-to-market ahead of the competition with their event-driven applications. Development teams are now more than ever seeking a low-code approach to building their next generation event-driven applications.

For companies to continue their digital transformation initiatives, turbo charged by AI, machine learning, and cloud architectures, they need a data platform to handle event stream processing. This is pivotal to the evolution of their infrastructures. This has to be the case, if companies are to be responsive to their users, and deliver a hyper-personalized experience. This can only be achieved with technical intelligence, which will drive the industry forward as events increasingly shape the future of IT.

Your advice to every CIO in the modern times on how to optimize their data management environments.

Companies require intelligent systems to save them development time and money as well as reduce ongoing operational costs of building and running next generation mission critical corporate applications. In addition, the  desire for application agility is intensifying.

Companies are constantly under pressure to build and launch new features quickly to stay ahead of the curve. That is where low code platforms step-in as companies don’t want to code, they want to deploy out-of-the-box solutions quickly and efficiently. Having a platform that can manage load with ease and without spiralling costs is a very compelling proposition. Crucially, these products must be stable during peak events which is a challenge. You don’t want to have to spin up cloud infrastructure that you can’t turn off quickly. The trick is to scale up quickly and not get a huge bill at the end of it. Low code platforms deliver on that promise and will be highly sought after in 2023 enabling companies to optimize their data management environments.

Your favourite customer case study that you would like to share with our readers. 

Tradition wanted to create a new Electronic Trading System and chose DiffusionData’s Data Platform, to assure reliable, real-time data delivery to their institutional clients worldwide.

Tradition is the interdealer broking arm of Compagnie Financière Tradition, one of the world’s largest interdealer brokers of financial and non-financial products on OTC markets and #1 in continental Europe. With operating companies in 29 countries and more than 2,225 people around the world, Compagnie Financière Tradition acts as an intermediate providing broking services for a complete range of products. The Group’s client base primarily comprises banks and financial institutions around the globe.

Interdealer brokers improve price discovery and transparency by posting a bid, offer, and size of available securities for trading. Historically, this process was dependent solely upon people constantly on the telephone relaying information and maintaining anonymity between the buyers and sellers. The objective of the new Tradition platform was to electronically automate the interdealer broker process by matching buyers and sellers and allowing these traders to trade directly with one another, while each side’s identity remained hidden.

To build the new Electronic Trading System, the Tradition development team had to: plan for their existing high data volume, the expansion of their data volume as the company client base grew, and the additional system functionality slated for the future. The data to be managed was bids, offers, and size of available securities for trading.

Tradition had to build an electronic trading platform that could easily scale, run over private networks and the Internet, and allow them to demonstrate their trading platform to prospective clients via the Internet.  Also high on Tradition’s list of requirements, was the flexibility to structure and transform data so that clients could access exactly the information they required, as well as give Tradition control over entitlements and data access permissions.

Tradition turned to DiffusionData’s Data Platform to address their requirements and simplify and speed development of their new electronic trading platform. With the use of Diffusion to consume, enrich and deliver data, the challenging load on Tradition’s back-end systems has been eliminated.

The firm’s employees and the bank and financial institution clients around the globe receive reliable, real-time access to the data they need, thanks to Diffusion’s proprietary streaming and messaging technology. Hundreds of thousands of messages-per second are delivered through Tradition’s multi-channel services.

Diffusion powers Tradition’s Trad-X multi asset class trading platform for OTC derivatives, ParFX wholesale electronic spot FX trading platform, and Torrent hybrid order management platform for trading foreign currency NDFs.

With DiffusionData’s Data Platform, Tradition eliminated the data management and delivery challenges faced by their development team. Further, as Tradition evolves and changes their back-end systems to meet increasing regulatory demands, the Diffusion Data Platform will minimize the impact of those changes on Tradition’s electronic trading platforms.

Read More: CIO Influence Interview with Andrew Hollister, Chief Information Security Officer at LogRhythm

Thank you, Grethe! That was fun and we hope to see you back on cioinfluence.com soon.

[To participate in our interview series, please write to us at sghosh@martechseries.com]

Grethe has extensive, experience in operations, R&D and commercial management for software companies across a variety of industries including Telco, FinTech and Food & Beverage. In addition to her work at young, high-tech companies, earlier in her career Grethe held software engineering and management positions with companies including Ericsson Ltd. She holds a BSc in mathematics from the University of Linkoping in Sweden, and an MBA from the University of Surrey.

DiffusionData Logo

 

DiffusionData has pioneered and led the market in real-time, data streaming and messaging solutions that dramatically reduce network bandwidth requirements, allowing customers to expand their businesses.

The company’s DiffusionData Platform, consumes raw data in any size, format,  or velocity; enriches the data in-flight; and distributes the data in real time — reliably and at massive scale with secure, fine-grained, role-based access control. Diffusion is purpose-built to simplify and speed data-driven, real-time application development, reduce operational costs, and economically deliver hyper-personalized data at Internet scale.

Leading brands, across industries including financial services, transportation, energy, retail, healthcare, eGaming, and Internet of Things companies, use the Diffusion Data Platform to drive customer engagement, fuel revenue growth, and streamline business operations. Diffusion is available on-premise, in-the-cloud, or in hybrid configurations, to fit the specific business, regulatory, and infrastructure requirements of the event-driven applications operating in today’s everything connected world.

Related posts

DDN and Tintri Launch IntelliFlash High Performance Enterprise NVMe and Hybrid Storage with Built-In AI and Multiprotocol Data Services

CIO Influence News Desk

Gijima Powers Sustainable Data Infrastructure with Hitachi Virtual Storage Platforms

CIO Influence News Desk

NetApp Optimizes VMware Environments with New Capabilities

Business Wire