“If everything seems under control, you’re not going fast enough.”

With the rate of data growth continuing to accelerate and market disruption becoming the new normal, companies might wonder how they can react fast enough. David Reed spoke recently to Aaron Auld, CEO of in-memory analytics vendor EXASOL, ...
aaron_auld_ceo_with_x_300dpi

With the rate of data growth continuing to accelerate and market disruption becoming the new normal, companies might wonder how they can react fast enough. David Reed spoke recently to Aaron Auld, CEO of in-memory analytics vendor EXASOL, about how its solution helps clients to keep pace – and how the company itself is coping with a changing marketplace.

Mario Andretti’s saying about speed and control reflects an important truth about Formula 1 cars – they operate optimally when right at the limits of aerodynamic and mechanical traction. Many companies now feel they, too, are hurtling around a track in pursuit of their goal, but pushing the boundaries of their capabilities. From decision making to its supporting technology, the pace is testing – and even breaking – existing processes.

There is one aspect of big data analytics which, like Formula 1 cars, actually performs better the quicker it runs, however – multiple parallel processing (MPP). When EXASOL was formed as a spin-out from a University project in Germany, MPP was still an exotic beast – now the company (working with Atheon Analytics) has even managed to shrink its analytical engine down to fit on a four-inch square portable Intel device that can be used for proof-of-concept deployments.

aaron auld ceo with x 300dpi scaledAuld argues that this is almost a case of waiting for the rest of the world to catch up. “What has made it easier for us is that the product works so well. That goes back to the data model created 15 years ago and the way the product has been developed since. EXASOL runs very fast on one node – as soon as you start to scale it up, the performance increases linearly. The way we built it means users get increased performance without increased costs.”

This has become an increasingly important consideration with data volumes growing two or threefold each year, threatening to outstrip processing capabilities as a result. When EXASOL launched in 2008 and for its first five years, the competition was classic data warehousing architecture built by the likes of IBM, Oracle and SAP. With the explosion of big data from mobile and social, alternatives like Hadoop have come into (and even gone out of) favour.

“It becomes increasingly difficult for companies to plan for the resource they are going to need in two or three years’ time,” says Auld. “That influences their decision making because they have to think about what they build on. We can scale linearly up to very high data volumes which makes us a cheaper option for CTOs and CIOs.”

A perfect example is King which, as delegates at last year’s DataIQ Link heard, needs to understand the 1.5 billion daily plays and what they mean commercially and for product development. Running a 200TB Hadoop data warehouse, it recently decided to migrate to an EXASOL solution which offers plug-and-play operation, four-fold storage increase and a continuation of the analytical performance it had already become used to.

To Auld, this is “a typical example of what you can do with an analytics engine” and working proof that data science can have a big impact when given the right operating environment. “Data scientists need to have certain guidelines for what you want them to achieve, but the goal is to throw away the constraints. To do that, you need compute power, performance and flexibility,” he says.

“King is finding that EXASOL gives them a way of working with their data that brings them gains. It is a data science machine, yet they also use it for low-level reporting. They can get information in seconds. They can generate BI without having to make changes to the data or the need for static numbers,” explains Auld.

For many organisations hoping to scale up their data and analytics operation, one of the challenges is the problem of moving data in and out. Vendors know the pain – and have typically responded by effectively locking-in clients through aggressive pricing, bespoke data models or poor integration with other solutions. Where Auld’s company has been scoring is through its ability to be loaded from multiple different data sources and then to interrogate them through almost any front-end tool the user wants to deploy. EXASOL has a strategic partnership with Tableau, but can be used underneath Cognos, MicroStrategy, SAS and many more. 

Easing data in-out problems helps the vendor to gain buy-in from companies facing the challenge of fast-changing environments. Says Auld: “Large organisations have a lot of resource and development power, but change takes a long time and sometimes they can’t do it. We are seeing cultural shifts where a tech-savvy individual is looking at best practice and best-of-breed. That is where our opportunities arise because we can make progress without disrupting everything else.”

It is not just clients’ businesses that are accelerating their transformation. EXASOL has undergone its own quickening as clients increasing scale up. Typically, deployments start small and then grow. As a result, Auld says the market for his company’s solution is moving quicker than ever: “We have shortened our sales cycle from nine to five months in three years. It was 15 months when I started ten years ago. Companies that recognise turning data into value are making faster decisions.” 

Upcoming Events

No event found!
Prev Next