Quartet FS Releases Benchmark Results for ActivePivot
Quartet FS, a provider of business intelligence combined with complex event processing (CEP) technology tools, has announced the benchmark results for its risk analysis software ActivePivot.
Following a project completed for a large global investment bank, ActivePivot has significantly reduced the time it takes to parse, load and aggregate value-at-risk (VaR) data to the point whereby it fits easily within a bank’s end-of-day window – making intra-day VaR reporting a feasible task.
Based on the performance/memory figures from this project, centred around VaR reporting requirements and using only an 8-core Intel Nehalem server with 64 GB RAM, ActivePivot parsed, aggregated and loaded into its hypercube eight days worth of VaR data, or 320 billion P&Ls, in less than 80 minutes, with the memory usage being only 42 GBs. This means that ActivePivot was able to load approximately 66,000 complex objects per second (a trade with all its dimensions and measures, including 1,000 simulated P&Ls) into the OLAP hypercube. As a result, each subsequent day’s VaR data (32 billion P&Ls) is now loaded in less than nine minutes – as opposed to two hours previously experienced.
Also, users now experience sub-second query response times and are able to ‘slice and dice’ through multiple hierarchical dimensions and levels using Excel’s pivot table functionality or ActivePivot Live browser-based graphical user interface – seeing VaR, and a host of other bank-defined measures, at any level necessary.
Xavier Bellouard, founder and managing director at Quartet London, said: “As investment firms get to grips with the new reporting requirements around VaR that are required following the financial crisis, many risk managers simply do not have the right technology in place to deliver in-depth VaR analysis in an appropriate or timely fashion. Ultimately, speed and flexibility are crucial to an institution’s ability to create their own views, reports and blotters and to ‘slice & dice’ their risk data. With such high volumes of data, powerful memory management is crucial and we have proved our aptitude to handle this.”