Tableau Software officially released Version 6 of its product this week. Tableau approaches business intelligence from the end user’s perspective, focusing primarily on delivering tools that allow people to easily interact with data and visualize it. With this release, Tableau has advanced its in-memory processing capabilities significantly. Fundamentally Tableau 6 shifts from the intelligent caching scheme used in prior versions to a columnar, in-memory data architecture in order to increase performance and scalability.
Tableau provides an interesting twist in its implementation of in-memory capabilities, combining in-memory data with data stored on the disk. One of the big knocks against in-memory architectures has been the limitation imposed by the physical memory on the machine. In some cases products were known to crash if you exceeded the memory. In other cases the system didn’t crash, but it performed so much slower once you exceeded the memory that it almost appeared to have crashed.
The advent of 64-bit operating systems dramatically increased the theoretical limitations that existed in 32-bit operating systems. Servers can now be configured with significant amounts of memory at prices that are within reason, but putting your entire warehouse or large-scale data set entirely in-memory on a single machine is still a stretch for most organizations. With Tableau 6 a portion of the data can be loaded into memory and the remainder left on disk. Coupled with the feature that allows links to data in an RDBMS it provides considerable flexibility. Data can be loaded into memory, put on disk or linked to one of many supported databases. As the user interacts with data, it will be retrieved from the appropriate location. Tableau 6 also includes assistance in managing and optimizing the dividing line between data in-memory and on-disk, based on usage patterns.
However, one of the places where this new architecture comes up short is in the data refreshment process. In the current Tableau 6, users must manually request a refresh of the data that is currently in-memory. Ideally there should be an optional automated way to keep the in-memory data up to date. The other thing I would like to see in Tableau 6 and other in-memory products is better read/write facilities. Although this version includes better “table calcs,” which can be used to display some derived data and perform some limited what-if capabilities, there is no write-back capability that would let you use Tableau as a planning tool and record the changes you explore.
Tableau 6 includes a number of other features beyond the in-memory capabilities. It now supports a form of data federation in which data from multiple sources can be combined in a single analysis. The data can be joined on the fly in the Tableau server. Tableau refers to this as “data blending.” Users can also create hierarchies on the fly simply by dragging and dropping dimensions. And there are some new interactive graphing features such as dual-axis graphs with different levels of detail on each axis and the ability to exclude items from one axis but not the other, which can be helpful to correct for outliers such as the impact of one big sale on profitability or average deal size.
As well this release supports several new data sources including the Open Data Protocol (OData), the DataMarket section of Windows Azure Marketplace and Aster Data who my colleague recently assessed.
Version 6 also includes some IT-oriented enhancements. As Tableau has grown, its software has been deployed in ever-larger installations, which places a focus on its administrative facilities. The new release includes improved management for large numbers of users with grouping and assigning privileges and specific selection and edit options. It also includes a new project view of objects created and managed within Tableau. All of these help bring it forward to departmental and enterprise class analytics technology.
Overall, the release includes features that should be well received by both end users and IT. It shares an end user analytics category with QlikView 10, which I recently assessed, and Tibco Spotfire. I’ll be anxious to see if the company can push the in-memory capabilities even further in future releases. It is clear that Tableau brings another viable option to the category of analytics for analysts with new in-memory computing and blending of data from across data sources.
Let me know your thoughts or come and collaborate with me on Facebook, LinkedIn and Twitter .
Regards,
David Menninger – VP & Research Director