Hitmetrix - User behavior analytics & recording

Freedom Intelligence Adds Tricks

A few years back, marketing technology experts often debated the virtues of “open” vs. “proprietary” database management systems. Open databases, meaning products that use the industry-standard Structured Query Language, were considered good because they did not lock users into a single vendor's applications. Proprietary databases, which use their own private query language, often were significantly faster and more flexible than their open competitors.

Today, the dust has settled. The open databases have won: It's just about impossible to sell a system built on a proprietary database. Vendors that do use such technologies position them as “query accelerators” that supplement rather than replace an open database. Or they bury the technologies deep within their systems and don't talk about them at all.

But it turns out that the “open” road still has some pretty high tolls at the exits. Even though SQL does allow the same application software to run on different databases, corporate information technology departments still find it painful to switch from one database to another. This is partly because database vendors add nonstandard features to SQL that improve performance but require changes to the application to run it on another database. But the chief reason is that databases are run by experts who usually specialize in a single product. This means that running multiple databases requires hiring more than one set of experts — a difficult and costly endeavor.

The practical result is an extremely strong bias in favor not just of open databases in general, but of whatever specific database is dominant within a particular installation. Even companies that already use several different databases are reluctant to further complicate their lives by adding a new one.

What this means for database software vendors is that SQL compatibility by itself does not truly eliminate the barriers to adoption. Even demonstrably superior performance is no guarantee of success. In recent years, quite a few technically impressive, SQL-compatible products — including Red Brick, HOPS, Sybase IQ and Mercantile IRE — have failed to win widespread adoption.

Freedom Intelligence (Freedom Intelligence, 519/884-4491, www.freedomintelligence.com) faces this same challenge. The system builds a highly compressed, fully indexed database that can accept standard SQL queries and return results five to 10 times faster than standard relational databases. It does this with data compression, sorting and indexing techniques that are often applied in such systems, plus some special tricks that the vendor will not reveal.

Freedom Intelligence imports data from a conventional data source, either an existing relational database or a comma-separated flat file. The connection is made through standard Open Database Connectivity drivers, which are available for almost any likely source. The system automatically displays the ODBC data dictionary and lets the user pick which elements to import and index. Optimal indexes are built automatically, so setting up the system requires very little specialized technical support.

During the import process, the system both builds in the specified indexes and creates its own compressed copy of the original data. Having this copy available lets Freedom Intelligence output any data element, including items — say, first names — that are not necessarily indexed. It also lets Freedom Intelligence operate independently of the source data systems themselves, so queries and selections do not interfere with other activities.

But working from a frozen copy also means the data is not up-to-the-minute, as is needed for most operational systems and increasing numbers of marketing applications. Data is loaded at about one gigabyte per hour. This should rise to two gigabytes per hour in the next release, due by September. Even at the faster rate, load time could pose problems for very large databases.

In some situations, load time can be reduced by using incremental updates rather than building each new database from scratch. This is somewhat limited, since the system can add and delete records but cannot modify existing ones. Incremental updates run at five to 10 gigabytes per hour, based on the total gigabytes in the new and existing database combined. The vendor said there is no degradation in performance after incremental loads.

The Freedom Intelligence data set takes up one to three times as much space as the original raw data. This is efficient compared with conventional database indexes, but it is still a problem for very large installations. It has not yet been a critical issue, since the largest current installation holds just less than 20 gigabytes of data. The next release is expected to handle several hundred gigabytes, although this is still much smaller than today's multi-terabyte enterprise data warehouses. Mindful of this limit, the vendor positions Freedom Intelligence as a tool to build specialized subsidiary data marts, not a replacement for the central warehouse itself.

Once the data is loaded, standard SQL-based tools can query it through ODBC. The current version of Freedom Intelligence supports most SQL functions but is missing some features for complicated subqueries and a few string functions. The next version is expected to support the full 1992 ANSI SQL standard. Still, the existing capabilities already make Freedom Intelligence more powerful than some index-based systems, which cannot do calculations on their compressed data and may have problems handling fields with large numbers of unique values.

Freedom Intelligence also overcomes the limits that some specialized databases place on data structures: It can handle large numbers of tables, can join many tables in the same query without performance problems and is not limited to joins specified when the data is loaded. In addition, the system provides extensive text-search capabilities, including the ability to look for substrings and for groups of words in context. It also does well with hundreds or thousands of simultaneous users.

The current version of Freedom Intelligence was released in late 1998. It runs on Windows NT/2000 servers and has two live installations plus several pilot sites. The next release also will support Unix servers, which will be key to improving scalability and performance. Pricing is based on the number of users and amount of data; it begins at $50,000 for a few users and five to 10 gigabytes. A large, enterprisewide installation could cost $500,000 to $1 million.

David M. Raab is a partner at Raab Associates, Chappaqua, NY, a consultancy specializing in marketing technology evaluation. His e-mail address is [email protected]

Related Posts