Data Management

SQLite: Not Faster, Not Better, But Cheaper?

Actian Corporation

July 2, 2020

Understanding SQLite’s Total Cost of Ownership (TCO)

Over the past three months, this blog series has explored why developers gravitated toward SQLite for embedded data management. Some developers chose SQLite because members of the extended team knew SQL and wanted to leverage that knowledge to support data management or the extraction of data for visualization and reporting. Most developers, though, adopted it to overcome the limitations of existing flat file management systems.

That all makes sense in hindsight. The adoption of new products and technologies very often turns on the answer to a simple question: is this replacement an improvement over what I’ve got now? Even more categorically, is the new thing faster, better, or cheaper? An ideal replacement would be faster, better, and cheaper but that’s a trifecta that usually eludes us. Rarely does a proposed change take place if at least one of these characteristics is not present, though. So what prompted the adoption of SQLite? Was it faster, better, or cheaper than the available alternatives? And if it was any of these things, is it still faster, better, or cheaper than the database alternatives that are available today?

Faster?

Once, yes, SQLite was faster—compared to operations involving a flat file. Today? Hardly.

SQLite is positively lethargic on several fronts. In a head-to-head in comparison of SQLite and Actian Zen Core, data access via SQL may be comparable, but accessing the same data using the NoSQL API of Actian Zen delivers a performance improvement that is an order of magnitude boost over SQLite. Or consider speed in terms of the kinds of optimized client-server interactions demanded by applications in the realm of modern edge data management. Client-server interactions in IoT and mobile scenarios depend on high-performance data collection and the inline processing of transactions—from multiple external channels. But because SQLite operates exclusively in a serverless mode the data must be transformed (the “T” in ETL) before it can move to or from any server-based companion, such as Microsoft SQL Server. That step not only incurs a measurable performance hit, but it also creates a potential chokepoint that can constrain application scalability. Add into the mix a requirement for data encryption and decryption as part of that client-server transformation—and is there really a question about whether encryption will be required in any modern edge data management scenario?—and you can see the speedometer on the SQLite dashboard slipping further back towards zero.

SQLite was a speed demon in its day—but so was the Intel 80×86 architecture. Need I say more?

Better?

Well, unless you’re still interacting exclusively with the underlying file system, the answer is another easy “no.” We examined the limitations of the SQLite serverless architecture extensively in installments 5, 6, and 7 of this series. While the architecture was a breakthrough at the time, it was also a breakthrough for its time. It met the then-emerging need for a simple mobile and web application data cache. But that’s not today’s need. Today’s mobile and IoT scenarios require an architecture designed for high speed, multi-channel, multi-process, multi-threaded applications. While some early IoT applications were built on an assumption that the vast majority of data would be sent to the cloud for processing and analytics—a scenario in which SQLite seemed viable as a local data cache—it has become apparent that the underlying assumption itself was flawed. With the emergence of a modern edge data management topology in which analysis and transaction processing can take place at the edge rather than deep in the cloud, an optimized client-server architecture designed for streamlined performance along the entire continuum of device-to-edge-to-cloud/data center redefines the concept of “better.”

As with faster, SQLite was once better than other database alternatives when it came to single-user mobile applications and local data caching. But serverless architectures aren’t meant to address the tasks of our time. Ours is a multi-verse, with multi-machine-to-multi-machine and multi-human-to-machine interactions and transactions occurring all the time. That world demands more than SQLite can deliver.

Cheaper?

Okay, SQLite gets that one. It’s open source, free. Can’t get cheaper than that. For Do-It-Yourselfers (DIYs) who eye the cost of externally produced and purchased software with hawk-like vigilance, SQLite may still exert a pull. Same thing among the business decision-makers when they hear that SQLite is free. That could mean more left in the budget for other line items in the BOM or to pay for additional service hours.

But “free,” here, is as misleading as “free puppies.” If you only look at the upfront cost of SQLite, you can’t beat it. But you’re decoupling that assessment from any consideration of the internal cost of that decision. If you factor in the costs of software design, development, testing, updates, ongoing support, and so forth—all of which, as we have previously discussed, involve a significant amount of hoop-jumping, given the inherent limitations of the architecture—then the cost calculation changes dramatically.

We could devote an entire blog just to DIY cost estimates, so we’re not going to dive deep here. Anyone that still has Cobol assets or other legacy tools and systems completely understands how difficult it can be to maintain and support code that was originally designed to meet the challenges of an earlier era. If cheaper remains the prime mover for you and if you’re determined to do the work to extend the capabilities of SQLite to meet today’s needs, there are various tools you can access to model the cost burden per number of lines of code that this effort will incur. The models vary by the size of the code body, regulatory guidance, projected lifecycle, and many other factors, but they may be able to help you assess in advance the true cost of this folly—sorry—I mean effort.

Of course, you may be smiling smugly and thinking that, no, you’re not actually going to do it yourself. There’s an entire industry of boutique developers that specialize in SQLite tools and add-on components, including SQL query editors, utilities for encrypting data at rest and in transit, transformation tools for synchronizing data with Microsoft SQL Server, and much more. But this approach only introduces a different dimension of cost to your undertaking. Not only do these add-ons effectively nullify the “free” aspect of SQLite (since they’re not free), but reliance upon these small vendors introduces an element of risk over which you have no control. Any bugs or shortcomings in their code becomes an inherent part of your application. If the boutique developer disappears—and the majority of them have historically disappeared in fairly short order—then you’re suddenly back to the DIY model you thought you were avoiding. This time, though, you’re having to DIY without a full understanding of the code you’ve incorporated, which often means suboptimal patching and extending code that you probably would have designed differently from the ground up if it were yours.

Oh, and between the lines above—no pun intended—you can clearly see that you still need to DIY the integration of these boutique add-ons into the solution you’re developing. Do we need to puncture your bubble still further by noting that the burden of troubleshooting any issues or conflicts arising from the incorporation of these add-ons also falls to you? You won’t necessarily have the insight required to resolve these issues easily, but you’ll ultimately be responsible for the solution you’ve delivered and yours will be the throat they reach for when users are unhappy.

Time to Retire That Number

Ultimately, SQLite is not faster, not better, and not cheaper. Not anymore. We’ll give SQLite its due: It was a brilliant addition to the tech team in its youth, but it is time to hoist that jersey to the rafters and retire the number. If it’s not faster, better, or cheaper, why would you still adopt it? Given the demands of modern edge data management, faster, better, and cheaper all point to Actian Zen.

About Actian Corporation

Actian is helping businesses build a bridge to a data-defined future. We’re doing this by delivering scalable cloud technologies while protecting customers’ investments in existing platforms. Our patented technology has enabled us to maintain a 10-20X performance edge against competitors large and small in the mission-critical data management market. The most data-intensive enterprises in financial services, retail, telecommunications, media, healthcare and manufacturing trust Actian to solve their toughest data challenges.