Metaspex Team Story Contact
Metaspex's Technology

Every once in a while, a technology comes and shakes up how we do things. A well-known one is the relational model. It became mainstream in the 1980s, and disrupted the way we developed back-ends. That was 40 years ago.

We know that trees do not grow to the sky. 40 years is an eternity at the scale of technological eras. Yet, the relational model and the tools which came with it have dominated the technology landscape until now, well after the Cloud was invented.

An important, although little-known event happened when Eric Brewer postulated the "CAP Theorem" at the end of the 1990s, more than two decades ago. A mundane formulation is "pick two" (and only two, the three of them being unattainable) among three guarantees given by a distributed system: Consistency, Availability and Partition tolerance ("C", "A" and "P"). The two first ones are easy to interpret, the last one is, somewhat improperly, called "resilience" amongst ourselves.

Why is it so important? Because the entire relational model, and the tools supporting it, have been built around the absolute consistency principle. CAP predicts that if we choose it, we can either have availability or resilience. Not both. There is no way around that, a theorem is impossible to bend. Alas, consistency is desirable, but it is the least desirable of the three. The reason for that is quite obvious. In a commercial application depending on a distributed system, only two guarantees among the three are directly related (positively or negatively) to a company revenue: Availability and Partition tolerance.

Indeed, if the application is not available, revenue takes a hit. If resilience is reduced, same thing. Consistency on the other hand is only indirectly related to revenue. If it is breached, the application keeps working, and generates revenue. Issues can be resolved in a multitude of ways. In contrast, issues on availability and resilience keep everybody awake at night.

This is why we are witnessing a more and more widespread usage of document-oriented databases, which part ways from the age-old relational model, relaxing consistency and making strides on availability, scalability and resilience. But they do not offer much more than a distributed filesystem with indexing. The gradual disappearance of relational databases, accelerated by the move to more Cloud-based heavily distributed systems (microservices), in spite of its numerous advantages, leaves a wide vacuum around data modeling and handling.

A vacuum is also an opportunity. That's what Darwin's theory of evolution has shown us.

The relational model has also stopped productivity gains. Repeated attempts have been made to create higher abstractions on top of relational databases. These endeavors have failed. This failure costs us today $2 trillion (that's two million million) every year, by perpetuating the human wave approach of the 1970s to delivering back-end projects. The number of programmers has doubled every 3.4 years during the last half of a century, because technological progress halted, echoing the pause in progress mankind has faced during the Middle Age. A humongous bubble is menacing to burst. What is coming is the same as the switch from soldering basic electronic components by hand, to having integrated circuits built automatically from masks by machines. A huge step in productivity, better quality and performance.

Modeling, handling, querying data is to be defined again, from the ground up. Even the recent explosion of Artificial Intelligence needs proper data modeling to make its way to the core of companies operations. And we need to build upon these new ideas to restart productivity gains.

Helping the IT community achieving this ambitious goal is what Metaspex does.
  • Metaspex comes with linguistic modeling techniques, much friendlier to business analysts. A domain is described using what AI calls an "ontology".
  • It offers a strong, science-based greatly extended definition of referential integrity.
  • It approaches data querying from a more powerful and natural angle: indexable semantics, making SQL feel clunky and cryptic.
  • It reduces complexity, allowing large applications to continue growing harmoniously, saving them from code proliferation and from coming to a halt, painful to business.
  • It offers serviceability, allowing implementations with variations, "dialects" from a master ontology.
The techniques used by Metaspex are better rooted in science than the relational model, and they build upon concepts which predate it. In addition, Metaspex's automatic code generation benefits from the latest advances in compilation and optimization, making of Metaspex a tool that takes better care of implementation than human programmers. Applications are delivered lightning-fast, straight from formal specification, and run faster also. The best of two worlds, this is a first.

But as usual, there are two sides to a coin: we can save a substantial amount of the yearly $2 trillion, or we can take advantage of Metaspex's power to do much more than what we do today. Somehow we believe that the latter will happen... What do you think?
A Human Wave Bubble Ready to Burst

The Case for High-Definition Search

Metaspex's Performance Advantage: A Miracle?

Ubergeek corner: Metaspex answering the 8 Ultimate Software Questions