Introduction to AI Infrastructure
The promise of AI remains immense, but one thing might be holding it back. The infrastructure that powers AI today won’t sustain tomorrow’s demands. CIOs must rethink how to scale smarter – not just bigger – or risk falling behind. CrateDB agrees and is betting on solving the problem by being a ‘unified data layer for analytics, search, and AI.’
The Challenge with Current IT Systems
The challenge is that most IT systems are relying, or have been built, around batch pipeline or asynchronous pipeline, and now you need to reduce the time between the production and the consumption of the data. Stephane Castellani, SVP marketing at CrateDB, explains that CrateDB is a very good fit because it really can give you insights to the right data with also a large volume and complexity of formats in a matter of milliseconds.
How CrateDB Works
A blog post notes the four-step process for CrateDB to act as the ‘connective tissue between operational data and AI systems’; from ingestion, to real-time aggregation and insight, to serving data to AI pipelines, to enabling feedback loops between models and data. The velocity and variety of data is key; Castellani notes the reduction of query times from minutes to milliseconds. In manufacturing, telemetry can be collected from machines in real-time, enabling greater learning for predictive maintenance models.
Benefits of Using CrateDB
There is another benefit, as Castellani explains. Some also use CrateDB in the factory for knowledge assistance. If something goes wrong, you have a specific error message appear on your machine and say ‘I’m not an expert with this machine, what does it mean and how can I fix it?’, you can ask a knowledge assistant, that is also relying on CrateDB as a vector database, to get access to the information, and pull the right manual and right instructions to react in real-time.
The Future of AI
AI, however, does not stand still for long; “we don’t know what [it] is going to look like in a few months, or even a few weeks”, notes Castellani. Organisations are looking to move towards fully agentic AI workflows with greater autonomy, yet according to recent PYMENTS Intelligence research, manufacturing – as part of the wider goods and services industry – are lagging. CrateDB has partnered with Tech Mahindra on this front to help provide agentic AI solutions for automotive, manufacturing, and smart factories.
Model Context Protocol (MCP)
Castellani notes excitement about the Model Context Protocol (MCP), which standardises how applications provide context to large language models (LLMs). He likens it to the trend around enterprise APIs 12 years ago. CrateDB’s MCP Server, which is still at the experimental stage, serves as a bridge between AI tools and the analytics database. “When we talk about MCP it’s pretty much the same approach [as APIs] but for LLMs,” he explains.
Partnerships and Future Plans
Tech Mahindra is just one of the key partnerships going forward for CrateDB. “We keep focusing on our basics,” Castellani adds. “Performance, scalability… investing into our capacity to ingest data from more and more data sources, and always minimising the latency, both on the ingestion and query side.”
Conclusion
In conclusion, the infrastructure that powers AI today won’t sustain tomorrow’s demands. CrateDB is working to solve this problem by being a ‘unified data layer for analytics, search, and AI.’ With its ability to provide real-time insights and reduce query times, CrateDB is an exciting solution for organisations looking to move towards fully agentic AI workflows.
FAQs
Q: What is the main challenge with current IT systems?
A: The main challenge is that most IT systems are relying, or have been built, around batch pipeline or asynchronous pipeline, and now you need to reduce the time between the production and the consumption of the data.
Q: How does CrateDB work?
A: CrateDB works by acting as the ‘connective tissue between operational data and AI systems’; from ingestion, to real-time aggregation and insight, to serving data to AI pipelines, to enabling feedback loops between models and data.
Q: What is the Model Context Protocol (MCP)?
A: The Model Context Protocol (MCP) is a standard that standardises how applications provide context to large language models (LLMs).
Q: What are CrateDB’s future plans?
A: CrateDB plans to continue focusing on its basics, including performance, scalability, and investing into its capacity to ingest data from more and more data sources, and always minimising the latency, both on the ingestion and query side.