by Fabian Schut
In our overview article on Data & AI trends for 2026, we identified a crucial shift: the obviousness of a "cloud-first" strategy is in question. Where the cloud was once the promised land of limitless possibilities, it is now a complex playing field of geopolitical risks, unpredictable costs and strict regulations.
The focus is shifting from the seemingly limitless possibilities of the cloud to fundamental, strategic questions. It now revolves around where the data resides, what legal regime it falls under, and what costs are involved.
In this article, we dive deeper into this new reality. We sharpen the discussion with thought-provoking statements, illustrate the dilemmas with concrete examples and share our views on how to navigate a world where technological freedom gives way to conscious, strategic choices.
Thesis 1: The degree of data sovereignty becomes a conscious strategic choice in data projects.
In a world of growing geopolitical tensions and a trend toward deglobalization, the question of where your data is stored is becoming increasingly important. For years, choosing the cloud was primarily a technical and financial consideration, with solutions from U.S. tech giants often seen as the standard. Now that obviousness is no longer so fixed. European organizations are becoming aware of risks when their most valuable data is housed in other legal waters.
There is growing uncertainty about the protection of and access to data under U.S. jurisdiction. Legislation such as the CLOUD Activ fuels this unease; it gives U.S. authorities the ability to retrieve data even if it is stored outside the country's borders. This forces organizations to rethink control of their data. It becomes a strategic imperative to reduce dependencies and keep data "in-house" or within their own jurisdiction. The discussion of cloud versus on-premises thus takes on a new, strategic dimension. Ensuring that data is only covered by local laws is no longer an afterthought, but a decisive factor in designing a future-proof data landscape.
This does not mean, by the way, that the American cloud is being written off. On the contrary, it calls for a more mature approach that makes the most of the unique power of these platforms. The major U.S. tech giants are namely the engine behind global technological innovation. For data that is less sensitive, they provide direct access to the latest developments in areas such as artificial intelligence and powerful data analytics, capabilities that you can rarely build internally as quickly and on a large scale. Moreover, they give the flexibility to grow effortlessly with demand and provide a stable, reliable digital backbone. The smart, future-proof approach is therefore to classify data: what is truly strategic or privacy-sensitive and needs to stay close to home, and what data can actually flourish by leveraging the unparalleled innovation power and scale of these global players? This is how you combine the best of both worlds in a hybrid architecture, with a well-thought-out design as the basis.
Case study
Since the publication of the report 'Vision Digital autonomy and sovereignty of government‘ on June 30, 2025, we see the question of data sovereignty directly reflected in the questions of our public domain clients. Questions now on the table include, for example, 'How quickly can we migrate our data platform to a European alternative?‘, ‘Which technologies are deployable both within our current architecture and in a future sovereign platform?‘, ‘Is it possible to switch completely to open source for data transformations?' and 'Will cloud providers that are now European-owned really remain European?'. The latter question is additionally fueled by a recent acquisition in the application landscape.
The report challenged the naturalness of current application landscapes, and public domain organizations are now noticeably acting accordingly.
Are you worried about your data?
- Get in touch with us!
Thesis 2: Without controlling costs you don't have a modern data platform.
The need for cloud storage and computing power is increasing, and for good reason. Its scalability, flexibility and vast array of managed services make it possible to innovate quickly. But this flexibility comes at a price. As data volumes and data processing complexity increase, cloud costs rise just as fast.
In the cloud, you pay for what you use. Every ad hoc calculation, every inefficient query, every random prompt to your internal AI chatbot, and every gigabyte of unnecessarily stored data ticks directly into your bill. Cost monitoring is therefore becoming an increasingly critical part of your data platform. It is no longer enough to focus only on technical operation; cost efficiency is just as important. In addition, organizations struggle with the opaque cost structure of cloud solutions. This lack of clarity leads to delays in decision making. The complex range of pricing models - such as fixed price, reservations or pay-as-you-go - does not make things easier. In order to make a good trade-off in this, it is essential, for example, to know which parts of the data processing require the most computing power. A specialized role, that of FinOps, has even been developed to manage this process. Organizations need to think strategically about the design of their data platform: how data is stored, how processes are optimized and when computing power is deployed.
Case study
We see a good example with our customers who have set up their data platform on Microsoft Fabric. As a SaaS solution, it offers a seemingly predictable cost model: a fixed price per month for a certain amount of computing power, with each upgrade doubling the capacity as well as the cost.
In practice, however, this simplicity turns out to be a sham. Customers struggle with the question: how much computing power do we really need? Estimating this in advance is difficult, which means that you quickly end up paying too much for unused capacity, or actually running out of it. The need to keep a finger on the pulse, to monitor and dynamically scale up or down, does not disappear. Not even with a SaaS package that promises to eliminate those concerns.
Significantly, even Microsoft is backtracking on this by still providing a "pay-to-use" option for certain processes offer. It proves that the promise of a fully managed SaaS platform cannot eliminate the need for financial control and detailed insight.
What is the best solution for you?
The trend
The days of an unconditional "cloud-first" strategy are over. Organizations are being forced to answer two fundamental questions: who has legal control over our data, and who controls the financial thermostat? These questions lead to a key reason why we are seeing a striking counter-movement: a renewed interest in on-premises solutions.
Although the cloud investments increase, we also see that some organizations their applications Retrieve to an on-premise infrastructureix. By taking the infrastructure back into their own hands, they immediately get a better grip on costs and guarantee sovereignty over their data. However, this brings a new consideration: the renewed need to invest in internal management and maintenance, a complex and costly task that the cloud just so efficiently out of your hands. For organizations, therefore, by 2026, this will become a conscious strategic consideration, which they will make with a well-thought-out platform design.
If you work with Data & AI, you won't want to miss the following articles. Keep an eye on the website or sign up below to automatically receive follow-up articles and other news about Data & AI in your mailbox!



