Cloud Independence: Testing a European Cloud Provider Against the Giants

Jun 18, 2025

Thorsten Foltz

Can a European cloud provider like Ionos replace AWS or Azure? We test it—and find surprising advantages in cost, control, and independence.

Somewhere in Europe. You might be running a small online business, managing a mid-sized automotive supplier or leading a global corporation with tens of thousands of employees. No matter the scale, the picture is the same. People are working on computers. They reply to emails, search for information on Google, chat with AI tools, calculate figures in Excel, and prepare presentations in PowerPoint. Some use PowerBI to view the latest sales reports.

In the background, your IT team is setting up databases in AWS, managing access to S3 buckets, and talking about scaling with Kubernetes. They are upgrading servers and considering whether to replace your aging Postgres data warehouse with Snowflake.

Your company seems to be on the right path, with its entire infrastructure running in the cloud. Still, something feels off. The dominant cloud providers — Amazon, Microsoft, and Google — are all based in the United States. Office 365? Also Microsoft. Snowflake or Databricks? American. Salesforce or Slack? The same.

Like many others, you rely on these platforms because they are considered easy to use, secure, and affordable. That is the promise. But it is worth taking a closer look.

Are they really easy to use? To some extent, yes. But think about how many cloud, platform, and data engineers you employ. The cloud may simplify some processes, but it introduces new layers of complexity too.

Are they cheap? That depends. Cloud solutions can be more cost-effective than maintaining your own infrastructure. But as your data and services grow, the costs increase rapidly. And providers often raise their prices whenever they can.

Are they secure? A properly configured cloud environment is usually well protected against external threats. But, just like with traditional systems, you need people who know what they’re doing. And keep in mind, your cloud provider always has access to your data. Under U.S. law, especially the CLOUD Act, American authorities can legally demand it.

Security and cost are both critical concerns, and security deserves special attention. The United States has long been and still is Europe’s closest ally. It is a constitutional democracy that respects the rule of law. Yet, recent developments suggest a shift. The U.S. increasingly follows its own strategic interests, and whether it consistently honors international legal agreements is becoming uncertain.

Maybe you think this doesn’t affect you. But it might. The debate over U.S. access to data is not new. If your business stores sensitive assets like technical documentation or trade secrets, there is a risk that outsiders could gain access.

Perhaps you assume encryption solves this. In theory, strong encryption protects your data. For now. But technologies evolve. Sooner or later, methods will emerge that can break today’s encryption standards. What is more concerning, however, is the assumption that you will always have access to your data. What if one day you don’t? That seems far-fetched? Maybe. Until it happens.

In February 2025, the U.S. government sanctioned the International Criminal Court (ICC). Its chief prosecutor, the British national Karim Khan, not only lost access to his financial accounts but also to his email account. We do not need to dive into the politics of whether this was justified. And to be fair, it is unclear whether Microsoft terminated access or the ICC. According to Microsoft’s President Brad Smith it wasn’t Microsoft. The fact is, it happened.

Now ask yourself this. What would happen if your entire infrastructure and data lived on one of the big American hyperscalers, and suddenly your account was shut down? How long could your company survive?

Do you truly believe the promises made by Microsoft and others that they are building an independent European cloud? Is it even possible? At the end of the day, they are still American companies.

Finally, everyone working with cloud environments has seen their invoices grow. It does not matter which provider you use. AWS, Azure, Snowflake, Salesforce, etc. They all increase their prices. Basic services become more expensive, data volumes continue to grow, and the use of managed services adds significant cost. This becomes even more problematic when scalability issues are solved by simply adding more hardware, often because no one has access to or fully understands what is happening behind the managed service.

Solution

What can you do? One approach is to go back to on-premises infrastructure, running your own servers in your office and managing everything by yourself. But with the amount of data and software you rely on, this could require a large number of servers. That brings significant costs for the hardware, the administrators you need to hire, security experts aren’t cheap, electricity is needed, and perhaps even a new building that must be physically secured. While cyberattacks are more common today, stealing physical hard drives is still a real threat, especially if those drives are not encrypted.

An alternative is to use a European cloud provider. Some, like Hetzner, mainly offer servers, but many others also provide managed services similar to those from the large US hyperscalers. There are numerous providers across Europe. For example, Ionos, Open Telekom Cloud, and StackIT in Germany, OVH and Scaleway in France, Elastx in Sweden, Seeweb and Aruba Cloud in Italy, Cyso in the Netherlands, UpCloud in Finland, and Exoscale in Switzerland. These are just a few of the more prominent ones.

Can they fully replace the US providers? The short answer is no. Their service offerings are still much smaller in comparison. But the more important question is what you actually need. Standard services like object storage, virtual machines, managed databases, infrastructure as code, and managed Kubernetes are widely available. If you focus on your core requirements, there is a good chance that these European providers can meet your needs.

European Cloud Provider Ionos

Let us take the Ionos Cloud as an example. Our goals are to:

  • Create an account and set a payment alarm to notify us if costs exceed a certain threshold

  • Set up storage, a virtual server, and a managed database using Terraform

  • Assign permissions for non-admin users

  • Install open-source software for analytics and a dashboard for visualization

Ionos is a German internet and cloud provider with around 4,500 employees and a turnover of approximately 1.5 billion euros. To get started, open their website, click on login, select your preferred language, and create an account just like on any other platform.

IonosIonos Cloud


Data center designerIonos

After logging in, you need to add a credit card, but you only pay for what you use. There are no fees or hidden costs. During the first 30 days, you receive a free credit of €200 or $200, which is great for testing. Just keep in mind that 30 days is not a very long time.

After logging in, the first thing you notice is the tidy and clearly organized interface. On the left, you see the available infrastructure and platform services. In the center, your virtual data center is displayed, along with your current resource allocation. You can increase this allocation if needed, as the default is just a starting limit. You also see the current status of the Ionos Cloud, news updates, and links to support and documentation. At the moment, the documentation is available only in English. In the top right corner, you can manage users and groups, configure password policies, and handle token management. You can also review usage and costs and set up a cost alert, which works reliably.

From the interface, it becomes clear that Ionos offers a wide range of services. In addition to virtual servers, you will find managed Kubernetes, a container registry, PostgreSQL, other managed databases such as MariaDB, object storage, network file storage, activity logging, IP management, VPN gateways, and Kafka.

Services offerd by Ionos

Infrastructure

To begin, creating tokens is required in order to work with code. You need three tokens in total: one for API or SDK authentication, and two for accessing object storage.


Go to managment and then token manager to create tokensDonwload tokenGo to storage and backup

Second, install the Ionos CLI. On Windows, you can use Scoop; on macOS, use Homebrew. For Linux, the method depends on your distribution — for example, use Snap on Ubuntu or the AUR on Arch Linux. After installation, set your token as an environment variable and test the setup by logging in to Ionos through the CLI.

Token is stored in Ionos

The next step is to create a .env file to store the credentials needed for use with Terraform, or rather with OpenTofu, the open-source alternative that I used. Please note, in the following some code has been shortened. Find the complete example in github.


Now let us create the Terraform or OpenTofu configuration files, starting with main.tf. This file will define the provider and set up the basic infrastructure components.
terraform {
required_version = ">= 1.9.0"
required_providers {
ionoscloud = {
source = "ionos-cloud/ionoscloud"
version = ">= 6.4.10"
}
aws = {
source = "hashicorp/aws"
version = ">= 5.0.0" # or latest
}
}
}

# Provider for IONOS Cloud API
provider "ionoscloud" {
token = var.ionos_token

s3_access_key = var.ionos_s3_access_key
s3_secret_key = var.ionos_s3_secret_key
s3_region = "eu-central-3"
}

locals {
# AWS region must be a valid AWS region string
# even if we talk to Ionos S3 via custom endpoint
dummy_aws_region = "eu-central-1"
}


# Provider for S3-compatible API (Ionos S3) via AWS provider
provider "aws" {
access_key = var.ionos_s3_access_key
secret_key = var.ionos_s3_secret_key
region = local.dummy_aws_region
skip_credentials_validation = true
skip_metadata_api_check = true
skip_requesting_account_id = true

endpoints {
s3 = "https://s3.eu-central-3.ionoscloud.com"
}
}

# -------------------------------
# S3 Bucket Policy Setup
# -------------------------------

# Generate the S3 bucket policy document
data "aws_iam_policy_document" "s3_bucket_policy" {
statement {
sid = "AllowEssentialS3Actions"
effect = "Allow"

actions = [
"s3:GetBucketLocation",
"s3:ListBucket",
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject"
]

resources = [
"arn:aws:s3:::${var.bucket_name}",
"arn:aws:s3:::${var.bucket_name}/*"
]

principals {
type = "AWS"
identifiers = ["*"]
}
}
}

# Apply the bucket policy
resource "aws_s3_bucket_policy" "s3_policy" {
bucket = var.bucket_name
policy = data.aws_iam_policy_document.s3_bucket_policy.json
}

You might be surprised to see that the AWS provider is used alongside the Ionos provider. This is because Ionos Cloud states that its object storage is compatible with the S3 protocol. As a result, you actually need to use the AWS provider to work with it.

To make this work, you simply change the endpoint to point to Ionos instead of AWS, and everything functions as expected. However, there are a few important things to know. First, the AWS provider requires a dummy AWS endpoint to initialize correctly. Second, setting a bucket policy is necessary, but this cannot be done through the Ionos provider. You must do it through the AWS provider.

As for the variables.tf file, it is not particularly complex. It just includes the settings you previously stored in your .env file.

variable "bucket_name" {
type = string
description = "test bucket"
default = "dataminded-terraform-test"
}

variable "ionos_database_password" {
type = string
description = "Postgres database password"
sensitive = true
}

variable "ionos_s3_access_key" {
type = string
description = "S3 access key"
sensitive = true
}

variable "ionos_s3_secret_key" {
type = string
description = "S3 secret key"
sensitive = true
}

variable "ionos_token" {
type = string
description = "Personal access token to ionos cloud"
sensitive = true
}

variable "ionos_user" {
type = string
description = "User to ionos cloud"
sensitive = true
}


Since we want to use a virtual server, we first need to create a datacenter. This is not complicated, but you do need to know where to create it. Previously, you might have chosen “eu-central-3” as the region, but now the format has changed to something like “de/fra,” which stands for Frankfurt in Germany. I didn’t find any list in the documentation, but you can find them in the UI or in this article:

  • de/fra for Frankfurt in Germany

  • de/txl for Berlin in Germany

  • us/ewr for Newark in the US (New Jersey)

  • us/las for Las Vegas in the US (Nevada)

  • us/mci for Lenexa in the US (Kansas)

  • gb/lhr for London in the UK

  • gb/bhx for Worcester in the UK

  • es/vit for Logroño in Spain

  • fr/par for Paris in France

resource "ionoscloud_datacenter" "dataminded-test" {
name = "Datacenter Dataminded"
location = "de/fra"
description = "datacenter description"
sec_auth_protection = false
}
Next, we need to create a virtual machine or vCPU server. Since I plan to install a data warehouse and a dashboard on this server, I chose a configuration with four cores, 500 GB of storage, and 16 GB of memory running Ubuntu. I also added my public SSH key to allow secure connections from my local computer. Finally, I set up a private LAN connection between this server and the database. Below are my network settings in the network.tf file:

resource "ionoscloud_lan" "lan" {
datacenter_id = ionoscloud_datacenter.dataminded-test.id
public = true
name = "public-lan"
}

resource "ionoscloud_lan" "private_lan" {
datacenter_id = ionoscloud_datacenter.dataminded-test.id
public = false
name = "private-lan"
}
Now I set up a PostgreSQL database. Versions 12, 13, 14, and 15 are available. It is a bit disappointing that Ionos does not offer versions 16 and 17 like AWS and taking into account that version 18 is already available as a beta. I chose version 15 with a single instance configured with four cores, 4 GB of memory, and 100 GB of HDD storage, costing about €0.28 per hour. A private connection between the server and the database has been established. Unlike the hyperscalers, you cannot connect directly to the database, you need to use a jump server. DBeaver community edition is a SQL client which supports it.

resource "ionoscloud_pg_cluster" "postgres" {
postgres_version = "15"
instances = 1
cores = 4
ram = 4096
storage_size = 102400
storage_type = "HDD"
connection_pooler {
enabled = true
pool_mode = "session"
}
connections {
datacenter_id = ionoscloud_datacenter.dataminded-test.id
lan_id = ionoscloud_lan.private_lan.id
cidr = "192.168.100.1/24"
}
location = ionoscloud_datacenter.dataminded-test.location
display_name = "PostgreSQL_cluster"
maintenance_window {
day_of_the_week = "Sunday"
time = "09:00:00"
}
credentials {
username = var.ionos_user
password = var.ionos_database_password
}
synchronization_mode = "ASYNCHRONOUS"
}

resource "ionoscloud_pg_database" "postgres_database" {
cluster_id = ionoscloud_pg_cluster.postgres.id
name = "linkedin"
owner = "thorsten"
}


The remaining tasks are creating a storage bucket and adding an additional user with restricted permissions. The bucket.tf file handles the storage setup, groups.tf defines the user group and its permissions, users.tf creates the user, and random.tf generates a password.


resource "ionoscloud_s3_bucket" "bucket" {  name                = "dataminded-terraform-test"  region              = "eu-central-3"  object_lock_enabled = false  force_destroy       = true  tags = {    key1 = "dataminded"    key2 = "test-bucket"  }  timeouts {    create = "10m"    delete = "10m"
resource "ionoscloud_group" "developer" {  name         = "developer"  s3_privilege = true
resource "ionoscloud_user" "test_user" {  first_name     = "Thorsten Test"  last_name      = "User"  email          = "thorsten@thorsten.de"  password       = random_password.user_password.result  administrator  = false  force_sec_auth = false  active         = true  group_ids      = [ionoscloud_group.developer.id
resource "random_password" "user_password" {  length  = 30  special = false

Run opentofu

to deploy. Time is around 7 Minutes, mostly because of the database. Going back to Ionos we see some changes.

Welcome Thorsten Foltz


Ionos

Connecting by SSH is possible now.

Ionos

A database has been created as well.


Postgres database

In addition, there is the object storage (S3).

Finally, an user “thorsten” has been created, as well as the group “developer”.

Object Storage

Now we fetch and store some data using Python 3.13. I used a public API from RapidAPI to retrieve some data from LinkedIn. The fetched data is stored as .csv files in the object storage. As you can see, the well-known library boto3 is used for this purpose. However, you’ll quickly notice that it does not work out of the box. The issue is an authentication error caused by checksum verification. To fix this, you need to add the following lines:

import os
os.environ['AWS_REQUEST_CHECKSUM_CALCULATION']

Find the complete script on github.

import httpx
import yaml
import os
from dotenv import load_dotenv
import io
import csv
import boto3
from botocore.client import Config
from pathlib import Path
from datetime import date, datetime, timezone
import time

env_path = Path(__file__).resolve().parents[2] / ".env"
load_dotenv(dotenv_path=env_path)

with open("config.yml", "r") as f:
    config = yaml.safe_load(f)

#------------ Config -----------------------------------

bucket_name = 'dataminded-terraform-test'
dataminded = config["companies"]["data-minded"]["id"]
dataminded_name = "data-minded"


# ------------------------------------------------------

def s3_client(config):
    os.environ['AWS_REQUEST_CHECKSUM_CALCULATION'] = 'WHEN_REQUIRED'
    s3 = boto3.client('s3',
                      endpoint_url = config["s3"]["endpoint"],
                      region_name = config["s3"]["region"],
                      aws_access_key_id = os.getenv("access_key"),
                      aws_secret_access_key = os.getenv("secret_key"),
                      config = Config(signature_version='s3v4')
                     )
    return s3

def object_name(filename):
    path = f"raw/company/{filename}/{date.today().strftime("%Y/%m/%d")}/{filename}_{datetime.now(timezone.utc).strftime("%Y_

Having a look into our object storage confirms that it worked.

Data is stored as cvs files

Data Warehouse and Dashboard

The data is now available and ready to be loaded into a data warehouse, then queried by a BI tool to create a dashboard. For the data warehouse, I chose Databend, an open-source alternative to Snowflake. For the dashboard, I used Metabase, which is also open source.

Please note that this setup is intended solely for testing and demonstration purposes. As seen earlier, firewalls are not configured, MFA is not enabled, and both Databend and Metabase are installed as containers on the same server we provisioned earlier. Running everything on a single server is not best practice, and both applications use their default images — Databend, for example, doesn’t even require a password by default.

This setup does not scale well, although the overall system could be made scalable with more effort. That, however, is beyond the scope of this demonstration. Don’t use any of this configuration in producton!

Databend (assuming you have installed docker on the VM) is ready to use by


docker run --net=host datafuselabs/databend

for Metabase
docker run -d -p 3000:3000 --name metabase metabase/metabase Let’s begin with Databend. After installing it, you can connect using any SQL client. DBeaver is a solid choice again. All you need is the IP address of the virtual machine, the default username (root), and port 8000.

The next steps include:

  • Creating a schema

  • Setting up a staging area using S3 as the backend (not AWS S3, but the bucket created on Ionos)

  • Creating an external table that reads from the S3 storage

  • Creating an internal table stored on the server itself

  • Copying the .csv files from S3 into the tables for further processing

My SQL script:

create schema linkedin;

-- Create Stage
CREATE STAGE s3_stage
URL = 's3://dataminded-terraform-test/'
CONNECTION = (
ACCESS_KEY_ID = 'EEAAAAGJ...'
SECRET_ACCESS_KEY = 'n4KB0L...'
ENDPOINT_URL = 'https://s3.eu-central-3.ionoscloud.com');

-- Create an external table
create or replace table linkedin.profiles(
id int
, name varchar
, global_name varchar
, url_linkedin varchar
, tagline varchar
, description varchar
, type varchar
, employees_counter int
, hq_area varchar
, hq_country varchar
, hq_city varchar
, hq_zip_code varchar
, industries varchar
, specialities varchar
, website varchar
, found int
, followers_counter varchar
)
's3://dataminded-terraform-test/linkedin-data/'
CONNECTION = (
ACCESS_KEY_ID = 'EEAAAAG...'
SECRET_ACCESS_KEY = 'n4KB0...'
ENDPOINT_URL = 'https://s3.eu-central-3.ionoscloud.com')
;


create or replace table linkedin.posts(
id int
, content varchar
, likes_count int
, comments_count int
, interest_count int
, empathy_count int
, reposts_count int
, posted_at varchar
, timestamp_posted varchar
, url varchar
)
's3://dataminded-terraform-test/linkedin-data/'
CONNECTION = (
ACCESS_KEY_ID = 'EEAAAAGJ...'
SECRET_ACCESS_KEY = 'n4KB0L...'
ENDPOINT_URL = 'https://s3.eu-central-3.ionoscloud.com')
;


-- copy data from external
COPY INTO linkedin.profiles
FROM @s3_stage/raw/company/dataminded-profile/2025/05/02/dataminded-profile_2025_05_02_15_08_55.csv
FILE_FORMAT = (
TYPE = 'CSV',
skip_header=1
);


COPY INTO linkedin.posts
FROM @s3_stage/raw/company/dataminded-posts/2025/05/02/dataminded-posts_2025_05_02_15_10_41.csv
FILE_FORMAT = (
TYPE = 'CSV',
skip_header=1
);

As a result you should be able to query data in databend and observe some new files stored in S3.

Schema and tables created


Linkedin-data

The final step is running Metabase and connecting it to Databend. You’ll notice that Metabase does not support Databend out of the box. However, Databend provides its own JDBC driver for integration.

To set this up:

  1. Create a directory on your server to hold the Databend drivers.

  2. Download the driver files from the official Databend repository.

  3. Modify your Docker command to mount this directory into Metabase’s plugin path (/plugins inside the container).

Once mounted and Metabase is restarted, Databend should appear as an available database type. You can then configure the connection using the IP, port, and default credentials.

mkdir metabasecd metabasewget https://github.com/databendcloud/metabase-databend-driver/releases/download/v0.0.8/databend.metabase-driver.jardocker run -p 3000:3000 -v ./databend.metabase-driver.jar:/plugins/databend.metabase-driver.jar --name metabase metabase/metabase

Open your browser and go to http://<your-ip>:3000 to access Metabase. Choose your preferred language, enter your credentials, and proceed through the setup. Once the custom Databend driver is recognized, you can select Databend as your data source and complete the connection.


Choose your language



Now databend is available


Successfully connected

Technical Result

As demonstrated, setting up a complete infrastructure, mainly for analytics, using a European cloud provider is not only possible, but in many ways comparable to working with US-based providers. So, can we just switch to them easily? Unfortunately, not quite.

In this case, Ionos performs well overall, but there are some clear pain points:

  • Documentation is limited and lacks depth.

  • Community support is small, making it harder to find solutions or best practices.

  • Self-hosted open source requires more expertise than relying on managed services.

  • Specific issues may arise, such as Metabase failing to read data from Databend. Likely a misconfiguration from my side, but it requires investigation.

  • If you rely on specific services (e.g. MySQL instead of MariaDB, or a built-in secret manager), you’ll need to run and maintain those yourself on a virtual machine.

  • No certifications are currently offered. Finding skilled professionals with hands-on experience may be difficult.

That said, it’s still worth a try. Core services like S3-compatible storage and virtual machines work reliably. You’re not overwhelmed by a bloated catalog of services and features, which keeps complexity manageable.

And perhaps most importantly: it’s a European provider. You’re subject only to European, and in this case German, law, with no entanglement in U.S. legal frameworks like the CLOUD Act.

So from a technical point of view, it’s not just an option. It’s an alternative, for basic services, but not for a complete replacement, if more advanced services are needed.

Let’s now move on and compare pricing.

Cost Comparison

Anyone who has ever tried to calculate cloud costs knows: it’s nearly impossible to get an exact number. You can estimate, but with so many variables, like region, API calls, traffic, scaling patterns, service types, and frequent price changes, precise calculation is unrealistic.

Comparing providers makes it even more complex. That said, I’ve prepared a rough estimate focusing on two core cost drivers: virtual machines and data storage.

Assumptions:

This is a simplified model to give a sense of pricing. Real costs can vary significantly depending on your architecture, discount plans, reserved instances, spot pricing, and managed service usage.

To estimate costs, let’s assume:

  • You store 1,000 GB of data per month

  • You run a virtual machine with 32 CPUs and 128 GB RAM for 720 hours per month (24/7)

  • Region is Frankfurt, Germany

  • Exchange rate at the time of calculation: 1 USD = 0.8788 EUR

Monthly storage costs (for 1,000 GB):

  • AWS: €21.53

  • Azure: €17.30

  • GCP: €20.21

  • Ionos: €7.00

Monthly compute costs (32 CPUs, 128 GB RAM, 720 hours):

  • AWS: €931.39

  • Azure: €874.44

  • GCP: €1,180.69

  • Ionos: €460.80

Annual compute costs (storage not accumulated):

  • AWS: €11,430.02

  • Azure: €10,700.89

  • GCP: €14,410.81

  • Ionos: €5,613.60

Ionos is significantly cheaper than the major US hyperscalers, more than 50 percent less in this example. The difference becomes even more impactful with larger workloads. If you don’t rely on highly specific managed services, a European provider like Ionos can be a serious and cost-effective alternative.

Of course, there are likely additional costs depending on your specific setup. Things like network egress, API usage, snapshots, premium storage, or licensing can shift the balance. Furthermore, using spot instances can lead to costs even below Ionos. And it’s also possible that some services are more expensive at Ionos than with the US hyperscalers. Finally, never underestimate the costs that can arise the more you do yourself.

However, even Ionos itself advertises that it is up to 50 percent cheaper than the major US providers. Based on this example, that claim holds up. Especially for core infrastructure like compute and storage, the price gap is significant.

Conclusion

The goal of this experiment was to test the maturity of a European cloud provider. Ionos left a solid impression. Stable core services, good performance, and a competitive interface. Still, it cannot fully replace the major US hyperscalers. If your workflows rely on features like AWS Lambda, BigQuery from Google, or managed orchestration platforms like Airflow, then US providers remain essential. Moreover, this is just a very basic test, scratching the surface. There might be surprises in the details.

And let’s be realistic, even if you migrate to Ionos or another European provider, you’re still dependent on US technology in some form. All (almost) businesses run on Windows and Office. Even if you switch to Linux and LibreOffice, your machines are powered by Intel or AMD chips. Full independence from US tech is not only unrealistic, it’s also not necessary.

But here’s the critical point: losing access to Microsoft Word and Excel is inconvenient. Losing access to your infrastructure and data is catastrophic.

From a pricing standpoint, the advantages are compelling. European providers like Ionos are not just cheaper, they’re often dramatically so. That alone makes them worth serious consideration.

You don’t need to make a complete switch overnight. A gradual shift, a hybrid or multi-cloud strategy, or migrating select workloads can reduce risk while increasing control. If nothing else, you gain experience and options.

And right now, options matter more than ever.

Latest

Cloud Independence: Testing a European Cloud Provider Against the Giants
Cloud Independence: Testing a European Cloud Provider Against the Giants
Cloud Independence: Testing a European Cloud Provider Against the Giants

Cloud Independence: Testing a European Cloud Provider Against the Giants

Can a European cloud provider like Ionos replace AWS or Azure? We test it—and find surprising advantages in cost, control, and independence.

Stop loading bad quality data
Stop loading bad quality data
Stop loading bad quality data

Stop loading bad quality data

Ingesting all data without quality checks leads to recurring issues. Prioritize data quality upfront to prevent downstream problems.

A 5-step approach to improve data platform experience
A 5-step approach to improve data platform experience
A 5-step approach to improve data platform experience

A 5-step approach to improve data platform experience

Boost data platform UX with a 5-step process:gather feedback, map user journeys, reduce friction, and continuously improve through iteration

Leave your email address to subscribe to the Dataminded newsletter

Leave your email address to subscribe to the Dataminded newsletter

Leave your email address to subscribe to the Dataminded newsletter

Belgium

Vismarkt 17, 3000 Leuven - HQ
Borsbeeksebrug 34, 2600 Antwerpen


Vat. BE.0667.976.246

Germany

Spaces Tower One,
Brüsseler Strasse 1-3, Frankfurt 60327, Germany

© 2025 Dataminded. All rights reserved.


Belgium

Vismarkt 17, 3000 Leuven - HQ
Borsbeeksebrug 34, 2600 Antwerpen

Vat. BE.0667.976.246

Germany

Spaces Tower One, Brüsseler Strasse 1-3, Frankfurt 60327, Germany

© 2025 Dataminded. All rights reserved.


Belgium

Vismarkt 17, 3000 Leuven - HQ
Borsbeeksebrug 34, 2600 Antwerpen

Vat. BE.0667.976.246

Germany

Spaces Tower One, Brüsseler Strasse 1-3, Frankfurt 60327, Germany

© 2025 Dataminded. All rights reserved.