Conveyor unites an entire ecosystem of modern tools and services into a single, simple workflow for building maintainable and cost effective data projects.
Simplify Data Engineering
Get started with a single command.
Scaffold your batch or stream processing project from one of our templates following industry best practices.
Choose your compute size and deploy in seconds.
Use T-shirt sizing to select the appropriate size for the workload. Deploy to test and promote to production.
Extend your use-cases with your own tools and libraries.
Containerization is at the core. Fit it in a container and run it.
Easy journey from notebooks to production.
Embedded notebooks run in the same context as the project sharing code, environment and permissions. This enables using notebooks to debug and facilitates iterative industrialization of experiments.
Experiment and test in isolated environments.
Spin up new environments in seconds and avoid impacting your colleagues. Each environment comes with a dedicated workflow manager.
Metrics and logs, out of the box.
Track performance and errors with live access to metrics and logs.
Analyse cost per project.
With cost monitoring, you get insight into the most expensive data projects. Optimize where there is most to gain.
Default to spot.
Using spot results in typical savings of 70-90% over on-demand pricing. Critical workloads can be configured to run on-demand, or mixed to get the best of both worlds.
Services and infrastructure are always up-to-date. No patching, no management, no worries.
37 projects - 53 users
Working with Conveyor helps in managing the full ecosystem, it accelerates the scaling and it supports collaboration
"Introducing new technologies and cloud adoption in data & analytics is a challenge, but getting IT and business on the same page can be an even bigger challenge. Working with Conveyor helps in managing the full ecosystem, it accelerates the scaling and it supports collaboration allowing data engineers and data scientists to focus on creating value with new data products, without having to worry too much about some of the underlying complexities of running a scalable data infrastructure in the cloud."
IT Deputy Director Application Delivery at Luminus
Deploy the use-cases you love with zero friction
Run any workload that can be wrapped in a container at any scale. This standardizes the work across data engineers, data scientists and data analysts.
Net Promoter Score
Cloud Cost Reduction
Effortless data project development for all
Conveyor is adopted by organizations in various industries. Proud to be part of their data journeys and see them grow.