Check out our first ever data blog post about how we migrated from dbt Cloud and scaled our data development! Our data team is constantly looking for ways to move faster and more efficiently, and this is an excellent example of that.
Read on to discover why we did this, the lessons learned along the way, and how this has led to higher productivity, in turn enabling us to serve our customers even better.
This is the first installment of our data blog, so stay tuned - we'll be sharing more good stuff! Shout out and credit to Alejandro Rojas Benítez for authoring this piece!
🔗 https://lnkd.in/e2gD_BVu
Are you setting up your dbt data platform like Katie Bauer did?
In her blog post, she hits on all the things Datacoves does (they are not a customer, but have a very similar setup)
VS Code "extensions such as Jinja syntax highlighting, Easier git file exploration, SQLFluff and yaml linters." and a few others on Datacoves to improve productivity
Airflow "is constantly helping us glue together services and processes from the data team, feeding ingestions, transformations and even data exports to other services, enhancing our overall efficiency." Datacoves saw this need from the start because complexity is inherent in most orgs. Our customers kick off Glue Jobs, Fivetran, Airbyte, StreamSets Inc., Azure Data Factory, Databricks notebooks, custom python scripts, etc. Oh and by the way, your pipelines don't stop at Transformation.
Datacoves also helps companies set up Slim CI and Blue / Green deployments in their CI of choice (Github Actions, Gitlab, Jenkins, etc). There's no lock-in. People use Datacoves because they can implement quickly and they don't have to worry about managing a platform.
Kudos to the GlossGenius team for making this happen, it is not simple. Understanding that tools alone are not the answer is also key in how they work.
Are you thinking of setting up your dbt data stack? Talk to us, you will find we save you time, headaches, and money. Most importantly, we are in it to make you successful, if you find we are not for you, you can still do it all on your own, we use the same tools you can use. It's your repo, it's your process, it's your platform. Datacoves is just here to simplify things.
https://lnkd.in/gsPgnzry
Check out our first ever data blog post about how we migrated from dbt Cloud and scaled our data development! Our data team is constantly looking for ways to move faster and more efficiently, and this is an excellent example of that.
Read on to discover why we did this, the lessons learned along the way, and how this has led to higher productivity, in turn enabling us to serve our customers even better.
This is the first installment of our data blog, so stay tuned - we'll be sharing more good stuff! Shout out and credit to Alejandro Rojas Benítez for authoring this piece!
🔗 https://lnkd.in/e2gD_BVu
I hear from many companies that say they want to improve their business-making decisions but don’t feel they have the right data. This is fundamentally a process problem. Different teams have different tools to answer the SAME questions 🤔 ⛔.
dbt Labs commissioned Forrester to conduct a Total Economic Impact (TEI) study on the value of dbt Cloud.
They found that prior to dbt Cloud, organizations had:
🚩 lack of consistency in data transformations
🚩 erroding trust of data across the org
🚩 extensive data rework among engineers
🚩 missed deadline around data-driven initiatives
After adopting dbt Cloud, these organizations saw:
✅ Accelerated data development productivity by 30%
✅ $1M+ in avoided data transformation costs of poor data quality
✅ Over 60% time savings on data rework leading to $2M+ in productivity gains
Overall, these organizations saw a payback period of 6 months and a 3-year ROI of 194% 📈 🤝
Take a peek at the full report here ➡️
https://lnkd.in/gBjahFVW
If one of your 2024 goals is to eliminate time spent on data pipeline maintenance- allow us to help you 🙌
In our latest case study, we walk you through how Fundrise used dbt Cloud to achieve:
✅ >50% reduction in time to introduce new data sources
✅ 0 time spent on data pipeline maintenance
Get all the details, below ⬇️
Are you curious about Nasdaq’s journey from data complexity to analytics nirvana using dbt Cloud in AWS?
Then be sure to fit this panel into your AWS re:Invent itinerary ➡️
Monday, November 27: 8:30AM PT
https://lnkd.in/exfQx6ng#aws#reinvent
Over the past couple of years, dbt Labs has made dbt Cloud better and better. I am certainly in the 'Cloud' camp myself, but I have used dbt Core as well, and have found it to be a powerful tool.
Check out this article written by Ryan Murphy that details some of the key differences between Cloud and Core, as well as some of the situations where you might want one over the other.
#dbt#ClymOn#analyticsengineering#dbtcloud#dbtcore
dbt Labs recently released a slew of exciting new dbt Cloud features at #dbtCoalesce, making the question of dbt Core vs dbt Cloud even more pressing.
In this blog, Ryan Murphy, Cloud Data Engineer at Data Clymer, dives into the basics of dbt Core and dbt Cloud, and when and why dbt Cloud might be a better fit. ➡ https://lnkd.in/gMjMWCNz#dbt#datamodeling#dataengineering
dbt Labs recently released a slew of exciting new dbt Cloud features at #dbtCoalesce, making the question of dbt Core vs dbt Cloud even more pressing.
In this blog, Ryan Murphy, Cloud Data Engineer at Data Clymer, dives into the basics of dbt Core and dbt Cloud, and when and why dbt Cloud might be a better fit. ➡ https://lnkd.in/gMjMWCNz#dbt#datamodeling#dataengineering
Building a strong data quality framework is essential to ensure trust in your data and to help catch issues before they impact stakeholders, in this blog Matt Winkler outlines a great framework for testing data in dbt Labs dbt Cloud, including CI/CD, unit testing, source, output testing and more.
Read more: https://lnkd.in/ggx4DxQq
We’re excited to see the improvements dbt Labs have made to dbt Cloud!
Combining Fivetran and dbt is a lightweight, low-cost way to build a powerful data infrastructure for your startup or e-commerce business.
dbt Explorer will be an interesting addition — a move into the data catalog space, which is becoming increasingly important as businesses connect their data assets to LLMs.
——
Give a 👍 if you’re an e-commerce company tired of spreadsheets and interested in up scaling your data infrastructure.
🚨 dbt Labs made some MAJOR new announcements today at Coalesce:
-dbt Mesh, Domain-level ownership, without compromising governance 🕸
-dbt Explorer, A 360-degree view of your dbt assets 🔎
-Cloud CLI, develop using any terminal backed by dbt Cloud 👨💻
-dbt Semantic Layer, powered by MetricFlow w/ new integrations 📊
-Microsoft Azure Synapse/Fabric 🤝 dbt Cloud adapter
What are you most excited about??
Check out Purple's strides with dbt Cloud ☁
20% less financial data inconsistency
60% fewer support requirements
80% quicker response times to data requests
Interested in dbt Cloud's benefits? Feel free to reach out or visit our website for a demo and to chat with the team.
Here's more about Purple's success 👇
This is great Katie Bauer It is as if we had met because your vision for what you build was exactly what we did :)