The SentryOne team has kicked off 2020 with changes to our product portfolio that aren’t just enhancements. This evolution includes the introduction of new options for documenting and mapping your data estate with SentryOne Document and new functionality in our database performance monitoring solution, SQL Sentry. It’s also the foundation of a new approach to managing data—Intelligent DataOps—that transforms businesses by elevating the importance of data overlaying business processes and technology.
Intelligent DataOps—building the people, processes, and technology for a data-driven culture—is core to SentryOne’s stated purpose of helping improve the quality of life for data professionals. And, we’ve noticed that a well-functioning DataOps culture empowers our customers to rein in data chaos, monetize data, actualize business—even drive cloud migration.
DataOps Is More Than DevOps for Data
DataOps is still emerging and often miscategorized as “DevOps for data.” While DataOps and DevOps certainly share some common ground—most notably the collaboration among people to improve the outcome—DataOps focuses on data. Today data is arguably most critical asset for any business, and certainly the most critical asset for every application.
Data is the critical element that guides a company’s course, from how to improve success to whether to stay in a particular market. DataOps helps companies get the data right—and on time. As Nick Heudecker of Gartner framed it in his “Hyping DataOps” blog post:
DataOps uses technology to automate data delivery with the appropriate levels of security, quality, and metadata to improve the use and value of data in a dynamic environment.
I simplify by saying DataOps brings together all the data and data assets with all the stakeholders, for all the data pipeline. That’s a lot to manage. Data pros need what SentryOne dubbed Intelligent DataOps: a platform that informs, forecasts, recommends and auto-tunes. This approach helps customers build a DataOps practice so our customers can systematically and predictably increase the value of their business-critical data.
Our Intelligent DataOps Journey
What do we at SentryOne mean when we talk about Intelligent DataOps?
At the core, it’s analytics for DataOps—rather than DataOps for analytics. Our focus with every solution we offer is bringing together the people, processes, and technology to improve the value of the data.
Here are some key takeaways we’ve gathered on this journey:
- Observability across the entire data pipeline is critical. Optimizing observability starts with building in data pipeline performance by design and by default. As part of that process, your development teams need to monitor and tune database applications during development and testing, before they’re released to production. This is more than introducing monitoring (one-direction oversight) into your entire data pipeline. It’s the use of intelligence gained from monitoring to advise regarding performance and best practices (bi-directional integration). I like to think of this as an observability contract or pact. As data teams mature and apply predictive analytics to monitoring, they create amplified value from Intelligent DataOps.
- Process observability is paramount. Good communication throughout organizational process is critical—but exhausting. Intelligent DataOps practices are observable: they are intuitive, standardized and transparent. And technology (collaboration software, reporting and analytics, etc.) can be applied to observable processes to help encourage engagement among teams.
- Data testing is not optional. Every application is a data-centric application, and data is the most volatile component in every app. An application is never truly tested until it’s tested with your wildest-possible data. Automated, integrated data testing addresses this common gap in many data pipelines—think of it as monitoring for your data. And when it comes to data science projects, if you are using untested data, your project is useless. You cannot build and train a model based on bad data.
- Mapping your data estate is essential. In a fully optimized DataOps environment, all data is accounted for and has a home. Your data underpins all your business decisions—and you are bound by law to meet data privacy regulations. So, you need a reliable map of where your data lives, where it originated, and where it ends up. (“Wow, social security numbers are included in over 200 reports!”) Automated database documentation and data lineage analysis help data teams check these boxes.
- Accessible data starts with optimized performance. In a well-oiled data pipeline, relational database management systems (RDBMS) provide the structure that’s necessary for continuous integration/continuous delivery (CI/CD). Continuous monitoring of your RDBMS—observability across the data environment—improves data delivery to stakeholders, end users, and customers.
Building your DataOps practice to encompass these principles can help your organization get more value from your data, by both monetizing the data and using the data to evolve the business.
A Data-Driven Culture Demands Intelligent DataOps
Data is your primary currency; and building a data-driven culture requires some soul-searching about the status of data in your organization. Can your users get to the data they need? Is your data trustworthy? Is your data delivered in time?
As you adopt DataOps and then upgrade to Intelligent DataOps, you’ll see better alignment between your data and DevOps teams. You can start to rein in chaos that devalues the role of data in your organization. By focusing on the ecosystem—the people, processes, and technology—surrounding your data estate, you can build an Intelligent DataOps environment that actualizes data value, forms the foundation of a data-driven culture, and promotes peace between your data and dev teams.
Curious about how SentryOne is manifesting the principles of Intelligence DataOps today? Contact me and let’s start the conversation.