The Oxford Dictionary defines "ad hoc" as: “arranged or happening when necessary and not planned in advance”, for example, "an ad hoc meeting to deal with the problem" or "meetings held on an ad hoc basis". https://www.oxfordlearnersdictionaries.com/us/definition/english/ad-hoc_1
When I first entered the IT workforce, the term “ad-hoc” in the data world referred to performing a task or creating a simple report that was custom or outside the usual, established process for generating data. Sometimes those custom requests were initiated outside the regular boundaries, where the visual aspect of data presentation was low on the list of priorities and your teammates simply wanted “the numbers.”
Other times, the looseness or flexibility was in the toolset used to generate those data results. People already relied on automated or event-driven reports, but they wanted more flexibility and more options to get different outcomes. You felt empowered as a data professional to fulfill those ad-hoc data requests, or knowledgeable enough to adjust an existing standard reporting solution on the go, otherwise, you might have seemed less adaptable.
Coming back to the modern day, it has become remarkably easy, both in terms of time and required skill, to retrieve ad-hoc data inquiries using publicly available datasets and conversational AI tools. In the corporate enterprise world, various LLM (Large Language Model) powered applications now allow employees to pick that cherry on top of the large data cake with the ease of a simple conversation: “I need this…” or “Can I get that summary…?”
A few moments later, the response appears with the expected results. Data validation might be another story, of course, but the point is that it is here, it is now, and you no longer need a middle person or a middle team to deliver it.
But how does it feel now when you, as a data professional or a member of a data team, are asked to handle a custom data request? Is there an acknowledgment of your expertise and familiarity with the available datasets? Maybe. Is there a sense of trust in your technical skills? Probably. And perhaps there is a good working relationship that allows others to wait a little, though not too long, for the results before they decide to dive into the data world themselves to get what they need.
That’s all a positive sign. But what are the risks? They’re not that difficult to see. Your expertise and familiarity may gradually fade as the volume and refresh rate of internal corporate datasets continue to grow, along with the supplementary data points offered by third-party providers. Trust in your technical skills may start to feel more like a myth, especially in an environment where modern data platforms are properly configured and all critical datasets are interconnected. In such a world, the conversational approach to data inquiries could easily become the new mainstream.
So what can I do without falling into the abyss of self-doubt and worries? I think one of the things we, as data professionals, can do while learning and practicing new technological tools is to continue enhancing our knowledge of proper data modelling techniques, where data quality and clear business domain definitions will simplify AI efforts to process and return expected results. Knowledge and practice of modern composable or modular data architecture may also be handy to treat data as a product and emphasize reusable components, like metrics or semantic layers, to enable faster adoption of AI tools. All this to help and center design on business outcomes and tight integration with data governance.
Along with that, we can still strengthen our human connections and relationships with other teams by working together, where words like “please” and “thank you” won’t result in higher electricity bills for the machines to process. Why? Because we’re humans, and we care 😊