It is that time of year again–time for predictions! We start off the 2025 bonanza of forecasts, estimates, and prognostications with a topic that’s near and dear to our hearts here at BigDATAwire: data analytics.
The world has seen all sorts of patterns for analytics: data lakes, data warehouses, in-memory analytics, and embedded analytics. But in 2025, the standard for analytics will be the data lakehouse, says Emmanuel Darras, CEO and Co-founder of Kestra, developer of an open-source orchestration platform.
“By 2025, over half of all analytics workloads are expected to run on lakehouse architectures, driven by the cost savings and flexibility they offer,” Darras says. “Currently, companies are shifting from cloud data warehouses to lakehouses, not just to save money but to simplify data access patterns and reduce the need for duplicate data storage. Large organizations have reported savings of over 50%, a major win for those with significant data processing needs.”
One of the big drivers of the data lakehouse is the standardization of open data formats. That is a trend that will continue to build in 2025, predicts Adam Bellemare, principal technologist in the Technology Strategy Group at Confluent.
“Next year we will see a widespread standardization of open data formats, such as Apache Iceberg, Delta Lake, and Apache Hudi,” says Bellemare. “This will be driven by a greater demand for interoperability, with enterprises looking to seamlessly combine data across different platforms, partners, and vendors. As enterprises prioritize access to timely, high-quality data, open data formats will no longer be optional but imperative for businesses to succeed. Those who fail to embrace these open standards risk losing a competitive advantage, and those who adopt them will be able to deliver a high-quality offering and real-time, cross-platform data insights.”
Two of the biggest backers of the data lakehouse are Snowflake and Databricks. But in 2025, people will tire of the Snowflake/Databrick War and look to federated IT for an evolved data architecture, says Andrew Madson, a technical evangelist at Dremio and professor of data and analytics at Southern New Hampshire and Grand Canyon universities.
“Central IT teams will continue decentralizing responsibilities to business units, creating more federated operating models,” Madson says. “Meanwhile, monolithic architectures from major vendors like Snowflake and Databricks will integrate additional tools aimed at improving cost-efficiency and performance, creating hybrid ecosystems that balance innovation and practicality.”
Data modeling has wallowed in relative obscurity for years. In 2025, the practice will have its moment in the sun, says Adi Polak, Confluent’s director of advocacy and developer experience engineering.
“Data modeling has long been the domain of DBAs (database administrators), but with the increased adoption of open table formats like Apache Iceberg, data modeling is a skill that more engineers need to master,” Polak says. “For application development, engineers are increasingly tasked with creating reusable data products, supporting both real-time and batch workloads while anticipating downstream consumption patterns. To build these data products effectively, engineers must understand how data will be used and design the right structure, or model, that’s suitable for consumption, early on. That’s why data modeling will be an essential skill for engineers to master in the coming year.
There’s one topic that will be impossible to avoid in 2025: AI (yes, we’ll have an AI 2025 predictions piece soon). AI’s impact will be felt everywhere, including the data analytics stack, says Christian Buckner, SVP of analytics and IoT at Altair.
“Today, many business leaders struggle with knowing what questions to ask their data or where to find the answers,” Buckner says. “AI agents are changing that by automatically delivering insights and recommendations, without the need for anyone to ask. This level of automation will be crucial for helping organizations unlock deeper understanding and connections within their data and empowering them to make more strategic decisions for business advantage. it’s important for businesses to establish guardrails to control AI-driven suggestions and maintain trust in the results.”
When you said “analytics,” it used to conjure images of someone firing up a desktop BI tool to work with a slice of data from the warehouse. My, times have changed. According to Sisense CEO Ariel Katz, 2025 will bring about the demise of traditional BI, which will be replaced with API-first and GenAI-integrated analytics in every app.
“In 2025, traditional BI tools will become obsolete, as API-first architectures and GenAI seamlessly embed real-time analytics into every application,” Katz says. “Data insights will flow directly into CRMs, productivity platforms, and customer tools, empowering employees at all levels to make data-driven decisions instantly–no technical expertise needed. Companies that embrace this shift will unlock unprecedented productivity and customer experiences, leaving static dashboards and siloed systems in the dust.”
Big data was big because–well, it just was (trust us). But in 2025, the big data movement will open a new chapter by welcoming a relative of big data called small data, predicts Francois Ajenstat, the Chief Product Officer at Amplitude.
“The past few years have seen a rise in data volumes, but 2025 will bring the focus from ‘big data’ to ‘small data,’” Ajenstat says. “We’re already seeing this mindset shift with large language models giving way to small language models. Organizations are realizing they don’t need to bring all their data to solve a problem or complete an initiative–they need to bring the right data. The overwhelming abundance of data, often referred to as the ‘data swamp,’ has made it harder to extract meaningful insights. By focusing on more targeted, higher-quality data–or the ‘data pond’–organizations can ensure data trust and precision. This shift towards smaller, more relevant data will help speed up analysis timelines, get more people using data, and drive greater ROI from data investments.”
It’s always been cool to have high-quality data. But in 2025, having high-quality data will become a business imperative, says Rajan Goyal, the CEO and co-founder of DataPelago.
“We’re seeing growing reports that LLM providers are struggling with model slowdown, and AI’s scaling law is increasingly being questioned,” Goyal says. “As this trend continues, it will become accepted knowledge next year that the key to developing, training and fine-tuning more effective AI models is no longer more data but better data. In particular, high-quality contextual data that aligns with a model’s intended use case will be key. Beyond just the model developers, this trend will place a greater onus on the end customers who possess most of this data to modernize their data management architectures for today’s AI requirements so they can effectively fine-tune models and fuel RAG workloads.”
Data silos are like mushrooms: They appear naturally without any human input. But in 2025, businesses will need to get on top of the growth of data silos if they want to succeed, says Molly Presley, the SVP of global marketing for Hammerspace.
“In 2025, breaking down data silos will emerge as a critical architectural concern for data engineers and AI architects,” Presley writes “The ability to aggregate and unify disparate data sets across organizations will be essential for driving advanced analytics, AI, and machine learning initiatives. As the volume and diversity of data sources continue to grow, overcoming these silos will be crucial for enabling the holistic insights and decision-making that modern AI systems demand.”
Managing user access to data sometimes feels like everything everywhere all at once. Instead of fighting that worker- and data-sprawl, teams in 2025 will learn how to more effectively harness tools like streaming data to make themselves more productive, predicts Arcitecta CEO Jason Lohrey.
“The rise of remote work and geographically distributed teams has changed how businesses operate,” Lohrey says. “Real-time data streaming allows organizations to record events and share live feeds globally, enabling employees to collaborate on continuous data streams without needing to be physically present. This trend will likely accelerate in 2025 as more companies adopt tools that facilitate seamless broadcasting and data distribution. By enabling real-time collaboration across a distributed workforce, businesses can reduce travel costs, increase efficiency, and make quicker, more informed decisions. The global reach of data streaming technology will expand, allowing organizations to tap into a wider talent pool and create more dynamic and flexible operational structures.”
December 18, 2024
- Qlik Shares 2025 AI and Data Trends: Authenticity, Applied Value, and Agents
- Domo Releases 12th Annual ‘Data Never Sleeps’ Report
- Dresner Advisory Services Publishes 2024 Embedded Business Intelligence Market Study
- Starburst Helps Arity Streamline Data Insights with Scalable Lakehouse Architecture
- Vultr Expands Global Reach with New Funding at $3.5B Valuation
- Ataccama Extends Generative AI Capabilities to Accelerate Enterprise Data Quality Initiatives
- Menlo Ventures Announces Cohort Backed by $100M Anthology Fund Launched in Partnership with Anthropic
- Nexla to Make GenAI RAG Faster, Simpler, and More Accurate Using NVIDIA AI
December 17, 2024
- Dataiku Recognized as a Customers’ Choice in 2024 Gartner ‘Voice of the Customer’ Report
- Survey: 86% of Enterprises Require Tech Stack Upgrades to Properly Deploy AI Agents
- Wherobots Introduces Raster Inference to Accelerate Geospatial AI Across Industries
- Protect AI Joins Microsoft Pegasus Program to Accelerate Adoption of AI Security Solutions
- Boomi Boosts Data Management Capabilities with Acquisition of Rivery
- Calgary’s YYC DataCon 2025 to Showcase AI, Big Data, and Business Transformation
- Milvus 2.5 Creates Best of Both Worlds with Hybrid Vector-Keyword Search
- NVIDIA Debuts NeMo Retriever Microservices for Multilingual GenAI Fueled by Data
- MinIO and F5 to Enhance AI Workloads with High-Performance Object Storage and Distributed Application Services
- Databricks Raises $10B in Series J Funding, Valuing the Company at $62B