Follow BigDATAwire:

February 18, 2020

Becoming an Event-Driven Enterprise

Kevin Petrie

(3dkombinat/Shutterstock)

Events drive modern business. A website click, a credit card swipe, the turn of a gear – almost any event now can be digitized to create business value.

Enterprises that analyze events as they happen are more agile and compete more effectively than those that do not. Event-driven enterprises can increase revenue, reduce cost, and control risk.

This huge opportunity spurs action across the organization. IT teams build event streams, often using Kafka or Kafka-like commercial offerings, to reduce full-load, batch processing and improve efficiency. Business teams, meanwhile, seek to analyze those event streams. They want to capitalize on events more rapidly and devise new strategies.

But to become event-driven, enterprises face key architectural tradeoffs. Standard or custom software? Commercial or open source? Suite or best of breed? Automated or scripted? Truly event-driven enterprises will base their choices on a realistic assessment of their objectives, capabilities, and organizational maturity.

First let’s explore what event streaming looks like, in the form of real-world use cases.

  • Contextualization: A meteorologist interprets various event streams from satellites, covering temperature, humidity, pressure, wind speed, and precipitation. Each event stream adds a bit of context to his weather prediction.
  • Analysis: A gas pipeline manager monitors and automatically correlates streaming sensor signals about pipeline pressure and flow rates. Threshold-based alerts help her keep things running smoothly.

    (Son-Hoang-Nguyen/Shutterstock)

  • Automated action: A credit card company’s machine learning software automatically identifies an unusual transaction, compares it to historical behavior, and alerts the card owner of potential fraud.

Adoption Trends

Most organizations have adopted event streaming, and of them a significant and growing proportion are piloting or deploying streaming analytics. Let’s dig deeper into what your peers are doing.

Data modernization. Event streaming is not an island unto itself. Organizations frequently budget for and execute event streaming projects as part of larger data modernization initiatives. They adapt legacy architectures to become faster, more efficient, more scalable, and more flexible than was previously possible. Data modernization initiatives might also include the migration of analytics workloads from mainframe systems to data lakes, NoSQL stores, and/or event streaming systems. Whatever the mix of projects, business teams and data teams that adopt event streaming usually need to manage a lot of inter-dependencies.

Shift to cloud. Each year data teams base a higher portion of their event streaming workloads on the cloud. They publish on-premises data to cloud-based event streaming platforms, migrate on-premises streaming workloads to the cloud, and launch new projects entirely on the cloud. As with other data initiatives, cloud platforms can improve economics by converting capital expenditure to operating expense. The cloud also improves operational flexibility, enabling IT teams to easily scale up or wind down infrastructure resources based on changing business requirements.

(BeeBright/Shutterstock)

Machine learning (ML). Many organizations use ML software to automatically learn from and adapt to event streams. Data scientists use ML to design a stream processor before applying it to an event stream. They also use ML to have a stream processor learn from and adapt to event streams on the fly. If managed carefully, ML improves the accuracy of streaming analytics, and helps address entirely new streaming analytics use cases.

IoT edge processing. Organizations in industries such as manufacturing frequently analyze data streams from physical sensors that are attached to “things” in the so-called “Internet of Things” (IoT). By processing events near these sensors, at the “edge,” they can speed up results and address new IoT use cases.

The Journey to the Event Driven Enterprise

The event driven enterprise is a journey, not a destination. The key is to set realistic goals and achieve them – not necessarily to reach for the stars.

Eckerson Group research has found that event streaming initiatives fall into one of two types, opportunistic or transformational, according to their level of customization. Business and data leaders can choose either opportunistic or transformational initiatives based on the scope of their use case, their organizational maturity and their readiness to incur risk.

Opportunistic Initiatives: Start with simpler projects that have lower cost and lower risk.

With an Opportunistic initiative, business and/or data leaders scope a relatively narrow, basic use case. These initiatives, often led by data teams within IT, target growing pains: is the organization struggling to meet demand for an existing analytics service? Perhaps that service is hobbled by legacy batch or SOA technology, and would benefit from a conversion to Kafka-based streaming analytics. Then they set a quantifiable business objective: increasing product revenue 5% through cross-selling, reducing production cost 2% via preventive maintenance, and so forth.

Comparison of Event Streaming Initiatives (Image Source: Eckerson Group)

With an Opportunistic initiative, data teams seek to minimize inhouse development, maintenance and troubleshooting. They use standard features of commercial solutions, or potentially vertical-specific commercial packages, to avoid customization. They take advantage of open source code, but only as part of commercially supported offerings. Opportunistic initiatives use suite offerings and fewer component types, to keep architectures somewhat homogeneous and therefore easier to manage. Both data teams and business users leverage automation to reduce manual scripting wherever possible.

Transformational Initiatives: Incur higher cost and higher risk in return for higher value.

Business teams typically lead transformational initiatives. They set strategic objectives, such as the creation of a new revenue-generating service, or a company-wide effort to reduce product defects by 5%. They commit resources to crafting sustainable competitive advantage. They define more complex streaming analytics use cases, often spanning multiple departments or the full enterprise. Alternatively, they focus on one highly specialized use case that offers strategic advantages.

With Transformational initiatives, organizations customize. They try new types of event sources, stream processors and visualization offerings, favoring best of breed over the suite. They implement commercial offerings, but also favor specialized open source offerings even if they lack commercial support. They hire and train developer teams. They create custom self-service processors and visualization interfaces for business users. They embrace heterogeneous architectures for the sake of specialization.

Set Your Course Based on Your Starting Point

Larger enterprises with legacy technologies often start with opportunistic initiatives, then mature to transformational as they modernize their environments and acquire additional developer skills. If instead they leapfrog to transformational, they will need to consider significant upfront investments in specialized tools and staff.

Younger, “born in the cloud” organizations potentially can start with transformational initiatives more cost-effectively than large enterprises. They often have fewer legacy constraints, are more familiar with open source, and have more developer skills.

Either way, the organizations that first assess their capabilities and risk tolerance are most likely to move in the right direction on their event-driven journey.

About the author: Kevin Petrie is vice president of research at Eckerson Group. His report, “Streaming Analytics: Architecting for Real-Time Insights,” will be published in March.

Related Items:

Kafka Transforming Into ‘Event Streaming Database’

How Disney Built a Pipeline for Streaming Analytics

Can We Stop Doing ETL Yet?

BigDATAwire