Skip to content

Business at the Speed of Now

Driving business results with intelligent analytics and decision-making using real-time streaming data.

What is Intelligent Streaming Analytics?

Digital-Transformation-Pillar-Key-Elements-800x600
Customers of all types have ever-increasing expectations for personalization and immediate response.

Modern business operations of all kinds are becoming digital and smart. They can be highly agile and adaptive in a dynamic world – one that is enabled and fueled by data, AI, cloud computing, digital networks, and relationships. Accordingly, business leaders are looking for better results and competitiveness in today’s constantly changing market. With an increasing number of connected devices, the adoption of 5G, and the explosion of big data and technology-enabled ways to effectively leverage data real-time to power AI-driven models, workflows, and rule engines, now more than ever, we can run data-driven organizations in real-time. Therefore, it is time to adopt the new NOW management thinking, the state-of-the-art capabilities, and AI that will get your organization doing business at the speed of now (or maybe even at the speed of tomorrow!)

Let’s start with some basics and then look at some examples and RCG’s framework to identify and execute real-time streaming intelligent analytics.

Streaming data processing is big data technology. The data is generated continuously by thousands of data sources, usually transmitting data records simultaneously in small sizes (think kilobytes). There are many types of streaming data, including files generated from mobile apps, web applications, eCommerce purchases, in-game player activity, information from social networks, financial trading floors, or geospatial services, and telemetry from connected devices (IoT) or instrumentation in data centers.

Streaming data analytics is called by many other names: real-time analytics, streaming analytics, complex event processing, real-time streaming analytics, and event processing. At RCG, we call it intelligent streaming analytics. This name reflects the fact that we see the vital pairing of streaming data with machine learning and other rules engines to help run the business at the speed of now by enabling a continuous data stream to feed machine learning models and rules engines, detect conditions and anomalies from the time of receiving the data enabling decision models and rules engines.

For example, with intelligent streaming analytics, you can feed a credit card fraud detection machine learning model, receive an alert, initiate a bot to create a case and a workflow, and make a decision on what to do immediately without human intervention or hand off relevant data to a human for a specific action.

What does Intelligence mean?

f.hubspotusercontent10.nethubfs5230018Social Post Imagesartificial-intelligence-3382507_1920
This type of real-time streaming analytics works with data flows, feeding decisions and analytics models and tasks that include correlations, aggregations, filtering, and sampling which provide a line of sight into many aspects of business operations and customer activity.

This can enable timely decision-making and enable intervention in a proactive, or at least highly responsive, way in emerging scenarios. A typical goal is to allow up-to-date information and keep the state of data updated for users and automation.

That said, it is now possible to use even more complex AI and machine learning models that can run in real-time or near-real-time. From live video processing to real-time prediction using machine learning techniques suited to the task (like gradient boosted trees) we can create and train the mathematical models in ways that reduce them to abstract vectors. The importance of a speedy decision (such as using computer vision to identify a bad guy at the airport) will then drive decisions around the business case of how much computing power (and cost) can be thrown at executing a real-time performant solution. This approach then, effectively, creates the possibility for analytics as a service with real-time responsive intelligence.

One simple example is where a firm can continuously analyze social media streams and track real-time changes in customer sentiment about their experience, products, services, and brands and products to respond in a timely fashion in a way that makes sense for the business.

Intelligence in our view comes both in terms of the “smarter decisions and actions” a firm will take and the potential for the execution of artificial intelligence to make the use case execution “smart.”

The Best of Both Worlds: the benefits of a hybrid model with both streaming and batch processing

Digital-Transformation-Pillar-Start-Your-Journey-800x600
Many organizations are building a hybrid model by combining both streaming data processing and batch processing approaches.

A hybrid model levers the best of both real-time data and aggregate data analytics with AI model development. This is done by maintaining a real-time data layer and a batch data layer. Data is processed through a streaming platform to enable real-time insights, recommendations, and actions via rules engines and machine learning models. Then the information is stored so that it can be further transformed, evaluated in various use cases.

At RCG, we execute this type of model quite frequently. We enable our clients with the best-of-breed architectural and data engineering solutions to handle the real-time (high volume, high latency) requirements for data management and data science.

One common approach is to enable a “fork” where an analytics result is delivered from the stream in real-time, and then the data and results are stacked up for batch processing.

Amazon Web Services provides a concise comparison of batch and streaming data processing – see Figure 1. To help with understanding RCG’s hybrid framework, consider these differences between stream and batch processing.

Batch processing can be used to compute arbitrary queries over different sets of data. It usually calculates results derived from all the data it encompasses and enables in-depth analysis of big data sets. In contrast, stream processing requires ingesting a sequence of data, and incrementally updating metrics, reports, and summary statistics in response to each arriving data record. It is better suited for real-time monitoring and response functions

Other considerations about streaming versus batch data processing include:

Streaming is a better fit for continuous time-series data.

Some data naturally comes as a continuous and never-ending stream of events such as data off of a machine, device, other sensors (IoT) or activity on a website (and naturally fit with time[1]series data and detecting patterns over time). Streaming handles these never-ending data streams. Identifying sequences and time patterns are challenging to do in batch processing, where you will have to have multiple batches. So, streaming processing here is a natural fit to detect patterns and a variety of use cases and results and even multiple streams of data at once.

Sometimes the data is too huge, and it is not possible to store it easily.

Stream processing enables the handling of extensive data and then retain only useful elements. On the flip side, stream processing can work with a lot less hardware than batch processing.

Streaming data processing helps you forge new insight frontiers.

There is a lot of under-explored and under-utilized streaming data available ( e.g., customer transactions, activities, website visits). This type of data will continue to accumulate at increasing rates with more sophisticated digital interactions and more IoT use cases (sensors everywhere). The value and potential impact of the data will only increase over time.

Streaming for approximate results.

Stream processing also enables approximation query processing through systematic load shedding. So, this process is suited for situations where approximate answers are sufficient to do the analytics job.

While stream processing enables many exceptional opportunities, it is not suited for all use cases. For example, if processing and modeling require multiple passes with a full data to train a model, then batch processing will be needed. Thus, the hybrid model provides a more robust approach to making the most of your streaming data assets.

Find the Best Uses Cases: practical applications in business

ContentImage_Website17-2
Stream processing is useful in different use cases

In general, stream processing is useful in use cases where

  • a problem or trend can be detected,
  • there is an excellent opportunity to improve the outcome or make a decision in real-time
  • speed of intervention and decision-making is essential for operations happening in milliseconds.

Business outcomes from smart operations include lowering operating costs, lowering inventory levels, improving product quality or service, improving customer service, decreasing time to market, lowering capital costs, reducing operational risk, and increasing innovation like the development of usage-based business models.

Additionally, better customer management can yield better insight for decision-making, reduced churn, optimized marketing investment, better targeting and segmentation, and delivering better overall customer experience.

While the opportunities indeed are endless, here are some examples:

  • Financial
    • Transaction processing, market/currency state monitoring
    • Algorithmic Trading, Stock Market Surveillance,
      A financial institution tracks changes in the stock market in real-time, computes value-at-risk, and automatically rebalances portfolios based on stock price movements.
  • Healthcare
    • Real-time monitoring of health-conditions, clinical risk-assessment, patient-state analysis, and alerts;
    • Smart Patient Care
  • Insurance
    • Real-time claims processing
    • Real-time underwriting
  • Consumer, Retail, and Marketing
    • Customer location intelligence
    • Context-aware promotions and advertising
    • Proximity marketing
    • Advertising optimization
    • Real-time customer churn prediction and intervention
    • Retail/customer service: customer behavior analysis and operations improvement
  • Operations, Manufacturing, and Industrial
    • Production line monitoring
    • Intrusion and Fraud Detection (e.g., Uber, Credit Card, Healthcare, etc.)
    • Operating heavy machinery and transportation fleets (sourcing data streams from sensors and IoT devices)
    • Predictive maintenance
    • Sensors in vehicles, industrial equipment, and farm machinery send data to a streaming application. The application monitors performance and detects any potential defects in advance.
    • Manufacturing/supply chain: real-time monitoring, predictive maintenance, disruption/risk assessment
  • Information Technology/Systems
    • Computer system and network monitoring
    • Geospatial data processing
    • Any kind of real-time data analysis like fraud detection or system maintenance.
  • Supply Chain
    • Predictive inventory a spare part order automatically prevents equipment downtime
    • Merchandising optimization
  • Other
    • Social distance capacity alert and management
    • Traffic Monitoring, Geofencing, Vehicle, and Wildlife tracking
    • Anomaly detection (e.g., power theft, medical care, etc.)

RCG’s Seven Step Framework for accelerating real-time streaming intelligent analytics use cases

Digital-Transformation-Pillar-In-Practice-800x600
With an understanding of the opportunity and some types of use cases, let’s explore how to build a portfolio of opportunities for the use of streaming data with machine learning.

RCG suggests an approach and framework to help identify, understand, communicate, and prioritize opportunities. We tackle opportunities where the stream data has a source and a destination, and we can achieve positive business outcomes.

Step 1: Identify/ Inventory: Look for the Sources of Gold

  • Start with assessing the inventory of areas that are prime for streaming.
  • Identify ideal sources that high volumes of data with low latency response.
  • Understand if the target data are synchronous or asynchronous

Step 2: Conduct Operational data intensity assessment and scoring

  • Evaluate the volume and value of actionable data generated in a specific business function or business process using a data intensity scoring methodology framework (such as RCG’s)1 and considering inventory from stage 1.
  • Assess and score to measure the quantity of actionable and impactful data. Business analysis helps understand where there may be more opportunities for advanced analytics and AI.
  • Consider key drivers impacting data intensity such as cost basis, employee basis, product/process basis, and network basis.

Step 3: Model the Business and Financial Impact

  • Look at the business and financial model implications and opportunities to formulate essential questions related to the desired business outcome.
  • Use the business model canvas2 and the RCG high performing organization model to frame the idea 3
  • Prioritize your critical questions based on value and complexity. Portfolio and value management tools can be helpful here.
  • Leverage integrative frameworks for both the customer data and the customer experience journey, such as the customer experience DNA (CxDNA)4 . These can help you to understand the steps and the action that the organization needs to take to manage the customer experience across critical areas of marketing, demand generation, sales enablement, customer care, and customer support.
  • Analyze the enterprise ecosystem of operations across these core stages: sourcing, making, storing, routing, and delivering. By looking at the jobs to be done in stage, the potential uses cases and the value opportunities are discoverable. 12.

Step 4: Develop Specific Use Cases

  • Determine the streaming data opportunity and real-time opportunity.
  • Consider the downstream batch opportunity in addition to the streaming opportunity.
  • Consider the potential for modeling specific issues/problems or questions (e.g., what is the likelihood of customer churn?) and the potential for AI to generate a course of action.

Step 5: Frame the advanced analytics and AI opportunity in the AI canvas

  • Use the AI canvas framework for framing your AI business case on one page (seven essential questions, the business problem, and the value that the initiative will generate).
  • For AI models, the procedure is to set up and improve the performance of the model over time.
  • The customer value proposition describes how customers will benefit. What is the economic value that will be created?
  • Consider the data strategy, which is a vital technical consideration on data collection, data analysis, data cleansing, data wrangling. 90% of the effort in an AI initiative is data cleansing and data wrangling.

Step 6: Create your execution plan, which should include Proof of Concept and testing

  • Create both a plan and portfolio of opportunities.
  • Align cross-functional teams with clear business sponsorship around implementation.

Step 7: Realize the results and monitor value creation, time to value

  • Implement your plan against pre-set objectives or outcomes with an emphasis on time-to-value.
  • Consistently measure the impact and success of your analytics.
  • Use portfolio and value management tools to drive for continuous improvement as a data-driven learning organization.

The opportunity to run the business at the speed of now is here.

City skyline at night with internet connection lines connecting each building
High performing organizations will utilize streaming data, analytics, and AI to enable a data-driven learning organization.

RCG has extensive experience in all the components necessary to help you develop your real-time intelligent analytics muscle. From business analysis, functional expertise, process improvement to data engineering, architecture, and data science, RCG can help. The Innovation Hub at RCG Global has also developed specific technology proof of concepts, designs, frameworks, and roadmaps to help you accelerate your journey to the business at the speed of now.

About RCG and Real-time Streaming Intelligent Analytics

RCG is a leader in the creation of custom real-time streaming data solutions combined with machine learning. RCG has taken a unique and practical approach in combining several streaming data technologies with machine learning and AI algorithms to create a robust foundation for driving real-time business decisions. The list of use cases and possibilities is endless. Check out some deep dive examples in our recent blogs that explore three new use cases relative to Customer Location Intelligence

  1. Proximity Marketing and Ad Optimization
  2. Social Distancing and Capacity Management
  3. Real-time, Automotive Predictive Maintenance and Performance Optimization

RCG can help you with your use case develop and design your tailor[1]fit solution using the latest technology for scalable and sustainable solutions and results that make a difference in your bottom line.

To discover how you can apply streaming data technologies with machine learning and AI algorithms to create a robust foundation for driving your real-time business decisions and improved business outcomes, contact us at info@rcggs.com.

 

Works Cited

1, 2, 4; The Data Intensity Scoring methodology, the AI Canvas, and the 7 step use case development framework have been developed by Dr. Mohanbir Sawhney at Northwestern University.

3; The Business Model Canvas is developed by Dr. Alexander Osterwelder; Business Model Generation: A Handbook for Visionaries, Game Changers, and Challengers Book by Alexander Osterwalder and Yves Pigneur.

4; The RCG high-performance organization model has been developed by Dr. Rob Nelson at RCG Global Services.

Download a PDF version of this guide by filling out this form

thumbnail Business-at-Speed-of-Now
placeholder-800x450