Quantcast
Channel: Business Analytics » customer analytics
Viewing all articles
Browse latest Browse all 2

Emerging drivers for “common” enterprise information analytics

$
0
0

David Loshin, President of Knowledge Integrity, Inc.

It would be unusual to suggest that some businesses are not continuously seeking better ways of increasing revenues, decreasing operational costs, and extending profitable customer relationships. A closer inspection of the popular approaches for achieving these goals centers on what could be called “common” analytics that are not specific to any particular industry. Some examples include:

  • Customer profiling and segmentation – Divides the customer community into customer categories based on key variables as a way of developing predictive models for behavior analysis
  • Customer/product affinity analysis- Examines which customer segments have affinities to specific products (or products within organized categories)
  • Market basket analysis – Looks at predispositions to purchasing certain products at the same time

All of these are examples of analytical approaches to drive increased product sales via upselling, cross-selling, understanding customer price sensitivity, or through the purchase of product bundles with higher profit margins.

And while the ability to execute projects enabling these analyses has typically been reserved by only the largest organizations with the biggest analytics budgets, a combination of factors is increasingly enabling a much broader spectrum of companies to be able to benefit from analytics, including:

  1. Data volumes: Not only are the volumes of data expanding, the rate of expansion of newly created digital content continues to increase.
  2. Positive marketing: The information management industry has done a very good job in marketing the purported benefits of analytics, effectively generating a blossoming demand.
  3. Feasibility: Larger organizations may have already had the resources to implement large-scale analytics programs, but with high-performance platforms deployed on collections of easily acquired commodity hardware components, the barrier to entry for implementing an analytics program has been significantly lowered.
  4. Right-time delivery: As the time windows for responding to emerging opportunities continues to shrink, there is a growing appetite for close to real-time delivery of actionable knowledge to drive trustworthy decision-making.

The result is that a greater number of smaller organizations are seeking to employ more sophisticated analysis techniques over a broader variety of digital content that spans both structured and unstructured sources. For example, including these types of digital content among others:

  • Structured data sets acquired either directly through the World Wide Web, or  through data aggregator vendors
  • Social media data, such as the unstructured comments and posts streamed through Twitter or Facebook, among others
  • Machine-generated data, such as periodic reading of smart energy meters installed across a residential network
  • Mixed-format content, such as documents and web sites containing text, photo images, graphic images, video, etc.

Analytic applications such as customer profiling, segmentation, and classification can be greatly enhanced with data from a wider variety of source. But as the demand grows for applications incorporating different types of data sources, the data management environment must be able to scale with the size and complexity of the data, and not just from a strict throughput performance perspective. There must be processes for extracting entity data from an unstructured source, identifying that entity, and augmenting the entity’s profile with discovered or learned characteristic attributes.

All aspects of data utility have to be taken into account, aligned with the idea of data management as a “dial-tone” service. Enabling predictive analytics implies that data management services must meet a base level of expectations: the performance for data delivery must be predictable, the framework must provide trustworthy information, there must be ways to ensure that commonly-used terms are not confused by downstream reinterpretation, and that data and business rules can be effectively incorporated directly into developed applications as part of the system development life cycle.


Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images