In today’s “Now Economy”, consumers demand highly relevant and personalized experiences as they interact with brands on a multitude of devices. As their expectations increase, an omnichannel strategy becomes mission-critical for organizations.

According to a recent study, 47% of customers would switch to a competitor within a day of a poor customer experience. Therefore, businesses must have the ability to deliver a consistent experience across channels, while factoring in the different devices and channels that consumers are using to interact with your business.

Designing a Seamless User Experience for Customers at Every Touchpoint

While much has been spoken about the front-end shopping experience, the main challenge continues to be the systems and processes that need to be put in place behind the scenes to enable the complete omnichannel customer experience. The key to such success is a cost-effective infrastructure that can optimally integrate all customer touch points.

A typical consumer experience today involves a contextual experience across many channels: this may start with gathering product information from a store visit, researching features online or through a mobile device, soliciting product reviews through social networks, and finally a mobile application purchase, perhaps with a preference of in-store pickup.

Omnichannel Retail path

Figure 1: The Omnichannel Purchase Path

The landscape of omnichannel is comprised of a synchronized operating model in which all of the channels are aligned and present a single face to the customer, along with a consistent way of doing business. At the root of the omnichannel strategy is the 360-degree view of the consumer facilitating a consistent and seamless way of engaging and serving each individual.

Strategizing around a customer-driven approach requires the need for a dynamic infrastructure working in lockstep to engage the customers with a full understanding and prediction of their buying decisions. Organizations today must develop a personalized marketing strategy through which they can effectively retarget a customer across all channels, including real-time analytics and geospatial capabilities.

According to a Walker study, by the year 2020 customer experience will overtake price and product as the key brand differentiator, and a proper omnichannel experience is a key component of this initiative.

3 Core Preliminary Components of Omnichannel:

  1. Channel-agnostic inventory visibility and order management
  2. Customer-360 view and predictive analytics
  3. Flexible fulfillment strategy for in-store and online

The Real-Time Challenge

In an omnichannel world the time value of data—the measure of its useful timespan—is growing shorter and shorter due to:

  • A decrease in the cumulative time spent on an eCommerce website (less than 5 minutes)
  • A requirement for personalization of each experience in real-time to allow for proper customer segmentation (buyer vs. hunter vs. browser) and appropriate retargeting
  • Time-to-Insight driven workflows (e.g: adding a wish-list item your mobile phone in-store, then expecting recommendations around it on a website channel.

Decreasing data analysis and decision latencies have a ripple effect from the frontend all the way downstream to storage architecture layers. This is a low-latency and event-driven architecture problem that is not well suited to the world of traditional databases, or the volume-centric world of NoSQL databases.
The Omnichannel Ecosystem GigaSpaces

In the above general retailer architecture model, you can see how highly fragmented a typical retailer’s current IT infrastructure is. With so many data silos, channel silos, cost prohibitive legacy back-ends (such as mainframes), and lack of real-time technologies to gain KPI insights across all sources. implementing an infrastructure ready for the omnichannel is a massive undertaking and seemingly a huge capital investment.

How can retailers meet the demands of the “Now” customer and gain a competitive edge?

“Now” customers require speed; and modern in-memory Insight Platforms, such as the GigaSpaces InsightEdge Platform, are a solution to common low-latency / high-throughput problems in mission-critical application domains. The core technology of InsightEdge is our XAP in-memory data grid (IMDG). An IMDG is created from a cluster of machines that work together by federating their collective RAM capacity to create a resilient shared data fabric for low-latency data access and extreme transactional processing.

GigaSpaces InsightEdge Platform

GigaSpaces InsightEdge Platform, with its core engine XAP, can streamline channel operations and data by providing a unified data layer that can integrate multiple channel streams along with back-end workflows. At the heart of GigaSpaces is the co-location of data and business logic within the boundaries of one node, promoting maximum data locality. This unit of scale is one of many instances cooperating in a shared or replicated topology to provide distributed computing capabilities.

Benefits of InsightEdge Platform

  • Instant Insights: Unlock immediate insights right as data is born, empowering time-to-analytics and time to action at sub-second scale. Hyperscale analytics from SQL streaming to machine learning through Apache Spark.
  • Extreme Performance: Ultra-low latency, high-throughput transaction and stream processing. Co-location of applications and analytics to act on time-sensitive data at millisecond performance.
  • TCO Reduction: Simplified architecture eliminating cluster and component sprawl complexity; radically minimizing the number of moving parts. Cloud-native, infrastructure agnostic deployment, for hybrid cloud and on-premises environment. Intelligent multi-tiered data storage across RAM, SSD and Storage-Class Memory (3DXPoint).
  • Total Confidence: A mature battle-tested platform for mission-critical businesses. Highly available with up to 5 nines reliability and auto healing. Geo-redundancy and fast data replication for disaster recovery.

Throughout our engagements with large-scale retailers, we’ve seen the in-memory data grid pattern paving its way gradually, beyond simple distributed caching scenarios, into the different layers of the retailer’s eCommerce technology stack. This reflects the current transition in the retail world from a multi-channel customer interaction to a real-time and event-driven omnichannel one. While the term “in-memory” may only imply RAM, the storage fabric behind InsightEdge – is multi-tiered: RAM- for hot data and storage class memory and SSD  for warm and cold data in high capacity use cases. The tiers are combined under one data grid fabric to provide unified query and transaction semantics across all data storage tiers.

The below architecture model shows  the essential components that can enable real-time data processing, queries, fast data analytics and event-driven capabilities across different commerce domains (product, inventory, merchandising, pricing, analytics…etc):

GigaSpaces InsightEdge Omnichannel

The reference architecture consists of 3 tiers:

  • Consumers tier: a stateless tier which contains all the front-end channels and customer-facing components such as shopping carts, eCommerce desktop websites, mobile websites, as well as any service API providing a commerce business capability. Each system in this tier is integrated with the downstream tier (Integration) either via service APIs or data colocation/integration through special integration adapters.
  • Business Operations & Analytics tier: the components and services in this tier run entirely in memory to provide low-latency and real-time access to data. The purpose of this tier is to integrate session, transaction, shopping, catalog, and operational data all under one shared low-latency fabric to provide a single source of truth and enterprise-wide service access and evolution.  The unification of fast-data analytics, AI and transactional processing enable real-time analytics on hot data as it’s born.
  • Data tier: New and emerging persistence components (relational databases, NoSQL), as well as legacy systems in this tier, provide the foundational data that is loaded into the Analytics and Integration tiers.

Resilient Low-latency Global Shopping Carts

A Global Shopping Cart can be implemented by utilizing HTTP Session Sharing functionality along with multi-site data replication (if geographical redundancy is required). An eCommerce frontend hosting the shopping cart would include an intercepting filter that utilizes a space as a storage medium for HTTP sessions. Any other application server sharing this data would utilize the same session sharing filter, or directly use the space object data and linking it to its own context.


Sample Code

Scaled Real-time Inventory & Cross Channel Visibility

Inventory data services are usually endpoints where the heaviest write throughput in an eCommerce architecture takes place. To provide highly resilient and low latency access, the principle of maximizing data locality is emphasized by scaling out inventory data in a partitioned topology.

Mutable inventory data which requires transactional benefits from data locality is supported by co-locating code (business logic) and data together in one partition. Providing such locality allows for linear scaling of inventory data access. In addition, the contention is reduced via an asynchronous write-behind mechanism using a mirror service.

Sample Code

Predictive Analytics for Real-Time Personalization

Shopping cart and clickstream data provide extremely valuable that eCommerce websites can use to implement product recommendations, channel retargeting (for shopping cart abandonment), and overall customer journey analytics. An average shopper spends no more than a few minutes on a site, and businesses must act within this short time-frame, in essence in real-time.

The co-location of shopping cart session data, operational and product data and fast data analytics enables businesses to run advanced analytics on hot data in real-time. real-time. Changes to session data (adding or removing products from a cart) are read by a notify container, which triggers a process to lookup a recommendation item or invokes a predictive analytics to predict and act on the next potential behavior from the customer.

Sample Code

Data-Driven Dynamic Pricing

The ability to generate real-time insights by analyzing both data on the move and in rest, will help you boost both consumer price perception, identify key value items (KVIs), set and validate prices for long-tail items, and drive retail profitable growth through dynamic pricing.

Being able to leverage advanced analytics to predict pricing trends and evaluate real-time information, helps you generate offers on-the-spot that may change consumer consideration about what they’re purchasing at the moment.

Implementing a pricing engine that can instantly respond and even predict the optimal pricing requires a real-time big data architecture with special emphasis on the volume and velocity aspects.

Building a pricing engine as a real-time big data architecture requires the combination of two types of systems: a write-heavy velocity-driven tier (RAM) to process incoming data and read-heavy volume-driven tier (storage class memory or SSD) to process even larger data samples. The MemoryXtend module combines both systems by creating a unified RAM + SSD in-memory data grid that abstracts the data query semantics from where the data is located.

GigaSpaces MemoryXtend

Product Catalog Content Delivery Network

Access to an eCommerce or mCommerce website grows by at least 500% during the holiday season (e.g. “Black Friday” or “Cyber Monday”). To provide high availability and fast static content load, retailers can utilize a CDN network such as Akamai or Amazon CloudFront. To implement a similar concept for product data, which doesn’t rapidly change, requires multiple copies of product catalog data to exist across many geographical regions.


Multi-site data replication through GigaSpaces WAN gateway can be used via a Single Master / Multi Slave topology whereas the master grid holds the product data, replicating it to slave grids. Slave grids act as endpoints accessed through a geolocation DNS routing policy. Thus, each slave grid will serve users with the closest IP address.

Sample Code

Final Thoughts

In-memory architecture delivers the low latency and scalability you need, to capture, process and analyze real-time events at scale for a seamless omnichannel shopping experience.

Next-generation applications must embrace analytics in order to generate engaging and inspiring personalized experiences. And they must do it without sacrificing performance. Fortunately, with the right infrastructure and strategy, these two objectives are not mutually exclusive.

By utilizing a unified data analytics and applications platform to gain insights from advanced data analytics and Machine Learning on your business data as it’s born, your business can stay ahead of the competition and deliver a seamless, personalized and convenient customer experience across all digital channels and in-store.

Learn more about how GigaSpaces can help you create the best omnichannel experience for your business

Delivering the Ultimate Omnichannel Experience with Unified Transactional and Analytical Processing Platforms
Dana Meschiany on LinkedinDana Meschiany on Twitter
Dana Meschiany
Marcom Manager @ GigaSpaces
Dana is a content wizard, a lover of words and a tech geek at heart. When she's not planning events or tweetin' about them, you can find her doing her pilates exercises or catching up on the latest Netflix series.
Tagged on: