6 Comments

Summary:

In order to respond to customers in a timely manner, it’s important that businesses focus on data velocity–the speed at which data can be generated, collected, processed, and analyzed.

Most data has three characteristics that business are concerned with — volume, variety and velocity. Much time and effort is spent on the concept of volume (after all, it’s big data) and even variety (all those databases for unstructured data) but velocity is a tougher problem to ponder and solve. Velocity is the speed at which data can be generated, collected, processed, and analyzed. It is based on changing customer behavior, and understanding the active customer in order for organizations to adapt and act quickly to changing situations or opportunities

Unfortunately, velocity is emerging as the forgotten V in the big data equation. Yet, the key to a successful big data project is making sure all three V’s—volume, variety, velocity—are equally considered and implemented.

Velocity is what differentiates big data from traditional business intelligence (BI) practices. It enables real-time decisions and actions, whereas traditional BI generally only covers volume and variety. In a traditional data warehouse, data is typically collected and analyzed at the end of the day, and then made available the following morning, at the start of business. This creates issues, as a lot can happen since the last transaction—continued customer activity, supply chain issues, product faults, etc.—and companies can’t afford to wait for this data to become available.

Velocity in motion

In this age of social media, customers turn to Facebook, Twitter, and blogs to post questions and feedback on anything from customer service to product issues. The longer it takes a company to react to negative feedback on these channels, the likelihood of churn increases. Organizations need to be able to respond to customers in a timely manner—and this is where velocity comes in. There is so much data available on all channels, and the velocity of data is what needs to be delivered quickly so that companies can take immediate, intelligent actions.

Consider company product launches. Taking advantage of quickly arriving data in real-time helps track adoption rates and apply action to them. If a product is appealing to a particular region or age group, having real-time access to this data enables marketers to take immediate action to steer the direction they’re looking to take (i.e. make changes to supply chain, inventory, or market properly to the appropriate customers). Time is money, and companies can’t afford to fall 24 hours behind—whether it’s fixing a product, making updates, responding to customers—otherwise, it becomes a traditional BI/ data warehouse where companies are seeing but not doing.

Handling data velocity can change prices and prevent fraud

TabletFraud

The same can be applied to pricing. Buyers are very price conscious and are largely more focused on the price than they are on the product. Knowing how to appropriately price a product and when to offer a discount on an item is all about timing and not losing the opportunity. If a shopper puts an item in a cart and then abandons it for a lower-priced item, the velocity portion of big data allows companies to take immediate action so that they are pricing items appropriately, sending personalized offers, offering coupons based on payment transactions and location, etc., at the right time.

Financial services is another great opportunity for data speed to shine. The ability to detect fraud and block rogue operations mid-transaction avoids costly recovery operations. As fraudsters try to mask their activities by quickly changing behavior and interaction channel over time, having a total view on the data, processed in real-time, helps detect such channel-hopping fraud. And an extension to this is online banking. The time customers spend on their banking website is limited—often only on a weekly or even lower frequency—so targeting them with the right offer or message needs to be done in-session, not afterwards, as by then their intent or interest has likely drifted.

It’s all about creating the best overall customer experience, managing the active customer, and the speed at which you do it. While volume and variety are integral components to a successful big data strategy, they are worth much less if they’re not working in conjunction with velocity.

Fortunately, with increased access to affordable cloud computing, companies can avoid expensive technology investments when it comes to analyzing data quickly. These days, it is relatively simple to connect to cloud services like Amazon, SAP Hana, and others for access to real-time, high volume computing to put ideas and concepts into action. Plus, with the services model, you pay only for what you use—making it a win-win for trying out a data strategy that takes velocity into account.

Alain Vandenborne at senior vice president of business development at NGDATA.

You’re subscribed! If you like, you can update your settings

  1. The general approach to address or build this into the eco-system is; by fronting in-memory data-fabrics that have big-data support and ability to distinguish operational and historic data.

    1. There’s no way to get ideal velocity when the data is stored in an unorganized way. There’s no walk-around it. If you want vast unstructured data you relent on the index front that helps speed analytic access up.If you want analytic speed you have to build some kinda structure in when you store the data. Those two things are hard to co-exist no matter what kinda fish oil these “fish oil” big data salesmen are selling.

  2. Sanjay Umakanth Tuesday, December 24, 2013

    Can i know some more information on this BIG DATA and is this software will give a drastic change in the industry.

  3. Big data concept undoubtedly revolves around 3 V’s- Volume, Variety and Velocity.
    Each of V is equally important and hence cannot be neglected. Volume and variety attributes are going to increase coz of large mass of consumer moving towards e-commerce. Now it depends on potential of company to analyze and act.

  4. Hector Sanchez-Villeda Tuesday, December 24, 2013

    That´s true, about the triple V, using unstructured database models to analyze information, it is important to use the businesses intelligence in order to direct to the target audience the results of the data, which is becoming to information and lastly knowledge. The question is not to handle BIG DATA, but generating intelligent software that can be used by the people with few knowledge about computers in an easy and user friendly way.

  5. Great of GigaOm to recognize Gartner’s original 3Vs from over a dozen years ago, albeit without the courtesy of a citation. See http://goo.gl/wH3qG for the original piece from 2001. –Doug Laney, VP Research, Gartner, @doug_laney

Comments have been disabled for this post