Challenge #1: It’s Hard to Visualize Data When You’re Overwhelmed With It

Contributed by Derrick Snyder | Austin, TX

This is the first installment of the blog series, “The Top Challenges of Visualizing Time-Based Measurement Data”.

Data visualization is complicated by the sheer amount of data produced by some engineering measurement systems. Depending on the application, data volume can become a problem in two scenarios: collecting a little data over a long period of time, or collecting a lot of data in a short time (of course, collecting a lot of data over a long time is the epitome of this dilemma).

Consider the example of temperature mentioned in my earlier post. All things considered, temperature doesn’t change very quickly; measuring one or two data points per second is sufficient to characterize temperature change. However, if you’re measuring environmental activity to try and characterize temperature change in the rainforest by collecting one data point per second over the course of a month (or more), those data points add up. At the other end of the spectrum, sometimes data is measured incredibly quickly for just a few seconds or less. In the case of an automotive safety test, vehicles are crashed into barriers to measure the force of impact over just a few tenths of a second. Systems like this are often capable of capturing millions of data points per second, yielding the same overwhelming outcome.

Figure 1. In this automotive crash test, millions of data  points were collected from several different accelerometer sensors in the head of the crash test dummy over the short duration of just three tenths of a second.

Many traditional data processing and visualization software tools are crippled by the sheer volume of data acquired in engineering measurement applications. Microsoft Excel, for example, only recently removed a limitation on graphing that restricted graph curves to 32,000 data points (far too few to be adequate for engineering measurement data), and even with the reduced restrictions, visualizing large quantities of data on a graph is painfully slow and sometimes causes the software to become unresponsive.

For these applications, it’s important to seek out a visualization tool that is built to handle a large magnitude of data. Appropriate software tools will include various optimizations that make this challenge easier. As one example, some software can automatically condense data sets through a technique called data reduction in order to only display the minimum, maximum, or average of every N data points.

– Derrick Snyder derrick.snyder@ni.com

Connect with Derrick on LinkedIn

*~*~*~*~*

About Derick Snyder

Derrick Snyder is a product manager for NI DIAdem measurement data processing and visualization software at National Instruments. He received a bachelor’s degree in Computer Engineering from Vanderbilt University.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s