Data Visualization is a hot topic, but as a practice, it’s still in its infancy. While business magazines and conferences are breathless about the power of data to provide new insights and unlock new value, the number of practitioners capable of producing accessible visualizations that use data to tell a compelling story is still fairly low. So what do good data visualizations entail?
A panel of data viz experts—New York Times Graphics Editor Amanda Cox, Facebook Data Scientist Tye Rattenbury, and Data Solutions Architect Chris Whong—answered that question and more at a panel moderated by Huge’s Analytics Lead Dylan Spencer.
The panelists articulated some of the basic rules and general principles for designing with data. Guidelines, which are well documented, include restrictions in spatial layout, color ranges that make it hard to determine numerical scale, baselines in bar charts, and more. Amanda emphasized that these rules can be broken, but only if you have a working knowledge of why they exist in the first place, and you can articulate why you’re breaking them.
Audience matters too. Tye noted that some audiences are “more comfortable with percentages, while others with geo-spatial stuff.” Having a solid understanding of your target audience will determine what approach to use when communicating a particular data set. It’s also important to simplify, taking care to strip away all irrelevant data without disrupting the core story being told. The panelists said they all use constant iteration and user testing to ensure clutter-free, comprehensible visualizations.
As for what’s on the horizon, all agreed: standardization and increasing human bandwidth to grasp large data sets. Standardized data requirements across cities will not only allow for more complex visualizations and comparisons, but also more accurate market indicators. Examples of attempts to standardize certain market data sets include the Real Estate Transaction Standard, pioneered byZillow and other real estate websites, and Yelp’s criteria for restaurant data. Human bandwidth is trickier—the human brain has its limitations, ones that ultimately will get in the way of successfully presenting larger and larger data sets.
The panelists also discussed the necessary skepticism involved when looking at data sets, data journalism and its future, open data, big data, and the intersection between qualitative and quantitative data.
Check out the video for more.