2014-08-29 14_06_02-Clipboard

#data14 – Start Your Blenders!

I’m going to plug some sessions for the 2014 Tableau Conference, if you want to promote yourself please add a comment below!

Getting your blend on has never been easier

At the 2014 Tableau Conference there’s a whole track worth of sessions on data blending by some fabulous folks, with my comments in italics.

  • Mix It Up: Data Blending Basics by Alex Woodcock of Tableau. Beginner, Wednesday 10:45am-1:00pm and Thursday, 10:45am-1:00pm. If you’ve never blended before, this is the class for you.
  • What’s In Your Blender by Charles Schaefer and Kelly Hotta of Tableau. Advanced, Tuesday, 11:15am-12:15pm. Tips and tricks for data blending.
  • Jedi Calculation Techniques by Bethany Lyons and Alan Eldridge of Tableau. Jedi, Tuesday, 11:15am-1:30pm Room also Wednesday 3:30-6pm. Covers when blending might be used among lots of other non-blending topics in the 2hr session.
  • Become a Mix Master with Data Blending by Bethany Lyons of Tableau. Jedi, Tuesday, 2:30-3:30pm  and Wednesday, 10:45-11:45am. Bethany gave this presentation at the London conference, it covers how blending works in more detail.
  • Mix and Match Your Data: Advanced Data Blending by Alex Woodcock of Tableau, Advanced,Tuesday, 2:30pm-5:00pm and Wednesday, 3:30-6pm. 2hr training to bootstrap yourself from basic to more advanced knowledge of data blending.
  • Flowing with Tableau by Joe Mako (Tableau guru to the gurus), Jedi, Wednesday, 12-1pm. See how Joe approaches Tableau and conceives of the solutions that he does, he gave a similar talk in California this summer.
  • Extreme Data Blending by Jonathan Drummey (yours truly), Jedi, Wednesday, 3:30-4:30pm. See below.

I’m energized about all of these sessions, especially Joe Mako’s. It’s not so much tips and tricks, but instead how to “think Tableau” and work with the software. I’ve used the metaphor of a structured poem before, in that when writing something like a sonnet we have certain conventions to follow, and as long as we do we can have lovely results, the same goes with Tableau in how we structure the data and use the different features and functions in the software.

Picture1My own session on Extreme Data Blending mashes up South Park and Frozen in a deep dive into how data blending works. I’m excited to share what I’ve learned, especially how that every single odd, strange, or seemingly broken result of data blending actually has a logic and reasoning behind it that can be understood, explained, and even made use of. If you’re new to data blending and want to attend my session, I suggest you go to one of the other sessions to get grounded in blending behaviors. If you’ve been using blending already, I promise you you’ll learn something new, though if you’ve read all of my posts on data blending then some of the use cases will be familiar. If you’re already a Tableau Jedi, you’ll like this session because I’ve purposely created it to start out with a review of known territory, then we’re going Jedi++.

A few other sessions and meetups that I’d like to plug are:

  • First timers and conference newbiesEmily Kund and Matt Francis (they host the one and only Tableau Wannabe Podcast, totally worth a listen) are hosting a conference orientation session Monday at 4pm, before the welcome reception. They then repeat that First Timers’ Field Guide session Tuesday at 11:15am
  • Tableau Community Meetup – Wednesday, 12:30-2pm in Community Alley. Here’s your chance to meet in real life Tracy, Patrick, and Jordan who run the Tableau forums along with assorted other forum helpers.
  • Meet the Tableau Zen Masters – Besides my session, the one place you can definitely find me (ok, besides stalking Neale Degrasse Tyson and Hans Rosling for selfies) is here on Wednesday at 6pm, though I’m not sure yet where “here” will be.
  • Women in Data Meetup – Jenn Day and Anya A’Hearn are hosting this meetup on Tuesday at 12:45pm in the University Room at the Sheraton. I think it’s fantastic Jenn & Anya are hosting this and that Tableau is supporting the meetup, see #womenindata for more on the topic. We all need to find our tribes, maybe this is yours!

See you in Seattle!

380px-Highlander-kilt

I Have Wee Data – Microsoft Access and Tableau

In all the hype about big data, we have to acknowledge that some of us have “wee” data. Not every organization has a fully-built out Information Systems department or Business Intelligence team with access to petabytes of data and the latest tools like Hadoop and Alteryx. Some of us are still running on legacy hardware and software, have tiny budgets, part-time staff, and thousands or tens of thousands of records that we want to analyze vs. billions.

My day job is in the latter camp. For all that US healthcare includes the latest treatments and technology, healthcare IT has historically been behind the times. My desktop is running Windows XP SP3, Office 2007 is our productivity tool, Microsoft Access is our most commonly used database, and Tableau is our go-to choice for data visualization (so there’s at least one area where we’re we’ve got current technology).

Every couple-few months I get a question about Microsoft Access and Tableau, I thought I’d take a few minutes to combine my answers into one post, so read on for what I know about integrating Access and Tableau.

Continue reading

clock image from http://images.cdn.fotopedia.com/flickr-4750765479-original.jpg

Formatting Time Durations in Tableau

Here’s a quick lunchtime post on working with durations in Tableau. By duration, I mean having a result that is showing the number of seconds, minutes, hours, and/or days in the form of dd:hh:mm:ss. This isn’t quite a built-in option, there are a several ways to go about this:

  • Use any duration formatting that is supported in your data source, for example by pre-computing values or using a RAWSQL function.
  • Do a bunch of calculations and string manipulations to get the date to set up. I prefer to avoid these mainly because they can be over 1000x slower than numeric manipulations. If you want to see how to do this, there’s a good example on this Idea for Additional Date Time Number Formats. (If that idea is implemented and marked as Released, then you can ignore this post!)
  • If the duration is less than 24 hours (86400 seconds), then you can use Tableau’s built-in date formatting. I’ll show how to do this here.
  • Do some calculations and then use Tableau’s built-in number formatting. This is the brand-new solution and involves a bit of indirection.

Continue reading

IMG_3956 - Version 2

Summer Studies of Tableau

If you’re not off on some sunny beach somewhere (or even if you are), here are some (free!) opportunities coming up for you to sharpen your Tableau skills and get previews of material that will be in my book. I’ve got 3 presentations in the next month, two are in New England, the other is a webinar:

  1. June 24th at the Boston Tableau User Group: Making Tableau More Predictable: Understanding the Multiple Levels of Granularity. This is a reschedule of the session I was going to give back in April, it’ll be a combination of presentation and hands-on practice on how to “think Tableau” so your calculated fields, top & conditional filters, table calcs, etc. are more likely to come out the way you expect. Alteryx is demoing their software, and Zach Leber is also presenting.
  2. July 10th for a Think Data Thursday webinar: Setting up for Table Calculation Success. This will also review some of the granularity material, and go through how you can set up views and table calculations so that a) they work, and b) if they don’t work how to diagnose what is going on so you can get back to a working calc or be able to submit a really detailed support request.
  3. July 22nd at the (inaugural) Maine Tableau User Group: Getting Good at Tableau. Hosted by Abilis Solutions in Portland, I’m helping to kick off the MaineTUG with a talk on how to set up your data and build your Tableau skills (including how to avoid getting distracted by all the gee-whiz features of the Tableau interface) and I’ll do some intro of Tableau 8.2. Grant Hogan of Abilis will be presenting, as well as someone from Tableau.

I’ll update this post as the links for registering appear, I hope to see you (virtually or in person) at one of these events! And if not then, I’ll be a the Tableau Conference in September.

Screen Shot 2014-06-14 at 4.08.55 PM 1

At the Level – Unlocking the Mystery Part 1: Ordinal Calcs

There was a Tableau forums thread on At the Level awhile back where Matthew Lutton asked for an alternative explanation of this somewhat puzzling table calculation configuration option, and I’d promised I’d take a swing at it. Plus, I’ve been deep into book writing about shaping data for Tableau, and a taking a break to write about obscure table calc options sounds like fun! (Yes, I’m wired differently.)

Read on for a refresher on addressing and partitioning and my current understanding of uses of At the Level for ordinal table calculations such as INDEX() and SIZE(). Part 2 will cover LOOKUP(), and Part 3 will cover WINDOW_SUM(), RUNNING_SUM(), and R scripts. If you’re new to table calcs, read through at least the Beginning set of links in  Want to Learn Table Calculations. Thanks to Alex Kerin, Richard Leeke, Dimitri Blyumin, Joe Mako, and Ross Bunker for their Tableau forum posts that have informed what you’re about to read.

Continue reading

Boom

The End of the World – by Noah Salvaterra

A guest post by Noah Salvaterra, you can find him on the Tableau forum or on Twitter @noahsalvaterra.

I expect the header image may spark some discussion about visualization best practices; actually, I sort of hope it does. The data shown is from NOAA’s online database of significant earthquakes and is displayed by magnitude on a globe, so 4 dimensions packed into a 2 dimensional screen. While it was created in Tableau, it might be a long wait before something like this appears in the show-me menu.

SpinningGlobeFor those who missed the header because they are reading this in an email, I’ve included an animated 3D version on the left, though to actually see it in 3D requires the use of ChromaDepth glasses (I discussed this technique in more detail in a prior blog post). Use of 3D glasses adds even more controversy because while we can get some understanding of depth from a 3D image, it isn’t perceived in an equal way to height and width. Data visualization best practices can help in choosing between several representations of the same dataset, choosing bar graphs over pies, for example, since bars will typically lead to a better understanding of the data. Best practices also instruct us to avoid distorted presentations such as 3D or exploding pies and 3D bar charts, since these are likely to lead to misunderstanding. I’m not exactly sure what best practices has to say about this spinning 3d anomaly, my guess is it would be frowned upon. I think there is something to be said for including a novel view of your data if it helps to engage with the topic, and even if this one does break some rules, it’s hard to look away. If you’d rather just see the earth spinning, without all the data overlaid, there is an earth only view at the end.

The images above may not be the best choice as general way to visualize this earthquake data. In fact, I’m the first to admit that it has some significant issues. Comparing earthquake magnitudes between 2 geographic areas would be tricky, plus half of the earth is hidden from view completely because it is on the back. Adding the ability to rotate the globe in various directions in a Tableau workbook helps a bit, but you’re left to rely on your memory to assemble the complete picture. If the magnitude of the quakes is the story you’re telling, you might be better served with a flat map maybe using circles to represent the magnitude of the quakes, such as the one shown below. I think this is a good presentation; it has some nice interactivity and as far as I know doesn’t break any major rules from a best practices standpoint. But it certainly isn’t perfect, nor is without distortion. Judging the relative size of circles isn’t something that will be perceived consistently, but the failure I had in mind isn’t one of perception, it is about the data being accurate at all. The map itself brings a tremendous amount of distortion to the picture, in location of all things.

In case you haven’t heard, the earth isn’t flat (I like to imagine someone’s head just exploded as they read that sentence). It is roughly spherical. Well, technically it is a bit more ellipsoidal, bulging out slightly along the equator, and more technically still this ellipsoid is irregularly dotted with mountains, oceans, freeways, trees, elephants and wal-marts (not meant to be a comprehensive list). Also, as the moon orbits, it causes a measurable effect not just on the tides, but it distorts the land a bit as well as it passes by. Furthermore, the thin surface we inhabit floats, lifting, sinking, circulating on top of a spinning liquid center. Earthquakes serve as a reminder of this fact. The truth can be overwhelming in its complexity; so we simplify. Though not the complete truth, a well-chosen model can be a valuable proxy when it doesn’t oversimplify. One way to understand the difference would be to analyze the scale of the errors introduced. The highest point on earth is Mt. Chimborazo in Ecuador at 6,384.4 km… you were thinking Everest? That is the highest above sea level, but the sea bulges as well, and Chimborazo is the furthest from the center getting a boost by being close to the equator. The closest point to the center of the earth is in the arctic ocean near the north pole and is about 6,353 km from center. If we use the mean radius of 6,371 we are doing pretty well (error is within .3%). A sphere seems like a reasonable compromise.

So the earth is spherical… but our map is rectangular. You don’t need to invest in differential geometry course to understand that there is something fishy going on there (though you might to prove it). In fact there is no way to map a spherical earth to a rectangle, or any flat surface without messing something up, the something being angle, size or distance; at least one will be distorted when the earth is presented on a flat surface (sometimes all of them). This seems to be a bit of a problem given the goals of presenting data accurately. What if your story is one of angle, distance, area or density?

What shape are the various shifting plates? What are their relative sizes? How fast do they move? Where do they rise and fall? What effect does this have? Can you tell this story in Tableau? Can you tell it at all? Maybe. I’d certainly like to see this done, but seismology isn’t an area I have any specialized knowledge. In areas where I do have such knowledge, I’m lucky to get questions so well defined and which span just a handful of dimensions. When I’m dealing with 50 dimensions that writhe and twist through imaginary spaces whispering patterns so subtle that the best technique I’ve found to discovering them is often just to give up and go to sleep, I’m not deciding between a pie chart and a bar chart, it is an all out street fight. Exploring the Mercator projection seemed like a good analogy for the struggle to represent a complex world in a rectangle, plus it seemed like a fun project. As I undertook this exercise, though, I realized that other map projections weren’t much further afield. Also, Richard Leeke mentioned something about extra credit if I could build a 3D globe with data on it. I’m a sucker for bonus points.

chartHow bad are the maps in Tableau? Well, it depends where you look at them, and what you hope to learn from them. Your standard Tableau world map is a Mercator projection. If you’re planning to circumnavigate the globe, using an antique compass and sextant, it will actually serve you pretty well. Since the Mercator projection has a nice property for navigating a ship. If you connect 2 points with a straight line, you can determine your compass heading and if you follow that course faithfully, you’ll probably end up pretty close to where you intended. Eventually. You can actually account for this distortion in such situations, with a bit of math, so you’re not completely guessing on how long you’ll need to sail. Incidentally, I’m not particularly riled up about Tableau’s choice of the Mercator projection, sailing around the world with a sextant and compass sounds like a whole lot of fun to me and any flat map is going to involve a compromise on accuracy somewhere. What I do think is important is knowing this distortion is there in the first place. How bad is the distortion? Scale distortion on a Mercator map can be measured locally as sec(Latitude) (if your trigonometry is rusty, sec is 1/cos). Comparing a 1m x 1m square near the equator with one at the north pole, you’d find that a Mercator projection introduces infinite error, which is a whole lot of error. To be fair, since printed maps are finite and the Mercator projection isn’t, the poles get cut off at some point (so the most common maps of the whole world are actually excluding part of it…). If we cutoff at +/- 85 degrees of latitude, we reach a scale increase of sec(85) which is about 11.47, i.e. objects are 1,147% bigger than their equivalent at the equator! That seems like a pretty significant lie factor…

Recently (on a cartographic time scale), the Peters projection has gotten a lot of attention. This is a good place to pause for a brief video interlude:

Maps that preserve angles locally are called conformal. The Peterson projection is not conformal so while it represents relative area more accurately, it would be a terrible choice for navigation.

StereographicStereographic projection is another noteworthy map. Like Mercator, Stereographic is a conformal map. It maps angle, size, and distance pretty faithfully close to the center, so it is a common choice for local maps (you probably use such maps often without even realizing it). Stereographic projection isn’t a very popular choice for a world map, however, because (among other things) you’d need an infinite sheet of paper to plot the whole thing. On the right is a stereographic projection map from my Tableau workbook. In case you can’t see them, North America, South America, Europe and Africa are all near the center of the map. The yellow country on the left is the Philippines…

I included the maps I did because they are popular, and I knew most of the math involved; however, there are lots of other options. I’m not arguing that any one is best, rather that they are all pretty bad in one way or another, and we should choose our maps like our other visualizations so they best tell a story, or answer a question, and while there will be distortion, it should be chosen in a way that doesn’t compete with what we hope to learn or teach.

In addition to the earthquake maps seen already, the workbook for this post contains an interface to explore some of these different projections, and not just the most traditionally presented versions of each of them. I invite you to create your own map of the world, based on whatever is most important to you. Flip the north and south poles, or rotate them through the equator. My hope is that exploring these a bit by rotating or shifting the transverse axis will be a useful exercise in understanding what it is you’re looking at when you see one of these maps, so you might have a better chance of seeing things as they truly are.

I’m pretty sure there is a rule about not putting 7 worksheets on a single dashboard, there may even be a law against it, but once I had all these maps I wasn’t entirely sure what to do with them all. I apologize for not arranging them thoughtfully into 2 or 3 at a time. I experimented with this approach, but ultimately abandoned it because I didn’t think I had enough material on map projection to make interactive presentation of all these very interesting. I also thought about a parameter to choose between them, but since they are necessarily different shapes, it didn’t seem practical to try to fit them all in the same box. Truthfully, I think there is a lot of room for improvement in terms of dash boarding these, but when I open the workbook I just end up tinkering with something else. It is time for me to set this one free. Feel free to download and play with them as long as Richard and I have.

Here is a link to the workbook on Tableau Public

When I’m presenting, or exploring data, accuracy is usually something I pay careful attention to, but it isn’t my goal. The most important thing for me is to find a story (or THE story) and to share it effectively. If you hadn’t noticed from my previous posts, I don’t let what is easy stand in the way of a good question; in fact if it is easy I get a little bored. I like to bite off more than I can chew (figuratively; literally doing this could potentially be pretty embarrassing). Having the confidence to take on big challenges is something I’m deeply grateful for; knowing when to ask for help, and where to find it has taken a bit more effort, but is something I’m getting better at. As with Enigma, Richard Leeke was a huge resource for this post. Having seen his work on maps I thought he might have something I could use as an initial dataset. He came through there, and helped me to work through the many subtleties of working with complex polygons without making a complete mess. You have him to thank for the workbook being as fast as it is (assuming I didn’t break it again; if it takes more than 7 seconds to load, my bad).

ptolemymap3I feel a kinship with cartographers during the age of exploration. This discipline still holds value, certainly, but the recesses of our planet have been documented to the point where it doesn’t hold the same mystique in my imagination. When I think of old world cartographers, I think of an amalgam of artist and scientist. Assimilating reports from a variety of sources, often incomplete and sometimes incorrect; they crafted this data to accurately paint a picture that would help drive commerce, avoid catastrophe or just build understanding. They created works of art that might mark the end of a significant exploration, or might be the vehicle through which exploration takes place. Sound familiar? If not, just use a bar chart. It is just better.

I almost forgot, I promised a spinning earth without all the earthquake data. Enjoy.
Globe

Screen Shot 2014-04-16 at 6.09.22 AM

The Letdown and the Pivot

The Letdown

Tableau does amazing demos. Fire up the software, connect to a data source, select a couple pills, click Show Me, boom there’s a view. Do a little drag and drop, boom, another view. Duplicate that one, boom, another view to rearrange. Within three minutes or less you can have a usable dashboard, for 200 rows of data or 200 million.

Screen Shot 2014-04-16 at 6.29.57 AMIf you’ve seen those demos, the not-so-dirty little secret of Tableau is that they pretty much all start with clean, well-formatted, analytics-ready data sources. As time goes on, I’ve interacted with more and more new Tableau users who are all fired up by what they saw in the demos, and then let down when they can’t immediately do that with their own data. They’ve got to reshape the data, learn some table calcs right away, or figure out data blending to deal with differing levels of granularity, and/or put together their first ever SQL query to do a UNION or a cross product, etc. Shawn Wallwork put it this way in a forum thread back in January: “On the one hand Tableau is an incredibly easy tool to use, allowing the non-technical, non-programmers, non-analysis to explore their data and gain useful insights. Then these same people want to do something ‘simple’ like a sort, and bang they hit the Table Calculation brick wall…”

I work with nurses and doctors who are smart, highly competent people who daily make life or death decisions. Give them a page of data and they all know how to draw bar charts, line charts, and scatterplots with that data. They can compute means and medians, and with a little help get to standard deviations and more. But hand them a file of messy data and they are screwed, they end up doing a lot of copy & paste, or even printing out the file to manually type the data in a more usable format. The spreadsheet software they are used to (hello, Excel) lets them down…

…and so does Tableau.

A data analyst like myself can salivate over the prospect of getting access to our call center data and swooping and diving through hundreds of thousands of call records looking for patterns. However, the call center manager might just want to know if the outgoing reminder calls are leading to fewer missed appointments. In other words, the call center manager has a job to do, that leads to a question she wants to answer, and she doesn’t necessarily care about the tool, the process, or the need to tack on a few characters as a prefix to the medical record number to make it correspond to what comes out of the electronic medical record system; she just wants an answer to her question so she can do her job better. To the degree that the software doesn’t support her needs, there has to be something else to help her get her job done.

The Pivot

When Joe Mako and I first talked about writing a book together, our vision was to write “the book” on table calculations and advanced use cases for Tableau. We wanted (and still want) to teach people *how* to build the crazy-awesome visualizations that we’ve put together, and how they can come up with their own solutions to the seemingly-intractable and impossible problems that get posted on the Tableau forums and elsewhere. And we’ve come to realize that there is a core set of understandings about data and how Tableau approaches data that are not explicitly revealed in the software nor well-covered in existing educational materials. Here are a few examples:

  • Spreadsheets can have a table of data, so do databases (we’ll leave JSON and XML data sources out of the mix for the moment). But spreadsheet tables and database tables are very different: Spreadsheet tables are very often formatted for readability by humans with merged cells and extra layers of headers that don’t make sense to computers. A single column in a spreadsheet can have many different data types and cells with many meanings, whereas databases are more rigid in their approach. We tend to assume that new users know this, and then they get confused when their data has a bunch of Null values because the Microsoft Jet driver assumed the column starting with numbers was numeric, and wiped out the text values.
  • Screen Shot 2014-04-16 at 6.09.22 AMWe—Tableau users who train and help other users—talk about how a certain data sets are “wide” vs. “tall”, and that tall data is (usually) better for Tableau, but we don’t really talk about what are the specific characteristics of the data and principles involved that in a way that new Tableau users who are non-data analysts can understand and apply those principles themselves to arrange their data for best use in Tableau.
  • Working with Tableau, we don’t just need to know the grain of the data–what makes a unique row in the data–we also need to understand the grain of the view–the distinct combinations of values of the dimensions in the view. There can be additional grains involved when we start including features like data blending and top filters. Even “simple” aggregations get confusing when we don’t understand the data or Tableau well enough to  make sense of how adding a dimension to the view can change the granularity.

Carnation, Lily, Lily, Rose by John Singer Sargent, from WikiMedia CommonsJust as we can’t expect to be a brilliant painter without an understanding of the interplay between color and light, we can’t expect to be a master of Tableau without a data- and Tableau- specific set of understandings. Therefore, we’ve been pivoting our writing to have more focus on these foundational elements. When they are in place, then doing something like a self-blend to get an unfiltered data source for a Filter Action becomes conceivable and implementable.

Screen Shot 2014-04-16 at 6.10.37 AMThis kind of writing takes time to research, think about, synthesize, and explain. I’ve been reading a lot of books, trawling through painfully difficult data sets, filling up pages with throw-away notes & diagrams, and always trying to keep in mind the nurses and doctors I work with, the long-time Tableau users who tell me that they still “don’t get” calculated fields in Tableau (never mind table calcs), and the folks I’m helping out on the Tableau forums. So “the book” is going slower than I’d hoped, and hopefully will be the better for it.

If you’d like a taste of this approach, I’ll be leading a hands-on workshop on pill types and granularity at this month’s Boston Tableau User Group on April 29.

Postscript #1: I’m not the only person thinking about this. Kristi Morton, Magdalena Balazinska, Dan Grossman (of the University of Washington), and Jock Mackinlay (of Tableau) have published a new paper Support the Data Enthusiast: Challenges for Next-Generation Data-Analysis Systems. I’m looking forward to what might come out of their research.

Postscript #2: This post wouldn’t have been possible without the help (whether they knew it or note) of lots of other smart people, including: Dan Murray, Shawn Wallwork, Robin Kennedy, Chris Gerrard, Jon Boeckenstedt, Gregory Lewandoski, and Noah Salvaterra. As I was writing this post, I read this quote from a Tableau user at the Bergen Record via Jewel Loree & Dustin Smith on Twitter: “Data is humbling, the more I learn, the less I know.” That’s been true for me as well!