If you’re not off on some sunny beach somewhere (or even if you are), here are some (free!) opportunities coming up for you to sharpen your Tableau skills and get previews of material that will be in my book. I’ve got 3 presentations in the next month, two are in New England, the other is a webinar:
June 24th at the Boston Tableau User Group: Making Tableau More Predictable: Understanding the Multiple Levels of Granularity. This is a reschedule of the session I was going to give back in April, it’ll be a combination of presentation and hands-on practice on how to “think Tableau” so your calculated fields, top & conditional filters, table calcs, etc. are more likely to come out the way you expect. Alteryx is demoing their software, and Zach Leber is also presenting.
July 10th for a Think Data Thursday webinar: Setting up for Table Calculation Success. This will also review some of the granularity material, and go through how you can set up views and table calculations so that a) they work, and b) if they don’t work how to diagnose what is going on so you can get back to a working calc or be able to submit a really detailed support request.
July 22nd at the (inaugural) Maine Tableau User Group: Getting Good at Tableau. Hosted by Abilis Solutions in Portland, I’m helping to kick off the MaineTUG with a talk on how to set up your data and build your Tableau skills (including how to avoid getting distracted by all the gee-whiz features of the Tableau interface) and I’ll do some intro of Tableau 8.2. Grant Hogan of Abilis will be presenting, as well as someone from Tableau.
I’ll update this post as the links for registering appear, I hope to see you (virtually or in person) at one of these events! And if not then, I’ll be a the Tableau Conference in September.
Tableau does amazing demos. Fire up the software, connect to a data source, select a couple pills, click Show Me, boom there’s a view. Do a little drag and drop, boom, another view. Duplicate that one, boom, another view to rearrange. Within three minutes or less you can have a usable dashboard, for 200 rows of data or 200 million.
If you’ve seen those demos, the not-so-dirty little secret of Tableau is that they pretty much all start with clean, well-formatted, analytics-ready data sources. As time goes on, I’ve interacted with more and more new Tableau users who are all fired up by what they saw in the demos, and then let down when they can’t immediately do that with their own data. They’ve got to reshape the data, learn some table calcs right away, or figure out data blending to deal with differing levels of granularity, and/or put together their first ever SQL query to do a UNION or a cross product, etc. Shawn Wallwork put it this way in a forum thread back in January: “On the one hand Tableau is an incredibly easy tool to use, allowing the non-technical, non-programmers, non-analysis to explore their data and gain useful insights. Then these same people want to do something ‘simple’ like a sort, and bang they hit the Table Calculation brick wall…”
I work with nurses and doctors who are smart, highly competent people who daily make life or death decisions. Give them a page of data and they all know how to draw bar charts, line charts, and scatterplots with that data. They can compute means and medians, and with a little help get to standard deviations and more. But hand them a file of messy data and they are screwed, they end up doing a lot of copy & paste, or even printing out the file to manually type the data in a more usable format. The spreadsheet software they are used to (hello, Excel) lets them down…
…and so does Tableau.
A data analyst like myself can salivate over the prospect of getting access to our call center data and swooping and diving through hundreds of thousands of call records looking for patterns. However, the call center manager might just want to know if the outgoing reminder calls are leading to fewer missed appointments. In other words, the call center manager has a job to do, that leads to a question she wants to answer, and she doesn’t necessarily care about the tool, the process, or the need to tack on a few characters as a prefix to the medical record number to make it correspond to what comes out of the electronic medical record system; she just wants an answer to her question so she can do her job better. To the degree that the software doesn’t support her needs, there has to be something else to help her get her job done.
When Joe Mako and I first talked about writing a book together, our vision was to write “the book” on table calculations and advanced use cases for Tableau. We wanted (and still want) to teach people *how* to build the crazy-awesome visualizations that we’ve put together, and how they can come up with their own solutions to the seemingly-intractable and impossible problems that get posted on the Tableau forums and elsewhere. And we’ve come to realize that there is a core set of understandings about data and how Tableau approaches data that are not explicitly revealed in the software nor well-covered in existing educational materials. Here are a few examples:
Spreadsheets can have a table of data, so do databases (we’ll leave JSON and XML data sources out of the mix for the moment). But spreadsheet tables and database tables are very different: Spreadsheet tables are very often formatted for readability by humans with merged cells and extra layers of headers that don’t make sense to computers. A single column in a spreadsheet can have many different data types and cells with many meanings, whereas databases are more rigid in their approach. We tend to assume that new users know this, and then they get confused when their data has a bunch of Null values because the Microsoft Jet driver assumed the column starting with numbers was numeric, and wiped out the text values.
We—Tableau users who train and help other users—talk about how a certain data sets are “wide” vs. “tall”, and that tall data is (usually) better for Tableau, but we don’t really talk about what are the specific characteristics of the data and principles involved that in a way that new Tableau users who are non-data analysts can understand and apply those principles themselves to arrange their data for best use in Tableau.
Working with Tableau, we don’t just need to know the grain of the data–what makes a unique row in the data–we also need to understand the grain of the view–the distinct combinations of values of the dimensions in the view. There can be additional grains involved when we start including features like data blending and top filters. Even “simple” aggregations get confusing when we don’t understand the data or Tableau well enough to make sense of how adding a dimension to the view can change the granularity.
Just as we can’t expect to be a brilliant painter without an understanding of the interplay between color and light, we can’t expect to be a master of Tableau without a data- and Tableau- specific set of understandings. Therefore, we’ve been pivoting our writing to have more focus on these foundational elements. When they are in place, then doing something like a self-blend to get an unfiltered data source for a Filter Action becomes conceivable and implementable.
This kind of writing takes time to research, think about, synthesize, and explain. I’ve been reading a lot of books, trawling through painfully difficult data sets, filling up pages with throw-away notes & diagrams, and always trying to keep in mind the nurses and doctors I work with, the long-time Tableau users who tell me that they still “don’t get” calculated fields in Tableau (never mind table calcs), and the folks I’m helping out on the Tableau forums. So “the book” is going slower than I’d hoped, and hopefully will be the better for it.
Postscript #2: This post wouldn’t have been possible without the help (whether they knew it or note) of lots of other smart people, including: Dan Murray, Shawn Wallwork, Robin Kennedy, Chris Gerrard, Jon Boeckenstedt, Gregory Lewandoski, and Noah Salvaterra. As I was writing this post, I read this quote from a Tableau user at the Bergen Record via Jewel Loree & Dustin Smith on Twitter: “Data is humbling, the more I learn, the less I know.” That’s been true for me as well!
I think I’ve mostly met those goals…my first post was 16 months ago today, with 41 since, and I’ll continue posting but at a slower pace for a while. I’ve quietly announced this already, my bigger news is that I’m collaborating with Joe Mako on a series*** of ebooks on Tableau. We’ve been doing a lot of deep thinking and research to figure out how we approach and work with Tableau to build the amazing solutions we come up with so that we can share those fundamental techniques with others. In one sense, we’re writing the books that we wish would have existed when we started with Tableau, in another we’re trying to make more Tableau Jedi and Zen Masters. Our goal is to help users who want to have a better sense of how Tableau thinks and works “under the hood” so they can come to their own solutions. It’s a bit different in structure and scope than the other Tableau books out there, with more theory mixed in with the practice.
*** Between a day job, parenting, husband-ing, householding, and feeding my Tableau forums addicition, writing the book is taking longer than I’d initially thought, so instead of trying to write a magnum opus (and have to keep revising for the moving target of new functionality in new Tableau releases), we’re cutting things down to be able to ship something sooner. I’m really excited about this, here’s a little snippet (of a draft of) something you’ll see in the first book, due out this winter:
We’ll also have a full-size poster available for the real Tableau fanboys & fangirls out there. 🙂
Thanks to everyone who has given me encouragement and support, and I look forward to the next 100K!