FEED YOUR CRAVING!
This project grew out of having a lot of time on my hands at work, being on the bench as a newly-minted consultant eagerly awaiting being put on my first client project. True to form, I'm writing this blog post belatedly, so I am now a bit fuzzy on the early details since it was about 8 months and 2 jobs ago, but I'll do my best to recount what probably happened.
In my boredom, as I got to thinking about what I might like to visualize, there was probably some recent occurrence of being out and about with my husband during which he commented that a certain house or town was "too far from the nearest Taco Bell." This is a regular occurrence for him/us, so it stands to reason that this was the inspiration. I managed to find an open data set with the locations of all Taco Bell restaurants in the United States and Canada on the handy-dandy site, POI Factory, and lo and behold they also had a handful of other restaurant data sets available with the same schema (data structure, like the same columns - this is super helpful when merging data sets together!) so I downloaded all of them.
The next step was to dump all of these files into Tableau Prep Builder to union them together and clean up some of the fields. I undoubtedly spent way more time on this step than I really needed to, because I wound up only using the lat/long fields, and then included the address string in the tooltip, but hey I was bored.
After Tableau Prep, I then dusted off my janky Python skills to create the circles you see in the map, as GEOJSON polygons. (For those not in the know, GEOJSON is JSON specifically for spatial data, and JSON is a file structure which uses nested key-value pairs to efficiently store tons of data.) I painstakingly did the calculations to convert degrees latitude and longitude to miles, and then implemented a function that would draw circles with the various radii I specified around each point. I had trouble getting Python to loop through the radii and also loop through all the points, so I wound up just making a separate GEOJSON file for each radius value and unioned them together in Tableau. Months after I finished this little project, I attended Tableau Conference 2022 and learned that there are two functions in Tableau that would have made my week of Python work into 2 very simple calculations: MAKEPOINT() to make a point out of the lat/long pair, and BUFFER to generate the circles of varying radius, in miles, around my points. Oh well, hooray for learning!
And so, after a bunch of coding and data cleanup, I was finally able to jump into Tableau and start visualizing. For each restaurant, I assigned it a main color from its logo - here I remembered that many restaurants use red in their branding (check out this study from the University of New Hampshire to read about the phenomenon), so I had to dig especially deep in the case of Wendy's, but I paired the bar with the restaurant logos as a legend so it would work well enough. Here's what I came up with and published originally:
I made the header image in Figma, with the little confetti matching the colors of the bars. I'm a huge fan of the symmetry of butterfly charts, so I included essentially two of them below the map, one as a pair of lollipop charts separated by the center content, and one vertically.
Riding the high of a completed project, I was congratulating myself of a job well done until I got some feedback from a viz guru I admire. My friend told me to think about balance across the visualization as a whole: horizontally as well as vertically on the whole page. He said the bottom was very "heavy" and the title design was a little bit jarring. Among other ideas, he suggested making the map expandable / controls hideable, and also to use a scatter plot instead of the two very tall lollipop charts.
This was all well and good and of course he was totally right, but by that time I'd started on a project and didn't have time to implement any of the ideas he'd given me, so my notes waited forlornly on a white board until about a month ago when I found myself back on the bench with some time to kill.
The resulting viz is a good bit shorter (as in less tall), and the scatterplot seemed to be well-received by the folks to whom I've shown the new draft. The bars with the restaurant logos work much better side-by-side instead of one atop the other, and I was able to put the controls and text in a collapsible container to make the map expandable. Overall I'm really happy with how it turned out, and I hope you enjoy it as well!
Dance of the Planets
For once, this post is not belated because I just finished development on this little app yesterday. Small victories!
I am now working as a Senior Consultant out of Slalom's San Francisco office. Since I just started early this month, I'm still "on the bench" as they say, waiting to be sent into action on my first project. I love that as a consultant, I am encouraged to use this down time to learn and grow professionally! I've been wanting to branch out from just using Tableau all the time, more into the world of front-end data visualization, so now is the perfect time.
Tutorials completed, I could finally get started.
For this project, I'm using data from NASA's Horizons API, which has coordinates for many different bodies in our solar system - from planets and their moons to asteroids and comets - at pretty much any given time, relative to your choice of city or even planet. So my first real step was to get something on the page using static data that I downloaded from the Horizons web app, as shown above. Once I got that working with one body, I downloaded a few more and made sure my script would work by looping through the list of bodies. I wrapped all of my spheres in high-resolution textures from this site to make the planets and the Sun all look realistic.
Next, it was time to start making real API calls to get my data. This is where development had to slow down just a little bit, because the NASA API server can't handle too many calls at once, so each one had to be spaced out using a timeout function in Angular. The final waiting time in between calls is 2.5 seconds(!) which explains why the app takes so stinking long to load since it has to wait in between all the 8 planets (no, Pluto is not a planet!) and the Sun.
Ultimately, it was the first part of this StackOverflow answer which I was able to use to fix the problem once and for all. I was able to very easily (and free-ly) create a little app on Heroku and deployed this CORS Anywhere proxy to it. For once in my dev career, everything actually worked as expected and I didn't run into any weird, undocumented bugs or glitches. It was blissful.
From here, there wasn't too much left to do! I created the header as an image in Figma, which is a wonderful design tool - great for creating and editing vector images, amazing for collaboration, easy to use, and has a very solid free option. Then I added a minimal blurb of informative text, slapped on my cute little logo (created by the amazing Andrea Millea), and deployed it to my Github site.
Are there things I could have done differently? Absolutely. To name just a few, I could have added:
This is a VERY belated post! Like more than 3 years late! But better late than never, right?
For this project, I used data from Crit Role Stats. They don't currently have an API but they do house most of their data in Google Sheets and what they don't have in a sheet lives in Google Docs which are embedded on their site. The Google Sheets data was easy to pull into the dashboard using Tableau's built-in Sheets connector, and the rest presented a fun challenge.
To begin, for each set of non-Sheets data that I wanted, I built a Python-powered web scraper in Morph.io, each of which is housed in my own Github in the repositories ("repos") labeled with "critrolestats". I then created a Google Sheet of my own, where each tab points to one of my Morph.io APIs, like so:
The "importdata()" function in Sheets was instrumental to making this all work! From here, I was able to again use Tableau's built-in Sheets connector to pull the data into the dashboard. The API call is made each time the extract is refreshed, which is daily on Tableau Public, so the dashboard stayed up to date until the end of the campaign in 2021.
As for the details about the inner workings of the Tableau workbook, I leave it to the reader to download and sniff around. :)
In a very cool (for me) turn of events, the Wildemount Dashboard was part of the Long List for the 2019 Information is Beautiful Awards. Click here to see the entry.
As of this writing, the viz has almost 10,000 views on Tableau Public.
Also I'm pretty sure this dashboard helped me land a job in early 2020, as I joined a team of D&D nerds with whom I still play on a quasi-regular basis even though I've since moved on from that company!
There is a business in the small town of Logan, nestled in the hills of rural southeastern Ohio, called The Artbreak. It is truly a unique place, in that in one building and owned by one couple, it houses a real estate brokerage, a piano repair shop / showroom, and an art gallery. I worked there as an administrative assistant for 4 years right out of high school, their first and sometimes only employee, while I put myself through my first round of college at nearby Ohio University.
In spring of 2018, I received a message from my former boss, the owner of this distinguished and artistic establishment. She told me that she was putting on a gallery exhibit that fall about alumni of the local high school who'd had "particularly creative career paths" and wondered if I might like to submit one or more data visualizations to the exhibit.
So it was that I embarked on a journey to make a new visualization for this exhibit, one which would be relevant and hopefully interesting to the good people of rural southeastern Ohio. On the State of Ohio's website, I found in CSV form W2 data ranging from 2011 through 2017; this data included name, job title, department, total wages earned for the year, and hourly wage.
What a treasure trove! After some data wrangling and cleaning in Tableau Prep, I realized that I would be safest using the hourly wage, as some positions might have overtime and some people might not have been employed the entire year; hourly wage was the best way to treat all positions equally. I then took the names and ran them through an API which would tell me the likely gender of each, and then I excluded from all further analysis the surprisingly few which returned as Unknown Gender. I should point out that this would likely give an incorrect result for some names which may be either female or male, but I have assumed that this does not cause a large enough effect to make a significant difference on the results.
My original intention had been to show the gender pay gap over time, but I found that more interesting story to be in the most recent full year: 2017. As you can see in the visualization, at a high level using both the mean and median, the pay gap is nearly non-existent. The median is higher for women than for men, while the mean is higher for men - this demonstrates that there are more men in the higher-paying roles than women, which is shown in the lower left corner of the visualization.
While I would have liked to break down by seniority and job experience, that was not possible, so I settled for breaking down by department (those having at least 100 employees) and looking at the difference between the average (or median) hourly wage between men and women therein. This result you can see on the right hand side of the visualization. The $1.09 (average) and the $1.10 (median) are the result of averaging each department's average or median hourly wage of women and dividing by that of the men, as long as that department had at least 100 employees.
This was an incredibly rewarding project, which I proudly stood beside during the exhibit opening in Fall 2018.
Here you'll find detailed write-ups of all projects featured on the homepage.