2014-11-17

Introduction

I have been thinking a lot about how Alteryx and Tableau work together to help me innovate when solving business and science problems. In this article, I’m going to explain what I mean by the term innovation and how I use Alteryx and Tableau to achieve this term.

I will also provide some thoughts on what I think is going to happen in the future with these products.

Although Part 1 and Part 2 of this series are mostly independent material, it is probably a good idea for new readers of this blog to start with Part 1. If you want to read Part 1 of this series, click here.

Basic Definitions

If you want to learn about Alteryx, click here to visit their website. If you want to learn about Tableau, click here to visit their website. If you click on each of these links as I just did, the definitions for these products as presented by their respective companies are:

Alteryx - Intuitive workflow for data blending and advanced analytics.

Tableau - EASY TO USE, EASY TO LOVE. Make analytics easy for everyone.

You notice that each company is selling a message that has two things in common. First, both companies indicate that their product is easy to use. Alteryx says that their product is “intuitive”. Intuitive implies something is easy because we have all hear people say: “Oh yes, that is intuitively obvious”. Tableau simply says that their product is “easy for everyone”. Secondly, both companies indicate that their product is to be used for “analytics”. Whereas Alteryx goes further to say that their product is for “advanced analytics”, Tableau says their product “makes analytics easy for everyone”.

Does this mean that Tableau does not do “advanced” analytics? Does this mean that Alteryx is not necessarily easy or intended for everyone, or routine analytics? These are just a few of the questions that a new user of these software products might ask themselves as they begin looking into using these tools.

I have a goal for this blog post. I want to define of the term “analytics”. I want to understand what this term actually means. The reason I want a definition for this term is simple. I find this term being used everywhere I look. Software products are being created for the purpose of doing “analytics”. If this term is so important, I wonder why all the dictionaries I use report this word as a mis-spelling every time I use it?

The term “Analytics” is defined as the science of logical analysis. OK, so that is the definition, but what does that actually mean? Does that mean that “analytics” only applies to the world of logic?

The word “analytics” has an origin from the 1590s as a term in logic, from Latin analytica from Greek analytika. This means that people have been thinking about “logical analysis” for over 400 years! Since the term “analytics” is basically vague, we can approximate the word by dropping the “s” from “analytics”. By doing so, we see that the word “analytic” helps us better understand what we are talking about as shown in Figure 1.



Figure 1 – Definitions for the work analytic.

Analytic(s) in the context of Tableau and Alteryx potentially refers to definitions 1, 2, 4 and 5. This means that the term “analytics” can have different meanings to different people. In fact, how these software products are used to perform “analytics” can vary significantly from person to person since they are general-use products. However, it is clear that both products are intended to be used for the “analysis” of data, or more specifically, the “logical analysis” of data.

If both products are intended for the “logical analysis” of data, how are they the same and how are they different? If I were to be asked to describe Tableau to someone unfamiliar with the software package, I would say that it is a computer program that allows you to visualize data. That is a clear and simple statement that most people would understand. Breaking the sentence down we see (1) A singular computer program that (2) visualizes data. That concept is clear and easily understood.

Given the same question for Alteryx, I would say that it is a software program that contains a collection of quantitative tools that allows you to manipulate and visualize data. Comparing these two definitions indicates that Alteryx may be somewhat more complicated than Tableau but that they are both in existence to be able to visualize data.

The Alteryx definition I have offered may be less well-defined than the Tableau definition because it involves some “quantitative tools” that allow data “to be manipulated”. When I used the word “manipulated” I don’t mean change the values of the data as Don Henley sings about in the song “Garden of Allah“. What I mean is to manipulate data into different forms, structures or formats to be able to achieve some objective.

Alteryx has a user interface that allows “data manipulation” to be accomplished in a “workflow”. A workflow (Figure 2) is developed and presented on a drawing canvas, but the actions completed by the workflow are quantitative in nature. Alteryx also allows a user to visualize data in certain ways such as on maps, but I feel that data visualization is not the primary focus of Alteryx. At the core, Alteryx is a tool to work with data, plus a whole lot more.



Figure 2 – An example workflow.

By examining these two products in this way, we begin to see the similarities and the differences. We can see that Tableau is intended to be used principally as a visualization platform. Alteryx is principally intended to be used as way of working with data to reshape, reformat, restructure, slice/dice, assemble, dissect, filter, join, group, or otherwise modify data to prepare it for spatial and/or temporal visualization.

Although each program is intending to achieve the same “logical analysis of data” through visualization, they go about achieving that goal in radically different ways. This leads to particular strengths and weaknesses for each product.

To summarize what I now understand from having worked with each tool fairly intensively, I have concluded that I can use Tableau as a stand-alone product to solve a wide array of problems. If simple data structures exist and only a few data sources are present, I can handle the project using Tableau. I have also been able to handle moderately complex projects using Tableau because I have a lot of experience using it and I have a big toolbox of data manipulation skills, techniques and experience. Some of the things I have accomplished in Tableau probably are not going to be easily understood by the average Tableau user. However, if a complex project is attempted with a wide array of potentially disconnected data sources, I will be most effective in solving the problem by using Alteryx in combination with Tableau.

My history and work experience has allowed me to understand when the time is right to use both tools. Although there is significant overlap in what each product can do for you, there are some clear strengths (and arguably some weaknesses) that exist in each product. Once you understand the strengths for each, you can combine them to accomplish some amazing work. This insight is what this blog post will attempt to define.

I am going to explain my vision of how these tools can be used together in harmonious and effective ways. I have learned that some of the work I am now doing would not be possible with either tool by itself. Although they are both incredible platforms by themselves, it is the interaction between the two that can lead to some seriously innovative work that can knock your socks off. So lets see what this really means by going through an example that is purely hypothetical but is partially based on real-world experience.

A Theoretical Example Based on Real World Complexity

Imagine that you are a boss at General Motors or Ford or Chrysler, or any other can manufacturer for that matter, and you have been building cars for years. You might have started building the cars decades ago or just a short time ago. In either case, your cars are built from parts that are supplied to you with by manufacturers that deliver the goods to you within the product specifications you gave them when the job began. Over time, parts change and are refined to improve their reliability as lessons are learned along the way.

At your primary manufacturers site, they receive products from their suppliers to build the parts they are building for you. At the suppliers site, they are receiving parts from other suppliers, and so on. A supply chain exists for the products that are built for you.

There is variation in the products you receive over time because the supply chain has variation at all levels. Your cars are built using this supply chain and everyone is doing their best to produce the best parts for these cars. Until the cars are assembled and driven, however, nobody really knows how reliable these cars and their specific parts will be over time. Problems usually take time to develop, but chances are good that some issues will occur for all the cars you produce despite your best efforts to build the perfectly reliable car. It can be said that no perfectly reliable car has ever been created due to the complexities inherent in the product.

A Problem Emerges With Your Cars

Now imagine that a certain car that you produce is experiencing some unusual failure at an alarming rate. Let’s say that the car owners are hitting the gas but the cars won’t accelerate properly. When the gas is applied, a hesitation occurs, followed by a hard jolt as the transmission engages and the car accelerates. This condition has been happening on some cars that are in cold weather areas, with the transmissions not at a normal operating temperature. The consequence of this behavior is that cars that are trying to accelerate into high-speed roadways are getting hit as they attempt to pull into traffic. The basic problem is that the transmission is slipping for several seconds when the gas is applied, causing a delayed acceleration.

You, as the boss, realize that this improper acceleration is a serious problem and you need to find the cause and find a solution to the issue as fast as you can. How are you going to do it?

Let’s assume that based on the feedback from car owners, the car transmission seems to slip when the car is cold and they are close to home. You are able to quickly verify this condition exists by testing some of the cars with the faulty condition. Your engineers verify that the transmissions are slipping when the gas is applied and the transmissions are cold. Based on a wide survey of owners, you are able to assemble the VIN numbers of the cars that have experienced this phenomenon, so you have a pretty good idea on where to start your “analytics” investigation.

You know that data has been collected all along the manufacturing processes for transmissions, from the lowest-level suppliers up to the your assembly line. You know that chances are very good that somewhere buried in all that data is/are the reasons why this problem is happening. So you assemble your engineering team and give them the directive to identify the slippage problem, and to find it fast.

Your team begins by assembling all the data files from the transmission suppliers. They quickly realize that this data is stored in a multitude of databases, all of which have different formats and structures. This data variation exists because each parts manufacturer has its own IT group and they do things their way. This scenario also occurred because nobody had thought about the day when all the data from the various suppliers would be needed to solve a problem like this. Everyone just thought that delivering the manufacturing data along with the manufactured parts was good enough. Nobody had the vision to think about the eventual need for data connectivity between suppliers.

You realize that you are quickly immersed in a data quantity and connection nightmare. You have a large number of files that are not linked together in any type of analytical framework. Your team reports to you that it will take them a long time to complete a cause and effect analysis because the data is disconnected throughout the supply chain.

Looking for A Solution Using Tableau

The engineering group determines that they are going to have to look though the individual data files one-by-one, looking for clues. They are going to try to trace the data for the problem cars, file by file, without any ability to effectively drill through the data up and down the supply chain.

Due to the shear volume of files, they distribute the files throughout the engineering group to balance the workload. The decide to start visualizing the individual files with Tableau to see if they spot a trend, like a bad part production run in the third shift on a particular day, for example. By working independently on separate files, progress is slow.

After working a while with this data, they started blending a few files in Tableau to connect pieces together. They try to go deeper than a few connections and they start having trouble working with the data. There is too much information and too many data sources and the analysts are getting confused and they can’t create what they want in the data.

The engineers also realize that they are looking for a needle in a haystack, so to speak. They are hoping that the problem is being caused by a singular part failure, such as a mal-formed spring, a bad gear, or a bad pressure valve. They are looking for anything to explain what has happened so that the repair campaign can begin and the problem can get solved.

Since the team is working independently on separate pieces of data, they have moments of success followed by the realization that they don’t have the complete picture available to them to conclude anything. They realize that more of the data needs to be connected in some way for them to understand the scope of the problem. They need an integrated data source that can search from the VIN down to all the parts used to assemble the transmission, including the part names, specs of the delivered parts, manufacturing information, car mileage, etc.

After several weeks of work, they have managed to identify a certain gear that was potentially faulty. When this gear was manufactured, the data indicated it was slightly out of spec on average and it had a high reject rate. This finding was discovered by using Tableau to analyze a large number of the supplier data files.

The engineers were perplexed, however, because they cannot understand why only some of those transmissions that contain that gear have failed. They wonder if it is only a matter of time before additional failures occur, or they wonder if is there another cause. They wonder if all the cars with the potentially bad part should be recalled. They started thinking of questions that the data currently could not answer because so much of it was disconnected.

Since you are the boss, you are watching this investigation unfold from a distance. You suddenly realize that your problem is bigger than just this one event. You wonder what you will do when another problem like this arises, with another part of the car.

You fear the arrival of that day because you will once again have to pour countless man-hours into another one-time analysis to identify the problem. You cannot possibly get all the manufacturers to change their IT systems, so what are you going to do with all of this disconnected data from your suppliers?

Adding Alteryx To The Analysis

You realize that a potential solution to your data problem is to use a data blending software tool like Alteryx. This tool will allow your engineers to assemble your data into a consistent framework that can be analyzed both quantitatively and visually. In fact, you can assemble all of the transmission-specific data by developing workflows that target only the key information needed in the analysis. Not only will you gain the visibility that you need across the supply chain, you will eliminate one huge problem: too much unneeded data. Alteryx will allow you to assemble the critical data you need, in the exact format that you need it in, to be able to find a solution to your problem. You cannot easily do this with Tableau since it is not designed to do these sorts of operations.

Once you create your optimized data structure in Alteryx, you can directly create a Tableau Data Extract (*.tde) file for rapid visual analysis in Tableau. By combining the best attributes of Alteryx and Tableau, you can create a solution for the engineering team that allows them to effectively and efficiently drill through the supply chain of parts for the transmissions, starting with the VIN.

Using Alteryx also gives you another huge advantage when future problems arise. Once you have the Alteryx workflows developed, as new cars are produced annually, you simply re-run the workflows to capture the latest manufacturing data. You no longer have to perform one-time operations to connect pieces of data. By taking the time to document and connect the various databases together, you have a long-term solution to a very problematic situation that will continue to repeat itself over time.

Applying Serious Alteryx Horsepower to Solve the Problem

Now back to the story. The engineers have identified a potentially bad gear that might be the cause of the transmission slippage, but they cannot conclude that it is the only cause. They have identified a bad batch of gears that were produced for a period of time at a specific supplier. However, the cars sold with these gears have been dispersed throughout the country and the engineers cannot economically do direct inspections of the cars. Not all the cars containing the potentially bad gear have had a problem.

To understand why only certain cars have failed, the engineers say that they need more data. They recommend to you (the boss) that you need to get them some data from people who fix and maintain these cars. The engineers indicate that they need to learn about the maintenance history of the cars because they think there is another issue that is causing these failures and they can’t find it in the manufacturing data for the specific parts.

As a major car manufacturer, you know that there are many things that go wrong with cars you produce. At your dealerships, you record the maintenance histories of the cars you service but you know that only a fraction of the cars you make will make it back to your dealerships for service.

To expand your service history data, you turn to companies like Firestone, Sears, Midas, and many others to see what data they have for you to glean insights. These companies are able to provide you with specific service records for your problematic cars. Once you receive this data, you realize that you are back to the same problem you had originally. These files are disconnected and have different data structures. The information you need is there, you just need to find a way to examine the information effectively. You realize that Alteryx can solve this problem for you, too.

Upon further analysis, you see that there is something in common with the way the records are kept for car maintenance histories. Each company essentially stores each visit to their shop the same way, although the databases do change from company to company. Each record in the databases you received represents one repair visit. Each record contains a lot of fields, potentially hundreds of fields. Each record contains the car information such as VIN (vehicle identification number), the service date, mileage, and the owner’s personal information.

Each service record also contains codes for each documented problem (111 = front brakes, 112 = rear brakes, etc), for each service performed (99111 = front brakes replaced, 99112 rear brakes replaced), and fields for other things like cost of parts, cost of labor, disposal costs, notes, etc.

You quickly realize that to find what you are looking for, you are going to have to join files together, lookup the meaning of codes, filter data, and do other things to isolate the salient (i.e. transmission) repairs that have occurred to these cars. In other words, you are going to have to do the things that Alteryx is designed to do.

Within a few days of working with these new data files, your analysts have used Alteryx to make a few breakthroughs. They have been able to do some data operations such as joins, multiple transposing operations for data reshaping and applying filters so that only transmission related repairs are emerging through the workflow. The data has been reduced, targeted, and organized so that it can be related to your original manufacturing data. Now the engineers feel that they can make some progress on solving the problem.

When the final Alteryx workflow is completed and Tableau is used to visualize the car histories, a startling find in made. A very strong correlation is determined between the car VINs that have experienced the acceleration problem and cars that have had transmission fluid changes as part of routine maintenance. In fact, the vast majority of the cars involved in transmission slippages have been high-mileage cars. Most of these cars have had transmission fluid changes completed as part of their recommended service history. The surprising twist to the story is not obvious, however.

By reviewing the data assembled by Alteryx and visualized in Tableau, it was found that the cars that experienced transmission slippage had one thing in common. They were all serviced by a singular national-level auto service provider. This provider, it seems, made an independent decision to use a different type of transmission fluid that was “specially formulated for high-mileage cars”, rather than using the transmission fluid recommended by the manufacture. Since the cars that experienced transmission slippage were all high-mileage cars, they received this new type of fluid.

The chemical composition and fluid properties of this new formulation were significantly different than the recommended transmission fluid. Properties such as viscosity, rust inhibition, heat transfer, and a multitude of other items were simply different. Apparently these fluid property differences were enough to cause the bad gear to malfunction under the load of car acceleration when the transmissions were cold. Maybe the fluid viscosity was too high when the transmissions were cold, or maybe the kinematic frictional forces were causing the problem. In any case, the bad gear and the transmission fluid were in some way incompatible in cold transmissions.

Example Conclusions

In the final analysis, it was concluded that the suspect gear and the new type of transmission fluid both contributed to the transmission slippage events. The problem occurred exclusively when these two factors were present in a car. This type of condition is known as a two-factor interaction.

A nationwide notice was sent out to owners of the cars informing them of this incompatibility. Repairs were made to cars with the potentially bad gear and the auto service centers were informed of the problem with the new transmission fluid. The problem was resolved without a major recall event.

In terms of the “logical analysis” of the problem, the combined usage of Alteryx and Tableau allowed this problem to be solved. The problem could not be solved efficiently with either product on its own.

Alteryx allowed the engineers to manipulate the data at whatever level and in whatever ways were needed to gain the insights needed to solve the problem. Tableau excelled at allowing the engineers to visualize the relationships between the wide number of variables. The correlation between the gear and the fluid was discovered in a simple Tableau scatter plot.

Innovative approaches to solving complex problems like this are possible by using the combination of Alteryx and Tableau. These types of challenging data issues exist everywhere, especially as more and more businesses and industries collect and rely on data to improve performance.

My Thoughts of Future Work With Alteryx

First, I want to talk about Alterxy because it is my data erector set, my data lincoln logs, and my data legos all built into one package. I use these toy analogies because Alteryx gives us the fundamental building tools to manufacture creative solutions for data. Alteryx is my creative data toolbox and I can build any kind of data structure that I want. I no longer have to depend upon a variety of software packages to do these types of things, I go directly to Alteryx without hesitation.

Alteryx allows you to assemble and analyze data in an efficient manner. You do not have to be familiar of the details of the data structures, the data schemas, or the data warehouses from which the data originated. You are free to connect to data and to use this data to build a foundation for a “logical analysis”. You just have to learn some basic techniques and the details of how the Alteryx commands operate to become an efficient user. You can quickly learn how simple options such as limiting the number of records when reading big files can save you huge amounts of development time. Every option in Alteryx is designed to save you time and to give you flexibility in building your workflows. Alteryx is a very cleverly designed program.

Alteryx frees you from writing custom computer codes and worrying about details like memory allocation and deallocation, creating objects, defining variables and having to do computer code maintenance and development. Alteryx does those things for you and delivers updates to you every quarter, like clockwork. You don’t have to spend time documenting your computer programs, rather you just document your workflows.

Working with Alteryx is equivalent to having a very high-level computer language designed specifically for rapid and comprehensive data analysis. Alteryx gives you functions at your fingertips that allow you to do a quantum amount of work with the greatest of ease. You just have to learn how to use the functions to get the benefits. Alteryx also provides flexibility in being able to customize these functions for your applications. Alteryx saves you a massive amount of time in handling data for so many different reasons, that I cannot quantify the overall effect. It is truly amazing.

Alteryx allows you to build a more complete data set that you could otherwise do with a tool like Tableau. There are a lot of things that you can do in Tableau, but when it comes to building a complex workflow that assembles a lot of different data sources, this is the domain of Alteryx. So when you get a problem that is complex, you need to be thinking about building an Alteryx workflow.

However, don’t forget about using Alteryx for the simpler tasks, too, because there are also great benefits to you at that level. Simple operations like search and replace operations (even with regular expressions), multi-level sorting of files, and filtering big files to create smaller files is very easily completed with Alteryx once you know the basics.

Also, notice that I haven’t said anything about the advanced features of Alteryx, like spatial analysis, regression modeling, cluster analysis, connections to R, etc. The scope of Alteryx is so big that I cannot attempt to describe it all in this blog post. That is why I’m going to take about 100 more blog posts to tell you that story over the next couple of years. I’ll be focusing on demonstrating technical techniques that are intended to solve common business problems, much like I have done in Tableau.

Alteryx also excels at creating repeatable processes. If you are building a technique or service that you want to sell or is simply a repetitive job requirement, Alteryx allows you to design, test, and then implement a solution in a short amount of time. You don’t have to write a custom computer code to transform the incoming data to have a repeatable process. You inherently get efficiency and repeat-ability in the Alteryx workflow process. You also get reuse-ability. You can quickly modify a workflow to solve another problem just by making a copy of an existing workflow and then making modifications. This is a huge advantage we get when using Alteryx.

My Thoughts of Future Work With Tableau

Now I’m going to talk about Tableau. If you have read any of the 75 or so blog posts I have written about Tableau, you already know how I feel about this package. It is absolutely my favorite software package of all time. I have tried a lot of tools through the years, but nothing compares to the excitement I feel when I nail a project using Tableau.

On the critical side of my brain, I have seen deficiencies in Tableau that are principally related to its intent. Tableau is never going to give us tools to attack a complex data management problem like the example that I described in this article. Tableau is built for ease of use, fast response, glitz and glamour. Tableau does not want us to use their tool to do the dirty, data preparation work. They are not going to give us a programming language or scripting tool to expand what we can do. They want to avoid complexity for the user. Remember how this article started: Tableau - EASY TO USE, EASY TO LOVE. Make analytics easy for everyone. That is the Tableau mission and I am OK with that.

If you have a one-time project, Tableau may be completely sufficient to solve your problem. Tableau is not, however, designed to offer you a repeatable or re-usable solution. It doesn’t have configuration files that can be used to store your favorite settings such as font sizes, font styles, or graphical settings for a particular type of charts. You cannot store your most recent project settings so that they can be default settings.

When you load Tableau, your style is pre-set and pre-determined by Tableau. In Tableau, you can’t really build a template to have a particular look and feel and then reuse that template for a new data source that has a different data structure than the one you are using. Of course, if your data source is the same, you can reuse your workbook by replacing your data source with the new one. These are some of the reasons that Tableau is not a great tool as far as re-usability is concerned.

Tableau cannot be customized much at all, so you have to start over with a blank worksheet to begin a new project. This limitation eliminates coding complexity for Tableau, creates a more stable product for the user, but it also causes the user to do a large amount of work on the front-end when beginning a new project. I suspect that some of these deficiencies will be overcome with third-party add-ons to Tableau in the future, but look for native Tableau to remain streamlined and designed for easy usage. Clearly Tableau will continue to improve some aspects of their software such as copying formatting from sheet to sheet, but overall I do not expect the package to become full of many fine-scale configuration options. Tableau wants to avoid software complexity, and that is a very good thing for users of their product.

Tableau’s simplicity belies it’s brilliance. It is the most useful tool I have ever discovered because it allows me to explain data so that anyone can see and understand what I am talking about. If a picture is worth a thousand words, some of the Tableau dashboards I have created would fill an encyclopedia. The ability that Tableau gives us to explore data is profound, is unprecedented, and it will continue to get better. If you want to read more about my Tableau enthusiasm, you can read a thee-part series I wrote by clicking here. Wow, with all of that praise, what else can I say after that?

Some people speculate that Tableau and Alteryx will merge into one software offering. I’ve seen that in writing. I don’t think that will happen because the company cultures are too different and the mission statements clash. I think it is more likely that a new software program will emerge that takes the best features of each product to produce the next generation of analytics software. The next-generation product will attempt to do what we can now do with these combined tools and expand the universe further by incorporating machine learning techniques and other current-research techniques for working with data. New concepts such as flow-based programming may be the future of this frontier. When this will occur is anyone’s guess, but as I have seen throughout my working life, something better always comes along.

Final Thoughts

With these written words coming to a close, I feel the need to explain why I feel the way I do. I want to explain what makes me tick. Why do I want to continue to write these blog posts when there is almost no feedback, good or bad?

The exception to this is my lovely wife Toni (Figure 3), who gives me a lot of feedback! She knows first-hand how this mission can drain a man of his energy. She teases me by saying that my writing is simplistic, like “See Jane run.” This is normal as she teases me about everything. I tell her that I write that way so that people can understand what I am trying to say. I am not trying to write like William Faulkner.



Figure 3 – My best friend, my wife Toni.

Toni wonders why I have malfunctioned. She thinks I have gone crazy because she sees me writing this stuff when most people are asleep. She does not necessarily understand why I buzz in the middle of the night, but she feels it. She sees me staring off in the distance when the Alteryx and Tableau solutions are being created in my brain. She doesn’t necessarily comprehend my enthusiasm for this endeavor, but she gives me the support needed for me to be able to get the job done. For this, I thank her and love her very much.

Toni is a great partner by giving me the latitude to express my creativity. She may not realize that it is hard for me to shut-off this energy flow because it is in my nature! I have always been like this, and I hope I will continue to be like this far into the future. In closing, let me put how I feel and how I think about writing this blog into two simple paragraphs.

Life is short. Time is ticking and memories fade. I have an innate sense of mortality that was etched in my brain when my brother Danny died at an early age, and I know we don’t live forever. Accomplishments are one of the reasons that we work. When I make an accomplishment, I want to share it. I want to document it so that others can benefit from it. If our forefathers didn’t share their knowledge, we wouldn’t be where we are now. If the transistor developers kept their secrets to themselves, we wouldn’t have these incredible computing machines that allow us to do these quantitative studies. If the computer programmers didn’t share their ideas, we wouldn’t have Tableau and Alteryx. If Tableau and Alteryx users don’t share their ideas through a medium such as blogging, then our ability to solve really complicated problems will lag.

I believe that we have to push the technology and share our ideas, to drive both software innovation and to improve our ability to solve big problems. I want to use data to help solve world-wide problems, like global climate issues or healthcare issues. I want to blast through barriers that I have faced and could not overcome. Every day I get a bit of a return on my investment when I look at my blog stats and see that a few hundred people have looked at my blog that day. Over time, that adds up to helping thousands of anonymous partners that have similar interests. These are people that I may never know, but each day I feel a little happiness for knowing that I helped someone solve a problem that they were having. These are the reasons I do what I do. I simply want to help people. Nothing more, nothing less. With that, I close my 100th blog post. Thanks for reading.

See Spot run. See Jane run. Spot jumps over the fence. Go Spot.

Show more