Eric Kavanagh, Bloor Group CEO, chatted with TIBCO Analytics Senior VP and General Manager Brian Gentile on April 1, 2015. This Inside Analysis interview covers the subject of putting analytics to work to gain faster, smarter insights throughout the organization.
Eric Kavanagh: Ladies and gentlemen, hello and welcome back once again to Inside Analysis. My name is Eric Kavanagh. I will be your host for what’s going to be a really interesting conversation with Brian Gentile, the Senior Vice President and General Manager of TIBCO Analytics. He came in from the Jaspersoft acquisition, so Brian, welcome to the show.
Brian Gentile: Thanks Eric, it’s really great to be with you.
Eric: Sure thing. We were just talking before we hit the record button that you’ve been on the road, and you’ve just come back from a multi-country tour and heard the latest. What’s going on about analytics in the world today?
Brian: It’s fascinating to actually get with customers, those who are taking the theory of analytics and putting it into action, that intersection is so valuable. I’m always excited about meeting with and learning through the lens of our customers. I can tell you that the starting point for most of the conversations today is about how quickly they need to put data to work in order to capture more of its value. It’s the velocity angle, the velocity vector of analytics and data that we start with. We don’t end there, but starting with how quickly must we put data to work to capture its value is typically the starting point.
Eric: When we’re talking about analytics, I come out of this whole data warehousing business intelligence history. That’s a very specific kind of intelligence, and analytics, in my mind at least, is much broader than that because you can focus on businesses processes. You can focus on all kinds of data, not just that traditional transactional data that’s stored in a relational database, for example. How would you define analytics? What is the spectrum of solutions that you talk about with customers?
Brian: This is it precisely. How quickly must we put the data to work in order to gain value? That’s my definition of velocity of data. With customers, we talk about it across a spectrum from fast data to rich data. Different applications have different needs for putting data to work. It is a bit of a speed versus richness debate. Some applications must have data put to work in seconds or subseconds in order to really get value. The contextual value of that data degrades very quickly after the period of a few seconds. That’s a fast data application by definition. The other side of that spectrum is the data needs to accumulate over time, and it needs to be defined richly for greater value to be assigned to it so that temporally, it can be explored to look for patterns and trends. This is a rich data application.
It’s no less valuable and, in fact, in some ways more valuable than a fast data application. It just requires an understanding of how quickly must the data be put to work before it becomes stale. That’s why we look at the spectrum as fast data versus rich data, and we believe at TIBCO that we are expert at solutions across that spectrum of need.
Eric: With TIBCO, you’ve got this history in fast-moving data and technologies like complex event processing, so it seems that TIBCO is uniquely suited to address some of these business analytics challenges, because you’ve got a history of focusing on data in motion and understanding data in motion.
As I recall, what spawned the idea for us to do this conversation was our discussion of some of the latest updates coming out of TIBCO. I realized that on the visualization side you would have a very keen perspective on how to use visualization to help people understand not just static reports, but data in motion or data over periods of time, because time is such an important component of understanding a business to your point of fast versus rich. Can you talk about how the TIBCO legacy of understanding moving data is now being woven into your ability to show data visually and to allow people better insights through analytics?
Brian: This is right. It’s about three prongs of equally important processes and activities with the data. TIBCO brings enormous experience to these three prongs. It’s about integration of data, intelligent events and definition of those events for the data, and then analytics on the data. So integration, events and analytics, each define a pillar of need. TIBCO began as an integration and data integration company. The ability to pull data from a wide variety of sources in real time or in rich time, and then the ability to define it intelligently, thresholds, milestones and intelligently designing those thresholds to be meaningful in a business process is the idea of defining events.
Ultimately, using insightful analytics to visualize the data and make some sort of thought-based action, some insight-based decision, on that data. This is about putting fast data or rich data into work in an organization under an integration event and analytics framework. I heard and saw lots of examples as I traveled from customers that are doing precisely this in the area of drug discovery and energy exploration and financial services, putting the integration events and analytics together to create a really powerful return for their business and their customers.
Eric: I love this concept of events. This is something that my partner, Dr. Robin Bloor, has been talking about for the last few years. It used to be that the transaction itself, meaning something going over the trans and money flying in one direction, product goes in the other direction, that was a transaction. You could call it an event. Now, we’re viewing the concept of event in a much broader perspective. Does that make sense?
Brian: It does.
Eric: Now you can, for example, understand someone moving through your website and picking up a telephone. You can get that richness and that texture around a transaction to better understand why the transaction took place or why it didn’t take place, why shopping carts are left empty, that kind of thing. This is where I was going with the window of time – when you put those markers down and say, “I want to be assessing what’s happening within 60-second intervals on my website.” That’s the kind of thing that this history at TIBCO really allows you to understand better than maybe some of the other players in the market, but what do you think?
Brian: That’s exactly right. Today, we simply have a temporal time zone that is far more specific than ever before. Several years ago, a company or an organization would have been doing well to design events and integrate them in a way that allowed you to act maybe with one day of delay. Today, it’s literally in seconds or maybe even subseconds. The applications of this type of temporal fast data use is incredibly broad, everything from marketing related applications, where consumer interaction is being measured and monitored in real time, through to financial services, where you’re watching underlying financial information and transactions flow, through to process manufacturing, where you’re looking at yield information, perhaps in subseconds, through the course of some sort of process manufacturing.
Across all these different industries, the fundamental changes occurred as the ability to shrink the amount of time that can be necessary to analyze and put data to work. Designing intelligent events around these thresholds and intelligent definitions is what’s making all the difference.
Eric: You just hit a really good point right there, intelligent definitions, because that’s what we’re really talking about – understanding the nature of a kind of event that we’re looking for, and then the definition boils down to a set of business rules that are driven by the business model, right?
Brian: That’s right. Precisely.
Eric: You mentioned an energy example. Maybe walk through what some of those folks are doing from start to finish, or from data to insight.
Brian: Yes, energy exploration is a particularly rich field for putting data to work. You can imagine and you can uncover real-time examples where, during the course of energy exploration, constant feedback from drilling for a new reserve could trigger modifications to the techniques used to uncover that reserve based upon yield information or geological information uncovered during the course of a drilling day. You can imagine how that would have an immediate impact on the success and the costs of exploiting some new energy reserve. You can also imagine over a lengthier period of time how quality data and production capacity data could be accumulated around a certain new reserve to determine whether or not that is a generally fruitful area to be exploring later or whether it should be capped and you should move on.
There are just so many varieties of ways to use analytics to make the entire energy exploration equation more successful, lower cost, higher yielding. It’s one of the most fascinating areas for data.
Eric: Of course, the issue, too, with energy is that you have this physical supply chain that’s so important. You have prices going up and down and different kinds of refining processes and locations, and then even turmoil in the Mideast is a factor, so you really need to build a very sophisticated model to understand what’s happening, and then to optimize your processes and optimize your pricing, right?
Brian: That’s exactly right. Some of the world’s most sophisticated energy companies are using TIBCO Spotfire, for instance, to build out these predictive models, weighing in with a wide variety of factors, geopolitical, geological, yield-based information, to determine the success, the profitability of certain exploration techniques or reserves that they’re considering.
Eric: That’s really cool stuff. What are some other stories from the road that you’ve heard?
Brian: One of they key threads I learned across financial services firms, for instance, is the need to keep up with very modern techniques for fraud detection, both looking at patterns of fraudulent data from the past to predict what might come in the future and, at the same time, monitoring in real time a wide variety of transactions and quick streams to be looking for the next likely threat and the next likely fraudulent transaction. It’s an amazingly costly endeavor for financial services firms, and anything that can be done to help manage and control it more fully is being sought after today. A lot of the conversations in financial services go to applications near to fraud and fraud detection.
Eric: That is such a difficult nut to crack, because what a lot of people don’t realize is that there is an entire industry that is focused on stealing electronically. There are certain countries where there is a higher percentage of these hackers than others, but the fact is there is a really well-orchestrated effort on the part of many individuals and organizations to undermine the security of financial transactions done online. You constantly have to stay on top of that. I’m just fascinated by the whole process and what can be done. I guess it gets back to this whole concept of pattern analysis, of looking for certain patterns of behavior of people who enter into a particular system and do certain things, and then you just have to have some awareness when a new pattern arises that might be worthwhile, right?
Brian: That’s right. It’s made it a lot more complicated to be competitive, because today financial services firms must offer a growing array of access points into their systems. At the same time, every new point of access creates another dimension for potential fraud. It grows exponentially with the number of users and the number of access points. In many ways, it’s the very nature of the competition that forces more and more sophisticated techniques for predicting, managing and controlling potential fraud.
Eric: Have you noticed any trends that are specific to countries, or are we all facing the same challenges in that regard?
Brian: I think that different countries have different interests across consumption techniques for analytics. For instance, in some countries in Asia Pacific, we’re seeing a faster uptake and interest in consuming cloud-based or cloud-delivered analytics, almost as if they’re skipping some of the infrastructure building that took place elsewhere in the world in the last decade or two.
In Europe, we see quite a lot of focus and concern around the location of the storage of data. If it’s cloud-consumed, where is my data? Is it in my country or some other country? There’s a lot of nationalism at work in different countries in Europe around that concept. It can be both a plus and a minus as it relates to advancing and creating a more sophisticated architecture.
In the United States, we’re surprisingly a little more relaxed about it all. We’re consuming more of a variety of cloud-delivered analytics in different forms, in different applications. I’d say that’s maybe one of the biggest differences as I traveled around the globe is the interest in different techniques for deploying and consuming analytics, either on premises or in the cloud.
Eric: Cloud computing is just amazing. I think it’s still like the sleeping giant out there. Three years ago, it was interesting that cloud still did not really register on too many radar screens out there, less than I would have thought, honestly. We’re starting now to see, and we track this stuff because we deal with our audience on a regular basis through webcasts and through email newsletters and via Twitter, that everyone is getting the cloud religion and realizing that you can leverage the power of the cloud and benefit from some of the cost savings associated with it and the fast on-ramp time, that’s a very seriously big deal. I guess I’d ask, how big is cloud for you guys going forward in terms of delivering analytics?
Brian: I’ve always said that the consumption of analytics in the cloud will grow as cloud-originated data grows. In other words, companies and organizations typically don’t want to move their data into the cloud in order to analyze it. If their data is on premises, they’re probably going to want to choose an on-premises tool for analysis. As cloud-originated data grows, and we become more comfortable with originating transactional application data in the cloud and therefore our quotient of cloud-originated data shifts to the cloud, then we’re going to be more and more comfortable with using a tool that is also cloud-hosted.
I use that as a context setter for saying that cloud analytics is incredibly important to TIBCO. We have already placed ourselves in a leadership position with our tools being available in the cloud and being built for the cloud. So the customer has a no-compromise experience, a beautiful cloud-based experience for using either TIBCO Spotfire or TIBCO Jaspersoft from and within the cloud and, in fact, whatever cloud environment they’re most interested in. You’ll watch us continue to shift toward a higher percentage of features and functionality and delivery, because we know that as customers originate more data in the cloud, they’re going to be more and more interested in using a cloud-based, cloud-delivered analytics tool for finding insight among that data.
Eric: Yes, that’s great. You’ve reminded me that or two or three years ago, I was talking with some of the folks from Jaspersoft, Brian Boyarsky in particular, about this concept of the intelligence inside and focusing on embedded analytics.
I thought that was an absolutely brilliant strategy, in part because I use lots of different cloud-based solutions that frankly do not provide very good analytical environments for understanding what’s happening. The idea, as I recall, was to work with companies that are developing various kinds of operational software, maybe financial software or industry-specific solutions and embedding some analytical capability into those applications to bring the analytics right to people in their working environment, where they can start slicing and dicing and better understanding their own unique line of business. Let’s face it, operational software is everywhere, and it’s unique to the industry. It’s often unique to companies. You’ll even getlots of home-grown solutions out there still to this day.
I thought that was a very clever strategy, and I’m guessing that’s still a focus for you guys. Can you talk about that for a second?
Brian: Yes, you actually summarized it very nicely, Eric. The truth is that if we purely look at the penetration rates, the usage rates of analytics over the last two decades, we could summarize that the analytics industry has failed its audience. Today, only 20% to 25%, best case, of knowledge workers in an organization use an analytics tool to analyze any amount of data. Instead of looking at the world through the lens of an analytics tool and saying, “Well, we have to make these tools more broadly available so more people can use them,” we probably now just have to admit that 75% of the audience in an enterprise is not going to leave their day-to-day production application, their day-to-day system of whatever it is they’re working within. They’re not going to leave that system for another separate system that is designed to analyze data.
The way I like to say it is that in order to succeed, analytics must become a thing that you do, not a place that you go. Herein lies the challenge for those of us who build tools. We have to let go of the concept of it’s about the tool. It’s not. It’s about the analytics. It’s about putting analytics in the right place, in the right time, with just the right amount of context – not too much, not too little – so that everyone in an organization can be made more capably analytic, even if it’s just for a small slice of day that they need in order to make a decision based on data and insight. That might be used just once or twice during a day, but if we can do that, we will have succeeded where everyone else has failed.
This is the charter, in a sense, of TIBCO Analytics, to reach the 75% who, today, have no interest, no ability, even, to go to a separate application to analyze data, and at the same time, we want to make the lives of those 25% whose job it is to analyze data even more delightful. Our job overall is to reach 100% of those in an enterprise.
Eric: Yes, and I really like this idea of bringing the analytics to the user in their existing application.
Brian: That’s right.
Eric: Because the fact is, that’s where they live most of the time if they’re doing their job. So if you can deliver underneath the covers and through the user interface that they are accustomed to using, that’s a huge victory. Even if you did go out and buy a high-powered analytical tool, even if you bought 500 licenses, such that all of your line-of-business managers and their direct reports could use it, you have to train those people, you have to disrupt their workflow, get them going from one application to another, or if it’s all in the same screen, and they can see some analysis or maybe drill down a little bit and better understand why is this order late, why did this happen, why did that happen, to me. It so incredibly lowers the barrier to analysis that you’ve suddenly changed workflow. You’ve changed how people think, and to me, that is the pathway to fostering this culture of what you might call analytical awareness, right?
Brian: That’s it. If we do our jobs right, people won’t even think of what they just did as an analytic artifact. They’ll just think that they were going through the normal process of working through their day. Analytic information was delivered to them inside of that process or inside of that application, to which they responded or acted, and it made for a more successful outcome. In their mind, that is fully analytics, although they might not ever really recognize that’s what they were doing.
Eric: Right. The other interesting thread that I’ve been following these days revolves around this whole story line of machines and rise of machine learning. There are, in my opinion, some scare tactics out there to make people afraid of machines. I have to say, I don’t care how good your analytical tool is. There will always be a need for people to be involved, because we are the ones who can understand the context of an event, understand the business model as such, and then make decisions. I really don’t see any threat to the human workforce by machines, no matter how powerful their analytical processes might be, because, again, we’re just enabling people to make better decisions with more useful information and better insights, right?
Brian: Yes. This is exactly right. The whole concept of using the logic of algorithms to improve our lives, we’re just the very forefront of that today. Using machine-generated data has this incredible hope, this incredible possibility of allowing us to more quickly assimilate useful facts from the day and making a more capable software system emerge. Within TIBCO Spotfire, for instance, we have recently released a new class of functionality that we refer to as recommendations. Recommendations is actually a really powerful artificial intelligence-like engine that during the data ingestion process, the recommendations engine actually runs through about 2,000 algorithms, characterizing this data that is being ingested.
By the time Spotfire has pulled all of your data into its memory, it knows far more about your data than you do. In fact, so much more that it’s able to immediately start recommending specific visualizations and best practices to you as a starting point. We see this as a Step 1 in putting algorithm-based heuristic data to work in a way that makes for a much better human experience, and it’s Step 1, and it’s an incredibly powerful feature set all by itself.
Eric: I really like that as well, because I think the first step is often the hardest for people if they’re trying to understand, especially if they’re new to this business or to the industry. Maybe they’re businesspeople who are joining a company that has this analytical mindset. They have the tools. A senior executive says, “Okay, we’ve given you access to these tools.” That first step of just throwing some visualizations on top of some prepared data, that’s a daunting thing, but if it’s done for you, and especially if the graphic itself is interactive with slider bars, for example, which are one of my favorite tools to use for understanding data, or any other kind of interactivity, that really helps get the juices flowing and helps people realize, “Oh, wait, there is something underneath here. I just have to play around to find it.” Right?
Brian: That’s right. Immediately putting live data to work, especially to a business user, invites them into a dialogue, into a conversation. The tool should be able to have that dialogue with you in a way that yields much richer insight more quickly and essentially makes you as the businessperson a better analyst. Ultimately, that’s what we need in order for everyone to be more capably analytic.
Eric: Yes, that’s good stuff. I think that there is a tendency to either overplay or underplay the possibilities with analytics, and one key point I like to remind people about is that if you’re going to do a good job with analytics, it does take time. You’re not going to get the best idea in the first five minutes. It usually takes a fairly significant discussion back and forth with the data to understand some patterns and to start to map things together in your head to understand where the possibilities are, because it’s not just some pattern in the data. It has to be a meaningful pattern in the data that represents some opportunity for change in your business. I think that’s a pretty fair assessment. What do you think?
Brian: I think you’re right. The faster time to insight is very real today. It literally can happen in seconds, especially with something like the recommendation engine, but then usually what happens is a conversation, a series, or iteration where you start exploring ever more fully, and that’s what leads to fundamental change. Even though the insights can come in seconds now, it’s about the dialogue that you get pulled into that creates much better business outcomes. That does take more time. At least, though, the software is helping to pull you into that dialogue.
Eric: Maybe, to close out, what about closing the loop and getting people to change what they do in the business? Any given business analyst only has so much influence over the management team or over people who actually run operations. If you’re going to get value, you have to change something at some point, if you’re to get value from analytics. What’s some advice you can offer to the people out there who get all this stuff, who understand it, and just are having a hard time persuading either someone above them or someone below them or someone adjacent to them to change what they’re doing?
Brian: I think we’ve already talked about two concepts that I’ll summarize, and then I’ll offer a third that is maybe the most important specifically for your question. The two concepts we’ve already talked about that are valuable here would be: understanding the fast data versus rich data spectrum and knowing where along that spectrum is the specific application or business problem you’re trying to solve. The second concept is putting integrated data intelligently defined around event thresholds together with analytics to enable really a fast enterprise to emerge, the kind of power you can bring to a solution in your market is probably unparalleled in this second concept of integration, events and analytics.
The third here I think, which we haven’t talked about, but broadens the lens for every organization is what I refer to as use all the data now. Today like never before we can use all of the data that is available to us. If you think about even several decades of computing power and information technology, we have been forced into environments where, based on costs and compute capacity, we had to use a subset of the data for a wide variety of reasons. Today, those limitations are removed. In effect, the only limitation on using all the data is essentially the size of our ideas and the quality of what it is we want to put to work.
Using all the data today is something that every company has to get comfortable with and embrace. Big data, small data, fast data, rich data, all of it can be turned into competitive advantage and value. It’s up to us. It’s almost like a new responsibility of all business leaders to understand in their organization and their market and industry how can they put all of the data to work in a way that creates more value for their customers, their partners, their shareholders or whomever.
Eric: That makes a lot of sense. I like this new responsibility concept, and I really liked the intelligent events concept as well. I think that’s where the future of analytics is going, and it’s going to improve life for businesses and users and for the population in general.
Brian: Yes, ultimately!
Eric: Folks, we’ve been talking to Brian Gentile, Senior VP and General Manager of TIBCO Analytics. Find out more about them online. Look up TIBCO at www.tibco.com and Spotfire, and of course, TIBCO Analytics. Thank you so much for your time, Brian.
Brian: Thank you, Eric. It’s always good to speak with you.
Eric: Okay, folks, you’ve been listening to Inside Analysis. Take care. Bye-bye.