I met Einblick’s Vice President of Marketing at 1am in the airport, on my first day of work. The Einblick team was attending Gartner’s annual Data and Analytics Summit in Orlando, Florida for the first time, and invited me to tag along so I could absorb as much information as possible. The summit is usually scheduled for the spring, but Gartner, like all of us, have needed to be more flexible in the face of an increasingly uncertain world. On August 22, bleary-eyed from several unexpected red-eye flights, the Einblick team joined the other 118 or so exhibitors on the showfloor, ready for three full days of demos, mingling, and conference proceedings. There have been many write-ups of Gartner’s summit in the last month, summarizing various talks, Gartner’s classic bake-off (think cooking show meets analytics software), and more. In this post, I will highlight the core paradoxes that Gartner introduced to help the data and analytics community unleash innovation and transform uncertainty. By unifying seemingly disparate concepts, Gartner’s summit created opportunities for new perspectives on age-old problems.
Gartner Data & Analytics Summit: Unleash Innovation, Transform Uncertainty
Gareth Herschel, Gartner VP Analyst, and Debra Logan, Gartner Distinguished VP, opened the summit with their keynote address, “Unleash Innovation, Transform Uncertainty” (Gartner, 2022). Based on the title and the purpose of the summit, it may be surprising that Herschel and Logan quoted not one but two novelists in their 45-minute keynote: Sir Terry Pratchett and Toni Morrison.
Data and Analytics = Technology plus Storytelling and Imagination
Aside from being engaging speakers and storytellers in their own right, pivoting from illustrative example to another, Herschel and Logan reminded conference attendees that data and analytics is just one piece of a larger story–an organization, a community, a business. As a result, data and analytics is not just about having the newest machine learning model or AI, but also about driving a particular story, and using your imagination to push the possibilities.
Big data has driven this concept that more data is inherently better, when in fact, all we need sometimes is what Gartner calls a “minimum viable dataset.” Any data analyst or data scientist (or social scientist, researcher, anyone that uses data) has encountered the feeling of not having enough data or not having the most data. But is more data always better? Herschel and Logan argue that it isn’t. Sometimes, data can become a liability, the more data you have, the more storage you need, and the more potential for security risks. The story you’re telling can help you determine what you need, and that focus can help you become more effective.
Data and Analytics = Art plus Science
The second paradox that was particularly interesting came from psychology: maximizing vs. satisficing personalities. The maximizing personality aims for the best–collect all the information available, and make the best decision possible–sounds like cutting-edge machine learning and artificial intelligence, right? Logan and Herschel pointed out that, yes, we tend to think of maximizing when we think of data and analytics. But is maximizing always optimizing? Maybe not. Satisficers focus on what is good enough, not what is best. This does not sound like big data, the future of AI and machine learning, but it should at least play a role. After all, psychological research shows that satisficers are happier than maximizers.
Although Logan and Herschel did not use the term interdisciplinary, their talk took us in that direction. How can we bring the values and virtues of art into data and analytics? One of the key messages here was that great artists ask great questions. Great data science should give us great answers, but we need those questions to drive the answers. Otherwise, we’re just using newer techniques and technologies to use them–without purpose, and without direction.
What Now? Activate and Augment
Based on the paradoxes introduced in Gartner’s opening keynote address, these were my two main takeaways from the remainder of the conference, based on Gartner’s key trends for data and analytics in 2022.
- Activate Dynamism and Diversity: share as much as you can, share your data, share your vision, and share your story
- Augment People and Decisions: we have to keep innovating, but we need a starting point, so we need to take the tools of today, and reimagine how they can push us forward
Activate–shared data, shared vision, shared story
One of the unifying themes throughout the sessions, as well as what we saw on the exhibition floor, was the importance of sharing. In fact, one of Gartner’s top trends for data and analytics in 2022 is “Always Share Data” (Gartner 2022). In Gartner’s talk on “How to Drive the Business Value of Data and Analytics,” Alan Duncan shared Gartner’s “Must Share Data Unless” model (Gartner 2022). The core idea being that we do not have the luxury of being in silos anymore. We must keep in mind security risk and privacy issues, but we have to share something if we really hope to move forward. The emphasis on sharing went beyond just data, and touched on having a shared vision within a company, as well as sharing process and takeaways with stakeholders. Communication and collaboration then become vital.
Communication and collaboration, however, are terms that are bandied about with such frequency, that sometimes they lose meaning. Moreso, maybe it seems bizarre, counterintuitive to center data and analytics on communication and collaboration, but recent events have shown otherwise. The realities of a global pandemic forced all of us to innovate and reimagine communication and collaboration immediately, regardless of industry or field. We needed band-aid fixes just to muddle through our day-to-day. While we are not in a post-pandemic world, we can start striving for more than just making it through the day, we can aim for something a bit better.
Augment–leveraging today’s reality
Data and analytics, like many technical fields, can suffer from inaccessibility. So let’s be clear, what do I mean by augmenting, and what did Gartner mean? “Augmented analytics” or any other such term can become so buzzy as to lose meaning. According to Gartner,
Augmented analytics is the use of enabling technologies such as machine learning and AI to assist with data preparation, insight generation and insight explanation to augment how people explore and analyze data in analytics and BI platforms. It also augments the expert and citizen data scientists by automating many aspects of data science, machine learning, and AI model development, management and deployment.
The idea of augmentation or enhancement however, was common throughout the conference. We are all striving to bring the benefits of data and analytics to consumers and businesses. As evangelists of data science, we want to empower people to reap the benefits of data and analytics. But the path to this augmented reality can seem murky.
In Rita Sallam’s talk, “Scaling Analytics for Everyone Through Automation,” she highlighted two key issues:
- How are capabilities shifting from the analyst to consumer?
- How do you leverage and extend your existing investments and skills?
The second issue intrigued me. Although we have to pivot perhaps more frequently than before, we cannot instantaneously download the necessary resources and skills into our human resources or even technological resources. Updating infrastructure and training people takes lots of time. Even as we move forward, at any given moment, we have to be scrappy and be creative with what we have.
In order to leverage these existing resources, Sallam offered a path to scaling analytics adoption through automation, which included investing in data literacy, and investing in the explainability of insights and models. At the end of her talk, Sallam offered six concrete recommendations to the data and analytics leaders present at the conference, including the following:
- Augment the consumer by incorporating new, “beyond the dashboard” capabilities into your strategy and operating model
- Expand adoption by piloting to demonstrate value and build trust; aligning with key business drivers and stakeholders
How can we move beyond todays’ data workflow to achieve these recommendations?
Einblick–beyond Python notebooks, an experiment in art and science
At Einblick, we are embedding what designers and artists have embraced in the digital world–the unbounded canvas–into the data science experience. We are doing for data science what Figma has done for design.
Today, there are many pain points in data scientists’ workflow, and one of them centers on the inability to collaborate in Python notebooks. Everyone uses Python notebooks because we’ve inherited these legacy tools. We can code in Python, connect to databases, build models, and more. But the setup is laborious, and the inability to collaborate and easily explore data, and share work in real-time is becoming deleterious to progress. If we can’t share work throughout, then we can’t share our story, and we can’t share a vision.
At Einblick, we’re making collaboration inherent to our platform. We understand the difficulties that data scientists have. It’s taxing and time-consuming to import packages, connect to databases, clean data, tune hyperparameters, build out different models, and more. To support that end, Einblick released a new set of features to make it easy for users to import Jupyter notebooks. Python cells in Einblick all use the same kernel, storing global variables that can be referenced throughout the canvas. But reproducing what is already available today is not enough for true innovation that will allow teams and businesses to pivot and adapt as quickly as possible.
Data science needs to be both an art and a science, as Gartner indicated. The inherently interdisciplinary nature of data science would greatly benefit from more collaboration, and an ease of sharing that is not yet available in widely used tools today.
Once in Einblick, we’re not in the linear notebook environment that data scientists have grown accustomed to. We’re not boxed in. The expansive canvas allows users to easily iterate on different chunks of code, compare AutoML models, and create different visualizations for exploratory data analysis with a few simple clicks.
We are moving and need to continue moving beyond what has been possible before.
Einblick is an agile data science platform that provides data scientists with a collaborative workflow to swiftly explore data, build predictive models, and deploy data apps. Founded in 2020, Einblick was developed based on six years of research at MIT and Brown University. Einblick customers include Cisco, DARPA, Fuji, NetApp and USDA. Einblick is funded by Amplify Partners, Flybridge, Samsung Next, Dell Technologies Capital, and Intel Capital. For more information, please visit www.einblick.ai and follow us on LinkedIn and Twitter.