Your data represents a massive investment, but are you getting a good return? A new CIO.com survey of data leaders highlights that data analytics is a priority, but shows that those leaders don’t have the people and tools they need to find and leverage the value in their data.

Traditional data analysis tools and techniques are falling short in today’s data-rich environment, limiting the advantages and value to be gained by analysts. “The Secret to Doing More With Data” shares where teams are today, as well as what organizations must do to gain a competitive advantage moving forward.

The results of the survey illustrate that teams today know they could, and should, be doing more with their data. Traditional BI dashboards aren’t giving teams enough insight to make strategic decisions, leaving teams in the dark about the complex relationships in their data. The solution to that problem could be to higher more data scientists, but the survey validates that those skills are hard to find.

So what can data leaders do? They can upskill their data analysts with advanced analytics tools that use AI to enable Intelligent Exploration, so analysts can see and share the insights that will impact their company strategy.

Learn how Intelligent Exploration is the answer to doing more with your data by downloading the CIO.com report.

Summer is for road trips, right? Virtualitics is excited to be participating in two major events next week: Snowflake Summit 2023 and the Data + AI Summit by Databricks. Whichever show you’re visiting, you’ll find Virtualitics experts ready to talk all things data analytics and Intelligent Exploration.

Where Data Analysts can see Intelligent Exploration Live

The Snowflake Summit in Las Vegas, Nevada, and Databricks Summit in San Francisco, California, will both feature keynote speakers from leading companies, as well as breakout sessions and workshops on a variety of data-related topics. These conferences are great opportunities to network with other data professionals, learn from leading experts, and see the latest data and data collaboration technologies…including the Virtualitics AI Platform!

Attendees will be able to experience sessions and demos about:

  • Data lakes and lakehouses
  • Machine learning and artificial intelligence
  • Data engineering and data science
  • Cloud computing
  • Data governance and security
  • Data collaboration

We are looking forward to hearing and seeing the latest trends in data and data collaboration from leading experts in the industry. We’re also excited to show how the Virtualitics AI Platform can be used to solve real-world problems for Databricks and Snowflake users.

Databricks Summit attendees can meet with our team at booth 42.

Snowflake Summit attendees can visit our team at booth 2132-B.

Advanced Analytics tools powered by Intelligent Exploration

Not planning to be at either show? No worries! We would love to chat about Intelligent Exploration wherever you are:

And of course, you can always request a demo with our tech team to see how the Virtualitics AI Platform can help you get more value from your data.

 

Ok, maybe you don’t have to ditch your dashboards entirely. But chances are good that the old-school BI dashboard or spreadsheets you’re using are providing incomplete and outdated snapshots of your business. And there are so many of them! They just keep getting added to. That’s not how to get the most value out of your data or your analysts.

If you feel like your data and analytics capabilities have fallen behind, you aren’t alone. A recent CIO survey found that 85% of organizations aren’t using tools designed to explore complex data. That means most companies have teams of analysts who are providing reports on the snippets of the past, rather than strategically recommending actions for the future.

Virtualitics is already being recognized as an advanced analytics platform that is ready to elevate data analysis across industries. With new investment from a trusted financial institution, exciting new commercial opportunities, and awards for innovative design, we know our solution helps companies across industries find value in their data. Traditional BI tools and methods consistently fall short when it comes to exploring complex data to unearth meaningful and actionable insights. If you want your analysts to level up to advanced analytics then it’s time to back up your dashboards with something more powerful.

Dashboards and Spreadsheets Don’t Serve Analysts

Dashboards and spreadsheets have been the go-to tools for data analysis for many years. They attempt to organize and present data through charts, graphs, and pivot tables. But our datasets have grown, and the limitations that these tools have always had are now majorly hindering their ability to deliver comprehensive insights.

  • Traditional BI tools are usually static. Dashboards and spreadsheets present data in a static format, providing a limited snapshot of information at a particular moment in time. They lack the dynamism needed to explore the data using different techniques and moving between different visualizations.
  • Dashboards don’t identify patterns and relationships. Dashboards and spreadsheets are primarily designed for visualization of relationships between just a couple of variables–sales by month, or revenue by client size–but they don’t allow for analysis across wide data sets. Even with multiple dashboards, it’s a struggle to discover intricate patterns, correlations, and trends across attributes. The temptation with more data points is to just add more tables to the dashboard.
  • Insight is left to be discovered. BI tools rely heavily on the user’s ability to manually identify insights, which can be time-consuming, require significant experience, and be prone to human bias. And as datasets keep getting bigger, dashboards keep getting busier, the insight just gets more deeply buried.

Make the Move to Advanced Analytics

While dashboards and spreadsheets have played a valuable role in data analysis, the limitations they possess can hinder businesses from fully capitalizing on their data assets. Advanced analytics tools allow analysts to leverage powerful data exploration, advanced visualizations, AI integration, and collaboration capabilities. Embracing advanced analytics empowers organizations to extract deeper insights, make informed decisions, and gain a competitive edge in today’s data-centric world.

What should teams be looking for in a great advanced analytics platform?

Powerful Data Exploration: AI-powered advanced analytics tools provide a flexible and interactive environment that enables users to explore data in-depth. Users can easily navigate complex datasets, interact with visualizations, apply various filters, and uncover hidden insights that may have gone unnoticed with traditional BI methods.

Machine Learning and AI-guided Analysis: Generative AI should be your data analyst’s best sidekick, not a gimmick added on for looks. Machine learning and artificial intelligence algorithms can guide analysts through complex data, highlighting and explaining patterns, predicting outcomes, and providing proactive recommendations. This level of automation and intelligence can significantly enhance decision-making processes.

Advanced Visualizations and Storytelling: Advanced analytics platforms may leverage cutting-edge techniques like data storytelling, augmented reality, and virtual reality to deliver immersive and engaging visualizations. By presenting data in a more intuitive and interactive manner, users can easily grasp complex concepts and communicate insights effectively.

Are you ready to level up your data analysts? Check out our free e-book: Intelligent Exploration for Data Analysts – How Advanced Analytics Tools Turn Data Analysts into Strategic Heroes.

 

It’s a new year and with that comes a clean slate and the best of intentions to get things right! If you have “Build a Successful AI Program” at the top of your 2023 resolution list then here are 5 things that you need to embrace.

1. Rediscover the Lost Art of Data Exploration–This Time with AI

Many AI use cases will fail from a lack of appropriate data exploration early on in the process. The old tools that are in use for basic analytics, while still useful, are not getting the job done for exploring the vast data that are relevant for AI use cases. And considering that AI is intended to automate decision-making and action-taking at scale, making sure that you’re pursuing the right use case and using the right data sets is critical.

In 2023, don’t try  to validate hypotheses using basic BI to justify your next AI use case. Instead, leverage AI algorithms to explore business challenges with an open mind. This will bring more options, reduce bias, and ensure that you’re following up on high-impact possibilities. To learn more, read What is Intelligent Exploration?

2. Finally(!) Start Using Modern Data Visualizations

Fully interactive multidimensional visualizations are the next standard in advanced data analytics. They are game-changers for both front-end exploration of data and for illustrating the findings and implications of your AI algorithms. Analysts and data scientists that try to show  complex data relationships in traditional business intelligence tool visualizations will fail to communicate them clearly, they won’t get informed buy-in, and they could  actually lead organizations down the wrong path.

AI-generated true 3D visualizations make the relationships between multiple data points consumable. Instead of a rabbit hole involving the comparison of myriad pie charts, histograms, and scatter plots, true 3D illustrates the connections concisely. We’re not talking about forced-perspective 3D, either. We’re talking about visualizations that can be manipulated like an object in space, rotated by the observer, and pulled in to zero in on a particular point. We’re talking about native 3D visualizations. To create better AI, you need more data; to find the insights in more data, you need more robust visualizations.

3. All Things Network Graph

Gartner analyst Rita Sallam said the research and advisory firm forecasts that 80% of data and analytics innovations will be made using graph technology by 2025 and the market is expected to grow by 28%. But most heads of Analytics have only a passing familiarity with network graphs and how they can be used at the Enterprise level. It’s no surprise, really. The tech stack and methodology have  been complex, meaning that most data professionals just don’t have any experience. 

So for 2023 your goal should be to learn more about what network graphs can do, because they’re more accessible than ever before with AI-powered Network Extractors. Now teams are able to create persona profiles of high-churn customers, spot the weak spots in a supply chain, and analyze any  highly connected datasets. Read more in our Beginner’s Guide to Network Analysis.

4. Find Beauty in Simplicity

We were all blown away by the capabilities of GPT-3 (an advanced AI-powered search, conversation, text completion app)  and perhaps dream of unveiling something that vast and ambitious ourselves. But for 2023, let’s take a moment to appreciate the beauty of a simple solution. Just because AI is the mission doesn’t mean that AI is always the answer. With just over half of AI projects getting to production, it’s pretty clear that we could be doing a better job vetting the use case list. 

The ability to spot the best, most effective path forward is where robust exploration in the preliminary project phases really pays off. Once you’ve surfaced your potential use cases, look at possible solutions from all angles, and include business stakeholders who will have a different perspective. When considering feasibility, don’t forget to count technical costs, and change management costs, alongside everything else. And if you come to the conclusion that the best solution is not an AI model? That’s a win, proof of the team’s value, and shows you have a thoughtful assessment process. Read more about Finding the Right AI Use Case.

5. Responsible AI Means Ownership of the Model

Much is said about responsible AI–AI that is transparent in its predictions and is thoughtfully developed to account for bias and downstream impact. But nothing can be truly responsible without accountability. Frustratingly, once deployed, AI models seem to become orphaned, with no one taking ownership. If something goes wrong in production, where does the buck stop? And if no one is responsible once the model is applied, who feels the responsibility during the development process to make sure that the model itself is responsible?

Data and Analytics leaders should consider clearly defining roles and responsibilities for all AI projects for the business. These responsibilities should include providing the business with clear information about what data is being used, how it is being used, and the impact the model had during testing in a format that business stakeholders can understand. It is  the responsibility of the business stakeholders to consume this information and ensure that they understand it. A plan for remediation should be in place should the model drift or not perform as expected and someone should be named to monitor the model. Furthermore, users should be empowered to understand the model, and understand when they should challenge the model’s results.

Are you ready to make Intelligent Exploration, 3D visualizations, and AI part of your 2023 game plan? Contact us for a personalized demonstration.

 

Multitasking? Great! You can listen to this blog post by clicking above or find our podcast, Intelligent Data Exploration, on major podcast platforms.

Gartner reports that just over half of all AI projects make it into production. And of the few who do, many will go unused, either because the business doesn’t trust the models, or because they’re just not solving the right problem.

So how can you find the AI use case that will go the distance and kick off a project that will have an impact? It all starts with a good foundation.

Don’t skimp on data exploration

You can’t fix a problem you don’t understand. And you definitely can’t fix a problem if you don’t even know it exists. Data exploration is the only way to make sense of all the moving parts in your business and it’s a critical first step in any potential AI project. You need to discover where the real challenges lie if you’re to have an impact on the business. And you need to understand what is driving those challenges in order to target them effectively.

Exploration is often guided by a hypothesis–we think the challenge could be X so let’s explore to see if that’s true or not. The challenge with exploring this way is that it really limits the scope of what’s explored and has a high risk of introducing confirmation bias. Hypotheses can lead data scientists and analysts down the wrong path and away from the most meaningful discoveries. And if you’re not solving the right problem then your project is based on a false premise and you won’t be able to find and implement a solution that works.

Exploration should focus on the business challenge. Sales are down? Explore as many attributes relating to sales as possible. Want to try generating personnel schedules that anticipate demand? Explore all the attributes that impact staffing. The key is to look for all the relationships and drivers that exist, with an open mind, so that nothing gets overlooked. 

If this feels daunting, don’t worry. This is where the use of AI to explore data—we call it Intelligent Exploration—can help. Traditional BI tools aren’t designed to support the breadth and depth of exploration that good AI demands as a foundation. But when you leverage Intelligent Exploration, you can start to surface the most impactful opportunities for AI.

Identify your opportunities

Very few business challenges are straightforward so your exploration will probably identify a few challenges and, for each of those challenges, a few contributing factors. The next step is to refine your results to identify which challenges are significant and worth pursuing.

Start by consulting your business stakeholders and reviewing the results of your exploration. Their knowledge of the business can help inform your understanding of your findings and together you can translate your findings into potential use cases. Involving the business stakeholders early also helps to ensure they understand how AI projects develop and what data will ultimately be used to drive the model.

Making sure that your business partners really understand the insight you’ve uncovered, particularly if it involves multiple, interconnected relationships, can be challenging but it is possible:

  1. Structure and present your discovered insight with a storytelling narrative.
    • Set the scene (the high-level challenge you were exploring)
    • Point out the early areas of insight that were surfaced by the AI and caught your attention, then what you chose to look at next as a result
    • Describe the big ‘ah-ha’ insights that came into focus as potential use cases
  2. Use plain language wherever possible. It’s so easy to slip into the jargon that we use every day but unfamiliar terms can be distracting and make them feel like what you’re showing is beyond their comprehension. 
  3. Be very thoughtful about which visualizations you use to illustrate your findings, particularly if you need to highlight the interplay between attributes. 
    For example, if you want to show the interplay between 3 attributes, showing three 2×2 plots will not get that information across in a consumable fashion but a true 3D visualization of a 3×3 will.
  4. If your toolset allows it, leave the analysis with the business leaders to consume and explore on their own. Ideally, you’ll be able to annotate it, calling attention to the areas of interest. Doing this provides the team more time to digest what you’ve told them, and to “kick the tires” of the results. 

Refine use cases and create a ranked list

Together with your business stakeholders, plot the use cases by feasibility and business value. Create and prioritize the list of possible problems to solve so that the most important challenges are addressed first.

When determining feasibility consider the following:

  1. Should the use case be tackled with an AI model or could it be solved some other way? Just because you have a hammer, doesn’t mean that everything is a nail. Other solutions could be a process change, a system modification, or better or more timely analytics.
  2. Determine the cost or impact of not addressing the problem. This will help determine impact, mobilize the business to embrace the solution, and provide you with key metrics to measure success down the road.
  3. Can you put this model into production, support it, and foster the organizational change required to leverage it? A complex model that can be seamlessly integrated into a current workflow may be more feasible than a simple model that requires a big change to established processes.
  4. Fairly assess your data sources. If the data you have is weak, it won’t matter how good your report or AI model is; it won’t be usable. In fact, it could do harm. 

Select The Right Use Case  

Armed with your carefully plotted use cases, the result of thorough exploration and consultation with the business, you, your team, and your business leadership will be able to select the right AI use case to move forward with.

To learn more about how to successfully create the next-generation AI strategy that leadership is asking for, download our eBook “Building a sustainable AI strategy from the ground up”.

We’ve been exploring data the same way for so long that we’ve stopped recognizing how it’s holding us–and our businesses–back. It’s time to break free from outdated tools and the shortcuts they forced us to take and start doing things differently.The Restrictions

With so much data at our disposal, we should be able to investigate problems from every angle, but we don’t. Every analysis starts with a hypothesis and that hypothesis is used to narrow the scope of the exploration. From the data points we include in our data set, to the questions we ask of the data, to the conclusions that we spot, the hypothesis is always in the driver’s seat. 

Given the limits of the analytic tools we had, it’s no surprise that analysts and data scientists have taken to using informed hypotheses to limit the scope of data exploration. But hypothesis-driven exploration has always injected some risk and, as AI has amplified the potential uses of data, these risks have been amplified as well.

This blog is based on a presentation we delivered at the Gartner Data and Analytics Conference. Watch it here!

Hypothesis-Driven Exploration

So what is hypothesis-driven exploration? It’s when the data is explored with a particular hypothesis in mind:

1.  Observe the problem (or opportunity)Some orders keep arriving too late!
2.  Formulate a hypothesisWe have an issue with our supply chain
3.  Gather only the data relevant to that hypothesisExplore the supply chain data
4.  Explore the data with that hypothesis in mindWhere is the weak link in our supply chain?
5.  Move forward with a project based on your resultsDiversify the supply chain under certain circumstances. But are we targeting the right problem? Will this resolve the issue of late orders?

Given the time and effort data exploration takes using traditional BI tools, it’s not a surprise that data scientists and analysts have leaned on hypotheses to keep project scope in check. It’s also just human nature–you have to focus your attention somewhere and it’s hard to set aside your theories to explore with an open mind. Even data scientists who develop AI algorithms to explore data will be directing their exploration using hypotheses to some degree or another. 

But there are real risks to allowing a hypothesis to direct exploration:

  • Missed Opportunities
    You could be looking in the entirely wrong direction–what the problem is or how you should go about solving it.
  • Underwhelming AI or Meaningless Insights
    Your exploration may yield some insight, but if it’s not insight about the real issue then any action you take will have weak or non-existent results.
  • Overlooked Risks
    You may have left out data that pointed to big issues, or that could have left you to draw completely different conclusions. That means the real problems are left unchecked.
  • Biased AI
    Conversely, limiting exploration could allow some data sources to have an outsized impact on the results.

Exploration should form the foundation for all of your data-driven initiatives but when it’s being done on the narrow premise of a hypothesis, everything built atop it is at risk. And when you’re planning to build AI that will automate business decisions across the enterprise, the repercussions of getting it wrong are just too great. There is a real cost to leaving insight on the table.

What Does Exploration with Today’s Tools Look Like?

The renowned Pew Research Center does a lot of great surveys and kindly makes not only their findings but their data available to the public. We thought it would be interesting to look at their 2021 survey data for Social Media Use their conclusions to help us illustrate the benefits of Intelligent Exploration. 

Let’s start by looking at all of the data points that Pew gathered–it’s a lot!

Gender
Age
Marital status
Employment
Income
State
Race
Party
Home internet
Home internet kind
Do you want high-speed at home?
Does disability impact you?
Parent of under 18
Education level
Current cable TV
Books read
Printed books
Audiobooks
eBooks
Internet user
Smart device?
Internet on device
Internet frequency
Social media use
Twitter user
Instagram user
Facebook user
Snapchat user
YouTube user
What’sApp user
Pinterest user
LinkedIn user
Reddit user
TikTok user
NextDoor user
Twitter frequency
Instagram frequency
Facebook frequency
Snapchat frequency
Youtube frequency

Below are some of the key results that the Pew center found:

Facebook remains the most popular and most visited site. Snapchat has the highest age spread between users

These are interesting findings, but we’re not seeing any interactions between the dimensions. There’s no sense of who these respondents are or how the use of one platform relates to another. If you were an advertiser looking to leverage these platforms, your knowledge of the average user is pretty light.

And given the amount of data that was collected, the analysis is very simple 2-dimensional pivot tables. There’s also no way to tell if these differences are real (statistically significant) or just based on sampling bias. The BI used for dashboards and pivot table analysis are not very well suited for multidimensional exploratory data analysis.

Breaking Limits with Intelligent Exploration

Intelligent Exploration is the practice of using AI to explore and understand data. Many data scientists focus on throwing a lot of data at a problem and applying supervised ML techniques to sort it out. While this feels efficient, the number of failed AI projects–a number that has only inched up 2% in two years–would say that it’s not. But doing good, thoughtful exploratory analysis with the aid of AI purpose-built to do just that will de-risk projects and lead to more trusted outcomes.

Intelligent Exploration creates a more complete picture and facilitates better decision-making because:

  • It can comb through complex datasets so that there’s no need to impose our own thoughts on what data should be included or not.
  • Intelligent Exploration can not just look at more dimensions, but it can look at the many possible relationships between those dimensions.
  • It cuts through the noise of complicated data and pulls out the significant insight, so data science teams know where to focus, instead of spinning cycles trying to find the ‘Aha!’ insight.
  • AI has no preconceived ideas about what trends or relationships it will find, keeping human bias out of the exploration. This means that Intelligent Exploration may find relationships that people won’t even think to look for.
  • Data quality is the single most important factor in an AI model. Every model that has gone off the rails has failed because it relied on either the wrong data or weak data. Intelligent Exploration is ideal for assessing data quality effectively and efficiently.

Let’s explore the Pew Social Media Survey data using Virtualitic’s Intelligent Exploration capabilities. We listed all of the dimensions collected earlier in this article but here’s Pew’s analysis of their demographic data in a 2 dimensional heatmap:

We asked the Virtualitics’ AI to look at all of the data collected (and there were 40 attributes!) and create communities of the respondents that could be visualized in a network graph. The AI-powered Network Extractor was able to sift through them all and group respondents together based on their similarities, and their differences. The end result is nine distinct communities, but it’s what defines them, and their relationships to the other communities, that’s really interesting.

This network graph tells us so much more about the respondents:

  • That even though they were significantly underrepresented in the survey data, young people make up the largest community, suggesting that their responses are very similar. (How do we know that they’re underrepresented? We did a quick analysis of the basic demographics before we got started.)
  • The Young group is characterized by the breadth and frequency of platform use. They’re most likely to use Instagram, Snapchat, Reddit, TikTok, LinkedIn, and Twitter, and to use them all quite frequently.
  • There are three groups at the top: Doesn’t Use Internet (green), Low Internet Use (Peach) and Low Income (light blue). These groups are all characterized by low or no internet use. But while the Low… and No… groups are quite tightly grouped, the Low Income group is spread out and starts to reach down to the Young group down below, telling us that they are a) young, and b) have more in common with that group than with the No.. and Low.. Internet user groups. It’s quite likely that as the income levels of individuals in this group goes up, they’ll likely reflect more of the patterns of the Young group.
  • There is a group for Readers–people who identify as regular readers and while it may be tempting to assume that as readers they are not internet users, the group’s positioning with the other communities characterized by internet use suggests that they are still regular internet users.
  • Only two groups are defined by their use of a specific platform: What’sApp and NextDoor respectively. 

Network graphs are visualizations that capture the connections and relationships in data. Those relationships are shown using nodes and edges, where nodes represent what is being analyzed and edges show how those nodes are connected. Similarity between nodes is denoted by the nodes’ proximity to one another; the closer they are, the more similar they are. 

Groups of nodes that are distinctly similar are called communities. Some communities may be highly similar and you can visually determine that because they’re tightly bunched together, while other communities have more variety between members and are placed further apart.You can learn more about network graphs and how they can be used in this blog or by downloading our eBook.

Next, we got a little more targeted and used Intelligent Exploration to determine what drove usage for each platform. These are the results for Pinterest. While it’s probably no surprise that women were the biggest users of Pinterest, when Virtualitics Intelligent Insight was used to call out the statistically significant insight, it noted that it’s not just women–it’s married women who work full-time who were most likely to use Pinterest. If you were like me and associated Pinterest with wedding planning and crafts that required way more time than I could possibly devote to them, this finding was surprising to me.

The Missing Piece

Intelligent Exploration makes it so much easier to work with complex data sets by cutting through the noise to the insight of interest, but those findings are even more usable when they’re presented in 3D.

Research has shown that information presented in 3D give consumers a 23% boost in understanding. When you’re trying to work with business stakeholders and get everyone on board with an AI solution that rests on the interplay between a number of different data points, that understanding boost is critical. You need your business and your stakeholders to understand, provide useful context, and ultimately buy-in to your initiative.

To build AI that works–or at least won’t break anything–you need to understand the interplay between dimensions. That interplay is much more clear when you can pivot the visualization. This image here is of the three basic demographic dimensions of the Pew survey data–age, income, and employment status. It’s clear which groups are over- and under-represented as we rotate the table.

Finally, when we limit our visualizations to 2D, we either limit our analysis to 2X2 plots, or we attempt to broaden the analysis, but flatten the visual, as with most common network visualization tools. But flattening something that’s supposed to communicate relationships by using proximity, like a network graph, actually distorts the information contained within. Only 3D allows for accurate illustration.

Plus, 3D is just way more fun to work with.

How Will Intelligent Exploration Transform Your Ability To Execute?

It’s time to change how we interrogate data. Our world is interconnected and we can’t afford to narrow the scope of our exploration. Intelligent Exploration–the use of AI to explore and visualize, and mine it for insight–must be the start of any data-driven initiative.

Intelligent Exploration flips the old hypothesis-driven exploration process entirely:

1.  Observe the problem (or opportunity)Some orders keep arriving too late!
2.  Gather all of the relevant dataGather inventory, order, and supply chain data
3.  Use Intelligent Exploration to uncover drivers behind late ordersThe biggest driver of late orders is on the order management side and changing suppliers would make no difference at all.
4.  Pinpoint the source and the change neededOrders for certain parts need to be placed 4 weeks in advance to get on time and so inventory levels need to be flagged earlier.
5.  Move forward with a project based on your unbiased resultsDevelop an algorithm that tracks equipment usage and flags the projected need for the parts 4 weeks out so that the warehouse staff can order in advance to have on hand, avoiding downtime.

Intelligent Exploration keeps your data science team focused on the right problems, and prevents issues early on instead of when an AI app is lurching towards deployment after months of investment. With a solid data-based foundation, your business can amplify the impact of your AI programs–and your business analysts. Start strong with a data-proven baseline and you will be leading the pack in successful AI.