We live in a world swimming in data. Each day on Earth we generate 500 million tweets, 294 billion emails, 4 million gigabytes of Facebook data, 65 billion WhatsApp messages and 720,000 hours of new content on YouTube.
Software solutions have been solving business problems related to large amounts of data for years. As time has gone on, the amount and sources of data available to better solve these problems has grown exponentially. In 2018, the total amount of data created, captured, copied, and consumed in the world was 33 zettabytes (ZB) – the equivalent of 33 trillion gigabytes. This grew to 59ZB in 2020 and is predicted to reach a mind-boggling 175ZB by 2025. One zettabyte is 8,000,000,000,000,000,000,000 bits.
Business Intelligence (BI) is the traditional way a software solution collects, processes, and displays the data in meaningful and efficient ways for the user. It can encompass many levels of data analysis capabilities, but the most common capability is reporting. Reporting, however, puts the onus on the user to interpret the data, as well as intuit insights and decisions. In the early stages of a solution’s lifetime, this is typically sufficient.
But what happens when the datasets become too large and more and more sources of data start to enter the equation? The impact of this on the user becomes overly burdensome. The typical response is to add better filters to shrink the relevant data for analysis and/or create rules to get the system to flag anomalous or interesting data. While this is often a sufficient next step, the user is likely to become frustrated with the amount of data they need to pour through to extract actionable insights.
As the data set progressed, the software evolved, and the possibilities expanded, it became necessary to get the system to work for its users. To do that, a series of questions arose, primarily, where can the manual analyses be automated to find trends and what can those trends tell us about the future?
More importantly, as new data is collected, can the system “learn” from potential changes in trends, make new predictions, and can it learn from the successes and failures of its previous predictions? Are there new areas of insights to be discovered as part of this process, even in areas outside of the original business problems being solved by the platform?
Data Science vs. Machine Learning vs. Artificial Intelligence
Over the years, the software industry has introduced concepts like data science, machine learning, and artificial intelligence (AI) as the path to solving some of these problems. The goal isn’t necessarily to remove the human element, but to leverage technology and mathematics to handle large amounts of data and provide conclusions, and ultimately forecasts, that would be very difficult to come up with manually. This strategy allows the user to advance to analyzing the generated insights instead of the source data. They may still refer to the source data as part of this analysis, but that’s not the main activity any longer.
The human brain is excellent at seeing patterns in disparate data, as well as using past experiences and knowledge to make decisions. However, it has a limited capacity to put large amounts of data together to that end. It can also be biased and sometimes it can be wrong. That’s where data science comes into play.
Data science is the general term for the analysis and modeling of large amounts of data, allowing users to extract meaning out of the data. A data scientist will apply statistics-based mathematics and visualizations on the data to that end. Data science is the tool on which AI is built.
AI combines the aspiration of pattern recognition and knowledge-based decisions with immense computing power to process large datasets. The more data there is in breadth and depth, the more possibilities exist not only to solve the initial problem with an increasing level of accuracy, but also to recognize other useful patterns.
AI is a very broad umbrella, however. Terms for disciplines within AI are often used interchangeably and sometimes cause confusion. A more common implementation of AI is machine learning. Machine Leaning uses data science methodologies to teach computers how to “think” and make decisions about specific problems.
At the basic level, it means giving the platform enough data to identify clusters, see patterns, and make connections. When new data is introduced, it can identify it because it has learned from data already observed. Once the new data is incorporated, the patterns are updated to help refine the connections.
The Role of the User
The user plays a key role in data development and accuracy. With the system recommending outcomes, the user can spend their time evaluating the predicted outcomes instead of the raw data. Their responsibility is to be a partner in continually teaching and refining the system.
In solutions where the user can participate in training the system, the more input, the better the partnership. In other situations where the solution is trained by the vendor, providing feedback about the accuracy of predictions is essential to keeping the insights relevant.
How much can the user trust the results? Even the most advanced AI systems are still only as good as its programming and input data. Big Blue from IBM did fantastic on Jeopardy, but it had occasional big misses. Most AI systems aren’t nearly as advanced as Big Blue, but a good feedback loop with the software vendor helps keep the results accurate.
AI in the Hospitality Industry
How is the hospitality industry using AI now? There are many areas, but a couple of areas of note are revenue management and the group market.
Revenue management has used data science and machine learning for many years to forecast transient business and allows its users to make informed decisions for things like room pricing, market segment filtering, overbooking, marketing offers, day-to-day staffing decisions, etc.
Historically, this was based on past stay patterns, future booking paces, and broader segment patterns. As the tools have evolved, they have sought out other data inputs to help refine the forecasts (ex. market reports, competitive set analysis).
Another emerging area in hospitality predictive analysis is within the group market. Particularly for hotels with a large group presence, it is important for hoteliers to understand how much group business is coming. For example, it is important to better understand if groups asking to use the hotel and event space are the correct ones for your property. Solutions using AI can help determine if there are better opportunities out there that better match your event criteria and identify ways to help your teams find those opportunities.
The right data can be a powerful tool in the hands of a salesperson. Bringing together actionable meeting intelligence can help sales teams better identify the right prospects meeting today.
Even more importantly, AI can help identify future group prospects by understanding the nuances of groups that may have met in the past. In this case, historical data is a premium data source, as it reveals trends that will return in the next 12-24 months.
Equipping Your Sales Team with Actionable, Meaningful Insights
Group meetings may seem stalled at the moment, but the reality is demand is returning. Even pre-COVID, meetings with more than 500 attendees only accounted for 5% of total meetings worldwide. The bread and butter for the industry was meetings with fewer than 100 attendees (more than 60% of all meetings). These meetings are returning to the marketplace and as vaccination levels grow and markets redefine meeting restrictions, these “smaller” meetings are likely to surge in the last quarter of 2021 and into 2022.
Hoteliers must prepare for new possibilities and a shift in how they identify and sell group business to capitalize on this renaissance. The time is now for your sales teams to leverage data science to conduct targeted outreach.
While virtual and hybrid meetings will continue for a while, the online meetings of the past 18 months have lost their uniqueness and with that their luster. People are weary of online communication and there is a driving factor for many who want to re-connect with colleagues, friends, and participate in like-minded group activities.
At the end of the day, data empowers sales calls by helping inform the talk track. The right insights can help your group sales better assess fit and probability to book while increasing the group sales pipeline. Quality of data and better efficiency in obtaining this data quickly and easily can make a significant difference in your bottom line.
Making Data Work for You
To stay on top of today’s influx of data, you need powerful data tools and software solutions to develop actionable insights. The power of data comes from the knowledge it gives us to make better decisions and improve efficiency. It is the key to helping us manage and interpret data to gain better decisions in the areas of distribution, revenue management, group sales, efficiency, and more. For our industry to thrive, it is necessary to unlock the value of data to achieve new levels of profitability. The time is now.