Blog

Lighthouse's AI game plan: Leveraging Google's cutting-edge technology with internal expertise

AI roadmap

In this post I intend to explain how we approach building AI solutions at Lighthouse. But first, it's important to align on some definitions.

What do we actually mean by AI?

Let’s start with defining what I mean by AI in this post. Since I’m an engineer, I will focus on clear definitions. I am going to talk not only about the brand new Generative AI, but about a broader range of technologies that are used to enable computers to simulate human intelligence and problem-solving capabilities.

To make it less vague, I will split the types of solutions we’ve worked on at Lighthouse into 3 categories:

Predictive AI:

  • Forecasts future outcomes

  • Works best with structured, (mostly) numerical data

  • The “black box” can be unpacked and thus understood (to some extent)

  • Data heavy, mostly numerical outcomes

  • Examples: occupancy prediction

Generative AI

  • Creates content (e.g., images, videos, music, or text)

  • Works best with unstructured, ideally text or image data

  • Fully “black box” (at least today, it’s basically impossible to explain why the output is what it is)

  • Produces creative outputs that mimic human-like patterns

  • Examples: summarizing a long article

Automation

  • Automates well-defined human processes

  • Doesn’t require data, but needs detailed process description (and programming)

  • Fully understandable and easy to adjust

  • Performs exactly as programmed

  • Examples: automating the gathering of multiple data sources into one Excel file

While automation is sometimes referred to as AI, it's not AI from a technical perspective as it doesn't "learn" from data patterns. Today, I'll focus on the first two types of solutions and how we approach building them at Lighthouse.

AI: buy or build? Or maybe both?

Should you have an AI / Data Science team in your company? Should you outsource this kind of work completely? Should you build the infrastructure to support AI developments in house? Or pay for an “out of the box” business solution?

For us, the answer is somewhere in between.

We're not yet large enough to develop "foundation models" or advance groundbreaking research in-house. We also don’t want to build physical computing clusters or data centers, nor do we want to build the AI supporting infrastructure from scratch - as these things require massive capital and time investment and are way more convenient to buy.

For these reasons we use Google Cloud Platform, specifically services like Google Cloud Storage, Spanner and BigQuery for data storage and processing, and Google Kubernetes Engine for deploying our AI models. This allows us to both have a scalable and secure way to store and process our data, but also gives us access to all the cutting edge technology around AI.

As a data-first company with unique data assets relevant to solving real-world problems in the hospitality industry, we need people to continuously work with that data. Thus, growing our own Data Science department was an easy decision.

What were the most important aspects of building such a team?

Hire the right people

Currently, our department is composed of experts from diverse backgrounds: from economics to theoretical physics, with experiences spanning business, academia, and various industries like telecommunications, food, finance, and taxi services.

What I value the most in building such a team is the breadth and depth of different views and perspectives. As I like to say, we try to find a “culture add” and not only a “culture fit”. This diversity allows the team to deliver high-quality solutions by challenging each other's ideas.

Create a culture of learning

We spend a lot of time exchanging ideas, brainstorming and trying to follow news of the industry and academia. It is crucial to build an R&D organization where people feel that it’s ok to spend some time on learning to then bring even better solutions to the table.

However, we also realize that AI is a complex and continuously evolving field making it impossible to keep up while continuing with a daily job of researching new product features.

We occasionally partner with Google and external Google certified AI experts, which helps us to access a wider range of expertise and knowledge than we could if we would solely rely on our internal team. We use it as a “learning curve booster”, so that we can learn things from them the right way from the get go.

Team working on laptops

Embrace innovation, and accept that things can fail

Innovation seems to be an easy sell to the business. It often goes like this:

“Hi there! My team would like to run this super innovative project. It’s such a great idea - if it works it will bring us a lot of revenue.”

However, everyone tends to forget the "if" and expects an innovation project to fit the mold of a standard project - one with a defined timeline and outcome.

But that goes against the nature of data science and more generally - innovation. Building this type of environment for your team is easier said than done. So here is how we have helped reinforce this idea at Lighthouse.

Innovation roadmap

Lighthouse has had a Data Science team (initially a small one) since almost the beginning. Over the years we researched and developed (Predictive) AI driven solutions for multiple problems brought to us by our customers.

Let me showcase a few examples where we successfully delivered using predictive AI:

  1. Market Insight Demand - first in class “nowcast” using forward looking data. It doesn't only rely on historical trends but shows abrupt market changes and tells users what to focus on today.

  2. Smart compset - an AI model to choose the most relevant competitors set for our customers. It’s used in Rate Insight, Market Insight and Benchmark Insight.

  3. Occupancy forecast - prediction model competing with the best players in the market.

  4. Price recommendations - recommendation system used in the Pricing Manager taking into account vast amounts of data to advise our customers the best price point.

Believe me, there were also many research projects that failed to deliver the expected outcome and so never faced the customers. This is a natural part of the innovation process.

Then, seemingly out of nowhere, ChatGPT was released in Q4 2022, gaining 1 million users in just 5 days. Suddenly, people were convinced that Generative AI would solve all our problems. The belief was that all white-collar jobs would be automated, and Data Scientists would become obsolete.

Well, unfortunately, or fortunately for me, it didn’t turn out this way. Over the next months we learned about the limitations of Large Language Models (LLMs). Things like the cost, scalability challenges and most famously accuracy issues caused by (often hilarious) hallucinations.

We learned that GenAI is yet another tool in our AI toolbox that is great for working with text and image data, but for some other tasks it’s simply not working as well as Predictive AI or even good old automation.

And here we get to the main point of this post: how did we actually learn all of this at Lighthouse?

Our innovation roadmap approach

We decided to invest more in experimentation and created a dedicated “Innovation roadmap”. What do I mean specifically with an innovation roadmap and what kind of projects do we tackle there?

At Lighthouse, we categorize our data science projects into three types:

  1. Projects with predictable scope and outcome - when we know exactly what we need and we know which technology will get us there as we already worked on similar projects in the past.

  2. Projects with unpredictable scope and predictable outcome - when we know something will work, but there is a lot of uncertainty on the solutions or data that will bring us there.

  3. Projects with unpredictable scope and outcome - when we don’t know if something will work at all and how long it will take us to know if we have the tools to make it work.

It's the last category - projects with unpredictable scope and outcome - that we designate as part of our "Innovation roadmap".

Lighthouse's AI innovation roadmap

Our innovation roadmap has been instrumental in driving bottom-up innovation at Lighthouse. When we have ideas, but not yet the knowledge to research them, we occasionally partner with Google's Certified AI Experts, who provide us with guidance and help navigate the complex, always changing landscape of AI.

One of the first projects that we tackled this way was a proof of concept for “Smart Summaries”. In this project we boosted the progress by partnering with Google’s certified AI experts to learn how to best build it. You can read more about that in my previous post Peek behind the curtain: how we built AI Smart Summaries, our first Generative AI feature.

Key takeaways

  1. Choose the right strategic partner: This allows your company to stay at the forefront of AI advancements while focusing on its core, industry-specific strengths.

  2. Build a strong, diverse team: Diverse perspectives ensure different views and opinions, leading to better solutions.

  3. Embrace innovation and its risks: Implement strategies that help your business and stakeholders give your team the latitude to fail and learn.

About the author
Joanna Kochel
Director of Data R&D at Lighthouse

Joanna leads Lighthouse's Data R&D teams, creating the algorithms behind the company's AI-driven products. With over a decade of Data Science experience in various industries, she has worked on telecommunications, data analytics platforms, and ride-hailing algorithms. She joined Lighthouse five years ago as one of the first Data Scientists, developing algorithms for Market Insight and Pricing Manager before transitioning to a leadership role.

    Capture more revenue around events in your market with Lighthouse's industry leading data sets