AI in 2024 – What does the future hold?

Tags: AI/ML , AIML , Kubeflow , MLOps

2023 was an epic year for artificial intelligence. In a year when industry raced faster than academia in machine learning (source), the state of the art for AI evolved to include increasingly larger amounts of data, and bringing to bear sufficient computing resources to support new use cases remained a challenge for many organisations. 

With the rise of AI, concerns were not far behind. According to an article published by Stanford, BLOOM’s training run emitted 25 times more carbon than a single air traveller on a one-way trip from New York to San Francisco. 

In light of these trends and challenges, what do we foresee in the AI space this year and where is the AI community focusing its energy? Let’s first look back at 2023 and then explore expectations for AI in 2024.

Rewind of AI in 2023

In 2022, we said that it was the year of AI, but… guess what? 2023 was also the year of AI, and it’s a safe bet that 2024 will follow suit. In the last 12 months, the adoption of AI has grown tremendously. WEKA’s 2023 report informed us that AI pioneers and explorers alike primarily used the public cloud both for training and inference. Organisations started moving projects into production, leading to new challenges and prompting companies to look more in-depth at the options for scaling their infrastructure. 

Following the announcement from NVIDIA and Microsoft, the arrival of DGX Cloud on the marketplace expanded the options that enterprises have to quickly get started with AI. At the same time, it highlighted the need to have hardware and software that are optimised to work together – and certifications such as DGX-Ready Software Solutions emerged to address this need. Charmed Kubeflow is one of the MLOps tools that have been validated on the NVIDIA hardware.

Machine learning security is still a concern 

According to a report published by the AI Infrastructure Alliance, more than 40% of organisations have all the resources needed to create value with AI and drive AI transformation. However, companies are also reporting challenges related to security and compliance, performance and cost and governance.

Securing the tooling that is used for machine learning projects is crucial. The security breach of Pytorch raised even more awareness about the topic and the possible risks. Data science tools often have access to highly sensitive data, so professionals need to ensure that both environment and artifacts are secured. Read more about securing your MLOps platform.

During KubeconEU 2023, Maciej Mazur and I also approached this topic and talked about secure MLOps on highly sensitive data. We captured some options to secure the environment at different layers of the stack during our keynote.

Kubeflow in 2023

In 2023, MLOps was an important topic for AI practitioners. Canonical offers one of the official distributions of Kubeflow, so naturally, we kept a close eye on the project. Kubeflow had two new releases, 1.7 and 1.8. Daniela Plascencia, part of the Canonical engineering team, was the release lead for Kubeflow 1.8. 

Towards the end of the year, the Kubeflow Summit took place.  With great sessions and working groups, use cases from companies such as Roblox, and challenges brought to the table, the event energised the community. Next year, Kubeflow Summit will be a co-located event together with Kubecon EU. Buy your ticket now and meet us there.

Canonical MLOps in 2023

2023 was, without a doubt, a busy year for us. Our activity this year went beyond Charmed Kubeflow, and in September 2023 we released Charmed MLFlow. We kept working on our documentation, publishing some new guides such as:

While many companies are rethinking their AI strategies, we are aware of how important it is to share our knowledge and help our audience make informed decisions. In 2023, we published more than 50 blogs on our website and on our Medium publication, hosted 10 webinars, launched Ubuntu AI podcast, released 5 whitepapers and went on a world tour to give talks and workshops on different topics. Some of the most successful pieces of content from 2023 were:

Canonical AI Roadshow

In September 2023 we launched Canonical AI Roadshow, a series of events and presentations that highlighted how enterprises can make better use of their own data and make AI use cases a reality. With more than 10 stops across 4 continents over the course of 3 months, Canonical experts talked about artificial intelligence, open source MLOps and how to run AI initiatives in production. We had a joint workshop with NVIDIA at the World AI Summit and a joint event with Microsoft in São Paulo. 

What’s in store for AI in 2024?

MLOps is here to stay and 2024 is likely to be another extraordinary year for those who are active in the industry. There is no doubt that the percentage of companies that will move their AI projects into production will continue to grow. For all players in the sector, the pressure will be on to improve security standards, document better and continue to integrate existing hardware and software to offer a seamless experience.

At the same time, 2024 will not only be about AI, but also about open source. This is a change that did not surprise many people, but it is of great importance. On one hand, open source gives access to everyone to quickly get started at a lower cost, but it also enables collaboration between different people, communities and organisations. Looking at the latest concerns on sustainability, open source reduces the computing power needed to train models, especially large language models (LLMs) from scratch, and therefore the carbon footprint. Leading open source projects such as Kubeflow and MLflow have already started adding features to enable better results for genAI and LLM-related projects.

It’s a wrap for now…

With 12 months ahead of us, MLOps has plenty of time to surprise everyone in 2024. It is, at the end of the day, a collaborative function that comprises data scientists, DevOps engineers and IT. As the market is going to evolve, new roles are going to be added to the list. However, everyone should be focused on an ongoing goal: run secure AI at different stages, from experimentation to production.

New solutions will likely appear on the market, similar to the ones that we mentioned above, and it is likely that a growing number of open source projects will be integrated in cohesive solutions. Enterprises will probably have higher expectations from machine learning projects, and thus from the tooling behind them. Cost efficiency and time-effectiveness will become more and more important discussions, influencing business decisions related to MLOps. 

Canonical’s promise to deliver secure open source software will continue to include MLOps tooling. Our goal is to help organisations not only by providing the right technology, but also guidance to make the best decisions, depending on the use case, where they are in their AI journey and constraints. At the same time, open source is in our DNA, so we will continue to enable AI enthusiasts to use our tools, educate early adopters on how to get started and encourage everyone to contribute to them.

Further reading

kubeflow logo

Run Kubeflow anywhere, easily

With Charmed Kubeflow, deployment and operations of Kubeflow are easy for any scenario.

Charmed Kubeflow is a collection of Python operators that define integration of the apps inside Kubeflow, like katib or pipelines-ui.

Use Kubeflow on-prem, desktop, edge, public cloud and multi-cloud.

Learn more about Charmed Kubeflow ›

kubeflow logo

What is Kubeflow?

Kubeflow makes deployments of Machine Learning workflows on Kubernetes simple, portable and scalable.

Kubeflow is the machine learning toolkit for Kubernetes. It extends Kubernetes ability to run independent and configurable steps, with machine learning specific frameworks and libraries.

Learn more about Kubeflow ›

kubeflow logo

Install Kubeflow

The Kubeflow project is dedicated to making deployments of machine learning workflows on Kubernetes simple, portable and scalable.

You can install Kubeflow on your workstation, local server or public cloud VM. It is easy to install with MicroK8s on any of these environments and can be scaled to high-availability.

Install Kubeflow ›

Newsletter signup

Get the latest Ubuntu news and updates in your inbox.

By submitting this form, I confirm that I have read and agree to Canonical's Privacy Policy.

Related posts

Charmed Kubeflow vs Kubeflow

Why should you use an official distribution of Kubeflow? Kubeflow is an open source MLOps platform that is designed to enable organizations to scale their ML...

What is MLflow?

MLflow is an open source platform, used for managing machine learning workflows. It was launched back in 2018 and has grown in popularity ever since, reaching...

A deep dive into Kubeflow pipelines 

Widely adopted by both developers and organisations, Kubeflow is an MLOps platform that runs on Kubernetes and automates machine learning (ML) workloads. It...