Highlights of the Canonical AI Roadshow 2023

This article is more than 1 year old.


It’s a wrap – Canonical AI Roadshow 2023 has come to an end. From Brazil to the United Arab Emirates, from Europe to the US, we’ve spent an amazing 10 weeks talking with people all over the world about how to innovate at speed with open source artificial intelligence (AI), and how to make enterprise AI use cases a reality.

Now that our globetrotting is over and the winter break is around the corner, let’s look back at some of the big news we shared and some of the lessons we learned during Canonical AI Roadshow 2023.

Charmed MLflow is here

In September 2023, right at the beginning of the roadshow, we announced the general availability of Charmed MLflow, Canonical’s distribution of the upstream project, as part of our MLOps portfolio. Charmed MLFlow can be deployed on a laptop within minutes, facilitating quick experimentation. It is fully tested on Ubuntu and can be used on other operating systems through Canonical’s Multipass or Windows Subsystem for Linux (WSL). It has all the features of the upstream project, as well as additional enterprise-grade capabilities, such as:

  • Simplified deployment and upgrades: the time to deployment is less than 5 minutes, enabling users also to upgrade their tools seamlessly.
  • Automated security scanning: The bundle is scanned at a regular cadence..
  • Security patching: Charmed MLflow follows Canonical’s process and procedure for security patching. Vulnerabilities are prioritised based on severity, the presence of patches in the upstream project and the risk of exploitation.
  • Maintained images: All Charmed MLflow images are actively maintained.
  • Comprehensive testing: Charmed MLflow is thoroughly tested on multiple platforms, including public cloud, local workstations, on-premises deployments, and various CNCF-compliant Kubernetes distributions.
  • Tools integration: Charmed MLflow is integrated with leading open source tools such as Kubeflow or Spark.

Charmed Spark is here

Our promise to offer secure open source software goes beyond MLOps. In Dubai at Gitex Global 2023, we also released Charmed Spark, which provides users with everything they need to run Apache Spark on Kubernetes. It is suitable for use in diverse data processing applications including predictive analytics, data warehousing, machine learning data preparation and extract-transform-load (ETL). Canonical Charmed Spark accelerates data engineering across public clouds and private data centres alike and comes with a comprehensive support and security maintenance offering, so teams can work with complete peace of mind.

Sustainable AI with open source

While AI is at the forefront of a revolution across industries and in the way we work, it is also a topic that raises numerous questions regarding long-term environmental impact. Google revealed that AI contributes to 10-15% of their electricity usage (source), and there is a growing concern over the CO2 footprint that such technologies have around the globe. From optimising computing power to using more open source software, throughout the Roadshow we learned how organisations are taking steps to build sustainably using the new technologies.

Open source tools optimise energy consumption by enabling organisations to spend less time on training models from scratch, as well as developing software used to run AI at scale. In this way, environmental responsibility goes hand-in-hand with faster project delivery, so organisations have double the incentive to follow a sustainable approach with open source tools, models, datasets and even frameworks.

Responsible AI

As the adoption of artificial intelligence grows within enterprises, there is also a need for more guidance on the market. Initiatives such as the European Artificial Intelligence Act approach this gap and put on the table proposals for a more responsible approach towards AI. Data security, artifact ownership, and practices for sharing the same infrastructure are just some of the topics that the industry needs more answers about. European AI Forum, EY, Rosenberg Institute and Activision Blizzard are just some of the organisations that approached responsible AI during World AI Summit 2023 and discussed how to build trust in relation with generative AI. Public sector players aren’t shying away either, with organisations such as the Dutch Authority for Digital Infrastructure approaching the topic and proposing a “European Approach to artificial intelligence”.

Run AI at scale

One big challenge that organisations face is moving projects beyond experimentation and into production. Running AI at scale comes with new capabilities such as model monitoring, infrastructure monitoring, pipeline automation or model serving. At the same time, there is a need to adjust the hardware, such that it stays time-efficient and cost-effective.

Michael Balint, Senior Manager, Product Architecture at NVIDIA and Maciej Mazur, Principal AI Field Engineer at Canonical held a hands-on workshop focused on building an LLM factory during World AI Summit. They highlighted a cohesive solution that runs on NVIDIA DGX, and also other hardware, including open source tools and libraries such as NVIDIA NeMo, Charmed Kubeflow or NVIDIA Triton. 

A global roadshow

That’s all for the Canonical AI Roadshow 2023. We had a great time discussing the latest trends in generative AI, showcasing how Canonical technology can speed up companies’ AI journeys, and spotlighting our MLOps and Data Fabric solutions. But rest assured, there’s still plenty more to come – both for Canonical and the AI industry at large – so stay tuned for what’s next. 

Further reading

Generative AI explained

Building a comprehensive toolkit for machine learning

ML observability: what, why, how

Talk to us today

Interested in running Ubuntu in your organisation?

Newsletter signup

Get the latest Ubuntu news and updates in your inbox.

By submitting this form, I confirm that I have read and agree to Canonical's Privacy Policy.

Related posts

What is MLflow?

MLflow is an open source platform, used for managing machine learning workflows. It was launched back in 2018 and has grown in popularity ever since, reaching...

Canonical joins OPEA to enable Enterprise AI

Canonical is committed to enabling organizations to secure and scale their AI/ML projects in production. This is why we are pleased to announce that we have...

Introducing Data Science Stack: set up an ML environment with 3 commands on Ubuntu 

Canonical, the publisher of Ubuntu, today announced the general availability of Data Science Stack (DSS), an out-of-the-box solution for data science that...