Absolutely. Here is a notebook-first, fresh-workspace, self-contained Databricks AI/ML tutorial built around Serverless notebook compute. I based it on the current Databricks docs as of late March 2026, including serverless notebooks, the official ML quickstart, Unity Catalog model lifecycle, Model Serving, AI Playground, and the current retrieval-agent tutorial. The key fit for your setup is that serverless notebooks are the right place for Python, MLflow, training, and experiments, while sample data is already available in Databricks through samples and /databricks-datasets. ([Databricks Documentation][1])
This lab lets a student do the full flow in one workspace: create a notebook, load built-in sample data, write Python, train a model, track experiments with MLflow, register the model in Unity Catalog, deploy it with Mosaic AI Model Serving, then move into the GenAI side with AI Playground and a Databricks-provided retrieval-agent notebook that is explicit