Skip to content

Overview

Hands-on Tutorial for Model Owners

Overview

In this guide, you will learn how a Model Owner can use MedPerf to take part in a benchmark. It's highly recommend that you follow this or this guide first to implement your own model MLCube and use it throughout this tutorial. However, this guide provides an already implemented MLCube if you want to directly proceed to learn how to interact with MedPerf.

The main tasks of this guide are:

  1. Testing MLCube compatibility with the benchmark.
  2. Submitting the MLCube.
  3. Requesting participation in a benchmark.

It's assumed that you have already set up the general testing environment as explained in the setup guide.

Before You Start

First steps

Running in cloud via Github Codespaces

As the most easy way to play with the tutorials you can launch a preinstalled Codespace cloud environment for MedPerf by clicking this link:

Open in GitHub Codespaces

Running in local environment

To start experimenting with MedPerf through this tutorial on your local machine, you need to start by following these quick steps:

  1. Install Medperf
  2. Set up Medperf

Prepare the Local MedPerf Server

For the purpose of the tutorial, you have to initialize a local MedPerf server with a fresh database and then create the necessary entities that you will be interacting with. To do so, run the following: (make sure you are in MedPerf's root folder)

cd server
sh reset_db.sh
python seed.py --demo model
cd ..

Download the Necessary files

A script is provided to download all the necessary files so that you follow the tutorial smoothly. Run the following: (make sure you are in MedPerf's root folder)

sh tutorials_scripts/setup_model_tutorial.sh

This will create a workspace folder medperf_tutorial where all necessary files are downloaded. The folder contains the following content:

Toy content description

Model MLCube

The medperf_tutorial/model_mobilenetv2/ is a toy Model MLCube. Once you submit your model to the benchmark, all participating Data Owners would be able to run the model within the benchmark pipeline. Therefore, your MLCube must support the specific input/output formats defined by the Benchmark Owners.

For the purposes of this tutorial, you will work with a pre-prepared toy benchmark. In a real-world scenario, you should refer to your Benchmark Owner to get a format specifications and details for your practical case.

In real life all the listed artifacts and files have to be created on your own. However, for tutorial's sake you may use this toy data.

Login to the Local MedPerf Server

The local MedPerf server is pre-configured with a dummy local authentication system. Remember that when you are communicating with the real MedPerf server, you should follow the steps in this guide to login. For the tutorials, you should not do anything.

You are now ready to start!

1. Test your MLCube Compatibility

Model Owner implements & tests MLCube Before submitting your MLCube, it is highly recommended that you test your MLCube compatibility with the benchmarks of interest to avoid later edits and multiple submissions. Your MLCube should be compatible with the benchmark workflow in two main ways:

  1. It should expect a specific data input structure
  2. Its outputs should follow a particular structure expected by the benchmark's metrics evaluator MLCube

These details should usually be acquired by contacting the Benchmark Committee and following their instructions.

To test your MLCube validity with the benchmark, first run medperf benchmark ls to identify the benchmark's server UID. In this case, it is going to be 1.

Next, locate the MLCube. Unless you implemented your own MLCube, the MLCube provided for this tutorial is located in your workspace: medperf_tutorial/model_mobilenetv2/mlcube/mlcube.yaml.

After that, run the compatibility test:

medperf test run \
   --benchmark 1 \
   --model "medperf_tutorial/model_mobilenetv2/mlcube/mlcube.yaml"

Assuming the test passes successfuly, you are ready to submit the MLCube to the MedPerf server.

2. Submit the MLCube

Model Owner submits Model MLCube

How does MedPerf Recognize an MLCube?

The MedPerf server registers an MLCube as metadata comprised of a set of files that can be retrieved from the internet. This means that before submitting an MLCube you have to host its files on the internet. The MedPerf client provides a utility to prepare the files of an MLCube that need to be hosted. You can refer to this page if you want to understand what the files are, but using the utility script is enough.

To prepare the files of the MLCube, run the following command ensuring you are in MedPerf's root folder:

python scripts/package-mlcube.py --mlcube medperf_tutorial/model_mobilenetv2/mlcube --mlcube-types model

This script will create a new folder in the MLCube directory, named assets, containing all the files that should be hosted separately.

Host the Files

For the tutorial to run smoothly, the files are already hosted. If you wish to host them by yourself, you can find the list of supported options and details about hosting files in this page.

Submit the MLCube

The submission should include the URLs of all the hosted files. For the MLCube provided for the tutorial:

  • The URL to the hosted mlcube manifest file is
https://raw.githubusercontent.com/mlcommons/medperf/main/examples/chestxray_tutorial/model_mobilenetv2/mlcube/mlcube.yaml
  • The URL to the hosted mlcube parameters file is
https://raw.githubusercontent.com/mlcommons/medperf/main/examples/chestxray_tutorial/model_mobilenetv2/mlcube/workspace/parameters.yaml
  • The URL to the hosted additional files tarball file is
https://storage.googleapis.com/medperf-storage/chestxray_tutorial/mobilenetv2_weights.tar.gz

Use the following command to submit:

medperf mlcube submit \
   --name my-model-cube \
   --mlcube-file "https://raw.githubusercontent.com/mlcommons/medperf/main/examples/chestxray_tutorial/model_mobilenetv2/mlcube/mlcube.yaml" \
   --parameters-file "https://raw.githubusercontent.com/mlcommons/medperf/main/examples/chestxray_tutorial/model_mobilenetv2/mlcube/workspace/parameters.yaml" \
   --additional-file "https://storage.googleapis.com/medperf-storage/chestxray_tutorial/mobilenetv2_weights.tar.gz" \
   --operational

The MLCube will be assigned by a server UID. You can check it by running:

medperf mlcube ls --mine

3. Request Participation

Model Owner requests to participate in the benchmark Benchmark workflows are run by Data Owners, who will get notified when a new model is added to a benchmark. You must request the association for your model to be part of the benchmark.

To initiate an association request, you need to collect the following information:

  • The target benchmark ID, which is 1
  • The server UID of your MLCube, which is 4.

Run the following command to request associating your MLCube with the benchmark:

medperf mlcube associate --benchmark 1 --model_uid 4

This command will first run the benchmark's workflow on your model to ensure your model is compatible with the benchmark workflow. Then, the association request information is printed on the screen, which includes an executive summary of the test mentioned. You will be prompted to confirm sending this information and initiating this association request.

What Happens After Requesting the Association?

Benchmark Committee accepts / rejects models When participating with a real benchmark, you must wait for the Benchmark Committee to approve the association request. You can check the status of your association requests by running medperf association ls. The association is identified by the server UIDs of your MLCube and the benchmark with which you are requesting association.

The end

Cleanup (Optional)

You have reached the end of the tutorial! If you are planning to rerun any of the tutorials, don't forget to cleanup:

  • To shut down the local MedPerf server: press CTRL+C in the terminal where the server is running.

  • To cleanup the downloaded files workspace (make sure you are in the MedPerf's root directory):

rm -fr medperf_tutorial
  • To cleanup the local MedPerf server database: (make sure you are in the MedPerf's root directory)
cd server
sh reset_db.sh
  • To cleanup the test storage:
rm -fr ~/.medperf/localhost_8000