Getting Started
Lets start with a simple notebook.
Notebooks
Go to TIR AI Platform
Create a New Project. Enter a suitable name for your project (e.g. sentinel)
Visit Notebooks tab and Click on Create a Notebook
Enter a name for notebook, if desired.
Choose a notebook image Pytorch 2
Choose a machine plan - CPU3.8_Free. Leave rest of the fields as default
Click CREATE
Wait for the notebook to come to a running state. Click on refresh icon in status column to monitor status.
When notebook is in running state, either click on the notebook name, or the three dots (…) to launch the notebook. The project explorer (left sidebar) also displayes quick launch links for your notebooks.
You will see a new window or tab open up in browser. Visit the page and you will find jupyter notebook ready for your use. When you do, you are all set to create magic.
Datasets
Now that you are comfortable with notebooks, lets look at creating a dataset.
Go to the TIR AI Platform
Create a new project or select an existing one
Go to Datasets tab
Click Create Dataset
Choose a bucket type New EOS Bucket. This will create a new EOS bucket tied to your account and also access keys for it.
Enter a name for your dataset (for e.g. paws)
Click on CREATE
Note down the Bucket name, Access Key and Secret Key. You will need them to upload data later on. An easiser approach would be to copy the mc command from setup minio cli tab and paste it on your command line.
Wait for the dataset to come to a ready state. Click on the refresh icon in status column to monitor the progress
When the dataset is ready, locate and click on the your dataset row.
You should see two tabs Details and Objects at the bottom of the page
Click on Objects tab
Upload any files of your choice here. Though this is easier option to use, we recommend using mc (minio CLI) for larger datasets
Using Datasets with Notebooks
Go back to Notebooks tab
Create a New Notebook, enter all the inputs in the form. In dataset, select the dataset we jsut created (e.g. paws)
Click CREATE
When notebook is ready, launch jupyter labs.
Enter the following command in jypter notebook cell and run:
ls /datasets/
If all went well, you should see your dataset name in the result. If you go further in the directory, you will see files that you uploaded from objects tab.
Model and Model Endpoints
In this example, we will deploy a torch serve based model but TIR also supports other frameworks like triton (ONNX, pytorch serve, tensorflow, etc), tf serve, etc. You can find more details in Models section.
Go to Models tab
Click Create Model
Enter a model name and click CREATE to generate EOS credentials and setup commands
Use the mc command to configure Minio CLI on your notebook (hosted or local), or local desktop.
Run the following command to confirm the setup works: .. code:
mc ls <model-name>/<model-eos-bucket>
Torch serve requires a model archive to serve the api. For this step you can use your own model archive or download this mnist archive
If you have downloaded mnist archive from the link in prior step then unzip the archive
Upload model archive (must include config.properties, model-store directory) to your model EOS bucket (see step 5).
The list the objects from your model EOS bucket (use mc ls) and ensure the structure is similar to below.
├── config │ ├── config.properties ├── model-store │ ├── mnist.mar
Now that the model store is ready, go to Model Endpoints section.
Click Create Endpoint
Select the model name (same as created in step 3)
Select the model format as pytorch
Create the model end point.
Use the instructions (e.g. curl command) in model endpoint to test the model. If you are using our mnist model, then use this tensor input file for testing.
curl -k -H 'Authorization: Bearer $AUTH_TOKEN' -X POST https://infer.e2enetworks.net/project/<projectid>/endpoint/<endpoint-id>/v1/models/mnist:predict -d @./mnist_request.json
Note: The model endpoints follow kserve inference protocol. For more details on kserve website.