Setup env variable in aws SageMaker container (bring your own container)
Asked Answered
P

5

9

We are using aws sagemaker that is using ecs container, Is there a way, we can setup environment variable (e.g. stage or prod) in container when calling sagemaker api using low level python sdk

Pandybat answered 6/7, 2018 at 17:16 Comment(0)
C
8

Even invoking the API directly (which is lower level than using the python SDK) you cannot directly set environment arbitrary variables inside the container. You can however pass arbitrary hyperparameters in as configuration for a TrainingJob, for example pass in a hyperparameter like {"mystage": "prod"}. Hyperparameters show up in the container in a file called /opt/ml/input/config/hyperparameters.json which is a simple key-value map as a JSON object. You can use this to set the environment variable in a launching script like this:

#!/bin/bash

export STAGE=$(jq -r ".mystage" /opt/ml/input/config/hyperparameters.json)

# Now run your code...

You can get SageMaker to invoke this script either by making it the ENTRYPOINT in your Dockerfile, or by calling it train and making sure it's on the PATH for the shell if you're not setting an ENTRYPOINT.

Chao answered 17/7, 2018 at 20:43 Comment(1)
we ended up doing this hack only, unless find a better way.Pandybat
S
1

You can configure environment variables for an ECS Task, this is a common one to differentiate between dev/prod mode.

environment - The environment variables to pass to a container. This parameter maps to Env in the Create a container section of the Docker Remote API and the --env option to docker run.

My answer isn't related Sagemaker, since I think the question refers only to ECS.

Subaltern answered 8/7, 2018 at 11:37 Comment(0)
R
0

If you're using the low-level Boto SageMaker client, it might work for you to set environment variables for your models using the create_model method. This method lets you define environment variables as part of the PrimaryContainer that will be available alongside the model artifacts in an instance of your container.

Source:

In the high-level sagemaker Python package, environment variables can be set as well, f.e. through the Estimator.deploy() and Estimator.create_model() methods (as additional args will be passed to Model).

Sources:

Note: It seems that this approach is only working at inference time, not during a training job.

Rousseau answered 27/7, 2018 at 15:47 Comment(0)
B
0

An extension to @leopd 's answer is to parse the EnvVar SM_TRAINING_ENV that Sagemaker sets and use it directly from your Python code (train.py):

import json
import os

if __name__ == '__main__':
    envs = dict(os.environ)
    sm_training_env = envs.get('SM_TRAINING_ENV')
    sm_training_env = json.loads(sm_training_env)
    hyperparameters = sm_training_env.get('hyperparameters')
    #do_train(hyperparameters)

Also, just realized that the hyperparameters passed to the Estimator get set as SM_HP_NAMEOFPARAMETER and can be accessed directly.

Boche answered 22/12, 2021 at 17:45 Comment(0)
F
0

You can use the "Environment" attribute when defining the container for the model. See more in boto3 create_model docs. @anthnyprschka mentioned this, but here's an example.

def create_model(model_name: str, docker_img_url: str, account_id: str):
    container = {
        "Image": docker_img_url,
        "Mode": "SingleModel",
        # Pass through to model container running the inference endpoint
        "Environment": {
            "S3_BUCKET": os.environ["S3_BUCKET"],            
        },
    }
    role = os.environ["AWS_EXEC_ROLE"]

    sm_client = boto3.client(service_name="sagemaker")
    sm_client.create_model(
        ModelName=model_name,
        ExecutionRoleArn=f"arn:aws:iam::{account_id}:role/{role}",
        Containers=[container],
    )
Firedamp answered 23/7, 2024 at 22:37 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.