Skip to content

How to make parameters/files available to Tensorflow Endpoint Instance #217

Closed
@tyler-lanigan-hs

Description

@tyler-lanigan-hs

I'm looking to make some hyper parameters or files available to the serving endpoint in SageMaker. The training instances is given access to input parameters using hyperparameters in:

estimator = TensorFlow(entry_point='autocat.py',
                       role=role,
                       output_path=params['output_path'],
                       code_location=params['code_location'],
                       train_instance_count=1,
                       train_instance_type='ml.c4.xlarge',
                       training_steps=10000,
                       evaluation_steps=None,
                       hyperparameters=params)

However, when the endpoint is deployed, there is no way to pass in parameters that are used to control the data processing in the input_fn(serialized_input, content_type) function.

What would be the best way to pass parameters to the serving instance?? Is the source_dir parameter defined in the sagemaker.tensorflow.TensorFlow class copied to the serving instance? If so, I could use a config.yml or similar.

The reason that I'm asking is that I keep the location of a TFIDF vectorizer in the params dictionary, and it loads it at training time from s3. In the future I'd like to use this same approach to load embeddings at serving time.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions