-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Add aws batch #5409
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add aws batch #5409
Changes from 9 commits
1d6c559
2cdb2d4
3438a87
a60459c
149f02c
41f71d6
9326821
711b5a3
1e5a534
3cd0846
d41bd5a
486377b
5e5d0cc
5030010
8194f1b
f96e58a
8121734
3fc1eff
9fd23a9
1dffe75
b7f4cfb
848dff9
c3dab73
4044caa
5148e14
8ab3b05
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -42,6 +42,7 @@ dependencies = [ | |
| "jinja2>=3.0,<4.0", | ||
| "sagemaker-mlflow>=0.0.1,<1.0.0", | ||
| "mlflow>=3.0.0,<4.0.0", | ||
| "nest_asyncio>=1.5.0", | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. is this a required dependency? how much of additional size implications are added to the sagemaker-train package?
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is used in the result() method for TrainingQueuedJob. It's a really small package; however, I'm aligned with removing it from pyproject.toml and having it be a dependency users can pip install (we have something similar where there are many different ML frameworks for inference but we don't require all of them in the pyproject.toml file since users can pick and choose which ML frameworks they want to use)
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Removed nest_asyncio as a dependency in pyproject.toml
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Synced with David. Adding back in dependency, it is a small package and should be available for users when they install sagemaker-train. |
||
| ] | ||
|
|
||
| [project.urls] | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,186 @@ | ||
| # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"). You | ||
| # may not use this file except in compliance with the License. A copy of | ||
| # the License is located at | ||
| # | ||
| # http://aws.amazon.com/apache2.0/ | ||
| # | ||
| # or in the "license" file accompanying this file. This file is | ||
| # distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF | ||
| # ANY KIND, either express or implied. See the License for the specific | ||
| # language governing permissions and limitations under the License. | ||
| """The module provides helper function for Batch Submit/Describe/Terminal job APIs.""" | ||
| from __future__ import absolute_import | ||
|
|
||
| import json | ||
| from typing import List, Dict, Optional | ||
| from sagemaker.train.aws_batch.constants import ( | ||
| SAGEMAKER_TRAINING, | ||
| DEFAULT_TIMEOUT, | ||
| DEFAULT_SAGEMAKER_TRAINING_RETRY_CONFIG, | ||
| ) | ||
| from sagemaker.train.aws_batch.boto_client import get_batch_boto_client | ||
|
|
||
|
|
||
| def submit_service_job( | ||
mufaddal-rohawala marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| training_payload: Dict, | ||
| job_name: str, | ||
| job_queue: str, | ||
| retry_config: Optional[Dict] = None, | ||
| scheduling_priority: Optional[int] = None, | ||
| timeout: Optional[Dict] = None, | ||
| share_identifier: Optional[str] = None, | ||
| tags: Optional[Dict] = None, | ||
| ) -> Dict: | ||
| """Batch submit_service_job API helper function. | ||
| Args: | ||
| training_payload: a dict containing a dict of arguments for Training job. | ||
| job_name: Batch job name. | ||
| job_queue: Batch job queue ARN. | ||
| retry_config: Batch job retry configuration. | ||
| scheduling_priority: An integer representing scheduling priority. | ||
| timeout: Set with value of timeout if specified, else default to 1 day. | ||
| share_identifier: value of shareIdentifier if specified. | ||
| tags: A dict of string to string representing Batch tags. | ||
| Returns: | ||
| A dict containing jobArn, jobName and jobId. | ||
| """ | ||
| if timeout is None: | ||
| timeout = DEFAULT_TIMEOUT | ||
| client = get_batch_boto_client() | ||
| training_payload_tags = training_payload.pop("Tags", None) | ||
| payload = { | ||
| "jobName": job_name, | ||
| "jobQueue": job_queue, | ||
| "retryStrategy": DEFAULT_SAGEMAKER_TRAINING_RETRY_CONFIG, | ||
| "serviceJobType": SAGEMAKER_TRAINING, | ||
| "serviceRequestPayload": json.dumps(training_payload), | ||
| "timeoutConfig": timeout, | ||
| } | ||
| if retry_config: | ||
| payload["retryStrategy"] = retry_config | ||
| if scheduling_priority: | ||
| payload["schedulingPriority"] = scheduling_priority | ||
| if share_identifier: | ||
| payload["shareIdentifier"] = share_identifier | ||
| if tags or training_payload_tags: | ||
| payload["tags"] = __merge_tags(tags, training_payload_tags) | ||
| return client.submit_service_job(**payload) | ||
|
|
||
|
|
||
| def describe_service_job(job_id: str) -> Dict: | ||
| """Batch describe_service_job API helper function. | ||
| Args: | ||
| job_id: Job ID used. | ||
| Returns: a dict. See the sample below | ||
| { | ||
| 'attempts': [ | ||
| { | ||
| 'serviceResourceId': { | ||
| 'name': 'string', | ||
| 'value': 'string' | ||
| }, | ||
| 'startedAt': 123, | ||
| 'stoppedAt': 123, | ||
| 'statusReason': 'string' | ||
| }, | ||
| ], | ||
| 'createdAt': 123, | ||
| 'isTerminated': True|False, | ||
| 'jobArn': 'string', | ||
| 'jobId': 'string', | ||
| 'jobName': 'string', | ||
| 'jobQueue': 'string', | ||
| 'retryStrategy': { | ||
| 'attempts': 123 | ||
| }, | ||
| 'schedulingPriority': 123, | ||
| 'serviceRequestPayload': 'string', | ||
| 'serviceJobType': 'EKS'|'ECS'|'ECS_FARGATE'|'SAGEMAKER_TRAINING', | ||
| 'shareIdentifier': 'string', | ||
| 'startedAt': 123, | ||
| 'status': 'SUBMITTED'|'PENDING'|'RUNNABLE'|'STARTING'|'RUNNING'|'SUCCEEDED'|'FAILED', | ||
| 'statusReason': 'string', | ||
| 'stoppedAt': 123, | ||
| 'tags': { | ||
| 'string': 'string' | ||
| }, | ||
| 'timeout': { | ||
| 'attemptDurationSeconds': 123 | ||
| } | ||
| } | ||
| """ | ||
| client = get_batch_boto_client() | ||
| return client.describe_service_job(jobId=job_id) | ||
|
|
||
|
|
||
| def terminate_service_job(job_id: str, reason: Optional[str] = "default terminate reason") -> Dict: | ||
| """Batch terminate_service_job API helper function. | ||
| Args: | ||
| job_id: Job ID | ||
| reason: A string representing terminate reason. | ||
| Returns: an empty dict | ||
| """ | ||
| client = get_batch_boto_client() | ||
| return client.terminate_service_job(jobId=job_id, reason=reason) | ||
|
|
||
|
|
||
| def list_service_job( | ||
| job_queue: str, | ||
| job_status: Optional[str] = None, | ||
| filters: Optional[List] = None, | ||
| next_token: Optional[str] = None, | ||
| ) -> Dict: | ||
| """Batch list_service_job API helper function. | ||
| Args: | ||
| job_queue: Batch job queue ARN. | ||
| job_status: Batch job status. | ||
| filters: A list of Dict. Each contains a filter. | ||
| next_token: Used to retrieve data in next page. | ||
| Returns: A generator containing list results. | ||
| """ | ||
| client = get_batch_boto_client() | ||
| payload = {"jobQueue": job_queue} | ||
| if filters: | ||
| payload["filters"] = filters | ||
| if next_token: | ||
| payload["nextToken"] = next_token | ||
| if job_status: | ||
| payload["jobStatus"] = job_status | ||
| part_of_jobs = client.list_service_jobs(**payload) | ||
| next_token = part_of_jobs.get("nextToken") | ||
| yield part_of_jobs | ||
| if next_token: | ||
| yield from list_service_job(job_queue, job_status, filters, next_token) | ||
|
|
||
|
|
||
| def __merge_tags(batch_tags: Optional[Dict], training_tags: Optional[List]) -> Optional[Dict]: | ||
| """Merges Batch and training payload tags. | ||
| Returns a copy of Batch tags merged with training payload tags. Training payload tags take | ||
| precedence in the case of key conflicts. | ||
| :param batch_tags: A dict of string to string representing Batch tags. | ||
| :param training_tags: A list of `{"Key": "string", "Value": "string"}` objects representing | ||
| training payload tags. | ||
| :return: A dict of string to string representing batch tags merged with training tags. | ||
| batch_tags is returned unaltered if training_tags is None or empty. | ||
| """ | ||
| if not training_tags: | ||
| return batch_tags | ||
|
|
||
| training_tags_to_merge = {tag["Key"]: tag["Value"] for tag in training_tags} | ||
| batch_tags_copy = batch_tags.copy() if batch_tags else {} | ||
| batch_tags_copy.update(training_tags_to_merge) | ||
|
|
||
| return batch_tags_copy | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,33 @@ | ||
| # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"). You | ||
| # may not use this file except in compliance with the License. A copy of | ||
| # the License is located at | ||
| # | ||
| # http://aws.amazon.com/apache2.0/ | ||
| # | ||
| # or in the "license" file accompanying this file. This file is | ||
| # distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF | ||
| # ANY KIND, either express or implied. See the License for the specific | ||
| # language governing permissions and limitations under the License. | ||
| """The file provides helper function for getting Batch boto client.""" | ||
| from __future__ import absolute_import | ||
|
|
||
| from typing import Optional | ||
| import boto3 | ||
|
|
||
|
|
||
| def get_batch_boto_client( | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. make it internal?
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is used in the notebook, should stay external |
||
| region: Optional[str] = None, | ||
| endpoint: Optional[str] = None, | ||
| ) -> boto3.session.Session.client: | ||
| """Helper function for getting Batch boto3 client. | ||
| Args: | ||
| region: Region specified | ||
| endpoint: Batch API endpoint. | ||
| Returns: Batch boto3 client. | ||
| """ | ||
| return boto3.client("batch", region_name=region, endpoint_url=endpoint) | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,34 @@ | ||
| # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"). You | ||
| # may not use this file except in compliance with the License. A copy of | ||
| # the License is located at | ||
| # | ||
| # http://aws.amazon.com/apache2.0/ | ||
| # | ||
| # or in the "license" file accompanying this file. This file is | ||
| # distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF | ||
| # ANY KIND, either express or implied. See the License for the specific | ||
| # language governing permissions and limitations under the License. | ||
| """The file defines constants used for Batch API helper functions.""" | ||
|
|
||
| from __future__ import absolute_import | ||
|
|
||
| SAGEMAKER_TRAINING = "SAGEMAKER_TRAINING" | ||
| DEFAULT_ATTEMPT_DURATION_IN_SECONDS = 86400 # 1 day in seconds. | ||
| DEFAULT_TIMEOUT = {"attemptDurationSeconds": DEFAULT_ATTEMPT_DURATION_IN_SECONDS} | ||
| POLL_IN_SECONDS = 5 | ||
| JOB_STATUS_RUNNING = "RUNNING" | ||
| JOB_STATUS_COMPLETED = "SUCCEEDED" | ||
| JOB_STATUS_FAILED = "FAILED" | ||
| DEFAULT_SAGEMAKER_TRAINING_RETRY_CONFIG = { | ||
| "attempts": 1, | ||
| "evaluateOnExit": [ | ||
| { | ||
| "action": "RETRY", | ||
| "onStatusReason": "Received status from SageMaker:InternalServerError: " | ||
| "We encountered an internal error. Please try again.", | ||
| }, | ||
| {"action": "EXIT", "onStatusReason": "*"}, | ||
| ], | ||
| } |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,52 @@ | ||
| # Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"). You | ||
| # may not use this file except in compliance with the License. A copy of | ||
| # the License is located at | ||
| # | ||
| # http://aws.amazon.com/apache2.0/ | ||
| # | ||
| # or in the "license" file accompanying this file. This file is | ||
| # distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF | ||
| # ANY KIND, either express or implied. See the License for the specific | ||
| # language governing permissions and limitations under the License. | ||
| """The file Defines customized exception for Batch queueing""" | ||
| from __future__ import absolute_import | ||
|
|
||
|
|
||
| class NoTrainingJob(Exception): | ||
| """Define NoTrainingJob Exception. | ||
|
|
||
| It means no Training job has been created by AWS Batch service. | ||
| """ | ||
|
|
||
| def __init__(self, value): | ||
| super().__init__(value) | ||
| self.value = value | ||
|
|
||
| def __str__(self): | ||
| """Convert Exception to string. | ||
|
|
||
| Returns: a String containing exception error messages. | ||
|
|
||
| """ | ||
| return repr(self.value) | ||
|
|
||
|
|
||
| class MissingRequiredArgument(Exception): | ||
| """Define MissingRequiredArgument exception. | ||
|
|
||
| It means some required arguments are missing. | ||
| """ | ||
|
|
||
| def __init__(self, value): | ||
| super().__init__(value) | ||
| self.value = value | ||
|
|
||
| def __str__(self): | ||
| """Convert Exception to string. | ||
|
|
||
| Returns: a String containing exception error messages. | ||
|
|
||
| """ | ||
| return repr(self.value) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wanted to check on what this utility is used for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is to maintain parity with the Estimator::logs method, which tails the logs being emitted from an active training job. Example of usage here, down under the "Monitor Job Status" section: https://github.com/aws/amazon-sagemaker-examples/blob/default/%20%20%20%20%20%20build_and_train_models/sm-training-queues/sm-training-queues_getting_started_with_estimator.ipynb
Check out this line in the example notebook in this PR:
model_trainer.sagemaker_session.logs_for_job(model_trainer._latest_training_job.training_job_name, wait=True)There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seem like we are replacing:
v2 logic:
job.get_estimator().logs()v3 logic:
model_trainer.sagemaker_session.logs_for_job(model_trainer._latest_training_job.training_job_name, wait=True)The V3 experience looks pretty bad to me and since this is not an existing v2 parity issue - Can we think about tie-ing the get logs functionality to training queue job or model trainer directly? We can also think through on this and not make a decision on this for 1st release. Lets use utility method within example notebooks (replicate _logs_for_job funtionality within notebook as a standalone utility)? This would not break customers using the notebook and we can make a right offering for logs exposure and update the notebook.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Synced with Mufi. Resolution: let's make logs_for_job a notebook method so that we are not introducing a new external method (and we can take more time on this for the future). We are okay with allowing the user to get the training job name from _lastest_training_job since this is an internal parameter (not an internal class or method). Will be implementing this