Skip to content
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
1d6c559
Add aws batch implementation (works with example notebook)
aviruthen Dec 11, 2025
2cdb2d4
fixing unit tests and adding integration test
aviruthen Dec 11, 2025
3438a87
Merge branch 'master' into add-aws-batch
aviruthen Dec 11, 2025
a60459c
add example notebook
aviruthen Dec 11, 2025
149f02c
Merge branch 'add-aws-batch' of https://github.com/aviruthen/sagemake…
aviruthen Dec 11, 2025
41f71d6
Adding missing dependencies for aws_batch
aviruthen Dec 11, 2025
9326821
Fixing indentation bug in source code
aviruthen Dec 12, 2025
711b5a3
comment out delete resources in example notebook
aviruthen Dec 12, 2025
1e5a534
Merge branch 'master' into add-aws-batch
aviruthen Dec 12, 2025
3cd0846
Add notebook png and remove extraneous comments
aviruthen Dec 16, 2025
d41bd5a
Merge branch 'add-aws-batch' of https://github.com/aviruthen/sagemake…
aviruthen Dec 16, 2025
486377b
Merge branch 'master' into add-aws-batch
aviruthen Dec 16, 2025
5e5d0cc
Add in png correctly
aviruthen Dec 16, 2025
5030010
Merge branch 'add-aws-batch' of https://github.com/aviruthen/sagemake…
aviruthen Dec 16, 2025
8194f1b
Removing logs_from_job from session_helper
aviruthen Dec 18, 2025
f96e58a
Merge branch 'master' into add-aws-batch
aviruthen Dec 18, 2025
8121734
Adding helpers for logging
aviruthen Dec 18, 2025
3fc1eff
Merge branch 'add-aws-batch' of https://github.com/aviruthen/sagemake…
aviruthen Dec 18, 2025
9fd23a9
Make helper methods internal
aviruthen Dec 18, 2025
1dffe75
Merge branch 'master' into add-aws-batch
aviruthen Dec 18, 2025
b7f4cfb
Adding back nest asyncio dependency
aviruthen Dec 18, 2025
848dff9
Merge branch 'add-aws-batch' of https://github.com/aviruthen/sagemake…
aviruthen Dec 18, 2025
c3dab73
Merge branch 'master' into add-aws-batch
aviruthen Dec 18, 2025
4044caa
Merge branch 'master' into add-aws-batch
aviruthen Dec 19, 2025
5148e14
Updating unit tests for internal-external method changes
aviruthen Dec 19, 2025
8ab3b05
Merge branch 'add-aws-batch' of https://github.com/aviruthen/sagemake…
aviruthen Dec 19, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 25 additions & 1 deletion sagemaker-core/src/sagemaker/core/helper/session_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -1865,6 +1865,30 @@ def expand_role(self, role):
if "/" in role:
return role
return self.boto_session.resource("iam").Role(role).arn


def logs_for_job(self, job_name, wait=False, poll=10, log_type="All", timeout=None):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wanted to check on what this utility is used for?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to maintain parity with the Estimator::logs method, which tails the logs being emitted from an active training job. Example of usage here, down under the "Monitor Job Status" section: https://github.com/aws/amazon-sagemaker-examples/blob/default/%20%20%20%20%20%20build_and_train_models/sm-training-queues/sm-training-queues_getting_started_with_estimator.ipynb

Check out this line in the example notebook in this PR: model_trainer.sagemaker_session.logs_for_job(model_trainer._latest_training_job.training_job_name, wait=True)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seem like we are replacing:
v2 logic: job.get_estimator().logs()

v3 logic: model_trainer.sagemaker_session.logs_for_job(model_trainer._latest_training_job.training_job_name, wait=True)

The V3 experience looks pretty bad to me and since this is not an existing v2 parity issue - Can we think about tie-ing the get logs functionality to training queue job or model trainer directly? We can also think through on this and not make a decision on this for 1st release. Lets use utility method within example notebooks (replicate _logs_for_job funtionality within notebook as a standalone utility)? This would not break customers using the notebook and we can make a right offering for logs exposure and update the notebook.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Synced with Mufi. Resolution: let's make logs_for_job a notebook method so that we are not introducing a new external method (and we can take more time on this for the future). We are okay with allowing the user to get the training job name from _lastest_training_job since this is an internal parameter (not an internal class or method). Will be implementing this

"""Display logs for a given training job, optionally tailing them until job is complete.

If the output is a tty or a Jupyter cell, it will be color-coded
based on which instance the log entry is from.

Args:
job_name (str): Name of the training job to display the logs for.
wait (bool): Whether to keep looking for new log entries until the job completes
(default: False).
poll (int): The interval in seconds between polling for new log entries and job
completion (default: 5).
log_type ([str]): A list of strings specifying which logs to print. Acceptable
strings are "All", "None", "Training", or "Rules". To maintain backwards
compatibility, boolean values are also accepted and converted to strings.
timeout (int): Timeout in seconds to wait until the job is completed. ``None`` by
default.
Raises:
exceptions.CapacityError: If the training job fails with CapacityError.
exceptions.UnexpectedStatusException: If waiting and the training job fails.
"""
_logs_for_job(self, job_name, wait, poll, log_type, timeout)


def _expand_container_def(c_def):
Expand Down Expand Up @@ -2962,4 +2986,4 @@ def container_def(
c_def["Mode"] = container_mode
if image_config:
c_def["ImageConfig"] = image_config
return c_def
return c_def
1 change: 1 addition & 0 deletions sagemaker-train/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,7 @@ dependencies = [
"jinja2>=3.0,<4.0",
"sagemaker-mlflow>=0.0.1,<1.0.0",
"mlflow>=3.0.0,<4.0.0",
"nest_asyncio>=1.5.0",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this a required dependency? how much of additional size implications are added to the sagemaker-train package?

Copy link
Collaborator Author

@aviruthen aviruthen Dec 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is used in the result() method for TrainingQueuedJob. It's a really small package; however, I'm aligned with removing it from pyproject.toml and having it be a dependency users can pip install (we have something similar where there are many different ML frameworks for inference but we don't require all of them in the pyproject.toml file since users can pick and choose which ML frameworks they want to use)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed nest_asyncio as a dependency in pyproject.toml

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Synced with David. Adding back in dependency, it is a small package and should be available for users when they install sagemaker-train.

]

[project.urls]
Expand Down
Empty file.
186 changes: 186 additions & 0 deletions sagemaker-train/src/sagemaker/train/aws_batch/batch_api_helper.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,186 @@
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"). You
# may not use this file except in compliance with the License. A copy of
# the License is located at
#
# http://aws.amazon.com/apache2.0/
#
# or in the "license" file accompanying this file. This file is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
"""The module provides helper function for Batch Submit/Describe/Terminal job APIs."""
from __future__ import absolute_import

import json
from typing import List, Dict, Optional
from sagemaker.train.aws_batch.constants import (
SAGEMAKER_TRAINING,
DEFAULT_TIMEOUT,
DEFAULT_SAGEMAKER_TRAINING_RETRY_CONFIG,
)
from sagemaker.train.aws_batch.boto_client import get_batch_boto_client


def submit_service_job(
training_payload: Dict,
job_name: str,
job_queue: str,
retry_config: Optional[Dict] = None,
scheduling_priority: Optional[int] = None,
timeout: Optional[Dict] = None,
share_identifier: Optional[str] = None,
tags: Optional[Dict] = None,
) -> Dict:
"""Batch submit_service_job API helper function.
Args:
training_payload: a dict containing a dict of arguments for Training job.
job_name: Batch job name.
job_queue: Batch job queue ARN.
retry_config: Batch job retry configuration.
scheduling_priority: An integer representing scheduling priority.
timeout: Set with value of timeout if specified, else default to 1 day.
share_identifier: value of shareIdentifier if specified.
tags: A dict of string to string representing Batch tags.
Returns:
A dict containing jobArn, jobName and jobId.
"""
if timeout is None:
timeout = DEFAULT_TIMEOUT
client = get_batch_boto_client()
training_payload_tags = training_payload.pop("Tags", None)
payload = {
"jobName": job_name,
"jobQueue": job_queue,
"retryStrategy": DEFAULT_SAGEMAKER_TRAINING_RETRY_CONFIG,
"serviceJobType": SAGEMAKER_TRAINING,
"serviceRequestPayload": json.dumps(training_payload),
"timeoutConfig": timeout,
}
if retry_config:
payload["retryStrategy"] = retry_config
if scheduling_priority:
payload["schedulingPriority"] = scheduling_priority
if share_identifier:
payload["shareIdentifier"] = share_identifier
if tags or training_payload_tags:
payload["tags"] = __merge_tags(tags, training_payload_tags)
return client.submit_service_job(**payload)


def describe_service_job(job_id: str) -> Dict:
"""Batch describe_service_job API helper function.
Args:
job_id: Job ID used.
Returns: a dict. See the sample below
{
'attempts': [
{
'serviceResourceId': {
'name': 'string',
'value': 'string'
},
'startedAt': 123,
'stoppedAt': 123,
'statusReason': 'string'
},
],
'createdAt': 123,
'isTerminated': True|False,
'jobArn': 'string',
'jobId': 'string',
'jobName': 'string',
'jobQueue': 'string',
'retryStrategy': {
'attempts': 123
},
'schedulingPriority': 123,
'serviceRequestPayload': 'string',
'serviceJobType': 'EKS'|'ECS'|'ECS_FARGATE'|'SAGEMAKER_TRAINING',
'shareIdentifier': 'string',
'startedAt': 123,
'status': 'SUBMITTED'|'PENDING'|'RUNNABLE'|'STARTING'|'RUNNING'|'SUCCEEDED'|'FAILED',
'statusReason': 'string',
'stoppedAt': 123,
'tags': {
'string': 'string'
},
'timeout': {
'attemptDurationSeconds': 123
}
}
"""
client = get_batch_boto_client()
return client.describe_service_job(jobId=job_id)


def terminate_service_job(job_id: str, reason: Optional[str] = "default terminate reason") -> Dict:
"""Batch terminate_service_job API helper function.
Args:
job_id: Job ID
reason: A string representing terminate reason.
Returns: an empty dict
"""
client = get_batch_boto_client()
return client.terminate_service_job(jobId=job_id, reason=reason)


def list_service_job(
job_queue: str,
job_status: Optional[str] = None,
filters: Optional[List] = None,
next_token: Optional[str] = None,
) -> Dict:
"""Batch list_service_job API helper function.
Args:
job_queue: Batch job queue ARN.
job_status: Batch job status.
filters: A list of Dict. Each contains a filter.
next_token: Used to retrieve data in next page.
Returns: A generator containing list results.
"""
client = get_batch_boto_client()
payload = {"jobQueue": job_queue}
if filters:
payload["filters"] = filters
if next_token:
payload["nextToken"] = next_token
if job_status:
payload["jobStatus"] = job_status
part_of_jobs = client.list_service_jobs(**payload)
next_token = part_of_jobs.get("nextToken")
yield part_of_jobs
if next_token:
yield from list_service_job(job_queue, job_status, filters, next_token)


def __merge_tags(batch_tags: Optional[Dict], training_tags: Optional[List]) -> Optional[Dict]:
"""Merges Batch and training payload tags.
Returns a copy of Batch tags merged with training payload tags. Training payload tags take
precedence in the case of key conflicts.
:param batch_tags: A dict of string to string representing Batch tags.
:param training_tags: A list of `{"Key": "string", "Value": "string"}` objects representing
training payload tags.
:return: A dict of string to string representing batch tags merged with training tags.
batch_tags is returned unaltered if training_tags is None or empty.
"""
if not training_tags:
return batch_tags

training_tags_to_merge = {tag["Key"]: tag["Value"] for tag in training_tags}
batch_tags_copy = batch_tags.copy() if batch_tags else {}
batch_tags_copy.update(training_tags_to_merge)

return batch_tags_copy
33 changes: 33 additions & 0 deletions sagemaker-train/src/sagemaker/train/aws_batch/boto_client.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"). You
# may not use this file except in compliance with the License. A copy of
# the License is located at
#
# http://aws.amazon.com/apache2.0/
#
# or in the "license" file accompanying this file. This file is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
"""The file provides helper function for getting Batch boto client."""
from __future__ import absolute_import

from typing import Optional
import boto3


def get_batch_boto_client(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make it internal?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is used in the notebook, should stay external

region: Optional[str] = None,
endpoint: Optional[str] = None,
) -> boto3.session.Session.client:
"""Helper function for getting Batch boto3 client.
Args:
region: Region specified
endpoint: Batch API endpoint.
Returns: Batch boto3 client.
"""
return boto3.client("batch", region_name=region, endpoint_url=endpoint)
34 changes: 34 additions & 0 deletions sagemaker-train/src/sagemaker/train/aws_batch/constants.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"). You
# may not use this file except in compliance with the License. A copy of
# the License is located at
#
# http://aws.amazon.com/apache2.0/
#
# or in the "license" file accompanying this file. This file is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
"""The file defines constants used for Batch API helper functions."""

from __future__ import absolute_import

SAGEMAKER_TRAINING = "SAGEMAKER_TRAINING"
DEFAULT_ATTEMPT_DURATION_IN_SECONDS = 86400 # 1 day in seconds.
DEFAULT_TIMEOUT = {"attemptDurationSeconds": DEFAULT_ATTEMPT_DURATION_IN_SECONDS}
POLL_IN_SECONDS = 5
JOB_STATUS_RUNNING = "RUNNING"
JOB_STATUS_COMPLETED = "SUCCEEDED"
JOB_STATUS_FAILED = "FAILED"
DEFAULT_SAGEMAKER_TRAINING_RETRY_CONFIG = {
"attempts": 1,
"evaluateOnExit": [
{
"action": "RETRY",
"onStatusReason": "Received status from SageMaker:InternalServerError: "
"We encountered an internal error. Please try again.",
},
{"action": "EXIT", "onStatusReason": "*"},
],
}
52 changes: 52 additions & 0 deletions sagemaker-train/src/sagemaker/train/aws_batch/exception.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"). You
# may not use this file except in compliance with the License. A copy of
# the License is located at
#
# http://aws.amazon.com/apache2.0/
#
# or in the "license" file accompanying this file. This file is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
"""The file Defines customized exception for Batch queueing"""
from __future__ import absolute_import


class NoTrainingJob(Exception):
"""Define NoTrainingJob Exception.

It means no Training job has been created by AWS Batch service.
"""

def __init__(self, value):
super().__init__(value)
self.value = value

def __str__(self):
"""Convert Exception to string.

Returns: a String containing exception error messages.

"""
return repr(self.value)


class MissingRequiredArgument(Exception):
"""Define MissingRequiredArgument exception.

It means some required arguments are missing.
"""

def __init__(self, value):
super().__init__(value)
self.value = value

def __str__(self):
"""Convert Exception to string.

Returns: a String containing exception error messages.

"""
return repr(self.value)
Loading
Loading