You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/authors.en.md
+4Lines changed: 4 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,6 +14,10 @@ weight: 100
14
14
1. Daniel Yoder ([danielsyoder](https://github.com/danielsyoder)) - The brains behind amazon-dynamodb-labs.com and the co-creator of the design scenarios
15
15
16
16
### 2025 additions
17
+
zETL Workshop update with OS pipeline changes (October 2025):
18
+
1. John Terhune - ([@terhunej](https://github.com/terhunej)) - Primary author
Copy file name to clipboardExpand all lines: content/change-data-capture/overview/create-tables.en.md
+6-7Lines changed: 6 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,12 +9,12 @@ In this section you create the DynamoDB tables you will use during the labs for
9
9
10
10
In the commands below, the **create-table** AWS CLI command is used to create two new tables called Orders and OrdersHistory.
11
11
12
-
It will create the Orders table in provisioned capacity mode to have 5 read capacity units (RCU), 5 write capacity uints (WCU) and a partition key named `id`.
12
+
It will create the Orders table in on-demand capacity mode with a partition key named `id`.
13
13
14
-
It will also create the OrdersHistory table in provisioned capacity mode to have 5 RCU, 5 WCU, a partition key named `pk` and a sort key named `sk`.
14
+
It will also create the OrdersHistory table in on-demand capacity mode with a partition key named `pk` and a sort key named `sk`.
15
15
16
16
* Copy the **create-table** commands below and paste them into your command terminal.
17
-
* Execute the commands to to create two tables named Orders and OrdersHistory.
17
+
* Execute the commands to to create two tables named `Orders` and `OrdersHistory`.
18
18
19
19
```bash
20
20
aws dynamodb create-table \
@@ -23,8 +23,7 @@ aws dynamodb create-table \
23
23
AttributeName=id,AttributeType=S \
24
24
--key-schema \
25
25
AttributeName=id,KeyType=HASH \
26
-
--provisioned-throughput \
27
-
ReadCapacityUnits=5,WriteCapacityUnits=5 \
26
+
--billing-mode PAY_PER_REQUEST \
28
27
--query "TableDescription.TableStatus"
29
28
30
29
aws dynamodb create-table \
@@ -35,9 +34,9 @@ aws dynamodb create-table \
35
34
--key-schema \
36
35
AttributeName=pk,KeyType=HASH \
37
36
AttributeName=sk,KeyType=RANGE \
38
-
--provisioned-throughput \
39
-
ReadCapacityUnits=5,WriteCapacityUnits=5 \
37
+
--billing-mode PAY_PER_REQUEST \
40
38
--query "TableDescription.TableStatus"
39
+
41
40
```
42
41
43
42
Run the command below to confirm that both tables have been created.
Copy file name to clipboardExpand all lines: content/change-data-capture/setup/aws-ws-event.en.md
+5-6Lines changed: 5 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,30 +7,29 @@ chapter: true
7
7
8
8
### Login to AWS Workshop Studio Portal
9
9
10
-
1. If you are provided a one-click join link, skip to step 3.
10
+
1. If you are provided a one-click join link, use it and skip to step 3.
11
11
12
12
2. Visit [https://catalog.us-east-1.prod.workshops.aws](https://catalog.us-east-1.prod.workshops.aws). If you attended any other workshop earlier on this portal, please logout first. Click on **Get Started** on the right hand side of the window.
13
-
14
13

15
14
16
15
3. On the next, **Sign in** page, choose **Email One-Time Passcode (OTP)** to sign in to your workshop page.
17
-
18
16

19
17
20
18
4. Provide an email address to receive a one-time passcode.
5. Enter the passcode that you received in the provided email address, and click **Sign in**.
25
22
26
23
6. Next, in the textbox, enter the event access code (eg: abcd-012345-ef) that you received from the event facilitators. If you are provided a one-click join link, you will be redirected to the next step automatically.
7. Select on **I agree with the Terms and Conditions** on the bottom of the next page and click **Join event** to continue to the event dashboard.
31
27
32
28
8. On the event dashboard, click on **Open AWS console** to federate into AWS Management Console in a new tab. On the same page, click **Get started** to open the workshop instructions.
9. In addition to the AWS console you should open your Visual Studio code server, by clicking in the `VSCodeServerURL` parameter, available from the "Event Outputs" section. When prompted for a password use the value from `VSCodeServerPassword`.
Copy file name to clipboardExpand all lines: content/change-data-capture/setup/index.en.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ To run this lab, you will need an AWS account, and a user identity with access t
14
14
* Amazon Kinesis
15
15
* AWS Lambda
16
16
* Amazon Simple Queue Service
17
-
*AWS Cloud9 Environment
17
+
*Visual Studio Code
18
18
19
19
You can use your own account, or an account provided through Workshop Studio as part of an AWS organized workshop. Using an account provided by Workshop Studio is the easier path, as you will have full access to all AWS services, and the account will terminate automatically when the event is over.
Copy file name to clipboardExpand all lines: content/change-data-capture/setup/user-account.en.md
+14-23Lines changed: 14 additions & 23 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,40 +6,31 @@ chapter: true
6
6
---
7
7
8
8
9
-
::alert[Only complete this section if you are running the workshop on your own. If you are at an AWS hosted event (such as re\:Invent, Immersion Day, etc), go to :link[At an AWS hosted Event]{href="/event-driven-architecture/setup/start-here/aws-ws-event"}]
9
+
::alert[These setup instructions are identitical for LADV, LHOL, LBED, LMR, and LGME - all of which use the same Visual Studio Code template. Only complete this section once, and only if you're running it on your own account.]{type="warning"}
10
10
11
-
## Create a Cloud9 Environment
11
+
::alert[Only complete this section if you are running the workshop on your own. If you are at an AWS hosted event (such as re\:Invent, Immersion Day, etc), go to :link[At an AWS hosted Event]{href="/hands-on-labs/setup/aws-ws-event"}]
12
12
13
-
To complete the steps in these labs, you need an IAM role that has the privileges to create, update and delete AWS Cloud9 environments, Lambda functions, DynamoDB tables, IAM roles, Kinesis Data Streams and DynamoDB Streams
13
+
## Launch the CloudFormation stack
14
+
::alert[During the course of the lab, you will make DynamoDB tables that will incur a cost that could approach tens or hundreds of dollars per day. Ensure you delete the DynamoDB tables using the DynamoDB console, and make sure you delete the CloudFormation stack as soon as the lab is complete.]
14
15
15
-
* Log into the AWS Management Console, go to the AWS Cloud9 service dashboard then select **Create environment**.
16
+
1.**[Deprecated]** - Launch the CloudFormation template in US West 2 to deploy the resources in your account: [](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=DynamoDBID&templateURL=:param{key="design_patterns_s3_lab_yaml"})
1.*Optionally, download [the YAML template](https://github.com/aws-samples/aws-dynamodb-examples/blob/master/workshops/modernizer/modernizer-db.yaml) from our GitHub repository and launch it your own way*
18
19
19
-
* Give your new environment a name - **DynamoDB Labs** then provide an optional description for the environment.
* Select **t2.small** as your instance type, leave all other fields as the default values then select **Create**.
24
+
1. In the Parameters section, note the *AllowedIP** contains a default IP Address, if you want to access the instance via SSH obtain your own public IP address. Ensure to add the `/32` network mask at the end. Do not modify any other parameter and click *Next*.
* Wait for creation of your Cloud9 environment to complete then select **Open**to launch your Cloud9 evironment.
28
+
6. Scroll to the bottom and click *Next*, and then review the *Template* and *Parameters*. When you are ready to create the stack, scroll to the bottom, check the box acknowledging the creation of IAM resources, and click *Create stack*.
The stack will create a Visual Studio Code EC2 instance, a role for the instance, and a role for the AWS Lambda function used later on in the lab. The CloudFormation template will create a set of folders that can be used to execute individually the lab modules presented in this guide.
30
33
31
-
Start a command line terminal in Cloud9 and set up the `Region` and `Account ID` environment variables.
32
-
33
-
```bash
34
-
export REGION={your aws region} &&
35
-
export ACCOUNT_ID={your aws account ID}
36
-
```
37
-
38
-
Install jq on your AWS Cloud9 environment using the command below.
39
-
40
-
```bash
41
-
sudo yum install jq -y
42
-
```
43
34
44
35
::alert[*After completing the workshop, remember to complete the :link[Clean Up]{href="/change-data-capture/clean-up"} section to remove AWS resources that you no longer require.*]
Copy file name to clipboardExpand all lines: content/rdbms-migration/migration-chapter03.en.md
+45-31Lines changed: 45 additions & 31 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,47 +10,57 @@ The dataset has over 106K movies, ratings, votes, and cast/crew information.
10
10
11
11
The CloudFormation template launched an EC2 Amazon Linux 2 instance with MySQL installed and running.
12
12
It created a MySQL database called `imdb`, added 6 new tables (one for each IMDb dataset), downloaded the IMDb TSV files to MySQL server local directory, and loaded the file contents into the 6 tables.
13
-
The CloudFormation template also configured a remote MySQL user based on input parameters for the template.
14
-
To explore the dataset, follow the instructions below to log in to the EC2 server.
15
-
16
-
1. Go to [EC2 console](https://console.aws.amazon.com/ec2/v2/home#Instances:instanceState=running).
17
-
2. Select the MySQL Instance and click **Connect**.
8. If you are completing this workshop at an AWS hosted event, go to [AWS CloudFormation Console](https://console.aws.amazon.com/cloudformation/home#/stacks?filteringStatus=active&filteringText=&viewNested=true&hideStacks=false) and selectthe stack named **ddb**. Go to the **Parameters** tab and copy the username and password listed next to **DbMasterUsername** and **DbMasterPassword**.
13
+
14
+
On the event dashboard, click on **Open AWS console** to federate into AWS Management Console in a new tab. On the same page, click **Get started** to open the workshop instructions.
In addition to the AWS console you should open your Visual Studio code server, by clicking in the `VSCodeServerURL` parameter, available from the "Event Outputs" section. When prompted for a password use the value from `VSCodeServerPassword`.
If you are completing this workshop at an AWS hosted event, go to [AWS CloudFormation Console](https://console.aws.amazon.com/cloudformation/home#/stacks?filteringStatus=active&filteringText=&viewNested=true&hideStacks=false) and select the stack named **ddb**. Go to the **Parameters** tab and copy the username and password listed next to **DbMasterUsername** and **DbMasterPassword**.
35
38
36
39
::alert[_If you are completing this workshop in your AWS account copy the DbMasterUsername and DbMasterPassword from the CloudFormation stack used to configure the MySQL environment._]
10. Congratulations! You are now connected to a self-managed MySQL source database on EC2. In the following steps, we will explore the database and tables hosting IMDb datasets.
Congratulations! You are now connected to a self-managed MySQL source database on EC2. In the following steps, we will explore the database and tables hosting IMDb datasets.
12. We will create a denormalized view with 1:1 static information and get it ready for migration to Amazon DynamoDB table. For now, go ahead and copy the code below and paste into the MySQL command line.
We will create a denormalized view with 1:1 static information and get it ready for migration to Amazon DynamoDB table. For now, go ahead and copy the code below and paste into the MySQL command line.
78
+
66
79
We will discuss the details around the target data model in the next chapter.
0 commit comments