Skip to content

Commit 7d933a1

Browse files
authored
Merge pull request #135 from tebanieo/cloud9-update
Cloud9 update - Part 2
2 parents 0ce9c5e + 8f5af18 commit 7d933a1

File tree

12 files changed

+240
-199
lines changed

12 files changed

+240
-199
lines changed

content/authors.en.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,10 @@ weight: 100
1414
1. Daniel Yoder ([danielsyoder](https://github.com/danielsyoder)) - The brains behind amazon-dynamodb-labs.com and the co-creator of the design scenarios
1515

1616
### 2025 additions
17+
zETL Workshop update with OS pipeline changes (October 2025):
18+
1. John Terhune - ([@terhunej](https://github.com/terhunej)) - Primary author
19+
2. Esteban Serna - ([@tebanieo](https://github.com/tebanieo)) - Editor, Tech reviewer and merger.
20+
1721
Removing Cloud9 due to End of Life from all the workshops (October 2025):
1822
1. Esteban Serna ([@tebanieo](https://github.com/tebanieo)) - Primary author, and merger
1923

content/change-data-capture/overview/create-tables.en.md

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -9,12 +9,12 @@ In this section you create the DynamoDB tables you will use during the labs for
99

1010
In the commands below, the **create-table** AWS CLI command is used to create two new tables called Orders and OrdersHistory.
1111

12-
It will create the Orders table in provisioned capacity mode to have 5 read capacity units (RCU), 5 write capacity uints (WCU) and a partition key named `id`.
12+
It will create the Orders table in on-demand capacity mode with a partition key named `id`.
1313

14-
It will also create the OrdersHistory table in provisioned capacity mode to have 5 RCU, 5 WCU, a partition key named `pk` and a sort key named `sk`.
14+
It will also create the OrdersHistory table in on-demand capacity mode with a partition key named `pk` and a sort key named `sk`.
1515

1616
* Copy the **create-table** commands below and paste them into your command terminal.
17-
* Execute the commands to to create two tables named Orders and OrdersHistory.
17+
* Execute the commands to to create two tables named `Orders` and `OrdersHistory`.
1818

1919
```bash
2020
aws dynamodb create-table \
@@ -23,8 +23,7 @@ aws dynamodb create-table \
2323
AttributeName=id,AttributeType=S \
2424
--key-schema \
2525
AttributeName=id,KeyType=HASH \
26-
--provisioned-throughput \
27-
ReadCapacityUnits=5,WriteCapacityUnits=5 \
26+
--billing-mode PAY_PER_REQUEST \
2827
--query "TableDescription.TableStatus"
2928

3029
aws dynamodb create-table \
@@ -35,9 +34,9 @@ aws dynamodb create-table \
3534
--key-schema \
3635
AttributeName=pk,KeyType=HASH \
3736
AttributeName=sk,KeyType=RANGE \
38-
--provisioned-throughput \
39-
ReadCapacityUnits=5,WriteCapacityUnits=5 \
37+
--billing-mode PAY_PER_REQUEST \
4038
--query "TableDescription.TableStatus"
39+
4140
```
4241

4342
Run the command below to confirm that both tables have been created.

content/change-data-capture/setup/aws-ws-event.en.md

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -7,30 +7,29 @@ chapter: true
77

88
### Login to AWS Workshop Studio Portal
99

10-
1. If you are provided a one-click join link, skip to step 3.
10+
1. If you are provided a one-click join link, use it and skip to step 3.
1111

1212
2. Visit [https://catalog.us-east-1.prod.workshops.aws](https://catalog.us-east-1.prod.workshops.aws). If you attended any other workshop earlier on this portal, please logout first. Click on **Get Started** on the right hand side of the window.
13-
1413
![Workshop Studio Landing Page](/static/images/aws-ws-event1.png)
1514

1615
3. On the next, **Sign in** page, choose **Email One-Time Passcode (OTP)** to sign in to your workshop page.
17-
1816
![Sign in page](/static/images/aws-ws-event2.png)
1917

2018
4. Provide an email address to receive a one-time passcode.
21-
2219
![Email address input](/static/images/aws-ws-event3.png)
2320

2421
5. Enter the passcode that you received in the provided email address, and click **Sign in**.
2522

2623
6. Next, in the textbox, enter the event access code (eg: abcd-012345-ef) that you received from the event facilitators. If you are provided a one-click join link, you will be redirected to the next step automatically.
27-
2824
![Event access code](/static/images/aws-ws-event4.png)
2925

3026
7. Select on **I agree with the Terms and Conditions** on the bottom of the next page and click **Join event** to continue to the event dashboard.
3127

3228
8. On the event dashboard, click on **Open AWS console** to federate into AWS Management Console in a new tab. On the same page, click **Get started** to open the workshop instructions.
29+
![Event dashboard](/static/images/common/workshop-studio-01.png)
30+
31+
9. In addition to the AWS console you should open your Visual Studio code server, by clicking in the `VSCodeServerURL` parameter, available from the "Event Outputs" section. When prompted for a password use the value from `VSCodeServerPassword`.
3332

34-
![Event dashboard](/static/images/aws-ws-event5.png)
33+
![Event dashboard](/static/images/common/workshop-studio-02.png)
3534

3635
Now that you are set up, continue on to: :link[2. Scenario Overview]{href="/change-data-capture/overview"}.

content/change-data-capture/setup/index.en.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ To run this lab, you will need an AWS account, and a user identity with access t
1414
* Amazon Kinesis
1515
* AWS Lambda
1616
* Amazon Simple Queue Service
17-
* AWS Cloud9 Environment
17+
* Visual Studio Code
1818

1919
You can use your own account, or an account provided through Workshop Studio as part of an AWS organized workshop. Using an account provided by Workshop Studio is the easier path, as you will have full access to all AWS services, and the account will terminate automatically when the event is over.
2020

content/change-data-capture/setup/user-account.en.md

Lines changed: 14 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -6,40 +6,31 @@ chapter: true
66
---
77

88

9-
::alert[Only complete this section if you are running the workshop on your own. If you are at an AWS hosted event (such as re\:Invent, Immersion Day, etc), go to :link[At an AWS hosted Event]{href="/event-driven-architecture/setup/start-here/aws-ws-event"}]
9+
::alert[These setup instructions are identitical for LADV, LHOL, LBED, LMR, and LGME - all of which use the same Visual Studio Code template. Only complete this section once, and only if you're running it on your own account.]{type="warning"}
1010

11-
## Create a Cloud9 Environment
11+
::alert[Only complete this section if you are running the workshop on your own. If you are at an AWS hosted event (such as re\:Invent, Immersion Day, etc), go to :link[At an AWS hosted Event]{href="/hands-on-labs/setup/aws-ws-event"}]
1212

13-
To complete the steps in these labs, you need an IAM role that has the privileges to create, update and delete AWS Cloud9 environments, Lambda functions, DynamoDB tables, IAM roles, Kinesis Data Streams and DynamoDB Streams
13+
## Launch the CloudFormation stack
14+
::alert[During the course of the lab, you will make DynamoDB tables that will incur a cost that could approach tens or hundreds of dollars per day. Ensure you delete the DynamoDB tables using the DynamoDB console, and make sure you delete the CloudFormation stack as soon as the lab is complete.]
1415

15-
* Log into the AWS Management Console, go to the AWS Cloud9 service dashboard then select **Create environment**.
16+
1. **[Deprecated]** - Launch the CloudFormation template in US West 2 to deploy the resources in your account: [![CloudFormation](/static/images/cloudformation-launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=DynamoDBID&templateURL=:param{key="design_patterns_s3_lab_yaml"})
1617

17-
![Create Cloud9 environment](/static/images/change-data-capture/setup/cloud9-create-env.png)
18+
1. *Optionally, download [the YAML template](https://github.com/aws-samples/aws-dynamodb-examples/blob/master/workshops/modernizer/modernizer-db.yaml) from our GitHub repository and launch it your own way*
1819

19-
* Give your new environment a name - **DynamoDB Labs** then provide an optional description for the environment.
20+
1. Click *Next* on the first dialog.
2021

21-
![Name Cloud9 environment](/static/images/change-data-capture/setup/cloud9-name-env.png)
22+
1. Provide a CloudFormation stack name.
2223

23-
* Select **t2.small** as your instance type, leave all other fields as the default values then select **Create**.
24+
1. In the Parameters section, note the *AllowedIP** contains a default IP Address, if you want to access the instance via SSH obtain your own public IP address. Ensure to add the `/32` network mask at the end. Do not modify any other parameter and click *Next*.
2425

25-
![Select Cloud9 instance](/static/images/change-data-capture/setup/cloud9-select-ec2.png)
26+
![CloudFormation parameters](/static/images/common/on-your-own-cf-01.png)
2627

27-
* Wait for creation of your Cloud9 environment to complete then select **Open** to launch your Cloud9 evironment.
28+
6. Scroll to the bottom and click *Next*, and then review the *Template* and *Parameters*. When you are ready to create the stack, scroll to the bottom, check the box acknowledging the creation of IAM resources, and click *Create stack*.
2829

29-
![Launch Cloud9 environment](/static/images/change-data-capture/setup/cloud9-launch-env.png)
30+
![CloudFormation parameters](/static/images/common/on-your-own-cf-02.png)
31+
32+
The stack will create a Visual Studio Code EC2 instance, a role for the instance, and a role for the AWS Lambda function used later on in the lab. The CloudFormation template will create a set of folders that can be used to execute individually the lab modules presented in this guide.
3033

31-
Start a command line terminal in Cloud9 and set up the `Region` and `Account ID` environment variables.
32-
33-
```bash
34-
export REGION={your aws region} &&
35-
export ACCOUNT_ID={your aws account ID}
36-
```
37-
38-
Install jq on your AWS Cloud9 environment using the command below.
39-
40-
```bash
41-
sudo yum install jq -y
42-
```
4334

4435
::alert[*After completing the workshop, remember to complete the :link[Clean Up]{href="/change-data-capture/clean-up"} section to remove AWS resources that you no longer require.*]
4536

content/rdbms-migration/migration-chapter03.en.md

Lines changed: 45 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -10,47 +10,57 @@ The dataset has over 106K movies, ratings, votes, and cast/crew information.
1010

1111
The CloudFormation template launched an EC2 Amazon Linux 2 instance with MySQL installed and running.
1212
It created a MySQL database called `imdb`, added 6 new tables (one for each IMDb dataset), downloaded the IMDb TSV files to MySQL server local directory, and loaded the file contents into the 6 tables.
13-
The CloudFormation template also configured a remote MySQL user based on input parameters for the template.
14-
To explore the dataset, follow the instructions below to log in to the EC2 server.
15-
16-
1. Go to [EC2 console](https://console.aws.amazon.com/ec2/v2/home#Instances:instanceState=running).
17-
2. Select the MySQL Instance and click **Connect**.
18-
![Final Deployment Architecture](/static/images/migration9.jpg)
19-
3. Make sure `ec2-user` is in the **User name** field. Click **Connect**.
20-
![Final Deployment Architecture](/static/images/migration10.jpg)
21-
4. Elevate your privileges using the `sudo` command.
22-
```bash
23-
sudo su
24-
```
25-
![Final Deployment Architecture](/static/images/migration11.jpg)
26-
5. Go to the file directory.
27-
```bash
28-
cd /var/lib/mysql-files/
29-
ls -lrt
30-
```
31-
6. You can see all the 6 files copied from the IMDB dataset to the local EC2 directory.
32-
![Final Deployment Architecture](/static/images/migration12.jpg)
33-
7. Feel free to explore the files.
34-
8. If you are completing this workshop at an AWS hosted event, go to [AWS CloudFormation Console](https://console.aws.amazon.com/cloudformation/home#/stacks?filteringStatus=active&filteringText=&viewNested=true&hideStacks=false) and select the stack named **ddb**. Go to the **Parameters** tab and copy the username and password listed next to **DbMasterUsername** and **DbMasterPassword**.
13+
14+
On the event dashboard, click on **Open AWS console** to federate into AWS Management Console in a new tab. On the same page, click **Get started** to open the workshop instructions.
15+
16+
![Event dashboard](/static/images/common/workshop-studio-01.png)
17+
18+
In addition to the AWS console you should open your Visual Studio code server, by clicking in the `VSCodeServerURL` parameter, available from the "Event Outputs" section. When prompted for a password use the value from `VSCodeServerPassword`.
19+
20+
![Event dashboard](/static/images/common/workshop-studio-02.png)
21+
22+
During the first 60 seconds, the environment will automatically update extensions and plugins. Any startup notification can be safely dismissed.
23+
24+
![VS Code Setup](/static/images/common/common-vs-code-01.png)
25+
26+
If a terminal is not available at the bottom left side of your screen, please open a new one like the following picture indicates.
27+
28+
![VS Code Setup](/static/images/common/common-vs-code-02.png)
29+
30+
In the terminal type:
31+
32+
```bash
33+
cd LDMS
34+
35+
```
36+
37+
If you are completing this workshop at an AWS hosted event, go to [AWS CloudFormation Console](https://console.aws.amazon.com/cloudformation/home#/stacks?filteringStatus=active&filteringText=&viewNested=true&hideStacks=false) and select the stack named **ddb**. Go to the **Parameters** tab and copy the username and password listed next to **DbMasterUsername** and **DbMasterPassword**.
3538

3639
::alert[_If you are completing this workshop in your AWS account copy the DbMasterUsername and DbMasterPassword from the CloudFormation stack used to configure the MySQL environment._]
3740

3841
![Final Deployment Architecture](/static/images/migration13.jpg)
39-
9. Go back to EC2 Instance console and login to mysql.
42+
43+
Go to the terminal and login to mysql.
4044
```bash
41-
mysql -u DbMasterUsername -pDbMasterPassword
45+
mysql -u dbuser -p
4246
```
43-
![Final Deployment Architecture](/static/images/migration14.jpg)
44-
10. Congratulations! You are now connected to a self-managed MySQL source database on EC2. In the following steps, we will explore the database and tables hosting IMDb datasets.
47+
48+
![Final Deployment Architecture](/static/images/LDMS/mysql-connecting.png)
49+
50+
Congratulations! You are now connected to a self-managed MySQL source database on EC2. In the following steps, we will explore the database and tables hosting IMDb datasets.
51+
4552
```bash
4653
use imdb;
4754
```
48-
![Final Deployment Architecture](/static/images/migration15.jpg)
49-
11. List all the tables created by the CloudFormation stack.
55+
56+
![Final Deployment Architecture](/static/images/LDMS/mysql-use-imdb.png)
57+
58+
List all the tables created by the CloudFormation stack.
5059
```bash
5160
show tables;
5261
```
53-
![Final Deployment Architecture](/static/images/migration16.jpg)
62+
63+
![Final Deployment Architecture](/static/images/LDMS/mysql-show-tables.png)
5464

5565
For illustration purposes, below is a logical diagram represents relationship between various source tables hosting IMDb dataset.
5666

@@ -60,10 +70,14 @@ For illustration purposes, below is a logical diagram represents relationship be
6070
- `title_principals` has cast and crew information. It has a 1\:many relationship with the `title_basics` table.
6171
- `title_crew` has writer and director information. It has a 1:1 relationship with the `title_basics` table.
6272
- `name_basics` has cast and crew details. Every member has unique `nconst` value assigned.
63-
![Final Deployment Architecture](/static/images/migration31.jpg)
6473

65-
12. We will create a denormalized view with 1:1 static information and get it ready for migration to Amazon DynamoDB table. For now, go ahead and copy the code below and paste into the MySQL command line.
74+
75+
![Final Deployment Architecture](/static/images/migration31.jpg)
76+
77+
We will create a denormalized view with 1:1 static information and get it ready for migration to Amazon DynamoDB table. For now, go ahead and copy the code below and paste into the MySQL command line.
78+
6679
We will discuss the details around the target data model in the next chapter.
80+
6781
```bash
6882
CREATE VIEW imdb.movies AS\
6983
SELECT tp.tconst,\

content/relational-migration/data migration/index2.en.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,23 +11,23 @@ set into S3. We can run this script in preview mode by using the "stdout" parame
1111

1212
1. Run:
1313
```bash
14-
python3 mysql_s3.py Customers stdout
14+
python mysql_s3.py Customers stdout
1515
```
1616
You should see results in DynamoDB JSON format:
1717

1818
![mysql_s3.py output](/static/images/relational-migration/mysql_s3_output.png)
1919

2020
2. Next, run it for our view:
2121
```bash
22-
python3 mysql_s3.py vCustOrders stdout
22+
python mysql_s3.py vCustOrders stdout
2323
```
2424
You should see similar output from the view results.
2525

2626
The script can write these to S3 for us. We just need to omit the "stdout" command line parameter.
2727

2828
3. Now, run the script without preview mode:
2929
```bash
30-
python3 mysql_s3.py Customers
30+
python mysql_s3.py Customers
3131
```
3232
You should see confirmation that objects have been written to S3:
3333

0 commit comments

Comments
 (0)