You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Developer-Instructions.md
+35-4Lines changed: 35 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ Save your access token in a secure location.
18
18
19
19
The Retail Demo Store provides several options for managing deployments. Here are the most common ones:
20
20
21
-
### Deploy via an S3 Staging Bucket
21
+
### Option 3.1 Deploy via an S3 Staging Bucket
22
22
23
23
If you want to modify deployment templates and manage the whole deployment process yourself, you will need to configure an S3 bucket for staging Retail Demo Store deployment templates and resources prior to deployment in your own AWS account. This bucket must be in the region in which you plan to deploy.
24
24
@@ -69,12 +69,43 @@ The [stage.sh](stage.sh) script at the root of the repository must be used to up
69
69
Example on how to stage your project to a custom bucket and path (note the path is optional but, if specified, must end with '/'):
70
70
71
71
```bash
72
-
./stage.sh mycustombucket path/
72
+
./stage.sh MY_CUSTOM_BUCKET S3_PATH/
73
73
```
74
74
75
-
The stage script will output a path to your master deployment CloudFormation template. You can use this link to your S3 bucket to start a new deployment via the CloudFormation console in your AWS Console.
75
+
The stage script will output a path to your master deployment CloudFormation template. You can use this link to your S3 bucket to start a new deployment via the CloudFormation console in your AWS Console or use the command line below. (replace REGION, MY_CUSTOM_BUCKET and S3_PATH value)
76
76
77
-
### Deploy Infrastructure from the Main Repo, Deploy Application and Services via GitHub
### Option 3.2 Deploy Infrastructure from the Main Repo, Deploy Application and Services via GitHub
78
109
79
110
If you only want to modify the web user interface, or the Retail Demo Store backend services, you can deploy Retail Demo Store using the options below, and issue commits in your own fork via GitHub to trigger a re-deploy. This will allow you to push changes to the Retail Demo Store services and web user interface using a CodeDeploy pipeline.
Copy file name to clipboardExpand all lines: aws/cloudformation-templates/template.yaml
+32Lines changed: 32 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -771,3 +771,35 @@ Outputs:
771
771
OffersServiceUrl:
772
772
Description: Offers service load balancer URL.
773
773
Value: !GetAtt Services.Outputs.OffersServiceUrl
774
+
775
+
ExportEnvVarScript:
776
+
Description: A script to export all required environment variable for running docker-compose locally, but connecting to Cloud resources. Replace the existing .env file with this output.
Copy file name to clipboardExpand all lines: src/README.md
+11Lines changed: 11 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,6 +10,17 @@ Besides cloning this repository to your local system, you also need to have the
10
10
11
11
Docker Compose will load the [.env](.env) file to resolve environment variables referenced in the [docker-compose.yml](./docker-compose.yml) file. You can copy the [.env.template](.env.template) file to [.env](.env) as a starting point. This is where you can customize variables to match your desired configuration.
12
12
13
+
You can find the common environment variables from your deployed stack in the CloudFormation output name `ExportEnvVarScript`. Use this CLI to get the output in a proper format.
Then you can copy and override variables for each service in your [.env](.env) file.
23
+
13
24
### AWS credentials
14
25
15
26
Some services, such as the [products](./products) and [recommendations](./recommendations) services, need to access AWS services running in your AWS account from your local machine. Given the differences between these container setups, different approaches are needed to pass in the AWS credentials needed to make these connections. For example, for the recommendations service we can map your local `~./.aws` configuration directory into the container's `/root` directory so the AWS SDK in the container can pick up the credentials it needs. Alternatively, since the products service is packaged from a [scratch image](https://hub.docker.com/_/scratch), credentials must be passed using the `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, and `AWS_SESSION_TOKEN` environment variables. In this case, rather than setting these variables in `.env` and risk exposing these values, consider setting these three variables in your shell environment. The following command can be used to obtain a session token which can be used to set your environment variables in your shell.
0 commit comments