Skip to content

Commit 844ffd0

Browse files
authored
Update 00-upload-training-data.md (#460)
Removed link to wikitext data location.
1 parent 04ecdb7 commit 844ffd0

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

content/09-ml-on-parallelcluster/00-upload-training-data.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ In this step, you create an environment configuration script to train a Natural
1212
First, create an Amazon S3 bucket and upload the training data folder. This training folder will be accessed by the cluster worker nodes through FSx.
1313

1414
1. Open a terminal in your AWS Cloud9 instance.
15-
2. Run the following commands to create a new Amazon S3 bucket. These commands also retrieve and store the [Wikitext 103 dataset](https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/)
15+
2. Run the following commands to create a new Amazon S3 bucket. These commands also retrieve and store the Wikitext 103 dataset
1616

1717
```bash
1818
# generate a unique postfix
@@ -21,7 +21,7 @@ echo "Your bucket name will be mlbucket-${BUCKET_POSTFIX}"
2121
aws s3 mb s3://mlbucket-${BUCKET_POSTFIX}
2222

2323
# downloading data:
24-
export URL="https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip"
24+
export URL="https://path/to/wikitext-103-v1.zip"
2525
export FILE="wikitext-103-v1.zip"
2626
wget $URL -O $FILE
2727
unzip $FILE

0 commit comments

Comments
 (0)