Creating bash script files to create S3 bucket
I wrote these commands to create s3 bucket:
bucketname=test1234
AWS_ACCESS_KEY_ID=*** AWS_SECRET_ACCESS_KEY=*** REGION=us-east-1 aws s3 mb "s3://$bucketname"
This successfully creates a bucket. but when I copy this in a file, passing the bucket name as an argument and run the script file I get an error:
Script in the file:
bucketname=$1
AWS_ACCESS_KEY_ID=*** AWS_SECRET_ACCESS_KEY=*** REGION=us-east-1 aws s3 mb "s3://$bucketname"
Bash file : createbucket.txt
Command used: ./createbucket.txt buckettest1234
Error:
Parameter validation failed:ettest1234
": Bucket name must match the regex "^[a-zA-Z0-9.-_]{1,255}$"
It even takes out the first 4 letters for some reason.
bash amazon-s3 aws-cli
add a comment |
I wrote these commands to create s3 bucket:
bucketname=test1234
AWS_ACCESS_KEY_ID=*** AWS_SECRET_ACCESS_KEY=*** REGION=us-east-1 aws s3 mb "s3://$bucketname"
This successfully creates a bucket. but when I copy this in a file, passing the bucket name as an argument and run the script file I get an error:
Script in the file:
bucketname=$1
AWS_ACCESS_KEY_ID=*** AWS_SECRET_ACCESS_KEY=*** REGION=us-east-1 aws s3 mb "s3://$bucketname"
Bash file : createbucket.txt
Command used: ./createbucket.txt buckettest1234
Error:
Parameter validation failed:ettest1234
": Bucket name must match the regex "^[a-zA-Z0-9.-_]{1,255}$"
It even takes out the first 4 letters for some reason.
bash amazon-s3 aws-cli
Use quotes around variables ? Also u might have to escape specials chars with
– Mike Q
Nov 26 '18 at 23:45
I'd also put single quotes around the secrets and put each variable on its own line--it could be that bash is trying to interpret something inside of them. Like:AWS_ACCESS_KEY_ID='***'
one one line andAWS_SECRET_ACCESS_KEY='***'
on another andREGION='us-east-1'
on another andaws s3 mb "s3://$bucketname"
on the last line.
– MatrixManAtYrService
Nov 27 '18 at 0:12
Why assignbucketname=$1
? Just useaws s3 mb "s3://$1"
– John Rotenstein
Nov 27 '18 at 2:09
Thank you @MatrixManAtYrService and Mike Q your recommendations worked. Could you please add your comment as an answer so that I can up vote it? John I was just trying something out and forgot it is the same. Thank you for your help.
– Digvijay Sawant
Nov 27 '18 at 17:54
add a comment |
I wrote these commands to create s3 bucket:
bucketname=test1234
AWS_ACCESS_KEY_ID=*** AWS_SECRET_ACCESS_KEY=*** REGION=us-east-1 aws s3 mb "s3://$bucketname"
This successfully creates a bucket. but when I copy this in a file, passing the bucket name as an argument and run the script file I get an error:
Script in the file:
bucketname=$1
AWS_ACCESS_KEY_ID=*** AWS_SECRET_ACCESS_KEY=*** REGION=us-east-1 aws s3 mb "s3://$bucketname"
Bash file : createbucket.txt
Command used: ./createbucket.txt buckettest1234
Error:
Parameter validation failed:ettest1234
": Bucket name must match the regex "^[a-zA-Z0-9.-_]{1,255}$"
It even takes out the first 4 letters for some reason.
bash amazon-s3 aws-cli
I wrote these commands to create s3 bucket:
bucketname=test1234
AWS_ACCESS_KEY_ID=*** AWS_SECRET_ACCESS_KEY=*** REGION=us-east-1 aws s3 mb "s3://$bucketname"
This successfully creates a bucket. but when I copy this in a file, passing the bucket name as an argument and run the script file I get an error:
Script in the file:
bucketname=$1
AWS_ACCESS_KEY_ID=*** AWS_SECRET_ACCESS_KEY=*** REGION=us-east-1 aws s3 mb "s3://$bucketname"
Bash file : createbucket.txt
Command used: ./createbucket.txt buckettest1234
Error:
Parameter validation failed:ettest1234
": Bucket name must match the regex "^[a-zA-Z0-9.-_]{1,255}$"
It even takes out the first 4 letters for some reason.
bash amazon-s3 aws-cli
bash amazon-s3 aws-cli
asked Nov 26 '18 at 22:38
Digvijay SawantDigvijay Sawant
522317
522317
Use quotes around variables ? Also u might have to escape specials chars with
– Mike Q
Nov 26 '18 at 23:45
I'd also put single quotes around the secrets and put each variable on its own line--it could be that bash is trying to interpret something inside of them. Like:AWS_ACCESS_KEY_ID='***'
one one line andAWS_SECRET_ACCESS_KEY='***'
on another andREGION='us-east-1'
on another andaws s3 mb "s3://$bucketname"
on the last line.
– MatrixManAtYrService
Nov 27 '18 at 0:12
Why assignbucketname=$1
? Just useaws s3 mb "s3://$1"
– John Rotenstein
Nov 27 '18 at 2:09
Thank you @MatrixManAtYrService and Mike Q your recommendations worked. Could you please add your comment as an answer so that I can up vote it? John I was just trying something out and forgot it is the same. Thank you for your help.
– Digvijay Sawant
Nov 27 '18 at 17:54
add a comment |
Use quotes around variables ? Also u might have to escape specials chars with
– Mike Q
Nov 26 '18 at 23:45
I'd also put single quotes around the secrets and put each variable on its own line--it could be that bash is trying to interpret something inside of them. Like:AWS_ACCESS_KEY_ID='***'
one one line andAWS_SECRET_ACCESS_KEY='***'
on another andREGION='us-east-1'
on another andaws s3 mb "s3://$bucketname"
on the last line.
– MatrixManAtYrService
Nov 27 '18 at 0:12
Why assignbucketname=$1
? Just useaws s3 mb "s3://$1"
– John Rotenstein
Nov 27 '18 at 2:09
Thank you @MatrixManAtYrService and Mike Q your recommendations worked. Could you please add your comment as an answer so that I can up vote it? John I was just trying something out and forgot it is the same. Thank you for your help.
– Digvijay Sawant
Nov 27 '18 at 17:54
Use quotes around variables ? Also u might have to escape specials chars with
– Mike Q
Nov 26 '18 at 23:45
Use quotes around variables ? Also u might have to escape specials chars with
– Mike Q
Nov 26 '18 at 23:45
I'd also put single quotes around the secrets and put each variable on its own line--it could be that bash is trying to interpret something inside of them. Like:
AWS_ACCESS_KEY_ID='***'
one one line and AWS_SECRET_ACCESS_KEY='***'
on another and REGION='us-east-1'
on another and aws s3 mb "s3://$bucketname"
on the last line.– MatrixManAtYrService
Nov 27 '18 at 0:12
I'd also put single quotes around the secrets and put each variable on its own line--it could be that bash is trying to interpret something inside of them. Like:
AWS_ACCESS_KEY_ID='***'
one one line and AWS_SECRET_ACCESS_KEY='***'
on another and REGION='us-east-1'
on another and aws s3 mb "s3://$bucketname"
on the last line.– MatrixManAtYrService
Nov 27 '18 at 0:12
Why assign
bucketname=$1
? Just use aws s3 mb "s3://$1"
– John Rotenstein
Nov 27 '18 at 2:09
Why assign
bucketname=$1
? Just use aws s3 mb "s3://$1"
– John Rotenstein
Nov 27 '18 at 2:09
Thank you @MatrixManAtYrService and Mike Q your recommendations worked. Could you please add your comment as an answer so that I can up vote it? John I was just trying something out and forgot it is the same. Thank you for your help.
– Digvijay Sawant
Nov 27 '18 at 17:54
Thank you @MatrixManAtYrService and Mike Q your recommendations worked. Could you please add your comment as an answer so that I can up vote it? John I was just trying something out and forgot it is the same. Thank you for your help.
– Digvijay Sawant
Nov 27 '18 at 17:54
add a comment |
1 Answer
1
active
oldest
votes
The error message you're getting indicates that something is happening to the bucket name before it is making it to the aws
command. You're expecting buckettest1234
but you're getting something like ettest1234
.
To make it easier to see why this is happening, try wrapping your secrets in single quotes, your variable references in double quotes, and initializing your variables on their own lines, and printing the intermediate value, like so:
createbucket.sh:
#! /usr/bin/env bash
AWS_ACCESS_KEY_ID='***'
AWS_SECRET_ACCESS_KEY='***'
REGION='us-east-1'
aws s3 mb "s3://$1"
aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
follow up question: I wrote the following 2 commands:AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' REGION='us-east-1' aws s3 mb "s3://$1" AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
the second command is to copy file from one bucket to another and it fails giving the same error. It fails to even create a bucket. Both lines work when used individaully but not in a bash file. I also tried everything on separate lines but did not work. Any idea why that might happen?
– Digvijay Sawant
Nov 28 '18 at 0:12
Create a script with the lineps -ef | grep $$ | grep -v grep
and run it. Does the output cointain:bash ./scriptname.sh
orsh ./scriptname.sh
or something else? Also, what does your command line environment output when you enterecho $0
? I'm wondering if you're using the same shell in your command line environment that you are in your script. If the above two steps don't both saybash
somewhere then that may be related to your problem.
– MatrixManAtYrService
Nov 28 '18 at 20:31
Also, try replacingaws s3 mb
withecho
and see if the string that it prints out is mangled in some way. I don't think this has anything to do with theaws
utility--something is amiss with how your shell is handling the parameter.
– MatrixManAtYrService
Nov 28 '18 at 20:40
ubuntu 7095 7079 0 22:19 pts/1 00:00:00 -bash ubuntu 7096 7095 0 22:19 pts/1 00:00:00 ps -ef
This is the output fromps -ef | grep $$ | grep -v grep
It outputs-bash
when I enterecho $0
. When my replace aws s3 mb with echo everything works fine and output of file name is as I enter it. I really appreciate your help with this. I will keep trying something. If anything occurs in your mind that I might be missing please do let me know. :-)
– Digvijay Sawant
Nov 28 '18 at 22:24
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53490185%2fcreating-bash-script-files-to-create-s3-bucket%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
The error message you're getting indicates that something is happening to the bucket name before it is making it to the aws
command. You're expecting buckettest1234
but you're getting something like ettest1234
.
To make it easier to see why this is happening, try wrapping your secrets in single quotes, your variable references in double quotes, and initializing your variables on their own lines, and printing the intermediate value, like so:
createbucket.sh:
#! /usr/bin/env bash
AWS_ACCESS_KEY_ID='***'
AWS_SECRET_ACCESS_KEY='***'
REGION='us-east-1'
aws s3 mb "s3://$1"
aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
follow up question: I wrote the following 2 commands:AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' REGION='us-east-1' aws s3 mb "s3://$1" AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
the second command is to copy file from one bucket to another and it fails giving the same error. It fails to even create a bucket. Both lines work when used individaully but not in a bash file. I also tried everything on separate lines but did not work. Any idea why that might happen?
– Digvijay Sawant
Nov 28 '18 at 0:12
Create a script with the lineps -ef | grep $$ | grep -v grep
and run it. Does the output cointain:bash ./scriptname.sh
orsh ./scriptname.sh
or something else? Also, what does your command line environment output when you enterecho $0
? I'm wondering if you're using the same shell in your command line environment that you are in your script. If the above two steps don't both saybash
somewhere then that may be related to your problem.
– MatrixManAtYrService
Nov 28 '18 at 20:31
Also, try replacingaws s3 mb
withecho
and see if the string that it prints out is mangled in some way. I don't think this has anything to do with theaws
utility--something is amiss with how your shell is handling the parameter.
– MatrixManAtYrService
Nov 28 '18 at 20:40
ubuntu 7095 7079 0 22:19 pts/1 00:00:00 -bash ubuntu 7096 7095 0 22:19 pts/1 00:00:00 ps -ef
This is the output fromps -ef | grep $$ | grep -v grep
It outputs-bash
when I enterecho $0
. When my replace aws s3 mb with echo everything works fine and output of file name is as I enter it. I really appreciate your help with this. I will keep trying something. If anything occurs in your mind that I might be missing please do let me know. :-)
– Digvijay Sawant
Nov 28 '18 at 22:24
add a comment |
The error message you're getting indicates that something is happening to the bucket name before it is making it to the aws
command. You're expecting buckettest1234
but you're getting something like ettest1234
.
To make it easier to see why this is happening, try wrapping your secrets in single quotes, your variable references in double quotes, and initializing your variables on their own lines, and printing the intermediate value, like so:
createbucket.sh:
#! /usr/bin/env bash
AWS_ACCESS_KEY_ID='***'
AWS_SECRET_ACCESS_KEY='***'
REGION='us-east-1'
aws s3 mb "s3://$1"
aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
follow up question: I wrote the following 2 commands:AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' REGION='us-east-1' aws s3 mb "s3://$1" AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
the second command is to copy file from one bucket to another and it fails giving the same error. It fails to even create a bucket. Both lines work when used individaully but not in a bash file. I also tried everything on separate lines but did not work. Any idea why that might happen?
– Digvijay Sawant
Nov 28 '18 at 0:12
Create a script with the lineps -ef | grep $$ | grep -v grep
and run it. Does the output cointain:bash ./scriptname.sh
orsh ./scriptname.sh
or something else? Also, what does your command line environment output when you enterecho $0
? I'm wondering if you're using the same shell in your command line environment that you are in your script. If the above two steps don't both saybash
somewhere then that may be related to your problem.
– MatrixManAtYrService
Nov 28 '18 at 20:31
Also, try replacingaws s3 mb
withecho
and see if the string that it prints out is mangled in some way. I don't think this has anything to do with theaws
utility--something is amiss with how your shell is handling the parameter.
– MatrixManAtYrService
Nov 28 '18 at 20:40
ubuntu 7095 7079 0 22:19 pts/1 00:00:00 -bash ubuntu 7096 7095 0 22:19 pts/1 00:00:00 ps -ef
This is the output fromps -ef | grep $$ | grep -v grep
It outputs-bash
when I enterecho $0
. When my replace aws s3 mb with echo everything works fine and output of file name is as I enter it. I really appreciate your help with this. I will keep trying something. If anything occurs in your mind that I might be missing please do let me know. :-)
– Digvijay Sawant
Nov 28 '18 at 22:24
add a comment |
The error message you're getting indicates that something is happening to the bucket name before it is making it to the aws
command. You're expecting buckettest1234
but you're getting something like ettest1234
.
To make it easier to see why this is happening, try wrapping your secrets in single quotes, your variable references in double quotes, and initializing your variables on their own lines, and printing the intermediate value, like so:
createbucket.sh:
#! /usr/bin/env bash
AWS_ACCESS_KEY_ID='***'
AWS_SECRET_ACCESS_KEY='***'
REGION='us-east-1'
aws s3 mb "s3://$1"
aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
The error message you're getting indicates that something is happening to the bucket name before it is making it to the aws
command. You're expecting buckettest1234
but you're getting something like ettest1234
.
To make it easier to see why this is happening, try wrapping your secrets in single quotes, your variable references in double quotes, and initializing your variables on their own lines, and printing the intermediate value, like so:
createbucket.sh:
#! /usr/bin/env bash
AWS_ACCESS_KEY_ID='***'
AWS_SECRET_ACCESS_KEY='***'
REGION='us-east-1'
aws s3 mb "s3://$1"
aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
edited Nov 28 '18 at 20:23
answered Nov 27 '18 at 19:22
MatrixManAtYrServiceMatrixManAtYrService
2,1151827
2,1151827
follow up question: I wrote the following 2 commands:AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' REGION='us-east-1' aws s3 mb "s3://$1" AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
the second command is to copy file from one bucket to another and it fails giving the same error. It fails to even create a bucket. Both lines work when used individaully but not in a bash file. I also tried everything on separate lines but did not work. Any idea why that might happen?
– Digvijay Sawant
Nov 28 '18 at 0:12
Create a script with the lineps -ef | grep $$ | grep -v grep
and run it. Does the output cointain:bash ./scriptname.sh
orsh ./scriptname.sh
or something else? Also, what does your command line environment output when you enterecho $0
? I'm wondering if you're using the same shell in your command line environment that you are in your script. If the above two steps don't both saybash
somewhere then that may be related to your problem.
– MatrixManAtYrService
Nov 28 '18 at 20:31
Also, try replacingaws s3 mb
withecho
and see if the string that it prints out is mangled in some way. I don't think this has anything to do with theaws
utility--something is amiss with how your shell is handling the parameter.
– MatrixManAtYrService
Nov 28 '18 at 20:40
ubuntu 7095 7079 0 22:19 pts/1 00:00:00 -bash ubuntu 7096 7095 0 22:19 pts/1 00:00:00 ps -ef
This is the output fromps -ef | grep $$ | grep -v grep
It outputs-bash
when I enterecho $0
. When my replace aws s3 mb with echo everything works fine and output of file name is as I enter it. I really appreciate your help with this. I will keep trying something. If anything occurs in your mind that I might be missing please do let me know. :-)
– Digvijay Sawant
Nov 28 '18 at 22:24
add a comment |
follow up question: I wrote the following 2 commands:AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' REGION='us-east-1' aws s3 mb "s3://$1" AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
the second command is to copy file from one bucket to another and it fails giving the same error. It fails to even create a bucket. Both lines work when used individaully but not in a bash file. I also tried everything on separate lines but did not work. Any idea why that might happen?
– Digvijay Sawant
Nov 28 '18 at 0:12
Create a script with the lineps -ef | grep $$ | grep -v grep
and run it. Does the output cointain:bash ./scriptname.sh
orsh ./scriptname.sh
or something else? Also, what does your command line environment output when you enterecho $0
? I'm wondering if you're using the same shell in your command line environment that you are in your script. If the above two steps don't both saybash
somewhere then that may be related to your problem.
– MatrixManAtYrService
Nov 28 '18 at 20:31
Also, try replacingaws s3 mb
withecho
and see if the string that it prints out is mangled in some way. I don't think this has anything to do with theaws
utility--something is amiss with how your shell is handling the parameter.
– MatrixManAtYrService
Nov 28 '18 at 20:40
ubuntu 7095 7079 0 22:19 pts/1 00:00:00 -bash ubuntu 7096 7095 0 22:19 pts/1 00:00:00 ps -ef
This is the output fromps -ef | grep $$ | grep -v grep
It outputs-bash
when I enterecho $0
. When my replace aws s3 mb with echo everything works fine and output of file name is as I enter it. I really appreciate your help with this. I will keep trying something. If anything occurs in your mind that I might be missing please do let me know. :-)
– Digvijay Sawant
Nov 28 '18 at 22:24
follow up question: I wrote the following 2 commands:
AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' REGION='us-east-1' aws s3 mb "s3://$1" AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
the second command is to copy file from one bucket to another and it fails giving the same error. It fails to even create a bucket. Both lines work when used individaully but not in a bash file. I also tried everything on separate lines but did not work. Any idea why that might happen?– Digvijay Sawant
Nov 28 '18 at 0:12
follow up question: I wrote the following 2 commands:
AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' REGION='us-east-1' aws s3 mb "s3://$1" AWS_ACCESS_KEY_ID='***' AWS_SECRET_ACCESS_KEY='***' aws s3 cp "s3://frombucket/testfile.txt" "s3://$1/testfile.txt"
the second command is to copy file from one bucket to another and it fails giving the same error. It fails to even create a bucket. Both lines work when used individaully but not in a bash file. I also tried everything on separate lines but did not work. Any idea why that might happen?– Digvijay Sawant
Nov 28 '18 at 0:12
Create a script with the line
ps -ef | grep $$ | grep -v grep
and run it. Does the output cointain: bash ./scriptname.sh
or sh ./scriptname.sh
or something else? Also, what does your command line environment output when you enter echo $0
? I'm wondering if you're using the same shell in your command line environment that you are in your script. If the above two steps don't both say bash
somewhere then that may be related to your problem.– MatrixManAtYrService
Nov 28 '18 at 20:31
Create a script with the line
ps -ef | grep $$ | grep -v grep
and run it. Does the output cointain: bash ./scriptname.sh
or sh ./scriptname.sh
or something else? Also, what does your command line environment output when you enter echo $0
? I'm wondering if you're using the same shell in your command line environment that you are in your script. If the above two steps don't both say bash
somewhere then that may be related to your problem.– MatrixManAtYrService
Nov 28 '18 at 20:31
Also, try replacing
aws s3 mb
with echo
and see if the string that it prints out is mangled in some way. I don't think this has anything to do with the aws
utility--something is amiss with how your shell is handling the parameter.– MatrixManAtYrService
Nov 28 '18 at 20:40
Also, try replacing
aws s3 mb
with echo
and see if the string that it prints out is mangled in some way. I don't think this has anything to do with the aws
utility--something is amiss with how your shell is handling the parameter.– MatrixManAtYrService
Nov 28 '18 at 20:40
ubuntu 7095 7079 0 22:19 pts/1 00:00:00 -bash ubuntu 7096 7095 0 22:19 pts/1 00:00:00 ps -ef
This is the output from ps -ef | grep $$ | grep -v grep
It outputs -bash
when I enter echo $0
. When my replace aws s3 mb with echo everything works fine and output of file name is as I enter it. I really appreciate your help with this. I will keep trying something. If anything occurs in your mind that I might be missing please do let me know. :-)– Digvijay Sawant
Nov 28 '18 at 22:24
ubuntu 7095 7079 0 22:19 pts/1 00:00:00 -bash ubuntu 7096 7095 0 22:19 pts/1 00:00:00 ps -ef
This is the output from ps -ef | grep $$ | grep -v grep
It outputs -bash
when I enter echo $0
. When my replace aws s3 mb with echo everything works fine and output of file name is as I enter it. I really appreciate your help with this. I will keep trying something. If anything occurs in your mind that I might be missing please do let me know. :-)– Digvijay Sawant
Nov 28 '18 at 22:24
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53490185%2fcreating-bash-script-files-to-create-s3-bucket%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Use quotes around variables ? Also u might have to escape specials chars with
– Mike Q
Nov 26 '18 at 23:45
I'd also put single quotes around the secrets and put each variable on its own line--it could be that bash is trying to interpret something inside of them. Like:
AWS_ACCESS_KEY_ID='***'
one one line andAWS_SECRET_ACCESS_KEY='***'
on another andREGION='us-east-1'
on another andaws s3 mb "s3://$bucketname"
on the last line.– MatrixManAtYrService
Nov 27 '18 at 0:12
Why assign
bucketname=$1
? Just useaws s3 mb "s3://$1"
– John Rotenstein
Nov 27 '18 at 2:09
Thank you @MatrixManAtYrService and Mike Q your recommendations worked. Could you please add your comment as an answer so that I can up vote it? John I was just trying something out and forgot it is the same. Thank you for your help.
– Digvijay Sawant
Nov 27 '18 at 17:54