Get size of the bucket based on storage classes in Google cloud storage












0














I would like to get the size of the bucket based on storage class . I've added rules to bucket to change the storage class of the files based on age of the file.
I've used below commands



gsutil du -sh gs://[bucket-name] 

To get Meta-data :
gsutil ls -L gs://[bucket-name]

To set ACL to bucket

gsutil lifecycle set life-cycle.json gs://[bucket-name]


Please any one help on this to resolve my issue










share|improve this question



























    0














    I would like to get the size of the bucket based on storage class . I've added rules to bucket to change the storage class of the files based on age of the file.
    I've used below commands



    gsutil du -sh gs://[bucket-name] 

    To get Meta-data :
    gsutil ls -L gs://[bucket-name]

    To set ACL to bucket

    gsutil lifecycle set life-cycle.json gs://[bucket-name]


    Please any one help on this to resolve my issue










    share|improve this question

























      0












      0








      0







      I would like to get the size of the bucket based on storage class . I've added rules to bucket to change the storage class of the files based on age of the file.
      I've used below commands



      gsutil du -sh gs://[bucket-name] 

      To get Meta-data :
      gsutil ls -L gs://[bucket-name]

      To set ACL to bucket

      gsutil lifecycle set life-cycle.json gs://[bucket-name]


      Please any one help on this to resolve my issue










      share|improve this question













      I would like to get the size of the bucket based on storage class . I've added rules to bucket to change the storage class of the files based on age of the file.
      I've used below commands



      gsutil du -sh gs://[bucket-name] 

      To get Meta-data :
      gsutil ls -L gs://[bucket-name]

      To set ACL to bucket

      gsutil lifecycle set life-cycle.json gs://[bucket-name]


      Please any one help on this to resolve my issue







      google-cloud-platform google-cloud-storage






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 23 at 6:27









      prasanna Kumar

      162




      162
























          1 Answer
          1






          active

          oldest

          votes


















          2














          Edit:



          I have filed a Feature Request for this on the Public Issue Tracker. In the meantime, the code below can be used.



          I believe there is no gsutil command that can show you the total size by storage class for a GCS bucket.



          However, using the Cloud Storage Client Libraries for Python, I made a script that does what you’re asking for:



          from google.cloud import storage
          import math

          ### SET THESE VARIABLES ###
          PROJECT_ID = ""
          CLOUD_STORAGE_BUCKET = ""
          ###########################

          def _get_storage_client():
          return storage.Client(
          project=PROJECT_ID)

          def convert_size(size_bytes):
          if size_bytes == 0:
          return "0 B"
          size_name = ("B", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
          i = int(math.floor(math.log(size_bytes, 1024)))
          p = math.pow(1024, i)
          s = round(size_bytes / p, 2)
          return "%s %s" % (s, size_name[i])

          def size_by_class():
          client = _get_storage_client()
          bucket = client.bucket(CLOUD_STORAGE_BUCKET)
          blobs = bucket.list_blobs()

          size_multi_regional = size_regional = size_nearline = size_coldline = 0
          for blob in blobs:
          if blob.storage_class == "MULTI_REGIONAL":
          size_multi_regional = size_multi_regional + blob.size
          if blob.storage_class == "REGIONAL":
          size_regional = size_regional + blob.size
          if blob.storage_class == "NEARLINE":
          size_nearline = size_nearline + blob.size
          if blob.storage_class == "COLDLINE":
          size_coldline = size_coldline + blob.size

          print("MULTI_REGIONAL: "+str(convert_size(size_multi_regional))+"n"+
          "REGIONAL: "+str(convert_size(size_regional)+"n"+
          "NEARLINE: "+str(convert_size(size_nearline))+"n"+
          "COLDLINE: "+str(convert_size(size_coldline))
          ))

          if __name__ == '__main__':
          size_by_class()


          To run this program from the Google Cloud Shell, make sure you have previously installed the Client Library for Python with:



          pip install --upgrade google-cloud-storage


          And in order to provide authentication credentials to the application code, you must point the environment variable GOOGLE_APPLICATION_CREDENTIALS to the location of the JSON file that contains your service account key:



          export `GOOGLE_APPLICATION_CREDENTIALS`="/home/user/Downloads/[FILE_NAME].json"


          Before running the script, set PROJECT_ID to the ID of your Project, and CLOUD_STORAGE_BUCKET to the name of your GCS Bucket.



          Run the script with python main.py. Output should be something like:



          MULTI_REGIONAL: 1.0 GB
          REGIONAL: 300 MB
          NEARLINE: 200 MB
          COLDLINE: 10 MB





          share|improve this answer























          • Hi @Maxima Thank you for your response , This one is not solved my problem . This one i've already checked .I'll brief it once again, in my bucket i've multi-region , nearline and codeline . I would like to get size of the files for all types of storage class exists in the bucket . If i've 50-GB in Multi-region , 20 GB in NearLINE , 1TB in codeline Like above i need response.
            – prasanna Kumar
            Nov 23 at 12:31










          • I misunderstood your initial inquiry then. I updated my answer. I hope this is what you were looking for.
            – Maxim
            Nov 24 at 0:03










          • Hi @Maxima, It's solve my 50% of my problem , I've run the above code but i'm getting only nearline and code line size as expected but region and multi region size are . not coming . Please check it from ur end. Will you able to get all storage class sizes .
            – prasanna Kumar
            Nov 26 at 6:38










          • I filed a Feature Request on your behalf for this: issuetracker.google.com/119989059. As it is indeed something that can be useful when having multiple files with different Storage Classes. As for the issue you’re encountering, please verify that you have files in the GCS bucket that belong to the Storage Class ‘REGIONAL’ and ‘MULTI_REGIONAL’. I’ve tested this on my end and it worked. Nonetheless, I will take another look at the code and will come back to you with an update by Wednesday the latest.
            – Maxim
            Nov 26 at 7:09










          • I tested that in my bucket have 3 storage classes namely multi-region/region , Nearline and codline . Assume total size of the bucket is 100TB but as per the code i'm getting as Multi-regional/regional : 0 B Neraline : 20 TB Codeline : 70 TB but remaining size should falls either in multi-regional or regional .
            – prasanna Kumar
            Nov 26 at 9:36











          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53441576%2fget-size-of-the-bucket-based-on-storage-classes-in-google-cloud-storage%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2














          Edit:



          I have filed a Feature Request for this on the Public Issue Tracker. In the meantime, the code below can be used.



          I believe there is no gsutil command that can show you the total size by storage class for a GCS bucket.



          However, using the Cloud Storage Client Libraries for Python, I made a script that does what you’re asking for:



          from google.cloud import storage
          import math

          ### SET THESE VARIABLES ###
          PROJECT_ID = ""
          CLOUD_STORAGE_BUCKET = ""
          ###########################

          def _get_storage_client():
          return storage.Client(
          project=PROJECT_ID)

          def convert_size(size_bytes):
          if size_bytes == 0:
          return "0 B"
          size_name = ("B", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
          i = int(math.floor(math.log(size_bytes, 1024)))
          p = math.pow(1024, i)
          s = round(size_bytes / p, 2)
          return "%s %s" % (s, size_name[i])

          def size_by_class():
          client = _get_storage_client()
          bucket = client.bucket(CLOUD_STORAGE_BUCKET)
          blobs = bucket.list_blobs()

          size_multi_regional = size_regional = size_nearline = size_coldline = 0
          for blob in blobs:
          if blob.storage_class == "MULTI_REGIONAL":
          size_multi_regional = size_multi_regional + blob.size
          if blob.storage_class == "REGIONAL":
          size_regional = size_regional + blob.size
          if blob.storage_class == "NEARLINE":
          size_nearline = size_nearline + blob.size
          if blob.storage_class == "COLDLINE":
          size_coldline = size_coldline + blob.size

          print("MULTI_REGIONAL: "+str(convert_size(size_multi_regional))+"n"+
          "REGIONAL: "+str(convert_size(size_regional)+"n"+
          "NEARLINE: "+str(convert_size(size_nearline))+"n"+
          "COLDLINE: "+str(convert_size(size_coldline))
          ))

          if __name__ == '__main__':
          size_by_class()


          To run this program from the Google Cloud Shell, make sure you have previously installed the Client Library for Python with:



          pip install --upgrade google-cloud-storage


          And in order to provide authentication credentials to the application code, you must point the environment variable GOOGLE_APPLICATION_CREDENTIALS to the location of the JSON file that contains your service account key:



          export `GOOGLE_APPLICATION_CREDENTIALS`="/home/user/Downloads/[FILE_NAME].json"


          Before running the script, set PROJECT_ID to the ID of your Project, and CLOUD_STORAGE_BUCKET to the name of your GCS Bucket.



          Run the script with python main.py. Output should be something like:



          MULTI_REGIONAL: 1.0 GB
          REGIONAL: 300 MB
          NEARLINE: 200 MB
          COLDLINE: 10 MB





          share|improve this answer























          • Hi @Maxima Thank you for your response , This one is not solved my problem . This one i've already checked .I'll brief it once again, in my bucket i've multi-region , nearline and codeline . I would like to get size of the files for all types of storage class exists in the bucket . If i've 50-GB in Multi-region , 20 GB in NearLINE , 1TB in codeline Like above i need response.
            – prasanna Kumar
            Nov 23 at 12:31










          • I misunderstood your initial inquiry then. I updated my answer. I hope this is what you were looking for.
            – Maxim
            Nov 24 at 0:03










          • Hi @Maxima, It's solve my 50% of my problem , I've run the above code but i'm getting only nearline and code line size as expected but region and multi region size are . not coming . Please check it from ur end. Will you able to get all storage class sizes .
            – prasanna Kumar
            Nov 26 at 6:38










          • I filed a Feature Request on your behalf for this: issuetracker.google.com/119989059. As it is indeed something that can be useful when having multiple files with different Storage Classes. As for the issue you’re encountering, please verify that you have files in the GCS bucket that belong to the Storage Class ‘REGIONAL’ and ‘MULTI_REGIONAL’. I’ve tested this on my end and it worked. Nonetheless, I will take another look at the code and will come back to you with an update by Wednesday the latest.
            – Maxim
            Nov 26 at 7:09










          • I tested that in my bucket have 3 storage classes namely multi-region/region , Nearline and codline . Assume total size of the bucket is 100TB but as per the code i'm getting as Multi-regional/regional : 0 B Neraline : 20 TB Codeline : 70 TB but remaining size should falls either in multi-regional or regional .
            – prasanna Kumar
            Nov 26 at 9:36
















          2














          Edit:



          I have filed a Feature Request for this on the Public Issue Tracker. In the meantime, the code below can be used.



          I believe there is no gsutil command that can show you the total size by storage class for a GCS bucket.



          However, using the Cloud Storage Client Libraries for Python, I made a script that does what you’re asking for:



          from google.cloud import storage
          import math

          ### SET THESE VARIABLES ###
          PROJECT_ID = ""
          CLOUD_STORAGE_BUCKET = ""
          ###########################

          def _get_storage_client():
          return storage.Client(
          project=PROJECT_ID)

          def convert_size(size_bytes):
          if size_bytes == 0:
          return "0 B"
          size_name = ("B", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
          i = int(math.floor(math.log(size_bytes, 1024)))
          p = math.pow(1024, i)
          s = round(size_bytes / p, 2)
          return "%s %s" % (s, size_name[i])

          def size_by_class():
          client = _get_storage_client()
          bucket = client.bucket(CLOUD_STORAGE_BUCKET)
          blobs = bucket.list_blobs()

          size_multi_regional = size_regional = size_nearline = size_coldline = 0
          for blob in blobs:
          if blob.storage_class == "MULTI_REGIONAL":
          size_multi_regional = size_multi_regional + blob.size
          if blob.storage_class == "REGIONAL":
          size_regional = size_regional + blob.size
          if blob.storage_class == "NEARLINE":
          size_nearline = size_nearline + blob.size
          if blob.storage_class == "COLDLINE":
          size_coldline = size_coldline + blob.size

          print("MULTI_REGIONAL: "+str(convert_size(size_multi_regional))+"n"+
          "REGIONAL: "+str(convert_size(size_regional)+"n"+
          "NEARLINE: "+str(convert_size(size_nearline))+"n"+
          "COLDLINE: "+str(convert_size(size_coldline))
          ))

          if __name__ == '__main__':
          size_by_class()


          To run this program from the Google Cloud Shell, make sure you have previously installed the Client Library for Python with:



          pip install --upgrade google-cloud-storage


          And in order to provide authentication credentials to the application code, you must point the environment variable GOOGLE_APPLICATION_CREDENTIALS to the location of the JSON file that contains your service account key:



          export `GOOGLE_APPLICATION_CREDENTIALS`="/home/user/Downloads/[FILE_NAME].json"


          Before running the script, set PROJECT_ID to the ID of your Project, and CLOUD_STORAGE_BUCKET to the name of your GCS Bucket.



          Run the script with python main.py. Output should be something like:



          MULTI_REGIONAL: 1.0 GB
          REGIONAL: 300 MB
          NEARLINE: 200 MB
          COLDLINE: 10 MB





          share|improve this answer























          • Hi @Maxima Thank you for your response , This one is not solved my problem . This one i've already checked .I'll brief it once again, in my bucket i've multi-region , nearline and codeline . I would like to get size of the files for all types of storage class exists in the bucket . If i've 50-GB in Multi-region , 20 GB in NearLINE , 1TB in codeline Like above i need response.
            – prasanna Kumar
            Nov 23 at 12:31










          • I misunderstood your initial inquiry then. I updated my answer. I hope this is what you were looking for.
            – Maxim
            Nov 24 at 0:03










          • Hi @Maxima, It's solve my 50% of my problem , I've run the above code but i'm getting only nearline and code line size as expected but region and multi region size are . not coming . Please check it from ur end. Will you able to get all storage class sizes .
            – prasanna Kumar
            Nov 26 at 6:38










          • I filed a Feature Request on your behalf for this: issuetracker.google.com/119989059. As it is indeed something that can be useful when having multiple files with different Storage Classes. As for the issue you’re encountering, please verify that you have files in the GCS bucket that belong to the Storage Class ‘REGIONAL’ and ‘MULTI_REGIONAL’. I’ve tested this on my end and it worked. Nonetheless, I will take another look at the code and will come back to you with an update by Wednesday the latest.
            – Maxim
            Nov 26 at 7:09










          • I tested that in my bucket have 3 storage classes namely multi-region/region , Nearline and codline . Assume total size of the bucket is 100TB but as per the code i'm getting as Multi-regional/regional : 0 B Neraline : 20 TB Codeline : 70 TB but remaining size should falls either in multi-regional or regional .
            – prasanna Kumar
            Nov 26 at 9:36














          2












          2








          2






          Edit:



          I have filed a Feature Request for this on the Public Issue Tracker. In the meantime, the code below can be used.



          I believe there is no gsutil command that can show you the total size by storage class for a GCS bucket.



          However, using the Cloud Storage Client Libraries for Python, I made a script that does what you’re asking for:



          from google.cloud import storage
          import math

          ### SET THESE VARIABLES ###
          PROJECT_ID = ""
          CLOUD_STORAGE_BUCKET = ""
          ###########################

          def _get_storage_client():
          return storage.Client(
          project=PROJECT_ID)

          def convert_size(size_bytes):
          if size_bytes == 0:
          return "0 B"
          size_name = ("B", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
          i = int(math.floor(math.log(size_bytes, 1024)))
          p = math.pow(1024, i)
          s = round(size_bytes / p, 2)
          return "%s %s" % (s, size_name[i])

          def size_by_class():
          client = _get_storage_client()
          bucket = client.bucket(CLOUD_STORAGE_BUCKET)
          blobs = bucket.list_blobs()

          size_multi_regional = size_regional = size_nearline = size_coldline = 0
          for blob in blobs:
          if blob.storage_class == "MULTI_REGIONAL":
          size_multi_regional = size_multi_regional + blob.size
          if blob.storage_class == "REGIONAL":
          size_regional = size_regional + blob.size
          if blob.storage_class == "NEARLINE":
          size_nearline = size_nearline + blob.size
          if blob.storage_class == "COLDLINE":
          size_coldline = size_coldline + blob.size

          print("MULTI_REGIONAL: "+str(convert_size(size_multi_regional))+"n"+
          "REGIONAL: "+str(convert_size(size_regional)+"n"+
          "NEARLINE: "+str(convert_size(size_nearline))+"n"+
          "COLDLINE: "+str(convert_size(size_coldline))
          ))

          if __name__ == '__main__':
          size_by_class()


          To run this program from the Google Cloud Shell, make sure you have previously installed the Client Library for Python with:



          pip install --upgrade google-cloud-storage


          And in order to provide authentication credentials to the application code, you must point the environment variable GOOGLE_APPLICATION_CREDENTIALS to the location of the JSON file that contains your service account key:



          export `GOOGLE_APPLICATION_CREDENTIALS`="/home/user/Downloads/[FILE_NAME].json"


          Before running the script, set PROJECT_ID to the ID of your Project, and CLOUD_STORAGE_BUCKET to the name of your GCS Bucket.



          Run the script with python main.py. Output should be something like:



          MULTI_REGIONAL: 1.0 GB
          REGIONAL: 300 MB
          NEARLINE: 200 MB
          COLDLINE: 10 MB





          share|improve this answer














          Edit:



          I have filed a Feature Request for this on the Public Issue Tracker. In the meantime, the code below can be used.



          I believe there is no gsutil command that can show you the total size by storage class for a GCS bucket.



          However, using the Cloud Storage Client Libraries for Python, I made a script that does what you’re asking for:



          from google.cloud import storage
          import math

          ### SET THESE VARIABLES ###
          PROJECT_ID = ""
          CLOUD_STORAGE_BUCKET = ""
          ###########################

          def _get_storage_client():
          return storage.Client(
          project=PROJECT_ID)

          def convert_size(size_bytes):
          if size_bytes == 0:
          return "0 B"
          size_name = ("B", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
          i = int(math.floor(math.log(size_bytes, 1024)))
          p = math.pow(1024, i)
          s = round(size_bytes / p, 2)
          return "%s %s" % (s, size_name[i])

          def size_by_class():
          client = _get_storage_client()
          bucket = client.bucket(CLOUD_STORAGE_BUCKET)
          blobs = bucket.list_blobs()

          size_multi_regional = size_regional = size_nearline = size_coldline = 0
          for blob in blobs:
          if blob.storage_class == "MULTI_REGIONAL":
          size_multi_regional = size_multi_regional + blob.size
          if blob.storage_class == "REGIONAL":
          size_regional = size_regional + blob.size
          if blob.storage_class == "NEARLINE":
          size_nearline = size_nearline + blob.size
          if blob.storage_class == "COLDLINE":
          size_coldline = size_coldline + blob.size

          print("MULTI_REGIONAL: "+str(convert_size(size_multi_regional))+"n"+
          "REGIONAL: "+str(convert_size(size_regional)+"n"+
          "NEARLINE: "+str(convert_size(size_nearline))+"n"+
          "COLDLINE: "+str(convert_size(size_coldline))
          ))

          if __name__ == '__main__':
          size_by_class()


          To run this program from the Google Cloud Shell, make sure you have previously installed the Client Library for Python with:



          pip install --upgrade google-cloud-storage


          And in order to provide authentication credentials to the application code, you must point the environment variable GOOGLE_APPLICATION_CREDENTIALS to the location of the JSON file that contains your service account key:



          export `GOOGLE_APPLICATION_CREDENTIALS`="/home/user/Downloads/[FILE_NAME].json"


          Before running the script, set PROJECT_ID to the ID of your Project, and CLOUD_STORAGE_BUCKET to the name of your GCS Bucket.



          Run the script with python main.py. Output should be something like:



          MULTI_REGIONAL: 1.0 GB
          REGIONAL: 300 MB
          NEARLINE: 200 MB
          COLDLINE: 10 MB






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Nov 26 at 20:07

























          answered Nov 23 at 9:56









          Maxim

          1,477210




          1,477210












          • Hi @Maxima Thank you for your response , This one is not solved my problem . This one i've already checked .I'll brief it once again, in my bucket i've multi-region , nearline and codeline . I would like to get size of the files for all types of storage class exists in the bucket . If i've 50-GB in Multi-region , 20 GB in NearLINE , 1TB in codeline Like above i need response.
            – prasanna Kumar
            Nov 23 at 12:31










          • I misunderstood your initial inquiry then. I updated my answer. I hope this is what you were looking for.
            – Maxim
            Nov 24 at 0:03










          • Hi @Maxima, It's solve my 50% of my problem , I've run the above code but i'm getting only nearline and code line size as expected but region and multi region size are . not coming . Please check it from ur end. Will you able to get all storage class sizes .
            – prasanna Kumar
            Nov 26 at 6:38










          • I filed a Feature Request on your behalf for this: issuetracker.google.com/119989059. As it is indeed something that can be useful when having multiple files with different Storage Classes. As for the issue you’re encountering, please verify that you have files in the GCS bucket that belong to the Storage Class ‘REGIONAL’ and ‘MULTI_REGIONAL’. I’ve tested this on my end and it worked. Nonetheless, I will take another look at the code and will come back to you with an update by Wednesday the latest.
            – Maxim
            Nov 26 at 7:09










          • I tested that in my bucket have 3 storage classes namely multi-region/region , Nearline and codline . Assume total size of the bucket is 100TB but as per the code i'm getting as Multi-regional/regional : 0 B Neraline : 20 TB Codeline : 70 TB but remaining size should falls either in multi-regional or regional .
            – prasanna Kumar
            Nov 26 at 9:36


















          • Hi @Maxima Thank you for your response , This one is not solved my problem . This one i've already checked .I'll brief it once again, in my bucket i've multi-region , nearline and codeline . I would like to get size of the files for all types of storage class exists in the bucket . If i've 50-GB in Multi-region , 20 GB in NearLINE , 1TB in codeline Like above i need response.
            – prasanna Kumar
            Nov 23 at 12:31










          • I misunderstood your initial inquiry then. I updated my answer. I hope this is what you were looking for.
            – Maxim
            Nov 24 at 0:03










          • Hi @Maxima, It's solve my 50% of my problem , I've run the above code but i'm getting only nearline and code line size as expected but region and multi region size are . not coming . Please check it from ur end. Will you able to get all storage class sizes .
            – prasanna Kumar
            Nov 26 at 6:38










          • I filed a Feature Request on your behalf for this: issuetracker.google.com/119989059. As it is indeed something that can be useful when having multiple files with different Storage Classes. As for the issue you’re encountering, please verify that you have files in the GCS bucket that belong to the Storage Class ‘REGIONAL’ and ‘MULTI_REGIONAL’. I’ve tested this on my end and it worked. Nonetheless, I will take another look at the code and will come back to you with an update by Wednesday the latest.
            – Maxim
            Nov 26 at 7:09










          • I tested that in my bucket have 3 storage classes namely multi-region/region , Nearline and codline . Assume total size of the bucket is 100TB but as per the code i'm getting as Multi-regional/regional : 0 B Neraline : 20 TB Codeline : 70 TB but remaining size should falls either in multi-regional or regional .
            – prasanna Kumar
            Nov 26 at 9:36
















          Hi @Maxima Thank you for your response , This one is not solved my problem . This one i've already checked .I'll brief it once again, in my bucket i've multi-region , nearline and codeline . I would like to get size of the files for all types of storage class exists in the bucket . If i've 50-GB in Multi-region , 20 GB in NearLINE , 1TB in codeline Like above i need response.
          – prasanna Kumar
          Nov 23 at 12:31




          Hi @Maxima Thank you for your response , This one is not solved my problem . This one i've already checked .I'll brief it once again, in my bucket i've multi-region , nearline and codeline . I would like to get size of the files for all types of storage class exists in the bucket . If i've 50-GB in Multi-region , 20 GB in NearLINE , 1TB in codeline Like above i need response.
          – prasanna Kumar
          Nov 23 at 12:31












          I misunderstood your initial inquiry then. I updated my answer. I hope this is what you were looking for.
          – Maxim
          Nov 24 at 0:03




          I misunderstood your initial inquiry then. I updated my answer. I hope this is what you were looking for.
          – Maxim
          Nov 24 at 0:03












          Hi @Maxima, It's solve my 50% of my problem , I've run the above code but i'm getting only nearline and code line size as expected but region and multi region size are . not coming . Please check it from ur end. Will you able to get all storage class sizes .
          – prasanna Kumar
          Nov 26 at 6:38




          Hi @Maxima, It's solve my 50% of my problem , I've run the above code but i'm getting only nearline and code line size as expected but region and multi region size are . not coming . Please check it from ur end. Will you able to get all storage class sizes .
          – prasanna Kumar
          Nov 26 at 6:38












          I filed a Feature Request on your behalf for this: issuetracker.google.com/119989059. As it is indeed something that can be useful when having multiple files with different Storage Classes. As for the issue you’re encountering, please verify that you have files in the GCS bucket that belong to the Storage Class ‘REGIONAL’ and ‘MULTI_REGIONAL’. I’ve tested this on my end and it worked. Nonetheless, I will take another look at the code and will come back to you with an update by Wednesday the latest.
          – Maxim
          Nov 26 at 7:09




          I filed a Feature Request on your behalf for this: issuetracker.google.com/119989059. As it is indeed something that can be useful when having multiple files with different Storage Classes. As for the issue you’re encountering, please verify that you have files in the GCS bucket that belong to the Storage Class ‘REGIONAL’ and ‘MULTI_REGIONAL’. I’ve tested this on my end and it worked. Nonetheless, I will take another look at the code and will come back to you with an update by Wednesday the latest.
          – Maxim
          Nov 26 at 7:09












          I tested that in my bucket have 3 storage classes namely multi-region/region , Nearline and codline . Assume total size of the bucket is 100TB but as per the code i'm getting as Multi-regional/regional : 0 B Neraline : 20 TB Codeline : 70 TB but remaining size should falls either in multi-regional or regional .
          – prasanna Kumar
          Nov 26 at 9:36




          I tested that in my bucket have 3 storage classes namely multi-region/region , Nearline and codline . Assume total size of the bucket is 100TB but as per the code i'm getting as Multi-regional/regional : 0 B Neraline : 20 TB Codeline : 70 TB but remaining size should falls either in multi-regional or regional .
          – prasanna Kumar
          Nov 26 at 9:36


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53441576%2fget-size-of-the-bucket-based-on-storage-classes-in-google-cloud-storage%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          A CLEAN and SIMPLE way to add appendices to Table of Contents and bookmarks

          Calculate evaluation metrics using cross_val_predict sklearn

          Insert data from modal to MySQL (multiple modal on website)