Keras concatenate with shapes [(1, 8), (None, 32)]












0














My network consists of LSTM and Dense parts connected together by the other Dense part and I cannot concatenate inputs of size [(1, 8), (None, 32)]. Reshape and Flatten do not work.



Here's the architecture:



def build_model_equal(dropout_rate=0.25):

curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
lstm_1 = LSTM(256, return_sequences=True, dropout=0.1)(curve_input_1)
lstm_1 = LSTM(64, dropout=0.1)(lstm_1)
lstm_out = Dense(8)(lstm_1)

metadata_input = Input(shape=(31,), name='metadata_input')

dense_1 = Dense(512, activation='relu')(metadata_input)
dense_1 = BatchNormalization()(dense_1)
dense_1 = Dropout(dropout_rate)(dense_1)

dense_out = Dense(32)(dense_1)

x = keras.layers.concatenate([lstm_out, dense_out], axis=1)

output_hidden = Dense(64)(x)
output_hidden = BatchNormalization()(output_hidden)
output_hidden = Dropout(dropout_rate)(output_hidden)

output = Dense(n_classes, activation='softmax', name='output')(output_hidden)

model = Model(inputs=[curve_input_1, metadata_input], outputs=output)
return model


When I train this model via



model.fit([x_train, x_metadata], y_train,
validation_data=[[x_valid, x_metadata_val], y_valid],
epochs=n_epoch,
batch_size=n_batch, shuffle=True,
verbose=2, callbacks=[checkPoint]
)


I get an error




ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 32)]




When I add Reshape layer like



dense_out = Dense(32)(dense_4)
dense_out = Reshape((1, 32))(dense_out)

x = keras.layers.concatenate([lstm_out, dense_out], axis=1)


I get




ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 1, 32)]




Reshape layer input_shape=(32,) or input_shape=(None, 32) parameters do not change the situation, error and shapes are the same.



Adding Reshape to LSTM like



curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
lstm_first_1 = LSTM(256, return_sequences=True, dropout=0.1, name='lstm_first_1')(curve_input_1)
lstm_second_1 = LSTM(64, dropout=0.1, name='lstm_second_1')(lstm_first_1)
lstm_out = Dense(8)(lstm_second_1)
lstm_out = Reshape((None, 8))(lstm_out)


Produces an error




ValueError: Tried to convert 'shape' to a tensor and failed. Error: None values not supported.




Changing concatenate axis parameter to 0, 1 and -1 doesn't help.



Changing Dense part input shape doesn't help. When I do metadata_input = Input(shape=(1, 31), name='metadata_input') instead of metadata_input = Input(shape=(31,), name='metadata_input') it produces an error with [(1, 8), (None, 1, 32)] dimensions.



My guess is that I need to transform data either to [(1, 8), (1, 32)] or to [(None, 8), (None, 32)] shape, but Reshape and Flatten layers didn't help.



There should be an easy way to do that that I missed.










share|improve this question





























    0














    My network consists of LSTM and Dense parts connected together by the other Dense part and I cannot concatenate inputs of size [(1, 8), (None, 32)]. Reshape and Flatten do not work.



    Here's the architecture:



    def build_model_equal(dropout_rate=0.25):

    curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
    lstm_1 = LSTM(256, return_sequences=True, dropout=0.1)(curve_input_1)
    lstm_1 = LSTM(64, dropout=0.1)(lstm_1)
    lstm_out = Dense(8)(lstm_1)

    metadata_input = Input(shape=(31,), name='metadata_input')

    dense_1 = Dense(512, activation='relu')(metadata_input)
    dense_1 = BatchNormalization()(dense_1)
    dense_1 = Dropout(dropout_rate)(dense_1)

    dense_out = Dense(32)(dense_1)

    x = keras.layers.concatenate([lstm_out, dense_out], axis=1)

    output_hidden = Dense(64)(x)
    output_hidden = BatchNormalization()(output_hidden)
    output_hidden = Dropout(dropout_rate)(output_hidden)

    output = Dense(n_classes, activation='softmax', name='output')(output_hidden)

    model = Model(inputs=[curve_input_1, metadata_input], outputs=output)
    return model


    When I train this model via



    model.fit([x_train, x_metadata], y_train,
    validation_data=[[x_valid, x_metadata_val], y_valid],
    epochs=n_epoch,
    batch_size=n_batch, shuffle=True,
    verbose=2, callbacks=[checkPoint]
    )


    I get an error




    ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 32)]




    When I add Reshape layer like



    dense_out = Dense(32)(dense_4)
    dense_out = Reshape((1, 32))(dense_out)

    x = keras.layers.concatenate([lstm_out, dense_out], axis=1)


    I get




    ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 1, 32)]




    Reshape layer input_shape=(32,) or input_shape=(None, 32) parameters do not change the situation, error and shapes are the same.



    Adding Reshape to LSTM like



    curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
    lstm_first_1 = LSTM(256, return_sequences=True, dropout=0.1, name='lstm_first_1')(curve_input_1)
    lstm_second_1 = LSTM(64, dropout=0.1, name='lstm_second_1')(lstm_first_1)
    lstm_out = Dense(8)(lstm_second_1)
    lstm_out = Reshape((None, 8))(lstm_out)


    Produces an error




    ValueError: Tried to convert 'shape' to a tensor and failed. Error: None values not supported.




    Changing concatenate axis parameter to 0, 1 and -1 doesn't help.



    Changing Dense part input shape doesn't help. When I do metadata_input = Input(shape=(1, 31), name='metadata_input') instead of metadata_input = Input(shape=(31,), name='metadata_input') it produces an error with [(1, 8), (None, 1, 32)] dimensions.



    My guess is that I need to transform data either to [(1, 8), (1, 32)] or to [(None, 8), (None, 32)] shape, but Reshape and Flatten layers didn't help.



    There should be an easy way to do that that I missed.










    share|improve this question



























      0












      0








      0







      My network consists of LSTM and Dense parts connected together by the other Dense part and I cannot concatenate inputs of size [(1, 8), (None, 32)]. Reshape and Flatten do not work.



      Here's the architecture:



      def build_model_equal(dropout_rate=0.25):

      curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
      lstm_1 = LSTM(256, return_sequences=True, dropout=0.1)(curve_input_1)
      lstm_1 = LSTM(64, dropout=0.1)(lstm_1)
      lstm_out = Dense(8)(lstm_1)

      metadata_input = Input(shape=(31,), name='metadata_input')

      dense_1 = Dense(512, activation='relu')(metadata_input)
      dense_1 = BatchNormalization()(dense_1)
      dense_1 = Dropout(dropout_rate)(dense_1)

      dense_out = Dense(32)(dense_1)

      x = keras.layers.concatenate([lstm_out, dense_out], axis=1)

      output_hidden = Dense(64)(x)
      output_hidden = BatchNormalization()(output_hidden)
      output_hidden = Dropout(dropout_rate)(output_hidden)

      output = Dense(n_classes, activation='softmax', name='output')(output_hidden)

      model = Model(inputs=[curve_input_1, metadata_input], outputs=output)
      return model


      When I train this model via



      model.fit([x_train, x_metadata], y_train,
      validation_data=[[x_valid, x_metadata_val], y_valid],
      epochs=n_epoch,
      batch_size=n_batch, shuffle=True,
      verbose=2, callbacks=[checkPoint]
      )


      I get an error




      ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 32)]




      When I add Reshape layer like



      dense_out = Dense(32)(dense_4)
      dense_out = Reshape((1, 32))(dense_out)

      x = keras.layers.concatenate([lstm_out, dense_out], axis=1)


      I get




      ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 1, 32)]




      Reshape layer input_shape=(32,) or input_shape=(None, 32) parameters do not change the situation, error and shapes are the same.



      Adding Reshape to LSTM like



      curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
      lstm_first_1 = LSTM(256, return_sequences=True, dropout=0.1, name='lstm_first_1')(curve_input_1)
      lstm_second_1 = LSTM(64, dropout=0.1, name='lstm_second_1')(lstm_first_1)
      lstm_out = Dense(8)(lstm_second_1)
      lstm_out = Reshape((None, 8))(lstm_out)


      Produces an error




      ValueError: Tried to convert 'shape' to a tensor and failed. Error: None values not supported.




      Changing concatenate axis parameter to 0, 1 and -1 doesn't help.



      Changing Dense part input shape doesn't help. When I do metadata_input = Input(shape=(1, 31), name='metadata_input') instead of metadata_input = Input(shape=(31,), name='metadata_input') it produces an error with [(1, 8), (None, 1, 32)] dimensions.



      My guess is that I need to transform data either to [(1, 8), (1, 32)] or to [(None, 8), (None, 32)] shape, but Reshape and Flatten layers didn't help.



      There should be an easy way to do that that I missed.










      share|improve this question















      My network consists of LSTM and Dense parts connected together by the other Dense part and I cannot concatenate inputs of size [(1, 8), (None, 32)]. Reshape and Flatten do not work.



      Here's the architecture:



      def build_model_equal(dropout_rate=0.25):

      curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
      lstm_1 = LSTM(256, return_sequences=True, dropout=0.1)(curve_input_1)
      lstm_1 = LSTM(64, dropout=0.1)(lstm_1)
      lstm_out = Dense(8)(lstm_1)

      metadata_input = Input(shape=(31,), name='metadata_input')

      dense_1 = Dense(512, activation='relu')(metadata_input)
      dense_1 = BatchNormalization()(dense_1)
      dense_1 = Dropout(dropout_rate)(dense_1)

      dense_out = Dense(32)(dense_1)

      x = keras.layers.concatenate([lstm_out, dense_out], axis=1)

      output_hidden = Dense(64)(x)
      output_hidden = BatchNormalization()(output_hidden)
      output_hidden = Dropout(dropout_rate)(output_hidden)

      output = Dense(n_classes, activation='softmax', name='output')(output_hidden)

      model = Model(inputs=[curve_input_1, metadata_input], outputs=output)
      return model


      When I train this model via



      model.fit([x_train, x_metadata], y_train,
      validation_data=[[x_valid, x_metadata_val], y_valid],
      epochs=n_epoch,
      batch_size=n_batch, shuffle=True,
      verbose=2, callbacks=[checkPoint]
      )


      I get an error




      ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 32)]




      When I add Reshape layer like



      dense_out = Dense(32)(dense_4)
      dense_out = Reshape((1, 32))(dense_out)

      x = keras.layers.concatenate([lstm_out, dense_out], axis=1)


      I get




      ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 1, 32)]




      Reshape layer input_shape=(32,) or input_shape=(None, 32) parameters do not change the situation, error and shapes are the same.



      Adding Reshape to LSTM like



      curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
      lstm_first_1 = LSTM(256, return_sequences=True, dropout=0.1, name='lstm_first_1')(curve_input_1)
      lstm_second_1 = LSTM(64, dropout=0.1, name='lstm_second_1')(lstm_first_1)
      lstm_out = Dense(8)(lstm_second_1)
      lstm_out = Reshape((None, 8))(lstm_out)


      Produces an error




      ValueError: Tried to convert 'shape' to a tensor and failed. Error: None values not supported.




      Changing concatenate axis parameter to 0, 1 and -1 doesn't help.



      Changing Dense part input shape doesn't help. When I do metadata_input = Input(shape=(1, 31), name='metadata_input') instead of metadata_input = Input(shape=(31,), name='metadata_input') it produces an error with [(1, 8), (None, 1, 32)] dimensions.



      My guess is that I need to transform data either to [(1, 8), (1, 32)] or to [(None, 8), (None, 32)] shape, but Reshape and Flatten layers didn't help.



      There should be an easy way to do that that I missed.







      python tensorflow keras






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 22 at 23:17









      Jonathan Leffler

      559k896651016




      559k896651016










      asked Nov 22 at 21:40









      rufldee

      1




      1
























          1 Answer
          1






          active

          oldest

          votes


















          0














          I think the problem could be the use batch_shape for the first Input and shape for the second one.



          With the first input, your batch size is hardcoded as 1 and your input data has 2 extra dimensions, None (unspecified) and 1.



          For the second input, since you are using shape, you are declaring that your input has batch size unespecified and data has one dimension of 31 values.



          Note that using shape=(31,) is the same as using batch_shape=(None, 31) (from here).



          Aligning both is working for me, at least at model declaration time (I was unable to run the fit though, and I'm not sure if I'm missing something and this solution doesn't fit your use case.



          So, to summarize, you can try:



          curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
          metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


          Or:



          curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
          metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


          Which is equivalent to:



          curve_input_1 = Input(shape=(None, 1, ), name='curve_input_1')
          metadata_input = Input(shape=(31, ), name='metadata_input')


          Please, let me know it this worked or lead you in a good direction!






          share|improve this answer





















            Your Answer






            StackExchange.ifUsing("editor", function () {
            StackExchange.using("externalEditor", function () {
            StackExchange.using("snippets", function () {
            StackExchange.snippets.init();
            });
            });
            }, "code-snippets");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "1"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53438264%2fkeras-concatenate-with-shapes-1-8-none-32%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            I think the problem could be the use batch_shape for the first Input and shape for the second one.



            With the first input, your batch size is hardcoded as 1 and your input data has 2 extra dimensions, None (unspecified) and 1.



            For the second input, since you are using shape, you are declaring that your input has batch size unespecified and data has one dimension of 31 values.



            Note that using shape=(31,) is the same as using batch_shape=(None, 31) (from here).



            Aligning both is working for me, at least at model declaration time (I was unable to run the fit though, and I'm not sure if I'm missing something and this solution doesn't fit your use case.



            So, to summarize, you can try:



            curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
            metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


            Or:



            curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
            metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


            Which is equivalent to:



            curve_input_1 = Input(shape=(None, 1, ), name='curve_input_1')
            metadata_input = Input(shape=(31, ), name='metadata_input')


            Please, let me know it this worked or lead you in a good direction!






            share|improve this answer


























              0














              I think the problem could be the use batch_shape for the first Input and shape for the second one.



              With the first input, your batch size is hardcoded as 1 and your input data has 2 extra dimensions, None (unspecified) and 1.



              For the second input, since you are using shape, you are declaring that your input has batch size unespecified and data has one dimension of 31 values.



              Note that using shape=(31,) is the same as using batch_shape=(None, 31) (from here).



              Aligning both is working for me, at least at model declaration time (I was unable to run the fit though, and I'm not sure if I'm missing something and this solution doesn't fit your use case.



              So, to summarize, you can try:



              curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
              metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


              Or:



              curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
              metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


              Which is equivalent to:



              curve_input_1 = Input(shape=(None, 1, ), name='curve_input_1')
              metadata_input = Input(shape=(31, ), name='metadata_input')


              Please, let me know it this worked or lead you in a good direction!






              share|improve this answer
























                0












                0








                0






                I think the problem could be the use batch_shape for the first Input and shape for the second one.



                With the first input, your batch size is hardcoded as 1 and your input data has 2 extra dimensions, None (unspecified) and 1.



                For the second input, since you are using shape, you are declaring that your input has batch size unespecified and data has one dimension of 31 values.



                Note that using shape=(31,) is the same as using batch_shape=(None, 31) (from here).



                Aligning both is working for me, at least at model declaration time (I was unable to run the fit though, and I'm not sure if I'm missing something and this solution doesn't fit your use case.



                So, to summarize, you can try:



                curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
                metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


                Or:



                curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
                metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


                Which is equivalent to:



                curve_input_1 = Input(shape=(None, 1, ), name='curve_input_1')
                metadata_input = Input(shape=(31, ), name='metadata_input')


                Please, let me know it this worked or lead you in a good direction!






                share|improve this answer












                I think the problem could be the use batch_shape for the first Input and shape for the second one.



                With the first input, your batch size is hardcoded as 1 and your input data has 2 extra dimensions, None (unspecified) and 1.



                For the second input, since you are using shape, you are declaring that your input has batch size unespecified and data has one dimension of 31 values.



                Note that using shape=(31,) is the same as using batch_shape=(None, 31) (from here).



                Aligning both is working for me, at least at model declaration time (I was unable to run the fit though, and I'm not sure if I'm missing something and this solution doesn't fit your use case.



                So, to summarize, you can try:



                curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
                metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


                Or:



                curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
                metadata_input = Input(batch_shape=(1, 31), name='metadata_input')


                Which is equivalent to:



                curve_input_1 = Input(shape=(None, 1, ), name='curve_input_1')
                metadata_input = Input(shape=(31, ), name='metadata_input')


                Please, let me know it this worked or lead you in a good direction!







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Nov 22 at 23:12









                Julian Peller

                849511




                849511






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.





                    Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                    Please pay close attention to the following guidance:


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53438264%2fkeras-concatenate-with-shapes-1-8-none-32%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    A CLEAN and SIMPLE way to add appendices to Table of Contents and bookmarks

                    Calculate evaluation metrics using cross_val_predict sklearn

                    Insert data from modal to MySQL (multiple modal on website)