CNTK Loss and Error Metric function for multi label classification





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







0















Other than squared_error what other loss function / error function would I be able to use ?



I looked through https://cntk.ai/pythondocs/cntk.losses.html
and wasn't able to find anything that helps.



i found documentation for brain script but not in python



any help would be amazing :)










share|improve this question





























    0















    Other than squared_error what other loss function / error function would I be able to use ?



    I looked through https://cntk.ai/pythondocs/cntk.losses.html
    and wasn't able to find anything that helps.



    i found documentation for brain script but not in python



    any help would be amazing :)










    share|improve this question

























      0












      0








      0








      Other than squared_error what other loss function / error function would I be able to use ?



      I looked through https://cntk.ai/pythondocs/cntk.losses.html
      and wasn't able to find anything that helps.



      i found documentation for brain script but not in python



      any help would be amazing :)










      share|improve this question














      Other than squared_error what other loss function / error function would I be able to use ?



      I looked through https://cntk.ai/pythondocs/cntk.losses.html
      and wasn't able to find anything that helps.



      i found documentation for brain script but not in python



      any help would be amazing :)







      cntk






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 29 '18 at 6:19









      zero corezero core

      84




      84
























          2 Answers
          2






          active

          oldest

          votes


















          0














          The best source of documentation (IMHO) is python documentation. If you need to write your own loss function I found this post very helpful. Try using sigmoid function at the output layer and binary cross entropy loss or cosine loss.



          target = cntk.input_variable(input_dim)
          loss = cntk.binary_cross_entropy(z, target)


          This way your nodes will output probabilities independent of each other like [0.73, 0.02, 0.05, 0.26, 0.68].






          share|improve this answer

































            0














            For multi-class classification, we typically use cross_entropy_with_softmax.



            You are trying to attribute 2 or more class to every sample, then there's no native implementation in cntk






            share|improve this answer


























            • I don't cntk has a built in category_cross_entropy_with_softmax ? .. i couldn't find it in the documentation or in the code

              – zero core
              Nov 29 '18 at 15:30











            • its in the python API, not too sure about brainscript. I updated the answer with the link

              – snowflake
              Nov 29 '18 at 15:32













            • Oh yes but correct me if i am worng but that one is mainly for one hot encoded target ... from their example I would ideally want - C.cross_entropy_with_softmax([[1., 1., 50., 50.]], [[0., 0., 1., 1.]]).eval() ...... to result in 0. ..... but it would say 50 ( array([[-49.30685425]], dtype=float32) ).

              – zero core
              Nov 29 '18 at 19:03













            • For classification problems, you must one hot encode.

              – snowflake
              Nov 30 '18 at 18:31











            • i don't think you understand what i am trying to say

              – zero core
              Dec 1 '18 at 23:47














            Your Answer






            StackExchange.ifUsing("editor", function () {
            StackExchange.using("externalEditor", function () {
            StackExchange.using("snippets", function () {
            StackExchange.snippets.init();
            });
            });
            }, "code-snippets");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "1"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53532947%2fcntk-loss-and-error-metric-function-for-multi-label-classification%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            The best source of documentation (IMHO) is python documentation. If you need to write your own loss function I found this post very helpful. Try using sigmoid function at the output layer and binary cross entropy loss or cosine loss.



            target = cntk.input_variable(input_dim)
            loss = cntk.binary_cross_entropy(z, target)


            This way your nodes will output probabilities independent of each other like [0.73, 0.02, 0.05, 0.26, 0.68].






            share|improve this answer






























              0














              The best source of documentation (IMHO) is python documentation. If you need to write your own loss function I found this post very helpful. Try using sigmoid function at the output layer and binary cross entropy loss or cosine loss.



              target = cntk.input_variable(input_dim)
              loss = cntk.binary_cross_entropy(z, target)


              This way your nodes will output probabilities independent of each other like [0.73, 0.02, 0.05, 0.26, 0.68].






              share|improve this answer




























                0












                0








                0







                The best source of documentation (IMHO) is python documentation. If you need to write your own loss function I found this post very helpful. Try using sigmoid function at the output layer and binary cross entropy loss or cosine loss.



                target = cntk.input_variable(input_dim)
                loss = cntk.binary_cross_entropy(z, target)


                This way your nodes will output probabilities independent of each other like [0.73, 0.02, 0.05, 0.26, 0.68].






                share|improve this answer















                The best source of documentation (IMHO) is python documentation. If you need to write your own loss function I found this post very helpful. Try using sigmoid function at the output layer and binary cross entropy loss or cosine loss.



                target = cntk.input_variable(input_dim)
                loss = cntk.binary_cross_entropy(z, target)


                This way your nodes will output probabilities independent of each other like [0.73, 0.02, 0.05, 0.26, 0.68].







                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited Dec 5 '18 at 12:59

























                answered Dec 5 '18 at 12:26









                papadoble151papadoble151

                448414




                448414

























                    0














                    For multi-class classification, we typically use cross_entropy_with_softmax.



                    You are trying to attribute 2 or more class to every sample, then there's no native implementation in cntk






                    share|improve this answer


























                    • I don't cntk has a built in category_cross_entropy_with_softmax ? .. i couldn't find it in the documentation or in the code

                      – zero core
                      Nov 29 '18 at 15:30











                    • its in the python API, not too sure about brainscript. I updated the answer with the link

                      – snowflake
                      Nov 29 '18 at 15:32













                    • Oh yes but correct me if i am worng but that one is mainly for one hot encoded target ... from their example I would ideally want - C.cross_entropy_with_softmax([[1., 1., 50., 50.]], [[0., 0., 1., 1.]]).eval() ...... to result in 0. ..... but it would say 50 ( array([[-49.30685425]], dtype=float32) ).

                      – zero core
                      Nov 29 '18 at 19:03













                    • For classification problems, you must one hot encode.

                      – snowflake
                      Nov 30 '18 at 18:31











                    • i don't think you understand what i am trying to say

                      – zero core
                      Dec 1 '18 at 23:47


















                    0














                    For multi-class classification, we typically use cross_entropy_with_softmax.



                    You are trying to attribute 2 or more class to every sample, then there's no native implementation in cntk






                    share|improve this answer


























                    • I don't cntk has a built in category_cross_entropy_with_softmax ? .. i couldn't find it in the documentation or in the code

                      – zero core
                      Nov 29 '18 at 15:30











                    • its in the python API, not too sure about brainscript. I updated the answer with the link

                      – snowflake
                      Nov 29 '18 at 15:32













                    • Oh yes but correct me if i am worng but that one is mainly for one hot encoded target ... from their example I would ideally want - C.cross_entropy_with_softmax([[1., 1., 50., 50.]], [[0., 0., 1., 1.]]).eval() ...... to result in 0. ..... but it would say 50 ( array([[-49.30685425]], dtype=float32) ).

                      – zero core
                      Nov 29 '18 at 19:03













                    • For classification problems, you must one hot encode.

                      – snowflake
                      Nov 30 '18 at 18:31











                    • i don't think you understand what i am trying to say

                      – zero core
                      Dec 1 '18 at 23:47
















                    0












                    0








                    0







                    For multi-class classification, we typically use cross_entropy_with_softmax.



                    You are trying to attribute 2 or more class to every sample, then there's no native implementation in cntk






                    share|improve this answer















                    For multi-class classification, we typically use cross_entropy_with_softmax.



                    You are trying to attribute 2 or more class to every sample, then there's no native implementation in cntk







                    share|improve this answer














                    share|improve this answer



                    share|improve this answer








                    edited Dec 3 '18 at 9:06

























                    answered Nov 29 '18 at 10:16









                    snowflakesnowflake

                    263210




                    263210













                    • I don't cntk has a built in category_cross_entropy_with_softmax ? .. i couldn't find it in the documentation or in the code

                      – zero core
                      Nov 29 '18 at 15:30











                    • its in the python API, not too sure about brainscript. I updated the answer with the link

                      – snowflake
                      Nov 29 '18 at 15:32













                    • Oh yes but correct me if i am worng but that one is mainly for one hot encoded target ... from their example I would ideally want - C.cross_entropy_with_softmax([[1., 1., 50., 50.]], [[0., 0., 1., 1.]]).eval() ...... to result in 0. ..... but it would say 50 ( array([[-49.30685425]], dtype=float32) ).

                      – zero core
                      Nov 29 '18 at 19:03













                    • For classification problems, you must one hot encode.

                      – snowflake
                      Nov 30 '18 at 18:31











                    • i don't think you understand what i am trying to say

                      – zero core
                      Dec 1 '18 at 23:47





















                    • I don't cntk has a built in category_cross_entropy_with_softmax ? .. i couldn't find it in the documentation or in the code

                      – zero core
                      Nov 29 '18 at 15:30











                    • its in the python API, not too sure about brainscript. I updated the answer with the link

                      – snowflake
                      Nov 29 '18 at 15:32













                    • Oh yes but correct me if i am worng but that one is mainly for one hot encoded target ... from their example I would ideally want - C.cross_entropy_with_softmax([[1., 1., 50., 50.]], [[0., 0., 1., 1.]]).eval() ...... to result in 0. ..... but it would say 50 ( array([[-49.30685425]], dtype=float32) ).

                      – zero core
                      Nov 29 '18 at 19:03













                    • For classification problems, you must one hot encode.

                      – snowflake
                      Nov 30 '18 at 18:31











                    • i don't think you understand what i am trying to say

                      – zero core
                      Dec 1 '18 at 23:47



















                    I don't cntk has a built in category_cross_entropy_with_softmax ? .. i couldn't find it in the documentation or in the code

                    – zero core
                    Nov 29 '18 at 15:30





                    I don't cntk has a built in category_cross_entropy_with_softmax ? .. i couldn't find it in the documentation or in the code

                    – zero core
                    Nov 29 '18 at 15:30













                    its in the python API, not too sure about brainscript. I updated the answer with the link

                    – snowflake
                    Nov 29 '18 at 15:32







                    its in the python API, not too sure about brainscript. I updated the answer with the link

                    – snowflake
                    Nov 29 '18 at 15:32















                    Oh yes but correct me if i am worng but that one is mainly for one hot encoded target ... from their example I would ideally want - C.cross_entropy_with_softmax([[1., 1., 50., 50.]], [[0., 0., 1., 1.]]).eval() ...... to result in 0. ..... but it would say 50 ( array([[-49.30685425]], dtype=float32) ).

                    – zero core
                    Nov 29 '18 at 19:03







                    Oh yes but correct me if i am worng but that one is mainly for one hot encoded target ... from their example I would ideally want - C.cross_entropy_with_softmax([[1., 1., 50., 50.]], [[0., 0., 1., 1.]]).eval() ...... to result in 0. ..... but it would say 50 ( array([[-49.30685425]], dtype=float32) ).

                    – zero core
                    Nov 29 '18 at 19:03















                    For classification problems, you must one hot encode.

                    – snowflake
                    Nov 30 '18 at 18:31





                    For classification problems, you must one hot encode.

                    – snowflake
                    Nov 30 '18 at 18:31













                    i don't think you understand what i am trying to say

                    – zero core
                    Dec 1 '18 at 23:47







                    i don't think you understand what i am trying to say

                    – zero core
                    Dec 1 '18 at 23:47




















                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53532947%2fcntk-loss-and-error-metric-function-for-multi-label-classification%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Contact image not getting when fetch all contact list from iPhone by CNContact

                    count number of partitions of a set with n elements into k subsets

                    A CLEAN and SIMPLE way to add appendices to Table of Contents and bookmarks