Prove that the sum and the absolute difference of 2 Bernoulli(0.5) random variables are not independent





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty{ margin-bottom:0;
}






up vote
1
down vote

favorite












Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.



I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.










share|cite|improve this question






















  • Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
    – whuber
    3 hours ago












  • I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
    – MSE
    2 hours ago

















up vote
1
down vote

favorite












Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.



I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.










share|cite|improve this question






















  • Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
    – whuber
    3 hours ago












  • I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
    – MSE
    2 hours ago













up vote
1
down vote

favorite









up vote
1
down vote

favorite











Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.



I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.










share|cite|improve this question













Let $X$ and $Y$ be independent $Bernoulli(0.5)$ random variables. Let $W = X + Y$ and $T = |X - Y|$. Show that $W$ and $T$ are not independent.



I know that I have to show that $P(W, T)$ is not equal to $P(W)P(T)$, but finding the joint distribution is hard. Please help.







probability self-study independence bernoulli-distribution






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 4 hours ago









MSE

668




668












  • Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
    – whuber
    3 hours ago












  • I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
    – MSE
    2 hours ago


















  • Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
    – whuber
    3 hours ago












  • I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
    – MSE
    2 hours ago
















Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber
3 hours ago






Re: "finding the joint distribution is hard:" have you made a table? Label the rows with values of $X$, the columns with values of $Y$, and in the cells put the values of $T,$ $W,$ and the associated probabilities. Collect your results into a new table with rows labeled with $T$ and columns labeled with $W:$ put the total probabilities into the entries. That depicts the entire joint distribution of $(T,W).$ You can then draw your conclusion with a visual inspection. No operation is any more difficult than computing $1/2times 1/2.$
– whuber
3 hours ago














I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
2 hours ago




I get it now. I was looking for an elegant, mathematical expression for the joint distribution, but I now realize that I can just enumerate the sample space and the probabilities easily. Thanks, @whuber.
– MSE
2 hours ago










2 Answers
2






active

oldest

votes

















up vote
1
down vote



accepted










The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



However, the joint density is defined on a smaller space:
$$
{0,0} cup {1,1} cup {2, 0}.
$$



To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
$$
P(W,T) = 0 neq P(W)P(T).
$$



Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.






share|cite|improve this answer




























    up vote
    2
    down vote













    When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
    See Independence of $X+Y$ and $X-Y$






    share|cite|improve this answer





















    • I want to mark your solution as correct, too! Thanks, @user158565.
      – MSE
      2 hours ago











    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "65"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f379103%2fprove-that-the-sum-and-the-absolute-difference-of-2-bernoulli0-5-random-variab%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted










    The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



    However, the joint density is defined on a smaller space:
    $$
    {0,0} cup {1,1} cup {2, 0}.
    $$



    To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
    $$
    P(W,T) = 0 neq P(W)P(T).
    $$



    Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.






    share|cite|improve this answer

























      up vote
      1
      down vote



      accepted










      The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



      However, the joint density is defined on a smaller space:
      $$
      {0,0} cup {1,1} cup {2, 0}.
      $$



      To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
      $$
      P(W,T) = 0 neq P(W)P(T).
      $$



      Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.






      share|cite|improve this answer























        up vote
        1
        down vote



        accepted







        up vote
        1
        down vote



        accepted






        The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



        However, the joint density is defined on a smaller space:
        $$
        {0,0} cup {1,1} cup {2, 0}.
        $$



        To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
        $$
        P(W,T) = 0 neq P(W)P(T).
        $$



        Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.






        share|cite|improve this answer












        The product of the marginal distributions is defined on ${0,1,2} times {0,1}$. You can plug in any of the $6$ possible pairs, and get a nonzero number out.



        However, the joint density is defined on a smaller space:
        $$
        {0,0} cup {1,1} cup {2, 0}.
        $$



        To disprove independence, take any $(w,t)$ pair not in the above, and plug it in to $P(W,T)$ and $P(W)P(T)$. You will see that, for that particular pair:
        $$
        P(W,T) = 0 neq P(W)P(T).
        $$



        Alternatively, because you're dealing with a small space, you can just go ahead and compute every probability and just check every possible pair.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 4 hours ago









        Taylor

        11.4k11743




        11.4k11743
























            up vote
            2
            down vote













            When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
            See Independence of $X+Y$ and $X-Y$






            share|cite|improve this answer





















            • I want to mark your solution as correct, too! Thanks, @user158565.
              – MSE
              2 hours ago















            up vote
            2
            down vote













            When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
            See Independence of $X+Y$ and $X-Y$






            share|cite|improve this answer





















            • I want to mark your solution as correct, too! Thanks, @user158565.
              – MSE
              2 hours ago













            up vote
            2
            down vote










            up vote
            2
            down vote









            When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
            See Independence of $X+Y$ and $X-Y$






            share|cite|improve this answer












            When T = 0, W = 0 or 2; when T = 1 then W = 1. So T and W are not independent.
            See Independence of $X+Y$ and $X-Y$







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered 4 hours ago









            user158565

            4,2391316




            4,2391316












            • I want to mark your solution as correct, too! Thanks, @user158565.
              – MSE
              2 hours ago


















            • I want to mark your solution as correct, too! Thanks, @user158565.
              – MSE
              2 hours ago
















            I want to mark your solution as correct, too! Thanks, @user158565.
            – MSE
            2 hours ago




            I want to mark your solution as correct, too! Thanks, @user158565.
            – MSE
            2 hours ago


















             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f379103%2fprove-that-the-sum-and-the-absolute-difference-of-2-bernoulli0-5-random-variab%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Contact image not getting when fetch all contact list from iPhone by CNContact

            count number of partitions of a set with n elements into k subsets

            A CLEAN and SIMPLE way to add appendices to Table of Contents and bookmarks