What does the integral of a function times a function of a random variable represent, conceptually?Expected value of a Gaussian random variable transformed with a logistic functionWhat is the expected partial value function really called?indicator variable - dirac delta or step functionExpected value of bounded function?Why does MLE not include the integral for joint probability of a contious random variableHow to calculate the expected value of a standard normal distribution?Plot the density function of a normal random variable knowing only the characteristic function in RExpectation of a function of a random variable from CDFDerivation of variance of normal distribution with gamma functionMoment Generating Function for Lognormal Random Variable

Is any special diet an effective treatment of autism?

Endgame puzzle: How to avoid stalemate and win?

Who filmed the Apollo 11 trans-lunar injection?

What Kind of Wooden Beam is this

Simple Derivative Proof?

How to remap repeating commands i.e. <number><command>?

Which US defense organization would respond to an invasion like this?

How to preserve a rare version of a book?

In "Avengers: Endgame", what does this name refer to?

How did the Apollo guidance computer handle parity bit errors?

What's the 2-minute timer on mobile Deutsche Bahn tickets?

How to properly store the current value of int variable into a token list?

As a GM, is it bad form to ask for a moment to think when improvising?

Why are oscilloscope input impedances so low?

Sheared off exhasut pipe: How to fix without a welder?

All of my Firefox add-ons been disabled suddenly, how can I re-enable them?

Why didn't this character get a funeral at the end of Avengers: Endgame?

Is space itself expanding or is it just momentum from the Big Bang carrying things apart?

Execute command on shell command output

Can the Tidal Wave spell trigger a vampire's weakness to running water?

Why is my arithmetic with a long long int behaving this way?

Game artist computer workstation set-up – is this overkill?

How do I, as a DM, handle a party that decides to set up an ambush in a dungeon?

Madam I m Adam..please don’t get mad..you will no longer be prime



What does the integral of a function times a function of a random variable represent, conceptually?


Expected value of a Gaussian random variable transformed with a logistic functionWhat is the expected partial value function really called?indicator variable - dirac delta or step functionExpected value of bounded function?Why does MLE not include the integral for joint probability of a contious random variableHow to calculate the expected value of a standard normal distribution?Plot the density function of a normal random variable knowing only the characteristic function in RExpectation of a function of a random variable from CDFDerivation of variance of normal distribution with gamma functionMoment Generating Function for Lognormal Random Variable






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








3












$begingroup$


I am trying to understand conceptually what does the following give me or tell me:



$$int f(x) cdot g(x) , dx$$



where $f(x)$ is any continuous function of $x$ and $g(x)$ is the probability density function for a random variable, for example a normal distribution's PDF is:



$$ g(x) = frac1 2 pi sigma^2 expleft(frac- (x - mu)^2 2 sigma ^2right) $$



I understand the integral of a PDF gives me the CDF. So:



$$int_-infty^0 g(x) , dx$$



Gives me the probability of $x$ being less than $0$. However, what happens when you multiply $g(x)$ by another function $f(x)$ and take the integral? I heard it gives you the expected payoff assuming $f(x)$ is a function of payoffs and you take an integral from -infinity to +infinity. This, if true, I conceptually understand. sum of payoffs times the probabilities is the expected value of whatever game you are playing.



I start getting confused when the boundaries of the integral are not $pm infty$. I'm not sure in that case what integral conceptually means. For example:



$$int_-infty^0 f(x) g(x) , dx$$



What does that tell me?










share|cite|improve this question











$endgroup$


















    3












    $begingroup$


    I am trying to understand conceptually what does the following give me or tell me:



    $$int f(x) cdot g(x) , dx$$



    where $f(x)$ is any continuous function of $x$ and $g(x)$ is the probability density function for a random variable, for example a normal distribution's PDF is:



    $$ g(x) = frac1 2 pi sigma^2 expleft(frac- (x - mu)^2 2 sigma ^2right) $$



    I understand the integral of a PDF gives me the CDF. So:



    $$int_-infty^0 g(x) , dx$$



    Gives me the probability of $x$ being less than $0$. However, what happens when you multiply $g(x)$ by another function $f(x)$ and take the integral? I heard it gives you the expected payoff assuming $f(x)$ is a function of payoffs and you take an integral from -infinity to +infinity. This, if true, I conceptually understand. sum of payoffs times the probabilities is the expected value of whatever game you are playing.



    I start getting confused when the boundaries of the integral are not $pm infty$. I'm not sure in that case what integral conceptually means. For example:



    $$int_-infty^0 f(x) g(x) , dx$$



    What does that tell me?










    share|cite|improve this question











    $endgroup$














      3












      3








      3


      1



      $begingroup$


      I am trying to understand conceptually what does the following give me or tell me:



      $$int f(x) cdot g(x) , dx$$



      where $f(x)$ is any continuous function of $x$ and $g(x)$ is the probability density function for a random variable, for example a normal distribution's PDF is:



      $$ g(x) = frac1 2 pi sigma^2 expleft(frac- (x - mu)^2 2 sigma ^2right) $$



      I understand the integral of a PDF gives me the CDF. So:



      $$int_-infty^0 g(x) , dx$$



      Gives me the probability of $x$ being less than $0$. However, what happens when you multiply $g(x)$ by another function $f(x)$ and take the integral? I heard it gives you the expected payoff assuming $f(x)$ is a function of payoffs and you take an integral from -infinity to +infinity. This, if true, I conceptually understand. sum of payoffs times the probabilities is the expected value of whatever game you are playing.



      I start getting confused when the boundaries of the integral are not $pm infty$. I'm not sure in that case what integral conceptually means. For example:



      $$int_-infty^0 f(x) g(x) , dx$$



      What does that tell me?










      share|cite|improve this question











      $endgroup$




      I am trying to understand conceptually what does the following give me or tell me:



      $$int f(x) cdot g(x) , dx$$



      where $f(x)$ is any continuous function of $x$ and $g(x)$ is the probability density function for a random variable, for example a normal distribution's PDF is:



      $$ g(x) = frac1 2 pi sigma^2 expleft(frac- (x - mu)^2 2 sigma ^2right) $$



      I understand the integral of a PDF gives me the CDF. So:



      $$int_-infty^0 g(x) , dx$$



      Gives me the probability of $x$ being less than $0$. However, what happens when you multiply $g(x)$ by another function $f(x)$ and take the integral? I heard it gives you the expected payoff assuming $f(x)$ is a function of payoffs and you take an integral from -infinity to +infinity. This, if true, I conceptually understand. sum of payoffs times the probabilities is the expected value of whatever game you are playing.



      I start getting confused when the boundaries of the integral are not $pm infty$. I'm not sure in that case what integral conceptually means. For example:



      $$int_-infty^0 f(x) g(x) , dx$$



      What does that tell me?







      probability distributions normal-distribution random-variable expected-value






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Apr 26 at 15:09









      Siong Thye Goh

      3,1172621




      3,1172621










      asked Apr 26 at 14:36









      vt_ogvt_og

      241




      241




















          3 Answers
          3






          active

          oldest

          votes


















          4












          $begingroup$

          Suppose $g$ is the pdf of random variable $X$, then



          $$E[f(X)|X in A]= fracint_A f(x)g(x) , dxint_A g(t) , dt$$



          Hence $$int_A f(x) g(x) , dt = Pr(X in A) E[f(X)|X in A],$$



          it gives you the product of the conditional expectation of $f(X)$ given that $X in A$ and the probability that $X$ is in $A$.



          I think $E[f(X)|X in A]$ is a more interesting quantity, and I would divide your integral with $Pr(X in A)$ to recover it.






          share|cite|improve this answer









          $endgroup$




















            4












            $begingroup$

            The expected value of $X$ following distribution $g$ is



            $$
            E[X] = int x ,g(x) ,dx
            $$



            By the law of the unconscious statistician we know that the expected value of a function $f$ of random variable $X$ is



            $$
            E[f(X)] = int f(x) ,g(x) ,dx
            $$



            What does it tell you? It tells you what would be the expected value of your random variable if you transformed it somehow. For example, say that you know what is the distribution of height for humans in centimetres, but are interested in expected value in meters, i.e. where $f(x) = x/100$.






            share|cite|improve this answer











            $endgroup$




















              2












              $begingroup$

              Another way to look at this integral is through the random variable transformation point of view. You can think of $Y=f(x)$ as a new random variable, and your integral is the expectation (mean) $E[Y]$ of the new variable $Y$.



              Let's build the probability density function of the new variable $Y$. Unfortunately, we can't derive an analytical expression. We'll have to use a more complex, integral form of it.



              What is the probability that $Y$ will have values between $y$ and $y+dy$? We look at all $x$ where $y<f(x)<y+dy$ and add up their probabilities:
              $$g_y(y)dy=int_x mathbb1_y<f(x)<y+dy g(x)dx $$



              Next, we simply integrate over all values of $Y$:
              $$E[y]=int_yyg_y(y)dy$$
              Since $int_y$ runs through all possible $y$, it's the same as running the integral through all possible $x$.
              Just pause for a moment and agree with me...



              Now that you agreed with me you'll see that:
              $$E[Y]=int_xf(x)g(x)dx$$






              share|cite|improve this answer











              $endgroup$








              • 1




                $begingroup$
                This could use some elaboration. Perhaps an example would help. If you would also bring into consideration the idea of truncation or conditioning (which seems to be the main stumbling block), I think your point would be an excellent answer.
                $endgroup$
                – whuber
                Apr 26 at 15:17







              • 1




                $begingroup$
                @whuber, i did impossible: explained Lebesque integral without measure theory!
                $endgroup$
                – Aksakal
                Apr 26 at 15:21











              Your Answer








              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "65"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: false,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: null,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













              draft saved

              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f405242%2fwhat-does-the-integral-of-a-function-times-a-function-of-a-random-variable-repre%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              4












              $begingroup$

              Suppose $g$ is the pdf of random variable $X$, then



              $$E[f(X)|X in A]= fracint_A f(x)g(x) , dxint_A g(t) , dt$$



              Hence $$int_A f(x) g(x) , dt = Pr(X in A) E[f(X)|X in A],$$



              it gives you the product of the conditional expectation of $f(X)$ given that $X in A$ and the probability that $X$ is in $A$.



              I think $E[f(X)|X in A]$ is a more interesting quantity, and I would divide your integral with $Pr(X in A)$ to recover it.






              share|cite|improve this answer









              $endgroup$

















                4












                $begingroup$

                Suppose $g$ is the pdf of random variable $X$, then



                $$E[f(X)|X in A]= fracint_A f(x)g(x) , dxint_A g(t) , dt$$



                Hence $$int_A f(x) g(x) , dt = Pr(X in A) E[f(X)|X in A],$$



                it gives you the product of the conditional expectation of $f(X)$ given that $X in A$ and the probability that $X$ is in $A$.



                I think $E[f(X)|X in A]$ is a more interesting quantity, and I would divide your integral with $Pr(X in A)$ to recover it.






                share|cite|improve this answer









                $endgroup$















                  4












                  4








                  4





                  $begingroup$

                  Suppose $g$ is the pdf of random variable $X$, then



                  $$E[f(X)|X in A]= fracint_A f(x)g(x) , dxint_A g(t) , dt$$



                  Hence $$int_A f(x) g(x) , dt = Pr(X in A) E[f(X)|X in A],$$



                  it gives you the product of the conditional expectation of $f(X)$ given that $X in A$ and the probability that $X$ is in $A$.



                  I think $E[f(X)|X in A]$ is a more interesting quantity, and I would divide your integral with $Pr(X in A)$ to recover it.






                  share|cite|improve this answer









                  $endgroup$



                  Suppose $g$ is the pdf of random variable $X$, then



                  $$E[f(X)|X in A]= fracint_A f(x)g(x) , dxint_A g(t) , dt$$



                  Hence $$int_A f(x) g(x) , dt = Pr(X in A) E[f(X)|X in A],$$



                  it gives you the product of the conditional expectation of $f(X)$ given that $X in A$ and the probability that $X$ is in $A$.



                  I think $E[f(X)|X in A]$ is a more interesting quantity, and I would divide your integral with $Pr(X in A)$ to recover it.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Apr 26 at 14:57









                  Siong Thye GohSiong Thye Goh

                  3,1172621




                  3,1172621























                      4












                      $begingroup$

                      The expected value of $X$ following distribution $g$ is



                      $$
                      E[X] = int x ,g(x) ,dx
                      $$



                      By the law of the unconscious statistician we know that the expected value of a function $f$ of random variable $X$ is



                      $$
                      E[f(X)] = int f(x) ,g(x) ,dx
                      $$



                      What does it tell you? It tells you what would be the expected value of your random variable if you transformed it somehow. For example, say that you know what is the distribution of height for humans in centimetres, but are interested in expected value in meters, i.e. where $f(x) = x/100$.






                      share|cite|improve this answer











                      $endgroup$

















                        4












                        $begingroup$

                        The expected value of $X$ following distribution $g$ is



                        $$
                        E[X] = int x ,g(x) ,dx
                        $$



                        By the law of the unconscious statistician we know that the expected value of a function $f$ of random variable $X$ is



                        $$
                        E[f(X)] = int f(x) ,g(x) ,dx
                        $$



                        What does it tell you? It tells you what would be the expected value of your random variable if you transformed it somehow. For example, say that you know what is the distribution of height for humans in centimetres, but are interested in expected value in meters, i.e. where $f(x) = x/100$.






                        share|cite|improve this answer











                        $endgroup$















                          4












                          4








                          4





                          $begingroup$

                          The expected value of $X$ following distribution $g$ is



                          $$
                          E[X] = int x ,g(x) ,dx
                          $$



                          By the law of the unconscious statistician we know that the expected value of a function $f$ of random variable $X$ is



                          $$
                          E[f(X)] = int f(x) ,g(x) ,dx
                          $$



                          What does it tell you? It tells you what would be the expected value of your random variable if you transformed it somehow. For example, say that you know what is the distribution of height for humans in centimetres, but are interested in expected value in meters, i.e. where $f(x) = x/100$.






                          share|cite|improve this answer











                          $endgroup$



                          The expected value of $X$ following distribution $g$ is



                          $$
                          E[X] = int x ,g(x) ,dx
                          $$



                          By the law of the unconscious statistician we know that the expected value of a function $f$ of random variable $X$ is



                          $$
                          E[f(X)] = int f(x) ,g(x) ,dx
                          $$



                          What does it tell you? It tells you what would be the expected value of your random variable if you transformed it somehow. For example, say that you know what is the distribution of height for humans in centimetres, but are interested in expected value in meters, i.e. where $f(x) = x/100$.







                          share|cite|improve this answer














                          share|cite|improve this answer



                          share|cite|improve this answer








                          edited Apr 27 at 5:11

























                          answered Apr 26 at 15:21









                          TimTim

                          61.1k9135230




                          61.1k9135230





















                              2












                              $begingroup$

                              Another way to look at this integral is through the random variable transformation point of view. You can think of $Y=f(x)$ as a new random variable, and your integral is the expectation (mean) $E[Y]$ of the new variable $Y$.



                              Let's build the probability density function of the new variable $Y$. Unfortunately, we can't derive an analytical expression. We'll have to use a more complex, integral form of it.



                              What is the probability that $Y$ will have values between $y$ and $y+dy$? We look at all $x$ where $y<f(x)<y+dy$ and add up their probabilities:
                              $$g_y(y)dy=int_x mathbb1_y<f(x)<y+dy g(x)dx $$



                              Next, we simply integrate over all values of $Y$:
                              $$E[y]=int_yyg_y(y)dy$$
                              Since $int_y$ runs through all possible $y$, it's the same as running the integral through all possible $x$.
                              Just pause for a moment and agree with me...



                              Now that you agreed with me you'll see that:
                              $$E[Y]=int_xf(x)g(x)dx$$






                              share|cite|improve this answer











                              $endgroup$








                              • 1




                                $begingroup$
                                This could use some elaboration. Perhaps an example would help. If you would also bring into consideration the idea of truncation or conditioning (which seems to be the main stumbling block), I think your point would be an excellent answer.
                                $endgroup$
                                – whuber
                                Apr 26 at 15:17







                              • 1




                                $begingroup$
                                @whuber, i did impossible: explained Lebesque integral without measure theory!
                                $endgroup$
                                – Aksakal
                                Apr 26 at 15:21















                              2












                              $begingroup$

                              Another way to look at this integral is through the random variable transformation point of view. You can think of $Y=f(x)$ as a new random variable, and your integral is the expectation (mean) $E[Y]$ of the new variable $Y$.



                              Let's build the probability density function of the new variable $Y$. Unfortunately, we can't derive an analytical expression. We'll have to use a more complex, integral form of it.



                              What is the probability that $Y$ will have values between $y$ and $y+dy$? We look at all $x$ where $y<f(x)<y+dy$ and add up their probabilities:
                              $$g_y(y)dy=int_x mathbb1_y<f(x)<y+dy g(x)dx $$



                              Next, we simply integrate over all values of $Y$:
                              $$E[y]=int_yyg_y(y)dy$$
                              Since $int_y$ runs through all possible $y$, it's the same as running the integral through all possible $x$.
                              Just pause for a moment and agree with me...



                              Now that you agreed with me you'll see that:
                              $$E[Y]=int_xf(x)g(x)dx$$






                              share|cite|improve this answer











                              $endgroup$








                              • 1




                                $begingroup$
                                This could use some elaboration. Perhaps an example would help. If you would also bring into consideration the idea of truncation or conditioning (which seems to be the main stumbling block), I think your point would be an excellent answer.
                                $endgroup$
                                – whuber
                                Apr 26 at 15:17







                              • 1




                                $begingroup$
                                @whuber, i did impossible: explained Lebesque integral without measure theory!
                                $endgroup$
                                – Aksakal
                                Apr 26 at 15:21













                              2












                              2








                              2





                              $begingroup$

                              Another way to look at this integral is through the random variable transformation point of view. You can think of $Y=f(x)$ as a new random variable, and your integral is the expectation (mean) $E[Y]$ of the new variable $Y$.



                              Let's build the probability density function of the new variable $Y$. Unfortunately, we can't derive an analytical expression. We'll have to use a more complex, integral form of it.



                              What is the probability that $Y$ will have values between $y$ and $y+dy$? We look at all $x$ where $y<f(x)<y+dy$ and add up their probabilities:
                              $$g_y(y)dy=int_x mathbb1_y<f(x)<y+dy g(x)dx $$



                              Next, we simply integrate over all values of $Y$:
                              $$E[y]=int_yyg_y(y)dy$$
                              Since $int_y$ runs through all possible $y$, it's the same as running the integral through all possible $x$.
                              Just pause for a moment and agree with me...



                              Now that you agreed with me you'll see that:
                              $$E[Y]=int_xf(x)g(x)dx$$






                              share|cite|improve this answer











                              $endgroup$



                              Another way to look at this integral is through the random variable transformation point of view. You can think of $Y=f(x)$ as a new random variable, and your integral is the expectation (mean) $E[Y]$ of the new variable $Y$.



                              Let's build the probability density function of the new variable $Y$. Unfortunately, we can't derive an analytical expression. We'll have to use a more complex, integral form of it.



                              What is the probability that $Y$ will have values between $y$ and $y+dy$? We look at all $x$ where $y<f(x)<y+dy$ and add up their probabilities:
                              $$g_y(y)dy=int_x mathbb1_y<f(x)<y+dy g(x)dx $$



                              Next, we simply integrate over all values of $Y$:
                              $$E[y]=int_yyg_y(y)dy$$
                              Since $int_y$ runs through all possible $y$, it's the same as running the integral through all possible $x$.
                              Just pause for a moment and agree with me...



                              Now that you agreed with me you'll see that:
                              $$E[Y]=int_xf(x)g(x)dx$$







                              share|cite|improve this answer














                              share|cite|improve this answer



                              share|cite|improve this answer








                              edited Apr 26 at 15:58

























                              answered Apr 26 at 15:08









                              AksakalAksakal

                              39.7k452120




                              39.7k452120







                              • 1




                                $begingroup$
                                This could use some elaboration. Perhaps an example would help. If you would also bring into consideration the idea of truncation or conditioning (which seems to be the main stumbling block), I think your point would be an excellent answer.
                                $endgroup$
                                – whuber
                                Apr 26 at 15:17







                              • 1




                                $begingroup$
                                @whuber, i did impossible: explained Lebesque integral without measure theory!
                                $endgroup$
                                – Aksakal
                                Apr 26 at 15:21












                              • 1




                                $begingroup$
                                This could use some elaboration. Perhaps an example would help. If you would also bring into consideration the idea of truncation or conditioning (which seems to be the main stumbling block), I think your point would be an excellent answer.
                                $endgroup$
                                – whuber
                                Apr 26 at 15:17







                              • 1




                                $begingroup$
                                @whuber, i did impossible: explained Lebesque integral without measure theory!
                                $endgroup$
                                – Aksakal
                                Apr 26 at 15:21







                              1




                              1




                              $begingroup$
                              This could use some elaboration. Perhaps an example would help. If you would also bring into consideration the idea of truncation or conditioning (which seems to be the main stumbling block), I think your point would be an excellent answer.
                              $endgroup$
                              – whuber
                              Apr 26 at 15:17





                              $begingroup$
                              This could use some elaboration. Perhaps an example would help. If you would also bring into consideration the idea of truncation or conditioning (which seems to be the main stumbling block), I think your point would be an excellent answer.
                              $endgroup$
                              – whuber
                              Apr 26 at 15:17





                              1




                              1




                              $begingroup$
                              @whuber, i did impossible: explained Lebesque integral without measure theory!
                              $endgroup$
                              – Aksakal
                              Apr 26 at 15:21




                              $begingroup$
                              @whuber, i did impossible: explained Lebesque integral without measure theory!
                              $endgroup$
                              – Aksakal
                              Apr 26 at 15:21

















                              draft saved

                              draft discarded
















































                              Thanks for contributing an answer to Cross Validated!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid


                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.

                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f405242%2fwhat-does-the-integral-of-a-function-times-a-function-of-a-random-variable-repre%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Wikipedia:Vital articles Мазмуну Biography - Өмүр баян Philosophy and psychology - Философия жана психология Religion - Дин Social sciences - Коомдук илимдер Language and literature - Тил жана адабият Science - Илим Technology - Технология Arts and recreation - Искусство жана эс алуу History and geography - Тарых жана география Навигация менюсу

                              Bruxelas-Capital Índice Historia | Composición | Situación lingüística | Clima | Cidades irmandadas | Notas | Véxase tamén | Menú de navegacióneO uso das linguas en Bruxelas e a situación do neerlandés"Rexión de Bruxelas Capital"o orixinalSitio da rexiónPáxina de Bruselas no sitio da Oficina de Promoción Turística de Valonia e BruxelasMapa Interactivo da Rexión de Bruxelas-CapitaleeWorldCat332144929079854441105155190212ID28008674080552-90000 0001 0666 3698n94104302ID540940339365017018237

                              What should I write in an apology letter, since I have decided not to join a company after accepting an offer letterShould I keep looking after accepting a job offer?What should I do when I've been verbally told I would get an offer letter, but still haven't gotten one after 4 weeks?Do I accept an offer from a company that I am not likely to join?New job hasn't confirmed starting date and I want to give current employer as much notice as possibleHow should I address my manager in my resignation letter?HR delayed background verification, now jobless as resignedNo email communication after accepting a formal written offer. How should I phrase the call?What should I do if after receiving a verbal offer letter I am informed that my written job offer is put on hold due to some internal issues?Should I inform the current employer that I am about to resign within 1-2 weeks since I have signed the offer letter and waiting for visa?What company will do, if I send their offer letter to another company