How come the time complexity of Binary Search is log nBinary search complexityIs there an efficient method for the calculation of $e^1/e$?Time complexity of binary sumAlgorithm - the longest chord whose supporting line contains a given point, in a convex polygonWay to Improve Binary Search when Search Space ChangesProbabilistic extension of egg drop from floor problemTime complexity of the ceilingA physical algorithm that finds all integer partitions of a numbermatrix columns represented by binary search treeTime Complexity Of Binary Tree Subtree Algorithm

What is a Centaur Thief's climbing speed?

Is CD audio quality good enough?

What to do when you've set the wrong ISO for your film?

Should breaking down something like a door be adjudicated as an attempt to beat its AC and HP, or as an ability check against a set DC?

Why does this if-statement combining assignment and an equality check return true?

Is neural networks training done one-by-one?

Why do airplanes use an axial flow jet engine instead of a more compact centrifugal jet engine?

I unknowingly submitted plagarised work

Why doesn't the Earth accelerate towards the Moon?

Why does Mjolnir fall down in Age of Ultron but not in Endgame?

Did people go back to where they were?

Why do most published works in medical imaging try to reduce false positives?

Does the unit of measure matter when you are solving for the diameter of a circumference?

Why colon to denote that a value belongs to a type?

Compactness of finite sets

Why does the 6502 have the BIT instruction?

How should I introduce map drawing to my players?

Does Nitrogen inside commercial airliner wheels prevent blowouts on touchdown?

In general, would I need to season a meat when making a sauce?

What is the object moving across the ceiling in this stock footage?

Why are C64 games inconsistent with which joystick port they use?

Count rotary dial pulses in a phone number (including letters)

Image processing: Removal of two spots in fundus images

Should I disclose a colleague's illness (that I should not know) when others badmouth him



How come the time complexity of Binary Search is log n


Binary search complexityIs there an efficient method for the calculation of $e^1/e$?Time complexity of binary sumAlgorithm - the longest chord whose supporting line contains a given point, in a convex polygonWay to Improve Binary Search when Search Space ChangesProbabilistic extension of egg drop from floor problemTime complexity of the ceilingA physical algorithm that finds all integer partitions of a numbermatrix columns represented by binary search treeTime Complexity Of Binary Tree Subtree Algorithm













2












$begingroup$


I am watching this professor's video on Binary Search but when he reached here, I am a bit lost. How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n
https://youtu.be/C2apEw9pgtw?t=969 . and then the time complexity become log 16/2 = 4... how that is $log n$ time complexity?



enter image description here










share|cite|improve this question











$endgroup$
















    2












    $begingroup$


    I am watching this professor's video on Binary Search but when he reached here, I am a bit lost. How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n
    https://youtu.be/C2apEw9pgtw?t=969 . and then the time complexity become log 16/2 = 4... how that is $log n$ time complexity?



    enter image description here










    share|cite|improve this question











    $endgroup$














      2












      2








      2





      $begingroup$


      I am watching this professor's video on Binary Search but when he reached here, I am a bit lost. How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n
      https://youtu.be/C2apEw9pgtw?t=969 . and then the time complexity become log 16/2 = 4... how that is $log n$ time complexity?



      enter image description here










      share|cite|improve this question











      $endgroup$




      I am watching this professor's video on Binary Search but when he reached here, I am a bit lost. How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n
      https://youtu.be/C2apEw9pgtw?t=969 . and then the time complexity become log 16/2 = 4... how that is $log n$ time complexity?



      enter image description here







      algorithms






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited May 13 at 14:32









      Ethan Bolker

      49k556125




      49k556125










      asked May 13 at 13:59









      EzeeweiEzeewei

      1133




      1133




















          3 Answers
          3






          active

          oldest

          votes


















          7












          $begingroup$

          Here's an answer to the question




          How come the time complexity of Binary Search is $log n$?




          that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



          You want to know how long binary search will take on input of size $n$, as a function of $n$.



          At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
          $$
          n, n/2, n/4, n/8 ldots .
          $$

          (Strictly speaking, you round those to integers.)



          Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



          So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
            $endgroup$
            – Ezeewei
            May 13 at 14:27






          • 7




            $begingroup$
            @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
            $endgroup$
            – Ethan Bolker
            May 13 at 14:29







          • 2




            $begingroup$
            Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
            $endgroup$
            – Ross Millikan
            May 13 at 14:30






          • 2




            $begingroup$
            @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
            $endgroup$
            – Ethan Bolker
            May 13 at 16:32






          • 1




            $begingroup$
            @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
            $endgroup$
            – Derek Elkins
            May 13 at 20:19


















          1












          $begingroup$


          How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




          I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



          The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



          For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



          One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.






          share|cite|improve this answer











          $endgroup$




















            0












            $begingroup$

            First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
            $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



            Similarly, we say that



            $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



            If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
            Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.






            share|cite|improve this answer









            $endgroup$













              Your Answer








              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "69"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













              draft saved

              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3224473%2fhow-come-the-time-complexity-of-binary-search-is-log-n%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              7












              $begingroup$

              Here's an answer to the question




              How come the time complexity of Binary Search is $log n$?




              that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



              You want to know how long binary search will take on input of size $n$, as a function of $n$.



              At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
              $$
              n, n/2, n/4, n/8 ldots .
              $$

              (Strictly speaking, you round those to integers.)



              Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



              So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.






              share|cite|improve this answer











              $endgroup$












              • $begingroup$
                So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
                $endgroup$
                – Ezeewei
                May 13 at 14:27






              • 7




                $begingroup$
                @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
                $endgroup$
                – Ethan Bolker
                May 13 at 14:29







              • 2




                $begingroup$
                Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
                $endgroup$
                – Ross Millikan
                May 13 at 14:30






              • 2




                $begingroup$
                @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
                $endgroup$
                – Ethan Bolker
                May 13 at 16:32






              • 1




                $begingroup$
                @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
                $endgroup$
                – Derek Elkins
                May 13 at 20:19















              7












              $begingroup$

              Here's an answer to the question




              How come the time complexity of Binary Search is $log n$?




              that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



              You want to know how long binary search will take on input of size $n$, as a function of $n$.



              At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
              $$
              n, n/2, n/4, n/8 ldots .
              $$

              (Strictly speaking, you round those to integers.)



              Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



              So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.






              share|cite|improve this answer











              $endgroup$












              • $begingroup$
                So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
                $endgroup$
                – Ezeewei
                May 13 at 14:27






              • 7




                $begingroup$
                @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
                $endgroup$
                – Ethan Bolker
                May 13 at 14:29







              • 2




                $begingroup$
                Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
                $endgroup$
                – Ross Millikan
                May 13 at 14:30






              • 2




                $begingroup$
                @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
                $endgroup$
                – Ethan Bolker
                May 13 at 16:32






              • 1




                $begingroup$
                @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
                $endgroup$
                – Derek Elkins
                May 13 at 20:19













              7












              7








              7





              $begingroup$

              Here's an answer to the question




              How come the time complexity of Binary Search is $log n$?




              that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



              You want to know how long binary search will take on input of size $n$, as a function of $n$.



              At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
              $$
              n, n/2, n/4, n/8 ldots .
              $$

              (Strictly speaking, you round those to integers.)



              Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



              So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.






              share|cite|improve this answer











              $endgroup$



              Here's an answer to the question




              How come the time complexity of Binary Search is $log n$?




              that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



              You want to know how long binary search will take on input of size $n$, as a function of $n$.



              At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
              $$
              n, n/2, n/4, n/8 ldots .
              $$

              (Strictly speaking, you round those to integers.)



              Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



              So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.







              share|cite|improve this answer














              share|cite|improve this answer



              share|cite|improve this answer








              edited May 13 at 14:24

























              answered May 13 at 14:16









              Ethan BolkerEthan Bolker

              49k556125




              49k556125











              • $begingroup$
                So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
                $endgroup$
                – Ezeewei
                May 13 at 14:27






              • 7




                $begingroup$
                @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
                $endgroup$
                – Ethan Bolker
                May 13 at 14:29







              • 2




                $begingroup$
                Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
                $endgroup$
                – Ross Millikan
                May 13 at 14:30






              • 2




                $begingroup$
                @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
                $endgroup$
                – Ethan Bolker
                May 13 at 16:32






              • 1




                $begingroup$
                @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
                $endgroup$
                – Derek Elkins
                May 13 at 20:19
















              • $begingroup$
                So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
                $endgroup$
                – Ezeewei
                May 13 at 14:27






              • 7




                $begingroup$
                @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
                $endgroup$
                – Ethan Bolker
                May 13 at 14:29







              • 2




                $begingroup$
                Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
                $endgroup$
                – Ross Millikan
                May 13 at 14:30






              • 2




                $begingroup$
                @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
                $endgroup$
                – Ethan Bolker
                May 13 at 16:32






              • 1




                $begingroup$
                @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
                $endgroup$
                – Derek Elkins
                May 13 at 20:19















              $begingroup$
              So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
              $endgroup$
              – Ezeewei
              May 13 at 14:27




              $begingroup$
              So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
              $endgroup$
              – Ezeewei
              May 13 at 14:27




              7




              7




              $begingroup$
              @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
              $endgroup$
              – Ethan Bolker
              May 13 at 14:29





              $begingroup$
              @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
              $endgroup$
              – Ethan Bolker
              May 13 at 14:29





              2




              2




              $begingroup$
              Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
              $endgroup$
              – Ross Millikan
              May 13 at 14:30




              $begingroup$
              Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
              $endgroup$
              – Ross Millikan
              May 13 at 14:30




              2




              2




              $begingroup$
              @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
              $endgroup$
              – Ethan Bolker
              May 13 at 16:32




              $begingroup$
              @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
              $endgroup$
              – Ethan Bolker
              May 13 at 16:32




              1




              1




              $begingroup$
              @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
              $endgroup$
              – Derek Elkins
              May 13 at 20:19




              $begingroup$
              @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
              $endgroup$
              – Derek Elkins
              May 13 at 20:19











              1












              $begingroup$


              How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




              I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



              The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



              For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



              One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.






              share|cite|improve this answer











              $endgroup$

















                1












                $begingroup$


                How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




                I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



                The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



                For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



                One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.






                share|cite|improve this answer











                $endgroup$















                  1












                  1








                  1





                  $begingroup$


                  How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




                  I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



                  The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



                  For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



                  One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.






                  share|cite|improve this answer











                  $endgroup$




                  How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




                  I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



                  The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



                  For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



                  One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited May 16 at 5:26

























                  answered May 13 at 19:06









                  dgoulddgould

                  1112




                  1112





















                      0












                      $begingroup$

                      First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
                      $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



                      Similarly, we say that



                      $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



                      If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
                      Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.






                      share|cite|improve this answer









                      $endgroup$

















                        0












                        $begingroup$

                        First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
                        $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



                        Similarly, we say that



                        $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



                        If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
                        Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.






                        share|cite|improve this answer









                        $endgroup$















                          0












                          0








                          0





                          $begingroup$

                          First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
                          $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



                          Similarly, we say that



                          $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



                          If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
                          Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.






                          share|cite|improve this answer









                          $endgroup$



                          First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
                          $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



                          Similarly, we say that



                          $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



                          If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
                          Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered May 13 at 15:19









                          Ariel SerranoniAriel Serranoni

                          11817




                          11817



























                              draft saved

                              draft discarded
















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid


                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.

                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3224473%2fhow-come-the-time-complexity-of-binary-search-is-log-n%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Club Baloncesto Breogán Índice Historia | Pavillón | Nome | O Breogán na cultura popular | Xogadores | Adestradores | Presidentes | Palmarés | Historial | Líderes | Notas | Véxase tamén | Menú de navegacióncbbreogan.galCadroGuía oficial da ACB 2009-10, páxina 201Guía oficial ACB 1992, páxina 183. Editorial DB.É de 6.500 espectadores sentados axeitándose á última normativa"Estudiantes Junior, entre as mellores canteiras"o orixinalHemeroteca El Mundo Deportivo, 16 setembro de 1970, páxina 12Historia do BreogánAlfredo Pérez, o último canoneiroHistoria C.B. BreogánHemeroteca de El Mundo DeportivoJimmy Wright, norteamericano do Breogán deixará Lugo por ameazas de morteResultados de Breogán en 1986-87Resultados de Breogán en 1990-91Ficha de Velimir Perasović en acb.comResultados de Breogán en 1994-95Breogán arrasa al Barça. "El Mundo Deportivo", 27 de setembro de 1999, páxina 58CB Breogán - FC BarcelonaA FEB invita a participar nunha nova Liga EuropeaCharlie Bell na prensa estatalMáximos anotadores 2005Tempada 2005-06 : Tódolos Xogadores da Xornada""Non quero pensar nunha man negra, mais pregúntome que está a pasar""o orixinalRaúl López, orgulloso dos xogadores, presume da boa saúde económica do BreogánJulio González confirma que cesa como presidente del BreogánHomenaxe a Lisardo GómezA tempada do rexurdimento celesteEntrevista a Lisardo GómezEl COB dinamita el Pazo para forzar el quinto (69-73)Cafés Candelas, patrocinador del CB Breogán"Suso Lázare, novo presidente do Breogán"o orixinalCafés Candelas Breogán firma el mayor triunfo de la historiaEl Breogán realizará 17 homenajes por su cincuenta aniversario"O Breogán honra ao seu fundador e primeiro presidente"o orixinalMiguel Giao recibiu a homenaxe do PazoHomenaxe aos primeiros gladiadores celestesO home que nos amosa como ver o Breo co corazónTita Franco será homenaxeada polos #50anosdeBreoJulio Vila recibirá unha homenaxe in memoriam polos #50anosdeBreo"O Breogán homenaxeará aos seus aboados máis veteráns"Pechada ovación a «Capi» Sanmartín e Ricardo «Corazón de González»Homenaxe por décadas de informaciónPaco García volve ao Pazo con motivo do 50 aniversario"Resultados y clasificaciones""O Cafés Candelas Breogán, campión da Copa Princesa""O Cafés Candelas Breogán, equipo ACB"C.B. Breogán"Proxecto social"o orixinal"Centros asociados"o orixinalFicha en imdb.comMario Camus trata la recuperación del amor en 'La vieja música', su última película"Páxina web oficial""Club Baloncesto Breogán""C. B. Breogán S.A.D."eehttp://www.fegaba.com

                              Vilaño, A Laracha Índice Patrimonio | Lugares e parroquias | Véxase tamén | Menú de navegación43°14′52″N 8°36′03″O / 43.24775, -8.60070

                              Cegueira Índice Epidemioloxía | Deficiencia visual | Tipos de cegueira | Principais causas de cegueira | Tratamento | Técnicas de adaptación e axudas | Vida dos cegos | Primeiros auxilios | Crenzas respecto das persoas cegas | Crenzas das persoas cegas | O neno deficiente visual | Aspectos psicolóxicos da cegueira | Notas | Véxase tamén | Menú de navegación54.054.154.436928256blindnessDicionario da Real Academia GalegaPortal das Palabras"International Standards: Visual Standards — Aspects and Ranges of Vision Loss with Emphasis on Population Surveys.""Visual impairment and blindness""Presentan un plan para previr a cegueira"o orixinalACCDV Associació Catalana de Cecs i Disminuïts Visuals - PMFTrachoma"Effect of gene therapy on visual function in Leber's congenital amaurosis"1844137110.1056/NEJMoa0802268Cans guía - os mellores amigos dos cegosArquivadoEscola de cans guía para cegos en Mortágua, PortugalArquivado"Tecnología para ciegos y deficientes visuales. Recopilación de recursos gratuitos en la Red""Colorino""‘COL.diesis’, escuchar los sonidos del color""COL.diesis: Transforming Colour into Melody and Implementing the Result in a Colour Sensor Device"o orixinal"Sistema de desarrollo de sinestesia color-sonido para invidentes utilizando un protocolo de audio""Enseñanza táctil - geometría y color. Juegos didácticos para niños ciegos y videntes""Sistema Constanz"L'ocupació laboral dels cecs a l'Estat espanyol està pràcticament equiparada a la de les persones amb visió, entrevista amb Pedro ZuritaONCE (Organización Nacional de Cegos de España)Prevención da cegueiraDescrición de deficiencias visuais (Disc@pnet)Braillín, un boneco atractivo para calquera neno, con ou sen discapacidade, que permite familiarizarse co sistema de escritura e lectura brailleAxudas Técnicas36838ID00897494007150-90057129528256DOID:1432HP:0000618D001766C10.597.751.941.162C97109C0155020