How come the time complexity of Binary Search is log nBinary search complexityIs there an efficient method for the calculation of $e^1/e$?Time complexity of binary sumAlgorithm - the longest chord whose supporting line contains a given point, in a convex polygonWay to Improve Binary Search when Search Space ChangesProbabilistic extension of egg drop from floor problemTime complexity of the ceilingA physical algorithm that finds all integer partitions of a numbermatrix columns represented by binary search treeTime Complexity Of Binary Tree Subtree Algorithm

What is a Centaur Thief's climbing speed?

Is CD audio quality good enough?

What to do when you've set the wrong ISO for your film?

Should breaking down something like a door be adjudicated as an attempt to beat its AC and HP, or as an ability check against a set DC?

Why does this if-statement combining assignment and an equality check return true?

Is neural networks training done one-by-one?

Why do airplanes use an axial flow jet engine instead of a more compact centrifugal jet engine?

I unknowingly submitted plagarised work

Why doesn't the Earth accelerate towards the Moon?

Why does Mjolnir fall down in Age of Ultron but not in Endgame?

Did people go back to where they were?

Why do most published works in medical imaging try to reduce false positives?

Does the unit of measure matter when you are solving for the diameter of a circumference?

Why colon to denote that a value belongs to a type?

Compactness of finite sets

Why does the 6502 have the BIT instruction?

How should I introduce map drawing to my players?

Does Nitrogen inside commercial airliner wheels prevent blowouts on touchdown?

In general, would I need to season a meat when making a sauce?

What is the object moving across the ceiling in this stock footage?

Why are C64 games inconsistent with which joystick port they use?

Count rotary dial pulses in a phone number (including letters)

Image processing: Removal of two spots in fundus images

Should I disclose a colleague's illness (that I should not know) when others badmouth him



How come the time complexity of Binary Search is log n


Binary search complexityIs there an efficient method for the calculation of $e^1/e$?Time complexity of binary sumAlgorithm - the longest chord whose supporting line contains a given point, in a convex polygonWay to Improve Binary Search when Search Space ChangesProbabilistic extension of egg drop from floor problemTime complexity of the ceilingA physical algorithm that finds all integer partitions of a numbermatrix columns represented by binary search treeTime Complexity Of Binary Tree Subtree Algorithm













2












$begingroup$


I am watching this professor's video on Binary Search but when he reached here, I am a bit lost. How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n
https://youtu.be/C2apEw9pgtw?t=969 . and then the time complexity become log 16/2 = 4... how that is $log n$ time complexity?



enter image description here










share|cite|improve this question











$endgroup$
















    2












    $begingroup$


    I am watching this professor's video on Binary Search but when he reached here, I am a bit lost. How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n
    https://youtu.be/C2apEw9pgtw?t=969 . and then the time complexity become log 16/2 = 4... how that is $log n$ time complexity?



    enter image description here










    share|cite|improve this question











    $endgroup$














      2












      2








      2





      $begingroup$


      I am watching this professor's video on Binary Search but when he reached here, I am a bit lost. How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n
      https://youtu.be/C2apEw9pgtw?t=969 . and then the time complexity become log 16/2 = 4... how that is $log n$ time complexity?



      enter image description here










      share|cite|improve this question











      $endgroup$




      I am watching this professor's video on Binary Search but when he reached here, I am a bit lost. How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n
      https://youtu.be/C2apEw9pgtw?t=969 . and then the time complexity become log 16/2 = 4... how that is $log n$ time complexity?



      enter image description here







      algorithms






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited May 13 at 14:32









      Ethan Bolker

      49k556125




      49k556125










      asked May 13 at 13:59









      EzeeweiEzeewei

      1133




      1133




















          3 Answers
          3






          active

          oldest

          votes


















          7












          $begingroup$

          Here's an answer to the question




          How come the time complexity of Binary Search is $log n$?




          that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



          You want to know how long binary search will take on input of size $n$, as a function of $n$.



          At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
          $$
          n, n/2, n/4, n/8 ldots .
          $$

          (Strictly speaking, you round those to integers.)



          Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



          So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
            $endgroup$
            – Ezeewei
            May 13 at 14:27






          • 7




            $begingroup$
            @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
            $endgroup$
            – Ethan Bolker
            May 13 at 14:29







          • 2




            $begingroup$
            Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
            $endgroup$
            – Ross Millikan
            May 13 at 14:30






          • 2




            $begingroup$
            @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
            $endgroup$
            – Ethan Bolker
            May 13 at 16:32






          • 1




            $begingroup$
            @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
            $endgroup$
            – Derek Elkins
            May 13 at 20:19


















          1












          $begingroup$


          How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




          I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



          The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



          For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



          One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.






          share|cite|improve this answer











          $endgroup$




















            0












            $begingroup$

            First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
            $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



            Similarly, we say that



            $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



            If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
            Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.






            share|cite|improve this answer









            $endgroup$













              Your Answer








              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "69"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













              draft saved

              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3224473%2fhow-come-the-time-complexity-of-binary-search-is-log-n%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              7












              $begingroup$

              Here's an answer to the question




              How come the time complexity of Binary Search is $log n$?




              that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



              You want to know how long binary search will take on input of size $n$, as a function of $n$.



              At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
              $$
              n, n/2, n/4, n/8 ldots .
              $$

              (Strictly speaking, you round those to integers.)



              Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



              So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.






              share|cite|improve this answer











              $endgroup$












              • $begingroup$
                So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
                $endgroup$
                – Ezeewei
                May 13 at 14:27






              • 7




                $begingroup$
                @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
                $endgroup$
                – Ethan Bolker
                May 13 at 14:29







              • 2




                $begingroup$
                Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
                $endgroup$
                – Ross Millikan
                May 13 at 14:30






              • 2




                $begingroup$
                @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
                $endgroup$
                – Ethan Bolker
                May 13 at 16:32






              • 1




                $begingroup$
                @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
                $endgroup$
                – Derek Elkins
                May 13 at 20:19















              7












              $begingroup$

              Here's an answer to the question




              How come the time complexity of Binary Search is $log n$?




              that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



              You want to know how long binary search will take on input of size $n$, as a function of $n$.



              At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
              $$
              n, n/2, n/4, n/8 ldots .
              $$

              (Strictly speaking, you round those to integers.)



              Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



              So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.






              share|cite|improve this answer











              $endgroup$












              • $begingroup$
                So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
                $endgroup$
                – Ezeewei
                May 13 at 14:27






              • 7




                $begingroup$
                @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
                $endgroup$
                – Ethan Bolker
                May 13 at 14:29







              • 2




                $begingroup$
                Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
                $endgroup$
                – Ross Millikan
                May 13 at 14:30






              • 2




                $begingroup$
                @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
                $endgroup$
                – Ethan Bolker
                May 13 at 16:32






              • 1




                $begingroup$
                @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
                $endgroup$
                – Derek Elkins
                May 13 at 20:19













              7












              7








              7





              $begingroup$

              Here's an answer to the question




              How come the time complexity of Binary Search is $log n$?




              that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



              You want to know how long binary search will take on input of size $n$, as a function of $n$.



              At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
              $$
              n, n/2, n/4, n/8 ldots .
              $$

              (Strictly speaking, you round those to integers.)



              Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



              So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.






              share|cite|improve this answer











              $endgroup$



              Here's an answer to the question




              How come the time complexity of Binary Search is $log n$?




              that describes informally what's going on in the binary tree in the question and in the video (which I have not watched).



              You want to know how long binary search will take on input of size $n$, as a function of $n$.



              At each stage of the search (pass through the body of the while loop) you split the input in half, so you successively reduce the size of the problem (h-l) this way:
              $$
              n, n/2, n/4, n/8 ldots .
              $$

              (Strictly speaking, you round those to integers.)



              Clearly you will be done when the input is $1$, for there's just one place. That index is the answer.



              So you want the number of steps $k$ such that $n/2^k le 1$. That's the smallest $k$ for which $2^k ge n$. The definition of the logarithm says that $k$ is about $log_2(n)$, so binary search has that complexity.







              share|cite|improve this answer














              share|cite|improve this answer



              share|cite|improve this answer








              edited May 13 at 14:24

























              answered May 13 at 14:16









              Ethan BolkerEthan Bolker

              49k556125




              49k556125











              • $begingroup$
                So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
                $endgroup$
                – Ezeewei
                May 13 at 14:27






              • 7




                $begingroup$
                @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
                $endgroup$
                – Ethan Bolker
                May 13 at 14:29







              • 2




                $begingroup$
                Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
                $endgroup$
                – Ross Millikan
                May 13 at 14:30






              • 2




                $begingroup$
                @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
                $endgroup$
                – Ethan Bolker
                May 13 at 16:32






              • 1




                $begingroup$
                @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
                $endgroup$
                – Derek Elkins
                May 13 at 20:19
















              • $begingroup$
                So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
                $endgroup$
                – Ezeewei
                May 13 at 14:27






              • 7




                $begingroup$
                @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
                $endgroup$
                – Ethan Bolker
                May 13 at 14:29







              • 2




                $begingroup$
                Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
                $endgroup$
                – Ross Millikan
                May 13 at 14:30






              • 2




                $begingroup$
                @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
                $endgroup$
                – Ethan Bolker
                May 13 at 16:32






              • 1




                $begingroup$
                @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
                $endgroup$
                – Derek Elkins
                May 13 at 20:19















              $begingroup$
              So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
              $endgroup$
              – Ezeewei
              May 13 at 14:27




              $begingroup$
              So basically, in this case log2(𝑛) is simplified as log n in the lecture. correct?
              $endgroup$
              – Ezeewei
              May 13 at 14:27




              7




              7




              $begingroup$
              @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
              $endgroup$
              – Ethan Bolker
              May 13 at 14:29





              $begingroup$
              @Ezeewei Correct. In computer science just plain $log$ usually means $log_2$. In mathematics it usually means $ln = log_e$. It hardly ever means $log_10$ these days. All those logs are proportional, so for big-O time complexity arguments they are the same.
              $endgroup$
              – Ethan Bolker
              May 13 at 14:29





              2




              2




              $begingroup$
              Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
              $endgroup$
              – Ross Millikan
              May 13 at 14:30




              $begingroup$
              Yes. In time complexity we don't care about multiplicative factors. $log n = log 2 log_2 n$. In fact most logs in complexity work are taken to base $2$
              $endgroup$
              – Ross Millikan
              May 13 at 14:30




              2




              2




              $begingroup$
              @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
              $endgroup$
              – Ethan Bolker
              May 13 at 16:32




              $begingroup$
              @Chris All logs are the same for the complexity analysis. Base $2$ is what comes up naturally for this algorithm, and for most estimates in theoretical computer science. If you happen to be working with a divide-and-conquer algorithm that trisects its input you would work in base $3$.
              $endgroup$
              – Ethan Bolker
              May 13 at 16:32




              1




              1




              $begingroup$
              @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
              $endgroup$
              – Derek Elkins
              May 13 at 20:19




              $begingroup$
              @Chris We're not always doing big-O analysis. Sometimes you actually want a more precise calculation. Also, even with big-O, something like $O(2^log n)$ would be sensitive to the base of the logarithm.
              $endgroup$
              – Derek Elkins
              May 13 at 20:19











              1












              $begingroup$


              How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




              I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



              The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



              For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



              One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.






              share|cite|improve this answer











              $endgroup$

















                1












                $begingroup$


                How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




                I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



                The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



                For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



                One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.






                share|cite|improve this answer











                $endgroup$















                  1












                  1








                  1





                  $begingroup$


                  How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




                  I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



                  The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



                  For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



                  One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.






                  share|cite|improve this answer











                  $endgroup$




                  How come he came up the time coomplexity is log in just by breaking off binary tree and knowing height is log n




                  I'm guessing this is a key part of the question: you're wondering not just "why is the complexity log(n)?", but "why does knowing that the height of the tree is log2(n) equate to the complexity being O(log(n))?"



                  The answer is that steps down the tree are the unit "operations" you're trying to count. That is, as you walk down the tree, what you're doing at each step is: "[check whether the value at your current position is equal to, greater than, or less than the value you're searching for; and accordingly, return, go left, or go right]". That whole chunk of logic is a constant-time-bounded amount of processing, and the question you're trying to answer is, "How many times (at most) am I going to have to do [that]?"



                  For every possible search, you'll be starting at the root and walking down the tree, all the way (at most) to a leaf on the bottom level. So, the height of the tree is equal to the maximum number of steps you'll need to take.



                  One other possible source of confusion is that seeing him draw the whole tree might give the impression that the search process would involve explicitly constructing the entire Binary Search Tree data structure (which would itself be a O(n) task). But no -- the idea here is that the tree exists abstractly and implicitly, as a graph of the relationships among the elements in the array; and drawing it and tracing paths down it is just a way of visualizing what you're doing as you jump around the array.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited May 16 at 5:26

























                  answered May 13 at 19:06









                  dgoulddgould

                  1112




                  1112





















                      0












                      $begingroup$

                      First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
                      $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



                      Similarly, we say that



                      $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



                      If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
                      Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.






                      share|cite|improve this answer









                      $endgroup$

















                        0












                        $begingroup$

                        First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
                        $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



                        Similarly, we say that



                        $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



                        If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
                        Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.






                        share|cite|improve this answer









                        $endgroup$















                          0












                          0








                          0





                          $begingroup$

                          First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
                          $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



                          Similarly, we say that



                          $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



                          If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
                          Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.






                          share|cite|improve this answer









                          $endgroup$



                          First, it is important to note that the running time of an algorithm is usually represented as function of the input size. Then, we 'measure' the complexity by fitting this function into a class of functions. For instance, if $T(n)$ is the function describing your algorithm's running time and $gcolonmathbbNtomathbbR$ is another function then
                          $$Tin O(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)leq c g(n) text for each ngeq n_0. $$



                          Similarly, we say that



                          $$Tin Omega(g) iff text there exist c,n_0inmathbbR_++ text such that T(n)geq c g(n) text for each ngeq n_0. $$



                          If $T$ belongs to both $O(g)$ and $Omega(g)$ then we say that $TinTheta(g)$. Let's conclude that for the binary search algorithm we have a running time of $Theta(log(n))$. Note that we always solve a subproblem in constant time and then we are given a subproblem of size $fracn2$. Thus, the running time of binary search is described by the recursive function $$T(n)=TBig(fracn2Big)+alpha.$$
                          Solving the equation above gives us that $T(n)=alphalog_2(n)$. Choosing constants $c=alpha$ and $n_0=1$, you can easily conclude that the running time of binary search is $Theta(log(n))$.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered May 13 at 15:19









                          Ariel SerranoniAriel Serranoni

                          11817




                          11817



























                              draft saved

                              draft discarded
















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid


                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.

                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3224473%2fhow-come-the-time-complexity-of-binary-search-is-log-n%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Wikipedia:Vital articles Мазмуну Biography - Өмүр баян Philosophy and psychology - Философия жана психология Religion - Дин Social sciences - Коомдук илимдер Language and literature - Тил жана адабият Science - Илим Technology - Технология Arts and recreation - Искусство жана эс алуу History and geography - Тарых жана география Навигация менюсу

                              Bruxelas-Capital Índice Historia | Composición | Situación lingüística | Clima | Cidades irmandadas | Notas | Véxase tamén | Menú de navegacióneO uso das linguas en Bruxelas e a situación do neerlandés"Rexión de Bruxelas Capital"o orixinalSitio da rexiónPáxina de Bruselas no sitio da Oficina de Promoción Turística de Valonia e BruxelasMapa Interactivo da Rexión de Bruxelas-CapitaleeWorldCat332144929079854441105155190212ID28008674080552-90000 0001 0666 3698n94104302ID540940339365017018237

                              What should I write in an apology letter, since I have decided not to join a company after accepting an offer letterShould I keep looking after accepting a job offer?What should I do when I've been verbally told I would get an offer letter, but still haven't gotten one after 4 weeks?Do I accept an offer from a company that I am not likely to join?New job hasn't confirmed starting date and I want to give current employer as much notice as possibleHow should I address my manager in my resignation letter?HR delayed background verification, now jobless as resignedNo email communication after accepting a formal written offer. How should I phrase the call?What should I do if after receiving a verbal offer letter I am informed that my written job offer is put on hold due to some internal issues?Should I inform the current employer that I am about to resign within 1-2 weeks since I have signed the offer letter and waiting for visa?What company will do, if I send their offer letter to another company