How to evaluate sum with one million summands?What are the most common pitfalls awaiting new users?Plotting a Double SumHow to correctly implement in a new function the scoping behavior of Table, Sum and other commands that use Block to localize iterators?Mathematica Infinite Summation ConfusionGet Mathematica to Apply Chu-Vandermonde ConvolutionFinding given sums over one variable in sums over multiple variablesHow to work out sums symbolically?Why is $sumlimits_k=1^100 0.01 $ different than $sumlimits_k=1^100 frac1100$?Perform an explicit summation, if possible, over a restricted range of two indices both varying from 0 to infinityHandling cases of cross terms for multi-sumsHow to optimize Mathematica code that depends on eigenvalues of big matrices and big sums?

Why do Russians almost not use verbs of possession akin to "have"?

Dad jokes are fun

Time complexity of an algorithm: Is it important to state the base of the logarithm?

How was Daenerys able to legitimise Gendry?

Is keeping the forking link on a true fork necessary (Github/GPL)?

Best shape for a necromancer's undead minions for battle?

Freedom of Speech and Assembly in China

Shorten or merge multiple lines of `&> /dev/null &`

Burned out due to current job, Can I take a week of vacation between jobs?

Sorting with IComparable design

How does the Earth's center produce heat?

Is it legal to have an abortion in another state or abroad?

Can we assume that a hash function with high collision resistance also means highly uniform distribution?

On San Andreas Speedruns, why do players blow up the Picador in the mission Ryder?

Why is this integration method not valid?

How can I properly write this equation in Latex?

Does "was machen sie" have the greeting meaning of "what do you do"?

Final exams: What is the most common protocol for scheduling?

What is the use case for non-breathable waterproof pants?

Creating second map without labels using QGIS?

How to respond to an e-mail asking me to suggest a doctoral research topic?

Expected maximum number of unpaired socks

Does an eye for an eye mean monetary compensation?

Why does Bran want to find Drogon?



How to evaluate sum with one million summands?


What are the most common pitfalls awaiting new users?Plotting a Double SumHow to correctly implement in a new function the scoping behavior of Table, Sum and other commands that use Block to localize iterators?Mathematica Infinite Summation ConfusionGet Mathematica to Apply Chu-Vandermonde ConvolutionFinding given sums over one variable in sums over multiple variablesHow to work out sums symbolically?Why is $sumlimits_k=1^100 0.01 $ different than $sumlimits_k=1^100 frac1100$?Perform an explicit summation, if possible, over a restricted range of two indices both varying from 0 to infinityHandling cases of cross terms for multi-sumsHow to optimize Mathematica code that depends on eigenvalues of big matrices and big sums?













3












$begingroup$


For a research project I am working on currently, I need to do a very simple and straightforward calculation. Unfortunately, I do not know how to include Mathematica code here, but it is very short anyway:



b[x_] := x^2 - x + 1/2

bp[x_] := b[Mod[x, 1]]

d[n_, q_] := Sum[bp[k/n]* bp[q*k/n], k, 0, n - 1]


Now I need to compare two values of d[n,q], in particular, I need to calculate
d[1346269,514229] and d[1346269,1137064] to see which one is larger. It works perfectly fine for smaller numbers, e.g. I tried d[75025, 28657] and got the correct result in a reasonable amount of time. However, when I tried evaluating d[1346269,514229] after some time I got the result



(1/9760128332100732436)(4744910246749618660646829 - 
4880064166050366218 Hold[$ConditionHold[$ConditionHold[
System`Dump`AutoLoad[Hold[Sum`InfiniteSum],
Hold[Sum`InfiniteSum, Sum`SumInfiniteRationalSeries,
Sum`SumInfiniteRationalExponentialSeries,
Sum`SumInfiniteLogarithmicSeries,
Sum`SumInfiniteBernoulliSeries,
Sum`SumInfiniteFibonacciSeries, Sum`SumInfiniteLucasLSeries,
Sum`SumInfiniteArcTangentSeries,
Sum`SumInfiniteArcCotangentSeries,
Sum`SumInfiniteqArcTangentSeries,
Sum`SumInfiniteqArcCotangentSeries,
Sum`SumInfiniteStirlingNumberSeries,
Sum`SumInfiniteqRationalSeries], "Sum`InfiniteSum`"]]]][(1 -
Ceiling[(1 - Sum`FiniteSumDump`l)/1346269] +
Floor[(1346268 - Sum`FiniteSumDump`l)/1346269]) Mod[(
514229 Sum`FiniteSumDump`l)/1346269, 1], Sum`FiniteSumDump`l,
0, 1346268, True])


Now, I am not too familiar with Mathematica, so I am not sure where the problem is exactly. However, I would need the two results of d[1346269,1137064] and d[1346269,514229] exactly (i.e. not numerically) as they are super close together, so any rounding could already change the results sufficiently much to alter the order of the two. Is there any way of computing those sums symbolically?










share|improve this question











$endgroup$
















    3












    $begingroup$


    For a research project I am working on currently, I need to do a very simple and straightforward calculation. Unfortunately, I do not know how to include Mathematica code here, but it is very short anyway:



    b[x_] := x^2 - x + 1/2

    bp[x_] := b[Mod[x, 1]]

    d[n_, q_] := Sum[bp[k/n]* bp[q*k/n], k, 0, n - 1]


    Now I need to compare two values of d[n,q], in particular, I need to calculate
    d[1346269,514229] and d[1346269,1137064] to see which one is larger. It works perfectly fine for smaller numbers, e.g. I tried d[75025, 28657] and got the correct result in a reasonable amount of time. However, when I tried evaluating d[1346269,514229] after some time I got the result



    (1/9760128332100732436)(4744910246749618660646829 - 
    4880064166050366218 Hold[$ConditionHold[$ConditionHold[
    System`Dump`AutoLoad[Hold[Sum`InfiniteSum],
    Hold[Sum`InfiniteSum, Sum`SumInfiniteRationalSeries,
    Sum`SumInfiniteRationalExponentialSeries,
    Sum`SumInfiniteLogarithmicSeries,
    Sum`SumInfiniteBernoulliSeries,
    Sum`SumInfiniteFibonacciSeries, Sum`SumInfiniteLucasLSeries,
    Sum`SumInfiniteArcTangentSeries,
    Sum`SumInfiniteArcCotangentSeries,
    Sum`SumInfiniteqArcTangentSeries,
    Sum`SumInfiniteqArcCotangentSeries,
    Sum`SumInfiniteStirlingNumberSeries,
    Sum`SumInfiniteqRationalSeries], "Sum`InfiniteSum`"]]]][(1 -
    Ceiling[(1 - Sum`FiniteSumDump`l)/1346269] +
    Floor[(1346268 - Sum`FiniteSumDump`l)/1346269]) Mod[(
    514229 Sum`FiniteSumDump`l)/1346269, 1], Sum`FiniteSumDump`l,
    0, 1346268, True])


    Now, I am not too familiar with Mathematica, so I am not sure where the problem is exactly. However, I would need the two results of d[1346269,1137064] and d[1346269,514229] exactly (i.e. not numerically) as they are super close together, so any rounding could already change the results sufficiently much to alter the order of the two. Is there any way of computing those sums symbolically?










    share|improve this question











    $endgroup$














      3












      3








      3





      $begingroup$


      For a research project I am working on currently, I need to do a very simple and straightforward calculation. Unfortunately, I do not know how to include Mathematica code here, but it is very short anyway:



      b[x_] := x^2 - x + 1/2

      bp[x_] := b[Mod[x, 1]]

      d[n_, q_] := Sum[bp[k/n]* bp[q*k/n], k, 0, n - 1]


      Now I need to compare two values of d[n,q], in particular, I need to calculate
      d[1346269,514229] and d[1346269,1137064] to see which one is larger. It works perfectly fine for smaller numbers, e.g. I tried d[75025, 28657] and got the correct result in a reasonable amount of time. However, when I tried evaluating d[1346269,514229] after some time I got the result



      (1/9760128332100732436)(4744910246749618660646829 - 
      4880064166050366218 Hold[$ConditionHold[$ConditionHold[
      System`Dump`AutoLoad[Hold[Sum`InfiniteSum],
      Hold[Sum`InfiniteSum, Sum`SumInfiniteRationalSeries,
      Sum`SumInfiniteRationalExponentialSeries,
      Sum`SumInfiniteLogarithmicSeries,
      Sum`SumInfiniteBernoulliSeries,
      Sum`SumInfiniteFibonacciSeries, Sum`SumInfiniteLucasLSeries,
      Sum`SumInfiniteArcTangentSeries,
      Sum`SumInfiniteArcCotangentSeries,
      Sum`SumInfiniteqArcTangentSeries,
      Sum`SumInfiniteqArcCotangentSeries,
      Sum`SumInfiniteStirlingNumberSeries,
      Sum`SumInfiniteqRationalSeries], "Sum`InfiniteSum`"]]]][(1 -
      Ceiling[(1 - Sum`FiniteSumDump`l)/1346269] +
      Floor[(1346268 - Sum`FiniteSumDump`l)/1346269]) Mod[(
      514229 Sum`FiniteSumDump`l)/1346269, 1], Sum`FiniteSumDump`l,
      0, 1346268, True])


      Now, I am not too familiar with Mathematica, so I am not sure where the problem is exactly. However, I would need the two results of d[1346269,1137064] and d[1346269,514229] exactly (i.e. not numerically) as they are super close together, so any rounding could already change the results sufficiently much to alter the order of the two. Is there any way of computing those sums symbolically?










      share|improve this question











      $endgroup$




      For a research project I am working on currently, I need to do a very simple and straightforward calculation. Unfortunately, I do not know how to include Mathematica code here, but it is very short anyway:



      b[x_] := x^2 - x + 1/2

      bp[x_] := b[Mod[x, 1]]

      d[n_, q_] := Sum[bp[k/n]* bp[q*k/n], k, 0, n - 1]


      Now I need to compare two values of d[n,q], in particular, I need to calculate
      d[1346269,514229] and d[1346269,1137064] to see which one is larger. It works perfectly fine for smaller numbers, e.g. I tried d[75025, 28657] and got the correct result in a reasonable amount of time. However, when I tried evaluating d[1346269,514229] after some time I got the result



      (1/9760128332100732436)(4744910246749618660646829 - 
      4880064166050366218 Hold[$ConditionHold[$ConditionHold[
      System`Dump`AutoLoad[Hold[Sum`InfiniteSum],
      Hold[Sum`InfiniteSum, Sum`SumInfiniteRationalSeries,
      Sum`SumInfiniteRationalExponentialSeries,
      Sum`SumInfiniteLogarithmicSeries,
      Sum`SumInfiniteBernoulliSeries,
      Sum`SumInfiniteFibonacciSeries, Sum`SumInfiniteLucasLSeries,
      Sum`SumInfiniteArcTangentSeries,
      Sum`SumInfiniteArcCotangentSeries,
      Sum`SumInfiniteqArcTangentSeries,
      Sum`SumInfiniteqArcCotangentSeries,
      Sum`SumInfiniteStirlingNumberSeries,
      Sum`SumInfiniteqRationalSeries], "Sum`InfiniteSum`"]]]][(1 -
      Ceiling[(1 - Sum`FiniteSumDump`l)/1346269] +
      Floor[(1346268 - Sum`FiniteSumDump`l)/1346269]) Mod[(
      514229 Sum`FiniteSumDump`l)/1346269, 1], Sum`FiniteSumDump`l,
      0, 1346268, True])


      Now, I am not too familiar with Mathematica, so I am not sure where the problem is exactly. However, I would need the two results of d[1346269,1137064] and d[1346269,514229] exactly (i.e. not numerically) as they are super close together, so any rounding could already change the results sufficiently much to alter the order of the two. Is there any way of computing those sums symbolically?







      summation






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited May 9 at 19:34









      Carl Lange

      6,59911647




      6,59911647










      asked May 9 at 19:20









      Analysis801Analysis801

      433




      433




















          1 Answer
          1






          active

          oldest

          votes


















          7












          $begingroup$

          I find that this evaluates much faster. Mathematica is much faster at evaluating functions with a list of 1 million data points than it is at evaluate a function 1 million times with a single point each.



          b[x] := x^2 - x + 1/2
          bp[x_] := b[Mod[x, 1]]
          d[n_, q_] := Total[bp[Range[0, n - 1]/n] bp[q Range[0, n - 1]/n]]
          d[1346269, 1137064]


          This takes about 17 seconds to evaluate on my machine and gives me:



          $frac14599731344021534525766739760128332100732436$



          Evaluating d[1346269, 514229] gives me:



          $frac14599731343994714468596179760128332100732436$



          The difference is:



          $frac6705014292642440032083025183109$



          Evaluated to 100 digits with N[difference, 100], I get



          2.74792054550653169316510562953899839764312057350982296551104800194 089580643554604005134744886010933307817655482572397565420836194
          $cdot 10^-7$






          share|improve this answer









          $endgroup$












          • $begingroup$
            Your point about the way lists are interacted with via functions during evaluation is going to be quite impactful for me! Thank you :) do you have a good example or explanation as to how the list is evaluated differently than running the evaluation some N number of times? I understand the difference in how much code you’d have to write, but it seemed to me that would only affect the memory taken up, and not change the speed of evaluation. Is the difference in having to access it some N amount of times, reloading the function each time?
            $endgroup$
            – CA Trevillian
            May 10 at 3:28










          • $begingroup$
            I have tried it your way now and indeed it is much faster. Which is really useful, since in fact not only do I need to compute some d[n,q], but for fixed n I need to compute Table[d[n, q], q, 0, Ceiling[n/2]] and then find the smallest element. Is there another way of speeding up the computation of the entire table as well? I thought of first computing Table[bp[k/n],k,0,n-1] and then just reading the function values from that table. But since bp is not a very complicated function, I was not sure whether this is significantly faster. But is there some way of speeding this up?
            $endgroup$
            – Analysis801
            May 10 at 6:09










          • $begingroup$
            @CATrevillian mathematica.stackexchange.com/questions/18393/… provides a little background and links to a good question/answer by Mr. Wizard. I don't understand all the details behind why it's faster, but have often found using this feature speeds up code for large amounts of data. Sin[Range[0, 100, 0.001]] is faster than Table[Sin[i], i, 0, 100, 0.001] which is faster than For[i = 0, i <= 100, i = i + 0.001, Sin[i]].
            $endgroup$
            – MassDefect
            May 10 at 18:03










          • $begingroup$
            @Analysis801 Hmmm... I'm not sure. I think that's probably worthy of its own question, if you haven't already asked one. Each one is a pretty major calculation and will take quite some time, so to do half a million of them would be tough.
            $endgroup$
            – MassDefect
            May 10 at 18:10










          • $begingroup$
            @MassDefect wow! I unfortunately didn’t even know you could do that! Now I fortunately do. The link is awesome and your examples are perfect in illustrating the differences. I’m excited to do some new timing tests :)
            $endgroup$
            – CA Trevillian
            May 11 at 1:49











          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "387"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathematica.stackexchange.com%2fquestions%2f198049%2fhow-to-evaluate-sum-with-one-million-summands%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          7












          $begingroup$

          I find that this evaluates much faster. Mathematica is much faster at evaluating functions with a list of 1 million data points than it is at evaluate a function 1 million times with a single point each.



          b[x] := x^2 - x + 1/2
          bp[x_] := b[Mod[x, 1]]
          d[n_, q_] := Total[bp[Range[0, n - 1]/n] bp[q Range[0, n - 1]/n]]
          d[1346269, 1137064]


          This takes about 17 seconds to evaluate on my machine and gives me:



          $frac14599731344021534525766739760128332100732436$



          Evaluating d[1346269, 514229] gives me:



          $frac14599731343994714468596179760128332100732436$



          The difference is:



          $frac6705014292642440032083025183109$



          Evaluated to 100 digits with N[difference, 100], I get



          2.74792054550653169316510562953899839764312057350982296551104800194 089580643554604005134744886010933307817655482572397565420836194
          $cdot 10^-7$






          share|improve this answer









          $endgroup$












          • $begingroup$
            Your point about the way lists are interacted with via functions during evaluation is going to be quite impactful for me! Thank you :) do you have a good example or explanation as to how the list is evaluated differently than running the evaluation some N number of times? I understand the difference in how much code you’d have to write, but it seemed to me that would only affect the memory taken up, and not change the speed of evaluation. Is the difference in having to access it some N amount of times, reloading the function each time?
            $endgroup$
            – CA Trevillian
            May 10 at 3:28










          • $begingroup$
            I have tried it your way now and indeed it is much faster. Which is really useful, since in fact not only do I need to compute some d[n,q], but for fixed n I need to compute Table[d[n, q], q, 0, Ceiling[n/2]] and then find the smallest element. Is there another way of speeding up the computation of the entire table as well? I thought of first computing Table[bp[k/n],k,0,n-1] and then just reading the function values from that table. But since bp is not a very complicated function, I was not sure whether this is significantly faster. But is there some way of speeding this up?
            $endgroup$
            – Analysis801
            May 10 at 6:09










          • $begingroup$
            @CATrevillian mathematica.stackexchange.com/questions/18393/… provides a little background and links to a good question/answer by Mr. Wizard. I don't understand all the details behind why it's faster, but have often found using this feature speeds up code for large amounts of data. Sin[Range[0, 100, 0.001]] is faster than Table[Sin[i], i, 0, 100, 0.001] which is faster than For[i = 0, i <= 100, i = i + 0.001, Sin[i]].
            $endgroup$
            – MassDefect
            May 10 at 18:03










          • $begingroup$
            @Analysis801 Hmmm... I'm not sure. I think that's probably worthy of its own question, if you haven't already asked one. Each one is a pretty major calculation and will take quite some time, so to do half a million of them would be tough.
            $endgroup$
            – MassDefect
            May 10 at 18:10










          • $begingroup$
            @MassDefect wow! I unfortunately didn’t even know you could do that! Now I fortunately do. The link is awesome and your examples are perfect in illustrating the differences. I’m excited to do some new timing tests :)
            $endgroup$
            – CA Trevillian
            May 11 at 1:49















          7












          $begingroup$

          I find that this evaluates much faster. Mathematica is much faster at evaluating functions with a list of 1 million data points than it is at evaluate a function 1 million times with a single point each.



          b[x] := x^2 - x + 1/2
          bp[x_] := b[Mod[x, 1]]
          d[n_, q_] := Total[bp[Range[0, n - 1]/n] bp[q Range[0, n - 1]/n]]
          d[1346269, 1137064]


          This takes about 17 seconds to evaluate on my machine and gives me:



          $frac14599731344021534525766739760128332100732436$



          Evaluating d[1346269, 514229] gives me:



          $frac14599731343994714468596179760128332100732436$



          The difference is:



          $frac6705014292642440032083025183109$



          Evaluated to 100 digits with N[difference, 100], I get



          2.74792054550653169316510562953899839764312057350982296551104800194 089580643554604005134744886010933307817655482572397565420836194
          $cdot 10^-7$






          share|improve this answer









          $endgroup$












          • $begingroup$
            Your point about the way lists are interacted with via functions during evaluation is going to be quite impactful for me! Thank you :) do you have a good example or explanation as to how the list is evaluated differently than running the evaluation some N number of times? I understand the difference in how much code you’d have to write, but it seemed to me that would only affect the memory taken up, and not change the speed of evaluation. Is the difference in having to access it some N amount of times, reloading the function each time?
            $endgroup$
            – CA Trevillian
            May 10 at 3:28










          • $begingroup$
            I have tried it your way now and indeed it is much faster. Which is really useful, since in fact not only do I need to compute some d[n,q], but for fixed n I need to compute Table[d[n, q], q, 0, Ceiling[n/2]] and then find the smallest element. Is there another way of speeding up the computation of the entire table as well? I thought of first computing Table[bp[k/n],k,0,n-1] and then just reading the function values from that table. But since bp is not a very complicated function, I was not sure whether this is significantly faster. But is there some way of speeding this up?
            $endgroup$
            – Analysis801
            May 10 at 6:09










          • $begingroup$
            @CATrevillian mathematica.stackexchange.com/questions/18393/… provides a little background and links to a good question/answer by Mr. Wizard. I don't understand all the details behind why it's faster, but have often found using this feature speeds up code for large amounts of data. Sin[Range[0, 100, 0.001]] is faster than Table[Sin[i], i, 0, 100, 0.001] which is faster than For[i = 0, i <= 100, i = i + 0.001, Sin[i]].
            $endgroup$
            – MassDefect
            May 10 at 18:03










          • $begingroup$
            @Analysis801 Hmmm... I'm not sure. I think that's probably worthy of its own question, if you haven't already asked one. Each one is a pretty major calculation and will take quite some time, so to do half a million of them would be tough.
            $endgroup$
            – MassDefect
            May 10 at 18:10










          • $begingroup$
            @MassDefect wow! I unfortunately didn’t even know you could do that! Now I fortunately do. The link is awesome and your examples are perfect in illustrating the differences. I’m excited to do some new timing tests :)
            $endgroup$
            – CA Trevillian
            May 11 at 1:49













          7












          7








          7





          $begingroup$

          I find that this evaluates much faster. Mathematica is much faster at evaluating functions with a list of 1 million data points than it is at evaluate a function 1 million times with a single point each.



          b[x] := x^2 - x + 1/2
          bp[x_] := b[Mod[x, 1]]
          d[n_, q_] := Total[bp[Range[0, n - 1]/n] bp[q Range[0, n - 1]/n]]
          d[1346269, 1137064]


          This takes about 17 seconds to evaluate on my machine and gives me:



          $frac14599731344021534525766739760128332100732436$



          Evaluating d[1346269, 514229] gives me:



          $frac14599731343994714468596179760128332100732436$



          The difference is:



          $frac6705014292642440032083025183109$



          Evaluated to 100 digits with N[difference, 100], I get



          2.74792054550653169316510562953899839764312057350982296551104800194 089580643554604005134744886010933307817655482572397565420836194
          $cdot 10^-7$






          share|improve this answer









          $endgroup$



          I find that this evaluates much faster. Mathematica is much faster at evaluating functions with a list of 1 million data points than it is at evaluate a function 1 million times with a single point each.



          b[x] := x^2 - x + 1/2
          bp[x_] := b[Mod[x, 1]]
          d[n_, q_] := Total[bp[Range[0, n - 1]/n] bp[q Range[0, n - 1]/n]]
          d[1346269, 1137064]


          This takes about 17 seconds to evaluate on my machine and gives me:



          $frac14599731344021534525766739760128332100732436$



          Evaluating d[1346269, 514229] gives me:



          $frac14599731343994714468596179760128332100732436$



          The difference is:



          $frac6705014292642440032083025183109$



          Evaluated to 100 digits with N[difference, 100], I get



          2.74792054550653169316510562953899839764312057350982296551104800194 089580643554604005134744886010933307817655482572397565420836194
          $cdot 10^-7$







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered May 9 at 19:55









          MassDefectMassDefect

          2,775311




          2,775311











          • $begingroup$
            Your point about the way lists are interacted with via functions during evaluation is going to be quite impactful for me! Thank you :) do you have a good example or explanation as to how the list is evaluated differently than running the evaluation some N number of times? I understand the difference in how much code you’d have to write, but it seemed to me that would only affect the memory taken up, and not change the speed of evaluation. Is the difference in having to access it some N amount of times, reloading the function each time?
            $endgroup$
            – CA Trevillian
            May 10 at 3:28










          • $begingroup$
            I have tried it your way now and indeed it is much faster. Which is really useful, since in fact not only do I need to compute some d[n,q], but for fixed n I need to compute Table[d[n, q], q, 0, Ceiling[n/2]] and then find the smallest element. Is there another way of speeding up the computation of the entire table as well? I thought of first computing Table[bp[k/n],k,0,n-1] and then just reading the function values from that table. But since bp is not a very complicated function, I was not sure whether this is significantly faster. But is there some way of speeding this up?
            $endgroup$
            – Analysis801
            May 10 at 6:09










          • $begingroup$
            @CATrevillian mathematica.stackexchange.com/questions/18393/… provides a little background and links to a good question/answer by Mr. Wizard. I don't understand all the details behind why it's faster, but have often found using this feature speeds up code for large amounts of data. Sin[Range[0, 100, 0.001]] is faster than Table[Sin[i], i, 0, 100, 0.001] which is faster than For[i = 0, i <= 100, i = i + 0.001, Sin[i]].
            $endgroup$
            – MassDefect
            May 10 at 18:03










          • $begingroup$
            @Analysis801 Hmmm... I'm not sure. I think that's probably worthy of its own question, if you haven't already asked one. Each one is a pretty major calculation and will take quite some time, so to do half a million of them would be tough.
            $endgroup$
            – MassDefect
            May 10 at 18:10










          • $begingroup$
            @MassDefect wow! I unfortunately didn’t even know you could do that! Now I fortunately do. The link is awesome and your examples are perfect in illustrating the differences. I’m excited to do some new timing tests :)
            $endgroup$
            – CA Trevillian
            May 11 at 1:49
















          • $begingroup$
            Your point about the way lists are interacted with via functions during evaluation is going to be quite impactful for me! Thank you :) do you have a good example or explanation as to how the list is evaluated differently than running the evaluation some N number of times? I understand the difference in how much code you’d have to write, but it seemed to me that would only affect the memory taken up, and not change the speed of evaluation. Is the difference in having to access it some N amount of times, reloading the function each time?
            $endgroup$
            – CA Trevillian
            May 10 at 3:28










          • $begingroup$
            I have tried it your way now and indeed it is much faster. Which is really useful, since in fact not only do I need to compute some d[n,q], but for fixed n I need to compute Table[d[n, q], q, 0, Ceiling[n/2]] and then find the smallest element. Is there another way of speeding up the computation of the entire table as well? I thought of first computing Table[bp[k/n],k,0,n-1] and then just reading the function values from that table. But since bp is not a very complicated function, I was not sure whether this is significantly faster. But is there some way of speeding this up?
            $endgroup$
            – Analysis801
            May 10 at 6:09










          • $begingroup$
            @CATrevillian mathematica.stackexchange.com/questions/18393/… provides a little background and links to a good question/answer by Mr. Wizard. I don't understand all the details behind why it's faster, but have often found using this feature speeds up code for large amounts of data. Sin[Range[0, 100, 0.001]] is faster than Table[Sin[i], i, 0, 100, 0.001] which is faster than For[i = 0, i <= 100, i = i + 0.001, Sin[i]].
            $endgroup$
            – MassDefect
            May 10 at 18:03










          • $begingroup$
            @Analysis801 Hmmm... I'm not sure. I think that's probably worthy of its own question, if you haven't already asked one. Each one is a pretty major calculation and will take quite some time, so to do half a million of them would be tough.
            $endgroup$
            – MassDefect
            May 10 at 18:10










          • $begingroup$
            @MassDefect wow! I unfortunately didn’t even know you could do that! Now I fortunately do. The link is awesome and your examples are perfect in illustrating the differences. I’m excited to do some new timing tests :)
            $endgroup$
            – CA Trevillian
            May 11 at 1:49















          $begingroup$
          Your point about the way lists are interacted with via functions during evaluation is going to be quite impactful for me! Thank you :) do you have a good example or explanation as to how the list is evaluated differently than running the evaluation some N number of times? I understand the difference in how much code you’d have to write, but it seemed to me that would only affect the memory taken up, and not change the speed of evaluation. Is the difference in having to access it some N amount of times, reloading the function each time?
          $endgroup$
          – CA Trevillian
          May 10 at 3:28




          $begingroup$
          Your point about the way lists are interacted with via functions during evaluation is going to be quite impactful for me! Thank you :) do you have a good example or explanation as to how the list is evaluated differently than running the evaluation some N number of times? I understand the difference in how much code you’d have to write, but it seemed to me that would only affect the memory taken up, and not change the speed of evaluation. Is the difference in having to access it some N amount of times, reloading the function each time?
          $endgroup$
          – CA Trevillian
          May 10 at 3:28












          $begingroup$
          I have tried it your way now and indeed it is much faster. Which is really useful, since in fact not only do I need to compute some d[n,q], but for fixed n I need to compute Table[d[n, q], q, 0, Ceiling[n/2]] and then find the smallest element. Is there another way of speeding up the computation of the entire table as well? I thought of first computing Table[bp[k/n],k,0,n-1] and then just reading the function values from that table. But since bp is not a very complicated function, I was not sure whether this is significantly faster. But is there some way of speeding this up?
          $endgroup$
          – Analysis801
          May 10 at 6:09




          $begingroup$
          I have tried it your way now and indeed it is much faster. Which is really useful, since in fact not only do I need to compute some d[n,q], but for fixed n I need to compute Table[d[n, q], q, 0, Ceiling[n/2]] and then find the smallest element. Is there another way of speeding up the computation of the entire table as well? I thought of first computing Table[bp[k/n],k,0,n-1] and then just reading the function values from that table. But since bp is not a very complicated function, I was not sure whether this is significantly faster. But is there some way of speeding this up?
          $endgroup$
          – Analysis801
          May 10 at 6:09












          $begingroup$
          @CATrevillian mathematica.stackexchange.com/questions/18393/… provides a little background and links to a good question/answer by Mr. Wizard. I don't understand all the details behind why it's faster, but have often found using this feature speeds up code for large amounts of data. Sin[Range[0, 100, 0.001]] is faster than Table[Sin[i], i, 0, 100, 0.001] which is faster than For[i = 0, i <= 100, i = i + 0.001, Sin[i]].
          $endgroup$
          – MassDefect
          May 10 at 18:03




          $begingroup$
          @CATrevillian mathematica.stackexchange.com/questions/18393/… provides a little background and links to a good question/answer by Mr. Wizard. I don't understand all the details behind why it's faster, but have often found using this feature speeds up code for large amounts of data. Sin[Range[0, 100, 0.001]] is faster than Table[Sin[i], i, 0, 100, 0.001] which is faster than For[i = 0, i <= 100, i = i + 0.001, Sin[i]].
          $endgroup$
          – MassDefect
          May 10 at 18:03












          $begingroup$
          @Analysis801 Hmmm... I'm not sure. I think that's probably worthy of its own question, if you haven't already asked one. Each one is a pretty major calculation and will take quite some time, so to do half a million of them would be tough.
          $endgroup$
          – MassDefect
          May 10 at 18:10




          $begingroup$
          @Analysis801 Hmmm... I'm not sure. I think that's probably worthy of its own question, if you haven't already asked one. Each one is a pretty major calculation and will take quite some time, so to do half a million of them would be tough.
          $endgroup$
          – MassDefect
          May 10 at 18:10












          $begingroup$
          @MassDefect wow! I unfortunately didn’t even know you could do that! Now I fortunately do. The link is awesome and your examples are perfect in illustrating the differences. I’m excited to do some new timing tests :)
          $endgroup$
          – CA Trevillian
          May 11 at 1:49




          $begingroup$
          @MassDefect wow! I unfortunately didn’t even know you could do that! Now I fortunately do. The link is awesome and your examples are perfect in illustrating the differences. I’m excited to do some new timing tests :)
          $endgroup$
          – CA Trevillian
          May 11 at 1:49

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Mathematica Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathematica.stackexchange.com%2fquestions%2f198049%2fhow-to-evaluate-sum-with-one-million-summands%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Club Baloncesto Breogán Índice Historia | Pavillón | Nome | O Breogán na cultura popular | Xogadores | Adestradores | Presidentes | Palmarés | Historial | Líderes | Notas | Véxase tamén | Menú de navegacióncbbreogan.galCadroGuía oficial da ACB 2009-10, páxina 201Guía oficial ACB 1992, páxina 183. Editorial DB.É de 6.500 espectadores sentados axeitándose á última normativa"Estudiantes Junior, entre as mellores canteiras"o orixinalHemeroteca El Mundo Deportivo, 16 setembro de 1970, páxina 12Historia do BreogánAlfredo Pérez, o último canoneiroHistoria C.B. BreogánHemeroteca de El Mundo DeportivoJimmy Wright, norteamericano do Breogán deixará Lugo por ameazas de morteResultados de Breogán en 1986-87Resultados de Breogán en 1990-91Ficha de Velimir Perasović en acb.comResultados de Breogán en 1994-95Breogán arrasa al Barça. "El Mundo Deportivo", 27 de setembro de 1999, páxina 58CB Breogán - FC BarcelonaA FEB invita a participar nunha nova Liga EuropeaCharlie Bell na prensa estatalMáximos anotadores 2005Tempada 2005-06 : Tódolos Xogadores da Xornada""Non quero pensar nunha man negra, mais pregúntome que está a pasar""o orixinalRaúl López, orgulloso dos xogadores, presume da boa saúde económica do BreogánJulio González confirma que cesa como presidente del BreogánHomenaxe a Lisardo GómezA tempada do rexurdimento celesteEntrevista a Lisardo GómezEl COB dinamita el Pazo para forzar el quinto (69-73)Cafés Candelas, patrocinador del CB Breogán"Suso Lázare, novo presidente do Breogán"o orixinalCafés Candelas Breogán firma el mayor triunfo de la historiaEl Breogán realizará 17 homenajes por su cincuenta aniversario"O Breogán honra ao seu fundador e primeiro presidente"o orixinalMiguel Giao recibiu a homenaxe do PazoHomenaxe aos primeiros gladiadores celestesO home que nos amosa como ver o Breo co corazónTita Franco será homenaxeada polos #50anosdeBreoJulio Vila recibirá unha homenaxe in memoriam polos #50anosdeBreo"O Breogán homenaxeará aos seus aboados máis veteráns"Pechada ovación a «Capi» Sanmartín e Ricardo «Corazón de González»Homenaxe por décadas de informaciónPaco García volve ao Pazo con motivo do 50 aniversario"Resultados y clasificaciones""O Cafés Candelas Breogán, campión da Copa Princesa""O Cafés Candelas Breogán, equipo ACB"C.B. Breogán"Proxecto social"o orixinal"Centros asociados"o orixinalFicha en imdb.comMario Camus trata la recuperación del amor en 'La vieja música', su última película"Páxina web oficial""Club Baloncesto Breogán""C. B. Breogán S.A.D."eehttp://www.fegaba.com

          Vilaño, A Laracha Índice Patrimonio | Lugares e parroquias | Véxase tamén | Menú de navegación43°14′52″N 8°36′03″O / 43.24775, -8.60070

          Cegueira Índice Epidemioloxía | Deficiencia visual | Tipos de cegueira | Principais causas de cegueira | Tratamento | Técnicas de adaptación e axudas | Vida dos cegos | Primeiros auxilios | Crenzas respecto das persoas cegas | Crenzas das persoas cegas | O neno deficiente visual | Aspectos psicolóxicos da cegueira | Notas | Véxase tamén | Menú de navegación54.054.154.436928256blindnessDicionario da Real Academia GalegaPortal das Palabras"International Standards: Visual Standards — Aspects and Ranges of Vision Loss with Emphasis on Population Surveys.""Visual impairment and blindness""Presentan un plan para previr a cegueira"o orixinalACCDV Associació Catalana de Cecs i Disminuïts Visuals - PMFTrachoma"Effect of gene therapy on visual function in Leber's congenital amaurosis"1844137110.1056/NEJMoa0802268Cans guía - os mellores amigos dos cegosArquivadoEscola de cans guía para cegos en Mortágua, PortugalArquivado"Tecnología para ciegos y deficientes visuales. Recopilación de recursos gratuitos en la Red""Colorino""‘COL.diesis’, escuchar los sonidos del color""COL.diesis: Transforming Colour into Melody and Implementing the Result in a Colour Sensor Device"o orixinal"Sistema de desarrollo de sinestesia color-sonido para invidentes utilizando un protocolo de audio""Enseñanza táctil - geometría y color. Juegos didácticos para niños ciegos y videntes""Sistema Constanz"L'ocupació laboral dels cecs a l'Estat espanyol està pràcticament equiparada a la de les persones amb visió, entrevista amb Pedro ZuritaONCE (Organización Nacional de Cegos de España)Prevención da cegueiraDescrición de deficiencias visuais (Disc@pnet)Braillín, un boneco atractivo para calquera neno, con ou sen discapacidade, que permite familiarizarse co sistema de escritura e lectura brailleAxudas Técnicas36838ID00897494007150-90057129528256DOID:1432HP:0000618D001766C10.597.751.941.162C97109C0155020