Minimum number of operations (divide by 2/3 or subtract 1) to reduce $n$ to $1$












17












$begingroup$


This question is inspired by a Stack Overflow question which involves the task to find an algorithm to solve the following problem:




Given a natural number $n$, what is the least number of moves you need to reduce it to $1$? Valid moves are:




  • subtract $1$

  • divide by $2$, applicable if $n equiv 0 pmod{2}$

  • divide by $3$, applicable if $n equiv 0 pmod{3}$




For example, you can reduce 10 in 3 steps: $10 rightarrow^{-1} 9 rightarrow^{/3} 3 rightarrow^{/3} 1$.



Let's define $f(n)$ as the answer for number $n$. Then we have $f(1) = 0$ and for $n > 1$:



$f(n) = 1 + min { f(n-1), (n mod 2) + f(lfloorfrac{n}{2}rfloor), (nmod 3) + f(lfloorfrac{n}{3}rfloor) }$



$n$ is restricted to $10^9$ in the original source, which makes it easy to solve in $O(n)$ using dynamic programming or a breadth-first search, but that isn't really interesting.



Initially I thought that the tricky range for $n$ would only be small (below $10^6$ or so) and for larger $n$ we could apply some simple greedy algorithm that prefers division by 3 or 4, even if we need to subtract 1 first. I tried to test some identities that could lead to such a heuristic:




  • $f(n) = 1 + f(n - 1) forall n: n equiv 1,5 pmod{6}$ (that's easy to prove, because there's only one valid move)

  • $f(n) = min { f(frac{n}{2}), f(frac{n}{3}) } forall n: n equiv 0 pmod{6}$

  • $f(3n) geq f(n)$

  • $f(n) = f(frac{n}{3}) forall n: n equiv 0 pmod{3^3}$

  • $f(n) = f(frac{n}{3}) forall n: n equiv 0 pmod{3} textbf{ and } n notequiv 0 pmod{2}$

  • ...


But all but the first three have turned out not to be correct, and those are not very helpful because you still have a branching factor of 2. You can use the third inequality to prune during a depth- or breadth-first search, but I also can't prove that this yields a "good" algorithm, $O(log^c n)$ or something.



I understand that it might have something to do with the exponents of 2 and 3 in the prime factorization of $n$, but I can't put my finger on it, since you always have the possibility to get to any equivalency class modulo 2 or 3 within at most 2 steps and change up everything.



Do you have any ideas on how to formalize this or prove useful properties of the $f$ function? I'm not only looking for approaches that necessarily lead to an algorithm for larger $n$, also for general insights that have escaped me so far.










share|cite|improve this question











$endgroup$












  • $begingroup$
    This is essentially oeis.org/A056796. They don't give further reference.
    $endgroup$
    – benh
    Mar 16 '14 at 14:22










  • $begingroup$
    @benh good to know, that doesn't leave me with high hopes :)
    $endgroup$
    – Niklas B.
    Mar 16 '14 at 14:25










  • $begingroup$
    I'm a little confused: you say $f(n) = 1+f(n-1) forall nneq 0pmod 6$, but $f(27)=3$ and certainly $f(26)$ isn't 2. Did you mean $forall nequiv 1,5pmod 6$?
    $endgroup$
    – Steven Stadnicki
    Mar 21 '14 at 15:56












  • $begingroup$
    @Steven yes, exactly. Thanks for spotting that
    $endgroup$
    – Niklas B.
    Mar 21 '14 at 16:38










  • $begingroup$
    It almost certainly doesn't depend on the exponents of 2 and 3 in the prime factorization of $n$ because subtraction by 1 essentially mangles that value. It probably has more to do with the binary and ternary representation of n.
    $endgroup$
    – Foo Barrigno
    Apr 8 '14 at 17:13
















17












$begingroup$


This question is inspired by a Stack Overflow question which involves the task to find an algorithm to solve the following problem:




Given a natural number $n$, what is the least number of moves you need to reduce it to $1$? Valid moves are:




  • subtract $1$

  • divide by $2$, applicable if $n equiv 0 pmod{2}$

  • divide by $3$, applicable if $n equiv 0 pmod{3}$




For example, you can reduce 10 in 3 steps: $10 rightarrow^{-1} 9 rightarrow^{/3} 3 rightarrow^{/3} 1$.



Let's define $f(n)$ as the answer for number $n$. Then we have $f(1) = 0$ and for $n > 1$:



$f(n) = 1 + min { f(n-1), (n mod 2) + f(lfloorfrac{n}{2}rfloor), (nmod 3) + f(lfloorfrac{n}{3}rfloor) }$



$n$ is restricted to $10^9$ in the original source, which makes it easy to solve in $O(n)$ using dynamic programming or a breadth-first search, but that isn't really interesting.



Initially I thought that the tricky range for $n$ would only be small (below $10^6$ or so) and for larger $n$ we could apply some simple greedy algorithm that prefers division by 3 or 4, even if we need to subtract 1 first. I tried to test some identities that could lead to such a heuristic:




  • $f(n) = 1 + f(n - 1) forall n: n equiv 1,5 pmod{6}$ (that's easy to prove, because there's only one valid move)

  • $f(n) = min { f(frac{n}{2}), f(frac{n}{3}) } forall n: n equiv 0 pmod{6}$

  • $f(3n) geq f(n)$

  • $f(n) = f(frac{n}{3}) forall n: n equiv 0 pmod{3^3}$

  • $f(n) = f(frac{n}{3}) forall n: n equiv 0 pmod{3} textbf{ and } n notequiv 0 pmod{2}$

  • ...


But all but the first three have turned out not to be correct, and those are not very helpful because you still have a branching factor of 2. You can use the third inequality to prune during a depth- or breadth-first search, but I also can't prove that this yields a "good" algorithm, $O(log^c n)$ or something.



I understand that it might have something to do with the exponents of 2 and 3 in the prime factorization of $n$, but I can't put my finger on it, since you always have the possibility to get to any equivalency class modulo 2 or 3 within at most 2 steps and change up everything.



Do you have any ideas on how to formalize this or prove useful properties of the $f$ function? I'm not only looking for approaches that necessarily lead to an algorithm for larger $n$, also for general insights that have escaped me so far.










share|cite|improve this question











$endgroup$












  • $begingroup$
    This is essentially oeis.org/A056796. They don't give further reference.
    $endgroup$
    – benh
    Mar 16 '14 at 14:22










  • $begingroup$
    @benh good to know, that doesn't leave me with high hopes :)
    $endgroup$
    – Niklas B.
    Mar 16 '14 at 14:25










  • $begingroup$
    I'm a little confused: you say $f(n) = 1+f(n-1) forall nneq 0pmod 6$, but $f(27)=3$ and certainly $f(26)$ isn't 2. Did you mean $forall nequiv 1,5pmod 6$?
    $endgroup$
    – Steven Stadnicki
    Mar 21 '14 at 15:56












  • $begingroup$
    @Steven yes, exactly. Thanks for spotting that
    $endgroup$
    – Niklas B.
    Mar 21 '14 at 16:38










  • $begingroup$
    It almost certainly doesn't depend on the exponents of 2 and 3 in the prime factorization of $n$ because subtraction by 1 essentially mangles that value. It probably has more to do with the binary and ternary representation of n.
    $endgroup$
    – Foo Barrigno
    Apr 8 '14 at 17:13














17












17








17


5



$begingroup$


This question is inspired by a Stack Overflow question which involves the task to find an algorithm to solve the following problem:




Given a natural number $n$, what is the least number of moves you need to reduce it to $1$? Valid moves are:




  • subtract $1$

  • divide by $2$, applicable if $n equiv 0 pmod{2}$

  • divide by $3$, applicable if $n equiv 0 pmod{3}$




For example, you can reduce 10 in 3 steps: $10 rightarrow^{-1} 9 rightarrow^{/3} 3 rightarrow^{/3} 1$.



Let's define $f(n)$ as the answer for number $n$. Then we have $f(1) = 0$ and for $n > 1$:



$f(n) = 1 + min { f(n-1), (n mod 2) + f(lfloorfrac{n}{2}rfloor), (nmod 3) + f(lfloorfrac{n}{3}rfloor) }$



$n$ is restricted to $10^9$ in the original source, which makes it easy to solve in $O(n)$ using dynamic programming or a breadth-first search, but that isn't really interesting.



Initially I thought that the tricky range for $n$ would only be small (below $10^6$ or so) and for larger $n$ we could apply some simple greedy algorithm that prefers division by 3 or 4, even if we need to subtract 1 first. I tried to test some identities that could lead to such a heuristic:




  • $f(n) = 1 + f(n - 1) forall n: n equiv 1,5 pmod{6}$ (that's easy to prove, because there's only one valid move)

  • $f(n) = min { f(frac{n}{2}), f(frac{n}{3}) } forall n: n equiv 0 pmod{6}$

  • $f(3n) geq f(n)$

  • $f(n) = f(frac{n}{3}) forall n: n equiv 0 pmod{3^3}$

  • $f(n) = f(frac{n}{3}) forall n: n equiv 0 pmod{3} textbf{ and } n notequiv 0 pmod{2}$

  • ...


But all but the first three have turned out not to be correct, and those are not very helpful because you still have a branching factor of 2. You can use the third inequality to prune during a depth- or breadth-first search, but I also can't prove that this yields a "good" algorithm, $O(log^c n)$ or something.



I understand that it might have something to do with the exponents of 2 and 3 in the prime factorization of $n$, but I can't put my finger on it, since you always have the possibility to get to any equivalency class modulo 2 or 3 within at most 2 steps and change up everything.



Do you have any ideas on how to formalize this or prove useful properties of the $f$ function? I'm not only looking for approaches that necessarily lead to an algorithm for larger $n$, also for general insights that have escaped me so far.










share|cite|improve this question











$endgroup$




This question is inspired by a Stack Overflow question which involves the task to find an algorithm to solve the following problem:




Given a natural number $n$, what is the least number of moves you need to reduce it to $1$? Valid moves are:




  • subtract $1$

  • divide by $2$, applicable if $n equiv 0 pmod{2}$

  • divide by $3$, applicable if $n equiv 0 pmod{3}$




For example, you can reduce 10 in 3 steps: $10 rightarrow^{-1} 9 rightarrow^{/3} 3 rightarrow^{/3} 1$.



Let's define $f(n)$ as the answer for number $n$. Then we have $f(1) = 0$ and for $n > 1$:



$f(n) = 1 + min { f(n-1), (n mod 2) + f(lfloorfrac{n}{2}rfloor), (nmod 3) + f(lfloorfrac{n}{3}rfloor) }$



$n$ is restricted to $10^9$ in the original source, which makes it easy to solve in $O(n)$ using dynamic programming or a breadth-first search, but that isn't really interesting.



Initially I thought that the tricky range for $n$ would only be small (below $10^6$ or so) and for larger $n$ we could apply some simple greedy algorithm that prefers division by 3 or 4, even if we need to subtract 1 first. I tried to test some identities that could lead to such a heuristic:




  • $f(n) = 1 + f(n - 1) forall n: n equiv 1,5 pmod{6}$ (that's easy to prove, because there's only one valid move)

  • $f(n) = min { f(frac{n}{2}), f(frac{n}{3}) } forall n: n equiv 0 pmod{6}$

  • $f(3n) geq f(n)$

  • $f(n) = f(frac{n}{3}) forall n: n equiv 0 pmod{3^3}$

  • $f(n) = f(frac{n}{3}) forall n: n equiv 0 pmod{3} textbf{ and } n notequiv 0 pmod{2}$

  • ...


But all but the first three have turned out not to be correct, and those are not very helpful because you still have a branching factor of 2. You can use the third inequality to prune during a depth- or breadth-first search, but I also can't prove that this yields a "good" algorithm, $O(log^c n)$ or something.



I understand that it might have something to do with the exponents of 2 and 3 in the prime factorization of $n$, but I can't put my finger on it, since you always have the possibility to get to any equivalency class modulo 2 or 3 within at most 2 steps and change up everything.



Do you have any ideas on how to formalize this or prove useful properties of the $f$ function? I'm not only looking for approaches that necessarily lead to an algorithm for larger $n$, also for general insights that have escaped me so far.







number-theory discrete-mathematics algorithms






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited May 23 '17 at 12:39









Community

1




1










asked Mar 15 '14 at 22:12









Niklas B.Niklas B.

327216




327216












  • $begingroup$
    This is essentially oeis.org/A056796. They don't give further reference.
    $endgroup$
    – benh
    Mar 16 '14 at 14:22










  • $begingroup$
    @benh good to know, that doesn't leave me with high hopes :)
    $endgroup$
    – Niklas B.
    Mar 16 '14 at 14:25










  • $begingroup$
    I'm a little confused: you say $f(n) = 1+f(n-1) forall nneq 0pmod 6$, but $f(27)=3$ and certainly $f(26)$ isn't 2. Did you mean $forall nequiv 1,5pmod 6$?
    $endgroup$
    – Steven Stadnicki
    Mar 21 '14 at 15:56












  • $begingroup$
    @Steven yes, exactly. Thanks for spotting that
    $endgroup$
    – Niklas B.
    Mar 21 '14 at 16:38










  • $begingroup$
    It almost certainly doesn't depend on the exponents of 2 and 3 in the prime factorization of $n$ because subtraction by 1 essentially mangles that value. It probably has more to do with the binary and ternary representation of n.
    $endgroup$
    – Foo Barrigno
    Apr 8 '14 at 17:13


















  • $begingroup$
    This is essentially oeis.org/A056796. They don't give further reference.
    $endgroup$
    – benh
    Mar 16 '14 at 14:22










  • $begingroup$
    @benh good to know, that doesn't leave me with high hopes :)
    $endgroup$
    – Niklas B.
    Mar 16 '14 at 14:25










  • $begingroup$
    I'm a little confused: you say $f(n) = 1+f(n-1) forall nneq 0pmod 6$, but $f(27)=3$ and certainly $f(26)$ isn't 2. Did you mean $forall nequiv 1,5pmod 6$?
    $endgroup$
    – Steven Stadnicki
    Mar 21 '14 at 15:56












  • $begingroup$
    @Steven yes, exactly. Thanks for spotting that
    $endgroup$
    – Niklas B.
    Mar 21 '14 at 16:38










  • $begingroup$
    It almost certainly doesn't depend on the exponents of 2 and 3 in the prime factorization of $n$ because subtraction by 1 essentially mangles that value. It probably has more to do with the binary and ternary representation of n.
    $endgroup$
    – Foo Barrigno
    Apr 8 '14 at 17:13
















$begingroup$
This is essentially oeis.org/A056796. They don't give further reference.
$endgroup$
– benh
Mar 16 '14 at 14:22




$begingroup$
This is essentially oeis.org/A056796. They don't give further reference.
$endgroup$
– benh
Mar 16 '14 at 14:22












$begingroup$
@benh good to know, that doesn't leave me with high hopes :)
$endgroup$
– Niklas B.
Mar 16 '14 at 14:25




$begingroup$
@benh good to know, that doesn't leave me with high hopes :)
$endgroup$
– Niklas B.
Mar 16 '14 at 14:25












$begingroup$
I'm a little confused: you say $f(n) = 1+f(n-1) forall nneq 0pmod 6$, but $f(27)=3$ and certainly $f(26)$ isn't 2. Did you mean $forall nequiv 1,5pmod 6$?
$endgroup$
– Steven Stadnicki
Mar 21 '14 at 15:56






$begingroup$
I'm a little confused: you say $f(n) = 1+f(n-1) forall nneq 0pmod 6$, but $f(27)=3$ and certainly $f(26)$ isn't 2. Did you mean $forall nequiv 1,5pmod 6$?
$endgroup$
– Steven Stadnicki
Mar 21 '14 at 15:56














$begingroup$
@Steven yes, exactly. Thanks for spotting that
$endgroup$
– Niklas B.
Mar 21 '14 at 16:38




$begingroup$
@Steven yes, exactly. Thanks for spotting that
$endgroup$
– Niklas B.
Mar 21 '14 at 16:38












$begingroup$
It almost certainly doesn't depend on the exponents of 2 and 3 in the prime factorization of $n$ because subtraction by 1 essentially mangles that value. It probably has more to do with the binary and ternary representation of n.
$endgroup$
– Foo Barrigno
Apr 8 '14 at 17:13




$begingroup$
It almost certainly doesn't depend on the exponents of 2 and 3 in the prime factorization of $n$ because subtraction by 1 essentially mangles that value. It probably has more to do with the binary and ternary representation of n.
$endgroup$
– Foo Barrigno
Apr 8 '14 at 17:13










3 Answers
3






active

oldest

votes


















2












$begingroup$

Just a bit of statistics: Up to a billion, worst case is 644,972,543 with 44 moves. Up to 4 billion, worst case is 3,386,105,855 with 48 moves. 99% of all numbers to 36 moves or fewer, 99.99% take 39 moves or fewer.



The simple algorithm "Divide by 3 if possible, else divide by 2 if possible, else subtract 1" has the worst case numbers 3 * 2^k - 2 taking 2k steps and 3 * 2^k - 1 taking 2k + 1 steps, which is substantially worse.






share|cite|improve this answer









$endgroup$





















    0












    $begingroup$

    It's a elementary dp problem.



    for recursive solution:
    First, you can run dp function upto 10^7 using memozition in O(n).
    And then write another similar function for bigger numbers, this time run the function without memozition. if the number you call becomes less than 10^7, return the value from previously function.



    observations:
    what can be maximum output of all the numbers?




    • 2^30 > 10^9

    • 3^17 > 10^9

    • And you know you can reduce a number n by 1 if n/2 & n/3 doesn't give optimal solution.


    suppose, a number p is a prime.



    then, at first step, we reduce it to (p-1) & then call it again. This (p-1) is obviously divisible by 2 or 3. For worst case, I assume it is divisible by 2. then, we call (p-1)/2.
    let, p = (p-1)/2.



    so we need 2 moves to reduce a number at it's half. if this new p is prime again, we repeat the steps. So what can be maximum output? I guess 60. But practically, It'll be at most 40.



    Another guess, what's the maximum move of a greater than 10^7 to reduce it upto 10^7?



    Practically, first function O(n), 2nd function O(3^m), where m <= 12.
    the 2nd function I referred that can be like this one: http://paste.ubuntu.com/7131241/



    Sorry, for bad English.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      This is basically somewhat worse than BFS, which I know. The question is whether it's actually $o(n) $
      $endgroup$
      – Niklas B.
      Mar 21 '14 at 16:40





















    0












    $begingroup$

    We will use a very straightforward method that produces a solution of order o(N), still though not polynomial in the size of the input.



    We will use the following recursive function to calculate the result:



    $$m(N) = 1 + min( Ν mod 2 + m( frac{N}{2} ), N mod 3 + m( frac{N}{3} ) )$$
    $$m(1) = 0$$



    The correctness of the above method is obvious, we pretty much try every possible way to reduce the number to 1.



    The total amount of operations for the above method is given by the following recurrence relation:



    $$ T(N) = T(frac{N}{2}) + T(frac{N}{3}) + O(1) $$



    To analyze this relation, we cannot use the master theorem since this relation is not in the appropriate form. We have to use the Akra-Bazzi method. In short, what the method says is that if we have a recurrence relation of the following form:



    $$ T(x) = g(x) + sum_{i=1}^{k} a_i T( b_i x + c_i ) $$



    Then we can found the asymptotic behaviour of T(x) by first determining the constant $p in mathbb{R}$ such that $sum_{i=1}^{k} a_i b_i^p = 1$ and then evaluating the following integral:



    $$I(x) = int_1^x frac{g(u)}{u^{p+1}} du$$



    Then we know that $ T(x) in Theta( x^p ( 1 + I(x) ) ) $.



    We will now apply this method to our problem. We know that $g(x) in O(1)$ and furthermore $a_1=a_2=1, b_1 = frac{1}{2}, b_2 = frac{1}{3}$. Evaluating the integral gives us:



    $$ I(x) = int_1^x frac{g(u)}{u^{p+1}} du = int_1^x u^{-(p+1)}du = frac{-x^{-p}}{p} + frac{1}{p} = frac{1 - x^{-p}}{p} $$



    Finally by substituting we get: $ T(x) in Theta( x^p( 1 + I(x) ) ) = Theta( x^p ) $.



    What remains to be done is to find the value of p. We solve the following equation, $2^{-p} + 3^{-p} = 1$, by analytically computing its root (we can use either the Newton-Raphson or the bisection method) we obtain that p = 0.78788...



    Therefore, the complexity of our algorithm is of order $O( N^{0.79} )$.



    For further details for the Akra-Bazzi method you can check here: https://en.wikipedia.org/wiki/Akra%E2%80%93Bazzi_method






    share|cite|improve this answer









    $endgroup$














      Your Answer








      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f713698%2fminimum-number-of-operations-divide-by-2-3-or-subtract-1-to-reduce-n-to-1%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      2












      $begingroup$

      Just a bit of statistics: Up to a billion, worst case is 644,972,543 with 44 moves. Up to 4 billion, worst case is 3,386,105,855 with 48 moves. 99% of all numbers to 36 moves or fewer, 99.99% take 39 moves or fewer.



      The simple algorithm "Divide by 3 if possible, else divide by 2 if possible, else subtract 1" has the worst case numbers 3 * 2^k - 2 taking 2k steps and 3 * 2^k - 1 taking 2k + 1 steps, which is substantially worse.






      share|cite|improve this answer









      $endgroup$


















        2












        $begingroup$

        Just a bit of statistics: Up to a billion, worst case is 644,972,543 with 44 moves. Up to 4 billion, worst case is 3,386,105,855 with 48 moves. 99% of all numbers to 36 moves or fewer, 99.99% take 39 moves or fewer.



        The simple algorithm "Divide by 3 if possible, else divide by 2 if possible, else subtract 1" has the worst case numbers 3 * 2^k - 2 taking 2k steps and 3 * 2^k - 1 taking 2k + 1 steps, which is substantially worse.






        share|cite|improve this answer









        $endgroup$
















          2












          2








          2





          $begingroup$

          Just a bit of statistics: Up to a billion, worst case is 644,972,543 with 44 moves. Up to 4 billion, worst case is 3,386,105,855 with 48 moves. 99% of all numbers to 36 moves or fewer, 99.99% take 39 moves or fewer.



          The simple algorithm "Divide by 3 if possible, else divide by 2 if possible, else subtract 1" has the worst case numbers 3 * 2^k - 2 taking 2k steps and 3 * 2^k - 1 taking 2k + 1 steps, which is substantially worse.






          share|cite|improve this answer









          $endgroup$



          Just a bit of statistics: Up to a billion, worst case is 644,972,543 with 44 moves. Up to 4 billion, worst case is 3,386,105,855 with 48 moves. 99% of all numbers to 36 moves or fewer, 99.99% take 39 moves or fewer.



          The simple algorithm "Divide by 3 if possible, else divide by 2 if possible, else subtract 1" has the worst case numbers 3 * 2^k - 2 taking 2k steps and 3 * 2^k - 1 taking 2k + 1 steps, which is substantially worse.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Apr 29 '14 at 0:05









          gnasher729gnasher729

          6,1701228




          6,1701228























              0












              $begingroup$

              It's a elementary dp problem.



              for recursive solution:
              First, you can run dp function upto 10^7 using memozition in O(n).
              And then write another similar function for bigger numbers, this time run the function without memozition. if the number you call becomes less than 10^7, return the value from previously function.



              observations:
              what can be maximum output of all the numbers?




              • 2^30 > 10^9

              • 3^17 > 10^9

              • And you know you can reduce a number n by 1 if n/2 & n/3 doesn't give optimal solution.


              suppose, a number p is a prime.



              then, at first step, we reduce it to (p-1) & then call it again. This (p-1) is obviously divisible by 2 or 3. For worst case, I assume it is divisible by 2. then, we call (p-1)/2.
              let, p = (p-1)/2.



              so we need 2 moves to reduce a number at it's half. if this new p is prime again, we repeat the steps. So what can be maximum output? I guess 60. But practically, It'll be at most 40.



              Another guess, what's the maximum move of a greater than 10^7 to reduce it upto 10^7?



              Practically, first function O(n), 2nd function O(3^m), where m <= 12.
              the 2nd function I referred that can be like this one: http://paste.ubuntu.com/7131241/



              Sorry, for bad English.






              share|cite|improve this answer











              $endgroup$













              • $begingroup$
                This is basically somewhat worse than BFS, which I know. The question is whether it's actually $o(n) $
                $endgroup$
                – Niklas B.
                Mar 21 '14 at 16:40


















              0












              $begingroup$

              It's a elementary dp problem.



              for recursive solution:
              First, you can run dp function upto 10^7 using memozition in O(n).
              And then write another similar function for bigger numbers, this time run the function without memozition. if the number you call becomes less than 10^7, return the value from previously function.



              observations:
              what can be maximum output of all the numbers?




              • 2^30 > 10^9

              • 3^17 > 10^9

              • And you know you can reduce a number n by 1 if n/2 & n/3 doesn't give optimal solution.


              suppose, a number p is a prime.



              then, at first step, we reduce it to (p-1) & then call it again. This (p-1) is obviously divisible by 2 or 3. For worst case, I assume it is divisible by 2. then, we call (p-1)/2.
              let, p = (p-1)/2.



              so we need 2 moves to reduce a number at it's half. if this new p is prime again, we repeat the steps. So what can be maximum output? I guess 60. But practically, It'll be at most 40.



              Another guess, what's the maximum move of a greater than 10^7 to reduce it upto 10^7?



              Practically, first function O(n), 2nd function O(3^m), where m <= 12.
              the 2nd function I referred that can be like this one: http://paste.ubuntu.com/7131241/



              Sorry, for bad English.






              share|cite|improve this answer











              $endgroup$













              • $begingroup$
                This is basically somewhat worse than BFS, which I know. The question is whether it's actually $o(n) $
                $endgroup$
                – Niklas B.
                Mar 21 '14 at 16:40
















              0












              0








              0





              $begingroup$

              It's a elementary dp problem.



              for recursive solution:
              First, you can run dp function upto 10^7 using memozition in O(n).
              And then write another similar function for bigger numbers, this time run the function without memozition. if the number you call becomes less than 10^7, return the value from previously function.



              observations:
              what can be maximum output of all the numbers?




              • 2^30 > 10^9

              • 3^17 > 10^9

              • And you know you can reduce a number n by 1 if n/2 & n/3 doesn't give optimal solution.


              suppose, a number p is a prime.



              then, at first step, we reduce it to (p-1) & then call it again. This (p-1) is obviously divisible by 2 or 3. For worst case, I assume it is divisible by 2. then, we call (p-1)/2.
              let, p = (p-1)/2.



              so we need 2 moves to reduce a number at it's half. if this new p is prime again, we repeat the steps. So what can be maximum output? I guess 60. But practically, It'll be at most 40.



              Another guess, what's the maximum move of a greater than 10^7 to reduce it upto 10^7?



              Practically, first function O(n), 2nd function O(3^m), where m <= 12.
              the 2nd function I referred that can be like this one: http://paste.ubuntu.com/7131241/



              Sorry, for bad English.






              share|cite|improve this answer











              $endgroup$



              It's a elementary dp problem.



              for recursive solution:
              First, you can run dp function upto 10^7 using memozition in O(n).
              And then write another similar function for bigger numbers, this time run the function without memozition. if the number you call becomes less than 10^7, return the value from previously function.



              observations:
              what can be maximum output of all the numbers?




              • 2^30 > 10^9

              • 3^17 > 10^9

              • And you know you can reduce a number n by 1 if n/2 & n/3 doesn't give optimal solution.


              suppose, a number p is a prime.



              then, at first step, we reduce it to (p-1) & then call it again. This (p-1) is obviously divisible by 2 or 3. For worst case, I assume it is divisible by 2. then, we call (p-1)/2.
              let, p = (p-1)/2.



              so we need 2 moves to reduce a number at it's half. if this new p is prime again, we repeat the steps. So what can be maximum output? I guess 60. But practically, It'll be at most 40.



              Another guess, what's the maximum move of a greater than 10^7 to reduce it upto 10^7?



              Practically, first function O(n), 2nd function O(3^m), where m <= 12.
              the 2nd function I referred that can be like this one: http://paste.ubuntu.com/7131241/



              Sorry, for bad English.







              share|cite|improve this answer














              share|cite|improve this answer



              share|cite|improve this answer








              edited Mar 21 '14 at 16:04

























              answered Mar 21 '14 at 15:35









              imranimran

              11




              11












              • $begingroup$
                This is basically somewhat worse than BFS, which I know. The question is whether it's actually $o(n) $
                $endgroup$
                – Niklas B.
                Mar 21 '14 at 16:40




















              • $begingroup$
                This is basically somewhat worse than BFS, which I know. The question is whether it's actually $o(n) $
                $endgroup$
                – Niklas B.
                Mar 21 '14 at 16:40


















              $begingroup$
              This is basically somewhat worse than BFS, which I know. The question is whether it's actually $o(n) $
              $endgroup$
              – Niklas B.
              Mar 21 '14 at 16:40






              $begingroup$
              This is basically somewhat worse than BFS, which I know. The question is whether it's actually $o(n) $
              $endgroup$
              – Niklas B.
              Mar 21 '14 at 16:40













              0












              $begingroup$

              We will use a very straightforward method that produces a solution of order o(N), still though not polynomial in the size of the input.



              We will use the following recursive function to calculate the result:



              $$m(N) = 1 + min( Ν mod 2 + m( frac{N}{2} ), N mod 3 + m( frac{N}{3} ) )$$
              $$m(1) = 0$$



              The correctness of the above method is obvious, we pretty much try every possible way to reduce the number to 1.



              The total amount of operations for the above method is given by the following recurrence relation:



              $$ T(N) = T(frac{N}{2}) + T(frac{N}{3}) + O(1) $$



              To analyze this relation, we cannot use the master theorem since this relation is not in the appropriate form. We have to use the Akra-Bazzi method. In short, what the method says is that if we have a recurrence relation of the following form:



              $$ T(x) = g(x) + sum_{i=1}^{k} a_i T( b_i x + c_i ) $$



              Then we can found the asymptotic behaviour of T(x) by first determining the constant $p in mathbb{R}$ such that $sum_{i=1}^{k} a_i b_i^p = 1$ and then evaluating the following integral:



              $$I(x) = int_1^x frac{g(u)}{u^{p+1}} du$$



              Then we know that $ T(x) in Theta( x^p ( 1 + I(x) ) ) $.



              We will now apply this method to our problem. We know that $g(x) in O(1)$ and furthermore $a_1=a_2=1, b_1 = frac{1}{2}, b_2 = frac{1}{3}$. Evaluating the integral gives us:



              $$ I(x) = int_1^x frac{g(u)}{u^{p+1}} du = int_1^x u^{-(p+1)}du = frac{-x^{-p}}{p} + frac{1}{p} = frac{1 - x^{-p}}{p} $$



              Finally by substituting we get: $ T(x) in Theta( x^p( 1 + I(x) ) ) = Theta( x^p ) $.



              What remains to be done is to find the value of p. We solve the following equation, $2^{-p} + 3^{-p} = 1$, by analytically computing its root (we can use either the Newton-Raphson or the bisection method) we obtain that p = 0.78788...



              Therefore, the complexity of our algorithm is of order $O( N^{0.79} )$.



              For further details for the Akra-Bazzi method you can check here: https://en.wikipedia.org/wiki/Akra%E2%80%93Bazzi_method






              share|cite|improve this answer









              $endgroup$


















                0












                $begingroup$

                We will use a very straightforward method that produces a solution of order o(N), still though not polynomial in the size of the input.



                We will use the following recursive function to calculate the result:



                $$m(N) = 1 + min( Ν mod 2 + m( frac{N}{2} ), N mod 3 + m( frac{N}{3} ) )$$
                $$m(1) = 0$$



                The correctness of the above method is obvious, we pretty much try every possible way to reduce the number to 1.



                The total amount of operations for the above method is given by the following recurrence relation:



                $$ T(N) = T(frac{N}{2}) + T(frac{N}{3}) + O(1) $$



                To analyze this relation, we cannot use the master theorem since this relation is not in the appropriate form. We have to use the Akra-Bazzi method. In short, what the method says is that if we have a recurrence relation of the following form:



                $$ T(x) = g(x) + sum_{i=1}^{k} a_i T( b_i x + c_i ) $$



                Then we can found the asymptotic behaviour of T(x) by first determining the constant $p in mathbb{R}$ such that $sum_{i=1}^{k} a_i b_i^p = 1$ and then evaluating the following integral:



                $$I(x) = int_1^x frac{g(u)}{u^{p+1}} du$$



                Then we know that $ T(x) in Theta( x^p ( 1 + I(x) ) ) $.



                We will now apply this method to our problem. We know that $g(x) in O(1)$ and furthermore $a_1=a_2=1, b_1 = frac{1}{2}, b_2 = frac{1}{3}$. Evaluating the integral gives us:



                $$ I(x) = int_1^x frac{g(u)}{u^{p+1}} du = int_1^x u^{-(p+1)}du = frac{-x^{-p}}{p} + frac{1}{p} = frac{1 - x^{-p}}{p} $$



                Finally by substituting we get: $ T(x) in Theta( x^p( 1 + I(x) ) ) = Theta( x^p ) $.



                What remains to be done is to find the value of p. We solve the following equation, $2^{-p} + 3^{-p} = 1$, by analytically computing its root (we can use either the Newton-Raphson or the bisection method) we obtain that p = 0.78788...



                Therefore, the complexity of our algorithm is of order $O( N^{0.79} )$.



                For further details for the Akra-Bazzi method you can check here: https://en.wikipedia.org/wiki/Akra%E2%80%93Bazzi_method






                share|cite|improve this answer









                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  We will use a very straightforward method that produces a solution of order o(N), still though not polynomial in the size of the input.



                  We will use the following recursive function to calculate the result:



                  $$m(N) = 1 + min( Ν mod 2 + m( frac{N}{2} ), N mod 3 + m( frac{N}{3} ) )$$
                  $$m(1) = 0$$



                  The correctness of the above method is obvious, we pretty much try every possible way to reduce the number to 1.



                  The total amount of operations for the above method is given by the following recurrence relation:



                  $$ T(N) = T(frac{N}{2}) + T(frac{N}{3}) + O(1) $$



                  To analyze this relation, we cannot use the master theorem since this relation is not in the appropriate form. We have to use the Akra-Bazzi method. In short, what the method says is that if we have a recurrence relation of the following form:



                  $$ T(x) = g(x) + sum_{i=1}^{k} a_i T( b_i x + c_i ) $$



                  Then we can found the asymptotic behaviour of T(x) by first determining the constant $p in mathbb{R}$ such that $sum_{i=1}^{k} a_i b_i^p = 1$ and then evaluating the following integral:



                  $$I(x) = int_1^x frac{g(u)}{u^{p+1}} du$$



                  Then we know that $ T(x) in Theta( x^p ( 1 + I(x) ) ) $.



                  We will now apply this method to our problem. We know that $g(x) in O(1)$ and furthermore $a_1=a_2=1, b_1 = frac{1}{2}, b_2 = frac{1}{3}$. Evaluating the integral gives us:



                  $$ I(x) = int_1^x frac{g(u)}{u^{p+1}} du = int_1^x u^{-(p+1)}du = frac{-x^{-p}}{p} + frac{1}{p} = frac{1 - x^{-p}}{p} $$



                  Finally by substituting we get: $ T(x) in Theta( x^p( 1 + I(x) ) ) = Theta( x^p ) $.



                  What remains to be done is to find the value of p. We solve the following equation, $2^{-p} + 3^{-p} = 1$, by analytically computing its root (we can use either the Newton-Raphson or the bisection method) we obtain that p = 0.78788...



                  Therefore, the complexity of our algorithm is of order $O( N^{0.79} )$.



                  For further details for the Akra-Bazzi method you can check here: https://en.wikipedia.org/wiki/Akra%E2%80%93Bazzi_method






                  share|cite|improve this answer









                  $endgroup$



                  We will use a very straightforward method that produces a solution of order o(N), still though not polynomial in the size of the input.



                  We will use the following recursive function to calculate the result:



                  $$m(N) = 1 + min( Ν mod 2 + m( frac{N}{2} ), N mod 3 + m( frac{N}{3} ) )$$
                  $$m(1) = 0$$



                  The correctness of the above method is obvious, we pretty much try every possible way to reduce the number to 1.



                  The total amount of operations for the above method is given by the following recurrence relation:



                  $$ T(N) = T(frac{N}{2}) + T(frac{N}{3}) + O(1) $$



                  To analyze this relation, we cannot use the master theorem since this relation is not in the appropriate form. We have to use the Akra-Bazzi method. In short, what the method says is that if we have a recurrence relation of the following form:



                  $$ T(x) = g(x) + sum_{i=1}^{k} a_i T( b_i x + c_i ) $$



                  Then we can found the asymptotic behaviour of T(x) by first determining the constant $p in mathbb{R}$ such that $sum_{i=1}^{k} a_i b_i^p = 1$ and then evaluating the following integral:



                  $$I(x) = int_1^x frac{g(u)}{u^{p+1}} du$$



                  Then we know that $ T(x) in Theta( x^p ( 1 + I(x) ) ) $.



                  We will now apply this method to our problem. We know that $g(x) in O(1)$ and furthermore $a_1=a_2=1, b_1 = frac{1}{2}, b_2 = frac{1}{3}$. Evaluating the integral gives us:



                  $$ I(x) = int_1^x frac{g(u)}{u^{p+1}} du = int_1^x u^{-(p+1)}du = frac{-x^{-p}}{p} + frac{1}{p} = frac{1 - x^{-p}}{p} $$



                  Finally by substituting we get: $ T(x) in Theta( x^p( 1 + I(x) ) ) = Theta( x^p ) $.



                  What remains to be done is to find the value of p. We solve the following equation, $2^{-p} + 3^{-p} = 1$, by analytically computing its root (we can use either the Newton-Raphson or the bisection method) we obtain that p = 0.78788...



                  Therefore, the complexity of our algorithm is of order $O( N^{0.79} )$.



                  For further details for the Akra-Bazzi method you can check here: https://en.wikipedia.org/wiki/Akra%E2%80%93Bazzi_method







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Dec 22 '18 at 15:21









                  infinityinfinity

                  1




                  1






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f713698%2fminimum-number-of-operations-divide-by-2-3-or-subtract-1-to-reduce-n-to-1%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Plaza Victoria

                      In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

                      How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...