[백준] 1629번: 곱샘

whitehousechef·2023년 11월 15일
0

https://www.acmicpc.net/problem/1629

initial

Divide and conquer is hard to imagine how the code runs because it uses recursion. If we meet a base condition, we return a certain value and that value will be used in the previous call stack and value accumulates up and up until the first recursion stack that we called.

For example we want to find 2^5 %7. We know that 2^5 can be broken down to (2^2)2 *2 so
(a^b//2)
2a. Whereas for even values of b like 2^4, it is (a^b//2)2. So anyway 2^2 can be broken down to (2^1)2. When b =1, a to the power of 1 is just itself so we just return a%c, which is 2%7=2. This returned value of a%c is powered to 2 (2^2=4) and modulo is done on it so 4%7= 4. This 4 goes back to the previous call stack of (2^2)**2 2, so 4*22=32 and we have to do modulo 7 on it so ans is 4.

I was confused as to why we have to modulo each returned value. The returned values can be very big and can cause overflow so we modulo each time. But won’t the result be affected?

Actually modular exponential says tells us that
a^b modulo c = (a modulo c)^b mod c
So np!

solution

A, B, C = map(int, input().split())

def dac(a, b, c):
    if b == 1:
        return a % c
    elif b % 2 == 0:
        return dac(a, b // 2, c) ** 2 % c
    else:
        return (dac(a, b // 2, c) ** 2 * a) % c

print(dac(A, B, C))

complexity

is it 3^n tcuz there are 3 choices?
oh no it is log n bcos we are dividing b by 2 for each recursion

The time complexity of the given code is (O(\log(B))), and the space complexity is (O(\log(B))) due to the recursive calls. This is because the code uses a divide-and-conquer approach for modular exponentiation, where the exponent (B) is divided by 2 in each recursive call.

Explanation:

  1. The recursive function dac is called with (a), (b), and (c) as its parameters.
  2. If (b) is equal to 1, the function returns (a \mod c).
  3. If (b) is even, it recursively calls itself with (a), (b//2), and (c), squares the result, and takes the modulo (c).
  4. If (b) is odd, it recursively calls itself with (a), (b//2), and (c), squares the result, multiplies by (a), and takes the modulo (c).
  5. The final result is printed.

The space complexity is (O(\log(B))) because each recursive call creates a new frame on the call stack, and the maximum depth of the stack is proportional to (\log(B)) due to the halving of (b) in each recursive call.

The time complexity is (O(\log(B))) because, in each recursive call, the value of (b) is effectively halved. The number of recursive calls is (\log(B)), and each call involves basic operations with constant time complexity.

0개의 댓글