๋ชฉํ: 6๊ฐ์ ๋์ CLRS(Introduction to Algorithms, 4th Edition)๋ฅผ ํ์ตํ๋ฉฐ, ์๊ณ ๋ฆฌ์ฆ ์ค๊ณ ๋ฐ ๋ถ์ ์ญ๋์ ๊ฐํํ๊ธฐ.
์ฃผ์ฐจ | ํ์ต ๋ด์ฉ | ์ค์ต & ๋ฌธ์ ํ์ด |
---|---|---|
1์ฃผ์ฐจ | 1์ฅ: ์๊ณ ๋ฆฌ์ฆ์ ์ญํ | ์๊ณ ๋ฆฌ์ฆ ๊ฐ๋ ์ ๋ฆฌ |
2-3์ฃผ์ฐจ | 2-3์ฅ: ์ ๋ ฌ, ์๊ณ ๋ฆฌ์ฆ ๋ถ์ | ์ฝ์ ์ ๋ ฌ, ํฉ๋ณ ์ ๋ ฌ ๊ตฌํ |
4-5์ฃผ์ฐจ | 4์ฅ: ๋ถํ ์ ๋ณต | Strassenโs matrix multiplication ๊ตฌํ |
6์ฃผ์ฐจ | 5์ฅ: ํ๋ฅ ์ ๋ถ์๊ณผ ๋๋ค ์๊ณ ๋ฆฌ์ฆ | ๋์ ๊ธฐ๋ฐ ์๊ณ ๋ฆฌ์ฆ ์ค์ต |
7-8์ฃผ์ฐจ | 6-9์ฅ: ํ ์ ๋ ฌ, ํต ์ ๋ ฌ, ์ ํ ์๊ฐ ์ ๋ ฌ, ์ ํ ๋ฌธ์ | ๊ตฌํ ๋ฐ ๋น๊ต ๋ถ์ |
9-10์ฃผ์ฐจ | 10-13์ฅ: ์๋ฃ๊ตฌ์กฐ (ํด์ ํ ์ด๋ธ, BST, Red-Black Tree) | ๋ฐ์ดํฐ ๊ตฌ์กฐ ํ์ฉ ๋ฌธ์ |
11-12์ฃผ์ฐจ | 14-16์ฅ: ๋์ ํ๋ก๊ทธ๋๋ฐ, ํ์ ์๊ณ ๋ฆฌ์ฆ, ์ํ๋ ๋ถ์ | DP, Huffman coding ์ค์ต |
13-15์ฃผ์ฐจ | 17-19์ฅ: ๊ณ ๊ธ ์๋ฃ๊ตฌ์กฐ | B-tree, Disjoint Set ๊ตฌํ |
16-18์ฃผ์ฐจ | 20-25์ฅ: ๊ทธ๋ํ ์๊ณ ๋ฆฌ์ฆ | BFS, DFS, Dijkstra, MST |
19-21์ฃผ์ฐจ | 26-28์ฅ: ๋ณ๋ ฌ ์๊ณ ๋ฆฌ์ฆ, ์จ๋ผ์ธ ์๊ณ ๋ฆฌ์ฆ, ํ๋ ฌ ์ฐ์ฐ | ํ๋ ฌ ๊ณ์ฐ ๋ฐ ๋ณ๋ ฌ ์ฒ๋ฆฌ |
22-24์ฃผ์ฐจ | 29-35์ฅ: ๋จธ์ ๋ฌ๋, NP-์์ ์ฑ, ๊ทผ์ฌ ์๊ณ ๋ฆฌ์ฆ | ๊ทธ๋๋์ธํธ ๋์ผํธ, NP ๋ฌธ์ ํด๊ฒฐ |
์๊ณ ๋ฆฌ์ฆ์ด๋?
์๊ณ ๋ฆฌ์ฆ์ด ์ค์ํ ์ด์
์๊ณ ๋ฆฌ์ฆ๊ณผ ๊ธฐ์ ๋ฐ์
์๊ณ ๋ฆฌ์ฆ์ ์ค์ ์์ฉ
์๊ณ ๋ฆฌ์ฆ์ ๋์
๐ฅ ๋ค์ ๋จ๊ณ: 2์ฅ(์ฝ์
์ ๋ ฌ, ์๊ณ ๋ฆฌ์ฆ ๋ถ์) ๊ณต๋ถ ์์! ๐
์ด ํ๋์ด๋ ์์ฝ์์ ์์ ํ๊ฑฐ๋ ๋ณด์ํ๊ณ ์ถ์ ๋ถ๋ถ์ด ์์ผ๋ฉด ๋งํด์ค!
Gyver Tc, [2/17/2025 1:24 PM]
์ฝ์ ์ ๋ ฌ ๊ฐ๋
์ฝ์
์ ๋ ฌ ์๊ณ ๋ฆฌ์ฆ (์์ฌ์ฝ๋)
def insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i]
j = i - 1
while j >= 0 and arr[j] > key:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key
์ ๋ ฅ ํฌ๊ธฐ (Input Size)
์ ๊ทผ ํ๊ธฐ๋ฒ (Asymptotic Notation)
์ฌ๊ท์ ์๊ณ ๋ฆฌ์ฆ ๋ถ์
์ค์ ํ์ฉ
โ
[์ค์ต1] ์ฝ์
์ ๋ ฌ ๊ตฌํ ๋ฐ ์คํ ์๊ฐ ๋น๊ต
โ
[์ค์ต2] ๋น
์ค ๋ถ์ ์ฐ์ต (O, ฮฉ, ฮ ํ๊ธฐ ์ฐ์ต)
โ
[์ค์ต3] ๋ง์คํฐ ์ ๋ฆฌ ์ ์ฉํ์ฌ ๋ถํ ์ ๋ณต ์๊ณ ๋ฆฌ์ฆ ๋ถ์
์ด์ 4์ฅ(๋ถํ ์ ๋ณต)์ผ๋ก ๋์ด๊ฐ๋ฉด ์ข ๋ ํจ์จ์ ์ธ ์๊ณ ๋ฆฌ์ฆ์ ๋ค๋ฃฐ ์ ์์ด!
์ดํด ์ ๋๋ ๋ถ๋ถ์ด๋ ์ถ๊ฐ ์ง๋ฌธ ์์ผ๋ฉด ์๋ ค์ค ๐
Gyver Tc, [2/17/2025 1:26 PM]
๋ชฉํ: ๋ถํ ์ ๋ณต ๊ธฐ๋ฒ์ ์ดํดํ๊ณ , ํจ์จ์ ์ธ ์๊ณ ๋ฆฌ์ฆ ๊ตฌํ ๋ฅ๋ ฅ์ ํค์ฐ๊ธฐ
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] < right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
arr = [38, 27, 43, 3, 9, 82, 10]
print("Sorted:", merge_sort(arr))
โ
O(n log n) ์ฑ๋ฅ์ ๋ณด์ฅํ๋ ์์ ์ ์ธ ์ ๋ ฌ ์๊ณ ๋ฆฌ์ฆ
import random
def randomized_quick_sort(arr):
if len(arr) <= 1:
return arr
pivot = random.choice(arr)
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return randomized_quick_sort(left) + middle + randomized_quick_sort(right)
arr = [38, 27, 43, 3, 9, 82, 10]
print("Sorted:", randomized_quick_sort(arr))
โ
๋ฌด์์ ํผ๋ฒ ์ ํ์ ํตํด ์ต์
์ ๊ฒฝ์ฐ(O(nยฒ))๋ฅผ ์ค์ด๋ ์ ๋ต
์ด์ 6-7์ฃผ์ฐจ (ํ ์ ๋ ฌ & ์ ํ ์๊ฐ ์ ๋ ฌ) ๋ก ๋์ด๊ฐ ์ค๋น๊ฐ ๋์ด!
์ถ๊ฐ ์ง๋ฌธ ์์ผ๋ฉด ๋งํด์ค ๐
Gyver Tc, [2/17/2025 1:28 PM]
๋ชฉํ: ํจ์จ์ ์ธ ์ ๋ ฌ ์๊ณ ๋ฆฌ์ฆ์ ํ์ตํ๊ณ , ํ(heap) ์๋ฃ๊ตฌ์กฐ์ ์ ํ ์๊ฐ ์ ๋ ฌ(linear time sorting) ๋ฐฉ๋ฒ์ ์ตํ๊ธฐ
ํ(Heap)์ด๋?
ํ ์ ๋ ฌ ์๊ณ ๋ฆฌ์ฆ
ํ ์ ๋ ฌ ์ฝ๋ ๊ตฌํ (Max Heap)
def heapify(arr, n, i):
largest = i
left = 2 i + 1
right = 2 i + 2
if left < n and arr[left] > arr[largest]:
largest = left
if right < n and arr[right] > arr[largest]:
largest = right
if largest != i:
arr[i], arr[largest] = arr[largest], arr[i]
heapify(arr, n, largest)
def heap_sort(arr):
n = len(arr)
for i in range(n // 2 - 1, -1, -1): # Build heap
heapify(arr, n, i)
for i in range(n - 1, 0, -1): # Extract elements
arr[i], arr[0] = arr[0], arr[i]
heapify(arr, i, 0)
arr = [4, 10, 3, 5, 1]
heap_sort(arr)
print("Sorted:", arr)
โ
O(n log n) ์ฑ๋ฅ์ ๋ณด์ฅํ๋ ๋น๊ต ๊ธฐ๋ฐ ์ ๋ ฌ ์๊ณ ๋ฆฌ์ฆ
โ
๋น ๋ฅธ ์ ๋ ฌ ์๋ & ์์ ์ ์ธ ์ฑ๋ฅ, ํ์ง๋ง ์ ์๋ฆฌ ์ ๋ ฌ(In-place Sorting) ์๋
๋น๊ต ๊ธฐ๋ฐ ์ ๋ ฌ vs ์ ํ ์๊ฐ ์ ๋ ฌ
๊ณ์ ์ ๋ ฌ (Counting Sort) - O(n)
๋ฐ์ดํฐ ๊ฐ์ด ์ ํ๋ ๋ฒ์ ๋ด์ ์์ ๋ ์ฌ์ฉ ๊ฐ๋ฅ
์์๊ฐ ์์ผ๋ฉด ์ ์ฉ ๋ถ๊ฐ, ์ค๋ณต ๊ฐ ๋ง์ ๋ ํจ๊ณผ์
์ฝ๋ ๊ตฌํ:
def counting_sort(arr):
max_val = max(arr)
count = [0] (max_val + 1)
output = [0] len(arr)
for num in arr:
count[num] += 1
for i in range(1, len(count)):
count[i] += count[i - 1]
for num in reversed(arr):
output[count[num] - 1] = num
count[num] -= 1
return output
arr = [4, 2, 2, 8, 3, 3, 1]
print("Sorted:", counting_sort(arr))
โ O(n) ์ฑ๋ฅ์ ๋ณด์ฅํ์ง๋ง ๋ฒ์ ์ ํ(์์ ์ ์๋ง ๊ฐ๋ฅ)
๊ธฐ์ ์ ๋ ฌ (Radix Sort) - O(d * n)
์๋ฆฌ์๋ณ๋ก ์ ๋ ฌ โ ๊ฐ์ฅ ์์ ์๋ฆฟ์๋ถํฐ ์ ๋ ฌ
๋น๊ต ์ฐ์ฐ ์์ด ์ ๋ ฌ ๊ฐ๋ฅ
์ฝ๋ ๊ตฌํ:
def counting_sort_radix(arr, exp):
n = len(arr)
output = [0] n
count = [0] 10
for num in arr:
index = (num // exp) % 10
count[index] += 1
for i in range(1, 10):
count[i] += count[i - 1]
for i in reversed(range(n)):
index = (arr[i] // exp) % 10
output[count[index] - 1] = arr[i]
count[index] -= 1
for i in range(n):
arr[i] = output[i]
def radix_sort(arr):
max_val = max(arr)
exp = 1
while max_val // exp > 0:
counting_sort_radix(arr, exp)
exp *= 10
arr = [170, 45, 75, 90, 802, 24, 2, 66]
radix_sort(arr)
print("Sorted:", arr)
โ ์๋ฆฌ์ d๊ฐ ์์ ๊ฒฝ์ฐ O(n), ์์ ์ง์ ๋ถ๊ฐ
6์ฅ (ํ ์ ๋ ฌ):
7์ฅ (์ ํ ์๊ฐ ์ ๋ ฌ):
์ด์ 8-9์ฃผ์ฐจ (ํต ์ ๋ ฌ & ์ ํ ์๊ณ ๋ฆฌ์ฆ) ๋ก ๋์ด๊ฐ ์ค๋น๊ฐ ๋์ด!
์ถ๊ฐ ์ง๋ฌธ ์์ผ๋ฉด ๋งํด์ค ๐
Gyver Tc, [2/17/2025 1:32 PM]
๋ชฉํ: ํต ์ ๋ ฌ(QuickSort)๊ณผ ์ ํ ์๊ณ ๋ฆฌ์ฆ(Selection Algorithm)์ ํ์ตํ๊ณ , ์คํ ์๊ฐ ๋ถ์ ๋ฐ ํจ์จ์ ์ธ ์ ํ ๊ธฐ๋ฒ์ ์ตํ๊ธฐ
import random
def quick_sort(arr):
if len(arr) <= 1:
return arr
pivot = random.choice(arr)
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return quick_sort(left) + middle + quick_sort(right)
arr = [38, 27, 43, 3, 9, 82, 10]
print("Sorted:", quick_sort(arr))
โ
๋๋ค ํผ๋ฒ ์ฌ์ฉ โ ํ๊ท O(n log n) ์ ์ง
โ
๋น ๋ฅด์ง๋ง, ์์ ์ ๋ ฌ(Stable Sort) ์๋
import random
def randomized_select(arr, k):
if len(arr) == 1:
return arr[0]
pivot = random.choice(arr)
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
if k < len(left):
return randomized_select(left, k)
elif k < len(left) + len(middle):
return pivot
else:
return randomized_select(right, k - len(left) - len(middle))
arr = [38, 27, 43, 3, 9, 82, 10]
k = 3 # 3๋ฒ์งธ ์ต์๊ฐ ์ฐพ๊ธฐ
print(f"{k}rd smallest element:", randomized_select(arr, k-1))
โ
ํ๊ท O(n) ์ฑ๋ฅ ์ ์ง
โ
์์ ์ ๋ ฌ์ด ํ์ ์๋ ๊ฒฝ์ฐ ์ ์ฉํจ
์ด์ 10-11์ฃผ์ฐจ (ํ๊ณผ ํธ๋ฆฌ ๊ธฐ๋ฐ ์ ๋ ฌ, ํด์ ํ
์ด๋ธ) ๋ก ๋์ด๊ฐ ์ค๋น๊ฐ ๋์ด! ๐
์ถ๊ฐ ์ง๋ฌธ ์์ผ๋ฉด ๋งํด์ค! ๐
Gyver Tc, [2/17/2025 1:33 PM]
๋ชฉํ: ํ(Heap)๊ณผ ํธ๋ฆฌ(Tree) ๊ธฐ๋ฐ ์ ๋ ฌ ๋ฐ ํด์(Hash Table) ์๋ฃ๊ตฌ์กฐ**๋ฅผ ํ์ตํ๊ณ , ํจ์จ์ ์ธ ๋ฐ์ดํฐ ๊ฒ์ ๋ฐ ์ ๋ ฌ ๊ธฐ๋ฒ์ ์ตํ๊ธฐ
def heapify(arr, n, i):
largest = i
left = 2 i + 1
right = 2 i + 2
if left < n and arr[left] > arr[largest]:
largest = left
if right < n and arr[right] > arr[largest]:
largest = right
if largest != i:
arr[i], arr[largest] = arr[largest], arr[i]
heapify(arr, n, largest)
def heap_sort(arr):
n = len(arr)
for i in range(n // 2 - 1, -1, -1): # ํ ๊ตฌ์ฑ
heapify(arr, n, i)
for i in range(n - 1, 0, -1): # ํ ์ ๋ ฌ ์ํ
arr[i], arr[0] = arr[0], arr[i]
heapify(arr, i, 0)
arr = [4, 10, 3, 5, 1]
heap_sort(arr)
print("Sorted:", arr)
โ
O(n log n) ์ฑ๋ฅ ๋ณด์ฅ
โ
ํ ์๋ฃ๊ตฌ์กฐ ๊ธฐ๋ฐ ์ ๋ ฌ
class HashTable:
def init(self, size=10):
self.size = size
self.table = [[] for _ in range(size)]
def _hash(self, key):
return hash(key) % self.size
def insert(self, key, value):
index = self._hash(key)
for pair in self.table[index]:
if pair[0] == key:
pair[1] = value
return
self.table[index].append([key, value])
def get(self, key):
index = self._hash(key)
for pair in self.table[index]:
if pair[0] == key:
return pair[1]
return None
def delete(self, key):
index = self._hash(key)
self.table[index] = [pair for pair in self.table[index] if pair[0] != key]
ht = HashTable()
ht.insert("apple", 100)
ht.insert("banana", 200)
print("Apple Price:", ht.get("apple"))
ht.delete("apple")
print("Apple Price after deletion:", ht.get("apple"))
โ
์ถฉ๋ ํด๊ฒฐ์ ์ํ ์ฒด์ด๋ ๊ธฐ๋ฒ ์ ์ฉ
โ
๋น ๋ฅธ ๋ฐ์ดํฐ ๊ฒ์ (ํ๊ท O(1))
10์ฅ (ํ๊ณผ ํธ๋ฆฌ ์ ๋ ฌ):
11์ฅ (ํด์ ํ ์ด๋ธ):
์ด์ 12-13์ฃผ์ฐจ (์ด์ง ๊ฒ์ ํธ๋ฆฌ & ๋ ๋-๋ธ๋ ํธ๋ฆฌ) ๋ก ๋์ด๊ฐ ์ค๋น๊ฐ ๋์ด! ๐
๐ 12-24์ฃผ์ฐจ ์์ฝ ์ ๋ฆฌ (CLRS 4ํ, 12-35์ฅ)
๋ชฉํ: ๊ณ ๊ธ ์๋ฃ๊ตฌ์กฐ, ๊ทธ๋ํ ์๊ณ ๋ฆฌ์ฆ, NP-์์ ์ฑ ๋ฐ ๊ทผ์ฌ ์๊ณ ๋ฆฌ์ฆ์ ํ์ตํ๊ณ ์ค์ ๋ฌธ์ ํด๊ฒฐ ๋ฅ๋ ฅ์ ํค์ฐ๊ธฐ
๐น 12-15์ฃผ์ฐจ: ์ด์ง ๊ฒ์ ํธ๋ฆฌ & ๊ท ํ ํธ๋ฆฌ
12์ฅ: ์ด์ง ๊ฒ์ ํธ๋ฆฌ (Binary Search Tree, BST)
โ ํน์ง
์ผ์ชฝ ์๋ธํธ๋ฆฌ < ๋ฃจํธ < ์ค๋ฅธ์ชฝ ์๋ธํธ๋ฆฌ
๊ฒ์, ์ฝ์ , ์ญ์ O(h) (h = ํธ๋ฆฌ ๋์ด)
โ ์ฐ์ฐ ๊ตฌํ (BST ์ฝ์ , ์ญ์ , ํ์)
class TreeNode:
def init(self, key):
self.key = key
self.left = self.right = None
class BST:
def init(self):
self.root = None
def insert(self, key):
def _insert(node, key):
if not node:
return TreeNode(key)
if key < node.key:
node.left = _insert(node.left, key)
else:
node.right = _insert(node.right, key)
return node
self.root = _insert(self.root, key)
def search(self, key):
def _search(node, key):
if not node or node.key == key:
return node
return _search(node.left, key) if key < node.key else _search(node.right, key)
return _search(self.root, key)
bst = BST()
bst.insert(10)
bst.insert(5)
bst.insert(15)
print("Found:", bst.search(5) is not None)
13์ฅ: ๋ ๋-๋ธ๋ ํธ๋ฆฌ (Red-Black Tree)
โ
BST์ ๋ฌธ์ : ๋์ด๊ฐ O(n)์ผ ๊ฒฝ์ฐ ์ฑ๋ฅ ์ ํ
โ
๋ ๋-๋ธ๋ ํธ๋ฆฌ: ๊ท ํ ์ ์ง โ O(log n) ๋ณด์ฅ
โ ๊ท ํ ์ ์ง ๊ท์น
๐น 16-18์ฃผ์ฐจ: ๋์ ํ๋ก๊ทธ๋๋ฐ & ํ์ ์๊ณ ๋ฆฌ์ฆ
15์ฅ: ๋์ ํ๋ก๊ทธ๋๋ฐ (Dynamic Programming)
โ
์ค๋ณต ๊ณ์ฐ ๋ฐฉ์ง โ O(n) ๋๋ O(nยฒ)๋ก ์ต์ ํ ๊ฐ๋ฅ
โ
ํผ๋ณด๋์น ์์ (Top-Down + Memoization)
def fib(n, memo={}):
if n in memo:
return memo[n]
if n <= 2:
return 1
memo[n] = fib(n-1, memo) + fib(n-2, memo)
return memo[n]
print(fib(50))
โ ์ต์ฅ ๊ณตํต ๋ถ๋ถ ๋ฌธ์์ด (LCS)
def lcs(X, Y):
m, n = len(X), len(Y)
dp = [[0] * (n + 1) for _ in range(m + 1)]
for i in range(1, m + 1):
for j in range(1, n + 1):
if X[i - 1] == Y[j - 1]:
dp[i][j] = dp[i - 1][j - 1] + 1
else:
dp[i][j] = max(dp[i - 1][j], dp[i][j - 1])
return dp[m][n]
print(lcs("AGGTAB", "GXTXAYB"))
๐น 19-21์ฃผ์ฐจ: ๊ทธ๋ํ ์๊ณ ๋ฆฌ์ฆ
20์ฅ: ๊ทธ๋ํ ํ์ (BFS, DFS)
โ ๋๋น ์ฐ์ ํ์ (BFS)
from collections import deque
def bfs(graph, start):
visited, queue = set(), deque([start])
while queue:
node = queue.popleft()
if node not in visited:
print(node, end=" ")
visited.add(node)
queue.extend(graph[node] - visited)
graph = {0: {1, 2}, 1: {2}, 2: {0, 3}, 3: {3}}
bfs(graph, 2)
โ ์ต๋จ ๊ฒฝ๋ก (Dijkstra)
import heapq
def dijkstra(graph, start):
pq, distances = [(0, start)], {v: float('inf') for v in graph}
distances[start] = 0
while pq:
cur_dist, node = heapq.heappop(pq)
if cur_dist > distances[node]:
continue
for neighbor, weight in graph[node].items():
distance = cur_dist + weight
if distance < distances[neighbor]:
distances[neighbor] = distance
heapq.heappush(pq, (distance, neighbor))
return distances
graph = {0: {1: 4, 2: 1}, 1: {3: 1}, 2: {1: 2, 3: 5}, 3: {}}
print(dijkstra(graph, 0))
๐น 22-24์ฃผ์ฐจ: NP-์์ ์ฑ๊ณผ ๊ทผ์ฌ ์๊ณ ๋ฆฌ์ฆ
34์ฅ: NP-์์ ๋ฌธ์
โ P vs NP
P: ๋คํญ์ ์๊ฐ ๋ด ํด๊ฒฐ ๊ฐ๋ฅ
NP: ๊ฒ์ฆ์ ๋น ๋ฅด์ง๋ง ํด๊ฒฐ์ด ์ด๋ ค์
NP-์์ : ๋ชจ๋ NP ๋ฌธ์ ๋ฅผ ๋ค๋ฃฐ ์ ์๋ ๋ฌธ์ (ex. ์ธํ์ ๋ฌธ์ , ๋ฐฐ๋ญ ๋ฌธ์ )
โ NP-์์ ๋ฌธ์ ํด๊ฒฐ ๋ฐฉ๋ฒ
โ ๋ฐฐ๋ญ ๋ฌธ์ ๊ทผ์ฌ ์๊ณ ๋ฆฌ์ฆ (Greedy)
def knapsack(weights, values, W):
ratio = sorted(zip(weights, values), key=lambda x: x[1]/x[0], reverse=True)
total_value, cur_weight = 0, 0
for w, v in ratio:
if cur_weight + w <= W:
cur_weight += w
total_value += v
else:
fraction = (W - cur_weight) / w
total_value += v * fraction
break
return total_value
print(knapsack([10, 20, 30], [60, 100, 120], 50))
โ ๊ทผ์ฌ ์๊ณ ๋ฆฌ์ฆ์ NP-์์ ๋ฌธ์ ๋ฅผ ๋น ๋ฅด๊ฒ ํด๊ฒฐ ๊ฐ๋ฅํ์ง๋ง ์ต์ ํด๋ ์๋
๐ 12-24์ฃผ์ฐจ ์ ์ฒด ์์ฝ
๐ ์ฐ์ต๋ฌธ์ ํ์ด
1๏ธโฃ BST์์ ํน์ ๊ฐ ํ์ํ๋ ์๊ฐ ๋ณต์ก๋๋?
O(h), ์ต์ ์ ๊ฒฝ์ฐ O(n) (๊ท ํ์ด ๋ง์ง ์์ ๊ฒฝ์ฐ)
2๏ธโฃ ๋ค์ต์คํธ๋ผ ์๊ณ ๋ฆฌ์ฆ์ ์๊ฐ ๋ณต์ก๋๋?
O((V + E) log V) (์ฐ์ ์์ ํ ์ฌ์ฉ ์)
3๏ธโฃ NP-์์ ๋ฌธ์ ๋ ๋คํญ์ ์๊ฐ ๋ด์ ํ ์ ์์๊น?
์๋ ค์ง ์๊ณ ๋ฆฌ์ฆ์ผ๋ก๋ ๋ถ๊ฐ๋ฅ (P=NP ๋ฌธ์ ํด๊ฒฐ ์ ๊ฐ๋ฅ)
Here's a summary of Chapters 1 and 2 from the algorithms text:
These chapters provide foundational concepts for understanding algorithms, data structures, and computational challenges. If you have any specific questions or need more details, feel free to ask! ๐
NP-complete problems have unique characteristics that set them apart from other algorithmic problems:
Efficiency: For most algorithmic problems, efficient (polynomial-time) algorithms exist that can solve the problem quickly. NP-complete problems, however, do not have any known efficient algorithms. Solving an NP-complete problem in a reasonable amount of time is currently considered infeasible.
Verification vs. Solving: While finding a solution to an NP-complete problem might be hard, verifying a given solution is relatively easy. In other words, if someone provides a solution, you can check its correctness quickly, even though finding the solution may take an impractically long time.
Interconnectedness: One remarkable property of NP-complete problems is that they are interconnected. If an efficient algorithm is found for any one NP-complete problem, it can be adapted to solve all other NP-complete problems efficiently as well. This is why finding an efficient solution for any NP-complete problem is so tantalizing.
Real-World Applications: Many NP-complete problems arise frequently in real-world applications, making them highly relevant. For example, the traveling-salesperson problem (TSP), which seeks the shortest possible route visiting a set of locations and returning to the starting point, is an NP-complete problem with practical implications for logistics and delivery services.
Approximation Algorithms: Due to the difficulty in solving NP-complete problems exactly, researchers often develop approximation algorithms. These algorithms provide solutions that are close to optimal but are found much more efficiently. This approach is often used in practice to handle NP-complete problems.
In summary, NP-complete problems stand out due to their computational complexity and their significant impact on both theoretical computer science and practical applications. If you have any more questions or need further clarification, feel free to ask! ๐
์ ๋ ฌ์ ํ์๋ก ํ๋ ์ค์ ์๋ฅผ ์ค๋ช ํ๊ณ , ๋ ์ง์ ์ฌ์ด์ ์ต๋จ ๊ฑฐ๋ฆฌ๋ฅผ ์ฐพ๋ ์๋ฅผ ์ค๋ช ํ์ธ์.
์๋ ์ธ์, ์ค์ ํ๊ฒฝ์์ ๊ณ ๋ คํด์ผ ํ ๋ค๋ฅธ ํจ์จ์ฑ ์ฒ๋๋ ๋ฌด์์ผ๊น์?
๋ณธ ์ ์ด ์๋ ๋ฐ์ดํฐ ๊ตฌ์กฐ๋ฅผ ์ ํํ์ฌ ๊ทธ ๊ฐ์ ๊ณผ ํ๊ณ๋ฅผ ๋ ผ์ํ์ธ์.
์์์ ์ธ๊ธํ ์ต๋จ ๊ฒฝ๋ก ๋ฌธ์ ์ ์ฌํํ๋ ์ธํ์ ๋ฌธ์ ๋ ์ด๋ป๊ฒ ๋น์ทํ๊ณ ๋ค๋ฅผ๊น์?
์ต๊ณ ์ ํด๊ฒฐ์ฑ ์ด ํ์ํ ์ค์ ๋ฌธ์ ๋ฅผ ์ ์ํ๊ณ , "์ต์์" ํด๊ฒฐ์ฑ ์ด ์ถฉ๋ถํ ๋ฌธ์ ๋ฅผ ์ ์ํ์ธ์.
๋๋ก๋ ์ ์ฒด ์ ๋ ฅ์ด ๋ฌธ์ ๋ฅผ ํด๊ฒฐํ๊ธฐ ์ ์ ์ด์ฉ ๊ฐ๋ฅํ์ง๋ง, ๋๋ก๋ ์ ๋ ฅ์ด ๋ฏธ๋ฆฌ ์ ๋ถ ์ ๊ณต๋์ง ์๊ณ ์๊ฐ์ด ์ง๋จ์ ๋ฐ๋ผ ๋์ฐฉํ๋ ์ค์ ๋ฌธ์ ๋ฅผ ์ค๋ช ํ์ธ์.
๋ค๋ฅธ ๊ถ๊ธํ ์ ์ด ์๋ค๋ฉด ์ธ์ ๋ ์ง ์ง๋ฌธํ์ธ์! ๐
Letโs break down and solve each of the exercises and problems step by step.
Question:
Give an example of an application that requires algorithmic content at the application level, and discuss the function of the algorithms involved.
Answer:
An example of such an application is Google Maps.
Question:
Suppose that for inputs of size ( n ) on a particular computer, insertion sort runs in ( 8n^2 ) steps and merge sort runs in ( 64n \log n ) steps. For which values of ( n ) does insertion sort beat merge sort?
Answer:
We need to find the values of ( n ) for which:
[
8n^2 < 64n \log n
]
Simplify the inequality:
[
n^2 < 8n \log n \
n < 8 \log n
]
To solve this, we test values of ( n ):
( n ) | ( 8 \log n ) | Comparison (( n < 8 \log n )) |
---|---|---|
1 | 0 | ( 1 < 0 ) โ False |
2 | 8 | ( 2 < 8 ) โ True |
10 | ( 8 \log 10 \approx 18.4 ) | ( 10 < 18.4 ) โ True |
50 | ( 8 \log 50 \approx 45.2 ) | ( 50 < 45.2 ) โ False |
By testing further, we find that insertion sort beats merge sort for ( n \leq 43 ).
Question:
What is the smallest value of ( n ) such that an algorithm whose running time is ( 100n^2 ) runs faster than an algorithm whose running time is ( 2^n ) on the same machine?
Answer:
We need to find the smallest ( n ) for which:
[
100n^2 < 2^n
]
Test values of ( n ):
( n ) | ( 100n^2 ) | ( 2^n ) | Comparison (( 100n^2 < 2^n )) |
---|---|---|---|
1 | 100 | 2 | ( 100 < 2 ) โ False |
10 | 10,000 | 1,024 | ( 10,000 < 1,024 ) โ False |
15 | 22,500 | 32,768 | ( 22,500 < 32,768 ) โ True |
By testing further, we find that the smallest ( n ) is 15.
Question:
For each function ( f(n) ) and time ( t ) in the following table, determine the largest size ( n ) of a problem that can be solved in time ( t ), assuming that the algorithm to solve the problem takes ( f(n) ) microseconds.
Let's illustrate the operation of Insertion Sort on the array initially containing the sequence:
โจ31, 41, 59, 26, 41, 58โฉ.
Insertion Sort works by iteratively inserting each element into its correct position in the sorted portion of the array.
Initial Array:
[31, 41, 59, 26, 41, 58]
After 1st Iteration (i = 1):
After 2nd Iteration (i = 2):
After 3rd Iteration (i = 3):
After 4th Iteration (i = 4):
After 5th Iteration (i = 5):
After 6th Iteration (i = 6):
[26, 31, 41, 41, 58, 59]
The SUM-ARRAY procedure computes the sum of the numbers in the array A[1 : n]. Here is the procedure:
SUM-ARRAY(A, n)
sum = 0
for i = 1 to n
sum = sum + A[i]
return sum
A loop invariant is a condition that is true before and after each iteration of the loop. For the SUM-ARRAY procedure, the loop invariant is:
At the start of each iteration of the
for
loop, the variablesum
contains the sum of the elements in the subarrayA[1 : i-1]
.
Initialization:
Maintenance:
Termination:
The loop invariant ensures that the SUM-ARRAY procedure correctly computes the sum of the numbers in the array A[1 : n]. By the properties of initialization, maintenance, and termination, the procedure is proven to be correct.
Gyver Tc, [2/17/2025 1:47 PM]
To modify the Insertion Sort algorithm to sort in monotonically decreasing order, we need to change the comparison condition. Instead of inserting elements into the correct position in an increasing sequence, we insert them into the correct position in a decreasing sequence.
INSERTION-SORT-DECREASING(A)
for j = 2 to A.length
key = A[j]
i = j - 1
// Insert A[j] into the sorted sequence A[1 : j-1] in decreasing order
while i > 0 and A[i] < key
A[i + 1] = A[i]
i = i - 1
A[i + 1] = key
LINEAR-SEARCH(A, x)
for i = 1 to A.length
if A[i] == x
return i
return NIL
The loop invariant for the LINEAR-SEARCH algorithm is:
At the start of each iteration of the
for
loop, the elementx
does not appear in the subarrayA[1 : i-1]
.
Initialization:
Maintenance:
Termination:
The problem involves adding two n-bit binary integers stored in arrays A and B and storing the result in an (n + 1)-bit array C.
ADD-BINARY-INTEGERS(A, B, n)
let C[0 : n] be a new array
carry = 0
for i = n - 1 downto 0
sum = A[i] + B[i] + carry
C[i + 1] = sum % 2 // Store the least significant bit
carry = sum // 2 // Update the carry
C[0] = carry
return C
Initialization:
Loop:
Final Carry:
Return:
Input:
A = [1, 0, 1, 1] (binary number 1011, decimal 11)
B = [1, 1, 0, 1] (binary number 1101, decimal 13)
n = 4
Execution:
Output:
C = [1, 1, 0, 0, 0] (binary number 11000, decimal 24)
Shortest path algorithms have a wide range of practical applications across various fields. Here are some key examples:
Navigation Systems:
Network Routing:
Transportation and Logistics:
Robotics and Autonomous Vehicles:
Supply Chain Management:
Emergency Response:
Social Networks:
Urban Planning:
Healthcare:
These applications showcase the versatility and importance of shortest path algorithms in solving real-world problems efficiently. If you have any more questions or need further details, feel free to ask! ๐