Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update 4.2.md #530

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
68 changes: 61 additions & 7 deletions docs/Chap04/4.2.md
Original file line number Diff line number Diff line change
Expand Up @@ -123,17 +123,71 @@ By master theorem, we can find the largest $k$ to satisfy $\log_3 k < \lg 7$ is

> V. Pan has discovered a way of multiplying $68 \times 68$ matrices using $132464$ multiplications, a way of multiplying $70 \times 70$ matrices using $143640$ multiplications, and a way of multiplying $72 \times 72$ matrices using $155424$ multiplications. Which method yields the best asymptotic running time when used in a divide-and-conquer matrix-multiplication algorithm? How does it compare to Strassen's algorithm?

Using what we know from the last exercise, we need to pick the smallest of the following
### Analyzing Pan’s Methods

Pan has introduced three methods for divide-and-conquer matrix multiplication, each with different parameters. We will analyze the recurrence relations, compute the exponents using the Master Theorem, and compare the resulting asymptotic running times to Strassen’s algorithm.

#### **Method 1:**
- **Recurrence Relation:**

$$
\begin{aligned}
\log_{68} 132464 & \approx 2.795128 \\\\
\log_{70} 143640 & \approx 2.795122 \\\\
\log_{72} 155424 & \approx 2.795147.
\end{aligned}
T(n) = 132{,}464 \cdot T\left(\frac{n}{68}\right)
$$

- **Parameters:** $a = 132{,}464$, $b = 68$
- **Applying Master Theorem:**

$$
\log_b a = \frac{\log 132{,}464}{\log 68} \approx 2.7951284873613815
$$

- **Asymptotic Running Time:** $O(n^{2.7951284873613815})$

#### **Method 2:**
- **Recurrence Relation:**

$$
T(n) = 143{,}640 \cdot T\left(\frac{n}{70}\right)
$$

- **Parameters:** $a = 143{,}640$, $b = 70$
- **Applying Master Theorem:**

$$
\log_b a = \frac{\log 143{,}640}{\log 70} \approx 2.795122689748337
$$

- **Asymptotic Running Time:** $O(n^{2.795122689748337})$

#### **Method 3:**
- **Recurrence Relation:**

$$
T(n) = 155{,}424 \cdot T\left(\frac{n}{72}\right)
$$

- **Parameters:** $a = 155{,}424$, $b = 72$
- **Applying Master Theorem:**

$$
\log_b a = \frac{\log 155{,}424}{\log 72} \approx 2.795147391093449
$$

- **Asymptotic Running Time:** $O(n^{2.795147391093449})$

### **Comparison to Strassen’s Algorithm**

Strassen’s algorithm has an asymptotic running time of $O(n^{2.807})$, derived from $\log_2 7 \approx 2.807$.

### **Which Method is Best?**

Now, let's compare the exponents:

- **Method 1:** $\log_b a = 2.7951284873613815$
- **Method 2:** $\log_b a = 2.795122689748337$
- **Method 3:** $\log_b a = 2.795147391093449$

The fastest one asymptotically is $70 \times 70$ using $143640$.
Since **smaller exponents lead to more efficient algorithms**, **Method 2** is the best because it has the smallest exponent, $2.795122689748337$.

## 4.2-6

Expand Down