|
| 1 | +<h1 align='center'>GREEDY - ALGORITHM - PROBLEMS</h1> |
| 2 | + |
| 3 | +The **Greedy Algorithm** is a problem-solving technique used in optimization problems. It works by making the **locally optimal choice** at each step with the hope that these local choices will lead to a **globally optimal solution**. |
| 4 | + |
| 5 | +In simpler terms, the algorithm chooses the best immediate option without considering the bigger picture or the overall problem space. |
| 6 | + |
| 7 | +### **Key Characteristics of the Greedy Algorithm:** |
| 8 | + |
| 9 | +1. **Greedy Choice Property:** |
| 10 | + - A global optimal solution can be arrived at by choosing local optimal solutions at each step. |
| 11 | + |
| 12 | +2. **Optimal Substructure:** |
| 13 | + - A problem exhibits optimal substructure if an optimal solution can be constructed from optimal solutions of its subproblems. |
| 14 | + |
| 15 | +3. **Non-revisiting Decisions:** |
| 16 | + - Once a choice is made, it is never revisited. |
| 17 | + |
| 18 | +### **Steps in the Greedy Algorithm:** |
| 19 | +1. **Identify the Problem**: |
| 20 | + - Confirm the problem exhibits the greedy choice property and optimal substructure. |
| 21 | + |
| 22 | +2. **Sort or Arrange Data**: |
| 23 | + - Sort the elements involved based on certain criteria, such as value, weight, or priority. |
| 24 | + |
| 25 | +3. **Iterative Selection**: |
| 26 | + - At each step, pick the most optimal choice. |
| 27 | + |
| 28 | +4. **Check Feasibility**: |
| 29 | + - Verify if the choice meets the constraints of the problem. |
| 30 | + |
| 31 | +5. **Build the Solution**: |
| 32 | + - Continue making choices until the solution is complete. |
| 33 | + |
| 34 | +### **Examples of the Greedy Algorithm:** |
| 35 | + |
| 36 | +#### **1. Activity Selection Problem** |
| 37 | +**Problem Statement**: |
| 38 | +Given a set of activities with their start and end times, select the maximum number of activities that can be performed without overlapping. |
| 39 | + |
| 40 | +**Steps**: |
| 41 | +- Sort activities by their end time. |
| 42 | +- Always select the activity that finishes first and does not overlap with the previously selected activity. |
| 43 | + |
| 44 | +**Example**: |
| 45 | +| Activity | Start Time | End Time | |
| 46 | +|---|---|---| |
| 47 | +| A1 | 1 | 4 | |
| 48 | +| A2 | 3 | 5 | |
| 49 | +| A3 | 0 | 6 | |
| 50 | +| A4 | 5 | 7 | |
| 51 | +| A5 | 8 | 9 | |
| 52 | + |
| 53 | +**Solution**: |
| 54 | +- Sort by end time: ( A1, A2, A4, A5, A3 ). |
| 55 | +- Pick ( A1 ) (ends at 4), skip ( A2 ) (overlaps with ( A1 )), pick ( A4 ), and finally pick ( A5 ). |
| 56 | + |
| 57 | +#### **2. Fractional Knapsack Problem** |
| 58 | +**Problem Statement**: |
| 59 | +Given weights and values of ( n ) items, maximize the total value in a knapsack with a limited capacity ( W ). You can take fractions of an item. |
| 60 | + |
| 61 | +**Steps**: |
| 62 | +- Calculate the value-to-weight ratio (( v/w )) for each item. |
| 63 | +- Sort items by this ratio in descending order. |
| 64 | +- Take items greedily until the capacity is filled. |
| 65 | + |
| 66 | +**Example**: |
| 67 | +| Item | Value (( v )) | Weight (( w )) | ( v/w ) | |
| 68 | +|---|---|---|---| |
| 69 | +| I1 | 60 | 10 | 6 | |
| 70 | +| I2 | 100 | 20 | 5 | |
| 71 | +| I3 | 120 | 30 | 4 | |
| 72 | + |
| 73 | +**Knapsack Capacity**: ( W = 50 ) |
| 74 | + |
| 75 | +**Solution**: |
| 76 | +- Take ( I1 ) entirely (( w = 10, v = 60 )). |
| 77 | +- Take ( I2 ) entirely (( w = 20, v = 100 )). |
| 78 | +- Take ( frac{2/3} ) of ( I3 ) (( w = 20, v = 80 )). |
| 79 | + |
| 80 | +**Total Value**: ( 60 + 100 + 80 = 240 ). |
| 81 | + |
| 82 | +#### **3. Huffman Encoding** |
| 83 | +**Problem Statement**: |
| 84 | +Build a binary tree to encode data such that the overall size of the encoded data is minimized. |
| 85 | + |
| 86 | +**Steps**: |
| 87 | +- Build a priority queue with the frequencies of characters. |
| 88 | +- Iteratively combine the two smallest frequencies into a single node until one node remains. |
| 89 | + |
| 90 | +**Example**: |
| 91 | +| Character | Frequency | |
| 92 | +|---|---| |
| 93 | +| A | 5 | |
| 94 | +| B | 9 | |
| 95 | +| C | 12 | |
| 96 | +| D | 13 | |
| 97 | +| E | 16 | |
| 98 | +| F | 45 | |
| 99 | + |
| 100 | +**Solution**: |
| 101 | +1. Combine ( A (5) ) and ( B (9) ) into ( AB (14) ). |
| 102 | +2. Combine ( AB (14) ) and ( C (12) ) into ( ABC (26) ). |
| 103 | +3. Continue until a single tree is formed. |
| 104 | + |
| 105 | +### **Advantages of Greedy Algorithms** |
| 106 | +- **Simplicity**: Easy to implement. |
| 107 | +- **Efficiency**: Often faster due to fewer computations (usually ( O(n log n) )). |
| 108 | + |
| 109 | +### **Limitations of Greedy Algorithms** |
| 110 | +- **May Not Guarantee Optimal Solutions**: |
| 111 | + - Greedy algorithms don’t always produce the global optimum. |
| 112 | + - For example, in the **0/1 Knapsack Problem**, a greedy approach fails to find the correct solution. |
| 113 | + |
| 114 | +- **Problem Specific**: |
| 115 | + - Requires the problem to exhibit greedy-choice property and optimal substructure. |
| 116 | + |
| 117 | +### **When to Use Greedy Algorithms?** |
| 118 | +- Problems where greedy choice leads to an optimal solution. |
| 119 | +- Scenarios that allow breaking problems into simpler subproblems. |
| 120 | +- When performance and simplicity are priorities. |
| 121 | + |
| 122 | +Would you like to see more examples or dive into any specific problem in detail? |
0 commit comments