Objective: Minimize two terms using gradient descent.
Update the variable y
recursively by:
Assigning it the previous y
value minus a term proportional to the error deviation.
Weight this term with a weighting factor, ( \alpha ), set to 0.5.
Implementation strategy:
Retain the old variable y
.
Move towards the neighboring value ( y_{i+1} ) while stepping away from ( y_{i} ).
An improved approach combines terms from both directions:
Aim to make ( y_{i} ) close to both ( y_{i-1} ) and ( y_{i+1} ).
Optimize this combined term, setting ( \beta ) to 0.1.
Note: Do not apply this optimization for the first or last node in the sequence.
Path Description: Given a 5x5 grid, starting from (0,0) to (4,4), the path moves:
Right, down, and then right again.
Function to Implement:
The function smooth
takes the path, weighting factors, and a tolerance variable.
Creates a new path from the old path (deep copy).
The nested function smoother
should apply the update equations iteratively, excluding the first and last nodes until the total observed change is smaller than the specified tolerance.
Compute a new path using the smooth
function and verify it by uncommenting the print routine that outputs the result.
The original path and modified path must reflect the unchanged positions of the first and last nodes.
The in-between values should show a smoother transition:
Original path shows constant values in initial steps.
Smooth path adjusts values to create a more gradual slope, demonstrating the smoothing effect.
Corrections to Code:
Replace the ending coordinate (4,4) with (4,2).
Correct signs from negative to positive in the gradient descent implementation for convergence.
Iterate through each entry of 2D vectors (e.g., individual ( x_i )) for easier implementation.