The Sphere function is an easy ~~meta-heuristic~~ local-search problem to solve. It is expressed as *"the summation of squares of all numbers in an array"* ...

E.g. `sphere([1,2,3]) = sum([1,4,9]) = 14;`

The sphere function usually limits number inputs to `-5.12 >= n <= 5.12`

You digg?

In this problem, we need to find the configuration of inputs for the sphere function that gives the minimum possible output.

Intuitively, you might arrive at `[0,0,0...,0]`

which would result in `0`

when passed into the sphere function.

But how can we make the computer figure this out for itself? That's what machine learning is ... and that's what ~~meta-heuristics~~ local-search algorithms help us do.

#### Things you need to implement a ~~meta-heuristic~~ local-search algorithm

##### Solution Initialization Function

- A solution initialization function that generates a random approximation of the solution to the problem everytime.

- For instance, a random solution to this problem off the top of my head for 5 numbers could be [3,-4,5,1,-2] which would have a sphere_function value of 55

```
function getRandomSolution(width) {
var sol = [];
for (var i = 0; i < width; i++) {
sol.push((Math.random() * -10.24) + 5.12);
}
return sol;
}
```

##### Objective/Cost/Fitness Function

- A way to determine how close your approximate solution comes to the ideal solution to the problem you're tackling. This is known as the objective/cost/fitness function.

- For the sphere function problem, the objective function is already defined for us as
**"the summation of squares of all numbers in an array"**. Remember?

```
function getFitness(solution) {
var sum = 0;
for (var i = 0; i < solution.length; i++) {
sum += (solution[i] * solution[i]);
}
return sum; //sum of squares of numbers in the array
}
```

##### Optimization or Divergence?

- You also need to determine whether your problem is a an optimization or divergence problem. i.e. Does a reduced or increased objective/cost function value signify a better solution?

- If the solution gets better as the objective value reduces, then the problem is a minimization or optimization problem.
- If the solution gets better as the objective value increases, then the problem is maximization or divergence problem.
- The sphere function problem is a minimization/optimization problem because we want to
*minimize the summation of sqaures of all numbers in the array*

```
if (fit2 < fit1); //fit2 is better than fit1 because we want the sphere_function value to reduce as much as possible
```

##### Mutation Function

- We also need a way to mutate/change an existing solution to find its neighbor
*(forgive the metaphors)*in the search space. This is known as the mutation function.

- We think of all possible values of the solutions as a search space. e.g. for the sphere function, if we are dealing with an array of length 1, then all possible values could be
`[-5.21], [-5.20], [-5.19], ... , [-0.1], [0], [0.1], ... , [5.19], [5.20], [5.21]`

- A simple way to mutate a solution could be
`add/subtract a random number to/from its values`

. This is simple enough, but comes with its own challenges.- First, values tend to go higher than the maximum 5.21 value or less than the minimum -5.21,if measures are not put in place to prevent that.
- I'll leave you to figure out an efficient mutation function.

##### Clone Function

- I've found that making exact clones for a solution before mutation is important because of the way computers tend to pass values by reference. This ensures that you still have a copy of the original solution even after a mutated-solution is formed.

Well, that's all you need to implement a meta-heuristic algorithm. The simplest meta-heuristics tend to keep it simple and are usually easily to follow. Let's take a look at one of them.

### Hill Climbing

The Hill Climbing algorithm is perhaps the easiest meta-heuristic to implement. It's expressed by the following pseudocode:

```
- initialize a random solution `sol1`
- find the fitness of `sol1` as `sFit1`
- loop till stopping condition is true
- set `sol2` to a "neighbor" of `sol1`
- find the fitness of `sol2` as `sFit2`
- if (sFit2.isBetterThan(sFit1))
- let `sol1` be equal to `sol2`
- let `sFit1` be equal to `sFit2`
- end if
- end loop
- return `sol1` as best found solution
```

Of course, there are probably better ways to implement hill climbing for the sphere function, but take a look at the implementation below.

Feel free to execute the full code in your browser console.

Also, if you like this, and don't mind coding in c#, check out this .NET Library