Optimization

Overview


The built in optimizers that tensor-flow uses to optimize its models can also be used to solve optimizations over simple functions.
(see optimization)

Define the Function


The first step is to define the function that you wish to minimize. The function needs to be defined using the tensor flow function api.

The code below gives and exmple.


let f = function(x){
    return x.add(tf.scalar(2)).pow(tf.scalar(2, 'int32'))
}					
				

Optimization


Next, you need to select an optimizer from among the library of tensor flow optimizers. The following code creates a function which runs a minimization using the adam optimizer. It minizes the function specified as f. (see above for defining a function)


function minimize(epochs){
  //x is a variable with initial value of 2
  let x = tf.variable(tf.scalar(2));
  let learningRate = 0.1;
  const optim = tf.train.adam(learningRate);  //gadient descent algorithm 
  for(let i = 0 ; i < epochs ; i++) {
    optim.minimize(() => f(x));
  }
  return x;
}			
				

Complete Example


The following code demonstrates the complete example given above.


//load tensor flow library
await $src('https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@latest');

// f is the function (x+2)^2
//the minimum should be x=-2
let f = function(x){
    return x.add(tf.scalar(2)).pow(tf.scalar(2, 'int32'))
}

function minimize(epochs){
  //x is a variable with initial value of 2
  let x = tf.variable(tf.scalar(2));
  let learningRate = 0.1;
  const optim = tf.train.adam(learningRate);  //gadient descent algorithm 
  for(let i = 0 ; i < epochs ; i++) {
    optim.minimize(() => f(x));
  }
  return x;
}

let result = minimize(100);
				
Try it!

Contents