Saturday, August 30, 2025
HomeLanguagesJavascriptTensorflow.js tf.train.adam() Function

Tensorflow.js tf.train.adam() Function

Tensorflow.js is a javascript library developed by Google to run and train machine learning model in the browser or in Node.js.

Adam optimizer (or Adaptive Moment Estimation) is a stochastic gradient descent method that is based on adaptive estimation of first-order and second-order moments. The optimization technique is highly efficient in when working with a large sets of data and parameters. For more details refer to this article.

In Tensorflow.js tf.train.adam() function is used which creates tf.AdamOptimizer that uses the adam algorithm.

Syntax:

tf.train.adam (learningRate?, beta1?, beta2?, epsilon?)

 Parameters:

  • learningRate: The learning rate to use for the Adam gradient descent algorithm. It is optional.
  • beta1: The exponential decay rate for the 1st moment estimates. It is optional.
  • beta2: The exponential decay rate for the 2nd moment estimates. It is optional.
  • epsilon: A small constant for numerical stability. It is optional.

 Return Value: AdamOptimizer.

Example 1: A quadratic function is defined with taking x, y input tensors and a, b, c as random coefficients. Then we calculate the mean squared loss of the prediction and pass it to adam optimizer to minimize the loss with and change the coefficient ideally.

Javascript




// A cubic function with its coefficient l,m,n.
const x = tf.tensor1d([0, 1, 2, 3]);
const y = tf.tensor1d([1., 2., 5., 11.]);
 
const l = tf.scalar(Math.random()).variable();
const m = tf.scalar(Math.random()).variable();
const n = tf.scalar(Math.random()).variable();
 
// y = l * x^3 - m * x + n.
const f = x => l.mul(x.pow(3)).sub(m.mul(x)).add(n);
const loss = (pred, label) => pred.sub(label).square().mean();
 
const learningRate = 0.01;
const optimizer = tf.train.adam(learningRate);
 
// Training the model and printing the coefficients.
for (let i = 0; i < 10; i++) {
   optimizer.minimize(() => loss(f(x), y));
  console.log(
     `l: ${l.dataSync()}, m: ${m.dataSync()}, n: ${n.dataSync()}`);
}
 
// Predictions are made.
 
const preds = f(x).dataSync();
preds.forEach((pred, i) => {
   console.log(`x: ${i}, pred: ${pred}`);
});


Output:

l: 0.5212615132331848, m: 0.4882013201713562, n: 0.9879841804504395
l: 0.5113212466239929, m: 0.49809587001800537, n: 0.9783468246459961
l: 0.5014950633049011, m: 0.5077731013298035, n: 0.969675600528717
l: 0.49185076355934143, m: 0.5170749425888062, n: 0.9630305171012878
l: 0.48247095942497253, m: 0.5257879495620728, n: 0.9595866799354553
l: 0.47345229983329773, m: 0.5336435437202454, n: 0.9596782922744751
l: 0.4649032950401306, m: 0.5403363704681396, n: 0.9626657962799072
l: 0.4569399356842041, m: 0.5455683469772339, n: 0.9677067995071411
l: 0.4496782124042511, m: 0.5491118431091309, n: 0.9741682410240173
l: 0.44322386384010315, m: 0.5508641004562378, n: 0.9816395044326782
x: 0, pred: 0.9816395044326782
x: 1, pred: 0.8739992380142212
x: 2, pred: 3.4257020950317383
x: 3, pred: 11.29609203338623

Example 2: Below is the code where we designed a simple model and we define an optimizer by tf.train.adam with the learning rate parameter of 0.01 and pass it to model compilation.

Javascript




// Importing the tensorflow.js library
import * as tf from "@tensorflow/tfjs"
 
// defining the model
const model = tf.sequential({
    layers: [tf.layers.dense({ units: 1, inputShape: [12] })],
});
   
 
// in the compilation we use tf.train.adam optimizer
  const optimizer = tf.train.adam(0.001);
model.compile({ optimizer: optimizer, loss: "meanSquaredError" },
              (metrics = ["accuracy"]));
   
// evaluate the model which was compiled above
const result = model.evaluate(tf.ones([10, 12]), tf.ones([10, 1]), {
    batchSize: 4,
});
   
// print the result
result.print();


Output:

Tensor
    1.520470142364502

 Reference: https://js.tensorflow.org/api/3.6.0/#train.adam

Whether you’re preparing for your first job interview or aiming to upskill in this ever-evolving tech landscape, neveropen Courses are your key to success. We provide top-quality content at affordable prices, all geared towards accelerating your growth in a time-bound manner. Join the millions we’ve already empowered, and we’re here to do the same for you. Don’t miss out – check it out now!
Dominic
Dominichttp://wardslaus.com
infosec,malicious & dos attacks generator, boot rom exploit philanthropist , wild hacker , game developer,
RELATED ARTICLES

Most Popular

Dominic
32249 POSTS0 COMMENTS
Milvus
81 POSTS0 COMMENTS
Nango Kala
6617 POSTS0 COMMENTS
Nicole Veronica
11792 POSTS0 COMMENTS
Nokonwaba Nkukhwana
11838 POSTS0 COMMENTS
Shaida Kate Naidoo
6731 POSTS0 COMMENTS
Ted Musemwa
7013 POSTS0 COMMENTS
Thapelo Manthata
6689 POSTS0 COMMENTS
Umr Jansen
6701 POSTS0 COMMENTS