3

I have a piece of math more complicated than I can understand at the moment, a dynamic graph visualization. It uses a physics metaphor of springs and magnets, where the vertices act as magnets repelling each other, and the edges act as springs pulling the vertices back together. This simulation has six working parameters at the moment, which I graph from the running simulation at an interval.

I am trying to find the settings for the simulation necessary to produce a stable visualization.

My approach so far has been to make an array of settings, and an array of edge lengths for a fixed arrangement (a simple tetrahedron). I'm not entirely sure where I'm going with this since I'm still very new to machine learning, but I figured that by randomizing the settings during the data collection run, I would be able to predict the lengths of the edges for a given set of settings, then use regression to find the settings corresponding to stable lengths of edges.

To do this, I began adapting Google's Tensorflow.js tutorial, skipping the first data visualization part because I have 6 setting inputs, and 6 edge length labels.

The model looks like this:


  function createModel() {
    // Create a sequential model
    const model = tf.sequential(); 

    // Add a single hidden layer
    model.add(tf.layers.dense({inputShape: [6], units: 1, useBias: true}));

    // Add an output layer
    model.add(tf.layers.dense({ units: 6, useBias: true}));

    return model;
  }

I got the model to train after the graph visualization "blows up", that is the vertices disappear from the screen after edges are added, with the lengths of the edges recorded along the way. But I am stuck on testModel function:

function testModel(model, inputData, normalizationData) {
    console.log('inputData.shape', inputData.shape);

    const {inputMax, inputMin, labelMin, labelMax} = normalizationData;  

    // Generate predictions for a uniform range of numbers between 0 and 1;
    // We un-normalize the data by doing the inverse of the min-max scaling 
    // that we did earlier.
    const [xs, preds] = tf.tidy(() => {

      const xs = tf.linspace(0, 1, 100);
      console.log('xs', xs)
      const preds = model.predict(xs.reshape([null, 6]));      

      const unNormXs = xs
        .mul(inputMax.sub(inputMin))
        .add(inputMin);

      const unNormPreds = preds
        .mul(labelMax.sub(labelMin))
        .add(labelMin);

      // Un-normalize the data
      return [unNormXs.dataSync(), unNormPreds.dataSync()];
    });


    const predictedPoints = Array.from(xs).map((val, i) => {
      return {x: val, y: preds[i]}
    });

    const originalPoints = inputData.map(d => ({
      x: d.settings, y: d.lengths,
    }));


    tfvis.render.scatterplot(
      {name: 'Model Predictions vs Original Data'}, 
      {values: [originalPoints, predictedPoints], series: ['original', 'predicted']}, 
      {
        xLabel: 'settings',
        yLabel: 'edge lengths',
        height: 300
      }
    );
  }

but specifically, it is these lines that throw an error:

const xs = tf.linspace(0, 1, 100);
console.log('xs', xs)
const preds = model.predict(xs.reshape([null, 6]));   

(The reshape!)

So my question for now is this: How do I tell tensorflow to give me a list of 6 random values to predict the function with, instead of just a single value?

Thanks in advance!

The complete example currently lives at this github page. (click Construct Pyramid and wait for the visualization to blow up, and the tensorflow will take over.)

Maybe the entire approach is wrong, any feedback is welcome.

Brian Spiering
  • 20,142
  • 2
  • 25
  • 102

0 Answers0