1

I am using a genetic algorithm to maximize a few hundred thousand real-valued variables. Each of the variables, $x_i$, has its own independent boundary condition.

The fitness function uses each of these variables to compute another value and returns the sum of everything:

$$fitness = g(x_1) + g(x_2) + g(x_3) \ + \ ...$$

This is taking incredibly long in python. In this situation, what do I gain by maximizing all values at the same time, i.e. using the genetic algorithm to find the global optimum in this huge search space? Does it even make sense or should I just maximize each variable independently, since the fitness function is additively separable?


2nd question: If I include some constraints which depend on the value of other additively separable functions, like:

$$\text{if } h(x_1) + h(x_2) + h(x_3) \ + \ ... > 0:$$ $$fitness = fitness + penalty,$$

can I still maximize each variable independently in any way? What is the best approach in this case?

Edit: Changed "optimize" to "maximize" to make things clearer.

João Bravo
  • 127
  • 12
  • 1
    Welcome to DataScienceSE. For the first question I agree with your reasoning: if the variables are independent, it's not necessary to optimize all of them at once, you can optimize every g(x) on its own. For the 2nd question I suspect that you're back to the general case. – Erwan Sep 18 '21 at 18:52
  • Thank you for your comment @Erwan. For the 2nd question, I would assume that maximizing each variable individually would also give me the solution most likely to fulfill the kind of constraints I wrote in my question. But I am not sure about this... – João Bravo Sep 20 '21 at 11:33
  • 1
    If the final fitness function really depends on a condition which involves all the variables, I don't think it's possible to optimize each variable independently, unless there are other constraints which apply. – Erwan Sep 20 '21 at 15:14
  • @Erwan I forgot a detail. If $h(x)$ is a monotonic function, can't you maximize the variables independently, and then add everything? In that case, that solution should still be the most likely to fulfill the constraint, right? Because, if only one $x_i$ gets bigger, the sum of $h(x_i)$ cannot decrease. – João Bravo Sep 21 '21 at 15:40
  • Ok, I just realized that the monotonicity of the constraint functions means nothing in the general case, since one $x_i$ becoming smaller could also lead to a larger fitness value. – João Bravo Sep 27 '21 at 19:08

0 Answers0