Skip to content

Mistakes in Trid function #2

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
gbelouze opened this issue Sep 16, 2022 · 5 comments
Open

Mistakes in Trid function #2

gbelouze opened this issue Sep 16, 2022 · 5 comments

Comments

@gbelouze
Copy link

Two mistakes

  • the latex formula should read r"f(\mathbf{x})=\sum_{i=1}^{d}(x_i-1)^2-\sum_{i=2}^{d}(x_ix_{i-1})"
  • the evaluation formula should be np.sum((X - 1) ** 2) - np.sum(X[1:] * X[:-1])
@AxelThevenot
Copy link
Owner

Hello,

Thank you for the issues

I have note so much time now so if you want you can make a Pull Request and I'll review it ☺️

@gbelouze
Copy link
Author

Hi,
Of course ! I'm still in the process of validating all functions, as I am using them for research purposes (thank you for this repository btw). Once I think I am through with all the issues, I will make a PR that corrects them all at once :)

@AxelThevenot
Copy link
Owner

Thank you very much for the review

There are so many functions that I might have made a lot of mistakes so it is a great help !

@AxelThevenot
Copy link
Owner

And what research purpose if it is not secret ? ☺️

@gbelouze
Copy link
Author

You can read about it here, if you're interested about the use case of your implementation: https://github.com/gbelouze/forward-gradient (it also links to the associated paper)

gbelouze pushed a commit to gbelouze/Python_Benchmark_Test_Optimization_Function_Single_Objective that referenced this issue Sep 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants