Skip to content

tools : fix invalid free() #13436

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
May 11, 2025
Merged

Conversation

aumfer
Copy link
Contributor

@aumfer aumfer commented May 10, 2025

Added a constructor to server_context that initializes server_context::batch, preventing the destructor's call to llama_batch_free() from causing an invalid free().

Ran into this when bind() failed.

llamacpp1 llamacpp2

…uctor's call to llama_batch_free from causing an invalid free()
@aumfer aumfer marked this pull request as ready for review May 10, 2025 18:11
@aumfer aumfer requested a review from ngxson as a code owner May 10, 2025 18:12
@aumfer
Copy link
Contributor Author

aumfer commented May 10, 2025

There are other ways to solve this if you don't want to zero out the entire llama_batch object. (eg track whether server_context::init() has been called, check in the destructor before calling llama_batch_free)

@ngxson
Copy link
Collaborator

ngxson commented May 10, 2025

I think the proper fix is to check if batch.embd is not nullptr before freeing it. This is the same pattern with other _free functions in llama.cpp where we do nothing given a nullptr

@slaren
Copy link
Member

slaren commented May 10, 2025

Both free and delete will ignore a null pointer, these checks are not necessary. The problem here is uninitialized memory.

Copy link
Collaborator

@ngxson ngxson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok that makes sense. I think this can be the temporary solution before we finally have the private batch API.

Co-authored-by: Xuan-Son Nguyen <[email protected]>
@@ -1884,6 +1884,7 @@ struct server_context {

common_chat_templates_ptr chat_templates;

server_context() : batch({}) {}
Copy link
Collaborator

@cebtenzzre cebtenzzre May 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: could just use llama_batch batch = {}; in the field declaration on line 1865 to keep the default noexcept constructor.

Copy link
Contributor Author

@aumfer aumfer May 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't realize you could initialize non-scalar members in that way. I went ahead and updated this PR to use 'direct-list-initialization' as described here: https://en.cppreference.com/w/cpp/language/list_initialization

@CISC CISC merged commit 9a390c4 into ggml-org:master May 11, 2025
44 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants