Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory measurement and limits for Squirrel #7516

Merged
merged 2 commits into from May 11, 2019
Merged

Conversation

nielsmh
Copy link
Contributor

@nielsmh nielsmh commented Apr 15, 2019

Memory management for AI and GS is bad, everything comes out of the runtime library's default heap, so a single script running loose can exhaust all memory available to the OpenTTD process and cause a hard crash of the game.

This PR extends Squirrel to support custom a custom allocator (which has to be "context switched" manually), and so far uses it to keep track of allocation size for each running script. The end goal of this PR is to implement an arena allocator that will allow putting hard limits on how much memory a single script can use.

Obligatory screenshot:
image

Inspired by report in #7513 but it turns out the problem there may not be related to OoM conditions.

@nielsmh nielsmh changed the title Custom allocators for Squirrel Memory measurement and limits for Squirrel Apr 20, 2019
@nielsmh nielsmh marked this pull request as ready for review April 20, 2019 13:54
@nielsmh
Copy link
Contributor Author

nielsmh commented Apr 20, 2019

I think this is functionally complete now. Two remaining questions:

  • What should the default memory limit be? Right now it's 128 MB and that may not be enough for common use.
  • How much of a problem is it that the Memory column in the frame rate window shows even when there are no active scripts?

@James103
Copy link
Contributor

James103 commented Apr 20, 2019

As for the memory limit, I think that this memory limit should default depending on your system's RAM. I also propose that the RAM limit for AI/GS be configurable (both total and per-script, hidden from Advanced Settings), accepting k/K, m/M, g/G for 1024 (1 KiB), 1048576 (1 MiB), and 1073741824 (1 GiB) respectively. The reason for that is that some people may have computers with 16GB RAM plus, and want to have their AI/GS be able to utilize more of that, while other computers may only have 512 MB to 1 GB RAM and/or don't want their AI/GS taking up a lot of RAM to prevent out-of-memory errors.

Edit: Commit f55b811 has made the max memory limit per script configurable between 8MB (required for regression/dummy AI) and 8GB (1/2-1/4 of the average high-end gaming PC's RAM)

@PeterN
Copy link
Member

PeterN commented Apr 20, 2019

Good luck finding a reliable cross-platform way to determine amount of system RAM, let alone free RAM.

@nielsmh
Copy link
Contributor Author

nielsmh commented May 2, 2019

Resolved conflict with saveload version.

@glx22
Copy link
Contributor

glx22 commented May 10, 2019

This PR needs to be rebased

This can avoid out-of-memory situations due to single scripts using up the entire address space.
Instead, scripts that go above the maximum are killed.
The maximum is default 1 GB per script, but can be configured by a setting.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants