You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Not all glscopeclient users have 192GB of RAM on their workstation. It would be nice to allow extremely deep history even on low-memory machines by flushing sample data from older history waveforms to disk (keeping metadata in RAM for fast indexing).
Ideally the maximum RAM and disk history cache sizes would be specified in MB/GB under separate preferences, rather than forcing the user to specify history depth in waveforms (which may change RAM usage as capture settings change).
The text was updated successfully, but these errors were encountered:
Additionally, we should avoid burning GPU memory on older historical waveforms. Care may be needed to ensure that we properly recycle CPU+GPU waveforms to avoid wasting time on allocations, etc.
Not all glscopeclient users have 192GB of RAM on their workstation. It would be nice to allow extremely deep history even on low-memory machines by flushing sample data from older history waveforms to disk (keeping metadata in RAM for fast indexing).
Ideally the maximum RAM and disk history cache sizes would be specified in MB/GB under separate preferences, rather than forcing the user to specify history depth in waveforms (which may change RAM usage as capture settings change).
The text was updated successfully, but these errors were encountered: