A call to sample_MegaLMM(MegaLMM_state,n_iter) will run n_iter of the Gibbs sampler. If nrun > burn, then a posterior sample of all variables stored in MegaLMM_state$Posterior every thin iteration. If you are doing a long run, and storing a large number of parameters, this will take a lot of memory. This function will estimate the memory requirements.

estimate_memory_posterior(MegaLMM_state, n_iter)

Arguments

MegaLMM_state

The model after calling clear_Posterior

n_iter

number of iterations of the Gibbs sampler

Value

The estimated memory size in bytes

Details

Note 1: The estimated value will assume all iterations are post-burnin

Note 2: sample_MegaLMM() will instantiate all arrays to hold the posterior samples prior to running the iterations, so memory requirements will not increase much during the sampling.

Note 3: It is generally not needed to run sample_MegaLMM(MegaLMM_state,n_iter) with a large n_iter. Instead, run the function many times, each with a small n_iter, calling save_posterior_chunk between each run. This gives you the ability to diagnose problems during the run, and keeps the memory requirments low. You can always reload the posterior samples from the database on the disk using reload_Posterior or load_posterior_param.

Examples

estimate_memory_posterior(MegaLMM_state,100)
#> Error: object 'MegaLMM_state' not found