Skip to content

Large Memory Usage for HDDM Regression Models #24

@shabnamhossein

Description

@shabnamhossein

Hello,

I am using dockerHDDM (HDDM version 1.0.1RC) and trying to run HDDMRegression for 155 subjects, with 120 trials per subject (two trial types/conditions per subject). This is my regression model and sampling numbers:

model_pre_regression_IAT = hddm.HDDMRegressor(data, {"v ~ 1 + C(condition, Treatment('compatible'))",
                                                             "a ~ 1 + C(condition, Treatment('compatible'))",
                                                             "t ~ 1 + C(condition, Treatment('compatible'))"},
                                                  group_only_regressors=False, 
                                                  keep_regressor_trace=True, 
                                                  include=['v', 'a', 't'])   

model_pre_regression_IAT.find_starting_values()
model_pre_regression_IAT_infdata = model_pre_regression_IAT.sample(10000, burn=500, chains=4,
                                                                           save_name='model_pre_regression_IAT',
                                                                           return_infdata=True)

It takes about 9.5 hours to run the model with memory usage around 40 GB through the MCMC sampling and then it goes up to 90 GB at the end when "Start converting to InferenceData". If I set 'ppc=True' the memory consumption would go even higher than that, which my laptop cannot handle. Am I doing something wrong? Does 90 GB memory consumption for this rather simple model and medium number of subjects/trials make sense?

Thanks,
Shabnam

Metadata

Metadata

Assignees

Labels

No labels
No labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions