elicit.simulations module#

class elicit.simulations.Priors(ground_truth, init_matrix_slice, trainer, parameters, network, expert, seed)[source]#

Bases: Module

Initializes the hyperparameters of the prior distributions.

Attributes:
ground_truthbool

whether samples are drawn from a true prior (‘oracle’)

global_dictdict

dictionary containing all user and default input settings

loggerlogging method

retrieves module name for passing it to the logger

init_priorsdict

initialized hyperparameters (i.e., trainable variables); None if ground_truth = True

Initializes the hyperparameters (i.e., trainable variables)

Parameters:
ground_truthbool

whether samples are drawn from a true prior (‘oracle’)

global_dictdict

dictionary containing all user and default input settings.

__init__(ground_truth, init_matrix_slice, trainer, parameters, network, expert, seed)[source]#

Initializes the hyperparameters (i.e., trainable variables)

Parameters:
ground_truthbool

whether samples are drawn from a true prior (‘oracle’)

global_dictdict

dictionary containing all user and default input settings.

__call__()[source]#

Samples from the initialized prior distribution(s).

Returns:
prior_samplesdict

Samples from prior distribution(s).

elicit.simulations.intialize_priors(init_matrix_slice, method, seed, parameters, network)[source]#

Initialize prior distributions.

Parameters:
global_dictdict

dictionary including all user-and default input settings.

Returns:
init_priordict

returns initialized prior distributions ready for sampling.

elicit.simulations.sample_from_priors(initialized_priors, ground_truth, num_samples, B, seed, method, parameters, network, expert)[source]#

Samples from initialized prior distributions.

Parameters:
initialized_priorsdict

initialized prior distributions ready for sampling.

ground_truthbool

whether simulations are based on ground truth (then sampling is performed from true distribution).

global_dictdict

dictionary including all user-input settings..

Returns:
prior_samplesdict

Samples from prior distributions.

elicit.simulations.softmax_gumbel_trick(epred: float, likelihood: callable, upper_thres: float, temp: float, seed: int)[source]#

The softmax-gumbel trick computes a continuous approximation of ypred from a discrete likelihood and thus allows for the computation of gradients for discrete random variables.

Currently this approach is only implemented for models without upper boundary (e.g., Poisson model).

Corresponding literature:

  • Maddison, C. J., Mnih, A. & Teh, Y. W. The concrete distribution:

    A continuous relaxation of

    discrete random variables in International Conference on Learning Representations (2017). https://doi.org/10.48550/arXiv.1611.00712

  • Jang, E., Gu, S. & Poole, B. Categorical reparameterization with

gumbel-softmax in International Conference on Learning Representations (2017). https://openreview.net/forum?id=rkE3y85ee. - Joo, W., Kim, D., Shin, S. & Moon, I.-C. Generalized gumbel-softmax gradient estimator for generic discrete random variables.

Parameters:
model_simulationsdict

dictionary containing all simulated output variables from the generative model.

global_dictdict

dictionary including all user-input settings.

Returns:
ypredtf.Tensor

continuously approximated ypred from the discrete likelihood.

elicit.simulations.simulate_from_generator(prior_samples, seed, model)[source]#

Simulates data from the specified generative model.

Parameters:
prior_samplesdict

samples from prior distributions.

global_dictdict

dictionary including all user-input settings.

Returns:
model_simulationsdict

simulated data from generative model.