Title: | Bayesian Estimation for Finite Mixture of Distributions |
---|---|
Description: | Provides statistical tools for Bayesian estimation of mixture distributions, mainly a mixture of Gamma, Normal, and t-distributions. The package is implemented based on the Bayesian literature for the finite mixture of distributions, including Mohammadi and et al. (2013) <doi:10.1007/s00180-012-0323-3> and Mohammadi and Salehi-Rad (2012) <doi:10.1080/03610918.2011.588358>. |
Authors: | Reza Mohammadi [aut, cre] |
Maintainer: | Reza Mohammadi <[email protected]> |
License: | GPL (>= 2) |
Version: | 1.7 |
Built: | 2025-01-07 05:30:25 UTC |
Source: | https://github.com/cran/bmixture |
The R
package bmixture provides statistical tools for Bayesian estimation in finite mixture of distributions.
The package implemented the improvements in the Bayesian literature, including Mohammadi and Salehi-Rad (2012) and Mohammadi et al. (2013).
Besides, the package contains several functions for simulation and visualization, as well as a real dataset taken from the literature.
Whenever using this package, please cite as
Mohammadi R. (2019). bmixture: Bayesian Estimation for Finite Mixture of
Distributions, R
package version 1.5, https://CRAN.R-project.org/package=bmixture
Reza Mohammadi <[email protected]>
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi:10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi:10.1080/03610918.2011.588358
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi:10.1214/aos/1016120364
Richardson, S. and Green, P. J. (1997) On Bayesian analysis of mixtures with an unknown number of components. Journal of the Royal Statistical Society: series B, 59(4):731-792, doi:10.1111/1467-9868.00095
Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711-732, doi:10.1093/biomet/82.4.711
Cappe, O., Christian P. R., and Tobias, R. (2003) Reversible jump, birth and death and more general continuous time Markov chain Monte Carlo samplers. Journal of the Royal Statistical Society: Series B, 65(3):679-700
Wade, S. and Ghahramani, Z. (2018) Bayesian Cluster Analysis: Point Estimation and Credible Balls (with Discussion). Bayesian Analysis, 13(2):559-626, doi:10.1214/17-BA1073
## Not run: require( bmixture ) data( galaxy ) # Runing bdmcmc algorithm for the galaxy dataset mcmc_sample = bmixnorm( data = galaxy ) summary( mcmc_sample ) plot( mcmc_sample ) print( mcmc_sample) # simulating data from mixture of Normal with 3 components n = 500 mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) weight = c( 0.3, 0.5, 0.2 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 ) summary( bmixnorm.obj ) ## End(Not run)
## Not run: require( bmixture ) data( galaxy ) # Runing bdmcmc algorithm for the galaxy dataset mcmc_sample = bmixnorm( data = galaxy ) summary( mcmc_sample ) plot( mcmc_sample ) print( mcmc_sample) # simulating data from mixture of Normal with 3 components n = 500 mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) weight = c( 0.3, 0.5, 0.2 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 ) summary( bmixnorm.obj ) ## End(Not run)
This function consists of several sampling algorithms for Bayesian estimation for a mixture of Gamma distributions.
bmixgamma( data, k = "unknown", iter = 1000, burnin = iter / 2, lambda = 1, mu = NULL, nu = NULL, kesi = NULL, tau = NULL, k.start = NULL, alpha.start = NULL, beta.start = NULL, pi.start = NULL, k.max = 30, trace = TRUE )
bmixgamma( data, k = "unknown", iter = 1000, burnin = iter / 2, lambda = 1, mu = NULL, nu = NULL, kesi = NULL, tau = NULL, k.start = NULL, alpha.start = NULL, beta.start = NULL, pi.start = NULL, k.max = 30, trace = TRUE )
data |
vector of data with size |
k |
number of components of mixture distribution. It can take an integer values. |
iter |
number of iteration for the sampling algorithm. |
burnin |
number of burn-in iteration for the sampling algorithm. |
lambda |
For the case |
mu |
parameter of alpha in mixture distribution. |
nu |
parameter of alpha in mixture distribution. |
kesi |
parameter of beta in mixture distribution. |
tau |
parameter of beta in mixture distribution. |
k.start |
For the case |
alpha.start |
Initial value for parameter of mixture distribution. |
beta.start |
Initial value for parameter of mixture distribution. |
pi.start |
Initial value for parameter of mixture distribution. |
k.max |
For the case |
trace |
Logical: if TRUE (default), tracing information is printed. |
Sampling from finite mixture of Gamma distribution, with density:
where k
is the number of components of mixture distribution (as a defult we assume is unknown
) and
The prior distributions are defined as below
for more details see Mohammadi et al. (2013), doi:10.1007/s00180-012-0323-3.
An object with S3
class "bmixgamma"
is returned:
all_k |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
all_weights |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
pi_sample |
a vector which includes the MCMC samples after burn-in from parameter |
alpha_sample |
a vector which includes the MCMC samples after burn-in from parameter |
beta_sample |
a vector which includes the MCMC samples after burn-in from parameter |
data |
original data. |
Reza Mohammadi [email protected]
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi:10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi:10.1080/03610918.2011.588358
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi:10.1214/aos/1016120364
Richardson, S. and Green, P. J. (1997) On Bayesian analysis of mixtures with an unknown number of components. Journal of the Royal Statistical Society: series B, 59(4):731-792, doi:10.1111/1467-9868.00095
Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711-732, doi:10.1093/biomet/82.4.711
Cappe, O., Christian P. R., and Tobias, R. (2003) Reversible jump, birth and death and more general continuous time Markov chain Monte Carlo samplers. Journal of the Royal Statistical Society: Series B, 65(3):679-700
Wade, S. and Ghahramani, Z. (2018) Bayesian Cluster Analysis: Point Estimation and Credible Balls (with Discussion). Bayesian Analysis, 13(2):559-626, doi:10.1214/17-BA1073
## Not run: set.seed( 70 ) # simulating data from mixture of gamma with two components n = 1000 # number of observations weight = c( 0.6, 0.4 ) alpha = c( 12 , 1 ) beta = c( 3 , 2 ) data = rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) # plot for simulation data hist( data, prob = TRUE, nclass = 50, col = "gray" ) x = seq( 0, 10, 0.05 ) truth = dmixgamma( x, weight, alpha, beta ) lines( x, truth, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixgamma.obj = bmixgamma( data, iter = 1000 ) summary( bmixgamma.obj ) plot( bmixgamma.obj ) ## End(Not run)
## Not run: set.seed( 70 ) # simulating data from mixture of gamma with two components n = 1000 # number of observations weight = c( 0.6, 0.4 ) alpha = c( 12 , 1 ) beta = c( 3 , 2 ) data = rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) # plot for simulation data hist( data, prob = TRUE, nclass = 50, col = "gray" ) x = seq( 0, 10, 0.05 ) truth = dmixgamma( x, weight, alpha, beta ) lines( x, truth, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixgamma.obj = bmixgamma( data, iter = 1000 ) summary( bmixgamma.obj ) plot( bmixgamma.obj ) ## End(Not run)
This function consists of several sampling algorithms for Bayesian estimation for finite a mixture of Normal distributions.
bmixnorm( data, k = "unknown", iter = 1000, burnin = iter / 2, lambda = 1, k.start = NULL, mu.start = NULL, sig.start = NULL, pi.start = NULL, k.max = 30, trace = TRUE )
bmixnorm( data, k = "unknown", iter = 1000, burnin = iter / 2, lambda = 1, k.start = NULL, mu.start = NULL, sig.start = NULL, pi.start = NULL, k.max = 30, trace = TRUE )
data |
vector of data with size |
k |
number of components of mixture distribution. It can take an integer values. |
iter |
number of iteration for the sampling algorithm. |
burnin |
number of burn-in iteration for the sampling algorithm. |
lambda |
For the case |
k.start |
For the case |
mu.start |
Initial value for parameter of mixture distribution. |
sig.start |
Initial value for parameter of mixture distribution. |
pi.start |
Initial value for parameter of mixture distribution. |
k.max |
For the case |
trace |
Logical: if TRUE (default), tracing information is printed. |
Sampling from finite mixture of Normal distribution, with density:
where k
is the number of components of mixture distribution (as a defult we assume is unknown
).
The prior distributions are defined as below
where IG
denotes an inverted gamma distribution. For more details see for more details see Stephens, M. (2000), doi:10.1214/aos/1016120364.
An object with S3
class "bmixnorm"
is returned:
all_k |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
all_weights |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
pi_sample |
a vector which includes the MCMC samples after burn-in from parameter |
mu_sample |
a vector which includes the MCMC samples after burn-in from parameter |
sig_sample |
a vector which includes the MCMC samples after burn-in from parameter |
data |
original data. |
Reza Mohammadi [email protected]
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi:10.1214/aos/1016120364
Richardson, S. and Green, P. J. (1997) On Bayesian analysis of mixtures with an unknown number of components. Journal of the Royal Statistical Society: series B, 59(4):731-792, doi:10.1111/1467-9868.00095
Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711-732, doi:10.1093/biomet/82.4.711
Cappe, O., Christian P. R., and Tobias, R. (2003) Reversible jump, birth and death and more general continuous time Markov chain Monte Carlo samplers. Journal of the Royal Statistical Society: Series B, 65(3):679-700
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi:10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi:10.1080/03610918.2011.588358
Wade, S. and Ghahramani, Z. (2018) Bayesian Cluster Analysis: Point Estimation and Credible Balls (with Discussion). Bayesian Analysis, 13(2):559-626, doi:10.1214/17-BA1073
## Not run: data( galaxy ) set.seed( 70 ) # Runing bdmcmc algorithm for the galaxy dataset mcmc_sample = bmixnorm( data = galaxy ) summary( mcmc_sample ) plot( mcmc_sample ) print( mcmc_sample) # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 ) summary( bmixnorm.obj ) ## End(Not run)
## Not run: data( galaxy ) set.seed( 70 ) # Runing bdmcmc algorithm for the galaxy dataset mcmc_sample = bmixnorm( data = galaxy ) summary( mcmc_sample ) plot( mcmc_sample ) print( mcmc_sample) # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 ) summary( bmixnorm.obj ) ## End(Not run)
This function consists of several sampling algorithms for Bayesian estimation for finite mixture of t-distributions.
bmixt( data, k = "unknown", iter = 1000, burnin = iter / 2, lambda = 1, df = 1, k.start = NULL, mu.start = NULL, sig.start = NULL, pi.start = NULL, k.max = 30, trace = TRUE )
bmixt( data, k = "unknown", iter = 1000, burnin = iter / 2, lambda = 1, df = 1, k.start = NULL, mu.start = NULL, sig.start = NULL, pi.start = NULL, k.max = 30, trace = TRUE )
data |
vector of data with size |
k |
number of components of mixture distribution. Defult is |
iter |
number of iteration for the sampling algorithm. |
burnin |
number of burn-in iteration for the sampling algorithm. |
lambda |
For the case |
df |
Degrees of freedom (> 0, maybe non-integer). df = Inf is allowed. |
k.start |
For the case |
mu.start |
Initial value for parameter of mixture distribution. |
sig.start |
Initial value for parameter of mixture distribution. |
pi.start |
Initial value for parameter of mixture distribution. |
k.max |
For the case |
trace |
Logical: if TRUE (default), tracing information is printed. |
Sampling from finite mixture of t-distribution, with density:
where k
is the number of components of mixture distribution (as a defult we assume is unknown
).
The prior distributions are defined as below
where IG
denotes an inverted gamma distribution. For more details see Stephens, M. (2000), doi:10.1214/aos/1016120364.
An object with S3
class "bmixt"
is returned:
all_k |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
all_weights |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
pi_sample |
a vector which includes the MCMC samples after burn-in from parameter |
mu_sample |
a vector which includes the MCMC samples after burn-in from parameter |
sig_sample |
a vector which includes the MCMC samples after burn-in from parameter |
data |
original data. |
Reza Mohammadi [email protected]
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi:10.1214/aos/1016120364
Richardson, S. and Green, P. J. (1997) On Bayesian analysis of mixtures with an unknown number of components. Journal of the Royal Statistical Society: series B, 59(4):731-792, doi:10.1111/1467-9868.00095
Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711-732, doi:10.1093/biomet/82.4.711
Cappe, O., Christian P. R., and Tobias, R. (2003) Reversible jump, birth and death and more general continuous time Markov chain Monte Carlo samplers. Journal of the Royal Statistical Society: Series B, 65(3):679-700
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi:10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi:10.1080/03610918.2011.588358
Wade, S. and Ghahramani, Z. (2018) Bayesian Cluster Analysis: Point Estimation and Credible Balls (with Discussion). Bayesian Analysis, 13(2):559-626, doi:10.1214/17-BA1073
## Not run: set.seed( 20 ) # simulating data from mixture of Normal with 3 components n = 2000 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixt.obj = bmixt( data, k = 3, iter = 5000 ) summary( bmixt.obj ) ## End(Not run)
## Not run: set.seed( 20 ) # simulating data from mixture of Normal with 3 components n = 2000 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixt.obj = bmixt( data, k = 3, iter = 5000 ) summary( bmixt.obj ) ## End(Not run)
This dataset considers of 82 observatons of the velocities (in 1000 km/second) of distant galaxies diverging from our own, from six well-separated conic sections of the Corona Borealis. The dataset has been analyzed under a variety of mixture models; See e.g. Stephens (2000).
data( galaxy )
data( galaxy )
A data frame with 82 observations on the following variable.
speed
a numeric vector giving the speed of galaxies (in 1000 km/second).
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi:10.1214/aos/1016120364
data( galaxy ) hist( galaxy, prob = TRUE, xlim = c( 0, 40 ), ylim = c( 0, 0.3 ), nclass = 20, col = "gray", border = "white" ) lines( density( galaxy ), col = "black", lwd = 2 )
data( galaxy ) hist( galaxy, prob = TRUE, xlim = c( 0, 40 ), ylim = c( 0, 0.3 ), nclass = 20, col = "gray", border = "white" ) lines( density( galaxy ), col = "black", lwd = 2 )
Random generation and density function for a finite mixture of Gamma distribution.
rmixgamma( n = 10, weight = 1, alpha = 1, beta = 1 ) dmixgamma( x, weight = 1, alpha = 1, beta = 1 )
rmixgamma( n = 10, weight = 1, alpha = 1, beta = 1 ) dmixgamma( x, weight = 1, alpha = 1, beta = 1 )
n |
number of observations. |
x |
vector of quantiles. |
weight |
vector of probability weights, with length equal to number of components ( |
alpha |
vector of non-negative parameters of the Gamma distribution. |
beta |
vector of non-negative parameters of the Gamma distribution. |
Sampling from finite mixture of Gamma distribution, with density:
where
Generated data as an vector with size .
Reza Mohammadi [email protected]
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi:10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi:10.1080/03610918.2011.588358
## Not run: n = 10000 weight = c( 0.6 , 0.3 , 0.1 ) alpha = c( 100 , 200 , 300 ) beta = c( 100/3, 200/4, 300/5 ) data = rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixgamma = dmixnorm( x, weight, alpha, beta ) lines( x, densmixgamma, lwd = 2 ) ## End(Not run)
## Not run: n = 10000 weight = c( 0.6 , 0.3 , 0.1 ) alpha = c( 100 , 200 , 300 ) beta = c( 100/3, 200/4, 300/5 ) data = rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixgamma = dmixnorm( x, weight, alpha, beta ) lines( x, densmixgamma, lwd = 2 ) ## End(Not run)
Random generation and density function for a finite mixture of univariate Normal distribution.
rmixnorm( n = 10, weight = 1, mean = 0, sd = 1 ) dmixnorm( x, weight = 1, mean = 0, sd = 1 )
rmixnorm( n = 10, weight = 1, mean = 0, sd = 1 ) dmixnorm( x, weight = 1, mean = 0, sd = 1 )
n |
number of observations. |
x |
vector of quantiles. |
weight |
vector of probability weights, with length equal to number of components ( |
mean |
vector of means. |
sd |
vector of standard deviations. |
Sampling from finite mixture of Normal distribution, with density:
Generated data as an vector with size .
Reza Mohammadi [email protected]
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi:10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi:10.1080/03610918.2011.588358
## Not run: n = 10000 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) ## End(Not run)
## Not run: n = 10000 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) ## End(Not run)
Random generation and density function for a finite mixture of univariate t-distribution.
rmixt( n = 10, weight = 1, df = 1, mean = 0, sd = 1 ) dmixt( x, weight = 1, df = 1, mean = 0, sd = 1 )
rmixt( n = 10, weight = 1, df = 1, mean = 0, sd = 1 ) dmixt( x, weight = 1, df = 1, mean = 0, sd = 1 )
n |
number of observations. |
x |
vector of quantiles. |
weight |
vector of probability weights, with length equal to number of components ( |
df |
vector of degrees of freedom (> 0, maybe non-integer). df = Inf is allowed. |
mean |
vector of means. |
sd |
vector of standard deviations. |
Sampling from finite mixture of t-distribution, with density:
where
Generated data as an vector with size .
Reza Mohammadi [email protected]
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi:10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi:10.1080/03610918.2011.588358
## Not run: n = 10000 weight = c( 0.3, 0.5, 0.2 ) df = c( 4 , 4 , 4 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixt( n = n, weight = weight, df = df, mean = mean, sd = sd ) hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixt = dmixt( x, weight, df, mean, sd ) lines( x, densmixt, lwd = 2 ) ## End(Not run)
## Not run: n = 10000 weight = c( 0.3, 0.5, 0.2 ) df = c( 4 , 4 , 4 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixt( n = n, weight = weight, df = df, mean = mean, sd = sd ) hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixt = dmixt( x, weight, df, mean, sd ) lines( x, densmixt, lwd = 2 ) ## End(Not run)
S3
class "bmixgamma"
Visualizes the results for function bmixgamma
.
## S3 method for class 'bmixgamma' plot( x, ... )
## S3 method for class 'bmixgamma' plot( x, ... )
x |
An object of |
... |
System reserved (no specific usage). |
Reza Mohammadi [email protected]
## Not run: # simulating data from mixture of gamma with two components n = 500 # number of observations weight = c( 0.6, 0.4 ) alpha = c( 12 , 1 ) beta = c( 3 , 2 ) data <- rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) # plot for simulation data hist( data, prob = TRUE, nclass = 50, col = "gray" ) x = seq( 0, 10, 0.05 ) truth = dmixgamma( x, weight, alpha, beta ) lines( x, truth, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixgamma.obj <- bmixgamma( data ) plot( bmixgamma.obj ) ## End(Not run)
## Not run: # simulating data from mixture of gamma with two components n = 500 # number of observations weight = c( 0.6, 0.4 ) alpha = c( 12 , 1 ) beta = c( 3 , 2 ) data <- rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) # plot for simulation data hist( data, prob = TRUE, nclass = 50, col = "gray" ) x = seq( 0, 10, 0.05 ) truth = dmixgamma( x, weight, alpha, beta ) lines( x, truth, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixgamma.obj <- bmixgamma( data ) plot( bmixgamma.obj ) ## End(Not run)
S3
class "bmixnorm"
Visualizes the results for function bmixnorm
.
## S3 method for class 'bmixnorm' plot( x, ... )
## S3 method for class 'bmixnorm' plot( x, ... )
x |
An object of |
... |
System reserved (no specific usage). |
Reza Mohammadi [email protected]
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3 ) plot( bmixnorm.obj ) ## End(Not run)
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3 ) plot( bmixnorm.obj ) ## End(Not run)
S3
class "bmixt"
Visualizes the results for function bmixt
.
## S3 method for class 'bmixt' plot( x, ... )
## S3 method for class 'bmixt' plot( x, ... )
x |
An object of |
... |
System reserved (no specific usage). |
Reza Mohammadi [email protected]
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixt.obj = bmixt( data, k = 3 ) plot( bmixt.obj ) ## End(Not run)
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixt.obj = bmixt( data, k = 3 ) plot( bmixt.obj ) ## End(Not run)
S3
class "bmixgamma"
Prints the information about the output of function bmixgamma
.
## S3 method for class 'bmixgamma' print( x, ... )
## S3 method for class 'bmixgamma' print( x, ... )
x |
An object of |
... |
System reserved (no specific usage). |
Reza Mohammadi [email protected]
## Not run: # simulating data from mixture of gamma with two components n = 500 # number of observations weight = c( 0.6, 0.4 ) alpha = c( 12 , 1 ) beta = c( 3 , 2 ) data <- rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) # plot for simulation data hist( data, prob = TRUE, nclass = 50, col = "gray" ) x = seq( 0, 10, 0.05 ) truth = dmixgamma( x, weight, alpha, beta ) lines( x, truth, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixgamma.obj <- bmixgamma( data, iter = 500 ) print( bmixgamma.obj ) ## End(Not run)
## Not run: # simulating data from mixture of gamma with two components n = 500 # number of observations weight = c( 0.6, 0.4 ) alpha = c( 12 , 1 ) beta = c( 3 , 2 ) data <- rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) # plot for simulation data hist( data, prob = TRUE, nclass = 50, col = "gray" ) x = seq( 0, 10, 0.05 ) truth = dmixgamma( x, weight, alpha, beta ) lines( x, truth, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixgamma.obj <- bmixgamma( data, iter = 500 ) print( bmixgamma.obj ) ## End(Not run)
S3
class "bmixnorm"
Prints the information about the output of function bmixnorm
.
## S3 method for class 'bmixnorm' print( x, ... )
## S3 method for class 'bmixnorm' print( x, ... )
x |
An object of |
... |
System reserved (no specific usage). |
Reza Mohammadi [email protected]
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 ) print( bmixnorm.obj ) ## End(Not run)
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 ) print( bmixnorm.obj ) ## End(Not run)
S3
class "bmixt"
Prints the information about the output of function bmixt
.
## S3 method for class 'bmixt' print( x, ... )
## S3 method for class 'bmixt' print( x, ... )
x |
object of |
... |
System reserved (no specific usage). |
Reza Mohammadi [email protected]
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixt.obj = bmixt( data, k = 3, iter = 1000 ) print( bmixt.obj ) ## End(Not run)
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixt.obj = bmixt( data, k = 3, iter = 1000 ) print( bmixt.obj ) ## End(Not run)
Random generation from the Dirichlet distribution.
rdirichlet( n = 10, alpha = c( 1, 1 ) )
rdirichlet( n = 10, alpha = c( 1, 1 ) )
n |
number of observations. |
alpha |
vector of shape parameters. |
The Dirichlet distribution is the multidimensional generalization of the beta distribution.
A matrix with n
rows, each containing a single Dirichlet random deviate.
Reza Mohammadi [email protected]
draws = rdirichlet( n = 500, alpha = c( 1, 1, 1 ) ) boxplot( draws )
draws = rdirichlet( n = 500, alpha = c( 1, 1, 1 ) ) boxplot( draws )
S3
class "bmixgamma"
Provides a summary of the results for function bmixgamma
.
## S3 method for class 'bmixgamma' summary( object, ... )
## S3 method for class 'bmixgamma' summary( object, ... )
object |
An object of |
... |
System reserved (no specific usage). |
Reza Mohammadi [email protected]
## Not run: # simulating data from mixture of gamma with two components n = 500 # number of observations weight = c( 0.6, 0.4 ) alpha = c( 12 , 1 ) beta = c( 3 , 2 ) data <- rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) # plot for simulation data hist( data, prob = TRUE, nclass = 50, col = "gray" ) x = seq( 0, 10, 0.05 ) truth = dmixgamma( x, weight, alpha, beta ) lines( x, truth, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixgamma.obj <- bmixgamma( data, iter = 500 ) summary( bmixgamma.obj ) ## End(Not run)
## Not run: # simulating data from mixture of gamma with two components n = 500 # number of observations weight = c( 0.6, 0.4 ) alpha = c( 12 , 1 ) beta = c( 3 , 2 ) data <- rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta ) # plot for simulation data hist( data, prob = TRUE, nclass = 50, col = "gray" ) x = seq( 0, 10, 0.05 ) truth = dmixgamma( x, weight, alpha, beta ) lines( x, truth, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixgamma.obj <- bmixgamma( data, iter = 500 ) summary( bmixgamma.obj ) ## End(Not run)
S3
class "bmixnorm"
Provides a summary of the results for function bmixnorm
.
## S3 method for class 'bmixnorm' summary( object, ... )
## S3 method for class 'bmixnorm' summary( object, ... )
object |
An object of |
... |
System reserved (no specific usage). |
Reza Mohammadi [email protected]
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 ) summary( bmixnorm.obj ) ## End(Not run)
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 ) summary( bmixnorm.obj ) ## End(Not run)
S3
class "bmixt"
Provides a summary of the results for function bmixt
.
## S3 method for class 'bmixt' summary( object, ... )
## S3 method for class 'bmixt' summary( object, ... )
object |
An object of |
... |
System reserved (no specific usage). |
Reza Mohammadi [email protected]
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixt.obj = bmixt( data, k = 3, iter = 1000 ) summary( bmixt.obj ) ## End(Not run)
## Not run: # simulating data from mixture of Normal with 3 components n = 500 weight = c( 0.3, 0.5, 0.2 ) mean = c( 0 , 10 , 3 ) sd = c( 1 , 1 , 1 ) data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd ) # plot for simulation data hist( data, prob = TRUE, nclass = 30, col = "gray" ) x = seq( -20, 20, 0.05 ) densmixnorm = dmixnorm( x, weight, mean, sd ) lines( x, densmixnorm, lwd = 2 ) # Runing bdmcmc algorithm for the above simulation data set bmixt.obj = bmixt( data, k = 3, iter = 1000 ) summary( bmixt.obj ) ## End(Not run)