Huelsenbeck etal. (2001) said Bayesian inference of phylogeny is a powerful tool for addressing a number of long-standing, complex questions in evolutionary biology. The power they talk about states in the plausibility of fixing prior distributions of the parameters, and based on those prior probabilities and the likelihood of the data inferred the posterior probabilities of a tree. The posterior probability of each clade is estimated based on the frequency at which that clade is recovered among sampled trees once stationary log-likelihood have been reached under an MCMC algorithm. The numbers on the branches are said to represent the probability that the clade is correct or true (Huelsenbeck etal. 2002). The MCMC chain also sums another virtue to the Bayesian inference: makes it fast.
But are real those virtues or is only a hope that attracts pushovers? Firstly the prior problem, how do we calculate the probability of something we have not seen (Sober, 2002)? Someone could say that the answer to the trees prior distribution is not a problem and we can use the same probability to all of them (flat priors) and consider all possibilities, but this takes off one of the magnificent virtues of the Bayesian inference that is to involve into the calculation the plausibility of an event to occur. Although, Steel & Pickett (2006) proved that only under a special case priors do not induces a uniform distribution on clades which makes impossible the support evaluation for particular clades when the probability could be influenced by the clade size.
One of the most attractive features of the Bayesian inference is the speed, but when we need to be sure that the chains converge the time increment with the complexity and size of the data sets (Goloboff and Pol., 2005). On chains convergence roots the possibility of estimate the posterior probability of each clade, so a wrong implementation of the method drives to mistaken estimations. Moreover, admitting that the posterior probability has being well estimated the way the MCMC chain is summarized could give auto-inconsistent answers (the topology with the data) because the majority rule consensus may not recognize certain similarities among trees, and may be a poor summary (Yang, 2006). Besides, the posterior probability cannot be seen as a universal probability of truth, because it is given by the data, the model and the prior, so it is just a “local” probability (Simmons etal., 2004; Yang, 2006). Finally, but not less, actually Bayesian inference inflates the probabilities of correct clades and recovers high probabilities for incorrect nodes ( i.e. Douody etal., 2003; Simons etal., 2004; Goloboff & Pol., 2005).
1 comentario:
hola eri
Que malo que es el bayesiano!!!
Publicar un comentario