In the real world, information is sometimes observed incompletely. Then, an intelligent system working in the real environment has to combine observed information and prior knowledge of the environment. In order to realize this integration mechanism, a model based on Bayesian decision theory is proposed. In this framework, prior knowledge is represented as a conditional prior probability distribution, and integrated by Bayes' theorem. For optimum Bayesian decision theory based on a posterior probability, we have to approximate proper prior probabilities, which sometimes depends on particular situations. These situated prior probabilities can be modeled by Bayesian networks. Especially, Bayesian network on neural networks (BNNN) has some advantages for learning situated prior in the real world environment. As the demonstration of the framework, learning situated prior probabilities from an English dictionary is shown. It can apply to handwriting character recognition using Bayesian classifier with a neural network. The result improves recognition performance better than an ordinal (maximum likelihood) classifier.