next up previous
次へ: 正規混合分布の汎化バイアスの非単調性についての付録 上へ: 有限混合分布モデルの学習に関する研究 (Web 版) 戻る: 謝辞

文献目録

1
赤穂昭太郎.
アテンション領域における EM アルゴリズム.
神経回路学会第4回全国大会講演論文集, 1994.

2
S. Akaho.
The EM algorithm for multiple object recognition.
In Proc. of Int. Conf. on Neural Networks (ICNN'95), pp. 2426-2431, 1995.

3
赤穂昭太郎.
EM アルゴリズムの幾何学.
情報処理, Vol. 37, No. 1, pp. 43-51, 1996.

4
赤穂昭太郎, 長谷川修, 吉村隆, 麻生英樹, 速水悟.
EM 法を用いた複数情報源からの概念獲得.
電子情報通信学会技術研究報告 PRMU96-91, 1996.

5
赤穂昭太郎, 速水悟, 長谷川修, 吉村隆, 麻生英樹.
EM 法を用いた複数情報源からの概念獲得.
電子情報通信学会論文誌, Vol. J80-A, No. 9, pp. 1546-1553, 1997.

6
赤穂昭太郎.
ECM 法を用いた確率分布の位置、尺度、回転パラメータの推定法.
電子情報通信学会論文誌, Vol. J82-D-II, No. 12, pp. 2240-2250, 1999.

7
赤穂昭太郎, 梅山伸二.
マルチモーダル独立成分分析 --複数情報源からの共通特徴抽出法--.
電子情報通信学会論文誌, Vol. J83-A, No. 6, pp. 669-676, 2000.

8
S. Akaho, S. Hayamizu, O. Hasegawa, T. Yoshimura, and H. Asoh.
Multiple attribute learning with canonical correlation analysis and EM algorithm.
Technical Report 97-8, Electrotechnical Laboratory, 1997.

9
S. Akaho, Hayamizu, S., Hasegawa, O., Itou, K., Akiba, T., Asoh, H., Kurita, T., K. Sakaue K. Tanaka, and N. Otsu.
Recent developments for multimodal interaction by visual agent with spoken language.
In Proc. on Int. Conf. on Multimodal Interaction (ICMI'96), pp. 135-139, 1996.

10
S. Akaho and H.J.Kappen.
Nonmonotonic generalization bias of Gaussian mixture models.
Neural Computation, Vol. 12, No. 6, pp. 1411-1428, 2000.

11
S. Akaho and H. J. Kappen.
Nonmonotonic generalization bias of Gaussian mixture models.
Technical Report 98-22, Electrotechnical Laboratory, 1998.

12
S. Akaho, Y. Kiuchi, and S. Umeyama.
MICA: Multimodal independent component analysis.
In Proc. of Int. Joint Conf. on Neural Networks (IJCNN'99), pp. 927-932, 1999.

13
H. Akaike.
A new look at the statistical model identification.
IEEE Trans. on Automatic Control, Vol. 19, No. 6, pp. 716-723, 1974.

14
S. Amari.
Differential Geometrical Methods in Statistics.
Lecture Notes in Statistics. Springer-Verlag, 1985.

15
S. Amari.
A universal theorem on learning curves.
Neural networks, Vol. 6, pp. 161-166, 1993.

16
甘利俊一, 長岡浩司.
情報幾何の方法.
岩波講座 応用数学. 岩波書店, 1993.

17
S. Amari.
Information geometry of the EM and em algorithms for neural networks.
Neural Networks, Vol. 8, No. 9, pp. 1379-1408, 1995.

18
T. W. Anderson.
An Introduction to Multivariate Statistical Analysis.
John Wiley & Sons, 1984.

19
H. Asoh, S. Akaho, O. Hasegawa, T. Yoshimura, and S. Hayamizu.
Intermodal learning of multimodal interaction systems.
In Proc. of the Int. Workshop on Human Interface Technology (IWHIT '97), 1997.

20
H. Asoh and O. Takechi.
An approximation of nonlinear canonical correlation analysis by multilayer perceptrons.
In Artificial Neural Network (Proc. of ICANN'94), pp. 713-716, 1994.

21
H. Attias.
Independent factor analysis.
Neural Computation, Vol. 11, pp. 803-851, 1999.

22
N. Barkai, H. S. Seung, and H. Sompolinsky.
Scaling Laws in Learning of Classification Tasks.
Physical Review Letters, Vol. 70, pp. 3167-3170, 1993.

23
N. Barkai and H. Sompolinsky.
Statistical mechanics of the maximum-likelihood density estimation.
Physical Review E, Vol. 50, No. 3, pp. 1766-1769, 1994.

24
L. E. Baum, T. Petrie, G. Soules, and N. Weiss.
A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains.
Annals of Mathematical Statistics, Vol. 41, No. 1, pp. 164-171, 1970.

25
J. C. Bezdek.
A convergence theorem for the fuzzy ISODATA clustering algorithms.
IEEE Trans. on PAMI, Vol. 2, No. 1, pp. 1-8, 1980.

26
P. J. Bickel, C. A. J. Klaassen, Y. Ritov, and J. A. Wellner.
Efficient and Adaptive Estimation for Semiparametric Models.
Johns Hopkins Univ. Press, 1993.

27
C. M. Bishop.
Neural Networks for Pattern Recognition.
Oxford University Press, 1995.

28
I. Csiszár and G. Tusnády.
Information geometry and alternating minimization procedures.
In Statistics and decisions, pp. 205-237. Munich: Oldenburg Verlag, 1984.
(Supplemental Issue, no. 1).

29
D. Dacunha-Castelle and É. Gassiat.
Testing in locally conic models, and application to mixture models.
ESAIM: Probability and Statistics, Vol. 1, pp. 285-317, 1997.

30
A. P. Dempster, N. M. Laird, and D. B. Rubin.
Maximum likelihood from incomplete data via the EM algorithm.
J. of Royal Statistical Society Series B, Vol. 39, pp. 1-38, 1977.

31
B. S. Everitt and D. J. Hand.
Finite Mixture Distributions.
Chapman & Hall, 1981.

32
D. H. Fisher, M. J. Pazzani, and P. Langley, editors.
Concept Formation: Knowledge and Experience in Unsupervised Learning. Morgan Kaufmann, 1991.

33
B. J. Frey.
Graphical Models for Machine Learning and Digital Communication.
MIT Press, 1998.

34
K. Fukumizu.
Special statistical properties of neural network learning.
In Proc. of Int. Sympo. on Nonlinear Theory and Its Applications (NOLTA'97), pp. 747-750, 1997.

35
W. R. Gilks, S. Richardson, and D. J. Spiegelhalter.
Markov Chain Monte Carlo in Practice.
Chapman & Hall, 1996.

36
K. Hagiwara, N. Toda, and S. Usui.
On the problem of applying AIC to determine the structure of a layered feed-forward neural network.
In Proc. of Int. Conf. on Neural Networks (IJCNN'93), pp. 2263-2266, 1993.

37
長谷川修, 伊藤克亘, 麻生英樹, 赤穂昭太郎, 秋葉友良, 栗田多喜夫, 速水悟, 坂上勝彦, 田中, 大津展之.
コンピュータとの対話におけるユーザの振舞の解析.
電子情報通信学会技術研究報告 PRU95-57, 1995.

38
O. Hasegawa, K. Itou, T. Kurita, S. Hayamizu, K. Tanaka, K. Yamamoto, and N. Otsu.
Active agent oriented multimodal interface system.
In Proc. of Int. Joint Conf. on Artificial Intelligence (IJCAI '95), pp. 82-87, 1995.

39
P. Huber.
Robust Statistics.
John Wiley & Sons, 1981.

40
石井健一郎, 上田修功, 前田英作, 村瀬洋.
わかりやすいパターン認識.
オーム社, 1998.

41
R. A. Jacobs and M. I. Jordan.
A competitive modular connectionist architecture.
In Lippman et al, editor, Advances in Neural Information Processing Systems 3, pp. 767-773. Morgan Kaufman, 1991.

42
M. Jordan.
Learning in Graphical Models.
Kluwer Academic Publishers, 1998.

43
M. I. Jordan and R. A. Jacobs.
Hierarchical mixtures of experts and the EM algorithm.
Neural Computation, Vol. 6, pp. 181-214, 1994.

44
B. Kappen.
Using Boltzmann Machines for probability estimation: A general framework for neural network learning.
In S. Gielen et al, editor, Proc. of Int. Conf. of Artificial Neural Networks (ICANN'93), pp. 521-526, 1993.

45
B. Kappen.
Deterministic learning rules for Boltzmann Machines.
Neural Networks, Vol. 8, No. 4, pp. 537-548, 1995.

46
川人光男.
脳の計算理論.
産業図書, 1996.

47
T. Kurita, N. Otsu, and T. Sato.
A face recognition method using higher order local autocorrelation and multivariate analysis.
In Proc. of 11th International Conf. on Pattern Recognition (ICPR), Vol. II, pp. 213-216, The Hague, 1992.

48
黒川隆夫.
ノンバーバルインタフェース.
オーム社, 1994.

49
E. Levin, N. Tishby, and S. A. Solla.
A statistical approach to learning and generalization in layered neural networks.
Proc. of IEEE, Vol. 78, No. 10, pp. 1568-1574, 1990.

50
G. J. McLachlan and K. E. Basford.
Mixture models.
Mercel Dekker, 1987.

51
G. J. McLachlan and T. Krishnan.
The EM Algorithm and Extensions.
John Wiley & Sons, 1997.

52
X. L. Meng and D. B. Rubin.
Maximum likelihood estimation via the ecm algorithm: a general framework.
Biometrika, Vol. 80, pp. 267-278, 1993.

53
宮川雅巳.
EM アルゴリズムとその周辺.
応用統計学, Vol. 16, No. 1, pp. 1-19, 1987.

54
宮川雅巳.
グラフィカルモデリング.
朝倉書店, 1997.

55
J. Moody.
The effective number of parameters: an analysis of generalization and regularization in nonlinear learning systems.
In J. Moody et al, editor, Advances in Neural Information Processing Systems 4. Morgan Kaufmann, 1992.

56
本村陽一, 赤穂昭太郎, 麻生英樹.
ベイジアンネット学習の知能システムへの応用.
計測と制御, pp. 468-473, 1999.

57
N. Murata, S. Yoshizawa, and S. Amari.
A criterion for determining the number of parameters in an artificial neural network model.
In T. Kohonen et al, editor, Artificial Neural Network (Proc. of ICANN'94), pp. 9-14. Elsevier, 1994.

58
N. Murata, S. Yoshizawa, and S. Amari.
Network information criterions -- determining the number of parameters for an artificial neural network model.
IEEE Trans. on Neural Networks, Vol. 5, pp. 865-872, 1994.

59
中川聖一, 中西宏文, 古部好計, 板橋光義.
視聴覚情報の統合化に基づく概念の獲得.
人工知能学会誌, Vol. 8, No. 4, pp. 499-508, 1993.

60
中川聖一, 升方幹雄.
視聴覚情報の統合化に基づく概念と文法の獲得システム.
人工知能学会誌, Vol. 10, No. 4, pp. 619-627, 1995.

61
M. J. Nijman and H. J. Kappen.
Symmetry breaking and training from incomplete data with radial basis Boltzmann machines.
International J. of Neural Systems, Vol. 8, pp. 301-316, 1997.

62
西森秀稔.
統計力学と知識情報処理の関わり合い.
数理科学, Vol. 438, pp. 5-11, 1999.

63
大津展之.
パターン認識における特徴抽出に関する数理的研究.
研究報告 818, 電子技術総合研究所, 1981.

64
大津展之.
リアルワールドコンピューティング研究計画 --実世界における柔軟な知能を目指して.
人工知能学会誌, Vol. 9, No. 3, pp. 358-364, 1994.

65
J. K. Patel and C. B. Read.
Handbook of the Normal Distribution, Second Edition.
Marcel Dekker, 1996.

66
T. Poggio and F. Girosi.
Networks for approximation and learning.
Proceedings of the IEEE, Vol. 78, No. 9, pp. 1481-1497, 1990.

67
L. Rabiner and B.-H. Juang.
Fundamentals of Speech Recognition.
PTR Prentice-Hall, 1995.
古井 貞煕 (訳): 音声認識の基礎, NTTアドバンステクノロジ, 1995.

68
C. R. Rao.
Linear Statistical Inference and Its Applications, 2nd edition.
John Wiley & Sons, 1973.
奥野ほか(訳) : 統計的推測とその応用, 東京図書, 1977.

69
R. A. Redner and H. F. Walker.
Mixture densities, maximum likelihood and the EM algorithm.
SIAM Review, Vol. 26, pp. 195-239, 1984.

70
M. Revow, C. Williams, and G. Hinton.
Using generative models for handwritten digit recognition.
IEEE Trans. on PAMI, Vol. 18, pp. 592-606, 1996.

71
S. Richardson and P. J. Green.
On Bayesian analysis of mixtures with an unknown number of components (with discussion).
J. of the Royal Statistical Society, Series B, Vol. 59, No. 4, pp. 731-792, 1997.

72
B. D. Ripley.
Pattern Recognition and Neural Networks.
Cambridge Univ. Press, 1995.

73
J. Rissanen.
Stochastic complexity and modeling.
Annals of Statistics, Vol. 14, pp. 1080-1100, 1986.

74
C. R. Robert.
The Bayesian choice: a decision-theoretic motivation.
Springer-Verlag, 1994.

75
K. Rose, E. Gurewitz, and G. Fox.
Statistical mechanics of phase transitions in clustering.
Physical Review Letters, Vol. 65, pp. 945-948, 1990.

76
D. Roy.
Integration of speech and vision using mutual information.
In Proc. of International Conference on Acoustics, Speech, and Signal Processing (ICASSP2000), 2000.

77
S. J. Russell and P. Norvig.
Artificial Intelligence, Modern Approach.
Prentice-Hall, 1995.
古川康一(監訳): エージェントアプローチ 人工知能, 共立出版, 1997.

78
坂本慶行, 石黒真木夫, 北川源四郎.
情報量統計学.
共立出版, 1983.

79
A.J.C. Sharkey, editor.
Combining Artificial Neural Nets, Ensemble and Modular Multi-Net Systems. Springer-Verlag, 1998.

80
H. Shimodaira.
A new criterion for selecting models from partially observed data.
In P. Cheeseman and R. W. Oldford, editors, Selecting Models from Data: AI and Statistics IV, chapter 3, pp. 21-29. Springer-Verlag, 1994.

81
篠本滋.
情報の統計力学.
丸善, 1992.

82
竹内啓.
情報統計量の分布とモデルの適切さの規準.
数理科学, Vol. 153, pp. 12-18, 1976.

83
竹内啓(編).
統計学辞典.
東洋経済新報社, 1989.

84
M. A. Tanner.
Tools for Statistical Inference: Methods for the Exploration of Posterior Distributions and Likelihood Functions.
Springer-Verlag, 1993.

85
R. A. Tapia and J. R. Thompson.
Nonparametric Probability Density Estimation.
Johns Hopkins Univ. Press, 1978.

86
D. M. Titterington, A. F. M. Smith, and U. E. Makov.
Statistical analysis of finite mixture distribution.
John Wiley & Sons, 1985.

87
L. G. Valiant.
A theory of learnable.
Comm. ACM, Vol. 27, No. 11, pp. 1134-1142, 1984.

88
V. N. Vapnik.
Estimation of Dependences Based on Empirical Data.
Springer-Verlag, 1984.

89
渡辺美智子, 山口和範(編).
EM アルゴリズムと不完全データの諸問題.
多賀出版, 2000.

90
S. Watanabe.
Algebraic analysis for singular statistical estimation.
In Lecture Notes in Computer Sciences, Vol. 1720, pp. 39-50. Springer-Verlag, 1999.

91
C. F. J. Wu.
On the convergence properties of the EM algorithm.
Annals of Statistics, Vol. 11, pp. 95-103, 1983.

92
柳井晴夫, 高木広文.
多変量解析ハンドブック.
現代数学社, 1986.

93
I. Ziskind and M. Wax.
Maximum likelihood localization of multiple sources by alternating projection.
IEEE Trans. on Acoustics, Speech, and Signal Processing, Vol. 36, No. 10, pp. 1553-1560, 1988.



Shotaro Akaho 平成15年7月22日