Monday 16 January 2017

Inferring Trading Stratégies De Probabilité Distribution Fonctions

John Ehlers DOCUMENTS TECHNIQUES John Ehlers, le développeur de MESA, a écrit et publié de nombreux articles relatifs aux principes utilisés dans les cycles du marché. Les synopsis des articles disponibles sont affichés ci-dessous. Téléchargez chacun d'eux en sélectionnant leur hypertexte associé. Pourquoi les commerçants perdent de l'argent (et quoi faire à ce sujet) Un article dans le numéro de mai 2014 de Stock amp Commodities Magazine a décrit comment créer des courbes d'équité artificielle en connaissant simplement le facteur de profit et les gagnants de pourcentage d'une stratégie de négociation. Les statistiques de Bell Curve pour le négoce d'actions sélectionnées au hasard et le négoce de portefeuille sont également incluses. Il s'agit d'une feuille de calcul Excel qui vous permet de connaître ces descripteurs statistiques de la performance du système commercial. Indicateurs prédictifs pour des stratégies commerciales efficaces Les traders techniques comprennent que les indicateurs doivent lisser les données du marché pour être utiles, et que le lissage introduit le décalage comme un effet secondaire indésirable. Nous savons également que le marché est fractale un graphique d'intervalle hebdomadaire ressemble à un graphique mensuel, quotidien ou intraday. Ce qui n'est peut-être pas aussi évident, c'est que lorsque l'intervalle de temps suivant l'axe des abscisses augmente, les fluctuations de prix de haut à bas le long de l'axe des ordonnées augmentent également à peu près en proportion. Ce phénomène de dilatation spectrale provoque une distorsion indésirable, qui n'a pas été reconnue ou a été largement ignorée par les développeurs d'indicateurs et les techniciens du marché. Inferring Trading Strategies à partir de fonctions de densité de probabilité mesurée Ce fut le gagnant du second prix du MHA 2008 Charles H. Dow Award. Dans cet article, je montre les implications des différentes formes de détringence et comment les distributions de probabilité résultantes peuvent être utilisées comme stratégies pour générer des systèmes commerciaux efficaces. Les résultats de ces systèmes commerciaux robustes sont comparés aux approches standard. Ce document montre et façon interactive pour éliminer autant de lag que désiré de filtres de lissage. Bien sûr, le décalage réduit vient au prix de la lissibilité du filtre diminuée. Le filtre ne présente aucun dépassement transitoire fréquemment observé dans les filtres d'ordre supérieur. Décomposition empirique du mode Une nouvelle approche pour la détection du cycle et du mode tendance. Transformée de Fourier pour les négociants Le problème de la transformation de Fourier pour la mesure des cycles de marché est qu'ils ont une très mauvaise résolution. Dans cet article je montre comment utiliser une autre transformée non linéaire pour améliorer la résolution de sorte que les transformées de Fourier sont utilisables. Le spectre mesuré est affiché sous la forme d'une carte de chaleur. Indicateurs indicateurs de couteau suisse sont simplement des réponses de transfert de données d'entrée. Grâce à un simple changement de constantes, cet indicateur peut devenir un filtre passe-bas gaussien EMA, SMA, 2 pôles, un filtre passe-bas Butterworth à 2 pôles, un filtre lisse, un filtre passe-bande ou un filtre Bandstop. Ehlers Filter Un filtre non linéaire inhabituel FIR est décrit. Ce filtre est parmi les plus sensibles aux changements de prix mais plus lisse dans les marchés latéraux. Évaluation du rendement du système Le facteur de profit (gains bruts divisés par les pertes brutes) est analogue au facteur de distribution des jeux. Ainsi, lorsque le facteur de profit est combiné avec le pourcentage des gagnants dans une série d'événements aléatoires, les exemples de la façon dont une stratégie de négociation équité de croissance peut être simulée. Cet article décrit comment les descripteurs de performance communs sont liés à ces deux paramètres. Une feuille de calcul Excel est décrite, vous permettant d'effectuer une analyse Monte Carlo de vos systèmes de trading si vous connaissez ces deux paramètres (hors échantillon). FRAMA (FRactal Adaptive Moving Average). Une moyenne mobile non linéaire est dérivée en utilisant l'exposant de Hurst. MAMA est la mère de toutes les moyennes mobiles adaptative. Actuellement, le nom est un acronyme de MESA Adaptive Moving Average. L'action non linéaire de ce filtre est produite par le retour de phase tous les demi-cycles. Lorsqu'ils sont combinés à la FAMA, une moyenne mobile adaptative suivante, les crossovers forment d'excellents signaux d'entrée et de sortie qui sont relativement exempts de whipsaws. Time Warp Sans espace Voyage Laguerre Polynômes sont utilisés pour générer une structure de filtre similaire à une simple moyenne mobile avec la différence que l'intervalle de temps entre les robinets de filtre est nolinear. Le résultat permet la création de filtres très courts possédant les caractéristiques de lissage de filtres beaucoup plus longs. Les filtres plus courts signifient moins de décalage. Les avantages de l'utilisation des polynômes de Laguerre dans les filtres sont démontrés à la fois dans les indicateurs et les systèmes de négociation automatique. L'article inclut le code EasyLanguage. L'oscillateur CG L'oscillateur CG est unique parce que c'est un oscillateur qui est à la fois lissé et à zéro décalage. Il trouve le centre de gravité (CG) des valeurs de prix dans un filtre FIR. Le CG a automatiquement le lissage du filtre FIR (similaire à une moyenne mobile simple) avec la position du CG étant exactement en phase avec le mouvement des prix. Le code EasyLanguage est inclus. Utilisation de la transformation de Fisher De nombreux systèmes de négociation sont conçus en partant de l'hypothèse que la distribution de probabilité des prix a une distribution de probabilité normale ou gaussienne autour de la moyenne. En fait, rien ne peut être plus éloigné de la vérité. Cet article décrit comment le Transform de Fisher convertit des données pour avoir presque une distribution normale de probabilité. Étant donné que la distribution de probabilité est normale après l'application de la transformée de Fisher, les données sont utilisées pour créer des points d'entrée avec une précision chirurgicale. L'article inclut le code EasyLanguage. La transformée de Fisher inverse La transformation de Fisher inversée peut être utilisée pour générer un oscillateur qui passe rapidement entre survente et overbought sans whipsaws. Gaussian Filters Lag est la chute des filtres de lissage. Cet article montre comment le retard peut être réduit et le lissage de plus haute fidélité est obtenu en réduisant le décalage des composantes haute fréquence dans les données. Un tableau complet des coefficients de filtre gaussien est fourni. Pôles et zéros Description des filtres numériques en termes de Z Transforme. Les ramifications des filtres d'ordre supérieur sont décrites. Des tableaux de coefficients pour les filtres de 2 pôles et de Butterworth de 2 pôles sont donnés. Professeur John Geweke professionnel Économétrie de renommée internationale John Geweke est venu à UTS comme professeur de recherche distingué dans l'école d'affaires en 2009. Professeur Geweke se distingue pour ses contributions à la théorie économétrique dans le temps Les analyses en série et la modélisation bayésienne, ainsi que les applications dans les domaines de la macroéconomie, des finances et de la microéconomie. Il est membre de la Société Econométrique et de l'American Statistical Association. Il a été co-éditeur du Journal of Econometrics, du Journal of Applied Econometrics et rédacteur en chef du Journal of Business and Economic Statistics. Son livre le plus récent est Complete and Incomplete Econometric Models, publié par Princeton University Press en janvier 2010. Actuellement, il dirige le projet parrainé par six chercheurs, les Algorithmes Massivement Parallèles pour l'Inférence Bayésienne et la Prise de Décision. Prix ​​et reconnaissance Fellow de l'Econometric Society, depuis 1982 Membre de l'American Statistical Association, depuis 1990 Alfred P. Sloan Chercheur, 1982-1984 H. I. Professeur de Faculté de Romnes, Université du Wisconsin, 1982-1983 Bourse de Dayton-Hudson, 1970-1974 Boursier national du mérite, 1966-1970 Membre, Phi Beta Kappa et Phi Kappa Phi Liste dans Marquis Whos Qui en Amérique, publications similaires Postes académiques antérieurs Harlan McGregor Chaire de théorie économique et professeur d'économie et de statistique, Université de l'Iowa, 1999-2009 Professeur d'économie, Université du Minnesota, 1990-2001 Directeur, Institut des sciences de la statistique et de la décision, Université de Duke, 1987-1990 Professeur de sciences de la statistique et de la décision , Université Duke, 1987-1990 William R. Kenan, Jr. Professeur d'économie, Université Duke, 1986-1990 Professeur d'économie, Université Duke, 1983-1986 Professeur invité d'économie, Université Carnegie-Mellon, 1982-1983 Professeur invité de Statistique, Université Carnegie-Mellon, 1982-1983 Professeur d'économie, Université de Wisconsin-Madison, 1982-1983 Professeur agrégé d'économie, Université de Wisconsin-Madison, 1979-1982 Professeur invité, Université de Warwick, 1979 Professeur adjoint d'économie, Université De Wisconsin-Madison, 1975-1979 Distingué professeur invité, Groupe de discipline économique Membre associé, AAI - Advanced Analytics Institute Doc. De Philosophie Geweke, J .. Koop, G. amp Van Dijk, H. 2012, Le Manuel d'Oxford de l'économétrie bayésienne. ViewDownload from: Éditeurs site ampcopy Oxford University Press 2011. Tous droits réservés. Les méthodes économétriques bayésiennes ont connu une popularité croissante ces dernières années. Les économétriques, les économistes empiriques et les décideurs utilisent de plus en plus les méthodes bayésiennes. Le Oxford Handbook of Bayesian Econometrics est une source unique sur les méthodes bayésiennes dans les domaines spécialisés. Il contient des articles des principaux Bayésiens sur les derniers développements dans leurs domaines d'expertise spécifiques. Le volume fournit une large couverture de l'application de l'économétrie bayésienne dans les principaux domaines de l'économie et des disciplines connexes, y compris la macroéconomie, la microéconomie, les finances et le marketing. Il examine l'état de l'art dans la méthodologie économétrique bayésienne, avec des articles sur la simulation postérieure et les méthodes Monte Carlo de chaîne de Markov, les techniques non paramétriques bayésiennes et les outils spécialisés utilisés par les économétriques de la série temporelle bayésienne tels que les modèles spatiaux d'états et le filtrage des particules. Il comprend également des articles sur les principes bayésiens et la méthodologie. Geweke, J. 2010, Modèles économétriques complets et incomplets. 1, Princeton University Press, Princeton, Etats-Unis. ViewDownload from: UTS OPUS Les modèles économétriques sont largement utilisés dans la création et l'évaluation de la politique économique dans les secteurs public et privé. Mais ces modèles ne sont utiles que s'ils tiennent compte adéquatement des phénomènes en question, et ils peuvent être tout à fait trompeurs s'ils ne le font pas. En réponse, les économétriens ont mis au point des tests et d'autres vérifications de l'adéquation du modèle. Toutes ces méthodes, cependant, prennent comme donné la spécification du modèle à tester. Dans ce livre, John Geweke aborde la phase critique du développement du modèle, au point où les modèles potentiels sont intrinsèquement incomplets. En résumant et en étendant les avancées récentes de l'économétrie bayésienne, Geweke montre comment les méthodes de simulation modernes simples peuvent compléter le processus créatif de la formulation du modèle. Ces méthodes, accessibles aux étudiants en doctorat en économie ainsi qu'à la pratique des économétriques appliqués, rationalisent les processus de développement de modèles et de vérification des spécifications. Complété d'illustrations issues d'une grande variété d'applications, il s'agit d'une contribution importante à l'économétrie qui intéressera aussi bien les économistes que les doctorants. Geweke, J. 2005, Econométrie bayésienne contemporaine et statistiques. 1, John Wiley ampères Sons Inc, États-Unis. De la couverture arrière: Cette publication fournit aux lecteurs une compréhension approfondie de l'analyse bayésienne qui est fondée sur la théorie de l'inférence et la prise de décision optimale. L'économétrie et la statistique bayésiennes contemporaines fournissent aux lecteurs des méthodes et des modèles de simulation à la fine pointe de la technologie qui sont utilisés pour résoudre des problèmes complexes du monde réel. Armés d'un solide fondement à la fois théorique et pratique des outils de résolution de problèmes, les lecteurs découvrent comment optimiser la prise de décision face aux problèmes qui impliquent des données limitées ou imparfaites. Le livre commence par examiner les fondements théoriques et mathématiques des statistiques bayésiennes pour aider les lecteurs à comprendre comment et pourquoi il est utilisé dans la résolution de problèmes. L'auteur décrit ensuite comment les méthodes modernes de simulation rendent les approches bayésiennes pratiques en utilisant des logiciels d'applications mathématiques largement disponibles. En outre, l'auteur détaille comment les modèles peuvent être appliqués à des problèmes spécifiques, y compris les modèles linéaires et les choix de politiques, la modélisation avec les variables latentes et les données manquantes, les modèles de séries chronologiques et la prévision et la comparaison et l'évaluation des modèles. La publication a été développée et adaptée à travers une décennie d'expériences en classe, et les lecteurs trouveront l'approche de l'auteur très attrayante et accessible. Il existe près de 200 exemples et exercices pour aider les lecteurs à voir comment l'utilisation efficace des statistiques bayésiennes leur permet de prendre des décisions optimales. Les programmes informatiques Matlab et Splus sont intégrés dans tout le livre. Un site Web d'accompagnement fournit aux lecteurs des jeux de données et du code informatique pour de nombreux exemples. Cette publication s'adresse aux professionnels de la recherche qui utilisent l'économétrie et des méthodes statistiques similaires dans leur travail. En mettant l'accent sur la résolution de problèmes pratiques et l'utilisation extensive d'exemples et d'exercices, il constitue également un excellent manuel pour les étudiants de deuxième et de troisième cycles dans un large éventail de domaines, y compris l'économie, les statistiques, les sciences sociales, les affaires et les politiques publiques. Geweke, J. Bonnen, J. Koshel, J. amp White, A. 1999, Semer les semences: Informer les politiques publiques dans le service de recherche économique de l'USDA. . National Academy Press, Washington. Geweke, J .. Berry, D. amp. Chaloner, K. 1996, Bayesian Statistics and Econometrics: Essais en l'honneur d'Arnold Zellner. Wiley, New York. Geweke, J .. Caines, P. Parzen, M. Amp Taqqu, M. 1993, New Directions in Time Series Analysis, Parties I et II. . Springer-Verlag, New York. Geweke, J. 1992, La prise de décision sous le risque et l'incertitude: nouveaux modèles et résultats empiriques. . Kluwer Academic Publishers, Dordrecht. Geweke, J .. Barnett, W. et amp. Shell, K. 1989, Complexité économique: Chaos, taches solaires, bulles et non-linéarité. Cambridge University Press, Cambridge. Geweke, J. 1985, Infering Household Demand for Durable Goods, avec des préférences hétérogènes: une étude de cas. Geweke, J. 1984, Chapitre 19 Inférence et causalité dans les modèles de séries chronologiques économiques. L'utilisation de la simulation séquentielle séquentielle dans Upadhyay, S. Singh, U. Dey, D. amp Loganathan, A. (Inference bayésienne pour modèles de régression logistique) Eds), Tendances actuelles de la méthodologie bayésienne avec applications. CRC Press, Etats-Unis, pages 290-310. Les auteurs ont développé un modèle de Bayesian Model Comparison (Advances in Econometrics) dans le domaine de l 'informatique virtuelle. Emerald Group Publishing Limited, USA, pp. 1-44. ViewDownload from: Site UTS OPUS ou Publishers Des fonctionnalités d'informatique de bureau massivement parallèles maintenant bien à la portée des universitaires individuels modifient l'environnement pour la simulation postérieure de manière fondamentale et potentiellement très avantageuse. Mais pour exploiter pleinement ces avantages, des algorithmes conformes aux environnements informatiques parallèles sont nécessaires. Cet article présente un simulateur postérieur séquentiel conçu pour fonctionner efficacement dans ce contexte. Le simulateur fait moins de demandes d'analyse et de programmation aux enquêteurs et est plus rapide, plus fiable et plus complet que les simulateurs postérieurs conventionnels. Le document étend les méthodes et la théorie séquentielles existantes de Monte Carlo pour fournir une base complète et pratique pour la simulation postérieure séquentielle qui convient bien aux environnements de calcul massivement parallèles. Il fournit des recommandations détaillées sur la mise en œuvre, fournissant un algorithme qui nécessite seulement un code pour la simulation à partir de l'évaluation préalable des données et des densités et fonctionne bien dans une variété d'applications représentatives de travaux empiriques sérieux en économie et en finance. L'algorithme facilite la comparaison bayésienne du modèle en produisant des approximations de vraisemblance marginales d'une précision sans précédent comme sous-produit accessoire, est robuste aux distributions pathologiques postérieures et fournit des estimations d'erreur-type numérique et d'efficacité numérique relative intrinsèquement. Le papier se termine avec une application qui illustre le potentiel de ces simulateurs pour l'inférence bayésienne appliquée. Geweke, J .. Koop, G. Amp van Dijk, H. 2011, Introduction à Geweke, J .. Koop, G. Amp van Dijk, H. (eds), Le Oxford Handbook of Bayesian Econometrics. Oxford University Press, Oxford, pp. 1-8. Les méthodes économétriques bayésiennes ont connu une popularité croissante au cours des dernières années. Les économétriques, les économistes empiriques et les décideurs utilisent de plus en plus les méthodes bayésiennes. Ce manuel est une source unique pour les chercheurs et les décideurs qui souhaitent en apprendre davantage sur les méthodes bayésiennes dans les domaines spécialisés et pour les étudiants diplômés qui cherchent à passer de l'apprentissage des manuels scolaires à la frontière de la recherche. Il contient les contributions des principaux Bayésiens sur les derniers développements dans leurs domaines d'expertise spécifiques. Le volume fournit une large couverture de l'application de l'économétrie bayésienne dans les principaux domaines de l'économie et des disciplines connexes, y compris la macroéconomie, la microéconomie, les finances et le marketing. Il examine l'état de l'art dans la méthodologie économétrique bayésienne, avec des chapitres sur la simulation postérieure et les méthodes Monte Carlo de chaîne de Markov, les techniques non paramétriques bayésiennes et les outils spécialisés utilisés par les économétriques de la série temporelle bayésienne tels que les modèles spatiaux d'états et le filtrage des particules. Il comprend également des chapitres sur les principes bayésiens et la méthodologie. Geweke, J. 2009, Le modèle SETAR de Tong et Lim et Advances in Computation in Chan, K. S. (Ed), Exploration d'un monde non linéaire: Appréciation de Howell Tongs Contributions aux statistiques. . World Scientific, Singapour, p. 85-94. Cette discussion revient sur Tong et Lim039s seminal 1980 article sur le modèle SETAR dans le contexte de progrès dans le calcul depuis ce temps. En utilisant l'ensemble de données sur le lynx canadien de ce document, il compare les estimations du maximum de vraisemblance exactes avec celles du document original. Il illustre l'application des méthodes MCMC bayésiennes, développées dans les années intermédiaires, à ce modèle et à cet ensemble de données. Il montre que SETAR est un cas limite de mélange de modèles experts et étudie l'application d'une variante de ces modèles à l'ensemble de données lynx. L'application est réussie, malgré la petite taille de l'ensemble de données et la complexité du modèle. Les rapports prédictifs de vraisemblance favorisent Tong et le modèle original de Lim039s. Geweke, J .. Horowitz, J. L. amp Pesaran, H. 2008, Econometrics à Durlauf, S. N. Amp Blume, L. E. (Eds), Le Nouveau Palgrave Dictionary of Economics en ligne. Palgrave Macmillan, Online, pp. 1-32. ViewDownload from: UTS OPUS ou site éditeurs Comme une discipline unifiée, l'économétrie est encore relativement jeune et a été transformer et se développer très rapidement. Des progrès majeurs ont été réalisés dans l'analyse des données transversales au moyen de techniques semiparamétriques et non paramétriques. L'hétérogénéité des relations économiques entre les individus, les entreprises et les industries est de plus en plus reconnue et des tentatives ont été faites pour en tenir compte soit en intégrant ses effets, soit en modélisant les sources d'hétérogénéité lorsque des données de panel appropriées existent. Les considérations contrefactuelles qui sous-tendent l'analyse des politiques et l'évaluation du traitement ont été mieux fondées. De nouvelles techniques économétriques de séries chronologiques ont été développées et employées en profondeur dans les domaines de la macroéconométrie et des finances. Les techniques économétriques non linéaires sont de plus en plus utilisées dans l'analyse des observations en coupe transversale et en séries chronologiques. Les applications des techniques bayésiennes aux problèmes économétriques ont été favorisées en grande partie par les progrès de la puissance informatique et des techniques de calcul. L'utilisation des techniques bayésiennes a, à son tour, fourni aux enquêteurs un cadre unificateur dans lequel les tâches de prévision, de prise de décision, d'évaluation de modèle et d'apprentissage peuvent être considérées comme des parties d'un même processus interactif et itératif, fournissant ainsi une base pour une durée d'économétrie accrue. Keane, M. amp Geweke, J. 2006, Bayesian Cross-Sectional Analysis de la distribution conditionnelle des gains des hommes aux Etats-Unis (1967-1996) dans Upadhyay, S. K. Singh, U. amp Dey, D. K. (Eds), Bayesian Statistics et ses applications. Anshan Ltd, New Delhi, pages 160-197. VoirTéléchargement de: UTS OPUS Geweke, J. et Whiteman, C. 2006, Bayesian forecasting in Elliot, G. Granger, C. W.J. Amp Timmerman, A. (eds), Manuel de prévisions économiques. Elsevier, Pays-Bas, pp. 3-80. ViewDownload from: UTS OPUS ou Publishers site Les prévisions bayésiennes sont un produit naturel d'une approche bayésienne de l'inférence. L'approche bayésienne en général exige la formulation explicite d'un modèle, et le conditionnement sur des quantités connues, afin d'en tirer des inférences sur des inconnus. Dans la prévision bayésienne, on prend simplement un sous-ensemble des quantités inconnues pour être des valeurs futures de certaines variables d'intérêt. Ce chapitre présente les principes de la prévision bayésienne et décrit les avancées récentes dans les capacités de calcul pour les appliquer qui ont considérablement élargi la portée de l'applicabilité de l'approche bayésienne. Il décrit les développements historiques et les compromis analytiques qui étaient nécessaires avant les développements récents, l'application des nouvelles procédures dans une variété d'exemples et des rapports sur deux exercices de prévision bayésienne à long terme Geweke, J .. Houser, D. E. Keane, M. 2001, Inférence basée sur la simulation pour des modèles dynamiques de choix multinomial dans un compagnon de l'économétrie théorique Un compagnon de l'économétrie théorique Un compagnon de l'économétrie théorique. Blackwell Publishing, USA, pages 466-493. Geweke, J. et Keane, M. 2001, Méthodes à forte intensité de calcul pour l'intégration dans l'économétrie dans Handbook of Econometrics Handbook of Econometrics Ce document Résumé Actions Citées par Enregistrer en tant que Citation Alerte E-mail Article Export Citation H. Elsevier Inc, Hollande du Nord, pages 3463-3568. Geweke, J. 2001, Incorporation des outils bayésiens dans le logiciel mathématique dans George, E. I. (Ed), Méthodes bayésiennes avec applications aux sciences, aux politiques et aux statistiques officielles. . Eurostat, Bruxelles, pages 165-174. Le logiciel BACC fournit à ses utilisateurs des outils d'analyse bayésienne, de calcul et de communication. Ces outils sont intégrés dans des applications logicielles mathématiques telles que Matlab et Gauss. De la perspective userampacircs, il ya une intégration transparente des commandes BACC spéciales avec les puissantes commandes intégrées de l'application. Plusieurs modèles sont actuellement disponibles et le BACC est conçu pour être extensible. Nous donnons une brève démonstration de l'utilisation du BACC pour Matlab et discutons de la mise en œuvre de nouveaux modèles pour le BACC. Geweke, J. Keane, M., 2001, Chapitre 56 Méthodes intensives de calcul pour l'intégration dans l'économétrie, pp. 3463-3568. ViewDownload from: Publishers site Jusqu'à récemment, l'inférence dans de nombreux modèles intéressants a été empêché par l'exigence d'intégration dimensionnelle élevée. Mais l'augmentation spectaculaire de la vitesse de l'ordinateur et le développement récent de nouveaux algorithmes qui permettent une évaluation précise de Monte Carlo des intégrales à haute dimension ont considérablement élargi la gamme de modèles qui peuvent être considérés. Ce chapitre présente la méthodologie pour plusieurs des méthodes les plus importantes de Monte Carlo, complété par un ensemble d'exemples concrets qui montrent comment les méthodes sont utilisées. Certains de ces exemples sont nouveaux dans la littérature économétrique. Ils incluent l'inférence dans les modèles de choix discrets multinomiaux et les modèles de sélection dans lesquels l'hypothèse de normalité standard est relâchée en faveur d'un mélange multivarié d'hypothèses normales. Plusieurs expériences de Monte Carlo indiquent que ces méthodes réussissent à identifier les déviations par rapport à la normalité lorsqu'elles sont présentes. Tout au long du chapitre, l'accent est mis sur l'inférence dans les modèles paramétriques qui permettent une riche variation dans la répartition des perturbations. Le chapitre décrit d'abord les méthodes de Monte Carlo pour l'évaluation des intégrales de haute dimension, y compris les simulateurs intégrés comme la méthode GHK, et les méthodes de Markov Chain Monte Carlo comme l'échantillonnage de Gibbs et l'algorithme Metropolis-Hastings. Il se tourne ensuite vers des méthodes d'approximation de solutions à des problèmes d'optimisation dynamique de choix discrets, y compris les méthodes développées par Keane et Wolpin et Rust, ainsi que des méthodes pour contourner complètement le problème d'intégration, comme l'approche de Geweke et Keane. Le reste du chapitre traite d'exemples spécifiques: l'estimation de simulation classique pour les modèles probit multinomial, dans les contextes de données transversales et de panel, les modèles linéaires latents univariés et multivariés et l'inférence bayésienne dans les modèles de choix discrets dynamiques dans lesquels la composante future de la fonction valeur Est remplacé par un polynôme flexible. Ampcopy 2001 Elsevier Inc. Tous droits réservés. Geweke, J. 2000, Inférence bayésienne basée sur la simulation pour les séries chronologiques économiques de Mariano, R. Schuermann, T. et Weeks, M. J. (eds), Inférence basée sur la simulation en économétrie: méthodes et applications. Cambridge University Press, Cambridge, pages 255-299. Gefeke, J. Keane, M. 2000, Inférence bayésienne pour les modèles de choix discrets dynamiques sans le besoin de programmation dynamique dans Mariano, Schuermann amp Weeks (eds), inférence basée sur la simulation en économétrie: méthodes et applications. Cambridge University Press, UK, pages 100-131. Geweke, J. 1999, Simulation Methods for Model Criticism and Robustness Analysis dans Berger, J. O. Bernado, J. M. Dawid, A. P. amp Smith, A. F.M. (Eds), Statistiques bayésiennes 6. Oxford University Press, Oxford, Royaume-Uni, pages 275-299. Résumé: Cet article expose et développe des méthodes bayésiennes de critique de modèle et d 'analyse de robustesse. Les objectifs sont de clarifier l'interprétation bayésienne des tests diagnostiques non bayésiens et de fournir explicitement des procédures bayésiennes accessibles aux chercheurs praticiens. Des méthodes spécifiques pour la critique de la densité antérieure et l'analyse de la robustesse, et la critique de la densité des données, sont présentées. Tous sont basés sur l'approximation de facteurs Bayes appropriés, et évitent la nécessité de la simulation postérieure selon des spécifications de modèle alternatives. Une méthode générale de la critique de densité de données est développée, qui ne nécessite ni simulation postérieure ni approximations analytiques sous aucune spécification de modèle. Certaines des méthodes présentées ici ont été mises en œuvre dans des logiciels orientés utilisateur. Le papier présente quelques illustrations simples des méthodes. Geweke, J. et Keane, M. 1999, Mélange de modèles Probit de Normals dans Hsiao, C. Lahiri, K. Lee, L. F. amp Pesaran, M. H. (Eds), Analyse des panneaux et des variables dépendantes limitées: Un volume en l'honneur de G. S. Maddala. Cambridge University Press, Cambridge, pages 49-78. Résumé: Cet article généralise le modèle probit normal du choix dichotomique en introduisant des mélanges de distributions normales pour le terme de perturbation. En mélangeant à la fois les paramètres de moyenne et de variance et en augmentant le nombre de distributions dans le mélange, ces modèles suppriment efficacement l'hypothèse de normalité et sont beaucoup plus proches des modèles semiparamétriques. Lorsqu'on adopte une approche bayésienne, il existe une théorie exacte de distribution d'échantillons finis pour la probabilité de choix conditionnelle aux covariables. Le papier utilise des données artificielles pour montrer comment les rapports de cote postérieurs peuvent discriminer entre les distributions normale et non normale dans les modèles probit. La méthode est également appliquée aux décisions de participation de la main-d'œuvre féminine dans un échantillon avec 1 555 observations du PSID. Dans cette application, les facteurs de Bayes favorisent fortement le mélange de modèles probit normaux par rapport au modèle probit conventionnel, et les modèles les plus favorisés ont des mélanges de quatre distributions normales pour le terme de perturbation. Geweke, J. 1999, Certaines expériences dans la construction d'un modèle hybride pour l'analyse macroéconomique: un commentaire dans McCallum, B. (ed), Carnegie-Rochester Conférence sur les politiques publiques. Elsevier, pages 143-147. Geweke, J. 1997, Simulateurs Postérieurs en Econométrie dans Kreps, D. M. Amp Wallis, K. F. (Eds), Avancées en Economie et Econométrie: Théorie et Applications. Cambridge University Press, Cambridge, pages 128-165. Résumé: Le développement des simulateurs postérieurs au cours de la dernière décennie a permis de réviser les croyances sur les trois propositions qui précèdent qui ont été suivies de près par de nombreux économétriens. Le but de cet article est de transmettre ces innovations et leur signification pour l'économétrie appliquée, aux économétriens qui n'ont pas suivi la littérature mathématique et appliquée pertinente. Il ya quatre sections de fond. Une section examine les aspects de l'inférence bayésienne essentiels à la compréhension des implications des simulateurs postérieurs pour l'économétrie bayésienne. Une autre section décrit ces simulateurs et fournit les résultats de convergence essentiels. Les implications de ces procédures pour certains modèles économétriques sélectionnés sont présentées dans une troisième section. Ceci est fait pour indiquer la gamme de tâches auxquelles les simulateurs postérieurs sont bien adaptés, plutôt que pour fournir une étude représentative de la littérature économétrique bayésienne récente. Enfin, le papier se tourne vers certaines implications pour la comparaison de modèles et pour la communication entre ceux qui appliquent le travail et leurs publics, qui commencent à émerger de l'utilisation de simulateurs postérieurs dans l'économétrie bayésienne. Geweke, J. 1996, Inférence bayésienne pour les modèles linéaires soumis à des inégalités linéaires Contraintes dans la modélisation et la prévision - Honneur Seymour Geisser. Springer-verlag, New York, pages 248-263. Résumé: Le modèle linéaire normal, avec signe ou autres contraintes d'inégalités linéaires sur ses coefficients, apparaît très couramment dans de nombreuses applications scientifiques. Compte tenu des contraintes d'inégalité, l'inférence bayésienne est beaucoup plus simple que l'inférence classique, mais les méthodes de calcul bayésiennes standard deviennent impraticables lorsque la probabilité a posteriori des contraintes d'inégalité (sous un avant diffus) est faible. Cet article montre comment l 'algorithme d' échantillonnage de Gibbs peut fournir une alternative intéressante à l 'inférence soumise aux contraintes d' inégalité linéaire dans cette situation et comment le simulateur de probabilité GHK peut être utilisé pour évaluer la probabilité a posteriori des contraintes. Geweke, J. 1996, Simulation Monte Carlo et Intégration numérique à Amman, H. M. Kendrick, D. A. Amp Rust, J. (eds), Handbook of Computational Economics. Elsevier, pp. 731-800. Résumé: Il s'agit d 'un aperçu des méthodes de simulation en économie, avec un accent particulier sur les problèmes d' intégration. Il décrit les méthodes d'acceptation, les procédures d'échantillonnage d'importance et les méthodes Monte Carlo de la chaîne de Markov pour la simulation à partir de distributions univariées et multivariées et leur application à l'approximation des intégrales. L'exposition met l'accent sur les combinaisons d'approches différentes et l'évaluation de l'exactitude des approximations numériques des intégrales et des attentes. L'étude illustre ces procédures avec des applications aux problèmes de simulation et d'intégration en économie. Geweke, J. 1996, Variable Selection and Model Comparison in Regression in Bernardo, J. M. Berger, J. O. Dawid, A. P. amp Smith, A. F.M. (Eds), Statistiques bayésiennes 5. Oxford University Press, Oxford, pages 609-620. Résumé: Dans la spécification des modèles de régression linéaire, il est courant d'indiquer une liste de variables candidates à partir desquelles un sous-ensemble entre dans le modèle avec des coefficients non nuls. Cet article interprète cette spécification comme une distribution mixte continue et discrète pour les valeurs des coefficients. Il utilise alors un échantillonneur de Gibbs pour construire des moments postérieurs. On montre comment cette méthode peut incorporer des contraintes de signe et fournir des probabilités postérieures pour tous les sous-ensembles possibles de régresseurs. Les méthodes sont illustrées à l'aide de certains ensembles de données standard. Geweke, J. 1996, Tests de sélection des variables de modèles de prix des actifs chez Gatsonis, C. Hodges, J. S. Kass, R. E. McCulloch, R. E. Rossi, P. amp Singpurwalla, N. D. (eds), études de cas dans Bayesian Statistics (Volume 3). Springer-Verlag, New York. Geweke, J. 1996, Chapitre 15 Monte Carlo simulation et intégration numérique, pp. 731-800. ViewDownload from: Site des éditeurs Geweke, J. 1993, Inference and Forecasting for Chaotic Nonlinear Time Series in Day, R. H. amp Chen, P. (eds), Dynamique non linéaire et économie évolutive. Oxford University Press, Oxford. J. Watson, M. (eds), Nouvelle recherche sur les cycles d'affaires, les indicateurs et les prévisions, J. Geweeke, J. 1993, A Dynamic Index Model for Large Cross Sections in Stock. Bureau national de la recherche économique, New York. Geweke, J. 1992, Evaluation de l'exactitude des approches basées sur l'échantillonnage pour le calcul des moments postérieurs chez Bernardo, J. M. Berger, J. O. Dawid, A. P. amp Smith, A. F.M. (Eds), Statistiques bayésiennes 4. Oxford University Press, Oxford, pages 169-194. Résumé: L'augmentation des données et l'échantillonnage de Gibbs sont deux approches étroitement liées, basées sur l'échantillonnage, pour le calcul des moments postérieurs. Le fait que chacun produit un échantillon dont les constituants ne sont ni indépendants ni identiquement distribués complique l'évaluation de la convergence et de la précision numérique des approximations à la valeur attendue des fonctions d'intérêt sous le postérieur. Dans cet article, les méthodes de l'analyse spectrale sont utilisées pour évaluer formellement la précision numérique et construire des diagnostics pour la convergence. Ces méthodes sont illustrées dans le modèle linéaire normal avec des priors informatifs, et dans le modèle de régression de Tobit censuré. Geweke, J. Barnett, W. amp Wolfe, M. 1991, Estimation bayésienne comparée et sémantique de la demande du consommateur et des fonctions de demande de facteurs à Barnett, WA Cornet, B. DAspremont, C. Gabszewicz, J. et Mas-Colell, A. Eds), théorie de l'équilibre et applications. Cambridge University Press, Cambridge, pages 425-480. Geweke, J .. Barnett, W. amp Yue, P. 1991, Estimation Bayésienne semiparamétrique du modèle idéal asymptotiquement: Le système de demande AIM dans Barnett, W. A. Powell, J. amp Tauchen, G. E. (Eds), méthodes non paramétriques et semiparamétriques en économétrie et statistiques. Cambridge University Press, Cambridge, pages 127-174. Geweke, J. 1989, Modélisation avec des extensions polynomiales normales dans Barnett, W. A. Geweke, J. et amp. Shell, K. (eds), Complexité économique: chaos, taches solaires, bulles et non-linéarité. Cambridge University Press, Cambridge, pages 337-360. Résumé: Les expansions polynomiales de la fonction de densité de probabilité normale sont proposées comme une classe de modèles pour les composantes non observées. Les procédures opérationnelles pour l'inférence bayésienne dans ces modèles sont développées, de même que les méthodes pour combiner une séquence de tels modèles et l'évaluation des hypothèses de normalité et de symétrie. Les contributions de ce chapitre sont illustrées par une application aux taux quotidiens de variation du cours des actions. Geweke, J. 1988, Inference Exacte dans les Modèles avec Heteroscédasticité Conditionnelle Autoregressive dans Barnett, W. A. Berndt, E. R. amp White, H. (eds), Modélisation Econométrique Dynamique. Cambridge University Press, Cambridge, pages 73-103. Geweke, J. 1987, Endogeneity and Exogeneity in Eatwell, J. Milgate, M. amp Newman, P. (eds), Le Nouveau Palgrave: Un Dictionnaire de Théorie Economique et de Doctrine. . The Macmillan Press, Londres. Geweke, J. 1984, Inference and Causality in Economic Time Series Models dans Griliches, Z. amp Intriligator, M. D. (eds), Handbook of Econometrics, Volume 2. North-Holland, Amsterdam, pp. 1101-1144. Geweke, J. 1983, Causalité, Exogénéité et Inférence dans Hildenbrand, D. (ed), Advances in Econometrics. Cambridge University Press, Cambridge, pages 209-236. Geweke, J. 1982, Rétroaction entre la politique monétaire, l'activité sur le marché du travail et l'inflation salariale aux États-Unis, 1955-1978 dans les ouvriers, l'emploi et l'inflation. Brookings Institution Press, Washington, pages 159-198. Geweke, J. amp Weisbrod, B. 1980, Quelques conséquences économiques de l'avancement technologique dans les soins médicaux: le cas d'un nouveau médicament dans Helms, R. (ed), drogues et santé. American Enterprise Institute, Washington. Geweke, J. et amp Dent, W. 1980, On Spécification dans des modèles d'équations simultanées dans Kmenta, J. Ramsey, J. B. (eds), Evaluation of Econometric Models. Elsevier Science amp Technology Books, pages 169-196. Geweke, J. 1978, The Temporal and Sectoral Aggregation of Seasonally Adjusted Time Series in Zellner, A. (ed), Seasonal Analysis of Economic Time Series . Government Printing Office, Washington, US, pp. 411-432. Abstract: Procedures for the optimal seasonal adjustment of economic time series and their aggregation are derived, given a criterion suitable for the adjustment of the data used in political or journalistic contexts. It is shown that data should be adjusted jointly and then temporally or sectorally aggregated, as desired, a procedure that preserves linear aggregation identities. Examination of actual economic time series indicates that the optimal seasonal adjustment and aggregation of data provide a substantial improvement in the quality of sectorally disaggregated, adjusted data and considerably reduces the required subsequent revision of current adjusted series. Geweke, J. 1978, The Revision of Seasonally Adjusted Time Series in 1978 Proceedings of the Business and Economic Statistics Section - American Statistical Association . pp. 320-325. Geweke, J. 1977, The Dynamic Factor Analysis of Economic TIme Series Model in Aignew, D. amp Goldberger, A. (eds), Latent Variables in Socioeconomic Models . AMsterdam: North-Holland, pp. 365-383. Geweke, J. 1977, Wage and Price Dynamics in U. S. Manufacturing in New Methods in Business Cycle Research . Federal Researve Bank of Minneapolis, Minneapolis, MN USA, pp. 111-158. Conferences Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. J. 2011, Economic rationality, risk presentation, and retirement portfolio choice, Financial Management Association Annual Meeting, Denver, USA. Geweke, J. 2010, Complete and Incomplete Bayesian Models for Financial Time Series, JSM Proceedings, Section on Bayesian Statistical Science . American Statistical Association, Vancouver, Canada. This paper introduces the idea of an incomplete Bayesian model, which is a (possibly incoherent) prior predictive distribution for sample moments. Conventional complete Bayesian models also provide prior distributions for sample moments and consequently formal comparison of completely and incomplete models can be conducted by means of posterior odds ratios. This provides a logically consistent and workable Bayesian alternative to non-Bayesian significance tests and is an effective tool in the process of model development. These ideas are illustrated using three well-known alternative models for monthly SampP 500 index returns. Geweke, J. 2000, Bayesian Communication: The BACC System, 2000 Proceedings of the Section on Bayesian Statistical Sciences - American Statistical Association . pp. 40-49. Geweke, J.. Keane, M. amp Runkle, D. 1994, Recursively Simulating Multinomial Multiperiod Probit Probabilities, American Statistical Association 1994 Proceedings of the Business and Economic Statistics Section. . Geweke, J. amp Terui, N. 1991, Threshold Autoregressive Models for Macroeconomic Time Series: A Bayesian Approach, American Statistical Association 1991 Proceedings of the Business and Economic Statistics Section . American Statistical Association, pp. 42-50. Geweke, J. 1991, Efficient Simulation from the Multivariate Normal and Student-t DistribuAmerican tions Subject to Linear Constraints, Computing Science and Statistics: Proceedings of the Twenty-Third Symposium on the Interface . Interface Foundation of North America, Fairfax, pp. 571-578. The following routines constitute the software for the paper, quotEfficient Simulation from the Multivariate normal and Student-t Distributions Subject to Linear Constraints and the Evaluation of Constraint Probabilities, quot by John Geweke. This paper is to appear in the volume, quotComputing Science and Statistics: Proceedings of the Twenty-Third Symposium on the Interface. quot This work was supported by NSF Grant SES-8908365. Geweke, J. 1989, The Posterior Distribution of Roots in Multivariate Autoregressions, Erasmus University of Rotterdam - Econometric Institute. Geweke, J. 1989, Acceleration Methods for Monte Carlo Integration in Bayesian Inference, American Statistical Society, Alexandria, pp. 587-592. Abstract: Methods for the acceleration of Monte Carlo integration with n replications in a sample of size T are investigated. A general procedure for combining antithetic variation and grid methods with Monte Carlo methods is proposed, and it is shown that the numerical accuracy of these hybrid methods can be evaluated routinely. The derivation indicates the characteristics of applications in which acceleration is likely to be most beneficial. This is confirmed in a worked example, In which these acceleration methods reduce the computation time required to achieve a given degree of numerical Accuracy by several orders of magnitude. Geweke, J. 1986, Fixed Investment in the American Business Cycle, The American Business Cycle: Continuity and Change . National Bureau of Economic Research, New York. Geweke, J. 1983, Semi-Nonparametric and Nonparametric Regression: Consumer Demand Applications, Proceedings of the Business and Economic Section - American Statistical Association . American Statistical Association, USA. Geweke, J. 1983, Models of X-11 and X-11 Forecast Procedures, Applied Time Series Analysis of Economic Data . U. S. Bureau of the Census, Washington, pp. 12-13. Geweke, J. 1982, New Divisia Indices of the Money Supply, Proceedings of the Business and Economics Section . American Statistical Association. Geweke, J. 1978, On the Synthesis of Time Series and Econometric Models, Directions in Time Series . Institute of Mathematical Statistics. Geweke, J. 1978, Some Recent Developments in Seasonal Adjustment, Directions in Time Series . Institute of Mathematical Statistics. Journal articles Bateman, H. Eckert, C. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. J. 2016, Risk presentation and retirement portfolio choice, Review of Finance . vol. 20, no. 1, pp. 201-229. ViewDownload from: Publishers site Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. Satchell, S. amp Thorp, S. 2016, Risk Presentation and Portfolio Choice, Review of Finance . vol. 20, no. 1, pp. 201-229. ViewDownload from: UTS OPUS or Publishers site Efficient investment of personal savings depends on clear risk disclosures. We study the propensity of individuals to violate some implications of expected utility under alternative 039mass-market descriptions of investment risk, using a discrete choice experiment. We found violations in around 25 of choices, and substantial variation in rates of violation, depending on the mode of risk disclosure and participants039 characteristics. When risk is described as the frequency of returns below or above a threshold we observe more violations than for range and probability-based descriptions. Innumerate individuals are more likely to violate expected utility than those with high numeracy. Apart from the very elderly, older individuals are less likely to violate the restrictions. The results highlight the challenges of disclosure regulation. Geweke, J. 2016, Comment on: Reflections on the probability space induced by moment conditions with implications for Bayesian inference, Journal of Financial Econometrics . vol. 14, no. 2, pp. 253-257. ViewDownload from: Publishers site Geweke, J. 2016, Sequentially Adaptive Bayesian Learning for a Nonlinear Model of the Secular and Cyclical Behavior of US Real GDP, Econometrics . vol. 4, no. 1, pp. 10-10. There is a one-to-one mapping between the conventional time series parameters of a third-order autoregression and the more interpretable parameters of secular half-life, cyclical half-life and cycle period. The latter parameterization is better suited to interpretation of results using both Bayesian and maximum likelihood methods and to expression of a substantive prior distribution using Bayesian methods. The paper demonstrates how to approach both problems using the sequentially adaptive Bayesian learning algorithm and sequentially adaptive Bayesian learning algorithm (SABL) software, which eliminates virtually of the substantial technical overhead required in conventional approaches and produces results quickly and reliably. The work utilizes methodological innovations in SABL including optimization of irregular and multimodal functions and production of the conventional maximum likelihood asymptotic variance matrix as a by-product. Durham, G. Geweke, J. amp Ghosh, P. 2015, A comment on Christoffersen, Jacobs, and Ornthanalai (2012), Dynamic jump intensities and risk premiums: Evidence from SampP 500 returns and options, JOURNAL OF FINANCIAL ECONOMICS . vol. 115, no. 1, pp. 210-214. ViewDownload from: Publishers site Geweke, J. amp Petrella, L. 2014, Likelihood-based Inference for Regular Functions with Fractional Polynomial Approximations, Journal of Econometrics . vol. 183, no. 1, pp. 22-30. The paper demonstrates limitations in previous work using MampAtildeampfrac14ntz-Szatz polynomial approximations for regular functions. It introduces an alternative set of fractional polynomial approximations not subject to these limitations. Using Weierstrass approximation theory it shows that the set of fractional polynomial approximations is dense on a Sobolev space of functions on a com-pact set. Imposing regularity conditions directly on the fractional polynomi-als produces pseudo-true approximations that converge rapidly to productions functions having no exact representation as fractional polynomials. A small Monte Carlo study recovers this convergence in ampAcircnite sample, and the results are promising for future development of an adequate sampling-theoretic distribution theory. Frischknecht, B. D. Eckert, C.. Geweke, J. amp Louviere, J. J. 2014, A Simple Method to Estimate Preference Parameters for Individuals, International Journal of Research in Marketing . vol. 31, pp. 35-48. ViewDownload from: Publishers site Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. J. 2014, Financial competence, risk presentation and retirement portfolio preferences, Journal of Pension Economics and Finance . vol. 13, no. 1, pp. 27-61. ViewDownload from: Publishers site Geweke, J. amp Amisano, G. 2014, Analysis of variance for Bayesian inference, Econometric Reviews . vol. 33, no. 1-4, pp. 270-288. ViewDownload from: Publishers site Durham, G. amp Geweke, J. 2014, Adaptive sequential posterior simulators for massively parallel computing environments, Advances in Econometrics . vol. 34, pp. 1-44. ViewDownload from: Publishers site Copyright ampcopy 2014 by Emerald Group Publishing Limited. Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference. Durham, G. amp Geweke, J. 2014, Improving Asset Price Prediction When All Models are False, Journal of Financial Econometrics . vol. 12, no. 2, pp. 278-306. ViewDownload from: Publishers site Geweke, J. 2014, Review Essay on Charles F. Manskis Public Policy in an Uncertain World: Analysis and Decisions, JOURNAL OF ECONOMIC LITERATURE . vol. 52, no. 3, pp. 799-804. ViewDownload from: Publishers site Bateman, H. Eckert, C.. Geweke, J.. Iskhakov, F. Louviere, J. J. Satchell, S. E. amp Thorp, S. 2013, Disengagement: A Partial Solution to the Annuity Puzzle, UNSW Australian School of Business Research Paper . no. 2013. This research studies whether individuals make choices consistent with expected utility maximization in allocating wealth between a lifetime annuity and a phased withdrawal account at retirement. The paper describes the construction and administration of a discrete choice experiment to 854 respondents approaching retirement. The experiment fi nds overall rates of inconsistency with the predictions of the standard CRRA utility model of roughly 50, and variation in consistency rates depending on the characteristics of the respondents. Individuals with poor numeracy and with low engagement with the choice task, as measured by scores on a task-speci fic recall quiz, are more likely to increase allocations to the phased withdrawal as the risk of exhausting it increases. Individuals with higher scores on tests of financial capability and with knowledge of retirement income products are more likely to score high on the engagement measure, but capability and knowledge do not have independent eff ects on consistent choice rates. Results suggest that initiatives to improve speci fic product knowledge and to help individuals engage with decumulation decisions could be a partial solution to the annuity puzzle. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Thorp, S. J. amp Satchell, S. 2012, Financial competence and expectations formation: Evidence from Australia, The Economic Record . vol. 88, no. 280, pp. 39-63. ViewDownload from: UTS OPUS or Publishers site We study the financial competence of Australian retirement savers using self-assessed and quantified measures. Responses to financial literacy questions show large variation and compare poorly with some international surveys. Basic and sophisticated financial literacy vary significantly with most demographics, self-assessed financial competence, income, superannuation accumulation and net worth. General numeracy scores are largely constant across gender, age, higher education and income. Financial competence also significantly affects expectations of stock market performance. Using a discrete choice model, we show that individuals with a higher understanding of risk, diversification and financial assets are more likely to assign a probability to future financial crises rather than expressing uncertainty. Geweke, J.. Koop, G. amp Paap, R. 2012, Introduction for the annals issue of the Journal of Econometrics on Bayesian Models, Methods and Applications, Journal of Econometrics . vol. 171, no. 2, pp. 99-100. ViewDownload from: UTS OPUS or Publishers site This Annals issue of the Journal of Econometrics grew out of the European Seminar on Bayesian Econometrics (ESOBE) which was held at Erasmus University, Rotterdam on November 56, 2010. This conference was important for two reasons. First it inaugurated ESOBE, which has become a successful annual conference which brings European and international Bayesians together. Second, it celebrated the retirement of Herman van Dijk after a long and successful career in Bayesian econometric Geweke, J. amp Amisano, G. 2012, Prediction With Misspecified Models, American Economic Review . vol. 102, no. 3, pp. 482-486. ViewDownload from: UTS OPUS or Publishers site Many decision-makers in the public and private sectors routinely consult the im - plications of formal economic and statistical models in their work. Especially in large organizations and for important decisions, there are often competing models. Of course no model under consideration is a literal representation of reality for the purposes at hand. more succinctly, no model is. true..and diampcurrenerent models focus on diampcurrenerent aspects of the relevant environment. This fact can often be supported by formal econometric tests concluding that the models at hand are, indeed, misspecified in various dimensions. Geweke, J. 2012, Nonparametric Bayesian modelling of monotone preferences for discrete choice experiments, Journal of Econometrics . vol. 171, no. 2, pp. 185-204. ViewDownload from: UTS OPUS or Publishers site Discrete choice experiments are widely used to learn about the distribution of individual preferences for product attributes. Such experiments are often designed and conducted deliberately for the purpose of designing new products. There is a long-standing literature on nonparametric and Bayesian modelling of preferences for the study of consumer choice when there is a market for each product, but this work does not apply when such markets fail to exist as is the case with most product attributes. This paper takes up the common case in which attributes can be quantified and preferences over these attributes are monotone. It shows that monotonicity is the only shape constraint appropriate for a utility function in these circumstances. The paper models components of utility using a Dirichlet prior distribution and demonstrates that all monotone nondecreasing utility functions are supported by the prior. It develops a Markov chain Monte Carlo algorithm for posterior simulation that is reliable and practical given the number of attributes, choices and sample sizes characteristic of discrete choice experiments. The paper uses the algorithm to demonstrate the flexibility of the model in capturing heterogeneous preferences and applies it to a discrete choice experiment that elicits preferences for different auto insurance policies. Geweke, J. amp Amisano, G. 2011, Hierarchical Markov normal mixture models with applications to financial asset returns, Journal of Applied Econometrics . vol. 26, no. 1, pp. 1-29. ViewDownload from: UTS OPUS or Publishers site Abstract: Motivated by the common problem of constructing predictive distributions for daily asset returns over horizons of one to several trading days, this article introduces a new model for time series. This model is a generalization of the Markov normal mixture model in which the mixture components are themselves normal mixtures, and it is a specific case of an artificial neural network model with two hidden layers. The article characterizes the implications of the model for time series in two ways. First, it derives the restrictions placed on the autocovariance function and linear representation of integer powers of the time series in terms of the number of components in the mixture and the roots of the Markov process. Second, it uses the prior predictive distribution of the model to study the implications of the model for some interesting functions of asset returns. The article uses the model to construct predictive distributions of daily SampP 500 returns 1971-2005, US dollar -- UK pound returns 1972-1998, and one - and ten-year maturity bonds 1987-2006. It compares the performance of the model for these returns with ARCH and stochastic volatility models using the predictive likelihood function. The model039s performance is about the same as its competitors for the bond returns, better than its competitors for the SampP 500 returns, and much better than its competitors for the dollar-pound returns. In - and out-of-sample validation exercises with predictive distributions identify some remaining deficiencies in the model and suggest potential improvements. The article concludes by using the model to form predictive distributions of one - to ten-day returns during volatile episodes for the SampP 500, dollar-pound and bond return series. Geweke, J. amp Jiang, Y. 2011, Inference and prediction in a multiple-structural-break model, Journal Of Econometrics . vol. 163, no. 2, pp. 172-185. ViewDownload from: UTS OPUS or Publishers site This paper develops a new Bayesian approach to structural break modeling. The focuses of the approach are the modeling of in-sample structural breaks and forecasting time series allowing out-of-sample breaks. The model has several desirable features. Fir Geweke, J. amp Amisano, G. 2011, Optimal prediction pools, Journal Of Econometrics . vol. 164, no. 1, pp. 130-141. ViewDownload from: UTS OPUS or Publishers site We consider the properties of weighted linear combinations of prediction models, or linear pools, evaluated using the log predictive scoring rule. Although exactly one model has limiting posterior probability, an optimal linear combination typically includes several models with positive weights. We derive several interesting results: for example, a model with positive weight in a pool may have zero weight if some other models are deleted from that pool. The results are illustrated using SampP 500 returns with six prediction models. In this example models that are clearly inferior by the usual scoring criteria have positive weights in optimal linear pools. Geweke, J. amp Amisano, G. 2011, Optimal prediction pools, vol. 164, no. 1, pp. 130-141. We consider the properties of weighted linear combinations of prediction models, or linear pools, evaluated using the log predictive scoring rule. Although exactly one model has limiting posterior probability, an optimal linear combination typically includes several models with positive weights. We derive several interesting results: for example, a model with positive weight in a pool may have zero weight if some other models are deleted from that pool. The results are illustrated using SampP 500 returns with six prediction models. In this example models that are clearly inferior by the usual scoring criteria have positive weights in optimal linear pools. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. 2011, Investment Risk Framing and Individual Preference Consistency, UNSW Australian School of Business Research Paper . no. 2010. Here we test the usefulness of a discrete choice experiment (DCE) for identifying individuals who consistently exhibit concave utility over returns to wealth, despite variations in the framing of risk. At the same time, we test the relative strengths of nine standard descriptions of investment risk. We ask a sample of 1200 retirement savings account holders to select their most and least preferred investment strategies from a menu of a safe (zero risk) savings account, a risky growth asset portfolio and a 50:50 share of both. We identify respondents who fail to conform with expected utility and test whether this behavior is predictable across different risk frames. Tests confirm that the DCE can help isolate individuals whose preferences violate global risk aversion despite variation in risk presentation. We also identify frames linked to significantly more consistent behavior by respondents. These are frames which simultaneously specify upside and downside risk. Frames that present risk as a frequency of failures or successes against a zero returns benchmark are more likely to generate violations of risk aversion. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. 2011, Financial Competence and Expectations Formation: Evidence from Australia, UNSW Australian School of Business Research Paper . no. 2011. We study the financial competence of Australian retirement savers using self-assessed and quantified measures. Responses to numeracy and basic and sophisticated financial literacy questions show large variation and compare poorly with international surveys. We graph the relationships between financial competence index scores and a wide range of demographics, economic outcomes and attitudes. Results show significant variation of basic and sophisticated financial literacy measures with most demographics, self-assessed financial competence, income, superannuation accumulation and net worth. General numeracy scores show patterns of variation different to basic and sophisticated financial literacy, being largely constant by gender, age, higher education and personal income. Our results confirm the usefulness of studying numeracy and financial knowledge-based skills separately. We also consider the impact of financial competence on expectations of stock market performance. Consumers with higher understanding of risk, diversification and financial assets are more likely to assign a probability to future financial crises, and a time-frame to share market recovery, rather than expressing uncertainty. Geweke, J. 2010, Bayesian Analysis of DSGE Models, Econometric Reviews . vol. 26, pp. 193-200. Geweke, J. amp Amisano, G. 2010, Comparing And Evaluating Bayesian Predictive Distributions Of Asset Returns, International Journal of Forecasting . vol. 26, no. 2, pp. 216-230. ViewDownload from: UTS OPUS or Publishers site Bayesian inference in a time series model provides exact out-of-sample predictive distributions that fully and coherently incorporate parameter uncertainty. This study compares and evaluates Bayesian predictive distributions from alternative models, usin Geweke, J. 2010, Comment, International Journal of Forecasting . vol. 26, no. 2, pp. 435-438. ViewDownload from: UTS OPUS or Publishers site The article by Zellner and Ando proposes methods for coping with the excess kurtosis that is often observed in disturbances in applications of the seemingly unrelated regressions (SUR) model. This is an important topic which is of particular relevance in Geweke, J. amp Amisano, G. 2010, Comparing and evaluating Bayesian predictive distributions of asset returns, vol. 26, no. 2, pp. 216-230. Bayesian inference in a time series model provides exact out-of-sample predictive distributions that fully and coherently incorporate parameter uncertainty. This study compares and evaluates Bayesian predictive distributions from alternative models, using as an illustration five alternative models of asset returns applied to daily SampP 500 returns from the period 1976 through 2005. The comparison exercise uses predictive likelihoods and is inherently Bayesian. The evaluation exercise uses the probability integral transformation and is inherently frequentist. The illustration shows that the two approaches can be complementary, with each identifying strengths and weaknesses in models that are not evident using the other. Ackerberg, D. Geweke, J. amp Hahn, J. 2009, Comments on Convergence Properties Likelihood of Computed Dynamic Models, Econometrica . vol. 77, no. 6, pp. 2009-2017. ViewDownload from: UTS OPUS or Publishers site We show by counterexample that Proposition 2 in Fernampaacutendez-Villaverde, Rubio - Ramampiacuterez, and Santos (Econometrica (2006), 74, 93119) is false. We also show that even if their Proposition 2 were corrected, it would be irrelevant for parameter estimates. As a more constructive contribution, we consider the effects of approximation error on parameter estimation, and conclude that second order approximation errors in the policy function have at most second order effects on parameter estimates. Geweke, J. amp Keane, M. 2007, Smoothly mixing regressions, Journal of Econometrics . vol. 138, no. 1, pp. 252-290. ViewDownload from: UTS OPUS or Publishers site This paper extends the conventional Bayesian mixture of normals model by permitting state probabilities to depend on observed covariates. The dependence is captured by a simple multinomial probit model. A conventional and rapidly mixing MCMC algorithm provides access to the posterior distribution at modest computational cost. This model is competitive with existing econometric models, as documented in the paper039s illustrations. The first illustration studies quantiles of the distribution of earnings of men conditional on age and education, and shows that smoothly mixing regressions are an attractive alternative to nonBayesian quantile regression. The second illustration models serial dependence in the SampP 500 return, and shows that the model compares favorably with ARCH models using out of sample likelihood criteria. Geweke, J. 2007, Interpretation and Inference in Mixture Models: Simple MCMC Works, Computational Statistics and Data Analysis . vol. 51, no. 7, pp. 3529-3550. ViewDownload from: UTS OPUS or Publishers site Abstract: The mixture model likelihood function is invariant with respect to permutation of the components of the mixture. If functions of interest are permutation sensitive, as in classification applications, then interpretation of the likelihood function requires valid inequality constraints and a very large sample may be required to resolve ambiguities. If functions of interest are permutation invariant, as in prediction applications, then there are no such problems of interpretation. Contrary to assessments in some recent publications, simple and widely used Markov chain Monte Carlo (MCMC) algorithms with data augmentation reliably recover the entire posterior distribution. Geweke, J. 2007, Bayesian model comparison and validation, American Economic Review . vol. 97, no. 2, pp. 60-64. ViewDownload from: UTS OPUS or Publishers site Bayesian econometrics provides a tidy theory and practical methods of comparing and combining several alternative, completely specified models for a common data set. It is always possible that none of the specified models describe important aspects of the data well. The investigation of this possibility, a process known as model validation or model specification checking, is an important part of applied econometric work. Bayesian theory and practice for model validation are less well developed. A well-established Bayesian literature argues that non-Bayesian methods are essential in model validation. This line of though persists in Bayesian econometrics as well the paper reviews these methods. The paper proposes an alternative, fully Bayesian method of model validation based on the concept of incomplete models, and argues that this method is also strategically advantageous in applied Bayesian econometrics. The article provides detailed and accurate illustrations of Bayesian analysis of DSGE models that are likely to be used increasingly in support of central bank policy making. These comments identify a dozen aspects of these methods, discussing how their application and improvement can contribute to effective support of policy. Geweke, J.. Groenen, P. J.E. Paap, R. amp van Dijk, H. K. 2007, Computational techniques for applied econometric analysis of macroeconomic and financial processes, COMPUTATIONAL STATISTICS amp DATA ANALYSIS . vol. 51, no. 7, pp. 3506-3508. ViewDownload from: Publishers site Geweke, J. 2007, Bayesian dynamic econometrics - Comment, Econometric Reviews . vol. 26, no. 2-4, pp. 193-200. ViewDownload from: Publishers site Abrantes-Metz, R. Froeb, L. Geweke, J. amp Taylor, C. 2006, A Variance Screen for Collusion, International Journal Of Industrial Organisation . vol. 24, no. 3, pp. 467-486. ViewDownload from: UTS OPUS or Publishers site Abstract: In this paper, we examine price movements over time around the collapse of a bid-rigging conspiracy. While the mean decreased by sixteen percent, the standard deviations increased by over two hundred percent. We hypothesize that conspiracies in other industries would exhibit similar characteristics and search for quotpocketsquot of low price variation as indicators of collusion in the retail gasoline industry in Louisville. We observe no such areas around Louisville in 1996-2002. Geweke, J. 2004, Getting it Right: Joint Distribution Tests of Posterior Simulators, Journal of the American Statistical Association . vol. 99, no. 467, pp. 799-804. ViewDownload from: UTS OPUS or Publishers site Abstract: Analytical or coding errors in posterior simulators can produce reasonable but incorrect approximations of posterior moments. This article develops simple tests of posterior simulators that detect both kinds of errors, and uses them to detect and correct errors in two previously published papers. The tests exploit the fact that a Bayesian model specifies the joint distribution of observables (data) and unobservables (parameters). There are two joint distribution simulators. The marginal conditional simulator draws unobservables from the prior and then observables conditional on unobservables. The successive-conditional simulator alternates between the posterior simulator and an observables simulator. Formal comparison of moment approximations of the two simulators reveals existing analytical or coding errors in the posterior simulator. Geweke, J. amp H, T. 2003, Note on the Sampling Distribution for the Metropolis-Hastings Algorithm, Communications In Statistics-theory And Methods . vol. 32, pp. 775-789. ViewDownload from: UTS OPUS or Publishers site Abstract: The Metropolis-Hastings algorithm has been important in the recent development of Bayes methods. This algorithm generates random draws from a target distribution utilizing a sampling (or proposal) distribution. This article compares the properties of three sampling distributions-the independence chain, the random walk chain, and the Taylored chain suggested by Geweke and Tanizaki (Geweke, J. Tanizaki, H. (1999). On Markov Chain Monte-Carlo methods for nonlinear and non-Gaussian state-space models. Communications in Statistics, Simulation and. Computation 28(4):867-894, Geweke, J. Tanizaki, H. (2001). Bayesian estimation of state-space model using the Metropolis-Hastings algorithm within Gibbs sampling. Computational Statistics and Data Analysis 37(2):151-170). Geweke, J.. Gowrisankaran, G. amp Town, R. 2003, Bayesian Inference for Hospital Quality in a Selection Model, Econometrica . vol. 71, pp. 1215-1238. ViewDownload from: UTS OPUS or Publishers site This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and nonrandom selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient039s residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 74,848 Medicare patients admitted to 114 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds the smallest and largest hospitals to be of the highest quality. There is strong evidence of dependence between the unobserved severity of illness and the assignment of patients to hospitals, whereby patients with a high unobserved severity of illness are disproportionately admitted to high quality hospitals. Consequently a conventional probit model leads to inferences about quality that are markedly different from those in this study039s selection model. Geweke, J. amp Durham, G. 2003, Iterative and Recursive Estimation in Structured Non-Adaptive Models, Journal of Business amp Economic Statistics . vol. 21, no. 4, pp. 490-492. This was an invited paper and the journal solicited various people to write comments on it. So I have a comment on this paper. It starts on page 490 at the end of the article. Geweke, J. 2003, Econometric issues in using the AHEAD panel, vol. 112, no. 1, pp. 115-120. Durham, G. amp Geweke, J. 2003, Journal of Business and Economic Statistics: Comment, Journal of Business and Economic Statistics . vol. 21, no. 4, pp. 490-492. Geweke, J. amp Martin, D. 2002, Pitfalls in Drawing Policy Conclusions from Retrospective Survey Data: The Case of Advertising and Underage Smoking, Journal of Risk and Uncertainty . vol. 83, pp. 1181-1186. Abstract: Measuring the impact of potentially controllable factors on the willingness of youth to undertake health risks is important to informed public health policy decisions. Typically the only data linking these factors with risk-taking behavior are retrospective. This study demonstrates, by means of a recent example, that there can be serious pitfalls in using even longitudinal retrospective data to draw conclusions about causal relations between potentially controllable factors and risk-taking behavior. Geweke, J. 2002, Commentary: Econometric issues in using the AHEAD Panel, Journal of Econometrics . vol. 112, no. 1, pp. 115-120. ViewDownload from: UTS OPUS or Publishers site This study provides an illuminating perspective on the relation between health and socio-economic status. It is notable in meeting, head on, various technical but critical issues that arise in using the AHEAD panel to address issues of causation between health and socio-economic status (SES). This panel provides multiple measures of both health and SES, and there is no prior consensus reduction of these many dimensions. Household wealth is the candidate summary measure of economic status, but as users of self-reported wealth know and the authors lucidly demonstrate, severe measurement errors raise a host of methodological problems of their own. These comments focus on the way the authors have addressed these and some of the other technical issues that have to be confronted in one way or another in order to address the central issues. Geweke, J. 2001, Bayesian Econometrics and Forecasting, Journal of Econometrics . vol. 100, no. 1, pp. 11-15. ViewDownload from: UTS OPUS or Publishers site Abstract: Contemporary Bayesian forecasting methods draw on foundations in subjective probability and preferences laid down in the mid-twentieth century, and utilize numerical methods developed since that time in their implementation. These methods unify the tasks of forecasting and model evaluation. They also provide tractable solutions for problems that prove difficult when approached using non-Bayesian methods. These advantages arise from the fact that the conditioning in Bayesian probability forecasting is the same as the conditioning in the underlying decision problems. Geweke, J. 2001, Bayesian inference and posterior simulators, Canadian Journal of Agricultural Economics . vol. 49, no. 3, pp. 313-325. ViewDownload from: UTS OPUS or Publishers site Abstract: Recent advances in simulation methods have made possible the systematic application of Bayesian methods to support decision making with econometric models. This paper outlines the key elements of Bayesian investigation, and the simulation methods applied to bring them to bear in application. Geweke, J. amp McCausland, W. J. 2001, Bayesian Specification Analysis in Econometrics, American Journal of Agricultural Economics . vol. 83, pp. 1181-1186. ViewDownload from: Publishers site Geweke, J. amp Tanizaki, H. 2001, Bayesian Estimation of State-Space Models Using Metropolis-Hastings Algorithm with Gibbs Sampling, Computational Statistics and Data Analysis . vol. 37, no. 2, pp. 151-170. ViewDownload from: UTS OPUS Abstract: In this paper, an attempt is made to show a general solution to nonlinear andor non-Gaussian state-space modeling in a Bayesian framework, which corresponds to an extension of Carlin et al. (J. Amer. Statist. Assoc. 87(418) (1992) 493ampacirc500) and Carter and Kohn (Biometrika 81(3) (1994) 541ampacirc553 Biometrika 83(3) (1996) 589ampacirc601). Using the Gibbs sampler and the MetropolisampacircHastings algorithm, an asymptotically exact estimate of the smoothing Geweke, J. 2001, A Note on Some Limitations of CRRS Utility, Economic Letters . vol. 71, no. 3, pp. 341-345. ViewDownload from: UTS OPUS Abstract: In a standard environment for choice under uncertainty with constant relative risk aversion (CRRA), the existence of expected utility is fragile with respect to changes in the distributions of random variables, changes in prior information, or the assumption of rational expectations. Geweke, J. amp Keane, M. 2000, An Empirical Analysis of Male Income Dynamics in the PSID: 1968-1989, Journal of Econometrics . vol. 96, no. 2, pp. 293-356. ViewDownload from: Publishers site Geweke, J. amp Keane, M. 2000, An empirical analysis of earnings dynamics among men in the PSID: 1968-1989, vol. 96, no. 2, pp. 293-356. Geweke, J.. Rust, J. amp Van Dijk, H. K. 2000, Introduction - Inference and decision making, Journal of Applied Econometrics . vol. 15, no. 6, pp. 545-546. ViewDownload from: Publishers site Geweke, J. amp Tanizaki, H. 1999, On Markov Chain Monte Carlo Methods for Nonlinear and Non-Gaussian State-Space Models, Communications In Statistics-Simulation And Computation . vol. 28, pp. 867-894. ViewDownload from: Publishers site Abstract: In this paper, a nonlinear andor non-Gaussian smoother utilizing Markov chain Monte Carlo Methods is proposed, where the measurement and transition equations are specified in any general formulation and the error terms in the state-space model are not necessarily normal. The random draws are directly generated from the smoothing densities. For random number generation, the Metropolis-Hastings algorithm and the Gibbs sampling technique are utilized. The proposed procedure is very simple and easy for programming, compared with the existing nonlinear and non-Gaussian smoothing techniques. Moreover, taking several candidates of the proposal density function, we examine precision of the proposed estimator. Geweke, J. 1999, Using Simulation Methods for Bayesian Econometric Models: Inference, Development and Communication, Econometric Reviews . vol. 18, no. 1, pp. 1-73. ViewDownload from: UTS OPUS Abstract: This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper introduces subjective Bayesian tools for formal comparison of these models with as yet incompletely specified models. The paper then shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. A theme of the paper is the practicality of subjective Bayesian methods. To this end, the paper describes publicly available software for Bayesian inference, model development, and communication and provides illustrations using two simple econometric models. Geweke, J. 1999, Power of Tests in Binary Response Models: Comment, Econometrica . vol. 67, pp. 423-425. ViewDownload from: Publishers site Geweke, J. 1998, Real and Spurious Long Memory Properties of Stock Market Data., Journal of Business amp Economic Statistics . vol. 16, pp. 269-271. Geweke, J. amp Petrella, L. 1998, Prior Density Ratio Class Robustness in Econometrics, Journal of Business amp Economic Statistics . vol. 16, pp. 469-478. ViewDownload from: Publishers site Abstract: This paper provides a general and efficient method for computing density ratio class bounds on posterior moments, given the output of a posterior simulator. It shows how density ratio class bounds for posterior odds ratios may be formed in many situations, also on the basis of posterior simulator output. The computational method is used to provide density ratio class bounds in two economic models. It is found that the exact bounds are approximated poorly by their asymptotic approximation, when the posterior distribution of the function of interest is skewed. It is also found that the posterior odds ratios display substantial variation within the density ratio class, in ways that cannot be anticipated by the asymptotic approximation. Geweke, J.. Keane, M. amp Runkle, D. 1997, Statistical Inference In The Multinomial Multiperiod Probit Model, Journal Of Econometrics . vol. 80, no. 1, pp. 125-165. ViewDownload from: Publishers site Abstract: Statistical inference in multinomial multiperiod probit models has been hindered in the past by the high dimensional numerical integrations necessary to form the likelihood functions, posterior distributions, or moment conditions in these models. We describe three alternative estimators, implemented using simulation-based approaches to inference, that circumvent the integration problem: posterior means computed using Gibbs sampling and data augmentation (GIBBS), simulated maximum likelihood (SML) estimation using the GHK probability simulator, and method of simulated moment (MSM) estimation using GHK. We perform a set of Monte-Carlo experiments to compare the sampling distributions of these estimators. Although all three estimators perform reasonably well, some important differences emerge. Our most important finding is that, holding simulation size fixed, the relative and absolute performance of the classical methods, especially SML, gets worse when serial correlation in disturbances is strong. In data sets with an AR(1) parameter of 0.50, the RMSEs for SML and MSM based on GHK with 20 draws exceed those of GIBBS by 9 and 0, respectively. But when the AR(1) parameter is 0.80, the RMSEs for SML and MSM based on 20 draws exceed those of GIBBS by 79 and 37, respectively, and the number of draws needed to reduce the RMSEs to within 10 of GIBBS are 160 and 80 respectively. Also, the SML estimates of serial correlation parameters exhibit significant downward bias. Thus, while conventional wisdom suggests that 20 draws of GHK is enough039 to render the bias and noise induced by simulation negligible, our results suggest that much larger simulation sizes are needed when serial correlation in disturbances is strong. Geweke, J. amp Zhou, G. 1996, Measuring the Pricing Error of the Arbitrage Price Theory, Review of financial studies . vol. 9, no. 2, pp. 557-587. ViewDownload from: UTS OPUS Abstract: This article provides an exact Bayesian framework for analyzing the arbitrage pricing theory (APT). Based on the Gibbs sampler, we show how to obtain the exact posterior distributions for functions of interest in the factor model. In particular, we propose a measure of the APT pricing deviations and obtain its exact posterior distribution. Using monthly portfolio returns grouped by industry and market capitalization, we find that there is little improvement in reducing the pricing errors by including more factors beyond the first one. Geweke, J. 1996, Bayesian Reduced Rank Regression in Econometrics, Journal of Econometrics . vol. 75, no. 1, pp. 121-146. ViewDownload from: Publishers site Abstract: The reduced rank regression model arises repeatedly in theoretical and applied econometrics. To date the only general treatment of this model have been frequentist. This paper develops general methods for Bayesian inference with noninformative reference priors in this model, based on a Markov chain sampling algorithm, and procedures for obtaining predictive odds ratios for regression models with different ranks. These methods are used to obtain evidence on the number of factors in a capital asset pricing model Geweke, J. amp Zhou, G. 1996, Measuring the Pricing Error of the Arbitrage Pricing Theory., Review of Financial Studies . vol. 9, no. 2. This article provides an exact Bayesian framework for analyzing the arbitrage pricing theory (APT). Based on the Gibbs sampler, we show how to obtain the exact posterior distributions for functions of interest in the factor model. In particular, we propose a measure of the APT pricing deviations and obtain its exact posterior distribution. Using monthly portfolio returns grouped by industry and market capitalization, we find that there is little improvement in reducing the pricing errors by including more factors beyond the first one. Article published by Oxford University Press on behalf of the Society for Financial Studies in its journal, The Review of Financial Studies. Geweke, J. amp Runkle, D. 1995, A Fine Time for Monetary Policy, Federal Reserve Bank of Minneapolis Quarterly Review . vol. 19, no. 1, pp. 18-31. ViewDownload from: UTS OPUS Almost everyone would agree--even we in the Federal Reserve System--that monetary policy can be improved. But improving it requires accurate empirical descriptions of the current policy and the relationship between that policy and the economic variables policymakers care about. With those descriptions, we could, conceivably, predict how economic outcomes would change under alternative policies and hence find policies that lead to better economic outcomes. The first requirement of this policymaking problem is policy identification, and it is the focus of this study. Policy identification entails a specification of the instrument the Federal Reserve controls and a description of how that instrument is set based on information available when a policy decision is made. Because policy identification is a crucial step in the search for improved monetary policy, it has received much attention in the literature. Geweke, J. 1994, Bayesian Analysis of Stochastic Volatility Models, Journal of Business amp Economic Statistics . vol. 12, pp. 397-399. Geweke, J.. Keane, M. amp Horowitz, J. L. 1994, Advances in Random Utility Models, Marketing Letters . vol. 5, pp. 311-322. Abstract: In recent years, major advances have taken place in three areas of random utility modeling: (1) semiparametric estimation, (2) computational methods for multinomial probit models, and (3) computational methods for Bayesian estimation. This paper summarizes these developments and discusses their implications for practice. Geweke, J. 1994, Priors for Macroeconomic Time Series and Their Application, Econometric Theory . vol. 10, pp. 609-632. ViewDownload from: Publishers site Abstract: This paper takes up Bayesian inference in a general trend stationary model for macroeconomic time series with independent Student-t disturbances. The model is linear in the data, but non-linear in the parameters. An informative but nonconjugate family of prior distributions for the parameters is introduced, indexed by a single parameter which can be readily elicited. The main technical contribution is the construction of posterior moments, densities, and odds ratios using a six-step Gibbs sampler. Mappings from the index parameter of the family of prior distribution to posterior moments, densities, and odds ratios are developed for several of the Nelson-Plosser time series. These mappings show that the posterior distribution is not even approximately Gaussian, and indicate the sensitivity of the posterior odds ratio in favor of difference stationarity to the choice of prior distribution. Geweke, J.. Keane, M. amp Runkle, D. 1994, Alternative Computational Approaches to Statistical Inference In The Multinomial Probit Model, Review Of Economics And Statistics . vol. 76, no. 4, pp. 609-632. ViewDownload from: UTS OPUS or Publishers site This research compares several approaches to inference in the multinomial probit model, based on two Monte Carlo experiments for a seven choice model. The methods compared are the simulated maximum likelihood estimator using the GHK recursive probabilit Horowitz, J. L. Bolduc, D. Divakar, S. Geweke, J.. Gnl, F. Hajivassiliou, V. Koppelman, F. S. Keane, M. Matzkin, R. Rossi, P. amp Ruud, P. 1994, Advances in random utility models report of the workshop on advances in random utility models duke invitational symposium on choice modeling behavior, Marketing Letters . vol. 5, no. 4, pp. 311-322. ViewDownload from: Publishers site In recent years, major advances have taken place in three areas of random utility modeling: (1) semiparametric estimation, (2) computational methods for multinomial probit models, and (3) computational methods for Bayesian estimation. This paper summarizes these developments and discusses their implications for practice. ampcopy 1994 Kluwer Academic Publishers. Geweke, J. amp Terui, N. 1993, Bayesian Threshold Autoregressive Models for Nonlinear Time Series, Journal of Time Series Analysis . vol. 14, pp. 441-445. Abstract: This paper provides a Bayesian approach to statistical inference in the threshold autoregressive model for time series. The exact posterior distribution of the delay and threshold parameters is derived, as is the multi-step-ahead predictive density. The proposed methods are applied to the Wolfe039s sunspot and Canadian lynx data sets. Geweke, J. 1993, Forecasting Time Series with Common Seasonal Patterns, Journal of Econometrics . vol. 55, pp. 201-202. Geweke, J. 1993, Discussion on the Gibbs Sampler and Other Markov Chain Monte Carlo Methods., Journal Of The Royal Statistical Society Series B-methodological . vol. 55, p. 74. Clifford, P. Jennison, C. Wakefield, J. Phillips, D. Frigessi, A. Gray, A. Lawson, A. Forster, J. Ramgopal, P. Arslan, O. Constable, P. Kent, J. Wolff, R. Harding, E. Middleton, R. Diggle, P. Aykroyd, R. Berzuini, C. Brewer, M. amp Aitken, C. 1993, Discussion On The Meeting On The Gibbs Sampler And Other Markov Chain-Monte Carlo Methods, Journal Of The Royal Statistical Society Series B-Methodological . vol. 55, no. 1, pp. 53-102. GEWEKE, J. 1993, BAYESIAN TREATMENT OF THE INDEPENDENT STUDENT-T LINEAR-MODEL, JOURNAL OF APPLIED ECONOMETRICS . vol. 8, pp. S19-S40. ViewDownload from: Publishers site GEWEKE, J. 1993, REMARKS ON MY TERM AT JBES, JOURNAL OF BUSINESS amp ECONOMIC STATISTICS . vol. 11, no. 4, pp. 427-427. Geweke, J. 1992, Inference and Prediction in the Presence of Uncertainty and Determinism, Statistical Science . vol. 7, pp. 94-101. Geweke, J. 1991, Generic, Algorithmic Approaches to Monte Carlo Integration in Bayesian Inference, Contemporary Mathematics . vol. 115, pp. 117-135. Abstract: Program of research in generic, algorithmic approaches to Monte Carlo integration in Bayesian inference is summarized. The goal of this program is the development of a widely applicable family of solutions of Bayesian multiple integration problems, that obviate the need for case-by-case treatment of arcane problems in numerical analysis. The essentials of the Bayesian inference problem, with some reference to econometric applications, are set forth. Fundamental results in Monte Carlo integration are derived and their current implementation in software is described. Potential directions for fruitful new research are outlined. Geweke, J.. Barnett, W. amp Wolfe, M. 1991, Seminonparametric Bayesian Estimation of the Asymptotically Ideal Production Model, Journal of Econometrics . vol. 49, pp. 5-50. ViewDownload from: Publishers site Abstract: Recently it has been shown that seminonparametric methods can be used to produced high-quality approximations to a firm039s technology. Unlike the local approximations provided by the conventional class of flexible functional forms039, seminonparametric methods generate global spans within large classes of functions. However, that approach usually spans a much larger space than the neoclassical function space relevant to most production modeling. An exception is the asymptotically ideal model (AIM) generated from the MampAtildeampfrac14ntz-Szatz series expansion. Since every basis function in that expansion is within the neoclassical function space, a straightforward method exists for imposing neoclassical regularity, when all factors are substitutes. Since the relevant constraints are inequality restrictions, we implement the approach using Bayesian methods to avoid the problems of sampling distribution truncation that would occur from sampling theoretic methods. We further discuss the relevant extensions that would permit complementary factors, nonconstant returns to scale, and technological change. Geweke, J.. Matchar, D. Simel, D. amp Feussner, J. 1990, A Bayesian Method for Evaluating Medical Test Operating Characteristics When Some Patients Fail to be Diagnosed by the Reference Standard., Medical Decision Making . vol. 10, pp. 114-115. Matchar, D. Simel, D. Geweke, J. amp Feussner, J. 1990, A Bayesian Method for Evaluating Medical Test Operating Characteristics When Some Patients Condituions Fail to be Diagnosed by the Reference Standard, Medical Decision Making . vol. 10, no. 2, pp. 102-115. ViewDownload from: Publishers site Abstract: The evaluation of a diagnostic test when the reference standard fails to establish a diagnosis in some patients is a common and difficult analytical problem. Conventional operating characteristics, derived from a 2 x 2 matrix, require that tests have only positive or negative results, and that disease status be designated definitively as present or absent. Results can be displayed in a 2 x 3 matrix, with an additional column for undiagnosed patients, when it is not possible always to ascertain the disease status definitively. The authors approach this problem using a Bayesian method for evaluating the 2 x 3 matrix in which test operating characteristics are described by a joint probability density function. They show that one can derive this joint probability density function of sensitivity and specificity empirically by applying a sampling algorithm. The three-dimensional histogram resulting from this sampling procedure approximates the true joint probability density function for sensitivity and specificity. Using a clinical example, the authors illustrate the method and demonstrate that the joint probability density function for sensitivity and specificity can be influenced by assumptions used to interpret test results in undiagnosed patients. This Bayesian method represents a flexible and practical solution to the problem of evaluating test sensitivity and specificity when the study group includes patients whose disease could not be diagnosed by the reference standard. Keywords: Bayesian analysis test operating characteristics probability density functions. (Med Decis Making 199010:102-111) Geweke, J. 1989, Bayesian Inference in Econometric Models Using Monte Carlo Integration, Econometrica . vol. 57, no. 6, pp. 1317-1339. ViewDownload from: Publishers site Abstract: Methods for the systematic application of Monte Carlo integration with importance sampling to Bayesian inference in econometric models are developed. Conditions under which the numerical approximation of a posterior moment converges almost surely to the true value as the number of Monte Carlo replications increases, and the numerical accuracy of this approximation may be assessed reliably, are set forth. Methods for the analytical verification of these conditions are discussed. Importance sampling densities are derived from multivariate normal of Student t approximations to local behavior of the posterior density at its mode. These densities are modified by automatic rescaling along each axis. The concept of relative numerical efficiency is introduced to evaluate the adequacy of a chosen importance sampling density. The practical procedures based on these innovations are illustrated in two different models Geweke, J. 1989, Exact Predictive Densities in Linear Models with ARCH Disturbances, Journal of Econometrics . vol. 40, pp. 63-86. ViewDownload from: Publishers site Abstract: It is shown how exact predictive densities may be formed in the ARCH linear model by means of Monte Carlo integration with importance sampling. Several improvements in computational efficiency over earlier implementations of this procedure are developed, including use of the exact likelihood function rather than an asymptotic approximation to construct the importance sampling distribution, and antithetic acceleration of convergence. A numerical approach to the formulation of posterior odds ratios and the combination of non-nested models is also introduced. These methods are applied to daily quotations of closing stock prices. Forecasts are formulated using linear models, ARCH linear models and an integrated model constructed from the posterior probabilities of the respective models. The use of the exact predictive density in a decision-theoretic context is illustrated by deriving the optimal day-to-day portfolio adjustments of a trader with constant relative risk aversion. Geweke, J. 1989, Sensitivity Analysis of Seasonal Adjustments: Empirical Case Studies, Journal of the American Statistical Association . vol. 84, pp. 28-30. CARLIN, J. B. DEMPSTER, A. P. PIERCE, D. A. BELL, W. R. CLEVELAND, W. S. WATSON, M. W. amp GEWEKE, J. 1989, SENSITIVITY ANALYSIS OF SEASONAL ADJUSTMENTS - EMPIRICAL CASE STUDIES - COMMENTS, JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION . vol. 84, no. 405, pp. 6-30. ViewDownload from: Publishers site Geweke, J. 1988, Antithetic Acceleration of Monte Carlo Integration in Bayesian Inference, Journal of Econometrics . vol. 38, no. 1-2, pp. 73-90. ViewDownload from: UTS OPUS or Publishers site It is proposed to sample antithetically rather than randomly from the posterior density in Bayesian inference using Monte Carlo integration. Conditions are established under which the number of replications required with antithetic sampling relative to the number required with random sampling is inversely proportional to sample size, as sample size increases. The result is illustrated in an experiment using a bivariate vector autoregression. Geweke, J. 1988, An Application of Operational-Subjective Statistical Methods to Rational Expectations, Journal Of Business amp Economic Statistics . vol. 6, pp. 465-466. Geweke, J. 1988, Operational Bayesian Methods in Econometrics, Journal of Economic Perspectives . vol. 2, pp. 159-166. Geweke, J. 1988, Employment Discrimination and Statistical Science, Statistical Science . vol. 3, pp. 188-189. Geweke, J. 1988, Checks of Model Adequacy for Univariate Time Series Models and Their Application to Econometric Relationships., Econometric Reviews . vol. 7, no. 1, pp. 59-62. Geweke, J. 1988, The Secular and Cyclical Behavior of Real GDP in Nineteen OECD Countries, 1957-1983, Journal of Business amp Economic Statistics . vol. 6, pp. 479-486. Abstract: Log per capita real gross domestic product is modeled as a third-order autoregression with a pair of complex roots whose amplitude is smaller than the amplitude of the real root. The behavior of this time series is interpreted in terms of these two amplitudes, the periodicity of the complex roots, and the standard deviation of the disturbance. Restrictions are evaluated and inference is conducted using the likelihood principle, applying Monte Carlo integration with importance sampling. These Bayesian procedures efficiently cope with restrictions that are awkward taking a classical approach. We find very little difference in the amplitudes of real roots between countries and of complex roots relative to within-country uncertainty. There are some substantial differences in the periodicities of complex roots, and the greatest differences between countries are found in the standard deviation of the disturbance. GEWEKE, J. 1988, COMMENT ON POIRIER - OPERATIONAL BAYESIAN METHODS IN ECONOMETRICS, JOURNAL OF ECONOMIC PERSPECTIVES . vol. 2, no. 1, pp. 159-166. Geweke, J. amp Froeb, L. 1987, Long Run Competition in the U. S. Aluminum Industry, International Journal Of Industrial Organisation . vol. 5, pp. 67-78. ViewDownload from: Publishers site Abstract: A methodology for examining dynamic structure-performance relationships in a single industry is proposed and illustrated. Implications of long run competitive behavior for a simple simultaneous equations model of structure and performance are derived and tested using recently developed methods for the interpretation of economic time series. It is concluded that the structure and performance in the U. S. aluminum industry in the postwar period conform well with the hypothesis that the primary aluminum market was competitive in the long run. Geweke, J.. Marshall, R. amp Zarkin, G. 1986, Mobility Indices in Continuous Time Markov Chains, Econometrica . vol. 54, pp. 1407-1423. ViewDownload from: Publishers site Abstract: The axiomatic derivation of mobility indices for first-order Markov chain models in discrete time is extended to continuous-time models. Many of the logical inconsistencies among axioms noted in the literature for the discrete time models do not arise for continuous time models. It is shown how mobility indices in continuous time Markov chains may be estimated from observations at two points in time. Specific attention is given to the case in which the states are fractiles, and an empirical example is presented. Geweke, J.. Marshall, R. amp Zarkin, G. 1986, Exact Inference for Continuous Time Markov Chains, Review Of Economic Studies . vol. 53, pp. 653-669. ViewDownload from: Publishers site Abstract: Methods for exact Bayesian inference under a uniform diffuse prior are set forth for the continuous time homogeneous Markov chain model. It is shown how the exact posterior distribution of any function of interest may be computed using Monte Carlo integration. The solution handles the problems of embeddability in a very natural way, and provides (to our knowledge) the only solution that systematically takes this problem into account. The methods are illustrated using several sets of data. Geweke, J. 1986, Exact Inference in the Inequality Constrained Normal Linear Regression Model, Journal of Applied Econometrics . vol. 1, pp. 127-141. Abstract: Inference in the inequality constrained normal linear regression model is approached as a problem in Bayesian inference, using a prior that is the product of a conventional uninformative distribution and an indicator function representing the inequality constraints. The posterior distribution is calculated using Monte Carlo numerical integration, which leads directly to the evaluation of expected values of functions of interest. This approach is compared with others that have been proposed. Three empirical examples illustrate the utility of the proposed methods using an inexpensive 32-bit microcomputer. Geweke, J. 1986, The Superneutrality of Money in the United States: An Interpretation of the Evidence, Econometrica . vol. 54, pp. 1-22. ViewDownload from: Publishers site Abstract: Structural and stochastic neutrality have refutable implications for aggregate economic time series only in conjunction with other maintained hypotheses. Simple and commonly employed maintained hypotheses lead to restrictions on measures of feedback and their decomposition by frequency. These restrictions also suggest an empirical interpretation of the notional long and short runs. It is found that a century of annual U. S. data, and postwar monthly data, consistently support structural superneutrality of money with respect to output and the real rate of return and consistently reject its superneutrality with respect to velocity. A quantitative characterization of the long run is suggested. Geweke, J. 1986, Modeling Conditional Variance, Econometric Reviews . vol. 5, no. 1, pp. 57-61. Geweke, J. 1985, Macroeconomic Modeling and the Theory of the Representative Agent, American Economic Review . vol. 75, pp. 206-210. Geweke, J. amp Porter-Hudak, S. 1984, The Estimation and Application of Long Memory Time Series Models, Journal of Time Series Analysis . vol. 4, pp. 221-238. Abstract: The definitions of fractional Gaussian noise and integrated (or fractionally differenced) series are generalized, and it is shown that the two concepts are equivalent. A new estimator of the long memory parameter in these models is proposed, based on the simple linear regression of the log periodogram on a deterministic regressor. The estimator is the ordinary least squares estimator of the slope parameter in this regression, formed using only the lowest frequency ordinates of the log periodogram. Its asymptotic distribution is derived, from which it is evident that the conventional interpretation of these least squares statistics is justified in large samples. Using synthetic data the asymptotic theory proves to be reliable in samples of 50 observations or more. For three postwar monthly economic time series, the estimated integrated series model provides more reliable out-of-sample forecasts than do more conventional procedures. Geweke, J. 1984, Measures of Conditional Linear Dependence and Feedback, Journal of the American Statistical Association . vol. 79, pp. 907-915. ViewDownload from: Publishers site Abstract: Measures of linear dependence and feedback for two multiple time series conditional on a third are defined. The measure of conditional linear dependence is the sum of linear feedback from the first to the second conditional on the third, linear feedback from the second to the first conditional on the third, and instantaneous linear feedback between the first and second series conditional on the third. The measures are non-negative and may be expressed in terms of measures of unconditional feedback between various combinations of the three series. The measures of conditional linear feedback can be additively decomposed by frequency. Estimates of these measures are straightforward to compute, and their distribution can be routinely approximated by bootstrap methods. An empirical example involving real output, money, and interest rates is presented. Geweke, J. amp Weisbrod, B. 1984, How Does Technological Change Affect Health Care Expenditures The Case of a New Drug, Evaluation Review . vol. 8, no. 1, pp. 75-92. ViewDownload from: Publishers site Abstract: The expenditure consequences of the drug cimetidine for the period 1977-1979 are investigated. Using Medicaid data for the State of Michigan, it is found that expenditures for the first year of treatment of duodenal ulcers are reduced between 26 and 70 The methodology employed can be applied to the assessment of other medical technologies. Geweke, J. 1984, The Indispensable Art of Econometrics, Journal of the American Statistical Association . vol. 79, pp. 25-26. Geweke, J. 1984, Forecasting and Conditional Projection Using Realistic Prior Distributions, Econometric Reviews . vol. 5, no. 1, pp. 105-112. Geweke, J. amp Meese, R. 1984, A Comparison of Autoregressive Univariate Forecasting Procedures for Macroeconomic Time Series, Journal Of Business amp Economic Statistics . vol. 2, pp. 187-202. Abstract: The actual performance of several automated univariate autoregressive forecasting procedures, applied to 150 macroeconomic time series, are compared. The procedures are the random walk model as a basis for comparison long autoregressions, with three alternative rules for lag length selection and a long autoregression estimated by minimizing the sum of absolute deviations. The sensitivity of each procedure to preliminary transformations, data, periodicity, forecast horizon, loss function employed in parameter estimation, and seasonal adjustment procedures is examined. The more important conclusions are that Akaike039s lag-length selection criterion works well in a wide variety of situations, the modeling of long memory components becomes important for forecast horizons of three or more periods, and linear combinations of forecasts do not improve forecast quality appreciably. Meese, R. amp Geweke, J. 1984, A Comparison of Autoregressive Univariate Forecasting Procedures for Macroeconomic Time Series., Journal of Business and Economic Statistics . vol. 2, no. 3, pp. 191-200. GEWEKE, J. 1984, THE INDISPENSABLE ART OF ECONOMETRICS - COMMENT, JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION . vol. 79, no. 385, pp. 25-26. Geweke, J.. Meese, R. amp Dent, W. 1983, Comparing Alternative Tests of Causality in Temporal Systems: Analytic Results and Experimental Evidence, Journal of Econometrics . vol. 21, pp. 161-194. ViewDownload from: Publishers site Abstract: This paper discusses eight alternative tests of the absence of casual ordering, all of which are asymptotically valid under the null hypothesis in the sense that their limiting size is known. Their behavior under alternatives is compared analytically using the concept of approximate slope, and these results are supported by the outcomes of Monte Carlo experiments. The implications of these comparisons for applied work are unambiguous: Wald variants of a test attributed to Granger, and a lagged dependent variable version of Sim039s test introduced in this paper, are equivalent in all relevant respects and are preferred to the other tests discussed. Geweke, J. amp Weisbrod, B. 1982, Clinical Evaluation vs. Economic Evaluation: The Case of a New Drug, Medical Care . vol. 20, pp. 821-830. ViewDownload from: Publishers site Abstract: To economically evaluate a new. drug or other medical innovation one must assess both the changes in costs and in benefits. Safety and efficacy matter, but so do resource costs and social benefits. This paper evaluates the effects on expenditures of the recent introduction of cimetidine, a drug used in the prevention and treatment of duodenal ulcers. This evaluation is of interest in its own right and also as a quotguidequot for studying similar effects of other innovations. State Medicaid records are used to test the effects on hospitalization and aggregate medical care expenditures of this new medical innovation. After controlling to the extent possible for potential selection bias, we find that: 1) usage of cimetidine is associated with a lower level of medical care expenditures and fewer days of hospitalization per patient for those duodenal ulcer patients who had zero health care expenditures and zero days of hospitalization during the presample period an annual cost saving of some 320.00 (20 per cent) per patient is indicated. Further analysis disclosed, however, that this saving was lower for patients with somewhat higher levels of health care expenditures and hospitalization in the presample period, and to some extent was reversed for the patients whose prior year039s medical care expenditures and hospitalization were highest. Geweke, J.. Parzen, E. Pierce, D. Wei, W. amp Zellner, A. 1982, The Measurement of Linear Dependence and Feedback Between Multiple Time Series, Journal of the American Statistical Association . vol. 77, pp. 304-324. ViewDownload from: Publishers site Abstract: Measures of linear dependence and feedback for multiple time series are defined. The measure of linear dependence is the sum of the measure of linear feedback from the first series to the second, linear feedback from the second to the first, and instantaneous linear feedback. The measures are nonnegative, and zero only when feedback (causality) of the relevant type is absent. The measures of linear feedback from one series Geweke, J. 1982, Measurement of Linear Dependence and Feedback Between Multiple Time Series, Journal of the American Statistical Association . Geweke, J. amp Singleton, K. 1981, Latent Variable Models for Time Series: A Frequency Domain Approach with an Application to the Permanent Income Hypothesis, Journal of Econometrics . vol. 17, no. 3, pp. 287-304. ViewDownload from: UTS OPUS or Publishers site Abstract: The theory of estimation and inference in a very general class of latent variable models for time series is developed by showing that the distribution theory for the finite Fourier transform of the observable variables in latent variable models for time series is isomorphic to that for the observable variables themselves in classical latent variable models. This implies that analytic work on classical latent variable models can be adapted to latent variable models for time series, an implication which is illustrated here in the context of a general canonical form. To provide an empirical example a latent variable model for permanent income is developed, its parameters are shown to be identified, and a variety of restrictions on these parameters implied by the permanent income hypothesis are tested. Geweke, J. 1981, The Approximate Slopes of Econometric Tests, Econometrica . vol. 49, no. 6, pp. 1427-1442. ViewDownload from: UTS OPUS or Publishers site Abstract: In this paper the concept of approximate slope, introduced by R. R. Bahadur, is used to make asymptotic global power comparisons of econometric tests. The approximate slope of a test is the rate at which the logarithm of the asymptotic marginal significance level of the test decreases as sample size increases, under a given alternative. A test with greater approximate slope may therefore be expected to reject the null hypothesis more frequently under that alternative than one with smaller approximate slope. Two theorems, which facilitate the computation and interpretation of the approximate slopes of most econometric tests, are established. These results are used to undertake some illustrative comparisons. Sampling experiments and an empirical illustration suggest that the comparison of approximate slopes may provide an adequate basis for evaluating the actual performance of alternative tests of the same hypothesis Geweke, J. amp Meese, R. 1981, Estimating Regression Models of Finite but Unknown Order, International Economic Review . vol. 22, no. 1, pp. 54-70. ViewDownload from: UTS OPUS Examines problems associated with the estimation of the normal linear regression model of finite but unknown sequence of nested alternatives. Estimation criteria for the model selection Derivation of the numerical bounds on the finite sample distribution Relation of the estimation criterion functions proposed to other estimation criterion functions. Geweke, J. amp Singleton, K. 1981, Maximum Likelihood Confirmatory Factor Analysis of Economic Time Series, International Economic Review . vol. 22, no. 1, pp. 37-54. ViewDownload from: UTS OPUS or Publishers site Explains the theory of identification, estimation and inference in the dynamic confirmatory factor model for the economic time series. Derivation of the frequency domain representation of the model Illustration of the nature of the identification problem for the dynamic confirmatory model Dynamic confirmatory model of the business cycle motivated by Lucas theory of aggregate activity. Geweke, J. 1981, A Comparison of Tests of the Independence of Two Covariance Stationary Time Series, Journal of the American Statistical Association . vol. 76, no. 374, pp. 363-373. ViewDownload from: UTS OPUS or Publishers site Abstract: The approximate slopes of several tests of the independence of two covariance stationary time series are derived and compared. It is shown that the approximate slopes of regression tests are at least as great as those based on the residuals of univariate ARIMA models, and that there are cases in which the former are arbitrarily great while the latter are arbitrarily small. These analytical findings are supported by a Monte Carlo study that shows that in samples of size 100 and 250 the asymptotic distribution theory under the null hypothesis is adequate for all tests, but under alternatives to the null hypothesis the rate of Type II error for the test based on ARIMA model residuals is often more than double that of the regression tests. Geweke, J. amp Meese, R. 1981, Estimating regression models of finite but unknown order, vol. 16, no. 1, pp. 162-162. GEWEKE, J. amp MEESE, R. 1981, ESTIMATING REGRESSION-MODELS OF FINITE BUT UNKNOWN ORDER, INTERNATIONAL ECONOMIC REVIEW . vol. 22, no. 1, pp. 55-70. ViewDownload from: Publishers site Geweke, J. amp Singleton, K. 1980, Interpreting the Likelihood Ratio Statistic in Factor Models When Sample Size is Small, Journal of the American Statistical Association . vol. 75, no. 369, pp. 133-137. ViewDownload from: UTS OPUS or Publishers site Abstract: The use of the likelihood ratio statistic in testing the goodness of fit of the exploratory factor model has no formal justification when, as is often the case in practice, the usual regularity conditions are not met. In a Monte Carlo experiment it is found that the asymptotic theory seems to be appropriate when the regularity conditions obtain and sample size is at least 30. When the regularity conditions are not satisfied, the asymptotic theory seems to be misleading in all sample sizes considered. Geweke, J. amp Feige, E. 1979, Some Joint Tests of the Efficiency of Markets for Forward Foreign Exchange, Review Of Economics And Statistics . vol. 61, no. 3, pp. 334-341. Geweke, J. 1978, Testing the Exogeneity Specification in the Complete Dynamic Simultaneous Equation Model, Journal of Econometrics . vol. 7, no. 2, pp. 163-185. ViewDownload from: UTS OPUS Abstract: It is shown that in the complete dynamic simultaneous equation model exogenous variables cause endogenous variables in the sense of Granger (1969) and satisfy the criterion of econometric exogeneity discussed by Sims (1977a), but that the stationarity assumptions invoked by Granger and Sims are not necessary for this implication. Inference procedures for testing each implication are presented and a new joint test of both implications is derived. Detailed attention is given to estimation and testing when the error vector of the final form of the complete dynamic simultaneous equation model is both singular and serially correlated. The theoretical points of the paper are illustrated by testing the exogeneity specification in a small macroeconometric model. Geweke, J. 1978, Temporal Aggregation in the Multiple Regression Model, Econometrica . vol. 46, no. 3, pp. 643-661. ViewDownload from: UTS OPUS or Publishers site Abstract: The regression relation between regularly sampled Y(t) and Xquot1(t). XquotN(t) implied by an underlying model in which time enters more generally is studied. The underlying model includes continuous distributed lags, discrete models, and stochastic differential equations as special cases. The relation between parameters identified by regular samplings of Y and Xquotj and those of the underlying model is characterized. Sufficient conditions for identification of the underlying model in the limit as disaggregation over time proceeds are set forth. Empirical evidence presented suggests that important gains can be realized from temporal disaggregation in the range of conventional measurement frequencies for macroeconomic data. Geweke, J. 1976, A monetarist model of inflationary expectations. John Rutledge, (D. C. Health, Lexington, Massachusetts, 1974) pp. xv115, 12.50, vol. 2, no. 1, pp. 125-127. Geweke, J. 1975, A Monetarist Model of Inflationary Expectations, Journal of Monetary Economics . Geweke, J.. Durham, G. amp Xu, H. Bayesian Inference for Logistic Regression Models Using Sequential Posterior Simulation. The logistic specification has been used extensively in non-Bayesian statistics to model the dependence of discrete outcomes on the values of specified covariates. Because the likelihood function is globally weakly concave estimation by maximum likelihood is generally straightforward even in commonly arising applications with scores or hundreds of parameters. In contrast Bayesian inference has proven awkward, requiring normal approximations to the likelihood or specialized adaptations of existing Markov chain Monte Carlo and data augmentation methods. This paper approaches Bayesian inference in logistic models using recently developed generic sequential posterior simulaton (SPS) methods that require little more than the ability to evaluate the likelihood function. Compared with existing alternatives SPS is much simpler, and provides numerical standard errors and accurate approximations of marginal likelihoods as by-products. The SPS algorithm for Bayesian inference is amenable to massively parallel implementation, and when implemented using graphical processing units it is more efficient than existing alternatives. The paper demonstrates these points by means of several examples. Geweke, J.. Gowrisankaran, G. amp Town, R. J. Inferring Hospital Quality from Patient Discharge Records Using a Bayesian Selection Model. This paper develops new econometric methods to estimate hospital quality and other models with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patients residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 77,937 Medicare patients admitted to 117 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds higher quality in smaller hospitals than larger, and in private for-profit hospitals than in hospitals in other ownership categories. Variations in unobserved severity of illness across hospitals is at least a great as variation in hospital quality. Consequently a conventional probit model leads to inferences about quality markedly different than those in this studys selection model. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. amp Thorp, S. J. 2011, Financial competence, risk presentation and retirement portfolio preferences.. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. J. 2011, Financial Competence and Expectations Formation: Evidence from Australia. Geweke, J. 2010, Comment: Financial Convergance Properties of the Likelihood of Computed Dynamic Models. There are technical errors in this article (Econometrica, January 2006) that are important, simple and correctable. The corrections substantially alter the article039s conclusions. Berg, J. E. Geweke, J. amp Rietz, T. A. 2010, Memoirs of an Indifferent Trader: Estimating Forecast Distributions from Prediction Markets. ViewDownload from: UTS OPUS Prediction markets for future events are increasingly common and they often trade several contracts for the same event. This paper considers the distribution of a normative risk-neutral trader who, given any portfolio of contracts traded on the event, would choose not to reallocate that portfolio of contracts even if transactions costs were zero. Because common parametric distributions can conflict with observed prediction market prices, the distribution is given a nonparametric representation together with a prior distribution favoring smooth and concentrated distributions. Posterior modal distributions are found for popular vote shares of the U. S. presidential candidates in the 100 days leading up to the elections of 1992, 1996, 2000, and 2004, using bid and ask prices on multiple contracts from the Iowa Electronic Markets. On some days, the distributions are multimodal or substantially asymmetric. The derived distributions are more concentrated than the historical distribution of popular vote shares in presidential elections, but do not tend to become more concentrated as time to elections diminishes. Geweke, J.. Ackerberg, D. amp Hahn, J. 2010, Comments on Convergance Properties of the Likelihood of Computed Dynamic Models.. Geweke, J. 2010, Bayesian and Non-Bayesian Analysis of the Seemingly Unrelated Regression Model with Student-t Errors and Its Application to Forecasting: Comment. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. amp Thorp, S. J. 2010, Economic Rationality, Risk Presentation, and Retirement Portfolio Choice. Geweke, J.. Horowitz, J. amp Pesaran, M. H. 2006, Econometrics: A Birds Eye View. As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledged and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks of forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process thus paving the way for establishing the foundation of real time econometrics. This paper attempts to provide an overview of some of these developments. Geweke, J.. Gowrisankaran, G. amp Jul, R. J.T. X.-.I. 2002, Bayesian inference for hospital quality in a selection model. This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient039s residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 74,848 Medicare patients admitted to 114 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds the smallest and largest hospitals to be of the highest quality. There is strong evidence of dependence between the unobserved severity of illness and the assignment of patients to hospitals, whereby patients with a high unobserved severity of illness are disproportionately admitted to high quality hospitals. Consequently a conventional probit model leads to inferences about quality markedly different than those in this study039s selection model. Geweke, J. F. 1998, Using simulation methods for Bayesian econometric models: inference, development, and communication. This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper introduces subjective Bayesian tools for formal comparison of these models with as yet incompletely specified models. The paper then shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. A theme of the paper is the practicality of subjective Bayesian methods. To this end, the paper describes publicly available software for Bayesian inference, model development, and communication and provides illustrations using two simple econometric models. Geweke, J. F.. Keane, M. P. amp Runkle, D. E. 1994, Alternative computational approaches to inference in the multinomial probit model. This research compares several approaches to inference in the multinomial probit model, based on Monte-Carlo results for a seven choice model. The experiment compares the simulated maximum likelihood estimator using the GHK recursive probability simulator, the method of simulated moments estimator using the GHK recursive simulator and kernel-smoothed frequency simulators, and posterior means using a Gibbs sampling-data augmentation algorithm. Each estimator is applied in nine different models, which have from 1 to 40 free parameters. The performance of all estimators is found to be satisfactory. However, the results indicate that the method of simulated moments estimator with the kernel-smoothed frequency simulator does not perform quite as well as the other three methods. Among those three, the Gibbs sampling-data augmentation algorithm appears to have a slight overall edge, with the relative performance of MSM and SML based on the GHK simulator difficult to determine. Geweke, J.. Using Simulation Methods for Bayesian Econometric Models. This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and their implementation using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. The paper shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. Geweke, J. amp Keane, M. An Empirical Analysis of Income Dynamics among Men in the PSID: 19681989. This study uses data from the Panel Survey of Income Dynamics (PSID) to address a number of questions about life-cycle earnings mobility. It develops a dynamic reduced-form model of earnings and marital status that is nonstationary over the life-cycle. A Gibbs sampling-data augmentation algorithm facilitates use of the entire sample and provides numerical approximations to the exact posterior distribution of properties of earnings paths. This algorithm copes with the complex distribution of endogenous variables that are observed for short segments of an individual039s work history, not including the initial period. ltpgtThe study reaches several firm conclusions about life cycle earnings mobility. Incorporating non-Gaussian shocks makes it possible to account for transitions between low and higher earnings states, a heretofore unresolved problem. The non-Gaussian distribution substantially increases the lifetime return to postsecondary education, and substantially reduces differences in lifetime wages attributable to race. In a given year, the majority of variance in earnings not accounted for by race, education, and age is due to transitory shocks, but over a lifetime the majority is due to unobserved individual heterogeneity. Consequently, low earnings at early ages are strong predictors of low earnings later in life, even conditioning on observed individual characteristics. Geweke, J.. Computational Experiments and Reality. A common practice in macroeconomics is to assess the validity of general equilibrium models by first deriving their implications for population moments and then comparing population moments with observed sample moments. Generally the population moments are not explicit functions of model parameters, and so computational experiments are used to establish the link between parameters and moments. In most cases the general equilibrium models are intended to describe certain population moments (for example, means) but not others (for example, variances). The comparison of population moments with observed sample moments is informal, a process that has been termed calibration by some economists and ocular econometrics by others. This paper provides a formal probability framework within which this approach to inference can be studied. There are two principle results. First, if general equilibrium models are taken as predictive for sample moments, then the formal econometrics of model evaluation and comparison are straightforward. The fact that the models describe only a subset of moments presents no obstacles, and the formal econometrics yield as a byproduct substantial insights into the workings of models. Second, if general equilibrium models are taken to establish implications for population moments but not sample moments, then there is no link to reality because population moments are unobserved. Under this assumption, atheoretical macroeconomic models that link population and sample moments can be introduced coherently into the formal econometrics of model evaluation and comparison. The result is a framework that unifies general equilibrium models (theory without measurement) and atheoretical econometrics (measurement without theory). The paper illustrates these using some models of the equity premium. Geweke, J. amp Amisano, G. Hierarchical Markov normal mixture models with applications to financial asset returns. With the aim of constructing predictive distributions for daily returns, we introduce a new Markov normal mixture model in which the components are themselves normal mixtures. We derive the restrictions on the autocovariances and linear representation of integer powers of the time series in terms of the number of components in the mixture and the roots of the Markov process. We use the model prior predictive distribution to study its implications for some interesting functions of returns. We apply the model to construct predictive distributions of daily SampP500 returns, dollarpound returns, and one - and ten-year bonds. We compare the performance of the model with ARCH and stochastic volatility models using predictive likelihoods. The model039s performance is about the same as its competitors for the bond returns, better than its competitors for the SampP 500 returns, and much better for the dollar-pound returns. Validation exercises identify some potential improvements. JEL Classification: C53, G12, C11, C14 Geweke, J.. Gowrisankaran, G. amp Town, R. J. Bayesian Inference for Hospital Quality in a Selection Model. This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient039s residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 74,848 Medicare patients admitted to 114 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds the smallest and largest hospitals to be of high quality and public hospitals to be of low quality. There is strong evidence of dependence between the unobserved severity of illness and the assignment of patients to hospitals. Consequently a conventional probit model leads to inferences about quality markedly different than those in this study039s selection model. Geweke, J.. Horowitz, J. amp Pesaran, M. H. Econometrics: A Birds Eye View. As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledged and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks of forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process thus paving the way for establishing the foundation of 039real time econometrics. This paper attempts to provide an overview of some of these developments. Selected Peer-Assessed Projects


No comments:

Post a Comment