Working Paper Series

Browse the categories to access the content of academic, scientific and opinion publications of the professors and students of the Department of Economics PUC-Rio.

Unpacking Neighborhood Effects: Experimental evidence from a large-scale housing program in Brazil

N 695, 05/05/2023

This paper investigates the impacts of neighborhoods on the economic outcomes of adults. We exploit one of the world's largest housing lottery programs and administrative data linking lottery registration, formal employment, and access to social programs in Brazil. Receiving a house has positive impacts on housing quality and reduces household expenditures but has negative effects on beneficiaries' neighborhood characteristics. On average, the program has a  negative impact on the probability of being formally employed but no effects on the quality of jobs. Poorer individuals, however, experience better formal employment outcomes and lower welfare dependency. We find no differential impacts by distance to beneficiaries' previous homes or jobs. Leveraging a double-randomization design to allocate houses, we show that there are significant differences in effects across neighborhoods and we propose a framework to estimate the relative importance of potential underlying mechanisms. Network quality, amenities and crime play a very limited role, while labor market access explains 82-93% of the observed differences in neighborhood effects.

Carlos Alberto Belchior, Gustavo Gonzaga, Gabriel Ulyssea.


Natural disasters support authoritarian populism: Evidence from the Brazilian shrimp vote

N 694, 21/10/2022

We investigate the effects of extreme weather events on voicing political opposition against authoritarian regimes. We use the Brazilian general elections of 1982 as a case study. At the time, Brazil was under a military dictatorship that promoted local and gubernatorial elections to validate its authoritarian power. This context provides a positive measure of protest coined shrimp vote. Moreover, during the elections, the country’s northeastern region was facing a long-lasting drought that started in 1979. Using data from meteorological ground stations to compute a measure of drought severity that takes rainfall and evaporation into account, we estimate the effects of the drought on the voting behavior of individuals in this region. Our findings suggest a negative causal effect of adverse weather shocks on the share of protest votes. Specifically, a one-standard-deviation from the historical average water deficit reduces the share of shrimp vote by 2.5%. We also test for heterogeneity among factors such as relief transfers, clientelism, social vulnerability, and economic vulnerability. We only find heterogeneous effects for economic vulnerability. Namely, municipalities whose economy depended less on weather-resistant crops featured stronger declines in protest in response to drought severity.

Diogo Baerlocher, Renata Caldas, Rodrigo Schneider, Francisco de Lima Cavalcanti.


Monetary Policy and Liquidity Management With an Endogenous Interbank Network

N 693, 17/10/2022

This paper studies a central bank’s optimal interest rate corridor choice in the presence of an endogenous interbank network. We first provide a characterization of the unique equilibrium of banks’ liquidity holdings for any network of credit lines. Then, we endogenize the network and show that every equilibrium network is a complete coreperiphery graph. Central banks face the following trade-off. A narrower corridor implies more precise targeting of the interbank rate. But, when accounting for banks’ endogenous linking decisions, a narrower corridor may lead to a sparser interbank network with higher aggregate liquidity holdings. This incurs an implic it cost, since these funds could be invested in the more productive illiquid asset instead. We solve for the optimal corridor width and provide a comparative statics analysis.

Luiz Guilherme Carpizo Costa, Timo Hiller.


Fiscal Space in an Era of Central Bank Activism

N 692, 26/01/2022

Central banks’ liabilities are still often excluded from debt sustainability analyses, despite the enormous expansions in central banks’ balance sheets that we have witnessed in recent years. In this paper, we construct a dataset that consolidates both general government and central bank balance sheets and argue that this metric allows for fairer comparisons across countries. The findings highlight the increasingly important role played by central banks in managing and altering the profile of the privately-held sovereign debt. In addition, they shed light into the impact of FX reserves accumulation and QE on reducing the debt maturity, which cannot be captered by traditional general government debt metric.

Guido Maia da Cunha, Márcio Gomes Pinto Garcia, Pedro Maia da Cunha.


International Macroeconomic Vulnerability

N 691, 25/01/2022

We propose and implement an index of macroeconomic vulnerability to foreign shocks based on a structural time-varying bayesianVARwith a block-exogeneity hypothesis for a given pair of a large economy and a small open economy. The index is based on the sum of the responses of the small open economy to shocks in the large economy over time, thus allowing us to disentangle and measure the source of the shock, impact variables and duration of impact. Our approach brings light not only to vulnerability across countries and over time, but it can be also be used to elucidate previously unanswered channels. We provide an application of this approach to a global banks framework, allowing us to measure some yet unmeasured theoretical mechanisms. Using a sample of developed and developing countries, we find that global banks do not increase the macroeconomic vulnerability of a country

João Pedro Cavaleiro dos Reis Velloso, Márcio Gomes Pinto Garcia, Diogo Abry Guillén, Bernardo Silva de Carvalho Ribeiro.


Ignorance is bliss: voter education and alignment in distributive politics

N 690, 20/09/2021

Central-government politicians channel resources to sub-national entities for political gains. We show formally that the central politicians' allocation decision has two drivers: political alignment (between central and local politicians) and the level of local political accountability. However, drivers count one at a time: alignment matters before local elections, while local political accountability matters before central elections. We then perform a test of our model using Brazilian data, which corroborates our results. Furthermore, we show and explain why political accountability becomes a curse: better-educated districts receive fewer transfers in equilibrium.

Federico Boffa, Amedeo Piolatto, Francisco de Lima Cavalcanti.


Persistent Monetary Non-neutrality in an Estimated Menu-Cost Model with Partially Costly Information

N 688, 05/09/2021

We propose a model that reconciles microeconomic evidence of frequent and large price changes with sizable monetary non-neutrality. Firms incur separate lump-sum costs to change prices and to gather and process some information about marginal costs. Additional relevant information is continuously available, and can be factored into pricing decisions at no cost. We estimate the model by Simulated Method of Moments, using price-setting statistics for the U.S. economy. The model with free idiosyncratic and costly aggregate information fits well both targeted and untargeted microeconomic moments and generates more than twice as much monetary non-neutrality as the Calvo model.

Marco Bonomo, Carlos Viana de Carvalho, Rene Garcia, Vivian Malta Nunes, Rodolfo Dinis Rigato.


Taylor Rule Estimation by OLS

N 686, 02/09/2021

Ordinary Least Squares (OLS) estimation of monetary policy rules produces potentially inconsistent estimates of policy parameters. The reason is that central banks react to variables, such as in ation and the output gap, which are endogenous to monetary policy shocks. Endogeneity implies a correlation between regressors and the error term, and hence, an asymptotic bias. In principle, Instrumental Variables (IV) estimation can solve this endogeneity problem. In practice, IV estimation poses challenges, as the validity of potential instruments depends on various unobserved features of the economic environment. We argue in favor of OLS estimation of monetary policy rules. To that end, we show analytically in the three-equation New Keynesian model that the asymptotic OLS bias is proportional to the fraction of the variance of regressors accounted for by monetary policy shocks. Using Monte Carlo simulation, we then show that this relationship also holds in a quantitative model of the U.S. economy. As monetary policy shocks explain only a small fraction of the variance of regressors typically included in monetary policy rules, the endogeneity bias is small. Using simulations, we show that, for realistic sample sizes, the OLS estimator of monetary policy parameters outperforms IV estimators.

A sair no Journal of Monetary Economics

Carlos Viana de Carvalho, Fernanda Feitosa Nechio, Tiago Santana Tristão.


Residual Based Nodewise Regression in Factor Models with Ultra-High Dimensions: Analysis of Mean-Variance Portfolio Efficiency and Estimation of Out-of-Sample and Constrained Maximum Sharpe Ratios

N 684, 22/06/2021

We provide a new theory for nodewise regression when the residuals from a fitted factor model are used to apply our results to the analysis of maximum Sharpe ratio when the number of assets in a portfolio is larger than its time span. We introduce a new hybrid model where factor models are combined with feasible nodewise regression. Returns are generated from increasing number of factors plus idiosyncratic components (errors). The precision matrix of the idiosyncratic terms is assumed to be sparse, but the respective covariance matrix can be non-sparse. Since the nodewise regression is not feasible due to unknown nature of errors, we provide a feasible-residual based nodewise regression to estimate the precision matrix of errors, as a new method. Next, we show that the residual-based nodewise regression provides a consistent estimate for the precision matrix of errors. In another new development, we also show that the precision matrix of returns can be estimated consistently, even with increasing number of factors. Benefiting from the consistency of the precision matrix estimate of returns, we show that: (1) the portfolios in high dimensions are mean-variance efficient; (2) maximum out-of-sample Sharpe ratio estimator is consistent and the number of assets slows the convergence up to a logarithmic factor; (3) the maximum Sharpe ratio estimator is consistent when the portfolio weights sum to one; and (4) the Sharpe ratio estimators are consistent in global minimum-variance and mean-variance portfolios.

Mehmet Caner, Marcelo Medeiros, Gabriel F. R. Vasconcelos.


Price selection

N 685, 30/05/2021

Price selection is a simple, model-free measure of selection in price setting and its contribution to inflation dynamics. It exploits comovement between inflation and the level from which adjusting prices departed. Prices that increase from lower-than-usual levels tend to push inflation above average. Using detailed micro-level consumer price data for the United Kingdom, the United States, and Canada, we nd robust evidence of strong price selection across goods and services. At a disaggregate level, price selection accounts for 37% of inflation variance in the United Kingdom, 36% in the United States, and 28% in Canada. Price selection is stronger for goods with less frequent price changes or with larger average price changes. Aggregate price selection is considerably weaker. A multisector sticky-price model accounts well for this evidence and demonstrates a monotone relationship between price selection and monetary non-neutrality. Revisto em maio 2021

Artigo aceito para publicação no Journal of Monetary Economics

 

Carlos Viana de Carvalho, Oleksiy Kryvtsov.


Anchored Inflation Expectations

N 689, 05/04/2021

We develop a theory of low-frequency movements in inflation expectations, and use it to interpret joint dynamics of inflation and inflation expectations for the United States and other countries over the post-war period. In our theory long-run inflation expectations are endogenous. They are driven by short-run inflation surprises, in a way that depends on recent forecasting performance and monetary policy. This distinguishes our theory from common explanations of low-frequency properties of inflation.  The model, estimated using only inflation and short-term forecasts from professional surveys, accurately predicts observed measures of long-term inflation expectations and identifies episodes of unanchored expectations.

Carlos Viana de Carvalho, Stefano Eusepi, Emanuel Moench, Bruce Preston.


Jumps in Stock Prices: New Insights from Old Data

N 682, 19/03/2021

We characterize jump dynamics in stock market returns using a novel series of intraday prices covering over 80 years. Jump dynamics vary substantially over time. Trends in jump activity relate to secular shifts in the nature of news. Unscheduled news often involving major wars drives jump activity in early decades, whereas scheduled news and especially news pertaining to monetary policy drives jump activity in recent decades. Jump variation measures forecast excess stock market returns, consistent with theory. Results support models featuring a separate jump factor such that risk premium dynamics are not fully captured by volatility state variables

James A. Johnson, Bradley S. Paye, Marcelo Medeiros.


The Proper Use of Google Trends in Forecasting Models

N 683, 04/03/2021

It is widely known that Google Trends has become one of the most popular free tools used by forecasters both in academics and in the private and public sectors. There are many papers, from several different fields, concluding that Google Trends improve forecasts’ accuracy. However, what seems to be widely unknown, is that each sample of Google search data is different from the other, even if you set the same search term, data and location. This means that it is possible to find arbitrary conclusions merely by chance. This paper aims to show why and when it can become a problem and how to overcome this obstacle.

Marcelo Medeiros, Henrique Fernandes Pires.


Bridging factor and sparse models

N 681, 22/02/2021

Factor and sparse models are two widely used methods to impose a low-dimensional structure in high dimension. They are seemingly mutually exclusive. In this paper, we propose a simple lifting method that combines the merits of these two models in a supervised learning methodology that allows to efficiently explore all the information in high-dimensional datasets. The method is based on a very flexible linear model for panel data, called factor-augmented regression model with both observable, latent common factors, as well as idiosyncratic components as high-dimensional covariate variables. This model not only includes both factor regression and sparse regression as specific models but also significantly weakens the cross-sectional dependence and hence facilitates model selection and interpretability. The methodology consists of three steps. At each step, remaining cross-section dependence can be inferred by a novel test for covariance structure in high-dimensions. We developed asymptotic theory for the factoraugmented sparse regression model and demonstrated the validity of the multiplier bootstrap for testing high-dimensional covariance structure. This is further extended to testing highdimensional partial covariance structures. The theory and methods are further supported by an extensive simulation study and applications to the construction of a partial covariance network of the financial returns for the constituents of the S&P500 index and prediction exercise for a large panel of macroeconomic time series from FRED-MD database

Jianqing Fan, Ricardo Masini, Marcelo Medeiros.


Regularized estimation of high-dimensional vector autoregressions with weakly dependent innovations

N 680, 29/12/2020

There has been considerable advance in understanding the properties of sparse regularization procedures in high-dimensional models. In time series context, it is mostly restricted to Gaussian autoregressions or mixing sequences. We study oracle properties of LASSO estimation of weakly sparse vector-autoregressive models with heavy tailed, weakly dependent innovations with virtually no assumption on the conditional heteroskedasticity. In contrast to current literature, our innovation process satisfy an L1 mixingale type condition on the centered conditional covariance matrices. This condition covers L1-NED sequences and strong mixing sequences as particular examples. From a modeling perspective, it covers several multivariate-GARCH specifications, such as the BEKK model, and other factor stochastic volatility specifications that were ruled out by assumption in previous studies.

Ricardo Masini, Marcelo Medeiros, Eduardo F. Mendes.


Machine Learning Advances for Time Series Forecasting

N 679, 28/12/2020

In this paper we survey the most recent advances in supervised machine learning and highdimensional models for time series forecasting. We consider both linear and nonlinear alternatives. Among the linear methods we pay special attention to penalized regressions and ensemble of models. The nonlinear methods considered in the paper include shallow and deep neural networks, in their feed-forward and recurrent versions, and tree-based  methods, such as random forests and boosted trees. We also consider ensemble and hybrid models by combining ingredients from different alternatives. Tests for superior predictive ability are briefly  reviewed. Finally, we discuss application of machine learning in economics and nance and provide an illustration with high-frequency nancial data

Ricardo Pereira Masini, Marcelo Medeiros, Eduardo F. Mendes.


Do We Exploit all Information for Counterfactual Analysis? Benefits of Factor Models and Idiosyncratic Correction

N 678, 03/11/2020

The measurement of treatment (intervention) effects on a single (or just a few) treated unit(s) based on counterfactuals constructed from artificial controls has become a popular practice in applied statistics and economics since the proposal of the synthetic control method. In high-dimensional setting, we often use principal component or (weakly) sparse regression to estimate counterfactuals. Do we use enough data information? To better estimate the effects of price changes on the sales in our case study, we propose a general framework on counterfactual analysis for high dimensional dependent data. The framework includes both principal component regression and sparse linear regression as specific cases. It uses both factor and idiosyncratic components as predictors for improved counterfactual analysis, resulting a method called Factor-Adjusted Regularized Method for Treatment (FarmTreat) evaluation. We demonstrate convincingly that using either factors or sparse regression is inadequate for counterfactual analysis in many applications and the case for information gain can be made through the use of idiosyncratic  components. We also develop theory and methods to formally answer the question if common factors are adequate for estimating counterfactuals. Furthermore, we consider a simple resampling approach to conduct inference on the treatment effect as well as bootstrap test to access the relevance of the idiosyncratic components. We apply the proposed method to evaluate the effects of price changes on the sales of a set of products based on a novel large panel of sale data from a major retail chain in Brazil and demonstrate the benefits  of using additional idiosyncratic components in the treatment effect evaluations.

Jianqing Fan, Ricardo Pereira Masini, Marcelo Medeiros.


Targeting in Adaptive Networks

N 677, 27/10/2020

This paper studies optimal targeting policies, consisting of eliminating (preserving) a set of agents in a network and aimed at minimizing (maximizing) aggregate effort levels. Different from the existing literature, we allow the equilibrium network to adapt after a network intervention and consider targeting of multiple agents. A simple and tractable adjustment process is introduced. We find that allowing the network to adapt may overturn optimal targeting results for a fixed network and that congestion/competition effects are crucial to understanding differences between the two settings

Timo Hiller.


A Simple Model of Network Formation with Congestion Effects

N 676, 16/10/2020

This paper provides a game-theoretic model of network formation with a continuous effort choice. Efforts are strategic complements for direct neighbors in the network and display global substitution/congestion effects. We show that if the parameter governing local strategic complements is larger than the one governing global strategic substitutes, then all pairwise Nash  equilibrium networks are nested split graphs. We also consider the problem of a planner, who can choose effort levels and place links according to a network cost function. Again all socially optimal configurations are such that the network is a nested split graph. However, the socially optimal network may be different from equilibrium networks and efficient effort levels do not coincide with Nash equilibrium effort levels. In the presence of strategic substitutes, Nash equilibrium effort levels may be too high or too low relative to efficient effort levels. The relevant applications are crime networks and R&D collaborations among firms, but also interbank lending and trade.

Timo Hiller.


Search here