#### Investimentos diretos estrangeiros no Brasil, 1840-2024

TD n. 697, 16/09/2024

Marcelo de Paiva Abreu.

Navegue nas categorias para acessar o conteúdo de publicações acadêmicas, científicas e de opinião dos professores e alunos do Departamento de Economia da PUC-Rio.

TD n. 697, 16/09/2024

Marcelo de Paiva Abreu.

TD n. 696, 13/09/2024

Marcelo de Paiva Abreu.

TD n. 695, 05/05/2023

This paper investigates the impacts of neighborhoods on the economic outcomes of adults. We exploit one of the world's largest housing lottery programs and administrative data linking lottery registration, formal employment, and access to social programs in Brazil. Receiving a house has positive impacts on housing quality and reduces household expenditures but has negative effects on beneficiaries' neighborhood characteristics. On average, the program has a negative impact on the probability of being formally employed but no effects on the quality of jobs. Poorer individuals, however, experience better formal employment outcomes and lower welfare dependency. We find no differential impacts by distance to beneficiaries' previous homes or jobs. Leveraging a double-randomization design to allocate houses, we show that there are significant differences in effects across neighborhoods and we propose a framework to estimate the relative importance of potential underlying mechanisms. Network quality, amenities and crime play a very limited role, while labor market access explains 82-93% of the observed differences in neighborhood effects.

Carlos Alberto Belchior, Gustavo Gonzaga, Gabriel Ulyssea.

TD n. 694, 21/10/2022

We investigate the effects of extreme weather events on voicing political opposition against authoritarian regimes. We use the Brazilian general elections of 1982 as a case study. At the time, Brazil was under a military dictatorship that promoted local and gubernatorial elections to validate its authoritarian power. This context provides a positive measure of protest coined shrimp vote. Moreover, during the elections, the country’s northeastern region was facing a long-lasting drought that started in 1979. Using data from meteorological ground stations to compute a measure of drought severity that takes rainfall and evaporation into account, we estimate the effects of the drought on the voting behavior of individuals in this region. Our findings suggest a negative causal effect of adverse weather shocks on the share of protest votes. Specifically, a one-standard-deviation from the historical average water deficit reduces the share of shrimp vote by 2.5%. We also test for heterogeneity among factors such as relief transfers, clientelism, social vulnerability, and economic vulnerability. We only find heterogeneous effects for economic vulnerability. Namely, municipalities whose economy depended less on weather-resistant crops featured stronger declines in protest in response to drought severity.

Diogo Baerlocher, Renata Caldas, Rodrigo Schneider, Francisco de Lima Cavalcanti.

TD n. 693, 17/10/2022

This paper studies a central bank’s optimal interest rate corridor choice in the presence of an endogenous interbank network. We first provide a characterization of the unique equilibrium of banks’ liquidity holdings for any network of credit lines. Then, we endogenize the network and show that every equilibrium network is a complete coreperiphery graph. Central banks face the following trade-off. A narrower corridor implies more precise targeting of the interbank rate. But, when accounting for banks’ endogenous linking decisions, a narrower corridor may lead to a sparser interbank network with higher aggregate liquidity holdings. This incurs an implic it cost, since these funds could be invested in the more productive illiquid asset instead. We solve for the optimal corridor width and provide a comparative statics analysis.

Luiz Guilherme Carpizo Costa, Timo Hiller.

TD n. 692, 26/01/2022

Central banks’ liabilities are still often excluded from debt sustainability analyses, despite the enormous expansions in central banks’ balance sheets that we have witnessed in recent years. In this paper, we construct a dataset that consolidates both general government and central bank balance sheets and argue that this metric allows for fairer comparisons across countries. The findings highlight the increasingly important role played by central banks in managing and altering the profile of the privately-held sovereign debt. In addition, they shed light into the impact of FX reserves accumulation and QE on reducing the debt maturity, which cannot be captered by traditional general government debt metric.

Guido Maia da Cunha, Márcio Gomes Pinto Garcia, Pedro Maia da Cunha.

TD n. 691, 25/01/2022

We propose and implement an index of macroeconomic vulnerability to foreign shocks based on a structural time-varying bayesianVARwith a block-exogeneity hypothesis for a given pair of a large economy and a small open economy. The index is based on the sum of the responses of the small open economy to shocks in the large economy over time, thus allowing us to disentangle and measure the source of the shock, impact variables and duration of impact. Our approach brings light not only to vulnerability across countries and over time, but it can be also be used to elucidate previously unanswered channels. We provide an application of this approach to a global banks framework, allowing us to measure some yet unmeasured theoretical mechanisms. Using a sample of developed and developing countries, we find that global banks do not increase the macroeconomic vulnerability of a country

João Pedro Cavaleiro dos Reis Velloso, Márcio Gomes Pinto Garcia, Diogo Abry Guillén, Bernardo Silva de Carvalho Ribeiro.

TD n. 690, 20/09/2021

Central-government politicians channel resources to sub-national entities for political gains. We show formally that the central politicians' allocation decision has two drivers: political alignment (between central and local politicians) and the level of local political accountability. However, drivers count one at a time: alignment matters before local elections, while local political accountability matters before central elections. We then perform a test of our model using Brazilian data, which corroborates our results. Furthermore, we show and explain why political accountability becomes a curse: better-educated districts receive fewer transfers in equilibrium.

Federico Boffa, Amedeo Piolatto, Francisco de Lima Cavalcanti.

TD n. 673, 15/09/2021

We adopt an artificial counterfactual approach to assess the impact of lockdowns on the short-run evolution of the number of cases and deaths in some US states. To do so, we explore the different timing in which US states adopted lockdown policies, and divide them among treated and control groups. For each treated state, we construct an artificial counterfactual. On average, and in the very short-run, the counterfactual accumulated number of cases would be two times larger if lockdown policies were not implemented

Carlos B. Carneiro, Iuri Honda Ferreira, Marcelo Medeiros, Henrique Fernandes Pires, Eduardo Zilberman.

TD n. 688, 05/09/2021

We propose a model that reconciles microeconomic evidence of frequent and large price changes with sizable monetary non-neutrality. Firms incur separate lump-sum costs to change prices and to gather and process some information about marginal costs. Additional relevant information is continuously available, and can be factored into pricing decisions at no cost. We estimate the model by Simulated Method of Moments, using price-setting statistics for the U.S. economy. The model with free idiosyncratic and costly aggregate information fits well both targeted and untargeted microeconomic moments and generates more than twice as much monetary non-neutrality as the Calvo model.

Marco Bonomo, Carlos Viana de Carvalho, Rene Garcia, Vivian Malta Nunes, Rodolfo Dinis Rigato.

TD n. 686, 02/09/2021

Ordinary Least Squares (OLS) estimation of monetary policy rules produces potentially inconsistent estimates of policy parameters. The reason is that central banks react to variables, such as in ation and the output gap, which are endogenous to monetary policy shocks. Endogeneity implies a correlation between regressors and the error term, and hence, an asymptotic bias. In principle, Instrumental Variables (IV) estimation can solve this endogeneity problem. In practice, IV estimation poses challenges, as the validity of potential instruments depends on various unobserved features of the economic environment. We argue in favor of OLS estimation of monetary policy rules. To that end, we show analytically in the three-equation New Keynesian model that the asymptotic OLS bias is proportional to the fraction of the variance of regressors accounted for by monetary policy shocks. Using Monte Carlo simulation, we then show that this relationship also holds in a quantitative model of the U.S. economy. As monetary policy shocks explain only a small fraction of the variance of regressors typically included in monetary policy rules, the endogeneity bias is small. Using simulations, we show that, for realistic sample sizes, the OLS estimator of monetary policy parameters outperforms IV estimators.

A sair no Journal of Monetary Economics

Carlos Viana de Carvalho, Fernanda Feitosa Nechio, Tiago Santana Tristão.

TD n. 672, 10/07/2021

This paper proposes a generalization of the class of realized semivariance and semicovariance measures introduced by Barndorff-Nielsen, Kinnebrock and Shephard (2010) and Bollerslev, Li, Patton and Quaedvlieg (2020a) to allow for a finer decomposition of realized (co)variances. The new “realized partial (co)variances” allow for multiple thresholds with various locations, rather than the single fixed threshold of zero used in semi (co)variances. We adopt methods from machine learning to choose the thresholds to maximize the out-ofsample forecast performance of time series models based on realized partial (co)variances. We find that in low dimensional settings it is hard, but not impossible, to improve upon the simple fixed threshold of zero. In large dimensions, however, the zero threshold embedded in realized semi covariances emerges as a robust choice.

Tim Bollerslev, Marcelo Medeiros, Andrew J. Patton, Rogier Quaedvlieg.

TD n. 684, 22/06/2021

We provide a new theory for nodewise regression when the residuals from a fitted factor model are used to apply our results to the analysis of maximum Sharpe ratio when the number of assets in a portfolio is larger than its time span. We introduce a new hybrid model where factor models are combined with feasible nodewise regression. Returns are generated from increasing number of factors plus idiosyncratic components (errors). The precision matrix of the idiosyncratic terms is assumed to be sparse, but the respective covariance matrix can be non-sparse. Since the nodewise regression is not feasible due to unknown nature of errors, we provide a feasible-residual based nodewise regression to estimate the precision matrix of errors, as a new method. Next, we show that the residual-based nodewise regression provides a consistent estimate for the precision matrix of errors. In another new development, we also show that the precision matrix of returns can be estimated consistently, even with increasing number of factors. Benefiting from the consistency of the precision matrix estimate of returns, we show that: (1) the portfolios in high dimensions are mean-variance efficient; (2) maximum out-of-sample Sharpe ratio estimator is consistent and the number of assets slows the convergence up to a logarithmic factor; (3) the maximum Sharpe ratio estimator is consistent when the portfolio weights sum to one; and (4) the Sharpe ratio estimators are consistent in global minimum-variance and mean-variance portfolios.

Mehmet Caner, Marcelo Medeiros, Gabriel F. R. Vasconcelos.

TD n. 685, 30/05/2021

Price selection is a simple, model-free measure of selection in price setting and its contribution to inflation dynamics. It exploits comovement between inflation and the level from which adjusting prices departed. Prices that increase from lower-than-usual levels tend to push inflation above average. Using detailed micro-level consumer price data for the United Kingdom, the United States, and Canada, we nd robust evidence of strong price selection across goods and services. At a disaggregate level, price selection accounts for 37% of inflation variance in the United Kingdom, 36% in the United States, and 28% in Canada. Price selection is stronger for goods with less frequent price changes or with larger average price changes. Aggregate price selection is considerably weaker. A multisector sticky-price model accounts well for this evidence and demonstrates a monotone relationship between price selection and monetary non-neutrality. Revisto em maio 2021

Artigo aceito para publicação no Journal of Monetary Economics

Carlos Viana de Carvalho, Oleksiy Kryvtsov.

TD n. 670, 17/04/2021

The number of Covid-19 cases is increasing dramatically worldwide. Therefore, the availability of reliable forecasts for the number of cases in the coming days is of fundamental importance. We propose a simple statistical method for short-term real-time forecasting of the number of Covid-19 cases and fatalities in countries that are latecomers – i.e., countries where cases of the disease started to appear some time after others. In particular, we propose a penalized (LASSO) regression with an error correction mechanism to construct a model of a latecomer in terms of the other countries that were at a similar stage of the pandemic some days before. By tracking the number of cases and deaths in those countries, we forecast through an adaptive rolling-window scheme the number of cases and deaths in the latecomer. We apply this methodology to Brazil, and show that (so far) it has been performing very well. These forecasts aim to foster a better short-run management of the health system capacity.

Marcelo Medeiros, Alexandre Street, Davi Valladão, Gabriel F. R. Vasconcelos, Eduardo Zilberman.

TD n. 689, 05/04/2021

We develop a theory of low-frequency movements in inflation expectations, and use it to interpret joint dynamics of inflation and inflation expectations for the United States and other countries over the post-war period. In our theory long-run inflation expectations are endogenous. They are driven by short-run inflation surprises, in a way that depends on recent forecasting performance and monetary policy. This distinguishes our theory from common explanations of low-frequency properties of inflation. The model, estimated using only inflation and short-term forecasts from professional surveys, accurately predicts observed measures of long-term inflation expectations and identifies episodes of unanchored expectations.

Carlos Viana de Carvalho, Stefano Eusepi, Emanuel Moench, Bruce Preston.

TD n. 682, 19/03/2021

We characterize jump dynamics in stock market returns using a novel series of intraday prices covering over 80 years. Jump dynamics vary substantially over time. Trends in jump activity relate to secular shifts in the nature of news. Unscheduled news often involving major wars drives jump activity in early decades, whereas scheduled news and especially news pertaining to monetary policy drives jump activity in recent decades. Jump variation measures forecast excess stock market returns, consistent with theory. Results support models featuring a separate jump factor such that risk premium dynamics are not fully captured by volatility state variables

James A. Johnson, Bradley S. Paye, Marcelo Medeiros.

TD n. 683, 04/03/2021

It is widely known that Google Trends has become one of the most popular free tools used by forecasters both in academics and in the private and public sectors. There are many papers, from several different fields, concluding that Google Trends improve forecasts’ accuracy. However, what seems to be widely unknown, is that each sample of Google search data is different from the other, even if you set the same search term, data and location. This means that it is possible to find arbitrary conclusions merely by chance. This paper aims to show why and when it can become a problem and how to overcome this obstacle.

Marcelo Medeiros, Henrique Fernandes Pires.

TD n. 681, 22/02/2021

Factor and sparse models are two widely used methods to impose a low-dimensional structure in high dimension. They are seemingly mutually exclusive. In this paper, we propose a simple lifting method that combines the merits of these two models in a supervised learning methodology that allows to efficiently explore all the information in high-dimensional datasets. The method is based on a very flexible linear model for panel data, called factor-augmented regression model with both observable, latent common factors, as well as idiosyncratic components as high-dimensional covariate variables. This model not only includes both factor regression and sparse regression as specific models but also significantly weakens the cross-sectional dependence and hence facilitates model selection and interpretability. The methodology consists of three steps. At each step, remaining cross-section dependence can be inferred by a novel test for covariance structure in high-dimensions. We developed asymptotic theory for the factoraugmented sparse regression model and demonstrated the validity of the multiplier bootstrap for testing high-dimensional covariance structure. This is further extended to testing highdimensional partial covariance structures. The theory and methods are further supported by an extensive simulation study and applications to the construction of a partial covariance network of the financial returns for the constituents of the S&P500 index and prediction exercise for a large panel of macroeconomic time series from FRED-MD database

Jianqing Fan, Ricardo Masini, Marcelo Medeiros.