New articles on Economics


[1] 2601.16274

A Nonlinear Target-Factor Model with Attention Mechanism for Mixed-Frequency Data

We propose Mixed-Panels-Transformer Encoder (MPTE), a novel framework for estimating factor models in panel datasets with mixed frequencies and nonlinear signals. Traditional factor models rely on linear signal extraction and require homogeneous sampling frequencies, limiting their applicability to modern high-dimensional datasets where variables are observed at different temporal resolutions. Our approach leverages Transformer-style attention mechanisms to enable context-aware signal construction through flexible, data-dependent weighting schemes that replace fixed linear combinations with adaptive reweighting based on similarity and relevance. We extend classical principal component analysis (PCA) to accommodate general temporal and cross-sectional attention matrices, allowing the model to learn how to aggregate information across frequencies without manual alignment or pre-specified weights. For linear activation functions, we establish consistency and asymptotic normality of factor and loading estimators, showing that our framework nests Target PCA as a special case while providing efficiency gains through transfer learning across auxiliary datasets. The nonlinear extension uses a Transformer architecture to capture complex hierarchical interactions while preserving the theoretical foundations. In simulations, MPTE demonstrates superior performance in nonlinear environments, and in an empirical application to 13 macroeconomic forecasting targets using a selected set of 48 monthly and quarterly series from the FRED-MD and FRED-QD databases, our method achieves competitive performance against established benchmarks. We further analyze attention patterns and systematically ablate model components to assess variable importance and temporal dependence. The resulting patterns highlight which indicators and horizons are most influential for forecasting.


[2] 2601.16613

Is the diurnal pattern sufficient to explain intraday variation in volatility? A nonparametric assessment

In this paper, we propose a nonparametric way to test the hypothesis that time-variation in intraday volatility is caused solely by a deterministic and recurrent diurnal pattern. We assume that noisy high-frequency data from a discretely sampled jump-diffusion process are available. The test is then based on asset returns, which are deflated by the seasonal component and therefore homoskedastic under the null. To construct our test statistic, we extend the concept of pre-averaged bipower variation to a general Itô semimartingale setting via a truncation device. We prove a central limit theorem for this statistic and construct a positive semi-definite estimator of the asymptotic covariance matrix. The $t$-statistic (after pre-averaging and jump-truncation) diverges in the presence of stochastic volatility and has a standard normal distribution otherwise. We show that replacing the true diurnal factor with a model-free jump- and noise-robust estimator does not affect the asymptotic theory. A Monte Carlo simulation also shows this substitution has no discernable impact in finite samples. The test is, however, distorted by small infinite-activity price jumps. To improve inference, we propose a new bootstrap approach, which leads to almost correctly sized tests of the null hypothesis. We apply the developed framework to a large cross-section of equity high-frequency data and find that the diurnal pattern accounts for a rather significant fraction of intraday variation in volatility, but important sources of heteroskedasticity remain present in the data.


[3] 2601.16668

Inference from high-frequency data: A subsampling approach

In this paper, we show how to estimate the asymptotic (conditional) covariance matrix, which appears in central limit theorems in high-frequency estimation of asset return volatility. We provide a recipe for the estimation of this matrix by subsampling; an approach that computes rescaled copies of the original statistic based on local stretches of high-frequency data, and then it studies the sampling variation of these. We show that our estimator is consistent both in frictionless markets and models with additive microstructure noise. We derive a rate of convergence for it and are also able to determine an optimal rate for its tuning parameters (e.g., the number of subsamples). Subsampling does not require an extra set of estimators to do inference, which renders it trivial to implement. As a variance-covariance matrix estimator, it has the attractive feature that it is positive semi-definite by construction. Moreover, the subsampler is to some extent automatic, as it does not exploit explicit knowledge about the structure of the asymptotic covariance. It therefore tends to adapt to the problem at hand and be robust against misspecification of the noise process. As such, this paper facilitates assessment of the sampling errors inherent in high-frequency estimation of volatility. We highlight the finite sample properties of the subsampler in a Monte Carlo study, while some initial empirical work demonstrates its use to draw feasible inference about volatility in financial markets.


[4] 2601.16801

Bringing the economics of biodiversity into policy and decision-making: A target and cost-based approach to pricing biodiversity

Given ongoing, human-induced, loss of wild species we propose the Target and Cost Analysis (TCA) approach as a means of incorporating biodiversity within government appraisals of public spending. Influenced by how carbon is priced in countries around the world, the resulting biodiversity shadow price reflects the marginal cost of meeting government targets while avoiding disagreements on the use of willingness to pay measures to value biodiversity. Examples of how to operationalize TCA are developed at different scales and for alternative biodiversity metrics, including extinction risk for Europe and species richness in the UK. Pricing biodiversity according to agreed targets allows trade-offs with other wellbeing-enhancing uses of public funds to be sensibly undertaken without jeopardizing those targets, and is compatible with international guidelines on Cost Benefit Analysis.


[5] 2601.16865

Distributional Instruments: Identification and Estimation with Quantile Least Squares

We study instrumental-variable designs where policy reforms strongly shift the distribution of an endogenous variable but only weakly move its mean. We formalize this by introducing distributional relevance: instruments may be purely distributional. Within a triangular model, distributional relevance suffices for nonparametric identification of average structural effects via a control function. We then propose Quantile Least Squares (Q-LS), which aggregates conditional quantiles of X given Z into an optimal mean-square predictor and uses this projection as an instrument in a linear IV estimator. We establish consistency, asymptotic normality, and the validity of standard 2SLS variance formulas, and we discuss regularization across quantiles. Monte Carlo designs show that Q-LS delivers well-centered estimates and near-correct size when mean-based 2SLS suffers from weak instruments. In Health and Retirement Study data, Q-LS exploits Medicare Part D-induced distributional shifts in out-of-pocket risk to sharpen estimates of its effects on depression.


[6] 2601.16488

Anonymous Pricing in Large Markets

We study revenue maximization when a seller offers $k$ identical units to ex ante heterogeneous, unit-demand buyers. While anonymous pricing can be $\Theta(\log k)$ worse than optimal in general multi-unit environments, we show that this pessimism disappears in large markets, where no single buyer accounts for a non-negligible share of optimal revenue. Under (quasi-)regularity, anonymous pricing achieves a $2+O(1/\sqrt{k})$ approximation to the optimal mechanism; the worst-case ratio is maximized at about $2.47$ when $k=1$ and converges to $2$ as $k$ grows. This indicates that the gains from third-degree price discrimination are mild in large markets.


[7] 2601.16749

Finite Population Inference for Factorial Designs and Panel Experiments with Imperfect Compliance

This paper develops a finite population framework for analyzing causal effects in settings with imperfect compliance where multiple treatments affect the outcome of interest. Two prominent examples are factorial designs and panel experiments with imperfect compliance. I define finite population causal effects that capture the relative effectiveness of alternative treatment sequences. I provide nonparametric estimators for a rich class of factorial and dynamic causal effects and derive their finite population distributions as the sample size increases. Monte Carlo simulations illustrate the desirable properties of the estimators. Finally, I use the estimator for causal effects in factorial designs to revisit a famous voter mobilization experiment that analyzes the effects of voting encouragement through phone calls on turnout.


[8] 2305.12857

One Call Away. Ownership Chains and Ease of Communication in Multinational Enterprises

This study examines how multinational enterprises structure ownership chains to coordinate subsidiaries across multiple national borders. Using a unique global dataset, we first document key stylized facts: 54% of subsidiaries are controlled through indirect ownership, and ownership chains can span up to seven countries. In particular, we find that subsidiaries further down the control hierarchy tend to be more geographically distant from the parent and operate in different time zones. This suggests that the ease of communication along ownership chains is a critical determinant of their structure. On the other hand, tax optimization strategies are not correlated with locations along ownership chains. Motivated by previous findings, we develop a location choice model in which parent firms compete for corporate control of final subsidiaries, but monitoring is costly, and they can delegate control to an intermediate affiliate in another jurisdiction. The model generates a two-stage empirical strategy: (i) a trilateral equation that determines the location of an intermediate affiliate conditional on the location of final subsidiaries; and (ii) a bilateral equation that predicts the location of final investment. Our empirical estimates confirm that the ease of communication at the country level has a significant influence on the location decisions of affiliates along ownership chains. Our findings underscore the importance of communication frictions in shaping global corporate structures, and provide new insights into the geography of multinational ownership networks.


[9] 2402.08941

Local-Polynomial Estimation for Multivariate Regression Discontinuity Designs

We study a multivariate regression discontinuity design in which treatment is assigned by crossing a boundary in the space of multiple running variables. We document that the existing bandwidth selector is suboptimal for a multivariate regression discontinuity design when the distance to a boundary point is used for its running variable, and introduce a multivariate local-linear estimator for multivariate regression discontinuity designs. Our estimator is asymptotically valid and can capture heterogeneous treatment effects over the boundary. We demonstrate that our estimator exhibits smaller root mean squared errors and often shorter confidence intervals in numerical simulations. We illustrate our estimator in our empirical applications of multivariate designs of a Colombian scholarship study and a U.S. House of representative voting study and demonstrate that our estimator reveals richer heterogeneous treatment effects with often shorter confidence intervals than the existing estimator.


[10] 2406.14046

Estimating Time-Varying Parameters of Various Smoothness in Linear Models via Kernel Regression

We study kernel-based estimation of nonparametric time-varying parameters (TVPs) in linear models. Our contributions are threefold. First, we establish consistency and asymptotic normality of the kernel-based estimator for a broad class of TVPs including deterministic smooth functions, the rescaled random walk, structural breaks, the threshold model and their mixtures. Our analysis exploits the smoothness of the TVP. Second, we show that the bandwidth rate must be determined according to the smoothness of the TVP. For example, the conventional $T^{-1/5}$ rate is valid only for sufficiently smooth TVPs, and the bandwidth should be proportional to $T^{-1/2}$ for random-walk TVPs, where $T$ is the sample size. We show this highlighting the overlooked fact that the bandwidth determines a trade-off between the convergence rate and the size of the class of TVPs that can be estimated. Third, we propose a data-driven procedure for bandwidth selection that is adaptive to the latent smoothness of the TVP. Simulations and an application to the capital asset pricing model suggest that the proposed method offers a unified approach to estimating a wide class of TVP models.


[11] 2502.12431

Minimizing Instability in Strategy-Proof Matching Mechanism Using A Linear Programming Approach

We study the design of one-to-one matching mechanisms that are strategy-proof for both sides and as stable as possible. Motivated by the impossibility result of Roth (1982), we formulate the mechanism design problem as a linear program that minimizes stability violations subject to exact strategy-proofness constraints. We consider both an average-case objective (summing violations over all preference profiles) and a worst-case objective (minimizing the maximum violation across profiles), and we show that imposing anonymity and symmetry when the number of agents in both sides are the same can be done without loss of optimality. Computationally, for small markets our approach yields randomized mechanisms with substantially lower stability violations than randomized sequential dictatorship (RSD); in the $3\times 3$ case the optimum reduces average instability to roughly one third of RSD. For deterministic mechanisms with three students and three schools, we find that any two-sided strategy-proof mechanism has at least two blocking pairs in the worst case and we provide a simple algorithm that attains this bound. Finally, we propose an extension to larger markets and present simulation evidence that, relative to sequential dictatorship (SD), it reduces the number of blocking pairs by about $0.25$ on average.


[12] 2503.04854

Aggregation Model and Market Mechanism for Virtual Power Plant Participation in Inertia and Primary Frequency Response

The declining provision of inertia by synchronous generators in modern power systems necessitates aggregating distributed energy resources (DERs) into virtual power plants (VPPs) to unlock their potential in delivering inertia and primary frequency response (IPFR) through ancillary service markets. To facilitate DER participation in the IPFR market, this paper proposes an aggregation model and market mechanism for VPPs participating in IPFR. First, an energy-reserve-IPFR market framework is developed, in which a VPP acts as an intermediary to coordinate heterogeneous DERs. Second, by taking into account the delay associated with inertial response, an optimization-based VPP aggregation method is introduced to encapsulate the IPFR process involving a variety of DERs. Third, an energy-reserve-IPFR market mechanism with VPP participation is introduced, aiming to minimize social costs, where stochastic deviations of renewable energy generation are explicitly modeled through chance-constrained reformulations, ensuring that the cleared energy, reserve, and IPFR schedules remain secure against forecast errors. Case studies on IEEE 30-bus and IEEE 118-bus systems show that the nadir and quasi-steady-state frequencies are reproduced by the VPP aggregation model with a mean absolute percentage error <= 0.03%, and the proposed market mechanism with VPP participation reduces the total system cost by approximately 40% and increases the net profit by about 30%.


[13] 2601.14150

Trade relationships during and after a crisis

I study how firms adjust to temporary disruptions in international trade relationships organized through relational contracts. I exploit an extreme, plausibly exogenous weather shock during the 2010-11 La Niña season that restricted Colombian flower exporters' access to cargo terminals. Using transaction-level data from the Colombian-U.S. flower trade, I show that importers with less-exposed supplier portfolios are less likely to terminate disrupted relationships, instead tolerating shipment delays. In contrast, firms facing greater exposure experience higher partner turnover and are more likely to exit the market, with exit accounting for a substantial share of relationship separations. These findings demonstrate that idiosyncratic shocks to buyer-seller relationships can propagate into persistent changes in firms' trading portfolios.


[14] 2007.07703

Failures of Contingent Thinking

We present a behavioral definition of an agent's perceived implication that uniquely identifies a subjective state-space representing her view of a decision problem, and which may differ from the modeler's. By examining belief updating within this model, we formalize the recent empirical consensus that reducing uncertainty improves contingent thinking, and propose a novel form of updating corresponding to the agent 'realizing' a flaw in her own thinking. Finally, we clarify the sense in which contingent thinking makes state-bystate dominance more cognitively demanding than obvious dominance.