Recognizing reproducibility and reusability in times of fast science.
Interpretability and implicit model semantics in biomedicine and deep learning.
Computational framework to predict and shape human–machine interactions in closed-loop, co-adaptive neural interfaces.
Machine learning global atomic representations with Euclidean fast attention.
Reverse predictivity for bidirectional comparison of neural networks and biological brains.
AI and the long game
Cardiac health assessment across scenarios and devices using a multimodal foundation model pretrained on data from 1.7 million individuals
Synthetic X‑ray‑driven tracking and control of miniature medical devices
Benchmarking large language models on safety risks in scientific laboratories
Jointly modeling cardiovascular biomarkers
Addendum: Resolving data bias improves generalization in binding affinity prediction
Multi-agent AI systems need transparency
Modelling drug-induced cellular perturbation responses with a biologically informed dual-branch transformer
Proposing and solving olympiad geometry with guided tree search
Teaching machines to blend electrolyte cocktails
A unified predictive and generative solution for liquid electrolyte formulation
On the troubling rise of generative AI suspicion in academic publishing
Identifying spatial single-cell-level interactions with graph transformer
Authorization of prognostic AI medical devices
Attributing and situating knowledge cannot be left to language models
Visual language models show widespread visual deficits on neuropsychological tests
A flaw in using pretrained protein language models in protein–protein interaction inference models
Reusability Report: Evaluating the performance of a meta-learning foundation model on predicting the antibacterial activity of natural products
When large language models are reliable for judging empathic communication
What matters in building vision–language–action models for generalist robots
A federated graph learning method to realize multi-party collaboration for molecular discovery
Parallel hierarchical encoding of linguistic representations in the human auditory cortex and recurrent automatic speech recognition systems
Preconditioned inexact stochastic ADMM for deep models
Meta-designing quantum experiments with language models
Author Correction: Mask-prior-guided denoising diffusion improves inverse protein folding
A large-scale randomized study of large language model feedback in peer review
A family of large language models for materials research with insights into model adaptability in continued pretraining
A universal spin–orbit-coupled Hamiltonian model for accelerated quantum material discovery
Conditional diffusion with locality-aware modal alignment for generating diverse protein conformational ensembles
Reinforcement learning-based design of sequential drug treatment targeting the evolving tumour landscape with SequenTx
Learning collision risk proactively from naturalistic driving data at scale
A benchmarking framework for embodied neuromorphic agents
LLMs displaying less cognitive bias are not necessarily better decision makers
Sample-efficient generative molecular design using memory manipulation
Predicting and interpreting cell-type-specific drug responses in the small-data regime using inductive priors
A robot operating system framework for using large language models in embodied AI
The case for stakeholder-driven AI auditing in automatic speech recognition
Brain-inspired warm-up training with random noise for uncertainty calibration
Learning to be uncertain before learning from data
Predicting new research directions in materials science using large language models and concept graphs
Two-dimensional geometric template diffusion for boosting single-sequence protein structure prediction
Deep generative classification of blood cell morphology.
Reusability report: A distributed strategy for solving combinatorial optimization problems with hypergraph neural networks.
Are neural network representations universal or idiosyncratic?
Accelerating molecular dynamics by going with the flow.