Articles

Filter By:

Article Type
  • Predicting binding specificity of T-cell receptors (TCRs) and putative antigens can help improve cancer immunotherapy. Lin et al. propose RACER, which efficiently makes use of supervised machine learning to learn important molecular interactions contributing to TCR–peptide binding.

    • Xingcheng Lin
    • Jason T. George
    • Herbert Levine
    Article
  • A dynamic organ-resolved model is developed by integrating metabolic and regulatory processes in type 1 diabetes, providing a depiction of network dynamics, regulation and response to perturbations in relation to variability in insulin response.

    • Marouen Ben Guebila
    • Ines Thiele
    Article
  • haploSep is a computationally efficient method to infer major haplotypes and their frequencies from multiple samples of allele frequency data, and to provide improved estimates of experimentally obtained allele frequencies.

    • Marta Pelizzola
    • Merle Behr
    • Andreas Futschik
    Article
  • The CARseq method allows users to assess cell type-specific differential expression using RNA-seq data from bulk tissue samples, which opens up several opportunities for re-analyzing existing RNA-seq data and designing new studies.

    • Chong Jin
    • Mengjie Chen
    • Wei Sun
    Article
  • Physics-aware deep generative models are used to design material microstructures exhibiting tailored properties. Multi-fidelity data are used to create inexpensive yet accurate machine learning surrogates for evaluating the physics-based constraints within such design frameworks.

    • Xian Yeow Lee
    • Joshua R. Waite
    • Soumik Sarkar
    Article
  • Stimulated emission depletion (STED) microscopy allows images to be captured at a subdiffraction resolution. Here optimal transport colocalization is proposed for analyzing macromolecule distributions in high-resolution STED images.

    • Carla Tameling
    • Stefan Stoldt
    • Axel Munk
    Article
  • Developing lightweight deep neural networks, while essential for edge computing, still remains a challenge. Random sketch learning is a method that creates computationally efficient and compact networks, thus paving the way for deploying tiny machine learning (TinyML) in resource-constrained devices.

    • Bin Li
    • Peijun Chen
    • Jun Zhang
    Article
  • Through parametric sensitivity analysis and uncertainty quantification of the CovidSim model, a subset of this model’s parameters is identified to which the code output is most sensitive. Using these allows better and more informed decisions about proposed policies.

    • Wouter Edeling
    • Hamid Arabnejad
    • Peter V. Coveney
    Article
  • Spiking neural network simulations are very memory-intensive, limiting large-scale brain simulations to high-performance computer systems. Knight and Nowotny propose using procedural connectivity to substantially reduce the memory footprint of these models, such that they can run on standard GPUs.

    • James C. Knight
    • Thomas Nowotny
    Article
  • Multi-fidelity graph networks learn more effective representations for materials from large data sets of low-fidelity properties, which can then be used to make accurate predictions of high-fidelity properties, such as the band gaps of ordered and disordered crystals and energies of molecules.

    • Chi Chen
    • Yunxing Zuo
    • Shyue Ping Ong
    Article