Peter J. Cameron, University of St Andrews, Scotland
Aparna Lakshmanan S., Cochin University of Science and Technology, India
Ambat Vijayakumar, Cochin University of Science and Technology, India

The Shrikhande Graph
A Window on Discrete Mathematics

Series: London Mathematical Society Student Texts

Description

The Shrikhande graph, discovered by Indian Mathematician Sharadchandra Shankar Shrikhande in 1959, exhibits several unusual properties and occupies a pivotal position within discrete mathematics. Offering a unique introduction to graph theory and discrete mathematics, this book uses the example of the Shrikhande graph as a window through which these topics can be explored. Providing historical background, including the Euler conjecture and its demise, the authors explore key concepts including: Cayley graphs; topological graph theory; spectral theory; Latin squares; root systems. A novel and valuable resource for graduate students and researchers interested in graph theory, its history, and applications, this book offers a comprehensive exploration of the Shrikhande graph and its significance.

Ties together many areas of discrete mathematics, demonstrating their applications to the Shrikhande graph
An in-depth study of a beautiful graph with an interesting history
A tribute to the eminent Indian mathematician, Sharadchandra Shankar Shrikhande

Product details

Published: May 2026
Format: Hardback
ISBN: 9781009709101
Format: Paperback
ISBN: 9781009709088
Length: 178 pages
Dimensions: 229 × 152 mm
Availability: Not yet published - available from May 2026

Contents

Preface
Part I. Biography:
1. The life of S. S. Shrikhande
Part II. Graph Basics:
2. Definitions
3. Strongly regular graphs
Part III. Properties of the Shrikhand Graph:. 4. Spectrum and automorphism group
5. Further properties
Part IV. The Shrikhande Graph in Context:
6. Latin squares
7. The Shrikhande graph on the torus
8. Root systems
9. Graphs with least eigenvalue
10. Miscellanea
11. Further reading
References
Index.


T. Asir, Pondicherry University, Pondicherry
M. Evangeline Prathibha, Lady Doak College, Madurai, Tamil Nadu
B. Surendranath Reddy, Swami Ramanand Teerth Marathwada University, Nanded, Maharasthra

Essential Graph Theory
Concepts and Algorithms

Description

Designed for undergraduate students of computer science, mathematics, and engineering, this book provides the tools and understanding needed to master graph theory and algorithms. It offers a strong theoretical foundation, detailed pseudocodes, and a range of real-world and illustrative examples to bridge the gap between abstract concepts and practical applications. Clear explanations and chapter-wise exercises support ease of comprehension for learners. The text begins with the basic properties of graphs and progresses to topics such as trees, connectivity, and distances in graphs. It also covers Eulerian and Hamiltonian graphs, matchings, planar graphs, and graph colouring. The book concludes with discussions on independent sets, the Ramsey theorem, directed graphs and networks. Concepts are introduced in a structured manner, with appropriate context and support from mathematical language and diagrams. Algorithms are explained through rules, reasoning, pseudocode, and relevant examples.

Inquiry-based approach to foster interest in the subject
Detailed mathematical proofs for ease of understanding
Numerical exercises for sharpening problem-solving skills
Pseudocodes for visualization of algorithmic processes

Product details

Published: December 2025
Format: Paperback
ISBN: 9781009559379
Length: 334 pages
Dimensions: 240 × 182 × 13 mm
Weight: 0.468kg
Availability: Not yet published - available from December 2025

Contents

Preface
1. Introduction to graphs
2. Basic properties of graphs
3. Trees
4. Connectivity
5. Distance in Graphs
6. Eulerian graphs and Hamiltonian graphs
7. Matchings
8. Planar graphs
9. Coloring of Graphs
10. Independent sets and Ramsey theory
11. Directed Graphs
Bibliography
Index.

By Samia Challal

Guide to Multiple Regression

Copyright 2026
Hardback
ISBN 9781041018919
240 Pages 35 Color & 1 B/W Illustrations
December 30, 2025 by Chapman & Hall

Description

This book is intended to equip the reader with concept knowledge on multilinear regression methods, and develop technical skills while solving problems with the R programming language. The goal is not to become an expert in both directions. It is rather an incentive to do research on advanced models in data analysis once confidence is gained with the practice of some methods.

Basics in inferential statistics and basics in a programming language are necessary prerequisites to engage well with the content. We believe that using a computer is the only practical means of performing a linear or multilinear regression on a data set of even moderate size. Therefore, our purpose is to make these connections between theory, technology, and applications, as clear as possible.

Features

Focused summaries of the main statistical methods, followed by solved questions
Integration of R as a calculator, and as a programming language in solving each question with ease and accuracy
Use of simple coding that illustrates the connection to the theory
Suggest progressively alternative codes
Visualization and interpretation of the R outputs

Table of Contents

List of Figures List of Tables Preface 1 Simple linear probabilistic model 2 Multi-linear regression 3 Non-Basic Regression Appendix: Q-Q plot Appendix: Residual Plots Bibliography Index

By Gnanadarsha Sanjaya Dissanayake, Hassan Doosti

Long Memory Time Series Analysis

Copyright 2026
Paperback
Hardback
ISBN 9781032626963
ISBN 9781032626994
170 Pages 198 B/W Illustrations
February 19, 2026 by Chapman & Hall

Description

Long Memory Time Series Analysis is a comprehensive text which covers long memory time series with the different long memory time series discussed. The authors cover modelling and forecasting using various time series, deploying traditional and machine learning methodologies. The reader also learns recent research trends, such as state space modelling of generalized long memory time series and the use of the tsfGRNN machine learning tool in R. The book starts from autoregressive (AR) and moving average (MA) processes to descriptions of the autoregressive integrated moving average (ARMA) time series, the ARIMA model, and the autoregressive fractionally integrated moving average (ARFIMA) process. The differences of short, intermediate, and long memory processes are highlighted. The reader will gain knowledge of elementary time series through this extensive coverage.

The book discusses generalized Gegenbauer autoregressive moving averages (GARMA) and seasonal GARMA long memory time series and state space modelling of generalized and seasonal GARMA. The extensions of the short and long memory models driven by generalised autoregressive conditionally heteroskedastic (GARCH) errors are also presented. The extensive range of problems linked with generalized Gegenbauer long memory time series are presented to reinforce the reader’s conceptual learning. Coverage on the use of time series with high frequency data captured through the latest technological innovations is an invaluable resource to the reader. This learning is done through examples of time series application case studies in medicine, biology, and finance.

The core audience is students attending advanced studies in time series. The book can also be used by researchers and data scientists involved in utilizing time series analysis in a modern context.

Table of Contents

1. Introduction to AR, MA Time Series, Autocorrelation, Partial Autocorrelation, Spectral Density
2. ARMA Process and Box–Jenkins Model
3. Integer Differencing and ARIMA Process with White Noise
4. Fractional Differencing and ARFIMA Process with White Noise
5. Short, Intermediate, and Long Memory Properties of Time Series
6. Standard Long Memory and State Space Modeling of ARFIMA Process with White Noise
7. State Space Modeling of GARMA Processes with Generalized Long Memory
8. Nonlinear and Non Stationary Time Series
9. An Introduction to Nonparametric Long Memory Time Series
10. ARMA, ARIMA, ARFIMA, and GARMA Models with GARCH Errors
11. Enhancing Time Series Analysis with Machine Learning, High-Frequency Data, and Applications in Medicine and Biology

By Brian J. Reich, Sujit K. Ghosh

Bayesian Statistical Methods, 2nd Edition
With Applications to Machine Learning

Copyright 2026
Hardback
ISBN 9781032486321
360 Pages 116 B/W Illustrations
February 2, 2026 by Chapman & Hall

Description

Bayesian Statistical Methods: With Applications to Machine Learning provides data scientists with the foundational and computational tools needed to carry out a Bayesian analysis. Compared to others, this book is more focused on Bayesian methods applied routinely in practice, including multiple linear regression, mixed effects models and generalized linear models. This second edition includes a new chapter on Bayesian machine learning methods to handle large and complex datasets and several new applications to illustrate the benefits of the Bayesian approach in terms of uncertainty quantification.

Readers familiar with only introductory statistics will find this book accessible, as it includes many worked examples with complete R code, and comparisons are presented with analogous frequentist procedures. The book can be used as a one-semester course for advanced undergraduate and graduate students and can be used in courses comprising undergraduate statistics majors, as well as non-statistics graduate students from other disciplines such as engineering, ecology and psychology. In addition to thorough treatment of the basic concepts of Bayesian inferential methods, the book covers many general topics:

Advice on selecting prior distributions
Computational methods including Markov chain Monte Carlo (MCMC) sampling
Model-comparison and goodness-of-fit measures, including sensitivity to priors.
To illustrate the flexibility of the Bayesian approaches for complex data structures, the latter chapters provide case studies covering advanced topics:

Handling of missing and censored data
Priors for high-dimensional regression models
Machine learning models including Bayesian adaptive regression trees and deep learning
Computational techniques for large datasets
Frequentist properties of Bayesian methods.
The advanced topics are presented with sufficient conceptual depth that the reader will be able to carry out such analysis and argue the relative merits of Bayesian and classical methods. A repository of R code, motivating data sets and complete data analyses is made available on the book’s website.

Table of Contents

Preface 1 Basics of Bayesian inference 2 From prior information to posterior inference 3 Computational approaches 4 Linear models 5 Hypothesis testing 6 Model selection and diagnostics 7 Case studies using hierarchical modeling 8 Machine learning 9 Statistical properties of Bayesian methods Appendices Bibliography Index

By Robert H. Shumway, David S. Stoffer

Time Series, 2nd Edition
A Data Analysis Approach Using R

Copyright 2026
Paperback
ISBN 9781041031642
Hardback
ISBN 9781041031611
292 Pages 106 Color & 3 B/W Illustrations
February 9, 2026 by Chapman & Hall

Description

The goals of this new, second edition of this book are to develop the skills and an appreciation for the richness and versatility of modern time series analysis as a tool for analyzing dependent data. An expanded feature of this edition is the inclusion of many nontrivial data sets illustrating the wealth of potential applications to problems in the biological, physical, and social sciences as well as in economics and medicine.

This edition emphasizes a variety of methodological techniques to illustrate solutions to data analysis problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and the analysis of economic and financial problems.

Key Features:

• Presents a balanced and comprehensive treatment of both time and frequency domain methods with an emphasis on data analysis.

• Detailed R code is included with each numerical example.

• Includes nontrivial data sets.

The book can be used for a one semester/quarter introductory time series course where the prerequisites are an understanding of linear regression, basic calculus-based probability and statistics skills, and math skills at the high-school level. All the numerical examples use the R statistical package without assuming the reader has previously used the software.

Robert H. Shumway was Professor of Statistics, University of California, Davis. He was a Fellow of the American Statistical Association and won the American Statistical Association Award for Outstanding Statistical Application. He was the author of numerous texts and served on editorial boards such as the Journal of Forecasting and the Journal of the American Statistical Association.

David S. Stoffer is Professor Emeritus of Statistics, University of Pittsburgh. He is a Fellow of the American Statistical Association and has won the American Statistical Association Award for Outstanding Statistical Application. He was on the editorial boards of the Journal of Forecasting, the Annals of Statistical Mathematics, and the Journal of Time Series Analysis. He served as a Program Director in the Division of Mathematical Sciences at the National Science Foundation and as an Associate Editor for the Journal of the American Statistical Association and the Journal of Business & Economic Statistics. The authors have also published the more advanced Time Series Analysis and Its Application: With R Examples, Fifth Edition.

Table of Contents

Preface 1 Time Series Elements 2 Correlation and Stationary Time Series 3 Time Series Regression and EDA 4 ARMA Models 5 ARIMA Models 6 Spectral Analysis and Filtering 7 Spectral Estimation 8 Additional Topics* Appendix A Probability and Statistics Primer Appendix B Complex Number Primer Hints for Selected Exercises References Index

By George Szpiro

The Random Number Code
Unlocking the Secrets of Numbers That You Can't Predict but Can Rely On

Copyright 2026
Hardback
Paperback
ISBN 9781041076445
ISBN 9781041073291
240 Pages 14 B/W Illustrations
February 28, 2026 by Chapman & Hall

Description

Random numbers are immensely important in scientific research, in economic decision making, in polling, gaming, and cryptography and algorithm design. Surprisingly, while many popular books in mathematics have been written about prime numbers, about □□, e, √-1, there exist no books for the general reader about random numbers. True, the subject of randomness as such has been the subject of several books, but random numbers are only mentioned, if at all, as a by-product. Given the immense theoretical and practical importance of random numbers, this is astonishing.

This book proposes to fill that gap.

The book discusses random numbers under five headings: What are they? What are they good for? How do we produce them? Why do we need them? How do we fake them? The book has been written with a sophisticated general reader in mind, but should be of much interest to students and academics of all levels who have an interest in mathematics and randomness.

Features

Written in an easily readable, conversational style
Aimed at general readers who are interested in mathematics in general, or who have read books about □□, e, √-1, or irrational numbers
Accessible to anybody with a high-school mathematics education

Table of Contents

Introduction Part I Random Numbers: What are they? Chapter 1 Chapter 2 Chapter 3 Chapter 4 Part II Random Numbers: What are they good for? Chapter 5 Chapter 6 Chapter 7 Part III Random Numbers: How do we produce them? Chapter 8 Chapter 9 Chapter 10 Chapter 11 Part IV Random Numbers: Why do we need them? Chapter 12 Chapter 13 Chapter 14 Chapter 15 Chapter 16 Part V Random Numbers: How do we fake them? Chapter 17 Chapter 18 Chapter 19 Epilogue

Edited By Radu V. Craiu, Dootika Vats, Galin Jones, Steve Brooks, Andrew Gelman, Xiao-Li Meng

Handbook of Markov Chain Monte Carlo, 2nd Edition

Copyright 2026
Hardback
ISBN 9781032591575
680 Pages 53 Color & 73 B/W Illustrations
March 6, 2026 by Chapman & Hall

Description

This thoroughly revised and expanded second edition of the Handbook of Markov Chain Monte Carlo reflects the dramatic evolution of MCMC methods since the publication of the first edition. With the addition of two new editors, Radu V. Craiu and Dootika Vats, this comprehensive reference now offers deeper insights into the theoretical foundations and cutting-edge developments that are reshaping the field.

Features:

Completely restructured content with 13 updated chapters from the first edition and 10 entirely new chapters reflecting the latest methodological advances
In-depth coverage of recent breakthroughs in multi-modal sampling, intractable likelihood problems, and involutive MCMC theory
Comprehensive exploration of unbiased MCMC methods, control variates, and rigorous convergence bounds
Practical guidance on implementing MCMC algorithms on modern hardware and software platforms
Cutting-edge material on the integration of MCMC with deep learning and other machine learning approaches
Authoritative treatment of theoretical foundations alongside practical implementation strategies
This essential reference serves statisticians, computer scientists, physicists, data scientists, and researchers across disciplines who employ computational methods for Bayesian inference and stochastic simulation. Graduate students will find it an invaluable learning resource, while experienced practitioners will appreciate its balance of theoretical depth and practical implementation advice. Whether used as a comprehensive guide to current MCMC methodology or as a reference for specific advanced techniques, this handbook provides the definitive resource for anyone working at the intersection of Bayesian computation and modern statistical modeling.

Table of Contents

1 Introduction to MCMC
Charles J. Geyer
2 MCMC using Hamiltonian Dynamics
Radford M. Neal
3 Optimising and Adapting Metropolis Algorithm Proposal Distributions
Jeffrey S. Rosenthal
4 How Many Iterations to Run?
Charles C. Margossian and Andrew Gelman
5 Implementing MCMC: Multivariate Estimation with Confidence
James M. Flegal and Rebecca P. Kurtz-Garcia
6 Importance Sampling, Simulated Tempering, and Umbrella Sampling
Charles J. Geyer
7 Reversible Jump MCMC
Yanan Fan, Scott A. Sisson, and Laurence Davies
8 Perfecting MCMC Sampling: Recipes and Reservations
Radu V. Craiu and Xiao-Li Meng
9 The Data Augmentation Algorithm
Vivekananda Roy, Kshitij Khare, and and James P. Hobert
10 Latent Gaussian Models and Computation for Large Spatial Data
Murali Haran, John Hughes, and Ben Seiyon Lee
11 Efficient MCMC in Astronomy
David A. van Dyk, Taeyoung Park, and Hector McKimm
12 Computationally Intensive Inverse Problems
Mikkel B. Lykkegaard, Colin Fox, Dave Higdon, C. Shane Reese, and J. David Moulton
13 MCMC for State Space Models
Paul Fearnhead and Chris Sherlock
II New Chapters
14 MCMC Methods for Multi-modal Distributions
Krzysztof Łatuszyński, Matthew T. Moores, and Timothée Stumpf-Fétizon
15 Algorithms for Models with Intractable Normalizing Functions
Murali Haran, Bokgyeong Kang, and Jaewoo Park
16 Involutive theory of MCMC
Nathan E. Glatt-Holtz, Andrew J. Holbrook, Justin A. Krometis, Cecilia F. Mondaini, and Ami Sheth
17 Unbiased MCMC
Yves F. Atchadé and Pierre E. Jacob
18 Control Variates for MCMC
Leah South and Matthew Sutton
19 Convergence Bounds for MCMC
Qian Qin
20 Perturbations of Markov Chains
Daniel Rudolf, Aaron Smith, and Matias Quiroz
21 Running MCMC on Modern Hardware and Software
Pavel Sountsov, Colin Carroll, and Matthew D. Hoffman
22 Bayesian Computation in Deep Learning
Wenlong Chen, Bolian Li, Ruqi Zhang, and Yingzhen Li
23 MCMC-driven Learning
Alexandre Bouchard-Côté, Trevor Campbell, Geoff Pleiss, and Nikola Surjanovic


By Olivier Gimenez

Bayesian Analysis of Capture-Recapture Data with Hidden Markov Models
Theory and Case Studies in R and NIMBLE

Copyright 2026
Hardback
ISBN 9781032154237
360 Pages 19 B/W Illustrations
March 30, 2026 by Chapman & Hall

Description

Bayesian Analysis of Capture-Recapture Data with Hidden Markov Models: Theory and Case Studies in R and NIMBLE introduces ecologists and statisticians to a powerful and unifying framework for analysing capture-recapture data. Hidden Markov models (HMMs) have become a cornerstone in modern population ecology, offering a flexible way to decompose complex processes such as survival, recruitment, and dispersal into simpler building blocks, while explicitly accounting for the fact that we only observe imperfect data rather than the true underlying states. Combined with Bayesian inference, HMMs provide a natural and transparent approach to handle uncertainty, explore model structures, and draw robust conclusions. This book illustrates how to bring these ideas to life using the R package NIMBLE, a fast-developing environment for building and fitting hierarchical models.

Key features include:

• A clear introduction to the principles of Bayesian statistics, HMMs, and the NIMBLE package
• Step-by-step tutorials showing how to implement a wide range of capture-recapture models for open populations
• Fully reproducible examples with data and R code, following a “learning by doing” philosophy
• Case studies drawn from the ecological literature, illustrating how to apply methods to real-world conservation questions
• Practical guidance on model specification, coding strategies, and interpretation of results

Written in an accessible style, this book is designed for ecologists, wildlife biologists, and conservation scientists who already use R and wish to deepen their modelling toolkit, as well as statisticians interested in ecological applications. Beginners will find a self-contained path into Bayesian capture-recapture modelling, while experienced researchers will discover a flexible framework to extend and adapt to their own data and questions.

Table of Contents

1. Bayesian statistics & MCMC
2. NIMBLE tutorial
3. Hidden Markov models
4. Alive and dead
5. Sites and states
6. Dealing with covariates
7. Addressing model lack of fit
8. Quantifying life history traits