**1. Introduction**

Nowadays, fundamental and applied research in Condensed Matter Physics relies heavily on the use of large research infrastructures. These include continuous or spallation neutron sources or X-rays synchrotron facilities present today worldwide. Often, sources of neutrons and X-rays are indeed found in the same geographical place, given the recognized complementarity of these two powerful spectroscopic techniques for the study of matter. This is the case of the European Photon and

Neutron (EPN) campus in Grenoble, France, which hosts the Institut Laue-Langevin (ILL) [1], and the European Synchrotron Research Facility (ESRF) [2]. Similarly, both neutron and X-ray facilities are hosted by the Rutherford Appleton Laboratory in Oxfordshire, UK, (ISIS and Diamond, respectively) [3, 4], by the Paul Sherrer Institut in Villigen, Swiss (SINQ and SLS) [5, 6], and soon by the city of Lund, Sweden (ESS and Max IV Laboratory, respectively) [7, 8]. Large-scale facilities are accessible to scientists for beam-time allocation through a highly competitive proposal selection carried out by expert panels through peer-review processes. Based on this peer-review outcome, the number of days (or even hours) assigned to an experiment is thoroughly pondered. It readily appears how critical is to establish an optimal experimental strategy enabling to gather the most informative and precise data out of an approved measurement. For this purpose, one needs to evaluate not only the ideal number of samples and related physical and chemical conditions, but frequently (if not always, in neutron scattering experiments) the time needed for a certain number of ancillary measurements that are mandatory to achieve a clean set of data. These include accurate measurements of the resolution function, the background signal, and spurious intensity effects, in which the raw measurement needs to be precisely corrected for. Therefore, an optimal use of the beam time assigned to an experiment would greatly benefit from a quantitative criterion to take sensible decisions during the measurement. Here, we propose a simple method to achieve such a criterion based on Bayesian statistics and its inferential capabilities [9–11]. In Section 2, we briefly describe an inelastic neutron or X-ray measurement and the main concerns rising when deciding its duration. In Section 3, we focus on the output of a Brillouin Neutron scattering (BNS) experiment: the spectrum of density fluctuations of a system; in particular, we show how one can use a Bayesian approach to model this observable. In the same section, we recall a fundamental property of the Bayes theorem that makes it suited to a recursive use for data analysis purposes. To demonstrate the potentialities of this approach, we reproduce the results of a typical BNS measurement by generating simulated experimental spectra. We then summarize the results of an on-the-fly data modeling of these spectra, which enables us to draw a joint posterior distribution for the adopted model parameters eventually guiding the decision on when conveniently stop a spectral acquisition.

Such a running analysis should establish the premises for developing a Measurement Integration Time Optimizer (MITO), a computational tool to assist scattering experiments in large-scale research facilities. In Section 4, we will shortly mention aspects of the approach described which deserves attention or caution; finally, in Section 5, conclusions and possible perspectives are outlined.
