**4.1 Binary logistic regression model**

The binary logistic regression technique will provide the probability that a given variable, called the dependent variable, will have a given value based on the values of the other variables, called independent variables.

For our example, the dependent variable will be an excursion. The excursion will have a value of zero if there is no loss of position and a value of 1 if there is an excursion.

The rest of the variables will be considered independent variables. These variables can be quantitative or categorical. In our example, except for the variables water depth, percentage of thrusters online and percentage of generators online, which are all quantitative, the rest of the independent variables are categorical. Due to this, when using a statistical program, it manipulates its values internally to produce as many variables as there are categories minus one. For example, Wind sensors have five categories, and the program produces four variables: Windsensors (i), i = 1, 2, 3, 4. These new variables are dichotomic: the value 1 indicates the presence of a quality, and the value 0 its absence.

The statistical program (in our example, SPSS), considering the values for each case in the independent variables, calculates the probability of excursion for each of them. As this probability varies between 0 and 1, the closer to 0 will mean the most negligible probability of excursion, and the closer to 1 will mean a more significant probability of excursion. Thus, each case is assigned a probability p. This is important to interpret the coefficients in the regression. There has been a recodification, and no information has been lost.
