**3. Implications for FSS Design and Research: Putting Theory into Practice**

In conjunction with the decision maker, DSS have been shown to generate better decisions than humans alone by supplementing the decision makers' abilities [56], aiding one or more of phases of intelligence, design, and choice in decision making [57], facilitating problem solving, assisting with unstructured or semi-structured problems [58-59], providing expert guidance [60], and managing knowledge. Our discussion above raises additional issues per‐ tinent to FDSS design with emphasis on overcoming inefficiencies such as bias, irrationality, sub-optimization, and over-simplification that underlie judgmental adjustments. Since a growing body of research is focusing attention on specific DSS features such as information presentation, model building and generation, and integration of dynamic knowledge, in this section we view DSS design from the perspective of making directive and non-directive changes in forecaster behavior regarding application of adjustments. Such behavioral changes can be brought about in two ways: (a) by guiding and correcting forecaster behav‐ ior during task structuring and execution and (b) by encouraging evaluative analysis of de‐ cision processes through structured learning [61].

Our empirical research has raised two key observations related to forecaster behavior and implications for FSS design:


Elaborating on these findings, we make several propositions for FSS design in the next few sections.

#### **3.1. Design FSS that Adapt to Task Complexity**

For years, DSS designers have proposed designing systems that adapt to decision makers [62-63] and align with their natural thinking. Adaptive DSS support judgment by adjusting to high level cognitive needs of decision makers, context of decision making, and task char‐ acteristics [64]. The FTTF framework proposed in this paper provides a task-based approach to such adaptive systems. As a time series is initially input into the FSS, automated feature detection routines can categorize time series along the simple to complex continuum. Task profiles gathered in this way could be used to customize levels of *restrictiveness* and *deci‐ sional guidance* for simple versus complex tasks.

**•** *Practical Proposition 7:* Adjustments to FDSS-generated forecasts for simple series will

**•** *Practical Proposition 8:* Adjustments to FDSS-generated forecasts for complex series, if exe‐

As a caveat to the last proposition above, judgmental adjustments to complex forecasts may be best supported by FDSS in a way that the adjustments are structured [53] and validated automatically through improvements in forecast accuracy [35, 39]. In the following sections, we rely on the TTF framework and other DSS studies to propose ways in which FDSS could

In conjunction with the decision maker, DSS have been shown to generate better decisions than humans alone by supplementing the decision makers' abilities [56], aiding one or more of phases of intelligence, design, and choice in decision making [57], facilitating problem solving, assisting with unstructured or semi-structured problems [58-59], providing expert guidance [60], and managing knowledge. Our discussion above raises additional issues per‐ tinent to FDSS design with emphasis on overcoming inefficiencies such as bias, irrationality, sub-optimization, and over-simplification that underlie judgmental adjustments. Since a growing body of research is focusing attention on specific DSS features such as information presentation, model building and generation, and integration of dynamic knowledge, in this section we view DSS design from the perspective of making directive and non-directive changes in forecaster behavior regarding application of adjustments. Such behavioral changes can be brought about in two ways: (a) by guiding and correcting forecaster behav‐ ior during task structuring and execution and (b) by encouraging evaluative analysis of de‐

Our empirical research has raised two key observations related to forecaster behavior and

**I.** Forecasters will make adjustments to forecasts even when provided highly accu‐

**II.** Design of FSS must necessarily factor in, and adapt to, forecasting task complexity. Elaborating on these findings, we make several propositions for FSS design in the next

For years, DSS designers have proposed designing systems that adapt to decision makers [62-63] and align with their natural thinking. Adaptive DSS support judgment by adjusting

system features in congruence with adjustment behaviors.

rate forecasts. However, the direction and magnitude of these adjustments may be defined by complexity of the forecasting tasks. Considering this, FSS should offer

**3. Implications for FSS Design and Research: Putting Theory into**

harm forecast accuracy.

182 Decision Support Systems

**Practice**

cuted correctly, can improve forecast accuracy.

cision processes through structured learning [61].

**3.1. Design FSS that Adapt to Task Complexity**

implications for FSS design:

few sections.

be best designed to adaptively support simple to complex tasks.

Restrictiveness is the "degree to which, and the manner in which, a DSS limits its users' de‐ cision making process to a subset of all possible processes" [65, p. 52]. For example, a DSS may restrict access to certain data sets or ability to make judgmental inputs and adjustments to the system. Restrictiveness can be desirable when the intention is to limit harmful deci‐ sion choices and interventions. However, general IS literature has largely recommended limited use of restrictive features in DSS [1, 61, 65-66]. Excessive restrictiveness can result in user frustration and system disuse [65, 67]. It can also be difficult for the designer to deter‐ mine *a-priori* which decision processes will be useful for a particular situation [1]. However, when users are poorly trained [1], known to make bad decision choices, or when underlying conditions are stable, restrictive DSS features can be beneficial.

Decisional guidance is "the degree to which, and the manner in which, a DSS guides its users in constructing and executing the decision-making processes by assisting them in choosing and using its operators" [65, p. 57]), can be informative or suggestive. *Informative guidance* provides factual and unbiased information such as visual or text based display of data thereby empowering the user to choose the best course of action. *Suggestive guidance*, on the other hand, recommends an ideal course of action to the user such as by comparing available methods and recommending the one deemed to be most suited to the task at hand. Also [1] provide an excellent and extensive review of decisional guidance features for FSS that we recommend highly. To complement their recommendations, in the next few para‐ graphs, we provide additional design guidelines emergent from the theme of this study.

*A.1 Restrict Where Harmful Judgment can be Applied:* When unrestricted, forecasters are free to apply adjustments at many levels in the forecasting process such as toward data to be used or excluded, models to be applied and those to be ignored, and changes to decision out‐ comes. Similarly, as we demonstrated in our Study 2 [52], inexperienced forecasters may at‐ tempt to overcome their limited knowledge of underlying decision processes by making adjustments to the final outcomes [1]. FSS can restrict where such judgmental adjustments are permitted. Specifically, judgment is best utilized as input into the forecasting process or within the context of a validated knowledge base rather than as an adjustment to the final decision outcome [55].

*A.2 Restrict FSS Display Based on Task Complexity:* Since complex tasks pose significant de‐ mands on human cognitive and information processing capabilities, FSS displays for such tasks can be restricted as opposed to simple tasks that can benefit from decisional guidance. Since simple tasks create lower cognitive strain, performance on such tasks can potentially be improved by increasing user awareness of the forecasting cues such as by displaying fea‐ tures underlying the time series, generating processes, forecasts from alternative methods, and forecasting knowledge underlying the final forecasts. For instance, [49] found that mak‐ ing available the long-term trend of a time series improved forecaster accuracy since it al‐ lowed them to overlook distracting patterns and apply knowledge more consistently.

ess can lend itself to decisional guidance in numerous ways discussed later in this section. Decomposition by simplification can also be implemented by narrowing task demand for complex decisions. For instance, [49] recommend that forecasts should not be required for multiple time periods because forecasters tend to anchor long-term forecasts to short-term forecasts. Our data indirectly suggests that complex series generate higher errors and such

Designing Effective Forecasting Decision Support Systems: Aligning Task Complexity and Technology Support

http://dx.doi.org/10.5772/51255

185

Finally, *decomposition for method selection* could largely be implemented as decisional guid‐ ance. Users may be prompted with forecasts from multiple relevant methods (selected using rules applied to time series features) to consider use of alternative methods and processes. Suggestive guidance on how to proceed with method selection and combination could be

As decision situations become complex, guidance may need to be modified to minimal lev‐ els as such situations are already characterized by information overload. Adding suggestive guidance to this mix can lead to the FSS itself complicating the decision situation. Forecast‐ ers may become increasingly frustrated with interventions from such guidance and conse‐ quently engage in deleterious decision making behaviors. These suggestions are supported by [69] who found that for highly complex tasks, subjects who were provided with sugges‐ tive guidance performed poorly at the task when compared to those who were provided in‐ formational guidance or no decision support. Specifically, we suggest that for complex tasks, informational guidance be provided such that users can determine best strategy on

*A.4 Provide In-Task Feedback for Simple Tasks and Shift to Post-Task Feedback on Complex Tasks:* Feedback is intended to promote learning and behavior modification with the assumption that organizational practices encourage such review. Broadly speaking, evaluative feedback can be offered to forecasters at two stages – *during task execution* and *post task execution* – the former be‐ ing critical to effective forecasting and the latter being beneficial for fostering reflection and learning [1]. Suggestive and informational feedback regarding impact of their current actions on other aspects of the forecasting environment may contain the extent to which a series of poor adjustments may be executed. However, feedback during execution of complex tasks can frustrate the user. Forecasters facing complex tasks may not have the time or cognitive resour‐ ces to reflect adequately upon the impact of their adjustments on the environment [78] and consequently fail to consider control actions that can impact the forecasting environment. In‐ deed, corrective process-based feedback has been found to be transient and shallow [79-80]

anchoring and adjustment can compound errors across the long-term.

their own or ignore the additional information as desired.

**Figure 1.** Components of the Forecasting Process as Presented in [2].

and inadequately contributes to long term behavior modification [81].

useful for simple tasks.

As decision makers have a tendency to trade off accuracy in favor of cost efficiency, *informa‐ tive* and *suggestive* guidance could be displayed prominently such that the forecaster does not have to drill down to make such trade-off decisions [68]. However, this same informa‐ tion presented to the forecaster for complex tasks can result in greater information overload, cognitive strain, and over-reaction. Indeed, [69] confirm that in complex task settings, deci‐ sion makers tended to ignore suggestive advice and focused on informative guidance. To re‐ duce this cognitive load, several of the above discussed features could be hidden and made available as layered drill-down options. Such adaptive support can reduce information overload and related information processing challenges in the context of complex tasks [66], and is replicable across different contexts and organizational settings.

*A.3 Provide and Adapt Task Decomposition According to Task Complexity*: Individual decision maker's working memory is limited and consequently, complex tasks broken into simple "chunks" can be more effectively executed when compared to tasks not so simplified [12]. Cognitive overload may be avoided through effective and efficient design materials [44] ranging from better information presentation to providing greater structure to the learning environment [70] such as through use of decomposition strategies to simplify the subject do‐ main. Decomposition is found to improve performance over unaided and intuitive judg‐ ment [71-72] by breaking down a complex, holistic task into a set of easier tasks which are more accurately executed than the more holistic task [1]. Others [73] also found that DSS users were able to leverage more information when they used decomposition for forecasting tasks. While there are neurological explanations for why decomposition is effective [74-75], from a psychological perspective, decomposition allows the decision maker to optimize the problem solving domain into manageable chunks so that information processing for each chunk can be minimal and relevant while cognitive overload is minimized [70, 76-77].

Although it can be argued that decomposition can be a restrictive DSS feature when its use is forced upon the decision maker [1], most often, a user may not focus on the benefits of decomposing a task or may not recognize how to proceed with decomposition. To this end, we suggest that decomposition be implemented in both restrictive and decisional guidance mode. Specifically, we use the framework by [12] who suggests that decomposition can be applied at three levels: *decomposition via transformation*, i.e. identifying characteristics of the forecasting task and domain; *decomposition for simplification*, i.e. understanding components of the forecasting process from problem formulation to forecast use (Armstrong, 2001 [2]); and *decomposition for method selection* i.e. applying forecasting knowledge and rules to select‐ ing fitting methods. Herein, we propose *transformational decomposition* should be a restrictive feature in FSS. This decomposition of time series into its features can enhance forecaster ability to recognize meaningful patterns as opposed to random ones.

In the same vein, *simplification* of the problem domain could follow restrictive design by us‐ ing the forecasting process presented in Figure 1 to design FSS modules. In such a design, then, the flow of activities presented in Figure 1 could be used to restrict more rapid conver‐ gence on forecast methods and use. In contrast, the evaluative component of this given proc‐ ess can lend itself to decisional guidance in numerous ways discussed later in this section. Decomposition by simplification can also be implemented by narrowing task demand for complex decisions. For instance, [49] recommend that forecasts should not be required for multiple time periods because forecasters tend to anchor long-term forecasts to short-term forecasts. Our data indirectly suggests that complex series generate higher errors and such anchoring and adjustment can compound errors across the long-term.

Finally, *decomposition for method selection* could largely be implemented as decisional guid‐ ance. Users may be prompted with forecasts from multiple relevant methods (selected using rules applied to time series features) to consider use of alternative methods and processes. Suggestive guidance on how to proceed with method selection and combination could be useful for simple tasks.

As decision situations become complex, guidance may need to be modified to minimal lev‐ els as such situations are already characterized by information overload. Adding suggestive guidance to this mix can lead to the FSS itself complicating the decision situation. Forecast‐ ers may become increasingly frustrated with interventions from such guidance and conse‐ quently engage in deleterious decision making behaviors. These suggestions are supported by [69] who found that for highly complex tasks, subjects who were provided with sugges‐ tive guidance performed poorly at the task when compared to those who were provided in‐ formational guidance or no decision support. Specifically, we suggest that for complex tasks, informational guidance be provided such that users can determine best strategy on their own or ignore the additional information as desired.

**Figure 1.** Components of the Forecasting Process as Presented in [2].

ing available the long-term trend of a time series improved forecaster accuracy since it al‐

As decision makers have a tendency to trade off accuracy in favor of cost efficiency, *informa‐ tive* and *suggestive* guidance could be displayed prominently such that the forecaster does not have to drill down to make such trade-off decisions [68]. However, this same informa‐ tion presented to the forecaster for complex tasks can result in greater information overload, cognitive strain, and over-reaction. Indeed, [69] confirm that in complex task settings, deci‐ sion makers tended to ignore suggestive advice and focused on informative guidance. To re‐ duce this cognitive load, several of the above discussed features could be hidden and made available as layered drill-down options. Such adaptive support can reduce information overload and related information processing challenges in the context of complex tasks [66],

*A.3 Provide and Adapt Task Decomposition According to Task Complexity*: Individual decision maker's working memory is limited and consequently, complex tasks broken into simple "chunks" can be more effectively executed when compared to tasks not so simplified [12]. Cognitive overload may be avoided through effective and efficient design materials [44] ranging from better information presentation to providing greater structure to the learning environment [70] such as through use of decomposition strategies to simplify the subject do‐ main. Decomposition is found to improve performance over unaided and intuitive judg‐ ment [71-72] by breaking down a complex, holistic task into a set of easier tasks which are more accurately executed than the more holistic task [1]. Others [73] also found that DSS users were able to leverage more information when they used decomposition for forecasting tasks. While there are neurological explanations for why decomposition is effective [74-75], from a psychological perspective, decomposition allows the decision maker to optimize the problem solving domain into manageable chunks so that information processing for each chunk can be minimal and relevant while cognitive overload is minimized [70, 76-77].

Although it can be argued that decomposition can be a restrictive DSS feature when its use is forced upon the decision maker [1], most often, a user may not focus on the benefits of decomposing a task or may not recognize how to proceed with decomposition. To this end, we suggest that decomposition be implemented in both restrictive and decisional guidance mode. Specifically, we use the framework by [12] who suggests that decomposition can be applied at three levels: *decomposition via transformation*, i.e. identifying characteristics of the forecasting task and domain; *decomposition for simplification*, i.e. understanding components of the forecasting process from problem formulation to forecast use (Armstrong, 2001 [2]); and *decomposition for method selection* i.e. applying forecasting knowledge and rules to select‐ ing fitting methods. Herein, we propose *transformational decomposition* should be a restrictive feature in FSS. This decomposition of time series into its features can enhance forecaster

In the same vein, *simplification* of the problem domain could follow restrictive design by us‐ ing the forecasting process presented in Figure 1 to design FSS modules. In such a design, then, the flow of activities presented in Figure 1 could be used to restrict more rapid conver‐ gence on forecast methods and use. In contrast, the evaluative component of this given proc‐

lowed them to overlook distracting patterns and apply knowledge more consistently.

and is replicable across different contexts and organizational settings.

184 Decision Support Systems

ability to recognize meaningful patterns as opposed to random ones.

*A.4 Provide In-Task Feedback for Simple Tasks and Shift to Post-Task Feedback on Complex Tasks:* Feedback is intended to promote learning and behavior modification with the assumption that organizational practices encourage such review. Broadly speaking, evaluative feedback can be offered to forecasters at two stages – *during task execution* and *post task execution* – the former be‐ ing critical to effective forecasting and the latter being beneficial for fostering reflection and learning [1]. Suggestive and informational feedback regarding impact of their current actions on other aspects of the forecasting environment may contain the extent to which a series of poor adjustments may be executed. However, feedback during execution of complex tasks can frustrate the user. Forecasters facing complex tasks may not have the time or cognitive resour‐ ces to reflect adequately upon the impact of their adjustments on the environment [78] and consequently fail to consider control actions that can impact the forecasting environment. In‐ deed, corrective process-based feedback has been found to be transient and shallow [79-80] and inadequately contributes to long term behavior modification [81].

To this end, FSS developers may primarily focus on post-execution feedback for complex tasks. Post-task feedback has been found to improve decision quality [82] and attainment of challenging goals [83], particularly when the feedback is informative [69]. Further, [1] sug‐ gest four forms of post-task feedback: *outcome feedback*, result of outcomes from the forecast‐ ing task; *performance feedback*, assessment of performance such as forecast accuracy; c*ognitive process feedback,* effectiveness of forecasting process deployed; *task properties feedback,* infor‐ mation about the task e.g. presence of conflicting underlying series. Considering that the in‐ tention of post-execution is to foster learning, holistic learning is possible for instance, by providing informative guidance on the above aspects complemented with the ability to drill down to the suggestive components, may be most beneficial to forecasters.

towards DSS and their satisfaction with DSS as indicators of DSS use [86-87]. However, our concern in this paper extends beyond use since forecasters may use an FSS to generate fore‐ casts and still make judgmental adjustments. Confidence in the system can be enhanced by making its abilities transparent to the forecaster by making the FDSS and its features fully disclosed [35]. Furthermore, a well validated FDSS that has demonstrated stability across time and multiple data sets can potentially improve confidence [88]. This validation is par‐ ticularly simple to implement in FDSS due to the well-defined and universally accepted suc‐ cess measure, forecast accuracy. Confidence in an FDSS may also be enhanced by highlighting the credibility of knowledge underlying it. When transparent to forecasters, use of expert knowledge, empirically validated findings, and methodical calibrations can poten‐ tially enhance forecaster confidence in system abilities, and thereby mitigate the need for ad‐ justments. Finally, user involvement in systems design and development has been shown to increase user satisfaction with and commitment to the system and its outcomes [89-91]. For instance, [92] found that forecasters involved in defining features of the FSS such as display and models indicated greater satisfaction with FSS forecasts, even though their overall accu‐

Designing Effective Forecasting Decision Support Systems: Aligning Task Complexity and Technology Support

http://dx.doi.org/10.5772/51255

187

In the sections above, we have offered numerous suggestions regarding FSS design. While some of these have been researched and validated, most require further research attention particularly in light of the simple-complex task classification that forms the foundation of our paper. To this end, we first suggest that our proposed task classification be tested on a broader time series base to (a) determine if the application of this framework is generaliza‐ ble to a larger set of time series, and (b) whether the patterns of judgmental performance and adjustments we observed across the two studies [52] hold ground in a larger context. If our results are proven across a broader base, implications for FSS design are numerous in

Beyond confirmation of the FTTF framework, there are numerous opportunities for examin‐ ing FSS design issues. Most importantly, our proposition has been that FSS should be de‐ signed to not only enhance forecaster support for task execution but also to promote effective behavior modification during and post execution. Such learning and modification will occur over long term system utilization, features supporting feedback and learning in FSS should occur early in the design process. This has implications for finding the ideal bal‐ ance between restrictive and decisional guidance features and identifying the decision mak‐ ing stage to which these are best applied. As [69] suggest, increased decisional guidance during problem formulation can have an adverse effect on judgmental task performance but providing feedback at the right opportunity can improve performance. In response, much research is required to identify aspects of forecaster behavior that are amenable to behavior modification and those that are not, nature of desirable support, and stage of forecasting

racy was lower than those who were constrained in their involvement.

**3.3. Implications for Practical Design Research**

terms of recommendations addressed earlier.

process where these support features are best applied.

Simple tasks, in contrast, do not require the same level of feedback and support as complex tasks. Moreover, these tasks are cognitively less demanding. Consequently, in-task feedback may not be detrimental and may be designed to provide the user with guidance such as by displaying features of the time series and discussing their impact on forecasts, providing original series contrasted with series that have been cleansed of distracting features such as outliers and irrelevant early data, and providing forecasting guidance in form of rules and relevant methods. As a case in point, RBF rules that pertain to a specific set of features present in the task being executed could be displayed such that the user can recognize the knowledge that has gone into generating the forecast.

*A.5 Restrict Data and Models According to Task Complexity:* Restrictiveness may be relaxed for simpler tasks by increasing the range of available data and models. FSS can shift to making some desirable processes easy to use while making other, less desirable alternatives, more dif‐ ficult [1]. Automating and thereby simplifying the application of desirable strategies can serve to reduce the effort associated with executing the more desirable strategies [84] and thereby re‐ duce the need for making damaging judgmental adjustments to the decision process [13].

*A.6 Restrict to Impose Standards and Best Practices:* Finally, restrictions can be applied when certain organizational best practices and standards need to be applied in the forecasting process. For instance, a critical issue in supply chain forecasting is an escalation of forecast‐ ing adjustments as a forecast moves down the supply chain, thereby contributing to the bullwhip effect [85]. Embedding restraints in the forecasting system that contain the magni‐ tude and directionality of adjustments may potentially reduce the risks associated with overcompensating for each element of the supply chain. This is particularly true for complex data where forecasters may overemphasize random patterns in the data or simple series where forecasters may want to overcompensate for seemingly aggressive forecasts. These restraints may be in the form of boundaries or confidence intervals which adapt to the na‐ ture of the complexity being presented to the forecaster.

#### **3.2. Design FDSS to Increase Forecaster Confidence**

Earlier, we discussed judgmental adjustments as a mechanism for forecasters to develop ownership of the forecasts. If FSS can be designed with features that enhance forecaster con‐ fidence in its abilities, possibly the compulsion to make judgmental adjustments may be mi‐ tigated. Most studies have focused on DSS use and satisfaction and suggested user attitudes towards DSS and their satisfaction with DSS as indicators of DSS use [86-87]. However, our concern in this paper extends beyond use since forecasters may use an FSS to generate fore‐ casts and still make judgmental adjustments. Confidence in the system can be enhanced by making its abilities transparent to the forecaster by making the FDSS and its features fully disclosed [35]. Furthermore, a well validated FDSS that has demonstrated stability across time and multiple data sets can potentially improve confidence [88]. This validation is par‐ ticularly simple to implement in FDSS due to the well-defined and universally accepted suc‐ cess measure, forecast accuracy. Confidence in an FDSS may also be enhanced by highlighting the credibility of knowledge underlying it. When transparent to forecasters, use of expert knowledge, empirically validated findings, and methodical calibrations can poten‐ tially enhance forecaster confidence in system abilities, and thereby mitigate the need for ad‐ justments. Finally, user involvement in systems design and development has been shown to increase user satisfaction with and commitment to the system and its outcomes [89-91]. For instance, [92] found that forecasters involved in defining features of the FSS such as display and models indicated greater satisfaction with FSS forecasts, even though their overall accu‐ racy was lower than those who were constrained in their involvement.

#### **3.3. Implications for Practical Design Research**

To this end, FSS developers may primarily focus on post-execution feedback for complex tasks. Post-task feedback has been found to improve decision quality [82] and attainment of challenging goals [83], particularly when the feedback is informative [69]. Further, [1] sug‐ gest four forms of post-task feedback: *outcome feedback*, result of outcomes from the forecast‐ ing task; *performance feedback*, assessment of performance such as forecast accuracy; c*ognitive process feedback,* effectiveness of forecasting process deployed; *task properties feedback,* infor‐ mation about the task e.g. presence of conflicting underlying series. Considering that the in‐ tention of post-execution is to foster learning, holistic learning is possible for instance, by providing informative guidance on the above aspects complemented with the ability to drill

Simple tasks, in contrast, do not require the same level of feedback and support as complex tasks. Moreover, these tasks are cognitively less demanding. Consequently, in-task feedback may not be detrimental and may be designed to provide the user with guidance such as by displaying features of the time series and discussing their impact on forecasts, providing original series contrasted with series that have been cleansed of distracting features such as outliers and irrelevant early data, and providing forecasting guidance in form of rules and relevant methods. As a case in point, RBF rules that pertain to a specific set of features present in the task being executed could be displayed such that the user can recognize the

*A.5 Restrict Data and Models According to Task Complexity:* Restrictiveness may be relaxed for simpler tasks by increasing the range of available data and models. FSS can shift to making some desirable processes easy to use while making other, less desirable alternatives, more dif‐ ficult [1]. Automating and thereby simplifying the application of desirable strategies can serve to reduce the effort associated with executing the more desirable strategies [84] and thereby re‐ duce the need for making damaging judgmental adjustments to the decision process [13].

*A.6 Restrict to Impose Standards and Best Practices:* Finally, restrictions can be applied when certain organizational best practices and standards need to be applied in the forecasting process. For instance, a critical issue in supply chain forecasting is an escalation of forecast‐ ing adjustments as a forecast moves down the supply chain, thereby contributing to the bullwhip effect [85]. Embedding restraints in the forecasting system that contain the magni‐ tude and directionality of adjustments may potentially reduce the risks associated with overcompensating for each element of the supply chain. This is particularly true for complex data where forecasters may overemphasize random patterns in the data or simple series where forecasters may want to overcompensate for seemingly aggressive forecasts. These restraints may be in the form of boundaries or confidence intervals which adapt to the na‐

Earlier, we discussed judgmental adjustments as a mechanism for forecasters to develop ownership of the forecasts. If FSS can be designed with features that enhance forecaster con‐ fidence in its abilities, possibly the compulsion to make judgmental adjustments may be mi‐ tigated. Most studies have focused on DSS use and satisfaction and suggested user attitudes

down to the suggestive components, may be most beneficial to forecasters.

knowledge that has gone into generating the forecast.

186 Decision Support Systems

ture of the complexity being presented to the forecaster.

**3.2. Design FDSS to Increase Forecaster Confidence**

In the sections above, we have offered numerous suggestions regarding FSS design. While some of these have been researched and validated, most require further research attention particularly in light of the simple-complex task classification that forms the foundation of our paper. To this end, we first suggest that our proposed task classification be tested on a broader time series base to (a) determine if the application of this framework is generaliza‐ ble to a larger set of time series, and (b) whether the patterns of judgmental performance and adjustments we observed across the two studies [52] hold ground in a larger context. If our results are proven across a broader base, implications for FSS design are numerous in terms of recommendations addressed earlier.

Beyond confirmation of the FTTF framework, there are numerous opportunities for examin‐ ing FSS design issues. Most importantly, our proposition has been that FSS should be de‐ signed to not only enhance forecaster support for task execution but also to promote effective behavior modification during and post execution. Such learning and modification will occur over long term system utilization, features supporting feedback and learning in FSS should occur early in the design process. This has implications for finding the ideal bal‐ ance between restrictive and decisional guidance features and identifying the decision mak‐ ing stage to which these are best applied. As [69] suggest, increased decisional guidance during problem formulation can have an adverse effect on judgmental task performance but providing feedback at the right opportunity can improve performance. In response, much research is required to identify aspects of forecaster behavior that are amenable to behavior modification and those that are not, nature of desirable support, and stage of forecasting process where these support features are best applied.
