Defaults for Evaluating Objectives (Turn ON/OFF Objectives evaluation, Order of Evaluation within the Hierarchy and Default Pairwise Options)
Evaluate Objectives
The default option is to evaluate objectives, although a Project Manager might want to do the evaluation in stages over a period of time, and turn off the evaluation of objectives and evaluate only alternatives during one of these phases (for both Anytime and TeamTime evaluations).
Order for evaluating within the objectives hierarchy
When there is more than one level of objectives, as is typical for important decisions, it is customary to proceed from the top-down -- that is, evaluating the relative importance of the main objectives, then the relative importance of the sub-objectives with respect to the objectives, and so on. However, for reasons similar to the above where it was recommended to proceed bottom-up -- evaluating alternatives before the objectives -- it is also recommended to evaluate the various levels in the objectives hierarchy bottom-up as well. Doing so will enable the evaluators to have a better idea of the significance of the elements contained within the higher-level objectives when they are evaluated.
One or All pairs on the display
When prioritizing Objectives on each screen, the Project Manager can select to display one pair or all pairs of Objectives.
Multi-pairwise evaluation is only applicable for AnyTime Evaluation.
Number of diagonals (Trade-off between accuracy and # of comparisons)
These options apply to the number of pairwise comparisons to be made within each cluster of elements. Let's consider an example of a cluster with five elements, A, B, C, D, and E:
The non-dark cells in the following figure illustrate all possible ((5 * 4)/2 = 10) pairwise comparisons for a cluster of five elements.
The most accurate results are achieved with the first option above but at the expense of taking more time. If the number of elements in a cluster is small, then this option provides the most redundancy and hence the most accurate results.
The choice of first and second diagonals in the above example would entail 4+3 judgments. This would consist of 3 "redundant" judgments (since at least 4 judgments are required for a spanning set) and would be reasonable even if verbal judgments were made.
Choosing the minimum number of comparisons is not recommended unless pairwise graphical judgments are made and you have confidence in the accuracy of each of the judgments.
Force most comparisons
A combination of choosing the first and second diagonals, along with forcing most comparisons if fewer than 5 or 6 elements (you can specify how many), is recommended if there are some clusters with many elements that would require a large number of judgments for all diagonals and some clusters with just a few elements for which more accurate priorities would result from making judgments for all diagonals.
Pairwise Type (Verbal or Graphical)
You can choose between making the pairwise comparisons either graphically or verbally. Graphical judgments, being ratio scale measures to begin with, generally produce more accurate results and require less redundancy to produce accurate results than do verbal judgments. There are circumstances where verbal judgments might be more appropriate, however. If you do choose verbal judgments as a default, we recommend that you also specify that graphical judgments be used when there are only 2 or 3 elements in a cluster since there aren't enough redundant judgments to expect that accurate ratio scale priorities would be produced from the ordinal verbal judgments.
The pairwise options specified here are the defaults when evaluating objectives with respect to the parent objectives for each cluster set to be evaluated using pairwise comparisons. You can specify the pairwise options per cluster on the Advanced Mode of Measurement Methods, which will override the defaults.
Defaults for Evaluating Alternatives (Turn ON/OFF Alternatives Evaluation, Default Measurement Type and Pairwise Options)
Evaluate Alternatives
The default option is to evaluate alternatives, although a Project Manager might want to do the evaluation in stages over a period of time, and turn off the evaluation of alternatives and evaluate only objectives.
Default Measurement Type
The covering objectives are the lowest level nodes in the objectives hierarchy. The alternatives are compared or rated with respect to these "covering objectives." This option sets the default type for comparing/rating (the alternatives) with respect to the covering objectives to either pairwise comparisons or the default rating scale.
Note: The measurement type can be changed from this default setting for any or all covering objectives from Measurement Methods for Alternatives. In addition to specifying pairwise or ratings, other options include utility curves and step functions.
- When creating a new model, the template chosen will set the above option automatically.
- This option applies to covering objectives added to the model. If the option is changed, and new covering objectives are added, then the new option will take effect for the new covering objectives.
Number of Evaluation on each screen
When prioritizing Alternatives on each screen using pairwise comparisons, the Project Manager can select to display one pair or all pairs of Alternatives.
When prioritizing Alternatives on each screen using rating, the Project Manager can select from various options are shown above.
Multi evaluation is only applicable for AnyTime Evaluation.
Number of diagonals (Trade-off between accuracy and number of comparisons)
The below option applies to evaluating the alternatives with pairwise comparisons. This option is analogous to the corresponding option for pairwise comparisons of objectives.
Force most comparisons
A combination of choosing the first and second diagonals, along with forcing most comparisons if fewer than 5 or 6 elements (you can specify how many), is recommended if: 1) there are some clusters with many elements that would require a large number of judgments for all diagonals; and 2) there are some clusters with just a few elements for which more accurate priorities would result from making judgments for all diagonals.
Pairwise Type (Verbal or Graphical)
This option is also analogous to the corresponding option for pairwise comparisons of objectives.
The pairwise options specified here are the defaults when evaluating alternatives with respect to the covering objectives for each cluster set to be evaluated using pairwise comparisons. You can specify the pairwise options per cluster on the Advanced Mode of Measurement Methods, which will override the defaults.
You can define the order of evaluation from Collect input > Set Measurement Options > Judgment Options:
When an individual or small group derives priorities in an AHP model, they can evaluate either from the top-down (from goal to objectives to alternatives) or bottom-up (from alternatives to covering objectives, to top-level objectives).
Assuming there are evaluators that have roles to evaluate both the importance of the objectives as well as the preferences for the alternatives with respect to the objectives, it is customary to evaluate the objectives first. However, if the evaluators are not familiar with the alternatives, we recommend that they evaluate the alternatives first. Doing so will give them a better context for their judgments about the relative importance of the objectives (provides some feedback) and will lessen the amount of iteration that may be required.
The model elements terminologies for Objectives, and Alternatives -- both singular and plural, are defined from the DEFINE MODEL > Model Details page.
In addition, you can also specify the wording to use during the evaluation, specifically for pairwise comparison (and rating, see explanation below) evaluation. This can be found on the COLLECT INPUT > Set Measurement Options > Judgment Options page.
After "Which of the two" is the name of the element being compared. These terminologies (Objectives and Alternatives) are the same and in sync with what's on the Model Wording page (plural). To edit, simply type in the desired wording on the text box
The pairwise evaluation phrase is defined from the second dropdown:
For Alternatives:
Simply select from the predefined wording on the dropdown.
Selecting a predefined phrase will apply a similar phrase for Rating evaluation. For example, if you selected "is more preferable", the Rating wording will be "Rate the preference".
You can also select --Custom-- and type in a custom phrase (e.g. is more influential, etc.).
Custom wording will not be applicable for Rating evaluation -- the default will be used.
If in case you want to fully customize the evaluation questions, you can edit the question from the evaluation page itself.