














Set up performance measure tables to track, analyze, and compare specific output values for your model.
Transcript
00:03
Performance measure tables enable you to systematically track, measure, and analyze specific output values for your FlexSim Model.
00:12
Examples of output values are total throughput, average wait time, or maximum key length.
00:19
In this example, FlexSim is open to an already created model.
00:24
To add a Performance Measure Table, in the Toolbox, click Add, and then select Statistics > Performance Measure Table.
00:34
In this case, dock PerformanceMeasures1 next to the Model so that you can view them side-by-side.
00:42
From the Toolbox, you can add multiple tables by right-clicking Performance Measure Tables
00:47
and selecting Add Performance Measure Table or by following the previous steps.
00:53
For this example, use the table you just added.
00:57
Here, in PerformanceMeasures1, each row is a performance measure, and you can use the up or down arrows to adjust the number of rows.
01:07
Keep in mind that each Name must be unique across all performance measure tables for your model.
01:13
For this example, rename PerformanceMeasure1 to “Throughput”.
01:19
Click in the Value column, and then expand the drop-down to access the Value Properties.
01:25
Here, select the Reference Sampler.
01:29
Then, in the Model, click Processor1, and in the menu that opens, select Statistics > Output.
01:40
Back in the table, use the Display Units and Description fields to organize and reference your performance measures.
01:47
For example, enter “items” for the Display Units and “Number of items completed” for the Description.
01:56
Reset and Run the Model.
01:59
When an item moves through the process, the Value in the Throughput field increases.
02:04
Here, you can see that the value first increases to 1, and then to 2 as the simulation is Stepped forward.
02:12
Stop and Reset the simulation.
02:15
The Reference Sampler allows you to sample not only most 3D objects in your Model,
02:20
but also most activities and shared assets in your process flow, groups in your Toolbox, and charts on your Dashboard.
02:28
For example, to review the travel distance of the operator, expand the Value field for PerformanceMeasure2,
02:36
click the Reference Sampler, and then sample Operator1 in the Model.
02:42
In the menu that opens, select Statistics > Travel Distance.
02:49
In the Table, name the performance measurement “DistanceTraveled”,
02:54
update the Display Units to “meters” and add a Description of “Distance traveled by operator”.
03:01
Next, sample a Group.
03:05
In the Model shown, the three processors are already part of a group called Processors.
03:11
Expand the Value Properties for PerformanceMeasure3 to select the Reference Sampler.
03:18
Then, in the Toolbox, under Groups, select Processors.
03:25
In the Value Properties, expand the Value field and select State percentage by group.
03:32
Then, set the State to 2 – processing and the Aggregation to Average.
03:40
Back in the Table, name the measure “ProcessTime”,
03:45
set the Display Units to “seconds” and add a Description of “Average time spent processing among group”.
03:52
When you Reset and Run the model, you can see the values update.
03:57
Note that, in this example, the ProcessTime measures the processing time in the model shown here,
04:03
while the DistanceTraveled measures the operator in the model above.
04:07
These are just a few examples of performance measurement within your model;
04:12
there is tremendous variability in how you can set up performance measure tables to track various aspects of your model.
04:19
Keep in mind that performance measurement can require substantial computational capacity, as these values are constantly being updated.
04:27
If you find that your model runtime is slow,
04:31
best practice is to close the Performance Measures pane and reopen it once your model finishes running.
04:37
You can also use performance measures in conjunction with parameters and the experimenter tool
04:42
to systematically track, analyze, and compare key performance metrics across different simulation scenarios.
00:03
Performance measure tables enable you to systematically track, measure, and analyze specific output values for your FlexSim Model.
00:12
Examples of output values are total throughput, average wait time, or maximum key length.
00:19
In this example, FlexSim is open to an already created model.
00:24
To add a Performance Measure Table, in the Toolbox, click Add, and then select Statistics > Performance Measure Table.
00:34
In this case, dock PerformanceMeasures1 next to the Model so that you can view them side-by-side.
00:42
From the Toolbox, you can add multiple tables by right-clicking Performance Measure Tables
00:47
and selecting Add Performance Measure Table or by following the previous steps.
00:53
For this example, use the table you just added.
00:57
Here, in PerformanceMeasures1, each row is a performance measure, and you can use the up or down arrows to adjust the number of rows.
01:07
Keep in mind that each Name must be unique across all performance measure tables for your model.
01:13
For this example, rename PerformanceMeasure1 to “Throughput”.
01:19
Click in the Value column, and then expand the drop-down to access the Value Properties.
01:25
Here, select the Reference Sampler.
01:29
Then, in the Model, click Processor1, and in the menu that opens, select Statistics > Output.
01:40
Back in the table, use the Display Units and Description fields to organize and reference your performance measures.
01:47
For example, enter “items” for the Display Units and “Number of items completed” for the Description.
01:56
Reset and Run the Model.
01:59
When an item moves through the process, the Value in the Throughput field increases.
02:04
Here, you can see that the value first increases to 1, and then to 2 as the simulation is Stepped forward.
02:12
Stop and Reset the simulation.
02:15
The Reference Sampler allows you to sample not only most 3D objects in your Model,
02:20
but also most activities and shared assets in your process flow, groups in your Toolbox, and charts on your Dashboard.
02:28
For example, to review the travel distance of the operator, expand the Value field for PerformanceMeasure2,
02:36
click the Reference Sampler, and then sample Operator1 in the Model.
02:42
In the menu that opens, select Statistics > Travel Distance.
02:49
In the Table, name the performance measurement “DistanceTraveled”,
02:54
update the Display Units to “meters” and add a Description of “Distance traveled by operator”.
03:01
Next, sample a Group.
03:05
In the Model shown, the three processors are already part of a group called Processors.
03:11
Expand the Value Properties for PerformanceMeasure3 to select the Reference Sampler.
03:18
Then, in the Toolbox, under Groups, select Processors.
03:25
In the Value Properties, expand the Value field and select State percentage by group.
03:32
Then, set the State to 2 – processing and the Aggregation to Average.
03:40
Back in the Table, name the measure “ProcessTime”,
03:45
set the Display Units to “seconds” and add a Description of “Average time spent processing among group”.
03:52
When you Reset and Run the model, you can see the values update.
03:57
Note that, in this example, the ProcessTime measures the processing time in the model shown here,
04:03
while the DistanceTraveled measures the operator in the model above.
04:07
These are just a few examples of performance measurement within your model;
04:12
there is tremendous variability in how you can set up performance measure tables to track various aspects of your model.
04:19
Keep in mind that performance measurement can require substantial computational capacity, as these values are constantly being updated.
04:27
If you find that your model runtime is slow,
04:31
best practice is to close the Performance Measures pane and reopen it once your model finishes running.
04:37
You can also use performance measures in conjunction with parameters and the experimenter tool
04:42
to systematically track, analyze, and compare key performance metrics across different simulation scenarios.