Open Access

A decision support system for benchmarking the energy and waste performance of schools in Toronto

Environmental Systems Research20121:5

DOI: 10.1186/2193-2697-1-5

Received: 16 May 2012

Accepted: 14 August 2012

Published: 14 August 2012

Abstract

Background

The Toronto District School Board (TDSB) oversees the largest school district in Canada and has been spent more than one third of its annual maintenance budget on energy and waste. This has directed attention toward system-wide reductions to both energy consumption patterns and waste generation rates. In this paper, a decision support system (DSS) that can process unit-incompatible measures is used for rating, ranking, and benchmarking the schools within the TDSB.

Results

The DSS permits the ranking of any set of schools by contextually evaluating their relative attractiveness to other identified school groupings. Consequently, the DSS was used to explicitly rank each school’s performance within the district and to determine realistic energy improvement targets. Achieving these benchmarks would reduce system-wide energy costs by twenty-five percent.

Conclusions

The TDSB study demonstrates that this DSS provides an extremely useful approach for evaluating, benchmarking and ranking the relative energy and waste performance within the school system, and the potential to extend its much broader applicability into other applications clearly warrants additional exploration.

Keywords

Benchmarking Performance Data Envelopment Analysis Decision Support Systems Energy & Waste

Background

Representing more than 600 schools, the Toronto District School Board (TDSB) oversees the largest education constituency within Canada. In the 2003 fiscal year, the TDSB allocated $48 million of its annual budget to the energy and waste requirements of the school system, but within three years, found that the energy and waste expenditures had escalated to more than $69 million (Christie [2003, 2007]; Christie & Coppinger [2006]). This rapid forty percent increase in spending during a period of public economic retrenchment necessitated stringent attention toward system-wide reductions to both energy consumption patterns and waste generation rates (Christie [2003, 2007]; Christie & Coppinger [2006]).

While the need to decrease energy consumption and minimize waste was universally acknowledged, to achieve long-term success in these efforts, the TDSB resolutely believed that any successful systematic reduction efforts would need to adequately address three critical questions: (i) what initiatives would achieve the most effective results; (ii) how could these “most effective reduction initiatives” actually be identified; and, (iii) how effective would these initiatives prove to be once implemented (Christie [2003]) Additional “essential components” to effectively implement any proposed reduction schemes necessitated that improvement initiatives had to include: (i) the establishment of realistic benchmarks for each school to strive toward; (ii) the setting of achievable annual performance targets for each school; (iii) a commitment to rational and objective management of the system data; and, (iv) an assurance of transparency and neutrality in policy-setting and decision-making (Christie [2003, 2007]; Christie & Coppinger [2006]). Unfortunately, it proves to be an extremely difficult process to evaluate system-wide performance and to establish benchmarks when there are multiple incommensurate criteria measurements present (Camp [1995]), which was the case within the budgetary-constrained and politically-charged environment of the TDSB (Christie [2003, 2007]; Christie & Coppinger [2006]).

Decision support systems (DSS) are intelligent information systems based on decision models that can be used to extract large quantities of data from databases, to provide interfaces and methods for effectively processing it, and for deriving meaningful decisions of managerial/economic significance from it. DSS have been used to analyze a wide variety of performance information and to provide a readily-accessible medium for distributing any knowledge generated to a wide variety of stakeholders (Lin et al. [2008]). In this paper, a DSS that can simultaneously combine unit-incompatible energy and waste performance measures is introduced for evaluating the system-wide energy and waste performance of the schools within the TDSB. This DSS incorporates several data envelopment analysis (DEA) modules that had been developed in Yeomans ([2004]) and threads these modules together using commonly available software (Albright [2010]; Seref et al. [2007]; Zhu [2003]). The underlying DEA methodology has been shown to hold advantages over many other multi-criteria methods by providing an objective decision-making tool that does not require variables to have the same scale or conversion weights applied to them (Cook & Zhu [2008]; Zhu [2003]), while permitting a simultaneous combination of both quantitative and qualitative measures (Cook et al. [1996]).

The DSS developed can be used to rate and rank each school according to its energy and waste performance relative to the other schools in the system by recursively partitioning the schools into sub-groups of relatively superior and inferior performers. In addition, the DSS can establish realistically achievable improvement targets for each school by benchmarking their performance against the operations of schools in higher efficiency categories. This relative performance comparison between the schools is important since it ensures that the underlying analysis involves the use of energy and waste values actually occurring at peer institutions within the TDSB and not through “externally-generated”, potentially unrepresentative values.

The DSS developed directly addresses the TDSB’s three requisite “critical questions” while simultaneously addressing most of the key issues in the “essential components” identified above. Furthermore, the most practical contribution from this approach is that, since DEA can be readily implemented using common spreadsheet software linked together by relatively straightforward VBA programming (Albright [2010]; Seref et al. [2007]; Zhu [2003]), this entire methodology can be implemented on virtually any computer. Consequently, practitioners in any organization could easily modify and extend this methodology to match their own very specific multi-criteria applications and circumstances. An illustrative example of an application of the DSS is provided through an analysis of a subset of the high schools in the TDSB.

Results and Discussion

The primary purpose in the performance evaluations of most organizational systems is to appraise the current operations of individual entities and to benchmark these against peer entities to identify best practices. While individual performance measure, or “gap”, analysis has often provided the fundamental basis for performance evaluation and benchmarking (Zhu [2003]), it remains a difficult task to satisfactorily combine multiple disparate, unit-incompatible, single-criteria measurements into an overall conclusion (Camp [1995]). Since a complex entity’s actual performance generally represents multifaceted phenomena, the use of single measures in gap analysis explicitly ignores all interactions, substitutions, and tradeoffs between the various performance measures. Therefore, it is a rare occurrence when a one-measure-at-a-time gap-analysis can suffice for the purposes of an effective performance evaluation of organizational systems (Camp [1995]; Zhu [2003]).

Clearly it is difficult to evaluate an organization’s performance or to establish benchmarks when there are multiple measurements present (Camp [1995]). If the specific algebraic functional relationship between performance measures is known, then established multi-criteria techniques can be used to estimate best-practice levels of performance. These algebraic functional forms cannot be specified without a priori information on the corresponding tradeoffs and, unfortunately, such information is generally unavailable in most practical situations. However, when the best practices of similar types of operations can be identified empirically at a specific point in time, it becomes possible to empirically estimate the resulting best-practice level of performance, or efficient frontier, using these observations (Zhu [2003]).

DEA has proved to be an effective empirical tool for identifying multi-criteria efficient frontiers, for subsequently evaluating relative performance efficiencies, and for implicitly estimating the tradeoffs inherent within the empirically designed frontier. DEA’s empirical orientation and absence of a priori assumptions establish it as the ideal analytical instrument for application to a wide variety of practical situations, since its underlying theoretical basis is consistent with the practice of rating entities by concurrently examining the relative efficiencies of their multiple performance measures. Furthermore, DEA allows performance comparisons to be made between numerous entities that have been evaluated by multiple, unit-incomparable measures without employing any a priori weightings or conversion factors typically required in other methods. Since the process is adaptable and invariant to data type, DEA permits the inclusion and comparison of non-numeric environmental-type variables that might prove incomparable using many other techniques (Zhu [2003]). By incorporating DEA into its DSS, the focus of performance evaluation for the TDSB application could shift from a characterization of each school’s energy and waste usage in terms of single measures to evaluating performance from a mathematically rational, multidimensional system perspective (Linton et al. [2007]; Yeomans [2004]). Consequently, inherent energy and waste relationships and their role in performance ranking can be explicitly brought into the analysis in a rational, transparent, and neutral fashion.

An overview of the energy and waste DSS for the TDSB

The DSS, itself, contains a series of specific DEA modules (more fully explained in subsequent sections) for conducting the performance evaluation of the TDSB. The first module recursively partitions the selected schools into sub-groups of relatively superior and inferior performers according to their energy and waste performance. The second module then determines realistically achievable energy and waste improvement targets for each school by benchmarking them against the system-wide operations for schools in selected higher performance categories. If desired, the DSS can also execute a third module to establish an explicit rank ordering of each school relative to any desired set of schools within the TDSB.

The set of DEA modules was created using readily available spreadsheet optimization software linked together by a combination of straightforward programming. In order to make the entire analysis process readily accessible to the various system users, the DSS was implemented using Microsoft Access and Microsoft Excel together with VBA (contained in all Microsoft Office products) as the programming language. By using these specific computer packages, the entire set of modules for evaluating the performance of the TDSB was created using software residing on most current personal computers. In the following sections, each of the individual DEA modules used in the DSS is explained in detail and illustrated using data from 65 high schools in the TDSB.

DEA relative performance rating module

The first module in the DSS partitions any selected group of schools into relatively superior and inferior performers based upon their energy and waste performance. In DEA terminology a decision-making unit (DMU) designates the specific entity being studied (Zhu [2003]). For the example, the set of DMUs consists of the 65 high schools in the TDSB. Each DMU possesses a set of inputs and outputs that represent its multiple measures of performance and, for the high schools, these inputs and outputs consist of various observed energy and waste measures. While a considerable number of different combinations of energy and waste performance indicators could have been selected from the available data, for illustrative purposes the example in this paper contains only two inputs and four outputs. The inputs considered were (i) school enrolments measured in terms of the number of students and (ii) school sizes measured in square metres (sqm), while the set of outputs consisted of (i) total energy consumption measured in gigajoules (GJ), (ii) total energy costs in dollars ($), (iii) total waste in kilograms (Kg), and (iv) waste diversion percentage (%).

One analytical requirement for DEA is that all inputs have to be measured in units where “less is better”, while all outputs have to be expressed in units in which “more is better”. Hence, prior to implementing any of the procedures, the DSS transforms all selected inputs and outputs into a format consistent with this analytical requirements. Table 1 provides a complete list of the transformed inputs and outputs for the 65 schools.
Table 1

Each high school’s energy and waste input/output measurements transformed into appropriate format for DEA usage

FACILITY NAME

Floor Area sqm

Student Enrollment

Transformed Total Energy GJ

Transformed total Energy Cost

Transformed Total Waste Kg

Diversion(%)

Agincourt CI

19,554

1,533

19,826

$254,168

3,488

10.25

Albert Campbell CI

22,964

2,115

3,275

$1

3,546

9.59

Bendale BTI

14,693

855

15,794

$227,171

5,169

15.56

Birchmount Park CI

16,826

1,276

23,304

$287,084

5,150

41.52

Bloor CI

13,656

797

28,974

$314,772

3,924

18.07

Cedarbrae CI

23,668

1,557

14,999

$202,965

4,415

40.02

Central Commerce Collegiate

20,729

1,034

20,843

$298,930

6,114

100

Central Etobicoke HS

11,086

492

21,900

$291,602

5,025

35.99

CW Jefferys CI

16,401

1,057

22,486

$279,963

5,189

12.24

David & Mary Thomson CI

21,576

1,628

14,098

$206,145

5,856

8.95

Downsview SS

21,483

877

9,661

$176,278

5,514

13.05

Dr Norman Bethune CI

14,254

1,130

23,011

$286,933

5,682

56.31

Earl Haig SS

24,849

2,394

21,630

$146,889

1,159

69.46

Eastdale CI

5,501

197

34,717

$407,993

5,834

81.85

Eastern HS of Commerce / Subway Academy I

18,330

1,155

26,755

$342,930

5,520

53.13

Emery CI

22,306

1,570

16,563

$190,444

6,139

100

Etobicoke CI

19,367

1,502

22,215

$232,432

4,776

30.52

Etobicoke School of the Arts

12,537

889

22,467

$319,969

5,270

37.14

Frank Oke SS

4,322

154

34,371

$413,854

5,755

25.6

George Harvey CI

25,025

1,183

13,911

$179,360

4,635

30.21

Georges Vanier SS

23,721

1,045

1

$15,462

5,859

0

Greenwood SS / School of Life Experience

7,847

404

32,836

$379,072

5,982

69.35

Harbord CI

18,437

1,040

22,577

$213,196

4,992

33.39

Heydon Park SS

7,475

220

36,678

$441,925

6,122

0

Humberside CI

17,655

1,150

24,382

$304,460

5,226

63.09

Inglenook Community School

1,607

128

40,091

$488,102

6,317

89.74

Jarvis CI

21,783

1,313

19,217

$265,879

5,066

66.76

Kipling CI

12,276

729

27,813

$362,378

5,504

21.96

Lakeshore CI

16,208

920

22,371

$302,844

5,276

38.44

Lawrence Park CI

15,634

1,026

27,163

$336,863

5,149

49.39

Leaside HS

13,560

1,163

27,529

$339,132

5,369

37.07

Malvern CI

14,331

1,046

28,028

$345,788

4,982

41.88

Maplewood HS

10,728

523

25,715

$312,227

5,914

100

Martingrove CI

14,737

1,041

32,751

$331,622

5,691

48.38

Nelson A Boylen CI

9,708

611

28,556

$325,182

6,033

0

Newtonbrook SS

18,230

1,789

14,985

$170,648

5,443

0

North Albion CI

15,961

1,110

26,433

$332,279

3,934

20.16

North Toronto CI

16,046

1,114

28,736

$355,959

2,802

20.76

Northern SS

29,471

1,998

16,571

$223,581

1

21.05

Northview Heights SS

23,864

1,444

17,419

$220,008

3,644

18.95

Oakwood CI

18,588

983

28,163

$349,240

6,155

38.56

Parkdale CI

14,435

631

28,357

$342,029

5,639

64.45

RH King Academy

17,796

1,469

18,416

$228,731

4,751

36.07

Richview CI

11,030

992

27,608

$344,498

5,335

39.36

Riverdale CI

23,418

1,217

21,011

$210,624

3,854

29.29

Rosedale Heights SS

16,271

680

22,744

$289,208

5,503

50.75

Runnymede CI

13,491

806

29,109

$366,070

5,504

43.94

Scarlett Heights Entrepreneurial Academy

11,528

749

28,293

$348,189

369

3.31

School of Experiential Education

2,525

86

37,662

$459,735

3,814

79.81

Silverthorn CI

16,537

1,263

20,870

$242,216

4,868

38

Sir John A Macdonald CI

17,324

1,576

12,097

$105,763

4,444

10.15

Sir Robert L Borden BTI

13,246

722

21,546

$268,610

4,820

23.52

Sir William Osler HS

11,010

359

22,425

$304,830

5,725

39.89

Thistletown CI

15,540

1,103

21,448

$296,117

5,002

24.78

Ursula Franklin Academy

19,001

405

26,430

$320,347

4,395

63.15

Vaughan Road Academy

17,021

839

26,630

$324,903

5,610

67.77

Victoria Park SS

20,525

1,295

16,123

$211,734

5,823

76.02

West Hill CI

20,161

1,338

7,304

$105,948

2,922

15.03

West Toronto CI

19,852

813

18,979

$152,372

4,665

61.43

Western Technical-Commercial School / The Student School

44,367

1,220

6,691

$73,962

4,191

57.35

Weston CI

18,317

1,271

21,145

$206,202

4,546

37.46

Westview Centennial SS

25,323

1,385

11,118

$101,969

3,430

21.95

William Lyon Mackenzie CI

11,619

1,231

25,392

$316,995

6,039

100

Woburn CI

20,126

1,458

22,884

$264,443

3,887

69.23

York Mills CI

16,207

1,373

23,638

$274,777

3,012

73.49

The analytical approach of DEA evaluates the data by “enveloping” the entities being studied based upon the values of the performance measures. The underlying concept of DEA requires an evaluation of each DMU through a projection onto an empirically constructed, multi-dimensional efficient frontier. The enveloping process determines the efficiency of DMUs by: (i) creating an m + s dimensional surface, or “efficient frontier”, of the efficient DMUs (where m represents the number of inputs and s represents the number of outputs); (ii) assigning an efficiency score of θ = 1 to any DMU on the efficient frontier; (iii) determining the distance from the frontier for all inefficient DMUs; and, (iv) calculating the value of θ for inefficient DMUs as its proportional, multi-dimensional distance from the efficient frontier. The efficiency score of any inefficient DMU is always a value of θ < 1. The enveloping constructs an efficient frontier of the best-practice entities and also shows how any inefficient DMU can be improved by providing the amounts and directions for improvement to its specific measures (Zhu [2003]).

In assessing the high schools, the overall goal is to identify the system’s best performers by contrasting each school’s observed performance metrics relative to those of every other school considered. However, DEA only determines whether a school is efficient (θ = 1) or inefficient (θ < 1). The magnitude of θ cannot be used to establish relative degrees of inefficiency between non-efficient schools. Because the relative performance of any school can be contrasted only to an identified best-practice frontier, actual measures of relative inefficiency would change only when the best-practice frontier is altered (that is, when one or more of the efficient schools is removed). However, a recursive enveloping module can be created that stratifies the schools into numerous levels, or groupings, of relative best-practice frontiers, rather than the single DEA frontier.

The stratifying module of the DSS proceeds by removing all of those schools placed onto the original best-practice frontier and then using the original DEA enveloping methodology to form a new second-level best-practice frontier from the set of remaining schools. That is, once a first efficient frontier is calculated, all of the associated efficient schools are removed from further consideration and a new efficient frontier based only upon the remaining, initially inefficient schools is calculated. The schools on this second-level efficiency frontier are subsequently removed, permitting a third-level frontier to be constructed, then a fourth-level frontier, and so on, until no schools remain. The final result from this stratification is the creation of a series of efficient frontiers. The recursive procedure stratifies the original set of schools into L groupings of school efficiencies, with the specific value for L algorithmically determined, a posteriori, by an “empty-set” stopping rule.

When the stratification module was applied to the high school data, the 65 schools were partitioned into L = 12 distinct groups with 2, 2, 3, 6, 9, 10, 10, 5, 7, 5, 3, and 3 schools assigned to each respective efficiency stratum (see Table 2). This stratification effectively partitions the high schools into distinct groupings of comparably-performing schools based upon the multi-criteria measurements of their energy and waste performance.
Table 2

Changes required to each high school’s current energy and waste measures to advance into the next higher efficiency group

 

Changes required to original outputs to attain efficiency at the next higher level

DMU No.

DMU Name

Change to Total Energy GJ

Change to Total Energy Cost $

Change to Total Waste Kg

Change to Diversion(%)

Level 1

26

Inglenook Community School

0

0

0

0

49

School of Experiential Education

0

0

0

0

Level 2

14

Eastdale CI

−5,374

−80,109

−483

8

19

Frank Oke SS

−5,720

−74,248

−562

64

Level 3

22

Greenwood SS / School of Life Experience

−7,256

−109,030

−335

20

24

Heydon Park SS

−3,413

−46,177

−195

90

33

Maplewood HS

−5,334

−77,604

−265

55

Level 4

35

Nelson A Boylen CI

−6,411

−98,118

−96

8

42

Parkdale CI

−4,340

−36,153

−348

4

44

Richview CI

−5,423

−46,343

−718

5

53

Sir William Osler HS

−11,658

−96,559

−317

2

55

Ursula Franklin Academy

−8,319

−88,192

−1,443

17

63

William Lyon Mackenzie CI

−7,230

−73,073

−584

7

Level 5

7

Central Commerce Collegiate

−4,022

−47,268

−1,321

5

8

Central Etobicoke HS

−1,757

−23,397

−620

3

12

Dr Norman Bethune CI

−389

−13,090

−30

0

28

Kipling CI

−5,553

−39,705

−603

2

34

Martingrove CI

−1,720

−73,754

−299

3

47

Runnymede CI

−5,031

−34,249

−515

4

48

Scarlett Heights Entrepreneurial Academy

−55

−671

−5,337

14

56

Vaughan Road Academy

−6,589

−60,058

−342

4

65

York Mills CI

−11,079

−133,216

−2,822

8

Level 6

5

Bloor CI

−1,359

−14,768

−1,924

9

16

Emery CI

−2,444

−33,938

−401

5

18

Etobicoke School of the Arts

−6,006

−23,782

−392

3

25

Humberside CI

−2,360

−21,550

−386

4

30

Lawrence Park CI

−1,642

−19,472

−410

3

31

Leaside HS

−1,369

−15,909

−252

2

32

Malvern CI

−1,400

−17,269

−539

2

46

Rosedale Heights SS

−1,224

−12,518

−238

2

52

Sir Robert L Borden BTI

−1,970

−33,561

−441

17

59

West Toronto CI

−3,580

−127,440

−551

7

Level 7

3

Bendale BTI

−6,958

−61,342

−350

36

11

Downsview SS

−4,154

−4,813

−151

18

15

Eastern HS of Commerce / Subway Academy I

−1,860

−6,006

−97

1

29

Lakeshore CI

−2,506

−11,754

−205

10

37

North Albion CI

−1,868

−17,433

−206

14

38

North Toronto CI

−624

−7,736

−2,715

23

41

Oakwood CI

−11,928

−138,862

−162

51

54

Thistletown CI

−1,156

−2,288

−39

9

57

Victoria Park SS

−17,854

−184,878

−70

1

64

Woburn CI

−929

−19,192

−158

3

Level 8

4

Birchmount Park CI

−1,966

−33,642

−434

13

9

CW Jefferys CI

−189

−17,568

−44

27

21

Georges Vanier SS

−28,555

−309,719

−174

0

23

Harbord CI

−485

−88,334

−107

4

27

Jarvis CI

−3,704

−17,101

−326

4

Level 9

13

Earl Haig SS

−1,326

−122,171

−2,108

4

20

George Harvey CI

−8,662

−36,385

−364

2

36

Newtonbrook SS

−3,185

−72,412

−333

31

45

Riverdale CI

−1,451

−14,550

−1,151

2

50

Silverthorn CI

−1,998

−52,234

−341

2

60

Western Technical-Commercial School / The Student School

−12,969

−166,149

−624

9

61

Weston CI

−2,175

−81,412

−595

4

Level 10

1

Agincourt CI

−2,368

−30,352

−1,648

6

6

Cedarbrae CI

−4,810

−39,285

−533

5

10

David & Mary Thomson CI

−12,958

−105,347

−152

0

17

Etobicoke CI

−117

−1,224

−226

0

43

RH King Academy

−2,774

−8,051

−141

1

Level 11

40

Northview Heights SS

−3,353

−17,186

−285

2

51

Sir John A Macdonald CI

−10,389

−174,200

−746

2

62

Westview Centennial SS

−8,051

−110,998

−1,294

8

Level 12

2

Albert Campbell CI

−18,940

−232,431

−1,230

21

39

Northern SS

−2,571

−18,254

−4,099

2

58

West Hill CI

−15,242

−130,494

−2,138

11

Benchmark module for generating realistic energy and waste performance targets

Benchmarking is widely used for the identification and adoption of best practices and as a means for improving performance and increasing productivity. Benchmarking can be thought of as the process of defining valid measures of performance comparison among peer schools, using these to determine the relative standing of the peer schools, and ultimately in establishing standards of excellence for performance improvement. The TDSB had stated in their “essential component” requirements that, in order to improve their system-wide energy and waste usage, it was crucial for them to be able to determine attainable performance benchmarks and to establish achievable annual performance targets for each school (Christie [2003]). Clearly the satisfaction of these components would necessitate the creation of multi-criteria benchmark targets to improve each school’s performance and it would be imperative that the various competing stakeholders within the system felt that these targets had been set fairly, objectively, and transparently (Christie [2003, 2007]). As observed in the previous section, the stratification module partitioned the 65 high schools into 12 sets of comparably performing groups in which all schools within a higher grouping were better performers than any school in each of the lower groupings. In this sense, the stratification module could be viewed as a type of benchmarking, since the module creates various different performance groupings of peer-efficient schools to which any underperforming schools could be benchmarked.

However, by using the stratification module’s output, it becomes possible to establish performance goals for less efficient schools by benchmarking their current energy and waste usage against the more efficient operations of the schools in any of the higher performing strata. For an inefficient DMU, it is possible to calculate the changes needed to its inputs/outputs for it to become efficient relative to the DMUs on the efficient frontier. These efficiency targets would be the specific values that the inputs and outputs of the inefficient DMU would need to attain in order to move onto the efficiency frontier. For the benchmarking and target setting required in the TDSB case, the set of DMUs in the following procedure would consist of one specific, inefficient DMU under evaluation (i.e. DMU 0 ) and all of the DMUs in the next higher contextual grouping. Hence, assume that there exists a set of n DMUs, with each DMU j , j = 1,…, n, consisting of m input measures x ij , i = 1,…, m, and s output measures y rj , r = 1,…, s. Suppose that DMU 0 is being evaluated with xi 0 and yr 0 representing its i th input and r th output measures. Let s i , i = 1,…, m, and s r + , r = 1,…, s, represent the ith input and rth output slack variables, respectively, and let ε be some non-Archimedean scalar. Then by solving the optimization model:
min θ ε i = 1 m s i + r = 1 s s r +
(1)
subject to:
j = 1 n λ j x ij + s i = θ x i 0 i = 1 , , m
(2)
j = 1 n λ j y rj s r + = y r 0 r = 1 , s
(3)
j = 1 n λ j = 1
(4)
λ j 0 j = 1 , , n
(5)

efficiency targets x ^ i 0 = θ * x i 0 s 1 , i = 1,…, m, and y ^ ro = y r 0 + s γ + * , r = 1,…, s, can be calculated for each of the inputs and outputs of DMU0. The presence of the non-Archimedean ε in the objective function effectively allows the minimization over θ to pre-empt the optimization involving the slacks, s i and s r + . This creates a two-stage optimization process with the maximal reduction of the inputs being achieved in the first stage via the optimal θ*, followed by the movement onto the efficient frontier achieved in the second stage via the subsequent optimization of the slack variables. This approach to multi-criteria benchmarking proves particularly suitable to practical situations in which no objective or pre-existing engineered standards are available to define efficient or effective performance.

For illustrative purposes, an incremental goal for each school to progress only into the next highest level of efficiency was set. While a longer-term perspective might seek to advance all schools into the very highest level of performers, these intermediate targets establish more attainable improvements that each inefficient school would need to undertake in order to move into the next higher level of efficiency. Hence, the benchmarking module was used to calculate the specific efficiency targets required for each school to proceed into the next higher category of energy and waste performance and Table 2 shows the specific improvements required in each measure in order to reach the calculated targets. The specific changes shown in Table 2 represent: (i) annual reductions in energy use; (ii) annual reductions in energy expenses; (iii) annual reductions in the quantity of solid wastes generated; and, (iv) annual increases in the percentage diversion of solid wastes. While the target-setting module actually produces multi-criteria measures for improvement, if the changes in Table 2 were to be achieved, then one significant subset of these improvements would be the 25 % percent reduction in annual energy costs. This reduction, alone, represents a direct annual cost savings of $3.7 million from the current energy budget of $14.9 million allocated to the TDSB’s high school. Furthermore, if the established targets were achieved, the table shows potential reductions of over 300,000 GJ of energy and of 44,000 Kg of waste. It should be noted that an analogous percentage cost, energy and waste improvements would be obtained from the targets set for the entire set of the more than 600 schools of the TDSB.

Rank ordering module

While the stratification module partitions the schools into distinct levels of energy and waste performers, it does not rank order the standings of any of the schools within each grouping. If only a small number of schools were under consideration in an analysis or if a large number of groupings each containing only a very small number of schools had been produced, then the stratification, itself, might be sufficient for actually ranking the specific schools. For the general case, however, the stratification might not prove restrictive enough to permit sufficient degrees of preference discrimination. If this situation proves to be the case, then a contextual attractiveness concept can be incorporated into the DSS to permit an explicit ranking of the schools (Simonson & Tversky [1992]; Tversky & Simonson [1993]). Obtaining relative attractiveness scores for the schools requires that relative performance be defined with respect to some particular evaluation context and the L partitions from the stratification module can be used to supply these contexts.

Define H q ( d ) to be the d-degree, d = 1,…, L - l 0 , contextual attractiveness of DMU q  = (x q , y q ) from some specific level E l 0 , l 0  {1,…, L – 1}. H q ( d ) can be calculated by solving the model:
H q * d = min H q d d = 1 , , L l 0
(6)
subject to:
j F E l 0 + d λ j x j H q d x q
(7)
j F ( E l 0 + d ) λ j y j y q
(8)
j F ( E l 0 + d ) λ j = 1
(9)
λ j 0 j F E l 0 + d
(10)

DMU q is viewed as a more attractive option than another DMU if it possesses a larger value for its contextual attractiveness measure H q ( d ) . Hence, it is possible to rank each school within each stratum using a direct sorting of these contextual attractiveness scores. Since all schools within a contextual grouping are considered better performers than any school in a lower contextual group, when the rankings within each grouping are subsequently concatenated, a complete, rank ordering of the entire set of selected schools will be produced, de facto.

Using the next-lower stratification grouping as the appropriate context, the attractiveness scores H q ( d ) , for each high school were calculated. Since any school is considered more efficient than the other schools within its grouping if it possesses a larger contextual attractiveness score, it now becomes possible to rank order the schools within each stratum by sorting these scores. Concatenating these individually sorted groupings produces a comprehensive rank ordering from 1 to 65 based upon the energy and waste usage in all of the schools. The results of this contextual scoring system, the subsequent sorting within each partition, and the overall efficiency ranking of each high school within the TDSB are shown in Table 3.
Table 3

Overall rankings of high schools based upon contextual attractiveness scores calculated relative to the next lower partitioning group

  

Attractiveness

Overall

DMU No.

DMU Name

Score Hq(d)

Ranking

Level 1

49

School of Experiential Education

0.406

1

26

Inglenook Community School

0.251

2

Level 2

19

Frank Oke SS

0.539

3

14

Eastdale CI

0.422

4

Level 3

33

Maplewood HS

0.540

5

22

Greenwood SS / School of Life Experience

0.510

6

24

Heydon Park SS

0.378

7

Level 4

44

Richview CI

0.862

8

42

Parkdale CI

0.798

9

35

Nelson A Boylen CI

0.725

10

53

Sir William Osler HS

0.640

11

63

William Lyon Mackenzie CI

0.541

12

55

Ursula Franklin Academy

0.531

13

Level 5

12

Dr Norman Bethune CI

0.889

14

34

Martingrove CI

0.870

15

65

York Mills CI

0.848

16

56

Vaughan Road Academy

0.844

17

47

Runnymede CI

0.821

18

48

Scarlett Heights Entrepreneurial Academy

0.802

19

28

Kipling CI

0.773

20

7

Central Commerce Collegiate

0.728

21

8

Central Etobicoke HS

0.707

22

Level 6

52

Sir Robert L Borden BTI

0.914

23

25

Humberside CI

0.895

24

30

Lawrence Park CI

0.861

25

16

Emery CI

0.826

26

32

Malvern CI

0.795

27

31

Leaside HS

0.763

28

5

Bloor CI

0.757

29

18

Etobicoke School of the Arts

0.745

30

59

West Toronto CI

0.712

31

46

Rosedale Heights SS

0.642

32

Level 7

54

Thistletown CI

0.896

33

64

Woburn CI

0.859

34

57

Victoria Park SS

0.825

35

3

Bendale BTI

0.823

36

37

North Albion CI

0.820

37

11

Downsview SS

0.816

38

38

North Toronto CI

0.769

39

15

Eastern HS of Commerce / Subway Academy I

0.760

40

29

Lakeshore CI

0.716

41

41

Oakwood CI

0.671

42

Level 8

4

Birchmount Park CI

0.852

43

23

Harbord CI

0.769

44

9

CW Jefferys CI

0.724

45

21

Georges Vanier SS

0.699

46

27

Jarvis CI

0.647

47

Level 9

36

Newtonbrook SS

0.905

48

45

Riverdale CI

0.851

49

20

George Harvey CI

0.840

50

61

Weston CI

0.792

51

50

Silverthorn CI

0.787

52

13

Earl Haig SS

0.725

53

60

Western Technical-Commercial School / The Student School

0.547

54

Level 10

10

David & Mary Thomson CI

0.756

55

1

Agincourt CI

0.709

56

17

Etobicoke CI

0.524

57

6

Cedarbrae CI

0.510

58

43

RH King Academy

0.418

59

Level 11

62

Westview Centennial SS

0.706

60

40

Northview Heights SS

0.541

61

51

Sir John A Macdonald CI

0.534

62

Level 12

58

West Hill CI

0.923

63

39

Northern SS

0.889

64

2

Albert Campbell CI

0.657

65

If a large number of schools existed within any particular level after the stratification stage, then a situation might occur in which two or more of the schools each received exactly the same attractiveness measure. This tied ranking problem can be alleviated by incorporating a lexicographic ranking modification into the procedure described above and could be accomplished in the following manner. In order to reduce the likelihood of tied rankings, the lexicographic procedure would be used to calculate attractiveness measures for each school by using every level lower than it as the contextual basis. Hence, each school in the highest level would receive L-1 separate contextual attractiveness scores, each school in the second highest level would receive L-2 separate attractiveness scores, and so on. A lexicographical rank ordering of schools within each specific grouping would then be performed with greater emphasis given to attractiveness scores calculated from closer contextual groupings (i.e. through an alphabetical or dictionary style of sorting). This multi-scoring, lexicographic rank ordering process would produce additional discriminating powers by reducing the likelihood of ties occurring between schools within any specific grouping. However, since no tied-scores occurred in the contextual attractiveness calculations already performed, this lexicographic ranking produces the same rank ordering as Table 3.

Conclusions

In this paper, a DEA-based DSS for analyzing, rating, ranking, and benchmarking the multi-criteria energy and waste system of the schools in the TDSB has been studied. Several benefits of this DSS were demonstrated through an illustrative investigation of 65 Toronto high schools. The DSS stratified the schools into similarly-efficient groupings based upon their energy and waste usage. The DSS then generated realistic energy and waste improvement targets for any relatively inefficient school against another grouping of benchmarked schools. Amongst other findings, it was shown that achieving these target reductions would produce system-wide energy cost savings of twenty-five percent. The DSS also permits the ranking of any set of schools by contextually evaluating their relative attractiveness to other identified school groupings. The findings with respect to the high schools have been extended in an analysis of all 600 schools within the TDSB. Based upon the TDSB study, DEA has shown itself to be an extremely useful approach for evaluating, benchmarking and ranking the relative energy and waste performance within the school system, and the potential for its much broader applicability to other applications clearly warrants additional exploration.

Methods

The mathematical models of each of the DEA modules were created as separate computer models within Microsoft Excel worksheets (Albright [2010]; Seref et al. [2007]; Zhu [2003]). The optimization of the spreadsheets of these DEA models was performed using the standard, built-in Excel Solver function. The data for all 600 schools in the TDSB can be found in Christie ([2003]) and the data for the 65 schools used in the analysis appears in Table 1. All data was stored in a Microsoft Access database. The programming language used to link together the various components was VBA which is contained in all Microsoft Office products. Hence, the mathematical and computer-based DSS for evaluating the performance of the TDSB was created using software residing on essentially all personal computers.

Author’s contributions

JSY is the sole author of this paper, and read and approved the final manuscript.

Abbreviations

DSS: 

Decision support system

DEA: 

Data envelopment analysis

TDSB: 

Toronto district school board

DMU: 

Decision making unit

VBA: 

Visual basic for applications.

Declarations

Acknowledgments

This work was supported in part by grant OGP0155871 from the Natural Sciences and Engineering Research Council. The author would like to thank Richard Christie of the Toronto District School Board for providing access to the energy and waste data of the various schools.

Authors’ Affiliations

(1)
Operations Management & Information Systems Area, Schulich School of Business, York University

References

  1. Albright SC: VBA for Modelers: Developing Decision Support Systems. Cengage Learning, Mason, OH; 2010.Google Scholar
  2. Camp RC: Business Process Benchmarking, Finding and Implementing Best Practices. ASQC Quality Press, Milwaukee, WI; 1995.Google Scholar
  3. Christie R: The Case for an Annual Environment Report at the Toronto District. Department of Environmental Education, School Services, Toronto District School Board. , ; 2003.Google Scholar
  4. Christie R: Environmental Report. Department of Ecological Literacy and Sustainable Development, Toronto District School Board; 2007.Google Scholar
  5. Christie R, Coppinger F: Toronto District School Board Energy Conservation Report. Department of Ecological Literacy and Sustainable Development, Toronto District School Board; 2006.Google Scholar
  6. Cook W, Zhu J: Data Envelopment Analysis: Modeling Operational Processes and Measuring Productivity. CreateSpace, Charleston, SC; 2008.Google Scholar
  7. Cook W, Kress M, Seiford L: Data Envelopment Analysis in the Presence of both Quantitative and Qualitative Factors. J Oper Res Soc 1996, 47: 945–953.View ArticleGoogle Scholar
  8. Lin QG, Huang GH, Bass B, Chen B, Zhang BY, Zhang XD: CCEM: A City-cluster Energy Systems Planning Model. Energy Sources, Part A: Recovery, Utilization, and Environmental Effects 2008, 31: 1–14. 10.1080/15567030802308601View ArticleGoogle Scholar
  9. Linton J, Morabito J, Yeomans JS: An Extension to a DEA Support System Used for Assessing R&D Projects. R&D Management 2007, 37: 29–36.View ArticleGoogle Scholar
  10. Seref MMH, Ahuja RA, Winston WL: Developing Decision Spreadsheet-Based Decision Support Systems. Dynamic Ideas, Boston; 2007.Google Scholar
  11. Simonson I, Tversky A: Choice in Context: Tradeoff Contrast and Extremeness Aversion. J Mark Res 1992, 29: 281–295. 10.2307/3172740View ArticleGoogle Scholar
  12. Tversky A, Simonson I: Context-Dependent Preferences. Manag Sci 1993, 39: 1179–1189. 10.1287/mnsc.39.10.1179View ArticleGoogle Scholar
  13. Yeomans JS: Rating and Evaluating the Combined Financial and Environmental Performance of Companies in the Metals and Mining Sector. Journal of Environmental Informatics 2004,2004(3):95–105.View ArticleGoogle Scholar
  14. Zhu J: Quantitative Models for Performance Evaluation and Benchmarking. Kluwer Academic Publishers, Norwell, MA; 2003.View ArticleGoogle Scholar

Copyright

© Yeomans; licensee Springer. 2012

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.