16 Jun The IT Crisis Part 1 | History of Project Performance
*This is a summary and analysis of the following publication. For full references and research details, please see original publication written by Dr. Isaac Kashiwagi.
Kashiwagi, I. (2018). A Global Study on ICT Project Performance. Journal for the Advancement of Performance Information and Value, 10(1), 8-27.
ICT Project Performance
In 2018, due to repeated claims of information communications technology (ICT) performance issues, Performance Based Studies Research Group (PBSRG) compiled and analyzed the latest performance reports, project factors and propose solutions within the ICT industry. This is part of one a two-article summary of the findings in the research publication. For references, please see link to journal publication above.
Chronology of Performance Studies:
1. The Standish group (1994). The study surveyed 365 respondents with multiple personal interviews. The findings identified that 83.8% of ICT projects failed to be completed on time and on budget, and projects that were completed by the largest American companies had only 42% of their original features and functions.
2. IT-Cortex (2014) Reported four different studies done on ICT project performance. (1) In 1995 the Organizational Aspects of Information Technology (OASIG) UK group sampled 45 experts primarily employed by Universities or consultancies. The interviews resulted in the identification that the success rate of IT projects is estimated at 20 – 30%. (3) In 1998 the Bull Survey performed 203 telephone interviews with IT project managers who took the lead in integrating large systems within organizations in the Times Top 100 and reported that with the IT projects 75% missed deadlines, 55% exceeded budget and 37% were unable to meet project requirements. (4) In 2001, the Robbins-Gioia survey of ERP systems reported that 51% of ERP implementations were viewed as unsuccessful, 46% of the participants noted that while their organization had an ERP system in place, or was implementing a system, they did not feel their organization understood how to use the system to improve the way they conduct business.
3. Hoffman (1999) reported the results from Howard Rubin’s annual worldwide IT trends and benchmark report which surveyed more than 16,000 IT professionals at 6,000 companies and in 28 countries. The results reported 85% of IT organizations in the US are failing to meet their organizations strategic business needs.
4. Whittaker (1999) reported a study done in 1997 surveying chief executives of 1,450 public and private sector organizations across Canada in the ICT industry, of which 176 were analyzed. The findings included 87% of failed projects exceeded their initial schedule estimates by 30%.
5. Taylor (2000) analyzed 1,027 projects and interviewed 38 practitioners of the association of project managers and institute of management. The findings revealed that out of the 1,027 projects only 130 (12.7%) were successful.
6. Sauer and Cuthbertson (2003) from Oxford surveyed over 1500 practicing IT project managers and found that 16% of projects end up with an average cost overrun of 18%, schedule overrun of 23% and 7% underachievement of scope / functionality.
7. KPMG (2005) conducted a global IT project management survey of more than 600 companies in 22 countries. Some of the dominant results showed that in the past 12 months 49% of participants experienced at least one project failure. In the same period only 2% of organizations achieved benefits all the time, and 86% of organizations lost up to 25% of target benefits across their entire portfolio.
8. The European Services Strategy Unit (2007) reported 105 outsourced public-sector ICT projects with 57% of contracts which experienced cost overruns with an average cost overrun of 30.5%, average schedule overrun of 33% and 30% of contracts which were terminated or never used.
9. The US Accountability office (2008) identified 413 IT projects—totaling at least $25.2 billion in expenditures for the fiscal year of 2008—as being poorly planned, poorly performing, or both. With just under half being re-baselined at least once.
10.The Genenca group’s (2011) survey included 600 U.S. businesses IT executives and practitioners and reported that 75% of respondents admit that their projects are either always or usually doomed right from the start, of which 27% always felt this way (2011).
11.Flyvbjerg and Budzier’s (2011) entry for the Harvard Business Review did an analysis of 1,471 IT projects and reported an average cost overrun of 27%, of which 17% had a failure high enough to threaten the company’s existence, with an average cost overrun of 200% and schedule overrun of 70%.
12.McKinsey & Company (2012) analyzed over 5,400 projects and reported 50% of IT projects on average are 45% over budget, 7% over time, 56% less value than predicted and 17% of projects end so badly they can threaten the life of the company.
13.The Standish Group (2016) analyzed their database of over 25,000 projects to find that 61% of projects failed to complete on time, on budget with a satisfactory result.
Additional Claims of Performance Issues:
Other performance metrics have been reported without details such as the year the study was conducted, or methodology used to explain the metrics. Although these performance metrics are not as reliable, they are important to consider when examining the perception of the industry:
1. 15% of all software development never delivers anything, and has overruns of 100-200% (DeMarco, 1982).
2. There is a 50-80% failure rate of large projects (Dorsey, 2000).
3. An estimate of 5-15% of all large-scale software projects are cancelled in the USA and the total yearly cost of cancellations may be as much as $75 Billion USD (Savolainen & Ahonena, 2010).
4. Kappelman et al. (2002) cites two studies: (1) reporting 20% of IT projects are cancelled before completion and less than a third are finished on time and within budget and expected functionality. (2) study reports these numbers to more than double when considering large projects with 10,000 function points.
5. Fenech and De Raffaele (2013) report three different studies: (1) an independent study by McCafferty revealed that 25% of the projects will not succeed in meeting the requirements, amounting to around $63 billion annually spent on such failed initiatives, (2) a global study held by Gartner for 845 ICT companies concluded that 44% of the analyzed projects exceeded budget allocations, 42% failed to be delivered within agreed timeframes and over 42.5% lacked in achieving all expected benefits by the end of the project, (3) Young’s study reported that 15-28% of ICT projects in Australia were abandoned prior to implementation, around 30% experienced significant cost overruns sometimes up to 189% and less than 20% had achieved all the established performance objectives.
6. As many as 25% of all software projects are cancelled outright, as many as 80% are over budget, with the average project exceeding its budget by 50%. It is estimated that three-fourths of all large systems are operational failures because they either do not function as specified or are simply not used (Schmidt et al, 2001).
7. Dijk (2009) reports that 34% are successful, 51% does not go according to plan but ultimately does lead to some result and 15% of the projects fail completely.
8. Molokken and Jorgensen (2003) studied 6 different studies to find the performance statistics varying for ICT projects. Cost overrun was reported by four studies with 33%, 33%, 34% and 89%. Projects completed over budget was reported by four studies with 61%, 63%, 70%, and 80%. Projects completed after schedule was reported by three studies with 65%, 80%, 84%.
9. Procaccino et al. (2002) cited two studies: (1) in 1994, 31% of all corporate software development projects resulted in cancellation and (2) a more recent study found that 20% of software projects failed, and that 46% experienced cost and schedule overruns or significantly reduced functionality.
Governmental Inquires of ICT Performance Issues
Multiple countries have addressed the issue of ICT project performance on a governmental level including the United Kingdom, the Netherlands, Australia, and the United States. The UK government has spent over 16 billion (GBP) on IT projects in 2009 in a wide range of areas, yet the UK has been described as “a world leader in ineffective IT schemes for government”. In 2011 the House of Commons in England appointed a special committee to investigate the state of their government IT performance (Public Administration Committee, 2011). In addition to lessons learned and the identification of the sources of failure, the investigation revealed various high costing IT initiatives over the last twenty years which ended in failure (Public Administration Committee, 2011).
In 2012 – 2014 a Netherlands parliamentary inquiry was held to address the poor performance of ICT projects in the Public space (The House of Representatives of the Netherlands, 2014). During the enquiry, it was reported that 1-5 billion Euros are wasted in the Netherlands with ICT projects annually. Recent and notable projects by the media and government inquiry included (The House of Representatives of the Netherlands, 2014):
1. Defense department project (SPEER) cancelled after spending € 418 million.
2. Belastingdienst ETPM project cancelled after spending € 203 million.
3. Police Investigation Suite (PSO) Cancelled in 2005 after spending € 430 million.
4. C2000 emergency police and others implementation costs € 72 million overbudget due to delays.
5. Payroll administration (P-direct) failed tender costs of € 200 million with a potential € 700 million more.
6. EPD Electronic Patient File cancelled after spending € 300 million.
In 2013 – 2014 the Legislative Assembly of the Northern Territory of Australia held a government inquiry that was prompted by ongoing concerns raised by the Auditor-General regarding the management of information and communication technology projects (Legislative Assembly of the Northern Territory, 2014). The chairperson of the committee commented that it was clearly unacceptable to spend over $70 million only to make systems worse. In the inquiry three large government projects were specifically analyzed:
1. The department of infrastructure’s attempt to replace their nine legacy systems used to manage the Government’s asset management information systems and business processes with an integrated commercial off the shelf product (COTS). The project was budgeted at $14 million and to be completed on April 10th. The project was cancelled in March 2014 where it cost around $70 million.
2. The Power and Water Corporation (PWC) project to replace a suite of old systems which were poorly integrated and no longer supported by the suppliers. The project was budgeted at $15 million and to be completed on March 12th. The project was completed in August 2012 where it cost approximately $51.8 million.
3. The Department of Health’s grant management system project was to develop and implement an ICT system to support the management of service agreements with NGOs. The project was budgeted at $684 thousand and to be completed in November 2011. The project was still in progress with an expected budget of $979 thousand and with an expected completion date of June 2014 when the last report was created.
The United States has not held an official government inquiry, however from 2011 – 2014 the United States has also experienced similar high failure rate with government IT projects, reportedly spending billions of dollars on projects which are incomplete, cancelled, or nonfunctional. Recent and notable projects include:
1. The USAF’s attempt to automate and streamline their logistics operations by consolidating and replacing over 200 separate legacy systems. The project was cancelled after spending $1.1 billion, project incomplete and nonfunctional (Institute for Defense Analysis, 2011; Kanaracus, 2012; United States Senate Permanent Subcommittee on Investigations, 2014).
2. The state of California’s attempt to merge 13 separate payroll systems into a single system that served 243,000 employees. The project was cancelled after spending $254 million and had proven to be nonfunctional (Chiang, 2013; Kanaracus, 2013).
3. The Census Bureau’s attempt to convert to handheld computers for the 2010 census. The project was cancelled after spending up to $798 million for a nonfunctional product (Nagesh, 2008; US Department of Commerce, 2011).
4. The IRS’s continual attempt to update their system from legacy software. Multiple projects have been cancelled with over $4 billion spent (Hershey, 1996; Moseley, 2013; Thompson, 2012).
5. The US Government’s online healthcare website, “Obamacare” was originally budgeted for $93 million. Official statements of costs have not been calculated but estimations calculated it to be as high as $634 million (Costello & Mcclaim, 2013; Dinan & Howell, 2014; Vlahos, 2013).
6. The Federal Aviation Association’s attempt to consolidate their terminal automation system for an initial $438 million; the cost overrun has been estimated to be $270 million. The project was still ongoing and is nonfunctional according to the last reports of the project (Levin, 2013; Perera, 2013).
The various performance studies and reports performed through surveys, interviews, and case studies use different types of performance statistics, methods to achieve those statistics and values of those performance statistics. Each reported study also defines performance in a different way. Due to these factors, it is unclear and not verifiable to determine a universal performance level of the entire ICT industry. However, from the literature, we can conclude that there is a consensus that the ICT industry is perceived to have performance issues. The chronology and time period which the performance metrics cover also reveals that the ICT industry has been experiencing these perceived issues with performance for multiple years.