Enhancing project success: the impact of sociotechnical integration on project and program management using earned value management systems

Vartenie Aramali (Department of Civil Engineering and Construction Management, California State University Northridge, Northridge, California, USA)
George Edward Gibson (National Academy of Construction, Austin, Texas, USA)
Hala Sanboskani (School of Sustainable Engineering and the Built Environment, Arizona State University, Tempe, Arizona, USA)
Mounir El Asmar (Head of Strategic Accounts, First Solar, Tempe, Arizona, USA)

International Journal of Managing Projects in Business

ISSN: 1753-8378

Article publication date: 27 February 2024

813

Abstract

Purpose

Earned value management systems (EVMS), also called integrated project and program management systems, have been greatly examined in the literature, which has typically focused on their technical aspects rather than social. This study aims to hypothesize that improving both the technical maturity of EVMS and the social environment elements of EVMS applications together will significantly impact project performance outcomes. For the first time, empirical evidence supports a strong relationship between EVMS maturity and environment.

Design/methodology/approach

Data was collected from 35 projects through four workshops, attended by 31 industry practitioners with an average of 19 years of EVMS experience. These experts, representing 23 organizations, provided over 2,800 data points on sociotechnical integration and performance outcomes, covering projects totaling $21.8 billion. Statistical analyses were performed to derive findings on the impact of technical maturity and social environment on project success.

Findings

The results show statistically significant differences in cost growth, compliance, meeting project objectives and business drivers and customer satisfaction, between projects with high EVMS maturity and environment and projects with poor EVMS maturity and environment. Moreover, the technical and social dimensions were found to be significantly correlated.

Originality/value

Key contributions include a novel and tested performance-driven framework to support integrated project management using EVMS. The adoption of this detailed assessment framework by government and industry is driving a paradigm shift in project management of some of the largest and most complex projects in the U.S.; specifically transitioning from a project assessment based upon a binary approach for EVMS technical maturity (i.e. compliant/noncompliant to standards) to a wide-ranging scale (i.e. 0–1,000) across two dimensions.

Keywords

Citation

Aramali, V., Gibson, G.E., Sanboskani, H. and El Asmar, M. (2024), "Enhancing project success: the impact of sociotechnical integration on project and program management using earned value management systems", International Journal of Managing Projects in Business, Vol. 17 No. 8, pp. 1-21. https://doi.org/10.1108/IJMPB-07-2023-0160

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Vartenie Aramali, George Edward Gibson, Hala Sanboskani and Mounir El Asmar

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction

Research on earned value management systems (EVMS) has been explored in both the academic literature and the industry, providing valuable insights into their effectiveness and application (Aramali et al., 2021; Aramali et al., 2022a; Kwak and Anbari, 2009). EVMS can be viewed as a socio-technical system that unifies social and technological components, in line with Fox’s theory (1995) and indicated by Aramali et al. (2022a) in a recent literature review. The technical dimension, or EVMS maturity, is based on its conformity to industry standards. The social dimension of EVMS is reliant on the team’s ability to work together effectively, which is referred to as the EVMS environment. EVMS maturity is defined as “the degree to which an implemented system, associated processes, and deliverables serve as the basis for an effective and compliant EVMS” (Aramali et al., 2022a). Higher conformality to standards and guidelinesAshkanani indicates a more mature EVMS. On the other hand, the EVMS environment is “the conditions (i.e. people, culture, practices, and resources) that enable or limit the ability to manage the project/program using the EVMS, serving as a basis for timely and effective decision-making”. EVMS maturity, as seen from a managerial perspective, demonstrates the organization’s capacity to efficiently plan, carry out and oversee projects using EVMS. By building up their EVMS environment effectively, they can make decisions that will benefit their organizations and lead to improved project achievements.

EVMS is “an organization’s management system for project and program management that integrates a defined set of associated work scopes, schedules and budgets for effective planning, performance, and management control; it integrates these functions with other business systems such as accounting and human resources among others” Aramali et al. (2022a). Abba (2017) defined earned value management (EVM) as “the core discipline of integrated program performance management”, to shed light on the fact that EVM’s overarching role is to integrate management and business systems. For example, during the organizing process, the work breakdown structure should include the identification of subcontractors, thereby, this process has to be integrated with the subcontract management process. As such, EVMS is oftentimes viewed as an integrated project or program management system. Note that both “projects” and “programs” are considered in this study; however, only the term “project” will be used for brevity.

Government agencies and contractors have often questioned the reliability and effectiveness of EVMS when executing projects, in terms of its impact on project performance. To this end, the authors, in partnership with the U.S. department of energy formed a research team of industry experts representing 19 organizations and developed an EVMS assessment model called integrated project/program management maturity and environment total risk rating (IP2M METRR) to support reliable and effective EVMS (Gibson et al., 2022). IP2M METRR was developed through a series of steps including a comprehensive literature review, a large industry survey, collaborative meetings and focus groups with the research team, a dozen workshops and performance data collection. The model aims to support the effectiveness of EVMS application in an integrated manner by assessing the EVMS’s two technical and social dimensions: maturity and environment. The framework assesses the EVMS application in terms of its maturity and environment, and results in an EVMS maturity score and an EVMS environment score, each on a 1,000-point scale (with higher being better). Note that the research team chose a score wide range of 0–1,000 for assessing EVMS maturity and environment as it provides increased differentiation between scores, compared to other smaller ranges such as 0–100. Further elaboration on how maturity and environment are measured is provided in the background section.

On one hand, the current traditional use of EVMS in the industry is equivalent to a “one size fits all” system for projects, often without enough consideration of the unique characteristics, requirements and needs of each project (Bergerud, 2017; Hanna, 2012; Andrews et al., 2010). On the other hand, the EVMS standards and guidelines conformity assessment (i.e. compliance assessment) leans towards a binary “Compliant or Non-Compliant” approach, which is when stakeholders use a checklist to assess the technical compliance of the system to each EVMS guideline characteristic with a Yes/No question (Liggett et al., 2017; McNamee et al., 2017). Thus, to address such limitations, the IP2M METRR model in this study gives the flexibility of assessing only the applicable characteristics of the EVMS in the projects. This means allowing EVMS to be tailored based on the project’s specific context and needs. Also, the model is tested on completed projects by maturity assessment on a 5-level scale against each of the 56 identified maturity attributes, resulting in an EVMS maturity rating score over 1,000 points, along with associated gaps and areas for improvement, in contrast to the “Y/N” approach. Moreover, a new assessment layer is added which is the assessment of the social environment surrounding EVMS, enhancing the integrated project and program management in a novel way; a major contribution to the EVMS body of knowledge that has existed since the 1960s.

The EVMS framework’s building blocks are rooted in literature sources and primary studies supporting the different components. To position contractors for success, compliance assessment instruments should be clearly defined and defendable (DOE, 2018; Kester et al., 2015), which leads to clarity and consistency. Additionally, as the use of data-driven compliance metrics continues to grow (McNamee et al., 2017; Wu and Liang, 2015; Djali et al., 2010), there was a need to define and quantify the characteristics of EVMS maturity level and the degree of accuracy with its outputs. In a similar line, efforts by organizations such as the construction industry institute have led to assessment instruments like front-end engineering design maturity and accuracy total rating system (Yussef et al., 2019), as well as project definition rating index (Gibson et al., 2019). With these tools proven to be highly effective and supported by empirical evidence in the realm of project planning, it was possible to develop components that assess compliance with guidelines while determining the EVMS maturity level and that assess the accuracy of its outputs. These outputs are well-documented in the literature. EVMS integrates project schedule and cost and particularly influences and helps control the cost and schedule growth along with an early detection indicator of cost efficiency, called cost performance index (CPI) (Kim and Pinto, 2019; Yussef et al., 2019). It also aids in managing project changes, ensuring decisions align with project goals (Tariq et al., 2020). Furthermore, it aligns the project with business objectives (Kwak and Anbari, 2012), enhances customer satisfaction through transparency and accountability (Kim et al., 2003), and promotes proactive management, important in risk mitigation (Christensen and Heise, 1993). All of these outputs, treated as variables in this study along with compliance to standards and guidelines, were not only referenced in the literature but also validated by the research team, who has extensive EVMS expertise and acknowledges their significance for project success.

Problem statement and hypothesis formulation

Budget and schedule overruns on numerous large federal projects have decreased client satisfaction and have often failed to achieve business objectives on time (e.g. GAO, 2023). Some project teams struggle to implement an EVMS that is compliant with industry standards, leading to unreliable EVMS, cost growth and schedule slippage (Chirinos, 2015; GAO, 2012). Additionally, to properly apply effective EVMS technical processes (e.g. planning and scheduling, budgeting, subcontract management, etc.), qualified project management specialists are required due to the growing complexity of projects and increased competition (Sharma and Kirtani, 2021). Data-driven and evidence-based decision-making must be better understood to appropriately control project costs and schedules as opposed to narratives (Carney, 2023; DOE, 2022). Furthermore, the effective application of management systems is suggested to positively impact project performance, thereby increasing the probability of successfully achieving the project objectives (Ashkanani and Franzoi, 2022; Kagioglou et al., 2001). In summary, industry and academic sources from the literature suggest that combined efforts from following guidelines and standards, and improving the personnel’s knowledge and team culture, greatly support projects. Hence, it is imperative to prioritize the improvement of integrated project management approaches, specifically by elevating the effectiveness of EVMS and considering both its technical and social aspects. Stratton (2006) states that the technical maturity of EVMS implementation is dependent on critical attributes or criteria derived from industry guidelines. In another study, Aramali et al. (2022b) highlighted critical factors that form the social environment and impact project success. However, empirical evidence is lacking in studying the relationship between maturity and environment of EVMS and their interdependent impact on project performance. This leads to the general question of whether improving both EVMS maturity and environment within integrated project management accomplishes better project outcomes. The objective of this paper is to empirically investigate the impact of EVMS on project performance when it is implemented as a sociotechnical system, acknowledging that performance is shaped by both the technical and social conditions around it (Ropohl, 1999). Specifically, the authors aim to test the hypothesis that “effective implementation of EVMS considering maturity (technical conditions) and environment (social conditions) elements will result in significantly improved project performance outcomes". This hypothesis has not been tested before, and is not only formulated based on the literature, but also strengthened by the experience of a large research team representing both government and industry. To address this, the authors hosted workshops with expert practitioners to collect performance data from 35 completed projects and conduct an IP2M METRR assessment on each of them. The data were then statistically analyzed to test the hypothesis.

This study’s contributions to the body of knowledge include providing researchers and practitioners with methods to improve their integrated management efforts using EVMS, with a better understanding of how to achieve better project outcomes supported by empirical evidence. A key contribution is the paradigm shift from the traditional use of EVMS as a “one size fits all” binary approach of EVMS compliance with guidelines, to a rating scale of 0–1,000 across two interdependent dimensions (EVMS maturity and environment) supported by performance. Another contribution is providing practitioners with a benchmarking system to understand where their projects stand in terms of EVMS scores compared to the identified thresholds. This allows them to make more informed decisions in the areas of technical and social strengths and risks, to improve their EVMS and project performance. Such contributions do not exist in prior one-dimensional approaches (whether technical or social alone). The rest of the paper provides additional research background, and discusses existing literature and gaps, and details the methodology of this study. Then, the IP2M METRR scores and performance data are analyzed and interpreted, and the paper concludes with its research contributions as well as practical recommendations.

Background

The maturity component of the IP2M METRR consists of 10 subprocesses divided into 56 attributes referencing major EVMS guidelines (e.g. the 32 Electronic Industries Alliance (EIA) 748 guidelines (NDIA, 2018), PMI (2019), ISO (2018)). The list of the subprocesses and attributes is provided in Supplementary_Material_Appendix_1. An attribute is a core characteristic or quality that is essential to fielding an effective EVMS. Each maturity attribute has a description and is evaluated on a 1 to 5 graduated maturity scale in terms of compliance and “N/A”: “1” means that work on this attribute has not yet started; while “5” means the attribute is best in class. Level “4” is a maturity level where the attribute is compliant with requirements from EVMS standards and guidelines. “N/A” means the attribute is not applicable. As such, the IP2M METRR, allows the removal of any of the 56 attributes that do not apply to the project, therefore tailoring the EVMS. For example, if the project does not involve subcontractors, the attribute related to the prime contractor’s EVMS flow-down requirements will not be applicable and will not be considered when assessing the maturity. In this case, the attribute will receive a “N/A”. Looking at an anchor project, the maturity level associated with each attribute that corresponds to a certain maturity score, is first chosen. The determination of the level is based on reading the detailed narrative description provided in the IP2M METRR framework for each attribute level. Different attribute scores in the IP2M METRR are assigned in accordance with the relative importance of each attribute. Then, an overall EVMS maturity score, which is the sum of the individual attribute scores, is achieved by completing the EVMS maturity assessment for all attributes. A greater score denotes a better level of maturity, and this value has a maximum possible of 1,000 points.

The environment component consists of four categories divided into 27 factors (Gibson et al., 2022). The list of the categories and factors are provided in Supplementary_Material_Appendix_1. Similarly, each environment factor is evaluated on a scale ranging from Not Acceptable to Needs Improvement, Meets Some, Meets Most, and finally High Performing, with each corresponding to a factor score. Individual factor scores sum up to a 1,000-point scale each, with higher scores being better. Using data collected from evaluating 35 completed projects, a preceding analysis was done to identify thresholds for the scores, separately. The results identified 550 as the threshold separating low maturity projects from high maturity and 800 as the threshold separating poor environment from good environment in addition to project performance implications (Aramali et al., 2022b, c). Preliminary research findings showed promising results in understanding the influence of EVMS maturity attributes alone, and environment factors alone, on project performance (Aramali et al., 2022b, c). These initial findings motivated the authors to study the interdependent effects of both dimensions and examine their correlation.

Literature review

This section introduces the theoretical framework adopted for this study, namely the sociotechnical systems concept, which is explained in terms of its uses and how it relates to EVMS and project performance. It also emphasizes the role of integrated project management in the context of this theoretical framework and summarizes the identified research gaps based on the literature.

Sociotechnical systems

A socio-technical system is a notion created at the end of the 1950s in London in labor studies to find means to help humans adapt to the organizational and technical framework of production (Emery and Trist, 1960). Sociotechnical systems were designed to accommodate the issues of the industry’s working conditions by shaping the technical and social conditions such that efficiency and humanity do not contradict (Ropohl, 1999). They encourage bottom-up participation, team autonomy and self-regulation (Geels, 2004).

The use of socio-technical systems is spreading to different disciplines and is no longer restricted to labor and manufacturing organizations. In complex engineering development programs, a sociotechnical system was used to develop a framework that identified the fundamental elements of engineering programs (products, processes, organizations and people) as well as the drivers of program performance (deWeck and Rebentisch, 2016). Framing complex engineering development programs as socio-technical systems helps control for design, engineering, testing, fielding and maintenance of complex engineering programs (deWeck and Rebentisch, 2016). Amongst the recent sociotechnical systems is the integrated mechanism in designing the sustainable implementation of the fourth industrial revolution (Sony and Naik, 2020).

As argued by Davis et al. (2014), sociotechnical systems can be applied and have a useful impact beyond work design and new technologies such as in the management of crowd events and environmental sustainability. In engineering design decision-making, a socio-technical framework allows the concurrent examination of decision interdependencies, the patterns of social interactions in knowledge requirements and stakeholders’ involvement in and influence on decisions (Pirzadeh et al., 2021). Testing the developed socio-technical framework showed that better design decision outcomes can be attained by aligning the information interdependencies of the decisions and the social interaction patterns. This framework can be applied to different project settings to help achieve effective interaction for design decision-making.

When considering the utility of sociotechnical systems, this approach facilitates both the creation and implementation of designs that are not only efficient but also centered around the user experience. Specifically, it advocates for a collaborative synergy between people (social systems) and technology (technical systems) for the effective functioning of any organizational system (Appelbaum, 1997). For example, users participate in the system design process, hence stakeholders directly contribute to many benefits such as the technical functionality and organizational knowledge construction (Fischer and Herrmann, 2011). Researchers like Birasnav et al. (2019) and Lee et al. (2008) have demonstrated sociotechnical system’s role in boosting organizational performance. For example, studies on the quality of work-life programs show how changes in organizational design can elevate employee satisfaction and performance (Guest et al., 2022). Moreover, sociotechnical systems foster flexibility and adaptability, essential for responding to changing customer needs or integrating new technologies like artificial intelligence (Makarius et al., 2020; Manz and Stewart, 1997). Finally, it helps mitigate failure risks that involve the integration of human and technical factors (Rasmussen, 1997). Gallina et al. (2014) have explored the safety-related risks through a sociotechnical lens, providing insights on how safety managers can control failing behaviors and prevent failures. Similarly, Bahaei et al. (2019) explored the risks within the augmented-reality applications given that a sociotechnical lens can help effectively address interrelated concerns. In summary, the utility of sociotechnical systems lies in their ability to build organizational environments that are more effective, adaptive and successful.

This shows that the sociotechnical systems theory can be applied to different disciplines and is impacting performance. The objective of the sociotechnical system approach is to effectively blend the social and technical systems of an organization considering trade-offs and the interdependent relationship between these two dimensions (Fox, 1995). With the current trend of organizations seeking more productivity within the arising turbulent environments, sociotechnical systems are more needed. Thus, the design and performance of an organizational system will improve and meet its goals when both the technical and the social elements are treated as interdependent parts of a complex system.

Sociotechnical nature of EVMS in project management

EVMS is a top allied discipline to project management in the management field of study (Kwak and Anbari, 2009). Project management is a set of processes that use certain tools/techniques to deliver several outputs from a set of inputs (PMI, 2017). It is based on several process groups that range from defining a new project to completing this project passing through planning, executing, monitoring and controlling. Within the earlier project management research, there has been a lot of focus on the technical aspects; however, studies focusing on the human aspects (leadership, team development, etc.) have been trending since the 1990s (Kloppenborg and Opfer, 2002). As such it was made clear to the academic community that project management is highly interrelated with social elements and has a significant interconnection with social sciences (Turner et al., 2013). To this end, a project has been claimed to be a “social process” or a “social system” (Turner et al., 2013; Söderlund, 2004). This fact-the social dimension of project management-helped researchers identify ways for improving project processes by targeting the people, the methods, the tasks and the project environment in addition to the technology (Lehtinen et al., 2014).

The technical dimension of project management is the base of its process and performance; however, the social dimension seems to be playing a pivotal role in improving such process. For example, McLeod and MacDonell (2011) empirically reviewed the impact of social factors on software project development and project outcomes. O’Leary and Williams (2013) developed a model of projects as social trajectories based on alignment between the multiple perspectives of project stakeholders. This framework was used to understand the effectiveness of the alignment process based on the social aspect of an Information Technology (IT)-enabled business change project. Alias et al. (2014) developed a conceptual framework identifying five success factors of project management practice and highlighted human-related factors, which cover the resources in terms of the people and the culture, as one critical element influencing project performance.

On the topic of project performance, ample studies have been done on the impact of various management practices on project performance (e.g. He et al., 2022; Eric et al., 2020). In terms of EVMS, project performance has been carefully addressed from the angle of complying with the technical guidelines to pass the client’s check, from the angle of cost and schedule control, and from the angle of risk management (Kim and Pinto, 2019; Babar et al., 2017). For example, Kim and Pinto (2019) presented a CPI decision support tool for visual risk communication. Whereas Babar et al. (2017) developed a model that provides a better estimate at completion and validated it on case studies. Both papers improve decision-making regarding EVMS performance and result in improved overall project outcomes considering that the control mechanism is more efficient. Although EVMS is characterized as a project management mechanism, there has been no quantitative research in the literature that examined project performance considering the roles of EVMS maturity and EVMS environment that are expected to enable more efficient control, risk management and reliable implementation.

The field of integrated project management and control has been focusing on integrating static planning methods and risk analysis techniques with dynamic project control approaches. As highlighted by Vanhoucke (2012), their defined integrated project management system revolves around dynamic scheduling relying on baseline scheduling, schedule risk analysis and project control. Schieg (2009) developed a six-stage model for integrated project management in construction to integrate stakeholders with micro and macroenvironment factors and with the project’s lifecycle and examined the impact of such a proposition on project performance. This study is a perfect example of integrating the environment factors expected to affect the efficiency with the project’s lifecycle in a construction management project setting.

Nevertheless, as mentioned by Kim et al. (2003), EVMS is a project management and project control mechanism making its success and efficiency in application, by substitution, heavily reliant on the social factors in an organizational setting. Such social factors collectively represent the environment surrounding the EVMS implementation, resulting from research focus groups (Aramali et al., 2022b). And based on recent studies examining the practitioners’ experience with EVMS, the social factors are introduced as the drivers for more efficient technical implementation of EVMS (Rezouki and Mortadha, 2020). Thus, EVMS is a sociotechnical system “technically” complying with the EVMS subprocesses guidelines “facilitated” by a well-driven aligned environment. This is shown in Figure 1 which presents the maturity sub-processes surrounded by the environment categories that enable achieving their technical objectives.

Gaps in the existing literature

In summary, research has proven that the sociotechnical system theory can be applied to different disciplines and results in improved outcomes. Research has also addressed how systems can achieve their goals when the technical and social outcomes play an interdependent role in a complex system. Regardless of the efforts to address EVMS limitations, effective integration between the social and technical dimensions in project management is still in its infancy. This gap can be addressed by understanding the nature of EVMS as a sociotechnical system and building on this system’s characteristics to develop a performance-driven framework leading to effective implementation.

Research methodology

To test the research hypothesis and investigate the impact of EVMS maturity and environment on project performance, the authors first hosted four IP2M METRR workshops (each for five hours), to collect data from completed projects. In total, 31 industry practitioners, with an average of 19 years of EVM experience and representing 23 unique organizations applied the IP2M METRR on 35 projects worth $21.8 billion in total costs and provided inputs on their EVMS maturity and environment, as well as information related to project performance. All the projects in this sample are large complex projects from different industries and spanning 17 different states in the United States. Their industries include construction, defense, environmental, software, aerospace and science. Further details on the projects, companies and practitioners are kept confidential to respect data confidentiality agreements. The research method described in Figure 2 was followed after the data collection workshops concluded.

In Step 1, the authors compiled the data collected from the workshops in a database. The database included the 11 variables, shown in Figure 3, needed for the investigation as identified by literature sources and the research team. These values (and more than 80 more granular data points to support these values for each project) were collected confidentially and voluntarily from the workshop participants.

The variables for which data were collected are split into two: project performance metrics (five variables), and EVMS practice-related (six variables). Figure 3 portrays the project lifecycle times for which the variables were assessed; and the hatched lines group the project performance-related and practice-related variables together. For example, workshop participants were asked to assess 56 EVMS maturity attributes and 27 environment factors for each of their projects retroactively at a time of around 20% project completion, whereas customer satisfaction was assessed at the project end. The EVMS components were assessed at 20% project completion because this is the earliest time when EVMS gets established based on the literature (e.g. Christensen and Heise, 1993; Christensen and Payne, 1992) and corroborated by the research team’s experts via focus groups. Evaluating EVMS before 20% completion would not lead to accurate results (Kwak and Anbari, 2012) because a realistic project baseline oftentimes has not been established before that point. The rest of the variables were measured at project completion, which enabled capturing and studying the impact of EVMS components on final project outcomes.

The project performance-related variables are all numerically “continuous” variables recorded in percentages. They were recorded or calculated versus the project measurement baseline (PMB) that was set at 20% project completion. For example, if cost growth equals 30%, it means, this PMB was overrun by 30% at the end of the project (Yussef et al., 2019; Babar et al., 2017). The cost growth without change orders is the final project cost less the absolute value of change order measured versus the PMB (Yussef et al., 2019). The schedule growth is measured versus the initially set baseline duration (Yussef et al., 2019). The change absolute value represents the amount of change orders versus the PMB (Yussef et al., 2019). The CPI is a unitless measure of project cost efficiency (Kim and Pinto, 2019).

On the other hand, practice-related variables are “discrete” variables. The compliance variable is defined as “The characteristics of an EVMS that ensures the intent of the EIA-748 EVMS guidelines is embodied in the integrated processes and subprocesses of a contractor’s methods of operation that generate accurate and auditable project/program performance data.” (NDIA, 2018). It was assessed by the IP2M METRR asking workshop participants a yes/no binary question whether their EVMS was certified or not. The variables meeting business objectives, customer satisfaction, and EVMS helped proactively manage the project were all assessed on a 1–5 Likert scale, ranging from “very unsuccessful” to “very successful” (Tariq et al., 2020; Kwak and Anbari, 2012; Kim et al., 2003; Christensen and Heise, 1993). Finally, the EVMS maturity score and EVMS environment score are “continuous” variables (over 1,000 points) calculated based on the participants’ EVMS assessment at 20% project completion time.

In Step 2, data descriptive statistics were generated to determine the mean, the median, the standard deviation as well as the minimum and the maximum with respect to each variable and are summarized in Table 1. Generating descriptive statistics helped to arrange and interpret data (Chattamvelli and Shanmugam, 2023). In Step 3, the authors conducted linear correlation and regression analyses. In these analyses, aggregated data in the form of dependent (Y) and independent (X) variables are graphed on a scatterplot and the independent variable is assumed to predict the behavior of the dependent variable (Moore et al., 2010). The dependent variable was set to be the EVMS maturity score, and the independent variable was set to be the EVMS environment score. According to the expert EVMS practitioners on the research team, a key goal is to test and understand whether issues related to the EVMS environment (e.g. communication between project team members; leadership; funding availability; etc.) could statistically explain the behavior of EVMS maturity (e.g. achieving a degree to which EVMS is compliant with standards and guidelines). A linear regression model was used to test and the relationship between them (Waissi, 2015).

In Step 4, the authors developed a “heat map” by plotting the maturity scores against the environment scores for each of the 35 projects, and then subdividing maturity and environment into four different zones, with the zones being based on project performance and score thresholds developed iteratively as discussed in more detail later (See Figure 4). The sample of projects is distributed across all four zones. Different thresholds for maturity and environment cut-off scores to illustrate the data into different potential formats (e.g. matrix and heat-map) were initially discussed and investigated based on statistical testing. The results from steps 1 to 4 were shared with the research team in multiple focus group meetings with an objective to interpret the results, and their feedback was collected. Based on this feedback, “heat map” format was agreed upon. The reason was that this illustration (as a heat map) could provide not only statistically significant research contributions, but also add practical guidance and flexibility to industry practitioners when trying to improve the EVMS maturity and environment for their project by moving from one zone to the next and incrementally achieving higher scores (in successive yearly assessments, for example). The final zone thresholds were set based on the maturity and environment cut-off scores from the literature (Aramali et al., 2022b, c) and augmented by the expert practitioners’ input; iterative statistical analyses were performed to compare the performance of different project groups concerning the nine project performance-related and practice-related variables. This comparison aided in determining the final score thresholds for each zone. The differences in project cost growth between these four zones were found to be statistically significant. The authors did not observe greater statistically significant differences when selecting other thresholds. Also, the zones allow practical flexibility for improvement throughout the lifecycle of projects, when trying to move from a low zone to higher zones. The color coding of the four zones is shown in Figure 4, with different threshold lines at maturity and environment scores of 500, 700, 800 and 1,000.

In Step 5, descriptive statistics (mean, median, standard deviation, minimum and maximum) were generated for each of the four zones: red, orange, yellow and green (Chattamvelli and Shanmugam, 2023). Their analysis helped assess the project performance within each zone. Then in Step 6, the authors conducted additional statistical analyses on the performance differences between the finalized four heat map zones for two reasons. The first reason is to empirically test the hypothesis that links EVMS maturity and environment with project performance, thereby addressing the gap discussed earlier. The second reason is to provide guidance to practitioners based on clear performance differences between projects with different EVMS maturity and environment levels. The authors used Mann–Whitney U-tests (MW U test) and Kruskal–Wallis tests to examine the differences in performance between the projects with the various levels of EVMS maturity and environment scores. In preparation for these analyses, the Shapiro–Wilk normality tests were applied to the data corresponding to each heat map zone. For each normality test, the null hypothesis H0 states that data is normally distributed, whereas the alternative hypothesis H1 states that data is not normally distributed (Royston, 1983). P-values less than 0.05 (5% significance level) reject the null hypothesis, meaning the data is not normally distributed. Independent sample t-tests were needed when comparing two groups to see if their data were found to be normally distributed. In this case, the null hypothesis (H0) is that the mean values of the two groups being tested against each other are equal, or nearly equal (Morrison, 2009). The alternate hypothesis (H1) is that the mean values of the two groups being tested against each other are not equal (Morrison, 2009). The associated p-value indicates if the null hypothesis is rejected or if there is failure in rejecting it (Morrison, 2009). When the data between the two groups tested against each other were not normally distributed, Mann–Whitney U test was applied, which compares the medians (Corder and Foreman, 2014). The nonparametric Mann–Whitney U test was also applied in the case of the discrete variables (Wilcox, 2009). When comparing more than two groups of projects, the nonparametric Kruskal–Wallis test was applied. The null hypothesis (H0) is that the distribution of all the groups being tested against each other is the same (Ostertagova et al., 2014). The alternate hypothesis (H1) is that the distribution of all the groups being tested against each other is not the same (Ostertagova et al., 2014). If the associated p-value from the test was lower than 0.05, then the null hypothesis was rejected, then at least one of the groups is different from the other groups. In this case, tests that compare two groups were applied to identify where the differences occurred. The results of these analyses are discussed next.

Results and findings

This section first presents descriptive statistics, followed by the relationship between maturity and environment. Then the performance-based heat map is presented, before concluding with the project performance results. Evaluating each of the 56 maturity attributes and 27 environment factors for each project in the dataset resulted in a unique EVMS maturity and environment score for every project. Table 1 shows the data characteristics of the sample, including both maturity and environment scores, as well as project performance data.

The relationship between EVMS maturity and environment

The average maturity and environment scores for the sample were coincidentally the same, at 657 out of 1,000 points. The scores ranged from 78 to 898 for maturity (with a median of 703), and from 200 to 897 for environment (with a median of 686). The IP2M METRR was able to gauge projects across a wide range of maturity and environment scores, demonstrating diversity in the projects evaluated. Also, to ensure that maturity and environment were reliably assessed and that these scores are representative of the two main dimensions, Cronbach’s Alpha’s reliability tests were conducted on the assessment of the 27 environment factors and 56 maturity attributes in the sample. The results show a Cronbach’s Alpha of 0.93 and 0.97, respectively, indicating a high degree of consistency (Tavakol and Dennick, 2011).

To study the relationship between the environment score and the maturity score, a correlation analysis was conducted on the sample, as illustrated in the plot in Figure 5. The research team’s hypothesis is that maturity is in part a function of the environment that the team is exposed to, hence the tested dependency. This hypothesis is rooted in the existing literature on socio-technical systems that highlights the importance of the social factors shaping the system outcomes and performance in different fields (e.g. Pirzadeh et al., 2021; Righi and Saurin, 2015). The authors are testing this specifically on large complex government projects given the fact that the compliance with EVMS standards and guidelines on such projects are contractual requirements (Bergerud, 2017; Bhaumik, 2016; Marshall, 2007). However, there is no limitation to applying the framework to projects even in the absence of such contractual requirements.

The results showed a Pearson R-value of 0.835 (R2 = 0.69), which indicates a strong correlation between the environment and the maturity scores, with a direct and positive relationship between them which is significant (p-value = 0.001 < 0.05). The analysis indicates that changes in the predictor variable (environment score) were significantly correlated with the changes in the response variable (maturity score) in this sample. Projects that can achieve a high-performing environment with the right people, culture, resources and practices, appear to exhibit more mature EVMS subprocesses. This is the first time in the EVMS literature that such a claim can be made based on statistically significant quantitative assessments of both maturity and environment, backed up by more than $20 billion worth of completed projects. The linear regression resulted in the following equation:

(1)MaturityScore=(EnvironmentScore×0.96)+24.27

This means that for each point increase in environment score, the maturity score is predicted to increase by 0.96 points. The sample’s environment score accounts for 69% of the variance seen in the maturity score. Interestingly, the authors later added scores from eight in-progress projects to the sample collected independently, and the relationship was proven to be similarly strong within the new sample of 43 projects, with an R2 value of 0.71 (compared to 0.69 with the first sample). These results are consistent with Ling et al. (2009) who found 46 significant relationships between social practices around project management processes.

One implication of these findings is that due importance should be given to the social aspect when seeking improvements in the technical functionality of EVMS. The organization must look at each unique project and find out the means of improving the EVMS environment of the project, which in turn will lead to a more mature EVMS. This highlights how a reliable EVMS viewed as a sociotechnical system enables opportunities for effectively integrating project management while potentially achieving better performance outcomes. The latter component of this statement is tested next.

The EVMS heat map

Each project for which data was collected is labeled as a dot on Figure 6, which plots the maturity and environment scores in (x, y) format with x representing the environment score and y representing the maturity score. The plot is superimposed with a performance-based heat map.

An important finding based on Figure 6 is that within the sample of the 35 projects, no project was found in the bottom right red area (i.e. having a high score in environment and low in maturity). This indicates that the projects that excelled in their EVMS environment had relatively mature EVMS subprocesses, which is consistent with the finding of the previous section. Furthermore, the scores spanned a considerable range from the lower left corner all the way to the upper right corner of the heat map, with the largest concentration between 500 and 800, allowing benchmarking a diverse sample of projects against one another on this map.

Project performance impacts

In this section, the project performance results are discussed. When analyzing the performance variables for projects within each heat map zone and calculating their average values, the results are shown in Table 2.

Based on Kruskal–Wallis tests, statistically significant differences were found between the four zones in cost growth, compliance, meeting business objectives and customer satisfaction. To determine where these differences come from, the six-zone pairs were further compared (i.e. green vs yellow, green vs orange and so forth) using MW U-tests for each of the four significant variables. We will discuss a few of these key metrics next.

Cost Growth: In terms of cost growth, the MW U test results indicate that projects in the red zone exhibit statistically significant differences in the median (p-value <0.05) compared to orange, yellow, and green projects. The cost growth median of the red projects is 64.8% higher than that of the green projects. Similarly, the green projects outperform orange projects with statistically significant differences. Past research had found that the root causes of low maturity in EVMS application and project cost growth are in part related to poor risk management (risk identification, risk analysis, risk integration to EVMS) (Aramali et al., 2022c; Alleman et al., 2018). Projects with low maturity and environment scores can potentially improve by integrating project risk with EVMS to produce more reliable data that informs decision-making.

Compliance: The majority (75%) of the projects in the sample have a certified or compliant EVMS at 20% project completion time. This certification is typically provided by the government owner when the contractor’s EVMS conforms with the EVMS EIA-748-D guidelines and when the data outputs from EVMS are “reliable, timely, and actionable” per current traditional compliance assessment methods (NDIA, 2018; Kester et al., 2015). All the projects in the green and yellow zones were compliant with EIA 748-D. In contrast, the majority of the projects (80%) in the red zone were non-compliant projects.

Business Objectives and Customer Satisfaction: Interestingly, the average values in meeting business objectives and in customer satisfaction were equal and consistent with each other in all zones for this sample. As the projects reached higher scores in maturity and environment, they performed better and more successfully achieved business objectives and drivers, customer satisfaction and proactive use of EVMS. In both meeting business objectives and achieving customer satisfaction variables, red zone projects significantly underperformed compared to orange, yellow, and green projects (p-value <0.005) according to MW U test results. These results align with the earlier findings of Andersen and Jessen (2003) stating that the knowledge and attitudes of team members are strong drivers for better practices in project management hence mastering the business basics and achieving project goals. Also, these results seem to prove that the benefits of assessing EVMS maturity and environment are similar to the benefits of the organizational project management maturity model (PMI, 2013). Key benefits from the literature included improved customer satisfaction, better market share, compliance with best practices and effective project management.

In summary, these key results affirm the hypothesis that higher EVMS maturity and environment scores are positively correlated with project performance; specifically in terms of cost growth, compliance, meeting business objectives and customer satisfaction. They provide strong evidence indicating that projects with high EVMS maturity and good EVMS environment significantly outperform projects with low EVMS maturity and poor EVMS environment in at least four significant variables, according to this sample of $20 billion worth of projects. Authors suggest that there is a need for organizations to pay serious attention to any ongoing project that scores less than 500 in both maturity and environment, specifically adding new environment evaluations to support the custom practices of meeting maturity and compliance expectations.

The key recommendation related to truly integrated project management by allocating effort to the social environment surrounding the EVMS, has led to a positive paradigm shift in project management for one of the nation’s largest government organizations today. In fact, effectively implementing EVMS as a sociotechnical system aligns with and reinforces earlier findings discussed by researchers including Rode et al. (2022) and Kwak and Anbari (2012) and many others, while also developing a detailed framework to help gauge, assess and score individual attributes and factors, complete with identification of gaps and corrective actions, and linked to project performance. One industry reviewer used the terms “the holy grail of project management” as part of his reaction to this new framework. Another government participant representing NASA included “cracking the code” as part of their feedback on the final work. At the same time, authors understand that the performance results, like for any study, faces limitations stemming from the sample of projects used, and that this framework will continue to be improved as more data is collected and as it continues to be used by industry and government in the years to come.

Conclusion and recommendations

The authors analyzed more than 2,800 data points collected from 35 completed projects and studied the interdependent impact of EVMS technical and social variables on project performance; no such investigation has been conducted to date. The data was collected through workshops where participants used the novel IP2M METRR framework, an assessment model that measures an EVMS’s maturity and environment by aggregating the scores of 56 maturity attributes and 27 environment factors, each assessed individually. The collected data were analyzed statistically to examine whether higher scores in maturity and environment are positively correlated with project performance and whether implementing EVMS as a sociotechnical system helps achieve efficient integrated project/program management and influences project success. First, it was found that maturity and environment scores ranged from 78 to 900 (out of 1,000 possible points) in the sample of projects studied here, with the highest percentage of the projects scoring between 500 and 800. Second, the results showed statistically significant differences related to both maturity and environment, in terms of cost growth, compliance with EIA-748-D guidelines, meeting project objectives and business drivers, and customer satisfaction. These differences were found between the projects and programs that were implementing an effective EVMS with reliable data versus those that were less committed to EVMS. Empirical evidence in key findings included the statistically significant differences where projects that had low EVMS maturity and poor EVMS environment incurred 64.8% greater cost overruns than those that exhibited high EVMS maturity and environment. Third, EVMS environment and maturity were found to be strongly and positively correlated with one another in this sample. Projects that achieved excellence in their environment had the most mature EVMS subprocesses. Based on the collected data, IP2M METRR was proven to be an effective novel framework for measuring EVMS maturity and environment of various industry sectors’ large and complex projects and programs.

In addition, the contributions of this research included the creation of an EVMS heat map that displays where a given project stands in terms of EVMS maturity and environment (in one of the red, orange, yellow, or green zones). The IP2M METRR heat map is an insightful guide for practitioners in visualizing their projects across both dimensions and can be also used to benchmark this project against others internally and externally. Moreover, the IP2M METRR model has recently been coded into a practitioner’s webtool which is being currently used by federal agencies to assess their IP2M maturity and environment using EVMS for the some of the nation’s largest and most critical projects and programs. As a major paradigm shift from legacy EVMS practices, the novel IP2M METRR assessment model shifts away from binary compliance assessments focusing entirely on guideline adherence to a more tailorable model that scales across two different dimensions and out of 1,000 points in each dimension.

The research team recommends the use of the IP2M METRR across a project lifecycle, to highlight human and technical parameters that are essential for an effective EVMS implementation for integrating project and program management. Emphasis must be given to the human aspect around project controls because it is an upstream dimension that affects the maturity of EVMS subprocesses and project outcomes. This environment dimension, which has always been around but has now been newly formalized and tested versus performance, not only includes the adequate expertise of the project team members (people), but also the appropriate team size, their values and beliefs (culture), their professional training, effective coordination (practices) and the availability of the right technology (resources), among others (Gibson et al., 2022). This recommendation may serve as a guide to organizations to improve their integrated project and program management relying on EVMS sociotechnical systems. One limitation of this study is that the conclusions are based on the sample of projects used in the study. Even though the sample size was adequate for the work and statistical tests, caution should be used in generalizing the findings to all projects. In future studies, the sample of projects could be further increased and broadened by incorporating projects on other continents as well.

Figures

The sociotechnical elements of EVMS in integrated project/program management

Figure 1

The sociotechnical elements of EVMS in integrated project/program management

Research analysis steps

Figure 2

Research analysis steps

Assessed variables and project performance metrics

Figure 3

Assessed variables and project performance metrics

Format of maturity and environment heat map graph

Figure 4

Format of maturity and environment heat map graph

EVMS maturity and environment relationship and linear fit (N = 35)

Figure 5

EVMS maturity and environment relationship and linear fit (N = 35)

EVMS maturity and environment heat map graph (N = 35)

Figure 6

EVMS maturity and environment heat map graph (N = 35)

Descriptive statistics (N = 35)

VariablesMeanMedianStd. Dev.MinMax
Inputs (collected)
Initial Performance measurement baseline (PMB) budget (in $M)473.4112.0976.93.13981.0
Final project cost (in $M)662.1150.01491.24.87500.0
Final cost performance index (unitless)0.940.980.120.601.1
Absolute value of change orders (in $M)37.511.061.60.0266.0
Initial baseline project/program duration (in months)50.348.020.78.096.0
Final project/program duration (in months)56.048.525.820.0132.0
Meeting business objectives (1–5 scale)4.14.01.11.05.0
Customer satisfaction (1–5 scale)4.14.01.11.05.0
EVMS helped proactively manage (1–5 scale)3.54.00.91.05.0
Outputs (calculated)
EVMS maturity score (out of 1,000)65770318278898
EVMS environment score (out of 1,000)657686158200897
Cost growth (in %)+56.1+13.0121.4−13.8+537.9
Cost growth, without change orders (in %)+9.9+0.036.7−51.6+147.1
Schedule growth (in %)+17.8+2.146.6−20.0+250.0
Change absolute value (in %)53.313.8122.20.0537.9

Source(s): Tables were created by the authors

Project performance metrics per EVMS maturity and environment heat map zone

Performance metricsRedOrangeYellowGreenp-value
Cost growth (in %)+92.3+48.1+13.7−0.30.007*
Cost growth, without change orders (in %)+45.4+6.3+0.2−4.40.093
Schedule growth (in %)+24.3+26.9+3.7−5.90.102
Change absolute value (in %)56.747.213.47.70.454
Final cost performance index (unitless)0.850.950.951.030.091
% of projects compliant with EIA-748-D20.0%71.5%100.0%100.0%0.007*
Meeting business objectives (1–5 scale)2.74.34.45.00.003*
Customer satisfaction (1–5 scale)2.74.34.45.00.007*
EVMS helped proactively manage (1–5 scale)2.73.53.94.00.105

Note(s): *denotes p-values that are less than 0.05, meaning the observed differences between the heat map zones are statistically significant in this sample based on Kruskal–Wallis tests

Source(s): Tables were created by the authors

Appendix

The supplementary material for this article can be found online.

References

Abba, W.F. (2017), “The evolution of earned value management”, College of Performance Management, Vol. 2, pp. 912.

Alias, Z., Zawawi, E.M.A., Yusof, K. and Aris, N.M. (2014), “Determining critical success factors of project management practice: a conceptual framework”, Procedia-Social and Behavioral Sciences, Vol. 153, pp. 61-69, doi: 10.1016/j.sbspro.2014.10.041.

Alleman, G., Coonce, T. and Price, R. (2018), “Increasing the probability of program success with continuous risk management”, College Performance Management, Vol. 4, pp. 27-46.

Andersen, E.S. and Jessen, S.A. (2003), “Project maturity in organisations”, International Journal of Project Management., Vol. 21 No. 6, pp. 457-461, doi: 10.1016/s0263-7863(02)00088-1.

Andrews, R., Cooper, J., Ellsworth, B., Sestak, J., Conaway, K.M., Hunter, D. and Coffman, M. (2010), “House armed services committee panel on defense acquisition reform findings and recommendations”, available at: http://seaonline.org/AboutSEA/news/NewsDownloads/DARFINALREPORT032310.pdf

Appelbaum, S.H. (1997), “Socio‐technical systems theory: an intervention strategy for organizational development”, Management Decision, Vol. 35 No. 6, pp. 452-463, doi: 10.1108/00251749710173823.

Aramali, V., Sanboskani, H., Gibson, G.E. Jr, El Asmar, M. and Cho, N. (2022a), “Forward-looking state-of-the-art review on earned value management systems: the disconnect between academia and industry”, Journal of Management in Engineering, Vol. 38 No. 3, 03122001, doi: 10.1061/(asce)me.1943-5479.0001019.

Aramali, V., Gibson, G.E., Jr., El Asmar, M. and Cho, N. (2021), “Earned value management system state of practice: identifying critical subprocesses, challenges, and environment factors of a high-performing EVMS”, Journal of Management in Engineering, Vol. 37 No. 4, 04021031, doi: 10.1061/(asce)me.1943-5479.0000925.

Aramali, V., Gibson, G.E. Jr, El Asmar, M., Cho, N. and Sanboskani, H. (2022b), EVMS Environment Assessment Tool Development Process: Integrated Project/Program Management (IP2M) Maturity and Environment Total Risk Rating (METRR) Using an Earned Value Management System (EVMS), Report No. 3, Annex A, School of Sustainable Engineering and the Built Environment, Ira A. Fulton Schools of Engineering, Arizona State University.

Aramali, V., Gibson, G.E. Jr, El Asmar, M., Cho, N. and Sanboskani, H. (2022c), EVMS Maturity Assessment Tool Development Process: Integrated Project/Program Management (IP2M) Maturity and Environment Total Risk Rating (METRR) Using an Earned Value Management System (EVMS), Report No. 4, Annex A, School of Sustainable Engineering and the Built Environment, Ira A. Fulton Schools of Engineering, Arizona State University.

Ashkanani, S. and Franzoi, R. (2022), “An overview on megaproject management systems”, Management Matters, Vol. 19 No. 2, pp. 129-148.

Babar, S., Thaheem, M.J. and Ayub, B. (2017), “Estimated cost at completion: integrating risk into earned value management”, Journal of Construction Engineering and Management, Vol. 143 No. 3, 04016104, doi: 10.1061/(asce)co.1943-7862.0001245.

Bahaei, S.S., Gallina, B., Laumann, K. and Skogstad, M.R. (2019), “Effect of augmented reality on faults leading to human failures in socio-technical systems”, 2019 4th International Conference on System Reliability and Safety (ICSRS), IEEE, pp. 236-245.

Bergerud, C. (2017), “Adopting a flexible EVM strategy to optimize project performance”, Proc., 2017 AACE Int. Transactions, EVM, 2590.

Bhaumik, H. (2016), “EVMS recommendations for multi-contract projects”, Proc., 2016 AACE Int. Transactions, EVM, 2141.

Birasnav, M., Chaudhary, R. and Scillitoe, J. (2019), “Integration of social capital and organizational learning theories to improve operational performance”, Global Journal of Flexible Systems Management, Vol. 20 No. 2, pp. 141-155, doi: 10.1007/s40171-019-00206-9.

Carney, K. (2023), Understanding Environment Resources and Their Impact on EVMS Implementation, DOE Project Management News, available at: https://www.energy.gov/sites/default/files/2023-04/PMNewsletterApril2023_0.pdf

Chattamvelli, R. and Shanmugam, R. (2023), Descriptive Statistics for Scientists and Engineers: Applications in R, Springer Nature, London.

Chirinos, W. (2015), “Integrated project management approach for EPC Energy projects”, Proc., 2015 AACE Int. Transactions, EVM-1834, AACE, Morgantown, WV.

Christensen, D.S. and Heise, S.R. (1993), “Cost performance index stability”, National Contract Management Journal, Vol. 25 No. 1, pp. 7-15.

Christensen, D. and Payne, K. (1992), “Cost performance index stability: fact or fiction?”, Journal of Parametrics, Vol. 12 No. 1, pp. 27-40, doi: 10.1080/10157891.1992.10462509.

Corder, G.W. and Foreman, D.I. (2014), Nonparametric Statistics: A Step-by-step Approach, John Wiley & Sons, NY.

Davis, M.C., Challenger, R., Jayewardene, D.N. and Clegg, C.W. (2014), “Advancing socio-technical systems thinking: a call for bravery”, Applied Ergonomics, Vol. 45 No. 2, pp. 171-180, doi: 10.1016/j.apergo.2013.02.009.

deWeck, O. and Rebentisch, E. (2016), “Acquisition program teamwork and performance seen anew: exposing the interplay of architecture and behaviors in complex defense projects”, Acquisition Research Program, avaliable at: https://www.dair.nps.edu/bitstream/123456789/2664/1/MIT-AM-17-006.pdf

Djali, S., Janssens, S., Van Yper, S. and Van Parijs, J. (2010), “How a data-driven quality management system can manage compliance risk in clinical trials”, Drug Information Journal, Vol. 44 No. 4, pp. 359-373, doi: 10.1177/009286151004400402.

DOE (2018), Office of Project Management EVMS Compliance Review Standard Operating Procedure (ECRSOP)–APPENDIX A: Compliance Assessment Guidance, DoE, Washington, DC.

DOE (2022), Integrated Project Management—Earned Value Management System (EVMS), DOE G 413.3-10B, DOE, Washington, DC.

Emery, F.E. and Trist, E.L. (1960), “Socio-technical systems”, Proceeding of Manage. Sciences Models and Techniques, London, Vol. 2.

Eric, M., Jean de dieu, D., Placide, M. and Gemariel, N. (2020), “Effect of project planning practices on improving project performance in Rwanda. A case of huguka dukore akazi kanoze project in Nyabihu district (2017-2020)”, International Journal of Social Sciences: Current and Future Research Trends (IJSSCFRT), pp. 88-94.

Fischer, G. and Herrmann, T. (2011), “Socio-technical systems: a meta-design perspective”, International Journal of Sociotechnology and Knowledge Development (IJSKD), Vol. 3 No. 1, pp. 1-33, doi: 10.4018/jskd.2011010101.

Fox, W.M. (1995), “Sociotechnical system principles and guidelines: past and present”, The Journal of Applied Behavioral Science, Vol. 31 No. 1, pp. 91-105, doi: 10.1177/0021886395311009.

Gallina, B., Sefer, E. and Refsdal, A. (2014), “Towards safety risk assessment of socio-technical systems via failure logic analysis”, 2014 IEEE International Symposium on Software Reliability Engineering Workshops, IEEE, pp. 287-292.

GAO (Government Accountability Office) (2012), NASA: Earned Value Management Implementation across Major Spaceflight Projects Is Uneven, GAO, Washington, DC.

GAO (Government Accountability Office) (2023), COLUMBIA CLASS SUBMARINE: Program Lacks Essential Schedule Insight amid Continuing Construction Challenges, GAO, Washington, DC.

Geels, F.W. (2004), “From sectoral systems of innovation to socio-technical systems: insights about dynamics and change from sociology and institutional theory”, Research Policy, Vol. 33 Nos 6-7, pp. 897-920, doi: 10.1016/j.respol.2004.01.015.

Gibson, G.E.J., El Asmar, M. and Cho, N. (2019), Project Definition Rating Index: Maturity and Accuracy Total Rating System (PDRI MATRS), Construction Industry Institute, Austin, TX.

Gibson, G.E. Jr, El Asmar, M., Sanboskani, H. and Aramali, V. (2022), “Implementing the integrated project/program management (IP2M) maturity and environment total risk rating (METRR) using EVMS in a team environment”, Report No. 6., in School of Sustainable Engineering and the Built Environment, Ira A. Fulton Schools of Engineering, Arizona State University.

Guest, D., Knox, A. and Warhurst, C. (2022), “Humanizing work in the digital age: lessons from socio-technical systems and quality of working life initiatives”, Human Relations, Vol. 75 No. 8, pp. 1461-1482, doi: 10.1177/00187267221092674.

Hanna, A.S. (2012), “Using the earned value management system to improve electrical project control”, Journal of Construction Engineering and Management, Vol. 138 No. 3, pp. 449-457, doi: 10.1061/(asce)co.1943-7862.0000426.

He, C., Liu, M., Alves, T.da C.L., Scala, N.M. and Hsiang, S.M. (2022), “Prioritizing collaborative scheduling practices based on their impact on project performance”, Construction Management and Economics, Vol. 0 No. 0, pp. 1-20, doi: 10.1080/01446193.2022.2048042.

ISO (2018), Earned Value Management in Project and Programme Management, ISO, Geneva.

Kagioglou, M., Cooper, R. and Aouad, G. (2001), “Performance management in construction: a conceptual framework”, Construction Management and Economics, Vol. 19 No. 1, pp. 85-95, doi: 10.1080/01446190010003425.

Kester, D., Cottrell, D. and Carney, K. (2015), “Data driven EVMS compliance: an analytical approach that will transform the way we think about managing”, College Performance Management, Vol. 2, pp. 7-13.

Kim, B.-C. and Pinto, J.K. (2019), “What CPI ¼ 0.85 really means: a probabilistic extension of the estimate at completion”, Journal of Management in Engineering, Vol. 35 No. 2, 04018059, doi: 10.1061/(asce)me.1943-5479.0000671.

Kim, E., Wells, W.G. Jr and Duffey, M.R. (2003), “A model for effective implementation of Earned Value Management methodology”, International Journal of Project Management, Vol. 21 No. 5, pp. 375-382, doi: 10.1016/s0263-7863(02)00049-2.

Kloppenborg, T.J. and Opfer, W.A. (2002), “The current state of project management research: trends, interpretations and predictions”, Project Management Journal, Vol. 33 No. 2, pp. 5-18, doi: 10.1177/875697280203300203.

Kwak, Y.H. and Anbari, F.T. (2009), “Analyzing project management research: perspectives from top management journals”, International Journal of Project Management, Vol. 27 No. 5, pp. 435-446, doi: 10.1016/j.ijproman.2008.08.004.

Kwak, Y.H. and Anbari, F.T. (2012), “History, practices, and future of earned value management in government: perspectives from NASA”, Project Management Journal, Vol. 43 No. 1, pp. 77-90, doi: 10.1002/pmj.20272.

Lee, S.M., Kim, K., Paulson, P. and Park, H. (2008), “Developing a socio‐technical framework for business‐IT alignment”, Industrial Management and Data Systems, Vol. 108 No. 9, pp. 1167-1181, doi: 10.1108/02635570810914874.

Lehtinen, T.O., Mäntylä, M.V., VanhanenItkonen, J.J. and Lassenius, C. (2014), “Perceived causes of software project failures - an analysis of their relationships”, Information and Software Technology, Vol. 56 No. 6, pp. 623-643, doi: 10.1016/j.infsof.2014.01.015.

Liggett, W., Hunter, H. and Jones, M. (2017), “Navigating an earned value management validation led by NASA: a contractor's perspective and helpful hints”, 2017 IEEE Aerospace Conference, IEEE, pp. 1-28.

Ling, F., Low, S., Wang, S. and Lim, H. (2009), “Key project management practices affecting Singaporean firms' project performance in China”, International Journal of Project Management, Vol. 27 No. 1, pp. 59-71, doi: 10.1016/j.ijproman.2007.10.004.

Makarius, E.E., Mukherjee, D., Fox, J.D. and Fox, A.K. (2020), “Rising with the machines: a sociotechnical framework for bringing artificial intelligence into the organization”, Journal of Business Research, Vol. 120, pp. 262-273, doi: 10.1016/j.jbusres.2020.07.045.

Manz, C.C. and Stewart, G.L. (1997), “Attaining flexible stability by integrating total quality management and socio-technical systems theory”, Organization Science, Vol. 8 No. 1, pp. 59-70, doi: 10.1287/orsc.8.1.59.

Marshall, R. (2007), “The contribution of earned value management to project success on contracted efforts”, Journal of Contract Management, Vol. 5 No. 1, pp. 21-33.

McLeod, L. and MacDonell, S.G. (2011), “Factors that affect software systems development project outcomes: a survey of research”, ACM Computing Surveys (CSUR), Vol. 43 No. 4, pp. 1-56.

McNamee, E.M., Hanner, C.E. and Immonen, C.W. (2017), “Improving EVMS compliance through data integration”, Proc., 2017 AACE Int. Transactions, EVM, 2581.

Moore, D.S., McGabe, G.P., Alwan, L.C., Craig, B.A. and Duckworth, W.M. (2010), The Practice of Statistics for Business and Economics (3rd Edition), W.H Freeman and Company, New York, NY.

Morrison, J. (2009), Statistics for Engineers: An Introduction, John Wiley & Sons, Chichester.

NDIA (2018), Earned Value Management Systems EIA-748-D Intent Guide, Integrated Program Management Division, Arlington, VA.

Ostertagova, E., Ostertag, O. and Kováč, J. (2014), “Methodology and application of the Kruskal-Wallis test”, In Applied Mechanics and Materials, Vol. 611, pp. 115-120, doi: 10.4028/www.scientific.net/amm.611.115.

O'Leary, T. and Williams, T. (2012), “Managing the social trajectory: a practice perspective on project management”, IEEE Transactions on Engineering Management, Vol. 60 No. 3, pp. 566-580, doi: 10.1109/tem.2012.2228206.

Pirzadeh, P., Lingard, H. and Blismas, N. (2021), “Design decisions and interactions: a sociotechnical network perspective”, Journal of Construction Engineering and Management, Vol. 147 No. 10, 04021110, doi: 10.1061/(asce)co.1943-7862.0002136.

PMI (2013), Organizational Project Management Maturity Model (OPM3) Knowledge Foundation, PMI, Newtown Square, PA.

PMI (2017), Guide to the Project Management Body of Knowledge PMBOK Guide 6th Edition, PMI, Newtown Square, PA.

PMI (2019), The Standard for Earned Value Management, PMI, Newtown Square, PA.

Rasmussen, J. (1997), “Risk management in a dynamic society: a modelling problem”, Safety Science, Vol. 27 Nos 2-3, pp. 183-213, doi: 10.1016/s0925-7535(97)00052-0.

Rezouki, S.E. and Mortadha, S.B. (2020), “The factors affecting on earned value management”, IOP Conf. Ser.: Mater. Sci. Eng., Vol. 901 No. 1, 012023, doi: 10.1088/1757-899x/901/1/012023.

Righi, A.W. and Saurin, T.A. (2015), “Complex socio-technical systems: characterization and management guidelines”, Applied Ergonomics, Vol. 50, pp. 19-30, doi: 10.1016/j.apergo.2015.02.003.

Rode, A.L.G., Svejvig, P. and Martinsuo, M. (2022), “Developing a multidimensional conception of project evaluation to improve projects”, Project Management Journal, Vol. 53 No. 4, pp. 416-432, 87569728221095473, doi: 10.1177/87569728221095473.

Ropohl, G. (1999), “Philosophy of socio-technical systems”, Society for Philosophy and Technology Quarterly Electronic Journal, Vol. 4 No. 3, pp. 186-194, doi: 10.5840/techne19994311.

Royston, J.P. (1983), “Some techniques for assessing multivarate normality based on the Shapiro‐Wilk W”, Journal of the Royal Statistical Society: Series C (Applied Statistics), Vol. 32 No. 2, pp. 121-133, doi: 10.2307/2347291.

Schieg, M. (2009), “Model for integrated project management”, J. of Business Economics and Manage., Vol. 10 No. 2, pp. 149-160, doi: 10.3846/1611-1699.2009.10.149-160.

Sharma, H. and Kirtani, V. (2021), “Project management processes are important, but are stakeholders aligned correctly?”, International Journal of Indian Culture and Business Management, Vol. 24 No. 3, pp. 331-349, doi: 10.1504/ijicbm.2020.10035035.

Söderlund, J. (2004), “Building theories of project management: past research, questions for the future”, International Journal of Project Management, Vol. 22 No. 3, pp. 183-191, doi: 10.1016/s0263-7863(03)00070-x.

Sony, M. and Naik, S. (2020), “Industry 4.0 integration with socio-technical systems theory: a systematic review and proposed theoretical model”, Technology in Society, Vol. 61, 101248, doi: 10.1016/j.techsoc.2020.101248.

Stratton, R.W. (2006), The Earned Value Management Maturity Model, Berrett-Koehler, Oakland.

Tariq, S., Ahmad, N., Ashraf, M.U., Alghamdi, A.M. and Alfakeeh, A.S. (2020), “Measuring the impact of scope changes on project plan using EVM”, IEEE Access, Vol. 8, pp. 154589-154613, doi: 10.1109/access.2020.3018169.

Tavakol, M. and Dennick, R. (2011), “Making sense of Cronbach's alpha”, International Journal of Medical Education, Vol. 2, pp. 53-55, doi: 10.5116/ijme.4dfb.8dfd.

Turner, J.R., Anbari, F. and Bredillet, C. (2013), “Perspectives on research in project management: the nine schools”, Global Business Perspectives, Vol. 1 No. 1, pp. 3-28, doi: 10.1007/s40196-012-0001-4.

Vanhoucke, M. (2012), Project Management with Dynamic Scheduling, Springer, Heidelberg, doi: 10.1007/978-3-642-40438-2.

Waissi, G. (2015), “Applied statistical modeling”, 2nd Addition. Tempe, AZ. WEB, available at: http://www.public.asu.edu/∼gwaissi/ASM-e-book/regressionbook.html

Wilcox, R. (2009), Basic Statistics: Understanding Conventional Methods and Modern Insights, Oxford University Press, Cary, NC.

Wu, X. and Liang, H. (2015), “Issues and countermeasures of enterprise compliance management in China”, available at: https://aisel.aisnet.org/whiceb2015/46/

Yussef, A., Gibson, G.E. Jr, Asmar, M.E. and Ramsey, D. (2019), “Quantifying FEED maturity and its impact on project performance in large industrial projects”, Journal of Management in Engineering, Vol. 35 No. 5, 04019021, doi: 10.1061/(asce)me.1943-5479.0000702.

Acknowledgements

The authors would like to thank the U.S. Department of Energy for funding this study entitled “Improving the Reliability of EVMS Compliance Reviews and EVMS Maturity Level Assessments.” The authors also would like to thank the whole research team for their invaluable input, and all government and industry participants who provided project data and detailed comments to help co-develop this framework.

Corresponding author

Vartenie Aramali can be contacted at: vartenie.aramali@csun.edu

Related articles