|
| 1 | +!! title: Engineering Portfolio Management - Part 2 |
| 2 | +!! slug: engineering-portfolio-management-part2 |
| 3 | +!! published: 2024-11-03 |
| 4 | +!! description: Part two of the Engineering Portfolio Management series: a class summary and the lessons learned applied to the software industry |
| 5 | + |
| 6 | +--- |
| 7 | + |
| 8 | +Technology in the software industry is being developed at increasing speeds. Entire companies run |
| 9 | +their entire life-cycle within a few years. With the world going faster and faster, it seems |
| 10 | +challenging to pause and take a step back to think about what technology is or what it's role is in |
| 11 | +society. |
| 12 | + |
| 13 | +_Technology_ is defined as "the practical application of research and science to develop new |
| 14 | +solutions that could subsequently be taken into commercial application through a product or service" |
| 15 | +(Flinn p.5). While I do work in the "tech industry", I often forget that the tech industry expands |
| 16 | +far beyond my little corner in software. The "tech industry" is the entire industry where research |
| 17 | +and science is being applied to develop new solutions. Generative AI is definitely in the tech |
| 18 | +industry, and so is mRNA, CO2-based heat pumps, and solid-state batteries. |
| 19 | + |
| 20 | +Product development is the process in which technologies are turned into a commercial product or |
| 21 | +service. How do we know when a technology is sufficiently ready to be turned into a product? Some |
| 22 | +might say as soon as someone is willing to pay for it, but this leaves out a whole slew of testing |
| 23 | +and safety considerations. It also leaves out one of the more important business questions: "How do |
| 24 | +we know which of the technologies are the most promising (ie. best return for the organization)?" |
| 25 | + |
| 26 | +Fortunately for us, one of the great innovators of the 20th century has built a guide in determining |
| 27 | +the readiness of technology. These readiness levels help measure technologies against themselves in |
| 28 | +order to monitor their progress in development. They are also used to measure against other |
| 29 | +competing technologies to determine and prioritize the projects that has the best chance of success. |
| 30 | +NASA developed Technology Readiness Levels to capture this data (_Technology Readiness Levels - |
| 31 | +NASA_). |
| 32 | + |
| 33 | +| Level | Description | |
| 34 | +| ----- | ----------- | |
| 35 | +| TRL 1 | Basic principles observed and reported | |
| 36 | +| TRL 2 | Technology concept and/or application formulated | |
| 37 | +| TRL 3 | Analytical and experimental critical function and/or characteristic proof-of-concept | |
| 38 | +| TRL 4 | Component and/or breadboard validation in laboratory environment | |
| 39 | +| TRL 5 | Component and/or breadboard validation in relevant environment | |
| 40 | +| TRL 6 | System/sub-system model or prototype demonstrated in relevant environment (ground or space) | |
| 41 | +| TRL 7 | System prototype demonstrated in space environment | |
| 42 | +| TRL 8 | Actual system completed and "flight qualified" through test and demonstration (ground or space) | |
| 43 | +| TRL 9 | Actual system "flight proven" through successful mission operations | |
| 44 | + |
| 45 | +While specific to aerospace flight vehicles, the differences between levels helps extrapolate TRLs |
| 46 | +to other industries. A light-weight gate review system can then be built around these levels to |
| 47 | +ensure that all criteria has been met to achieve each level before investing too heavily into the |
| 48 | +next ones. [Part 4]() will go into more detail on such a system. |
| 49 | + |
| 50 | +It is helpful to expand each of the description definitions along multiple categories. This makes |
| 51 | +sure that the same definitions are used when assigning an overall level at a gate review. These |
| 52 | +definitions could be unique to every organization, depending on what product or services they |
| 53 | +provide. Staying in the physical engineering realm, such an expansion could look like the following |
| 54 | +(Weber, 2024): |
| 55 | + |
| 56 | +| Level | Product Definition | Form of Physical Realization | Technical Requirements | Testing & Validation | Regulatory Approval | Phase | |
| 57 | +| ----- | ------------------ | ---------------------------- | ---------------------- | -------------------- | ------------------- | ----- | |
| 58 | +| TRL 1 | Idea | Hand-made/mock-up | | Calculations | | Research | |
| 59 | +| TRL 2 | Sketches & narrative description | | Scientific principles in lab or published literature | | | Research | |
| 60 | +| TRL 3 | | Operational prototype demonstrated | Key features demoed by test/analysis | Computer models lab test | Regulations identified; ID need for new standards | Research | |
| 61 | +| TRL 4 | Schematics, models; limited change control | | Test/validation plan in place | | | Development | |
| 62 | +| TRL 5 | Design rules & operation environment defined | Production unit validation & verification plan defined | | Design for Manufacturing; Failure Mode & Effects Analysis | | Development | |
| 63 | +| TRL 6 | Form change control in place | | Inspection, service, repair methods demoed | Accelerated/extreme testing | Approach to regulatory approval defined | Development | |
| 64 | +| TRL 7 | Detailed CAD models, specs, bill of materials | Fully representative but not made off tools | | Real environment testing | | Deployment | |
| 65 | +| TRL 8 | Full production information | Made off production tools | Design values validated | Real environment testing with end users | | Deployment | |
| 66 | +| TRL 9 | | Made off tools in final production process | Final product/process fully qualified | | Full regulatory approval; Qualified specs & standards | Deployment | |
| 67 | + |
| 68 | +At any one time, the technology may be at different levels across these categories. The overall |
| 69 | +TRL of the project is the minimum achieved across all categories. For example, if we were reviewing |
| 70 | +a new technology that had a TRL 7 in _Product Definition_ but a TRL 2 in _Testing & Validation_ (and |
| 71 | +all other TRLs were greater than 2), the overall TRL of the technology would be a TRL 2. |
| 72 | + |
| 73 | +These level definitions have a heavy skew towards physical engineering. But lucky for us, NASA also |
| 74 | +develops software and has adapted their original TRLs to software engineering (_Software Technology |
| 75 | +Readiness Levels - NASA_): |
| 76 | + |
| 77 | +| Level | Software Description | |
| 78 | +| ----- | -------------------- | |
| 79 | +| TRL 1 | Scientific knowledge generated underpinning basic properties of software architecture and mathematical formulation. | |
| 80 | +| TRL 2 | Practical application is identified but is speculative, no experimental proof or detailed analysis is available to support the conjecture. Basic properties of algorithms, representations and concepts defined. Basic principles coded. Experiments performed with synthetic data. | |
| 81 | +| TRL 3 | Development of limited functionality to validate critical properties and predictions using non-integrated software components. | |
| 82 | +| TRL 4 | Key, functionally critical, software components are integrated, and functionally validated, to establish interoperability and begin architecture development. Relevant Environments defined and performance in this environment predicted. | |
| 83 | +| TRL 5 | End-to-end software elements implemented and interfaced with existing systems/simulations conforming to target environment. End-to-end software system, tested in relevant environment, meeting predicted performance. Operational environment performance predicted. Prototype implementations developed. | |
| 84 | +| TRL 6 | Prototype implementations of the software demonstrated on full-scale realistic problems. Partially integrate with existing hardware/software systems. Limited documentation available. Engineering feasibility fully demonstrated. | |
| 85 | +| TRL 7 | Prototype software exists having all key functionality available for demonstration and test. Well integrated with operational hardware/software systems demonstrating operational feasibility. Most software bugs removed. Limited documentation available. | |
| 86 | +| TRL 8 | All software has been thoroughly debugged and fully integrated with all operational hardware and software systems. All user documentation, training documentation, and maintenance documentation completed. All functionality successfully demonstrated in simulated operational scenarios. Verification and Validation (V&V) completed. | |
| 87 | +| TRL 9 | All software has been thoroughly debugged and fully integrated with all operational hardware/software systems. All documentation has been completed. Sustaining software engineering support is in place. System has been successfully operated in the operational environment. | |
| 88 | + |
| 89 | +All software technology _projects_ need to meet all TRLs (anything at the task level should be |
| 90 | +governed by other processes that reinforce these levels). If a company is building custom software |
| 91 | +in-house as a product, some projects may need to start at TRL 1 and work up through TRL 9. However, |
| 92 | +not all levels need to be done by the same organization. For instance, a company doesn't need to |
| 93 | +start from TRL 1 when purchasing and rolling out a payroll software from a vendor. The architecture |
| 94 | +and algorithms have already been figured out. However, the TRL of the project for the purchasing |
| 95 | +company may be lower than the vendor's internal TRL for the product. The purchasing company needs to |
| 96 | +integrate it into their business systems and validate that the product will work for them. Once the |
| 97 | +payroll software is fully operational across the workforce, it may then be considered a TRL 9 for |
| 98 | +the purchasing company. |
| 99 | + |
| 100 | +As mentioned in [Part 1](./posts/0064-engineering-portfolio-management-part1), my career has been in |
| 101 | +the Software-as-a-Service industry and in Cloud Operations. Both of these areas are relatively new |
| 102 | +compared to other industries like the food, aerospace, or steel industries. It seems we are in our |
| 103 | +adolescence, where we think we know better than the engineering industries that have come before. |
| 104 | +But we are still learning what works and what does not. |
| 105 | + |
| 106 | +While I advocate for DevOps methodologies and building the capabilities for failing fast forward, |
| 107 | +it seems that some of the approaches adopted have cut TRL corners. "Fail fast" has become a |
| 108 | +motto and badge of honor in the software industry. But the original idea seems to have lost its |
| 109 | +efficacy. It seems to have become an excuse to not to do the upfront legwork to validate ideas |
| 110 | +before building them. |
| 111 | + |
| 112 | +It seems that the intangibility of software makes it look like writing software is cheap. It might |
| 113 | +take five minutes to write these three lines of code and five days to write another three other |
| 114 | +lines of code. The end product in both cases is three lines of code but the impact may be completely |
| 115 | +different: an html button with a new hover state versus statistical analysis monitoring algorithm to |
| 116 | +determine the failure state of a dam generator. |
| 117 | + |
| 118 | +Dr. Ali published a book this year correlating the use of Agile project management with increased |
| 119 | +risk of project failure (Ali, p.311). His main thesis was that too often teams jump right into |
| 120 | +building something rather than taking time to define the problem and plan. While I am waiting for |
| 121 | +follow-up studies to be done in relation to Agile and project failure rates, I have also observed |
| 122 | +(and participated in) the behavior of jumping to code instead of designing and planning requirements |
| 123 | +and specifications. |
| 124 | + |
| 125 | +We can see this behavior in building new software features without the voice of the customer. |
| 126 | +Features are built and rolled out and then A/B tested. If the user likes a new feature, "Great! |
| 127 | +Let's keep it". If they don't, it may get removed (or sometimes it will be left in the product since |
| 128 | +there is another feature that is a higher priority than removing the unwanted one). A/B testing is |
| 129 | +useful for low cost changes (button color changes, etc), but with significant changes, a lot more |
| 130 | +planning and validation is needed. |
| 131 | + |
| 132 | +Failing fast needs to get back to failing early in the TRL levels rather than in TRL 9 after the |
| 133 | +technology has been fully implemented and deployed to production. While learning after the fact is |
| 134 | +valuable, it is better to learn prior to the significant financial investment in TRL 7-9. The gate |
| 135 | +review system that will be discussed in [Part 4]() will provide a process to fail fast in a reliable |
| 136 | +way. |
| 137 | + |
| 138 | +--- |
| 139 | + |
| 140 | +## Resources |
| 141 | + |
| 142 | +1. Flinn, Peter. Managing Technology and Product Development Programmes: A Framework for Success. Wiley, 2019. |
| 143 | +2. Technology Readiness Levels - NASA. 27 Sept. 2023, https://www.nasa.gov/directorates/somd/space-communications-navigation-program/technology-readiness-levels/. |
| 144 | +3. Weber, Gary. TRL/MRL Categories. Gonzaga University, ENGM 510. |
| 145 | +4. Software Technology Readiness Levels - NASA. https://www.nasa.gov/wp-content/uploads/2017/12/458490main_trl_definitions.pdf. |
| 146 | +5. Ali, Junade. Impact Engineering (Complete Omnibus). First edition, Engprax Ltd, 2024. |
| 147 | + |
0 commit comments