By Dan Friedlander
Retired following 44 years in component engineering
In part II, Component Engineer Dan Friedlander continues to examine the use of commercial off-the-shelf (COTS) components in aerospace applications, comparing COTS to traditional military standard (MIL-STD) parts. “The 1994 Perry Memo triggered a policy of transition from MIL to COTS products. Unfortunately, space applications were exempted,” he writes. Read part I by clicking here: http://bit.ly/2lnhMhf.
MIL-SPEC to MIL-PRF
The MIL-SPEC to MIL-PRF conversion, as understood by the reformers, has been a hasty conversion due to the given short time schedule. The peak overwhelming activities required to process thousands of MIL-SPECs in a timely manner may explain the MIL-PRF wording containing edited MIL-SPEC wording. The resulting wording contains elements of "how to do" – not really supposed to be in a performance specification (e.g., testing/screening baseline).
The main changes incorporated in the MIL-PRF (MIL performance specification) and the QML (qualified manufacturers list) concept are:
· Shift of responsibilities to the manufacturers;
· Validity to off shore activities (manufacturing, assembly, testing) by approved commercial entities, meaning use of Best Commercial Practices (BCP);
· Authority given to the manufacturers to perform testing optimization; and
· Shift to product line certification.
Those are meaningful steps toward recognizing the value of the COTS. However, military EEE components meeting the MIL-PRF are still more or less following the MIL-SPEC testing/screening requirements baseline.
The baseline did not change. Practically, to meet MIL-PRF requirements one must meet the above testing/screening baseline, except the tests eliminated within the given test optimization option. In reality, the extent of actually exercised test optimization by manufacturers is very limited. In fact, the optimization is referenced to the former relevant MIL-SPEC baseline flows.
To materialize the test optimization option, the procedure is not so simple. The QML-qualified manufacturer has the authority to solely decide and justify optimization of the flows baseline. However, the government oversight has to be involved, as well.
QML-38535 states: "MIL-PRF-38535 contains provisions for test optimization for all listed processes and products including all device classes. Under these provisions, traditional military screens and inspections can be optimized if found to be non-value added through the manufacturers up front controls and monitors as well as design, wafer fabrication, assembly and test design areas. Optimization of these screens and inspections in no way adversely affects the devices military form, fit and function application, rather the device shall be considered more robust since these screens and inspection are no longer value added."
Note that the baseline is the "traditional military screens and inspections". I concur with the conclusion that after elimination of non-value-added screens the components are "more robust." In another article, I emphasized the risk of 100% testing. In addition, it has to be mentioned that the value-added Statistical Process Control (SPC) has been introduced from the commercial practices.
The testing flows shall be documented inthe manufacturer's quality management (QM) plan.
Referring to space applications, MIL-PRF-38535K states: "The space community (e.g., DTRA, NASA, NRO, and AFSMC) and the customer shall be notified of major changes to the manufacturer's quality management (QM) plan. Any optimization proposed by the manufacturer must be presented to the qualifying activity and coordinated with the space community with accompanying supporting data to validate the proposed change. The optimization must be approved by the QA in writing prior to implementation."
Is the coordination with space community and customer practically working in the U.S. and/or the rest of the world? The time cycle for Class V EEE components test optimization approval by QA may vary from two to nine months.
To summarize the above, it can be noted that the MIL-PRF is a welcome,relaxed version of MIL-SPEC. It is a welcome Best Commercial Practices infusion (like Statistical Process Control, etc.) of vitality to the outdated MIL system, keeping the traditional testing/screening baseline. The movement is a step in the right direction.
Without going into a detailed analysis, following are samples reflecting issues raised above. It has to be mentioned that MIL-PRF-38535 is a complex specification and covers complex components. It is not easy to navigate through it.
Quote:“Never overlook the power of simplicity.” - Robin S. Sharma.
It seems that too much is squeezed into one document.
Quote: "Simplicity is the ultimate sophistication." - Leonardo da Vinci.
The MIL-PRF-38535 replaces MIL-M-38510. Both have similar structure/wording.
MIL-PRF-38535 states: "The basic section of this specification has been structured as a performance specification, which is supplemented with detailed appendices. These appendices provide guidance to manufacturers on demonstrated successful approaches to meeting military performance needs. These appendices are included as a benchmark and are intended to impose performance requirements."
Author comment: The performance specification is a mixture of performance (the main part) requirements (what should be the component capability) and old MIL-spec (supplemental appendices) "how to do" requirements. Practically, the two above parts are interrelated, implying following of the test flows (taken from MIL-M-38510), disguised as benchmark. De facto, the MIL-SPEC test baseline continues to be the one to follow, less the test optimization option (referenced to the MIL-SPEC baseline) adopted by a few QML manufacturers.
The specification contains a variety of class levels and classes.
"A.220.127.116.11 Class level B. Class level B requirements contained in this document are intended for use for class Q and class M products, as well as class B M38510 JAN slash sheet product. Class level B requirements are also intended for use for product claimed as 883 compliant or 1.2.1 compliant for high reliability military applications."
"A.18.104.22.168 Class level S. Class level S requirements contained in this document are intended for use for class V or class Y and slash sheet M38510 JAN product. Class level S requirements are also intended for use for product claimed as 883 compliant or 1.2.1 compliant for space level applications."
The following table summarizes the complexity a user has to deal with, while searching for the right solution.
To further complicate the understanding of this performance specification, the specification is multipurpose. It addresses QML-qualified manufacturers or candidates for QML or those compliant to MIL-STD-883.
Case by case, the appendices (integrated in the spec) are also multipurpose: performance baseline (benchmark), mandatory baseline, transitional mandatory baseline.
Although the specification is not dealing with COTS, it is worth paying attention to paragraph 6.5 of the spec, recognizing the value of the SPC vs. testing. As remembered, the reliability of a component is built-in during the manufacturing phase, under strict regime of SPC and not by the testing regime.
An extract from paragraph 6.5 of the spec reads: "Manufacturers listed on the QML will be able to produce microcircuits without the need for extensive end-of-manufacturing qualification testing and QCIs on each device design. The reduction of the end-of-manufacturing testing will be replaced with in-line monitoring and testing and SPC."
The MIL/COTS debate is ongoing. There are still debates about the intention of the reform, namely whether MIL-PRF components are COTS. The 1994 Perry reform was a right decision in the right direction. Further activity is needed to further clarify ambiguities, further simplify documents.
In view of the value-added Best Commercial Practices (BCP) infusion into military EEE components, the time has come to revisit the testing/screening baseline. The new approach of giving the test optimization authority to EEE components manufacturers is a right decision of limited benefit. Justified test optimization should be made valid per spec and not per manufacturer. Today, in the era of military components manufactured in commercial foundries under similar regimes, it does not seem logical to apply test optimization related to manufacturer/process.
The test optimization has to be done across the board, keeping the value of standardization. As stated above, even QML-38535, after elimination of non-value added screens the components are "more robust." The MIL/COTS debate should take into account the availability security of the relevant EEE components, especially the MIL ones.
The author, Dan Friedlander, graduated Engineering School/Tel Aviv University with a degree in physics (1965-1969). He has 44 years of experience in Component Engineering at MBT/Israeli Aerospace Industries (1969 to 2013), as Head of Components Engineering. As such, he was responsible for all aspects of EEE components – including policymaking, standardization at corporate level, approval, etc. – for military and space applications. Now retired, Friedlander is an industry consultancy (2013 to present). For further details on his experience, visit https://www.linkedin.com/in/dan-friedlander-63620092?trk=nav_responsive_tab_profile