Regulatory Open Forum

 View Only
  • 1.  Interim design outputs

    Posted 08-Nov-2019 12:15
    Hi everyone,
    This is another design control question. 
    In building a device with many subsystems and various modules within each subsystem,  there are many what I call interim design outputs (for examples, a populated PCB board) that are not the final design output (the finished medical device) but are what I call "sub-systems" or "modules". In performing design verification (to verify that the outputs meet our input requirements), what is the sampling rationale? We are taking what we receive from a supplier to see if the component meets specifications. This PCB board would be an interim design output; is there a particular sample size we should aim for when planning the design verification for an interim design output? In this case, how many PCB boards would be tested?  I would appreciate any insight on this.  Thanks. 


    ------------------------------
    Karen Zhou
    ------------------------------


  • 2.  RE: Interim design outputs

    Posted 08-Nov-2019 13:33

    The correct sample size rationale is that there is no sample sized rationale because this design verification does not require sampling.

    First identify the design output. It is not the parts you receive from a supplier; it is the purchasing data you send to the supplier under 820.50(b). As a design output, the specification will eventually come under document control and move to the purchasing function.

    The design verification step ensures the purchasing data matches the design input. The design verification method is comparing two documents (input and output) for correctness and completeness.

    Don't confuse the sampling plans you might use at receiving acceptance with design verification.

    There is a similar situation with the PCB. You could use circuit review and calculations based on the schematic for design verification, but that is hard. You could build one and test it. You don't need to build two, since they would be the same. The design verification would be a test that applies the inputs and verifies that the circuit design produces the correct outputs.

    People often confuse design verification with a manufacturability. They are not the same.

    My recommendation is to start by determining the design output. It is almost always a drawing for which you would verify only one copy.

    In other cases, you might do design verification using a type test, which typically will be one completed unit. The test demonstrates the unit meets the specification. (Occasionally a specification for a type test will require more than one unit.)



    ------------------------------
    Dan O'Leary CQA, CQE
    Swanzey NH
    United States
    ------------------------------



  • 3.  RE: Interim design outputs

    Posted 08-Nov-2019 15:09
    In building off of Dan's answer above, I would also recommend not confusing design verification with supplier validation.

    As Dan mentioned, the verification is that the design functions the way that it is intended. If the intent is to ensure that the components you receive from a supplier are made to the specification then ensuring their process is validated to consistently produce the same product is of higher importance. This may be done through heightened First Article Inspection or working with the supplier on a process validation protocol/report. This would fall under management of suppliers and their processes. It does not seem feasible to perform 100% inspection, so therefore a validation would be recommended.

    ------------------------------
    Michael Gerhard
    Quality Manager
    Putnam CT
    United States
    ------------------------------



  • 4.  RE: Interim design outputs

    Posted 11-Nov-2019 09:52
    While I generally agree with Dan on this, I think the reality is a bit more complicated - and I don't want people to walk away with "all you ever need to test is n=1" because while in some cases it is true, in others you may need to do more to account for variability and tolerances on your parts.

    Yes - the part drawing or specification are usually the "design output" that you must verify. That would be the case with the hypothetical PCB. In general, you must verify that said "design output" meets design inputs, which for this component may include the basic function, safety features/risk mitigators etc. For really simple parts, you might be able to verify completely by modeling in LabView or any of the electrical system modelers. However, for a sufficiently complex system, this can be tricky and often a more functional verification is needed. Also, many times everyone is just more comfortable "knowing" that things like safety features work in the real world.

    For a PCB and other electrical parts, verify often you can do this verification testing on n=1. There is simply so little variability from part to part do to the binary nature of much of the circuitry. However, for more analog parts, there are often important "ranges" to the physical outputs. It is possible you can model these with a sufficiently good software tool. More likely though, what you'll be able to do is model the potential variability that can be expected. You can then use that information to calculate an appropriate sample size to verify (or do verification testing).

    Also, remember "verification" does NOT equal testing. The usual 4 verification approaches are inspection, demonstration, test and analysis. Note that "test" is only one and only applies in situations where the others are not better approaches. I think, as an industry, we have erred too much to "testing" at times (which I think was one of Dan's points).

    As Dan also said, this does not mean you won't need to do other types of testing, perhaps to assess how effective your manufacturing process is or how manufacturable it is.

    As much as companies (and most regulators) would like to make this all "cookie cutter" the fact is, engineering is complicated, our devices are complex and unique and each situation should be assessed in light of the risks, design and a firm understanding of the basic principles at play - not from a procedure that says "test 10 in this situation."

    g-, on yet another soapbox

    ------------------------------
    Ginger Glaser RAC
    Chief Technology Officer
    MN
    ------------------------------