Title: Supplier Quality System Assessment Author: Byron Murray Date: 04/10/2014
NEON Form. #: NEON.DOC.004242 Revision: C
Page 1 of 8
General Supplier Information
Supplier Name:
Address of Facility Audited:
Type of Audit (check one)
Telephone:
Fax:
Audit Date:
Person’s name(s) completing this audit:
Name:
Company:
Title:
Phone:
Name:
Company:
Title:
Phone:
Supplier Senior Company Official’s Name:
Title:
Supplier Senior Quality Official’s Name:
Title:
Supplier Products:
Number of years in business:
Approximate Annual Sales:
Ownership:
Partnership;
Private;
Public;
Stock symbol if applicable:
Number of buildings:
Total square footage:
Total MFG square footage:
Total Employees:
Production:
Engineering:
Quality:
Office:
ISO Certified:
Yes No
Level:
ISO Registrar:
Date of last audit:
List top five customers:
Where is/would NEON be on that list (i.e., percent of business):
COMMENTS:
Supplier Overall Audit Rating
% (From Page 8 Calculations)
Audit Rating Recommendations:
95 %
=
Outstanding quality system and performance.
80 %
=
Meets requirements with above average to superior quality system and performance.
65 %
=
Requires improvements in meeting quality system standards and performance.
< 65 %
=
Requires significant improvement in meeting quality system standards and performance.
45 %
=
No quality system in place.
0.0
Title: Supplier Quality System Assessment Author: Byron Murray Date: 04/10/2014
NEON Form. #: NEON.DOC.004242 Revision: C
Page 2 of 8
Instructions for completing Sections 1 through 8:
Using the following rating system, answer each question by writing or typing in the number that best describes your
response. For Supplier Self Audits, suppliers do not complete the shaded boxed questions, although the supplier should be
prepared to answer all them during an actual NEON on-site audit. The shaded questions are additional items only for
NEON personnel use during on-site audits. NEON auditors will complete all questions as applicable.
3 = Procedure or system is thoroughly documented and consistently adhered to.
2 = Procedure or system exists though may be inadequate and/or is not consistently followed.
1 = Procedure or system exists but is rarely followed and/or is not accurate.
0 = No procedure or system exists at this time.
N/A = Not applicable.
Where indicated with” ∗∗”, please attach a one-page example from your supporting document.
Space is provided after each section for any comments. Please provide any details not described by documents.
Additional example documentation may be attached.
1.0 Contract Review & Document Control
Rating System Score
1.1 Is there a Quality Manual available that describes quality-related procedures and
policies? (Attach an uncontrolled copy)
0 1 2 3 N/A
1.2 Is there a procedure to ensure that revision levels are verified for each
manufacturing/sales order against the customer purchase order beginning from order
entry? ∗∗
0 1 2 3 N/A
1.3 Is there a procedure or policy requiring customer notification and approval of material,
process or supplier/manufacturing site location changes?
0 1 2 3 N/A
1.4 Is there a procedure for maintaining and distributing drawings, drawing revisions, and
specifications? ∗∗
0 1 2 3 N/A
1.5 Is there a master listing identifying current procedures or work instructions and their
latest revisions? ∗∗∗
0 1 2 3 N/A
1.6 Is there a procedure for the removal of obsolete documents that includes part master
files?
0 1 2 3 N/A
1.A Is there evidence that correct document revisions are available at all locations where
relevant activities take place?
0 1 2 3 N/A
1.B Is there evidence that controlled documents are approved by authorized personnel
prior to use?
0 1 2 3 N/A
1.C Do document changes/revisions include or reference the nature and date of the
changes made?
0 1 2 3 N/A
1.D Is there an up-to-date organization chart? (Attach a copy) 0 1 2 3 N/A
Section 1 Comments:
2.0 Control of Inspection, Measuring & Test Equipment
Rating System Score
2.1 Is there a procedure that describes calibration intervals and maintenance requirements
for all measurement equipment that is used to measure part or product conformance?
0 1 2 3 N/A
2.2 Are all measurement equipment clearly labeled with the last date of calibration and
when due for recalibration?
0 1 2 3 N/A
N/A
N/A
N/A
N/A
Title: Supplier Quality System Assessment Author: Byron Murray Date: 04/10/2014
NEON Form. #: NEON.DOC.004242 Revision: C
Page 3 of 8
2.3 Is all measurement equipment that is not used to measure part or product
conformance (may include employee owned tools and reference tools) identified with a
“NO CALIBRATION REQUIREDlabel or words to that effect?
0 1 2 3 N/A
2.4 Are calibration records maintained for all measurement equipment? Who performs the
calibrations? ∗∗∗
0 1 2 3 N/A
2.5 Are all calibrations performed using equipment and shop masters traceable to the
National Institute for Standards and Technology (NIST) or other suitable standards?
0 1 2 3 N/A
2.6 If equipment is found to be out of tolerance during calibration, are there procedures or
policies to evaluate the impact it may have had on manufactured material?
0 1 2 3 N/A
2.A Is there evidence that all inspections are being performed using calibrated
measurement equipment with sufficient precision and accuracy?
0 1 2 3 N/A
2.B If calibrations are being performed by supplier personnel, are there written instructions
for each measurement and test equipment being serviced?
0 1 2 3 N/A
2.C Is there evidence that test equipment that is over-due for calibration is being removed
from service until calibration has been performed?
0 1 2 3 N/A
2.D Are there adequate environmental, handling, preservation and storage conditions for
measuring, inspection and test equipment?
0 1 2 3 N/A
Section 2 Comments:
3.0 Incoming Inspection
Rating System Score
3.1 Are incoming materials inspected to all requirements of a purchase order, general
specifications, and/or applicable drawings?
0 1 2 3 N/A
3.2
Are there inspection procedures for incoming materials? ∗∗∗
0 1 2 3 N/A
3.3 Are statistically valid sampling plans with AQL’s based upon customer requirements
utilized? ∗∗
0 1 2 3 N/A
3.4
Is there a procedure for the disposition of discrepant incoming materials? ∗∗∗
0 1 2 3 N/A
3.5 Are there procedures and practices to insure that incoming materials as well as
rejected materials are kept segregated and secured from accepted material?
0 1 2 3 N/A
3.6 Is there a procedure that describes how long inspection records are retained? 0 1 2 3 N/A
3.A Is there evidence that test reports, certificates of conformance or chemical and
physical certifications of materials or parts are received with the individual material lots
and are retained?
0 1 2 3 N/A
3.B Is there evidence (records) that the material has passed inspection and tests as
defined by the acceptance criteria?
0 1 2 3 N/A
3.C Is there evidence that first article inspections are performed on new parts/materials or
when materials, processes or suppliers are changed?
0 1 2 3 N/A
3.D Is there evidence that correction action requests are being routinely issued to suppliers
due to rejected material or other supplier performance issues?
0 1 2 3 N/A
Section 3 Comments:
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Title: Supplier Quality System Assessment Author: Byron Murray Date: 04/10/2014
NEON Form. #: NEON.DOC.004242 Revision: C
Page 4 of 8
4.0 Manufacturing & Process Control
Rating System Score
4.1 Are there written procedures for all manufacturing processes and do the procedures
indicate workmanship criteria, special handling or process conditions, and the specific
equipment to be used? ∗∗
0 1 2 3 N/A
4.2 Is a lot traveler (or router) utilized and does it clearly define all processing and
inspection steps for each product lot as it progresses through manufacturing and test?
Do the records indicate the completed manufacturing processes with the quantities,
names and dates of those who performed each identified step? ∗∗
0 1 2 3 N/A
4.3 Are all software changes validated before approval and issuance and are there
effective controls to insure that only the most current version can be used? How is this
documented? ∗∗∗
0 1 2 3 N/A
4.4 Are important part and process characteristics clearly defined for each part/product,
and are they effectively monitored during production to insure that specified
requirements are met? Are any statistical process control methods utilized that would
contribute to final product acceptance? ∗∗
0 1 2 3 N/A
4.5 Is there a preventive maintenance schedule established for all production equipment
and tooling and is it suitable to insure continuing process capability? Does it include a
system for monitoring tool life and the number of parts produced from a tool before
maintenance and/or replacement?
0 1 2 3 N/A
4.6 Are there procedures and practices to prevent contamination or degradation of parts
from ESD, dust, oil, hazardous substances or other environmental contaminants?
0 1 2 3 N/A
4.A Are there procedures and policies for housekeeping and are there any safety
conditions that could jeopardize the work environment?
0 1 2 3 N/A
4.B Is there evidence that all Material Certifications, Process Certifications and
Certifications of Compliance are traceable to a manufacturing lot number?
0 1 2 3 N/A
4.C Does the manufacturing equipment and technology being utilized appear adequate
and suitable for the intended purposes?
0 1 2 3 N/A
4.D Is there any evidence that process capability studies, design of experiments, etc., are
being performed and communicated to the customer?
0 1 2 3 N/A
Section 4 Comments:
5.0 In-Process & Final Inspection
Rating System Score
5.1 Is in-process and/or final inspection being performed on each lot to insure compliance
to all requirements of the customer purchase order, general specifications, and/or
applicable drawings?
0 1 2 3 N/A
5.2 Where inspection and testing is being performed, are there written procedures with
statistically valid AQL based sampling plans being utilized? ∗∗∗
0 1 2 3 N/A
5.3 Is there a procedure and policy to insure that a first article inspection is performed for
all applicable dimensions when a part revision, material, or manufacturing process has
changed?
0 1 2 3 N/A
5.4 Is inspection and test data maintained on file and traceable to each lot? If yes, for how
long?
0 1 2 3 N/A
5.5 Can inspection and test data collected for key specified parameters be summarized to
indicate statistical control/consistency for each lot shipped to the customer?
0 1 2 3 N/A
N/A
N/A
N/A
N/A
Title: Supplier Quality System Assessment Author: Byron Murray Date: 04/10/2014
NEON Form. #: NEON.DOC.004242 Revision: C
Page 5 of 8
5.6 Is there a procedure for the identification, segregation and disposition of discrepant
parts and assemblies? ∗∗
0 1 2 3 N/A
5.A Is there evidence of adequate identification, including inspection and test status
throughout the process?
0 1 2 3 N/A
5.B Do retained quality records demonstrate conformance to product requirements? 0 1 2 3 N/A
5.C Is there evidence that customer authorizations are being requested and approved prior
to shipment of any parts or assemblies that may not fully meet all specifications?
0 1 2 3 N/A
5.D Is there evidence that repaired or reworked parts are being fully re-inspected to all
applicable requirements?
0 1 2 3 N/A
Section 5 Comments:
6.0 Packaging, Storage & Shipping
Rating System Score
6.1 Is there a procedure that describes proper handling, packaging, storage, preservation,
and shipping methods?
0 1 2 3 N/A
6.2 Are raw materials/parts stored and used on a first-in, first-out (FIFO) basis? 0 1 2 3 N/A
6.3 Are all ESD sensitive materials stored in shielded anti-static containers? 0 1 2 3 N/A
6.4 Is there a procedure that describes the handling and expiration date coding of limited
life materials such as adhesives (Loctite), etc.?
0 1 2 3 N/A
6.5 Are finished goods effectively segregated with a manufacturing lot number or a date
coding system that includes the part number and revision level?
0 1 2 3 N/A
6.6 Are labeling, certifications, and packaged finished product inspected and verified to
insure compliance with customer requirements prior to shipment? ∗∗
0 1 2 3 N/A
6.A Is there evidence that excess issued material returned from manufacturing to raw
materials storage is being properly identified and part marked?
0 1 2 3 N/A
6.B Is there evidence that material/parts are handled, stored and packaged in a way to
prevent damage?
0 1 2 3 N/A
6.C Is there evidence that customer specific packaging requirements are available to
shipping personnel for each part or assembly?
0 1 2 3 N/A
6.D If NEON provides kits of material or parts to the supplier, is there evidence of
appropriate storage, segregation and preservation of parts while under the supplier’s
control?
0 1 2 3 N/A
Section 6 Comments:
7.0 Corrective and Preventive Action
Rating System Score
7.1
Is there a procedure for implementing corrective and preventive actions? ∗∗.
0 1 2 3 N/A
7.2 Is there a follow-up system to identify, evaluate effectiveness, and close corrective
actions?
0 1 2 3 N/A
7.3 Is there a log, database or other system used for trending and/or history of corrective
actions? ∗∗∗
0 1 2 3 N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Title: Supplier Quality System Assessment Author: Byron Murray Date: 04/10/2014
NEON Form. #: NEON.DOC.004242 Revision: C
Page 6 of 8
7.4 Is there a procedure for the receipt and evaluation of customer complaints? 0 1 2 3 N/A
7.5 Does the above procedure include the issuance of return material authorizations
(RMA’s) and corrective action requests (CAR’s) to the customer?
0 1 2 3 N/A
7.6 Is quality cost data (scrap, rework, customer returns, etc.) collected, analyzed and
shared throughout the organization to drive process improvement and part variability
reduction activities? ∗∗
0 1 2 3 N/A
7.A Is there evidence that the corrective action system is up-to-date with issues closed in a
uniform and timely manner?
0 1 2 3 N/A
7.B Is there evidence that implemented changes resulting from corrective action have been
appropriately documented and is being effectively disseminated to those who are
directly responsible for assuring product quality and the prevention of future problems?
0 1 2 3 N/A
7.C Is there evidence of timely customer RMA and CAR issuances? 0 1 2 3 N/A
7.D Is there evidence that there is a regular and adequate management review of
corrective and preventive actions?
0 1 2 3 N/A
Section 7 Comments:
8.0 Training
Rating System Score
8.1 Is there a procedure that defines the responsibilities and training requirements for each
position? ∗∗∗
0 1 2 3 N/A
8.2 Have training and development plans been implemented for all employees who have
an impact on quality?
0 1 2 3 N/A
8.3 Are processes operated/performed by qualified employees? 0 1 2 3 N/A
8.4 Are training records maintained on each employee that includes the training
course/subject, completion date and trainer name? ∗∗
0 1 2 3 N/A
8.5 Are individual training records easily accessible to each employee? 0 1 2 3 N/A
8.6 Is training being provided to changes that would affect the form, fit or function within
the area?
0 1 2 3 N/A
8.A Do employees appear to be adequately trained for their assigned tasks? 0 1 2 3 N/A
8.B Is there evidence that the people being trained are made aware of the defects that may
occur from improper performance of their job function?
0 1 2 3 N/A
8.C If in-house trainers are utilized, is there evidence that the trainers have adequate
experience and expertise to perform the assigned training?
0 1 2 3 N/A
8.D Is there evidence that the quality policy and mission statement is known and
understood by all employees?
0 1 2 3 N/A
Section 8 Comments:
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Title: Supplier Quality System Assessment Author: Byron Murray Date: 04/10/2014
NEON Form. #: NEON.DOC.004242 Revision: C
Page 7 of 8
Supplier Quality System Audit Summary
A. Summary of items that we liked
B. Summary of items that require immediate attention (corrective action)
C. Summary of items that are of concern
Title: Supplier Quality System Assessment Author: Byron Murray Date: 04/10/2014
NEON Form. #: NEON.DOC.004242 Revision: C
Page 8 of 8
Supplier Quality System Audit Scorecard
Instructions for use:
1. Transfer the rating scores from the individual questions within each section to the corresponding areas within the table
below.
2. Calculate the Total Section Score by adding together the ratings of each applicable question score per section.
3. Calculate the Maximum Possible Score by multiplying the number of applicable questions rated per section by 3.
4. Calculate the Average Section Score by dividing the Total Section Score by the Maximum Possible Score.
5. Calculate the Supplier Overall Audit Rating by dividing the sum of the Total Section Scores by the sum of the Maximum
Possible Scores.
6. Record the Supplier Overall Audit Rating on Page 1 of audit form in the assigned area.
Note: Deficiencies in any critical quality system area may warrant corrective action and a change to the supplier’s approval
status. Flexibility may be granted to a supplier at the discretion of the NEON audit team based upon material function,
sourcing availability or other key considerations.
Survey Section
Individual Question Scores
Total
Section
Score
Max
Possible
Score
Average
Section
Score
.1 .2 .3 .4 .5 .6 .A .B .C .D
1. Contract Review & Document Control
2. Control of Inspection, Measuring & Test Equipment
3. Incoming Inspection
4. Manufacturing and Process Control
5. In-Process & Final Inspection
6. Packaging, Storage & Shipping
7. Corrective and Preventive Action
8. Training
SUPPLIER OVERALL AUDIT RATING
Audit Rating Recommendations:
95 %
=
Outstanding quality system and performance.
80 %
=
Meets requirements with above average to superior quality system and performance.
65 %
=
Requires improvements in meeting quality system standards and performance.
< 65 %
=
Requires significant improvement in meeting quality system standards and performance.
45 %
=
No quality system in place.
N/A
N/A
N/A
N/A
0
18
0.00
N/A
N/A
N/A
N/A
0
18
0.00
N/A
N/A
N/A
N/A
0
18
0.00
N/A
N/A
N/A
N/A
0
18
0.00
N/A
N/A
N/A
N/A
0
18
0.00
N/A
N/A
N/A
N/A
0
18
0.00
N/A
N/A
N/A
N/A
0
18
0.00
N/A
N/A
N/A
N/A
0
18
0.00
0
144
0.00
0.0