Project Risk Assessment and Decision Support Tools

Search Timer

QUALITY(AUDIT/FOLLOW-UP)


AUDIT/FOLLOW-UP:

1 – Are/ were vendor surveillances done?
2 – Is contractor/ vendor work verified?
3 – Are procedures, codes and standards being complied with?
4 – Are the impact of errors calculated in terms of cost overrun and schedule delay?
5 – Were errors of omission or commission made?
6 – Are deficiencies identified?
7 – Are deficiencies documented?
8 – Are deficiencies corrected?
9 – Is follow-up consistent?
10 – Does the contractor self-certify their own work?
11 – Were/ are there any significant technical glitches that could impede the project?
12 – Did 3 or more consecutive failures occur during testing?
13 – Was a full audit performed recently?
14 – Is third party certification done?
15 – Is contract performance reviewed?
16 – Were key objectives met?
17 – Was 100% computer simulation performed (as opposed to live testing [better])?
18 – Was a site visit done?
19 – Did a qualified/ specialized person inspect the repaired/ defective equipment and approve it before it was returned to service? Do two or more qualified people verify the work is performed correctly? (Yes = good)
20 – Are testers manipulating/ intentionally misinterpreting test results? (Yes = bad)
21 – Are only legal measurements and product/ service configurations being used? (No = bad)
22 – Project manager knows status of tools, equipment and parts in inventory and can prove with documentation?
23 – Is a large amount of inventory kept on-hand? (Yes = blue)
24 – Has a comparison been made between inventory on the shelves and previously submitted inventory reports?
25 – Are significant/ critical items resources not counted/ unaccounted for? Without a trace?
26 – Does project manager effectively manage disposal of unused/ unnecessary/ no-longer usable supplies/ equipment/ material? Is this management supported by documentation?
27 – Is only a small amount of work actually inspected?
28 – Are fines large enough to deter misconduct?
29 – Any phony statistics/ shipments received/ made?
30 – Is a joint inventory performed between the current and new responsible person? (No = bad)
31 – Is project using authorized parts? No = bad.
32 – Does project have a formal loss control procedure? Is it being followed?
33 – Are mandated tests being done by the product/ service provider and not independently verified?
34 – Are permits approved/ issued without securing the required proof? Problems getting permits/ rights of ways (Yes = bad)
35 – Can “finalized” records be changed after-the-fact? (Yes = bad)
36 – Can bill of lading be tampered with or copied? (Yes = bad)
37 – Does PM know what the project’s simulations assume? (No = bad)
38 – Can/ has test data: Been validated, have correct data, exists at all, correctly labeled?
39 – Were the services provided by the person listed on the invoice?
40 – Do any contracts exist just below the threshold requiring formal approval? (Yes = bad)
41 – Must work sheets/ invoices be verified & submitted before payment is made? (No = bad)
42 – Are bills being paid for work completed without an approved contract? (Yes = bad)
43 – PM ensures that project is in compliance with all pertinent federal and state laws? (Yes = good)
44 – Can PM determine where violations are piling up? (No = bad)
45 – Credit card related:
     *Do too many people have a project-related credit card? (Yes = bad)
     *Are charges being submitted and approved without properly filed receipts (such as details regarding location, business reason, trip cost for hotel, air, ground
     *transportation, meal expenses, other services)? (Yes = bad)
46 – Any unusual testing procedures being used? (Yes = blue)
47 – Are test results consistent (within 10%) when matching the project’s performance, expected normal range, and qualified third party test results? (No = bad)
48 – Are enough inspections/ testing being consistently done? (No = bad)
49 – Is final product (end item) testing done before delivery to the customer? (Yes = good)
50 – Does system that manages paperwork/ reports, which is actually submitted late be “backdated” to make it “appear” to have been completed on-time? (Yes = red)
51- Is the quality assurance deparment properly staffed and trained to catch flaws/ potential issues associated with this project? (No = bad)
52 – Can PM/ team trace the source of all the project’s resources? Data? (No = blue)
53 – Any evidence of performance data being falsely (and knowingly) reported/ numerous late entries? Any consistently deficient performance? (Yes = red)
54 – Do workers frequently wait until the end of their shift to update their work reports? (Yes = blue: Distorts actual time to complete)
55 – Are defects/ deficiencies found during inspections/ testing properly documented and promptly fixed? (Yes = good)
56 – Is project operating under a “temporary”/ “emergency” safety certificate? (Yes = blue)
57 – Is/ has the certification process been circumvented? (Yes = bad)
58 – Is testing performed to ensure that components/ equipment is configured properly/ in accordance with manufacturer’s instructions/ specifications? (No = bad)
59 – Is there a difference between the stated procedure and the actual test procedure? (Yes = bad)
60 – Are parts still being used past their initial replacement/ expiry date? (Yes = bad)
61 – Are safeguards in place to prevent accidents/ procedure violations? (Yes = good)
62 – Are bills/ invoices submitted in a timely manner? (No = bad)
63 – Are contractors direct labor task/ line-item prices reasonably in-line after independent expert review? (No = bad)
64 – Are contractors overhead items (food, lodging, insurance, equipment rental) prices reasonably in-line after independent expert review? (No = bad)
65 – Are line workers forced to figure out details which should have been placed in the quality plan? (Yes = bad)
66 – Do loopholes exist in this project, where if left unaddressed, will provide a less-than-accurate picture of the project’s actual performance? (Yes = bad)
67 – Do project failure reports include: Date/ equipment/ system/ time/ location/ duration/ cause/ effects/ department responsible for corrective action? (Yes = good)
68 – Is there any on expenses that have nothing to do with running this project? (Yes = bad)
69 – Are accounting tricks being employed that hide the project’s true situation? (Yes = bad)
70 – Has this project been in compliance the entire time? (Yes = good) – If no, can the compliance gaps be identified? (No = blue)
71 – Is the project’s product(s)/ component(s)/ process deviating from stated specifications? (Yes = blue)
72 – Can QA (or other competent organization) quantify the problem/ amount of deviation? (No = blue)
73 – Can QA (or other competent organization) identify internal and external organizations affected by products/ services operating out-of-specification? (No = red)
74 – Does QA ensure that responsible and qualified project team personnel ensure that contractors are following proper worker safety procedures? (No = bad)
75 – Was the number of tests tracked and recorded? (No = blue)
76 – Does PM/ QA have a formal process/ procedure to improve contractor/ team work quality (e.g.
A. Verifying contract performance: Contractor actually exists/ work actually was performed;
B. Ensuring accountability/ offering assistance and that the work gets performed;
C. Assigning consequences – such as a formal progressive disciplinary system, if work is not performed to standard;
D. Replacement of contractors who consistently fail to meet performance standards)
77 – Does the PM review actual testing data? (Yes = good)
78 – Is testing for this project standardized and repeatable? (Yes = good)
79 – Are there sufficient auditors at the project location to perform good audits? (No = bad)
80 – Any funds been/ being transferred from restricted/ custodial accounts to cover operating budget expenses? (Yes = bad)
81 – Are layers of testing being used? (e.g., initial testing – for identifying obvious flaws; and advanced testing for unique issues/ possible double blind testing/ comparing current test results with the baseline to see what did and did not change?) (Yes = good)
82 – Is advanced testing being (needlessly) employed, despite no evidence of problems with the initial tests? (Yes = bad)
83 – Is QA making on-site inspection of the work being performed? (Yes = good)
84- Is/ has the project’s end product passing/ passed all necessary tests? (No = bad)