24 0 527KB
[INSERT NAME OF AUDIT AND AUDIT NUMBER] Objectives 1. Provide management with an independent assessment of the progress, quality and attainment of project objectives, at defined milestones within the project, based off of company policies and procedures. 2. Provide management with an assessment of the adequacy of project management methodologies and that the methodologies are applied consistently across all projects. 3. Provide management with an evaluation of the internal controls of proposed business processes at a point in the development cycle where enhancements can be easily implemented and processes adapted. 4. Provide management with an assessment of the adequacy of security controls implemented. 5. Provide management with an evaluation of the project metrics / KPIs and expected benefits stated within the project business case report. Scope The audit of the SDLC process will review each phase of a system implementation project. The audit will address the following areas: governance and risk management, compliance with company procedures and regulation, project management methodology, budget, internal controls, and business processes. Audit Step AA - Planning 1. Prepare the audit announcement / notification letter informing applicable people of the estimated start date of the audit, the objective and the scope. E-mail it to addressee(s). Maintain e-mail in audit file. 2. Prepare a budget of estimated audit hours by audit category. See Audit Time Budget tab. Identify audit staff that will be assigned to the engagement. 3. Review prior SDLC audits and permanent files to ensure understanding of SDLC process and previously identified audit findings. Document any risks noted in the Risk Assessment tab. Update information in the permanent files, if necessary. 4. Perform pre-audit risk assessment. Map risks identified with audit procedures by updating the Benchmarking and Detail Audit Testing tabs as necessary. 5. Obtain and review the most current SDLC Policies and Procedures manual from auditee. Update the Benchmarking and Detail Audit Testing tabs as necessary. 6. Research industry best practices (ISACA, IIA, NIST, ISO, PMBOK) and compliance requirements (PCI DSS, Privacy, HIPAA, etc.) that are applicable to the system being implemented. Update the Benchmarking and Detail Audit Testing tabs as necessary.
W/P Ref
Preparer Sign-off
Reviewer Sign-off
Audit Step 7. Schedule pre-audit meeting with audit team and IT Project Team to discuss the objectives, scope, timing, involvement and requirements of the audit. Maintain meeting minutes. (Note: Audit Team should communicate to the Project Team the expectation that Audit Team will be invited to project meetings and included in any project e-mail groups.) 8. Prepare a preliminary request list of documentation and discuss it during the pre-audit meeting (e.g. flow charts, process narratives, listing of project team members, business case, system product information, etc). Detailed Audit Testing See separate tab for detailed audit program. BB - Audit Conclusion and Reporting 1. Prepare Audit Memos for each major phase of the IT Project (see below) and e-mail it to Project Team and Project Sponsor. Request from addressee(s) response(s) to all audit findings, along with an implementation date. a. b. c. d.
Business case and planning phase. Development and build phase. Testing phase. Pre Go-live & data conversion phase.
f. Training phase. 2. Prepare draft Audit Report and e-mail it to direct addressee(s). Request from addressee(s) response(s) to all audit findings, along with an implementation date. 3. Schedule an audit completion meeting with the Project team and Project Sponsor within 2 weeks of issuing the draft report. Review draft audit report and discuss audit findings and recommendations. Maintain meeting minutes. 4. Review management response (i.e. Action Plan) to audit findings. Review completion date(s) of action items for reasonableness. 5. Prepare final version of Audit Report and include the responses received from addressee(s) for all audit findings (if applicable). Email it to addressee(s) and management. 6. Prepare and e-mail Audit Survey to auditees. Request that responses are returned to the Audit Manager. See Audit Survey tab.
W/P Ref
Preparer Sign-off
Reviewer Sign-off
Audit Step
W/P Ref
CC - Follow-up Audit Procedures 1. Perform follow-up procedures (if applicable) to ensure that responses to audit findings have been implemented. Follow-up procedures should be performed within 6 months of issuance date of audit report. (Note: If management hasn't addressed findings, schedule a meeting with applicable employees to discuss implementation of action plan. Maintain meeting minutes.) 2. Prepare Follow-up Audit Report and e-mail it to direct addressess(s) and management. 3. If management is not properly or timely addressing audit findings, escalate the matter to the CAE. Document escalation in a memorandum to the Audit Files. DD - Audit Close-Out & Retention Dates 1. Perform a residual risk assessment and incorporate risks into the annual audit risk assessment. 2. Update permanent file, if necessary. 3. Review audit workpapers. Verify that all review notes have been properly addressed and closed. Verify that audit sign-offs have been performed by audit staff and reviewer(s). 4. Schedule a meeting of the audit team to discuss what worked well during the audit and areas for improvement to consider for future audits. Discuss with Audit Manager. 5. This audit has been completed in its entirety as of:
[insert date]
6. The audit report and workpapers shall be retained for 7 years from the date of completion, unless indicated otherwise (e.g. permanent files). The retentation date is:
[insert date]
Preparer Sign-off
Reviewer Sign-off
Audit Area Audit Charge Code: Audit Manager: [Insert Auditor Name] Planning Reporting Follow-up Audit Procedures Audit Close-out Detailed Audit Testing Project Governance Pre Implementation - Business Case & Project Planning Pre Implementation - System Development Pre Implementation - Testing Pre Implementation - Pre Go-Live & Conversion Pre Implementation - Training Post Implementation - Support & Maintenance Post Implementation - Project Assessment Post Implementation - Internal Controls Assessment Total Hours Audit Senior: [Insert Auditor Name] Planning Reporting Follow-up Audit Procedures Audit Close-out Detailed Audit Testing Project Governance Pre Implementation - Business Case & Project Planning Pre Implementation - System Development Pre Implementation - Testing Pre Implementation - Pre Go-Live & Conversion Pre Implementation - Training Post Implementation - Support & Maintenance Post Implementation - Project Assessment Post Implementation - Internal Controls Assessment Total Hours
Budget Hours Actual Hours
0
0
0
0
Notes
Audit Area Charge Code:Auditor Name] Audit Staff: [Insert Planning Reporting Follow-up Audit Procedures Audit Close-out Detailed Audit Testing Project Governance Pre Implementation - Business Case & Project Planning Pre Implementation - System Development Pre Implementation - Testing Pre Implementation - Pre Go-Live & Conversion Pre Implementation - Training Post Implementation - Support & Maintenance Post Implementation - Project Assessment Post Implementation - Internal Controls Assessment Total Hours
Budget Hours Actual Hours
0
0
Notes
[Provide a high level overview of the area(s), function(s), business process(es), and current systems that will be affected by the system being implmeneted.]
General listing of common risks that may occur during a system implementation project. • • • • • • • • •
System does not align with strategic objectives End Users do not accept the system due to poor design Project mismanagement leads to scope creep, budget overruns, and delays Security vulnerabilities Internal control gaps Lack of data completeness, accuracy, and integrity Inability to adhere to regulation resulting in fines / penalties Damage to reputation (especially if system is used by external parties) Disruption of service
Risk Rating Definition High Rating:
The potential for a material impact on the company's earnings, assets, reputation, customers, and operations. This risk has a high likelihood of occurring.
Medium Rating:
The potential for a significant impact on the company's earnings, assets, reputation, customers, and operations. This risk has a medium likelihood of occurring.
Low Rating:
The potential for a significant impact on the company's earnings, assets, reputation, customers, and operations. This risk has a low likelihood of occurring.
Risk Assessment Questionnaire Audit Senior Interview of Project Team Members R1 How will the new system benefit your area and the company the most? R2 How critical is the system to the overall organization? R3 Will the system be included in the corporate wide Disaster Recovery Plan and / or Business Continuity Plan? (Note: Not all systems will be in the DRP due to low criticality rating. Auditor should ask if the department will prepare procedures to perform in situations where the system is unavailable? If not, then system should be reassessed to determine if it warrants Internal Audit attention due to its insignificance in regards to company wide needs). R4 Will this system be used by external parties (e.g. customers or business partners) or internally? R5 What function(s) do you believe will be impacted the most by the new system? R6 Will the new system change the business process(es) significantly? R7 Were other departmental groups included in the assessment of the product to ensure that it meets all user needs? R8 What are the weaknesses in your current process(es) and will they be addressed by the new system? R9 Will the new system require a significant amount of customization? R10 Will the system contain confidential information? If so, what kind of data (e.g. customer PII, employee PII, R&D, intellectual property, etc.)? R11 How will access to confidential data be safeguarded during the development of the system (e.g. security is generally not tight in a development or test environment)? R12 How will access to confidential data be safeguarded when system is in production through access rights (e.g. segregation of duties)? R13 What systems will be interfacing with the new system? R14 What system(s) will be retired once the new system is in production? R15 What data will be converted over to the new system? R16 Do you believe the data to be converted is complete and accurate (e.g. good data) or does the data need to be cleansed? R17 Do you believe the budgeted timeline for the project is acheivable? R18 If the estimated go-live date was significantly delayed, what would be the impact on the organization? R19 What do you believe would be the most challenging aspect of the project? R20 Do you believe that the project has the appropriate support from senior / upper management? R21 In looking back on previous implementation projects, what were the things that worked well and what were the things that did not work well? How do you plan on addressing the things that did not go well? R22 Do you have any concerns regarding project team personnel resources (e.g. availability, training, experience)? R23 Do you believe that there are departmental silos that may impact the development and implementation of the system? R24 What areas do you believe could result in scope creep?
Audit Notes
Risk Rating
Audit Step
Risk Assessment Questionnaire R25 Do you believe that the new system will require a significant amount of support from the Help Desk, IS, Super Users, or the Project Team members after it goes live? R26 Did you include an assessment of the vendor's security process related to the product you are purchasing for the history of vulnerabilities, notifying customers of vulnerabilities and remediating vulnerabilities identified through patching? R27 Will you be using a software development model in implementing this system? (e.g. rapid application development, joint application development, agile, spiral, prototype, or waterfall) R28 Will you be using any implementation tools on the cloud in implementing this system? R29 Does this system or the data that will be contained within it fall under the scope of international / federal / state regulation? R30 [Add additional risks as needed] Audit Team Assessment R31 Indicate risks stated in the annual audit risk assessment. R32 Indicate risks identified in review of prior SDLC audits performed. R33 Indicate any control deficiencies identified in the area(s), function(s), or business process(es) that have occurred in the past two years. R34 Determine if the members on the Project Team have the proper training and experience to manage a SDLC project. R35 Are there any financial risks concerning this project (e.g. going overbudget, impacting the financial statements, impacting customer billing or vendor payments, etc.)? R36 Are there any fraud risks that need to be considered (programming backdoors, unauthorized access to / theft of data, intentionally misconfiguring the system, unauthorized individual (internal employees and contractors) with access to data and / or systems)? R37 Are there any security risks that need to be considered (server / OS, application, data, placement in network infrastructure (segmentation))? R38 Review post implementation project assessment reports r and identify any lessons learned that may pose a risk to this project. R39 Assess if there are any risks related to the response in question #R27. R40 Assess if there are any risks related to the response in question #R28. R41 [Add additional risks as needed]
Audit Notes
Risk Rating
Audit Step
Audit Risk
Section A - Governance Lack of procedures leads to mismanaged project, system not meeting business needs, and ineffective responsibilities and accountabilities.
Employees do not have the required skills to implement a system leading to mismanaged project and system not meeting business needs.
Control
Audit Procedures
IT project management policy, procedures, and templates have been developed and are reviewed on a periodic basis.
A1
Employees involved in IT system implementation projects have been trainined on policy, procedures and use of the templates.
A2
Communication of project status Project Steering Committee provides are not performed throughout the oversight over IT system life cycle of the project resulting in implementation projects. unidentified issues, surprises and delays.
A3
A4 A5
A6
A7
End Users do not accept or use the system.
Project Team meets regularly with project team members, subject matter experts, and system implementors / contractors to review project status and identify tasks and issues.
A8
An organizational change communication plan is developed and implemented. (typically for major systems)
A10
A9
A11
A12 A13
A14
Section B - Pre-Implementation: Lack of business justification results in the purchase of a system that does not meet business needs.
Business Case and Project Planning IT projects are assessed according to organizational objectives and are approved.
B1
B2
Inadequate vendor is selected to Vendor is selected according to implement the system resulting in company bid procedures. cost overruns, project delays, and system not meeting business needs.
B3 B4
Inadequate contract terms may Vendor contract is prepared according result in cost overruns, project to company procedures. delays, and exposure to additional risks (e.g. unauthorized access / disclosure of data).
B5
B6
Inadequate project planning may result in cost overruns, project delays, and system not meeting business needs.
A project plan is created to establish accountability and expectations.
B7
B8
B9
B10
B11
B12
Lack of procedures leads to mismanaged project, system not meeting business needs, and ineffective responsibilities and accountabilities.
Project documentation is in conformance with company procedures.
System design team does not Project Team members are trained on understand capabilities resulting in the capabilities of the system prior to inadequate system design. blueprinting.
B13
B14
B15
Section C - Pre-Implementation: Inadequate system design results in a system that does not meet user needs and increases likelihood of nonacceptance.
System Development -- Design & Build System design meetings are held with C1 a crossfunctional group of users and technical SMEs. C2
System Development Plan meets business needs and is updated throughout the project.
C3
Security and internal control requirements are included in the System Development Plan.
C4
System Development Plan is reviewed and approved by Project Sponsor.
C5
Inadequate data conversion plan Data Conversion Plan is well designed results in data integrity issues and and meets business needs. increases likelihood of nonacceptance by users.
C6
C7
Confidential data is properly secured.
C8
Lack of procedures leads to mismanaged project, system not meeting business needs, and ineffective responsibilities and accountabilities.
Data Conversion Plan is reviewed and approved by Project Sponsor.
C9
Project documentation is in conformance with company procedures.
C10
Project team members are Project documentation is unaware of system design communicated to Project team documentation leading to a system members. that does not meet business needs.
C11
Lack of procedures leads to mismanaged project, system not meeting business needs, and ineffective responsibilities and accountabilities.
C12
System build is in conformance with company policies and procedures.
C13
C14
Inaccurate converted data results in data integrity issues and increases likelihood of nonacceptance by users.
Data in the old system is reviewed to ensure accuracy and integrity.
C15
C16
C17
Inadequate testing of data conversion results in unidentified issues or errors occurring during the go-live phase.
Data conversion process is tested and errors are addressed.
C18
C19
C20
Lack of technical documentation leads to inability to effectively support the system.
System documentation is developed and maintained to current specifications.
C21
C22
C23
Lack of project monitoring leads to Project Lead reevalutes the project budget overruns, delays and not budget, timeline, and milestones meeting project objectives. against actual results throughout the project to identify any issues that need to be addressed.
C24
C25
C26
C27
C28
Section D - Pre-Implemetntation: Lack of a test plan may lead to ineffective testing resulting in acceptance of a system that does not meet business needs.
Test Test Plan is created to ensure testing is complete and system meets stated requirements prior to implementation.
Lack of procedures leads to mismanaged project, system not meeting business needs, and ineffective responsibilities and accountabilities.
Project documentation is in conformance with company procedures.
D1
D2
Lack of a test plan may lead to Test Plan is reviewed and approved by ineffective testing resulting in Project Sponsor. acceptance of a system that does not meet business needs.
D3
Lack of a test environment results Test environment is created and kept in ineffective testing and may lead separate from the development and to a system not meeting business production environment. needs.
D4
Lack of a test environment results in ineffective testing and may lead to a system not meeting business needs.
Lack of cross functional testers may result in system not meeting business needs.
Test environment simulates the production environment.
D5
Testers are made up of cross functional users to ensure system meets business needs.
D6
Lack of a test scripts may lead to Test scripts are created and monitored ineffective testing resulting in for satisfactory results. acceptance of a system that does not meet business needs.
D7
D8
D9 D10 Testing issues identified are not resolved prior to implementation.
Issues are tracked in a log and monitored to ensure timely and proper resolution.
D11
Lack of a test scripts may lead to Test scripts are created and monitored D12 ineffective testing resulting in for satisfactory results. acceptance of a system that does not meet business needs. D13
D14
D15
Lack of technical documentation leads to inability to effectively support the system.
System documentation is developed and maintained to current specifications.
Lack of project monitoring leads to Project Lead reevalutes the project budget overruns, delays and not budget, timeline, and milestones meeting project objectives. against actual results throughout the project to identify any issues that need to be addressed.
D16
D17
D18
D19
D20
D21
Section E - Pre-Implementation: Lack of implementation plan results in go-live steps being missed leading to a system that does not meet business needs or unavailability of the new system.
System Pre Go-live & Data Conversion Implementation Plan is created to E1 ensure a smooth and complete transition to the production environment.
Lack of procedures leads to mismanaged project, system not meeting business needs, and ineffective responsibilities and accountabilities.
Project documentation is in conformance with company procedures.
E2
Lack of implementation plan results in go-live steps being missed leading to a system that does not meet business needs or unavailability of the new system.
Implementation Plan is reviewed and approved by Project Sponsor.
E3
E4
Modification of loss of data in the old system may impact data conversion process.
Data in the old system is backed up prior to data conversion to production environment.
E5
Inadequate testing of data conversion results in unidentified issues or errors occurring during the go-live phase.
Data conversion process is tested and errors are addressed.
E6
E7
E8
Lack of testing of the production All test scripts have been completed environment prior to go live may with satisfactory results. result in system not meeting business needs or unavailability of the system. Production environment is tested to ensure it performs in the same capacity as the test environment. Unresolved issues are not identified resulting in system not meeting business needs in production environment.
E9
E10
Unresolved issues are tracked.
E11
Unresolved issues are assessed for impact on the system prior to going live.
E12
meeting business needs in production environment.
Lack of security controls leads to security vulnerabilities in the system.
Unresolved issues are reviewed and approved.
E13
Privielged users do not have access to the production environment.
E14
Security personnel review and approve E15 the security specifications prior to system going live.
Lack of access review may lead to System owner reviews and approves unauthorized users having access user access rights prior to system to the system or authorized users going live. set up in the wrong access group.
E16
Lack of approval to go-live may result in system not meeting business needs.
E17
Project Steering Committee approves system to go live.
E18 Lack of go-live checklist results in Go-live checklist is maintained to track go-live steps being missed leading all go-live tasks and ensure all have to a system that does not meet been completed. business needs or unavailability of the new system.
E19
Lack of project monitoring leads to Project Lead reevalutes the project budget overruns, delays and not budget, timeline, and milestones meeting project objectives. against actual results throughout the project to identify any issues that need to be addressed.
E20
E21
E22
E23
Section F - Pre-Implementation: Training
Users do not accept the system.
Training is provided to users to provide them with the proper procedures in utilizing the system.
F1
F2
F3
F4
F5
Section G - Post Implementation: Support & Maintenance Lack of support personnel may Support team is properly staffed to lead to user frustration and lack of meet business needs. acceptance.
Slow support response may lead to user frustration and lack of acceptance.
SLA metrics are developed and monitored to measure performance and meet business needs.
G1
G2
G3
G4
Support personnel do not Technical training is provided to understand system and are unable support personnel on newly to resolve user issues. implemented systems.
G5
Lack of change management Change management procedures are procedures results in unauthorized in place and reviewed annually. changes being made, changes not being properly tested, and system documentation not being updated.
G6
Lack of a patch management process leads to security vulnerability exposures.
G8
Patch management procedures are in place and reviewed annually.
G7
Lack of inventory listing leads to IT asset inventory listing is maintained unauthorized devices / software on and reviewed on an ongoing basis. the network resulting in exposure to security vulnerabilities.
G9
Section H - Post Implementation: Review of Project Results & Close Out Evaluation of the management of Post implementation review is H1 the project is not performed, which performed by the Project Lead and could lead to ineffective project reviewed by the Project Steering management practices, an Committee. ineffective system, and not meeting business objectives.
H2
H3
H4
H5
H6
Secion I - Post Implementation: Internal Controls Assessment Internal controls are not created or Internal controls are created or operating effectively for the modified to ensure the safeguarding of system, which could lead to assets and financial records. Controls inaccurate financial reporting. to be considered: - security - change management - operations - continuity - business processes
I1
I2
I3
- operations - continuity - business processes
I4
Audit Procedures
COBIT 5
COSO Principle
BAI01.01
4, 5, 10, 11, 12
Examine training records and verify that employees on the project team have been trained on company IT project management procedures.
BAI01.12
4
Verify that a Project Steering Committee exists, as evidenced by the committee charter.
BAI01.02
3
BAI01.05, BAI01.06, BAI01.07, BAI01.11
5, 13, 16
Obtain and examine policy, procedures and templates. Verify that they address the following:
Pre-Audit Risk #
Company Procedures
- Business Case Analysis - Project risk assessment - Roles and responsibilities - System documentation - System specification - User specification - Security specification - System development plan - Change requests - Developing internal controls - Project issue procedures - Data conversion plan - Test plan - Pre Go-live plan - Training - Organizational change management plan - Project monitoring & status updates - Post implementation project review
A member of the Audit Team should attend the Project Steering Committee meetings. Obtain and examine Project Steering Committee meeting minutes to verify that committee is reviewing project status, achievement of project milestones, monitoring budget-to-actual costs, and results of project measurements (i.e. KPIs).
Verify that the Project Team Lead is submitting status reports on a periodic basis and any other required documentation to the Project Steering Committee throughout the lifecycle of the project. Status reports should contain budget-to-actual comparison & variation analysis, monitoring of milestones & KPIs, description of achievements and any issues effecting the progress of the project.
BAI01.06, BAI01.07, BAI01.11
14, 16
Verify that the Project Steering Committee has reviewed the post implementation project results report and develops an Action Plan to address any actionable lessons learned.
BAI01.06, BAI01.11
17
Examine the Project Team's status meeting minutes and verify that the team discusses tasks completed / to be completed and issues identified / assigned / resolved.
BAI01.08
16
Verify that an organizational change communication plan has been developed and should address:
BAI01.03
14
Verify that an external communication plan has been created to provide information to customers and / or business partners regarding the implementation of the system (if system will be used by these parties).
BAI01.03
15
Verify that the Communication Plan has been approved by the Project Sponsor.
BAI01.03
3
Verify that the communication plan has been implemented.
BAI01.03
14
A member of the Audit Team should attend the Project Team status meetings.
- Assessing company's readiness to accept change - Educating end users on the reason and timing behind the change - Roles and responsibilities of organizational change management team - Vision and strategy for change - Communication of vision and strategy to end users - Remove barriers / silos that inhibit end user acceptance - Short-term and long-term goals identified and monitored - Identify training needs - Other communication activities (newsletter, posters, intranet site, etc.) - Continuous feedback
Interview a sample of employees to determine the effectiveness of the communication plan and end user acceptance of system.
BAI01.03
14
Verify that a Business Case Report has been developed and approved by the appropriate level of management and governance committee(s) (e.g. IT Steering Committee, Capital Assets Committee, etc.).
BAI01.02
6
Verify that the Business Case Report includes:
BAI01.02, BAI01.10, BAI02.02
6, 7, 9, 13
ning
- Current state of business process, identifying any control weaknesses - Expected future state of business process (consider future growth) - Addresses corporate / department strategic goals - Description of the application systems reviewed (e.g. proof of concepts, demos) - Reason behind system recommended to be implemented (e.g. feasibility study) - Cost benefit analysis (dollar & labor cost / benefits, other benefits) - Potential risks of the project and significance of risks - Potential impact to critical systems - Regulatory concerns / approvals - End User feedback Verify that a Request for Proposal was prepared and sent to selected vendors.
APO10.02
Verify that Vendor proposals were reviewed for:
APO10.02
- Reputation and experience of vendor - Experience of vendor personnel - Proposal content met scope of the project - Rates for time and expense - Ability to respond to system vulnerabilities and provide patches to customers timely
Verify that the vendor contract addresses the following: - Vendor expectations - Deliverables - System requirements - Warranties and liabilities - Rates for time and expense (pymt terms based on achievement of milestones) - Change request process - Confidentiality / Non-Disclosure - Terms and conditions - Project timelines and milestones - Insurance requirements - Adherence to company policies and procedures in developing system - Terms to restore back to current system
BAI03.03, BAI03.04, APO10.01, APO10.02
Verify that the vendor contract has been approved by the appropriate level of management. Verify that a Project Plan has been created and includes: - Project objectives - Project scope - Project risk assessment - Identifies stakeholders - Project Sponsor - Team members - Roles and responsibilities - Budgets and timelines - Project milestones and KPIs - Communication plan - Deliverables - Change in scope procedures
3
BAI01.04, BAI01.05, BAI01.07, BAI01.08, BAI01.10, BAI01.12, BAI02.03
Verify that the Project Plan has been approved by the Project Team Lead and Project Sponsor.
BAI01.07
Verify that a project kick-off meeting has been held to review the Project Plan with team members by obtaining the meeting minutes.
BAI01.07. BAI01.08
Assess project timelines and determine if timeline is reasonably acheivable. Assess project pesonnel resources for: - Availability - Cross functional respresentation of all departments impacted by system - Experience
BAI01.12
3, 5, 6, 7, 14
14
Review prior project lessons learned and determine if they have been properly incorporated into the Project Plan.
BAI01.01
Verify that the Project Plan is in compliance with company procedures.
BAI01.01
12
Verify that employees involved in the design and build of the application system have been properly trained to configure / customize the system and ability to use the system when performing tests.
BAI01.12
4
Verify that project design / blueprint meetings have been held to develop the System Development Plan and Data Conversion Plan by obtaining the meeting minutes.
BAI02.01, BAI03.01
10, 14
Verify that the appropriate employees are participating in the project design meetings:
BAI01.12, BAI03.01
10, 14
BAI02.01, BAI03.01, BAI03.02, BAI03.03
5, 11
Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor.
& Build
- Project team - System implementors - Subject matter experts - Super users - End users - Network administrators - System administrators - Security administrators Verify that a System Development Plan has been created and includes: - System documentation - System specification - User specification - Functional requirements - Reporting requirements - Customization - Security and internal controls requirements - Interfaces with other systems (consider impact on inter-operability) - Process and data flowcharts - Data storage - Issue identification and resolution - Constraints - Backout / Contingency Plan
Verify that security and internal control requirements consider the following: - access rights based on least privilege - segregation of duties - system authorizations - edit checks - audit logs - input checks - matching checks - sequence checks - duplication checks - output - exception reporting
BAI02.01, BAI03.01, BAI03.02, BAI03.03
Verify that the System Development Plan has been approved by the Project Team Lead, Project Sponsor, and System Implementor.
BAI03.01
Verify that a Data Conversion Plan has been created and includes:
BAI03.01, BAI03.02, BAI03.06, BAI07.01
- Identification of data to be transferred / converted - Data cleansing procedures - Error tolerances - Data mapping - Data extraction - Data transfer - Data validation test plans - Issue identification and resolution - Conversion timeline - Conversion tasks included in go-live checklist - Required approvals Verify that the Project Team has developed the data map. Determine if data map is in sufficient detail to assist IT in converting the data and for testers in testing the system.
BAI03.02, BAI07.02
- flow chart of data movement - identification of common data elements - identification of field mapping between old system and new system - determine file format and layout for import: field length, format, name, values, etc. - translation of data values - identification of confidential / key data Assess whether the data to be converted is confidential and whether appropriate security measures have been implemented to protect that data where it resides (e.g. dev / test / prod environments).
BAI03.02, BAI07.04
5, 11
11
Verify that the Data Conversion Plan has been approved by the Project Team Lead, Project Sponsor, and System Implementor.
BAI02.04
Verify that the System Development Plan and Data Conversion Plan are in compliance with company procedures.
BAI02.01
Verify that the System Development Plan and Data Conversion Plan have been discussed with applicable employees involved in implementing these plans.
BAI01.08
Verify that any servers and operating systems pertaining to the new system have been configured according to the company's configuration management procedures.
BAI01.09, BAI03.05
11
BAI01.09, BAI03.05
11
12
- default and unncecssary accounts / services are disabled, if possible - disable local admin account - default passwords are changed and made complex for admin accounts, application / operating systems and any other new networked device - limiting admin privileges to those who have a business need to modify configuration - enable logging Verify that any servers and operating systems pertaining to the new system have been secured according to the company's security procedures. Examples are: - anti-virus / malware on server - password management enabled (log-on attempts, password change timeframe, password history) - admins have different passwords for admin accounts and non-admin accounts - disabling LM hashes - encryption - network segmentation - enable firewall - remote administration of servers over secure channels
Verify that changes made to current systems (setting up interfaces, extracting data, importing data) follow the company's change management procedures.
BAI01.09, BAI03.05, BAI06
11
- changes are documented - changes are tested - changes are approved by business and IT prior to migration into production environment - quality assurance review Verify that the data cleansing has been performed by determining if the Project Team verified that:
BAI07.02
- All mandatory fields are populated - All records are present - Default or dummy values cannot be inserted where there is missing data - Data is complete - No duplication of data fields For data that has not been cleansed, determine potential risks and impacts to the project. Determine if error tolerances have been evaluated against the approved thresholds stated in the Data Conversion Plan.
BAI07.02
Verify that the Project Team has verified the accuracy, integrity and completeness of data conversion to the test system by reviewing test documentation.
BAI07.02
Verify that data converted to the test system is complete, accurate, and has integrity.
BAI03.05, BAI03.06, BAI07.02
11, 13
- batch and control totals - check sums / digits - range checks - date and time stamps - use a data analysis tool to compare a sample of data from the old system and the new system - verify a test sample of data to source documentation Verify that the Project Team addresses any errors or omissions identified as part of testing the data conversion.
BAI07.02
Verify that appropriate controls are in place to prevent or detect any data manipulation during the conversion process and that they are operating effectively.
BAI03.05, BAI07.02
11
Verify that the Project Team has maintained documentation of process design, configuration, and customization.
BAI03.05, BAI07.02, BAI10.03
11
BAI07.02
11
BAI01.09, BAI02.01, BAI03.03, BAI03.05, BAI03.10, BAI10.03
11
- flow charts - screenshots - exhibits of code - online and batch operating instructions - system narratives - configuration baselines At the end of the system build phase, verify that the Project Team has created the User Manual. The manual may include: - description of the system - use of the system - input data and parameters - output data - operating procedures - error identification and resolution - user responsibilities related to security, privacy and internal controls At the end of the system build phase, verify that the Project Team has created the Operations and Maintenance Manual. This manual may include: - description of software - instructions to operate software - technical flow charts - exhibits of code - technical specifications - security specifications - description of internal controls - description of non-routine procedures and security requirements - procedures for error resolution - maintenance procedures - configuration baselines Determine if any change orders have been approved. If so, verify if the project budget cost, labor hours and timeline have been updated. Determine if there is any risk due to scope creep.
BAI01.11, BAI03.09
Verify that any milestone(s) achieved during this phase have been reviewed and approved by the Project Sponsor.
BAI01.08, BAI01.11
Verify that the Project Lead has reviewed the Project Plan to ensure that the project is on target with budgets, milestones and timeline. Verify that Project Lead has reassessed the project risks for the activities in this phase. Verify the Project Lead has updated the Project Plan, if necessary.
BAI01.10, BAI01.11, BAI02.03
Review the project actual cost, labor hours and timeline in comparison with the budget. Determine if there are any risks that may impact the project in the testing phase (e.g. going over budget in the design and build phase may lead to decreasing hours dedicated to testing system).
BAI01.05
Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor. Verify that a Test Plan has been created and includes the following:
BAI01.09, BAI03.06, BAI03.07, BAI07.01, BAI07.03
11
Verify that the Test Plan is in compliance with company policy and procedures.
BAI01.01, BAI03.07, BAI07.03
12
Verify that the Test Plan has been reviewed and approved by the Project Leader and Project Sponsor.
BAI07.03
Verify that there is a separate test environment from the development and production environment.
BAI03.07, BAI07.04
- testing methodology, including types of tests to be performed (e.g. functional, unit, integration, end-to-end, acceptance, performance, parallel / pilot, volume / stress, regression, quality assurance, penetration, scanning, fuzzing, testing for failures, security) - Testing procedures - Testing templates / scripts (purpose, procedure, conclusion, sign-off) - Testing documentation to be maintained, along with retention period - Reporting, tracking and remediating issues identified during testing - Acceptance and approval of test results - test location and preparation
12
Verify that the test environment simluates the production environment.
BAI03.07, BAI07.04
Verify that the Project Team has identified all employees to be used in the testing process. Verify that these employees:
BAI01.12, BAI03.08, BAI07.03
4
- have been provided training on how to use the system - have been provided a copy of the Test Plan - understand their roles and responsibilities regarding testing the system - have the availability to perform the required test scripts and retest if necessary - are from business areas in the company that will be using the system Verify that test scripts have been created for all tests that are to be performed and have been mapped back to System Development Plan specifications.
BAI01.09, BAI03.06, BAI03.07, BAI07.03
11
BAI03.06, BAI07.05
11
Verify that the test scripts created are testing for failures in the process or negative testing where users can't perform functions that are beyond their authorization or responsibilities. Verify that test scripts include testing of security and system controls.
11
Verify that the Project Team is tracking the performance and completion of all test scripts.
BAI03.08, BAI07.05
11
Verify that the Project Team is tracking all issues identified on a log where the issue is assigned to an owner for resolution. Verify that the remediated issue is retested with a satisfactory result.
BAI03.06, BAI03.08, BAI07.05
11
Verify that the Project Team is tracking testing documentation and ensuring it is being maintained for all test scripts.
BAI01.09, BAI03.08, BAI07.05
11
Select a sample of test scripts and observe the Testers performing the tests. Verify that the Testers are performing the tests in accordance with the Test Plan. Select a sample of test scripts and reperform. Compare the audit results to the Tester's results. Use a data analysis tool to identify any gaps in the security or internal control requirements.
Verify that the User Manual and / or Operations Manual have been updated for any changes that occurred during the testing phase to ensure complete and accurate system documentation.
BAI02.01, BAI03.10
Determine if any change orders have been approved. If so, verify if the project budget cost, labor hours and timeline have been updated. Determine if there is any risk due to scope creep.
BAI01.11, BAI03.09
Verify that any milestone(s) achieved during this phase have been reviewed and approved by the Project Sponsor.
BAI01.08, BAI01.11
Verify that the Project Lead has reviewed the Project Plan to ensure that the project is on target with budgets, milestones and timeline. Verify that Project Lead has reassessed the project risks for the activities in this phase. Verify the Project Lead has updated the Project Plan, if necessary.
BAI01.10, BAI01.11
Review the project actual cost, labor hours and timeline in comparison with the budget. Determine if there are any risks that may impact the project in the go-live phase.
BAI01.05
11, 12
Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor.
ersion Verify that an Implementation Plan has been created and includes:
BAI01.09, BAI07.01, BAI07.06
11
BAI01.01
12
- implementation schedule - development of production environment - testing of production environment - securing production environment - data conversion - data back-up - contingency / fallback plan - approvals to go live - resolution of any issues identified prior to go-live - acceptance of any unresolved issues identified - tracking go-live tasks (e.g. checklist) - go / no-go criteria Verify that the Implementation Plan is in compliance with company policy and procedures.
Verify that the Implementation Plan has been reviewed and approved by the Project Lead, Project Sponsor, and System Implementor.
BAI07.01
11
Evaluate the implementation schedule and determine if it is reasonable and achievable. Verify that the data in the current system is backed up prior to converting data to the new system.
BAI07.02
Verify that data converted to the production system is complete, accurate, and has integrity.
BAI01.09, BAI07.02
11
Verify that appropriate controls are in place to prevent or detect any data manipulation during the conversion process and that they are operating effectively.
BAI01.09, BAI07.02
11
Verify that the Project Team addresses any errors or omissions identified as part of testing the data conversion prior to going live with the system.
BAI01.09, BAI07.02
11
Verify that all test scripts have been completed and any issues identified during the testing phase have been resolved prior to the system going live.
BAI01.09
11
Verify that tests scripts performed on the production environment have been completed and any issues identified have been resolved prior to the system going live.
BAI01.09
11
Verify that unresolved issues have been identified by the Project Lead and are being tracked.
BAI07.05
Verify that any unresolved issues that will not be addressed prior to go live will not have a significant impact on the production system.
BAI07.05
- batch and control totals - check sums / digits - range checks - date and time stamps - user reconciliations / data validation - use a data analysis tool to compare a sample of data from the old system and the new system - verify a test sample of data to source documentation
Verify that unresolved issues have been reviewed and approved by the Project Sponsor and Project Steering Committee prior to going live.
BAI07.05
Verify that the production environment has the appropriate security controls to prevent access to the system by administrators or the system implementors once the system is live.
BAI01.09
11
Verify that the Security group has reviewed the security specifications of the system and has approved it to go-live.
BAI01.09
11
Verify that the system owner has reviewed and approved the access rights of end users and assignment of user groups.
BAI01.09
11
Verify that the Project Lead has communicated the results of the system build and testing phases to the Project Steering Committee, along with any issues that are expected to be unresolved by the go-live date.
11
Verify that the Project Steering Committee has approved the system to go live.
3
Verify that all tasks on the go-live checklist have been signed-off on prior to going live.
BAI01.09
Verify that any milestone(s) achieved during this phase have been reviewed and approved by the Project Sponsor.
BAI01.08, BAI01.11
Verify that the Project Lead has reviewed the Project Plan to ensure that the project is on target with budgets, milestones and timeline. Verify that Project Lead has reassessed the project risks for the activities in this phase. Verify the Project Lead has updated the Project Plan, if necessary.
BAI01.10, BAI01.11
Review the project actual cost, labor hours and timeline in comparison with the budget. Determine if there are any risks that may impact the project and consider discussing with the Project Steering Committee.
BAI01.05
Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor.
Verify that the Project Team has developed a training program based off of the User Manual and Operations Manual.
BAI08.04
4
Verify that end users, super users, and technical support personnel are properly trained on the new system. Review training schedule and attendance sheets to determine that users attended the training.
BAI08.04
4, 14
Verify that surveys were provided to users regarding feedback on the training materials. Verify that comments are incorporated into the training program and / or User Manual.
14
Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor. Verify that management has committed the appropriate additional resources to support the system and respond to end users' needs post golive for a predetermined amount of time (e.g 3 months).
BAI03.10, BAI07.07
Verify that in-house support personnel have stated service level agreement metrics to meet the needs of the end users timely.
BAI01.06, BAI03.11
Determine if the level of support is meeting its SLA metrics.
BAI01.06, BAI03.10, BAI03.11
Determine if management will be relying on vendor support for the system. If so, obtain and review support contract for terms and agreement, confidentiality, and access rights. Determine level of support during an incident / disaster.
BAI03.11
Verify that all support personnel have received training on the Operations Manual.
BAI02.01
Verify that any changes made to the system by support personnel follow the company's change management policy and procedures.
BAI03.10, BAI06
11
Verify that there is a process in place to update the Operations Manual based on changes made.Verify that there is a process in place to update the Operations Manual based on changes made.
BAI03.10
11
Verify that the system is included in the patch management policy and procedures.
BAI03.10
11, 12
13
Verify that the hardware and software associated with this implementation project have been included in the company's inventory listing of IT assets.
BAI03.04, BAI09.01
Verify that the Project Lead has performed a post implementation assessment. The assessment should include:
BAI01.05, BAI01.06, BAI01.11, BAI01.13, BAI07.08
ose Out
- determination if project objectives were achieved - assessment on cost benefit analysis presented in business case - assessment of project budgets (cost, labor hours, timeline) in comparison with actual results - project metrics, KPIs - feedback from end users on acceptance of system - identifying lessons learned
Evaluate the lessons learned identified by the Project Lead and determine if they address the findings noted in the audit memorandums issued.
BAI01.13, BAI07.08
For any unresolved issues, verify that they have been assigned an owner and estimated completion date. Verify that unresolved issues are tracked and closed out timely.
BAI01.13, BAI07.08
Verify that the Project Lead presented the post implementation assessment results to the Project Steering Committee.
BAI01.06, BAI01.11
Verify that project documentation is properly secured and retained according to retention policy and procedures. Verify that the Project Lead has obtained approval from the Project Steering Committee to close the project.
BAI01.14
Verify that the ITGC and business process internal control documentation (e.g. narrative, flow charts, matrices) have been created or modified to accommodate the new system. Verify if policies, procedures, and internal controls have been revised based on the project's lessons learned. Test the internal controls to verify that they are operating effectively. a. Test the ITGC controls. b. Test the Business Process controls.
11, 13, 14
11, 13, 16, 17
BAI01.14
11
11, 12, 17
11
Verify that the new system has been added to the Disaster Recovery Procedures Manual. (Note: based on the criticality of the system, the company may decide not to include it in the DRP. In this situation, assess if the system owner needs to consider a business continuity plan).
BAI09.02
11
20 Critical Security Controls
CSC 17-3
CSC 3-1, 3-3, 3-4, 12-1, 12-3, 12-4
CSC 3-7, 12-6, 12-8, 12-9, 171, 19-4
CSC 6-6
CSC6-3
CSC 6-1
CSC 1 & 2
Audit Procedures Section A - Governance A1 Obtain and examine policy, procedures and templates. Verify that they address the following: - Business Case Analysis - Project risk assessment - Roles and responsibilities - System documentation - System specification - User specification - Security specification - System development plan - Change requests - Developing internal controls - Project issue procedures - Data conversion plan - Test plan - Pre Go-live plan - Training - Organizational change management plan - Project monitoring & status updates - Post implementation project review
A2 Examine training records and verify that employees on the project team have been trained on company IT project management procedures. A3 Verify that a Project Steering Committee exists, as evidenced by the committee charter. A4 A member of the Audit Team should attend the Project Steering Committee meetings. A5 Obtain and examine Project Steering Committee meeting minutes to verify that committee is reviewing project status, achievement of project milestones, monitoring budget-to-actual costs, and results of project measurements (i.e. KPIs).
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures A6 Verify that the Project Team Lead is submitting status reports on a periodic basis and any other required documentation to the Project Steering Committee throughout the lifecycle of the project. Status reports should contain budget-to-actual comparison & variation analysis, monitoring of milestones & KPIs, description of achievements and any issues effecting the progress of the project. A7 Verify that the Project Steering Committee has reviewed the post implementation project results report and develops an Action Plan to address any actionable lessons learned. A8 A member of the Audit Team should attend the Project Team status meetings. A9 Examine the Project Team's status meeting minutes and verify that the team discusses tasks completed / to be completed and issues identified / assigned / resolved. A10 Verify that an organizational change communication plan has been developed and should address: - Assessing company's readiness to accept change - Educating end users on the reason and timing behind the change - Roles and responsibilities of organizational change management team - Vision and strategy for change - Communication of vision and strategy to end users - Remove barriers / silos that inhibit end user acceptance - Short-term and long-term goals identified and monitored - Identify training needs - Other communication activities (newsletter, posters, intranet site, etc.) - Continuous feedback
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures
Testing Comments / Conclusions
A11 Verify that an external communication plan has been created to provide information to customers and / or business partners regarding the implementation of the system (if system will be used by these parties). A12 Verify that the Communication Plan has been approved by the Project Sponsor. A13 Verify that the communication plan has been implemented. A14 Interview a sample of employees to determine the effectiveness of the communication plan and end user acceptance of system. Section B - Pre-Implementation: Business Case and Project Planning B1 Verify that a Business Case Report has been developed and approved by the appropriate level of management and governance committee(s) (e.g. IT Steering Committee, Capital Assets Committee, etc.). B2 Verify that the Business Case Report includes: - Current state of business process, identifying any control weaknesses - Expected future state of business process (consider future growth) - Addresses corporate / department strategic goals - Description of the application systems reviewed (e.g. proof of concepts, demos) - Reason behind system recommended to be implemented (e.g. feasibility study) - Cost benefit analysis (dollar & labor cost / benefits, other benefits) - Potential risks of the project and significance of risks - Potential impact to critical systems - Regulatory concerns / approvals - End User feedback
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures B3 Verify that a Request for Proposal was prepared and sent to selected vendors. B4 Verify that Vendor proposals were reviewed for: - Reputation and experience of vendor - Experience of vendor personnel - Proposal content met scope of the project - Rates for time and expense - Ability to respond to system vulnerabilities and provide patches to customers timely
B5 Verify that the vendor contract addresses the following: - Vendor expectations - Deliverables - System requirements - Warranties and liabilities - Rates for time and expense (pymt terms based on achievement of milestones) - Change request process - Confidentiality / Non-Disclosure - Terms and conditions - Project timelines and milestones - Insurance requirements - Adherence to company policies and procedures in developing system - Terms to restore back to current system B6 Verify that the vendor contract has been approved by the appropriate level of management.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures B7 Verify that a Project Plan has been created and includes: - Project objectives - Project scope - Project risk assessment - Identifies stakeholders - Project Sponsor - Team members - Roles and responsibilities - Budgets and timelines - Project milestones and KPIs - Communication plan - Deliverables - Change in scope procedures B8 Verify that the Project Plan has been approved by the Project Team Lead and Project Sponsor. B9 Verify that a project kick-off meeting has been held to review the Project Plan with team members by obtaining the meeting minutes. B10 Assess project timelines and determine if timeline is reasonably acheivable. B11 Assess project pesonnel resources for: - Availability - Cross functional respresentation of all departments impacted by system - Experience B12 Review prior project lessons learned and determine if they have been properly incorporated into the Project Plan. B13 Verify that the Project Plan is in compliance with company procedures. B14 Verify that employees involved in the design and build of the application system have been properly trained to configure / customize the system and ability to use the system when performing tests.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures
Testing Comments / Conclusions
B15 Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor. Section C - Pre-Implementation: System Development -- Design & Build C1 Verify that project design / blueprint meetings have been held to develop the System Development Plan and Data Conversion Plan by obtaining the meeting minutes. C2 Verify that the appropriate employees are participating in the project design meetings: - Project team - System implementors - Subject matter experts - Super users - End users - Network administrators - System administrators - Security administrators C3 Verify that a System Development Plan has been created and includes: - System documentation - System specification - User specification - Functional requirements - Reporting requirements - Customization - Security and internal controls requirements - Interfaces with other systems (consider impact on inter-operability) - Process and data flowcharts - Data storage - Issue identification and resolution - Constraints - Backout / Contingency Plan
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures C4 Verify that security and internal control requirements consider the following: - access rights based on least privilege - segregation of duties - system authorizations - edit checks - audit logs - input checks - matching checks - sequence checks - duplication checks - output - exception reporting C5 Verify that the System Development Plan has been approved by the Project Team Lead, Project Sponsor, and System Implementor. C6 Verify that a Data Conversion Plan has been created and includes: - Identification of data to be transferred / converted - Data cleansing procedures - Error tolerances - Data mapping - Data extraction - Data transfer - Data validation test plans - Issue identification and resolution - Conversion timeline - Conversion tasks included in go-live checklist - Required approvals
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures C7 Verify that the Project Team has developed the data map. Determine if data map is in sufficient detail to assist IT in converting the data and for testers in testing the system.
C8
C9
C10
C11
- flow chart of data movement - identification of common data elements - identification of field mapping between old system and new system - determine file format and layout for import: field length, format, name, values, etc. - translation of data values - identification of confidential / key data Assess whether the data to be converted is confidential and whether appropriate security measures have been implemented to protect that data where it resides (e.g. dev / test / prod environments). Verify that the Data Conversion Plan has been approved by the Project Team Lead, Project Sponsor, and System Implementor. Verify that the System Development Plan and Data Conversion Plan are in compliance with company procedures. Verify that the System Development Plan and Data Conversion Plan have been discussed with applicable employees involved in implementing these plans.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures C12 Verify that any servers and operating systems pertaining to the new system have been configured according to the company's configuration management procedures. - default and unncecssary accounts / services are disabled, if possible - disable local admin account - default passwords are changed and made complex for admin accounts, application / operating systems and any other new networked device - limiting admin privileges to those who have a business need to modify configuration - enable logging C13 Verify that any servers and operating systems pertaining to the new system have been secured according to the company's security procedures. Examples are: - anti-virus / malware on server - password management enabled (log-on attempts, password change timeframe, password history) - admins have different passwords for admin accounts and non-admin accounts - disabling LM hashes - encryption - network segmentation - enable firewall - remote administration of servers over secure channels
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures C14 Verify that changes made to current systems (setting up interfaces, extracting data, importing data) follow the company's change management procedures. - changes are documented - changes are tested - changes are approved by business and IT prior to migration into production environment - quality assurance review C15 Verify that the data cleansing has been performed by determining if the Project Team verified that: - All mandatory fields are populated - All records are present - Default or dummy values cannot be inserted where there is missing data - Data is complete - No duplication of data fields C16 For data that has not been cleansed, determine potential risks and impacts to the project. Determine if error tolerances have been evaluated against the approved thresholds stated in the Data Conversion Plan. C17 Verify that the Project Team has verified the accuracy, integrity and completeness of data conversion to the test system by reviewing test documentation.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures C18 Verify that data converted to the test system is complete, accurate, and has integrity. - batch and control totals - check sums / digits - range checks - date and time stamps - use a data analysis tool to compare a sample of data from the old system and the new system - verify a test sample of data to source documentation C19 Verify that the Project Team addresses any errors or omissions identified as part of testing the data conversion. C20 Verify that appropriate controls are in place to prevent or detect any data manipulation during the conversion process and that they are operating effectively. C21 Verify that the Project Team has maintained documentation of process design, configuration, and customization. - flow charts - screenshots - exhibits of code - online and batch operating instructions - system narratives - configuration baselines
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures C22 At the end of the system build phase, verify that the Project Team has created the User Manual. The manual may include: - description of the system - use of the system - input data and parameters - output data - operating procedures - error identification and resolution - user responsibilities related to security, privacy and internal controls C23 At the end of the system build phase, verify that the Project Team has created the Operations and Maintenance Manual. This manual may include: - description of software - instructions to operate software - technical flow charts - exhibits of code - technical specifications - security specifications - description of internal controls - description of non-routine procedures and security requirements - procedures for error resolution - maintenance procedures - configuration baselines C24 Determine if any change orders have been approved. If so, verify if the project budget cost, labor hours and timeline have been updated. Determine if there is any risk due to scope creep. C25 Verify that any milestone(s) achieved during this phase have been reviewed and approved by the Project Sponsor.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures C26 Verify that the Project Lead has reviewed the Project Plan to ensure that the project is on target with budgets, milestones and timeline. Verify that Project Lead has reassessed the project risks for the activities in this phase. Verify the Project Lead has updated the Project Plan, if necessary. C27 Review the project actual cost, labor hours and timeline in comparison with the budget. Determine if there are any risks that may impact the project in the testing phase (e.g. going over budget in the design and build phase may lead to decreasing hours dedicated to testing system). C28 Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor. Section D - Pre-Implemetntation: Test D1 Verify that a Test Plan has been created and includes the following: - testing methodology, including types of tests to be performed (e.g. functional, unit, integration, end-to-end, acceptance, performance, parallel / pilot, volume / stress, regression, quality assurance, penetration, scanning, fuzzing, testing for failures, security) - Testing procedures - Testing templates / scripts (purpose, procedure, conclusion, sign-off) - Testing documentation to be maintained, along with retention period - Reporting, tracking and remediating issues identified during testing - Acceptance and approval of test results - test location and preparation D2 Verify that the Test Plan is in compliance with company policy and procedures.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures D3 Verify that the Test Plan has been reviewed and approved by the Project Leader and Project Sponsor. D4 Verify that there is a separate test environment from the development and production environment. D5 Verify that the test environment simluates the production environment. D6 Verify that the Project Team has identified all employees to be used in the testing process. Verify that these employees:
D7
D8
D9 D10 D11
- have been provided training on how to use the system - have been provided a copy of the Test Plan - understand their roles and responsibilities regarding testing the system - have the availability to perform the required test scripts and retest if necessary - are from business areas in the company that will be using the system Verify that test scripts have been created for all tests that are to be performed and have been mapped back to System Development Plan specifications. Verify that the test scripts created are testing for failures in the process or negative testing where users can't perform functions that are beyond their authorization or responsibilities. Verify that test scripts include testing of security and system controls. Verify that the Project Team is tracking the performance and completion of all test scripts. Verify that the Project Team is tracking all issues identified on a log where the issue is assigned to an owner for resolution. Verify that the remediated issue is retested with a satisfactory result.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures
Testing Comments / Conclusions
D12 Verify that the Project Team is tracking testing documentation and ensuring it is being maintained for all test scripts. D13 Select a sample of test scripts and observe the Testers performing the tests. Verify that the Testers are performing the tests in accordance with the Test Plan. D14 Select a sample of test scripts and reperform. Compare the audit results to the Tester's results. D15 Use a data analysis tool to identify any gaps in the security or internal control requirements. D16 Verify that the User Manual and / or Operations Manual have been updated for any changes that occurred during the testing phase to ensure complete and accurate system documentation. D17 Determine if any change orders have been approved. If so, verify if the project budget cost, labor hours and timeline have been updated. Determine if there is any risk due to scope creep. D18 Verify that any milestone(s) achieved during this phase have been reviewed and approved by the Project Sponsor. D19 Verify that the Project Lead has reviewed the Project Plan to ensure that the project is on target with budgets, milestones and timeline. Verify that Project Lead has reassessed the project risks for the activities in this phase. Verify the Project Lead has updated the Project Plan, if necessary. D20 Review the project actual cost, labor hours and timeline in comparison with the budget. Determine if there are any risks that may impact the project in the go-live phase. D21 Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor. Section E - Pre-Implementation: System Pre Go-live & Data Conversion
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures E1 Verify that an Implementation Plan has been created and includes: - implementation schedule - development of production environment - testing of production environment - securing production environment - data conversion - data back-up - contingency / fallback plan - approvals to go live - resolution of any issues identified prior to go-live - acceptance of any unresolved issues identified - tracking go-live tasks (e.g. checklist) - go / no-go criteria
E2 Verify that the Implementation Plan is in compliance with company policy and procedures. E3 Verify that the Implementation Plan has been reviewed and approved by the Project Lead, Project Sponsor, and System Implementor. E4 Evaluate the implementation schedule and determine if it is reasonable and achievable. E5 Verify that the data in the current system is backed up prior to converting data to the new system.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures E6 Verify that data converted to the production system is complete, accurate, and has integrity. - batch and control totals - check sums / digits - range checks - date and time stamps - user reconciliations / data validation - use a data analysis tool to compare a sample of data from the old system and the new system - verify a test sample of data to source documentation
E7 Verify that appropriate controls are in place to prevent or detect any data manipulation during the conversion process and that they are operating effectively. E8 Verify that the Project Team addresses any errors or omissions identified as part of testing the data conversion prior to going live with the system. E9 Verify that all test scripts have been completed and any issues identified during the testing phase have been resolved prior to the system going live. E10 Verify that tests scripts performed on the production environment have been completed and any issues identified have been resolved prior to the system going live. E11 Verify that unresolved issues have been identified by the Project Lead and are being tracked. E12 Verify that any unresolved issues that will not be addressed prior to go live will not have a significant impact on the production system. E13 Verify that unresolved issues have been reviewed and approved by the Project Sponsor and Project Steering Committee prior to going live.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures E14 Verify that the production environment has the appropriate security controls to prevent access to the system by administrators or the system implementors once the system is live. E15 Verify that the Security group has reviewed the security specifications of the system and has approved it to go-live. E16 Verify that the system owner has reviewed and approved the access rights of end users and assignment of user groups. E17 Verify that the Project Lead has communicated the results of the system build and testing phases to the Project Steering Committee, along with any issues that are expected to be unresolved by the go-live date. E18 Verify that the Project Steering Committee has approved the system to go live. E19 Verify that all tasks on the go-live checklist have been signed-off on prior to going live. E20 Verify that any milestone(s) achieved during this phase have been reviewed and approved by the Project Sponsor. E21 Verify that the Project Lead has reviewed the Project Plan to ensure that the project is on target with budgets, milestones and timeline. Verify that Project Lead has reassessed the project risks for the activities in this phase. Verify the Project Lead has updated the Project Plan, if necessary. E22 Review the project actual cost, labor hours and timeline in comparison with the budget. Determine if there are any risks that may impact the project and consider discussing with the Project Steering Committee. E23 Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor. Section F - Pre-Implementation: Training
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures F1
F2
F3
Verify that the Project Team has developed a training program based off of the User Manual and Operations Manual. Verify that end users, super users, and technical support personnel are properly trained on the new system. Review training schedule and attendance sheets to determine that users attended the training.
F4
Verify that surveys were provided to users regarding feedback on the training materials. Verify that comments are incorporated into the training program and / or User Manual. F5 Prepare an audit memorandum of the results of this phase of testing and distribute to the Project Team and Project Sponsor. Section G - Post Implementation: Support & Maintenance G1 Verify that management has committed the appropriate additional resources to support the system and respond to end users' needs post golive for a predetermined amount of time (e.g 3 months). G2 Verify that in-house support personnel have stated service level agreement metrics to meet the needs of the end users timely. G3 Determine if the level of support is meeting its SLA metrics. G4 Determine if management will be relying on vendor support for the system. If so, obtain and review support contract for terms and agreement, confidentiality, and access rights. Determine level of support during an incident / disaster. G5 Verify that all support personnel have received training on the Operations Manual. G6 Verify that any changes made to the system by support personnel follow the company's change management policy and procedures.
Testing Comments / Conclusions
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures
Testing Comments / Conclusions
G7 Verify that there is a process in place to update the Operations Manual based on changes made.Verify that there is a process in place to update the Operations Manual based on changes made. G8 Verify that the system is included in the patch management policy and procedures. G9 Verify that the hardware and software associated with this implementation project have been included in the company's inventory listing of IT assets. Section H - Post Implementation: Review of Project Results & Close Out H1 Verify that the Project Lead has performed a post implementation assessment. The assessment should include: - determination if project objectives were achieved - assessment on cost benefit analysis presented in business case - assessment of project budgets (cost, labor hours, timeline) in comparison with actual results - project metrics, KPIs - feedback from end users on acceptance of system - identifying lessons learned
H2 Evaluate the lessons learned identified by the Project Lead and determine if they address the findings noted in the audit memorandums issued. H3 For any unresolved issues, verify that they have been assigned an owner and estimated completion date. Verify that unresolved issues are tracked and closed out timely. H4 Verify that the Project Lead presented the post implementation assessment results to the Project Steering Committee.
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Audit Procedures
Testing Comments / Conclusions
H5 Verify that project documentation is properly secured and retained according to retention policy and procedures. H6 Verify that the Project Lead has obtained approval from the Project Steering Committee to close the project. Secion I - Post Implementation: Internal Controls Assessment I1 Verify that the ITGC and business process internal control documentation (e.g. narrative, flow charts, matrices) have been created or modified to accommodate the new system. I2 Verify if policies, procedures, and internal controls have been revised based on the project's lessons learned. I3 Test the internal controls to verify that they are operating effectively. a. Test the ITGC controls. b. Test the Business Process controls. I4 Verify that the new system has been added to the Disaster Recovery Procedures Manual. (Note: based on the criticality of the system, the company may decide not to include it in the DRP. In this situation, assess if the system owner needs to consider a business continuity plan).
W/P Ref
Findings
Preparer Sign-off
Reviewer Signoff
Internal Audit Quality Assurance In an effort to provide continuous improvement in the service we provide to you and the organization, would you please take a few moments to complete this short survey and return it promptly as indicated below? Thank you!
Audit Name: ________________________________
Excellent
Good
Fair
Poor
Not applicable / Don't Know
Improved Significantly
Improved
Stayed the same
Declined
Declined significantly
Evaluation Criteria Independence Objectivity of auditor team Professional Proficiency Understanding the business & your department Technical proficiency of audit team Uses technology appropriately Professionalism of audit team Communication skills of audit team Interpersonal skills of audit team Works well with your team Helps you manage and implement change Scope of Work Notification of the audit purpose and scope Audit focused on key areas & risks Department's concerns and perspective considered Performance of Audit Work Duration of the audit Level of creativity Usefulness of the audit Disruption of activities was minimal Sharing of best practices Feedback of findings during the audit Timeliness of the audit report Clarity of the audit report Accuracy of the audit findings Value of the audit recommendations Provides workable solutions for audit recommendations Timely follow-up on corrective action
How has the quality of service you received changed from previous audits you experienced? Was there anything about the audit you especially liked?
Was there anything about the audit you especially disliked?
Are there any recommendations for improvement that you would like us to consider?
Additional Comments:
Name: _____________________________________ Date: ______________________________________ Please return survey to: ________________________
Below is a list of resources that may be used during an SDLC audit: ISACA • • • •
COBIT 5 Enabling Processes COBIT 5 - Governance and Management Practices Activities (COBIT 5 Toolkit) COBIT 5 for Assurance Systems Development and Project Management Audit / Assurance Program (based off of COBIT 4.1)
• • • • • • • •
GTAG 12: Auditing IT Projects GTAG 14: Auditing User-developed Applications GTAG 5: Managing and Auditing Privacy Risks GTAG 8: Auditing Application Controls GAIT for Business and IT Risk GAIT for IT General Control Deficiency Assessment GAIT Methodology Top 10 System Implementation Audit Considerations (by PwC)
IIA
COSO Internal Control -- Integrated Framework National Institute of Standards and Technology (NIST) • SP 800-53 rev.4: Security and Privacy Controls for Federal Information Systems and Organizations. • SP 800-64 rev.2: Security Considerations in the System Development Life Cycle
Twenty Critical Security Controls (maintained by Council on Cyber Security) Project Management Body of Knowledge (aka PMBOK - maintained by Project Management Institute) AuditNet (subscription based) Protiviti's Knowledgeleader (subscription based)
Tickmarks {a} {b} {c} {d} {e} {f} {g} {h} {i} {j} {k} {l} {m} {n} {o} {p} {q} {r} {s} {t} {u} {v} {w} {x}
{y} {z}