U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.

Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.

The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Reauthorization of the Community Development Financial Institutions Fund

Testimony Before the Subcommittee on
Financial Institutions and Consumer Credit
Committee on Banking and Financial Services
U.S. House of Representatives
June 17, 1998
 
Reauthorization of the
Community Development Financial
Institutions Fund
 
Statement by Richard B. Calahan
Deputy Inspector General
 
 
Madam Chairwoman and Distinguished Members of the Subcommittee:
 
Introduction

Madam Chairwoman, Members of the Subcommittee, we appreciate the opportunity to discuss with you today our products on the Community Development Financial Institutions (CDFI) Fund. I am joined by Mr. Dennis Schindel, Assistant Inspector General for Audit.
 
Background
 
CDFI was created by the Riegle Community Development and Regulatory Improvement Act of 1994. CDFI was originally established as a wholly owned government corporation and was intended to operate as an independent Federal agency. However, through emergency appropriations legislation in July 1995, CDFI was placed within the Department of the Treasury.
 
Initial Advice and Evaluation Efforts
 
The Office of Inspector General (OIG) was asked by Departmental officials to provide some advice to CDFI at the time the Fund was transferred to the Treasury Department in 1995. Initially, we were asked to review draft regulations to determine if audit requirements were built in that would ensure appropriate control over the program. Later in 1995, we provided advice on program procedures in the form of two technical assistance reports produced by our Office of Evaluations. These two reports, the first on CDFI Award Application Procedures (OIG-96-006) issued in February 1996, and the second on CDFI Award Monitoring Procedures (OIG-96-E15) issued in May 1996, provided information on procedures used by other grant and loan awarding organizations, including other Federal agencies.
 
The first report provided information to CDFI on procedures for accepting applications for assistance and awarding assistance. Our Evaluations group researched, collected, and analyzed procedural information for award administration from other Federal agencies. The information collected was intended to provide the CDFI staff with useful examples of procedures for managing the grant application process. We provided an in-depth narrative on the entire application process including: receiving applications, reviewing applications, and records maintenance and retention. We specifically provided CDFI with information on the application screening process, detailing the importance and benefits of implementing a numeric application scoring process. Our report pointed out that narrative justifications may be used instead of a numeric scoring system, but that these narratives "should be well documented to establish a trail of consistent and logical conclusions by the reviewers." We pointed out that the narratives should provide sufficient information to justify the award or to demonstrate why the applicant did not receive an award. In this report, we also indicated that specific and clear criteria should be developed for use in assessing the applications. The criteria should explicitly state which areas are to be assessed during the evaluation of applications and what standards or principles would be used to assess each of the areas.
 
In describing a review process, we identified five fundamental steps that should apply for all reviewers. We said the reviewers should:
 
-Possess thorough knowledge of the subject matter and the related industry;
Receive sufficient training on the program;
Certify that no conflict of interest exists (on the reviewer’s part) prior to reviewing any application;
Independently review the applications prior to meeting with other reviewers; and
Follow the criteria and evaluation process established for the program.
 
The second technical assistance report issued by the Office of Evaluations was intended to provide information to CDFI on award monitoring. This report focused on the importance of establishing performance requirements, payment schedules, reporting requirements (i.e., status reports, audited financial statements, records maintenance), program/organization reviews (i.e., desk reviews, site visits), and award closeout procedures.
 
We provided these reports to the CDFI Director and informally heard that the information was being used by the program. As a result of discussions involving how the Fund would fit into the Treasury financial statements that were under preparation for 1996, we subsequently became aware that unresolved issues continued to be raised in the Department regarding the provision of accounting and other support services to CDFI. As a result of these issues and others involving special initiatives in the Department, the OIG began a series of four case studies to determine how such initiatives could benefit from lessons learned. CDFI was one of the case studies.
CDFI Implementation Review
 
Purpose
 
We conducted this review using the four case studies to assess Treasury’s ability to support high priority projects and endeavors. The purpose of the review was to identify improvements the Department could institute to plan for and support high priority projects and special endeavors.
 
Findings
 
Our review of the implementation of CDFI found that "Treasury did not develop written implementation guidance and did not, in some respects, execute and oversee a well-planned strategy for implementing CDFI." In addition, we found that no formal written analysis of the implementation effort (identifying issues, mission and objectives, funding sources, etc.) was completed when CDFI was transferred to Treasury. We found that early on, there was significant confusion regarding the organizational placement of CDFI within the Department.
We also found that a needs assessment that would have identified the program requirements regarding staffing, accounting, budgeting, procurement, information technology, telecommunications, and legal services was not completed. Also, a consistent implementation team did not shepherd the implementation of the program, particularly after the appointment of the CDFI Director in October 1995.
Our report noted that the CDFI staff, including the Director, had little experience operating Federal grant and loan programs. Unfortunately, the Department also had little experience in the area of grants administration and was unable to provide CDFI with the needed processes and operating procedures. As a result, CDFI had difficulty obtaining support for such things as accounting services and budget formulation.
 
Suggestions
 
Our suggestions focused on designing and implementing detailed planning processes for high priority projects and special endeavors and increasing oversight from the senior management group. We offered suggestions to help senior management better support these types of projects and programs. Our suggestions were to develop a consistent and comprehensive process for planning special projects and endeavors. This included developing a strategy and a needs assessment. We stressed the importance of a strong implementation team as well as the strong and consistent oversight from senior management. We identified what lessons could be learned from these previous endeavors to help the Department better manage high priority projects in the future. It should be noted that our suggestions were directed to the Department and not to CDFI management. The Department generally agreed with our findings and suggestions.
 
Investigative Work
 
In April 1997, after receiving complaints about the application review and award process, staff members of the General Oversight and Investigations Subcommittee of the House Committee on Banking and Financial Services initiated a review of CDFI’s first round of awards. They discovered a lack of documentation and undated memoranda in four application files. When the Secretary was subsequently asked to provide an explanation for the undated memoranda, his office requested that we conduct an investigation into the matter. Our investigation, completed in August of 1997, confirmed that the memoranda in question had been prepared the night before the Congressional staff’s visit.
 
Recent Audit Work
 
After the investigation was completed, we initiated a previously planned audit of CDFI. Before starting the audit, we met with representatives from the General Accounting Office (GAO) which was also planning to conduct an audit of CDFI and staff members from the U.S. Senate Appropriations Subcommittee on Veterans Affairs, Housing and Urban Development, and Independent Agencies. All parties agreed to divide audit responsibilities between GAO and Treasury OIG. GAO agreed to focus its review on the post-award monitoring procedures. The scope of our audit was limited to the application review and application selection processes only. Our review has confirmed that there was a significant lack of documentation to support conclusions and decisions made in the FY96 application review and award process
The OIG is near conclusion on this audit of the CDFI Fiscal Year (FY) 1996 and 1997 application award processes. The audit field work is complete and and we are drafting the report. Upon completion, the draft report will be presented to CDFI senior management for comment.
 
Objective
 
The objective of our CDFI audit was to evaluate the application review and application selection processes of the CDFI and Bank Enterprise Award (BEA) Programs during Fiscal Years 1996 and 1997. Specifically, our audit focused on documenting the actual process used in those two fiscal years. We addressed whether adequate internal controls existed for documenting the procedures used in evaluating applications and the decisions to award funds, and whether a consistent process was followed for all applicants.
 
Methodology
 
We reviewed 61 of the 268 application files received by CDFI during FY96. We also reviewed five additional application files that allegedly had received preferential treatment. A limited review of FY97 application files was completed to determine whether CDFI was following the interim procedures instituted after the FY96 application cycle. We also reviewed a sample of BEA Program application files processed during FY96.
In addition to reviewing application files, we conducted interviews with the Director of CDFI, the Deputy Director, the Legal Counsel, the application reviewers, and other selected support staff.
 
Results
 
In 1995, CDFI had the difficult task of simultaneously creating an organizational structure while processing applications for assistance for both the CDFI and BEA Programs. In establishing the CDFI and BEA Programs, the Fund staff published interim regulations that outlined an application review process. After the Director and the Deputy Director assumed their duties in the Fall of 1995, the Deputy Director was given primary responsibility for the CDFI Program. He decided to alter the application review and evaluation processes from that which had been recommended by the OIG and published in the Federal Register. He said that he revised the process because there was not enough time to do the three separate reviews specified in the interim regulations, and he wanted to improve the process. He used what he called the "Best Business practices" approach to reviewing the applications; however, he did not document these practices. The BEA Program which had been assigned to the Senior Policy Officer followed the process specified in the interim regulations.
In the CDFI application review and award process used in FY96, the Deputy Director was extensively involved in every aspect of application review, evaluation, and selection. He performed a cursory eligibility and completeness review of all the applications before he assigned them to reviewers. During the detailed reviews of applications performed by the initial substantive reviewers, which included another eligibility and completeness review, the Deputy Director monitored and supervised each review, communicating with the reviewers at least weekly. His stated intent was to question them on each application, make suggestions on aspects to consider, and challenge them on their assertions.
The reviewers each produced an evaluation memo that recommended that the application either be sent to an interview panel or not be given further consideration. The application files contained very little documentation of the work performed. We noted an absence of contact logs, correspondence, and analytical documents. The reviewers produced an application evaluation memo with a recommendation, but were not required to provide any supporting documentation. Accordingly, at the discretion of the reviewers, the files sometimes contained additional information provided by applicants, and sometimes not. In addition to the recommendation, the evaluation memos contained a narrative assessment of the quality of the application in the five categories covered in the interim regulations criteria.
During our interviews, the reviewers stated that they were in control of the application reviews. The Deputy Director supervised their work, but they made the recommendation decisions themselves. They stated that they evaluated the applications to a standard of quality and did not make selection recommendations based on a quota. We noted that there seemed to be varying degrees of "strictness" in the application of standards by different reviewers. However, within our sample, we found that each reviewer consistently reviewed his set of applications.
Part of the award process included face-to-face interviews. The interview panel consisted of five members: the Director, Deputy Director, a staff member who had served as a reviewer and two consultants (one of whom had been a reviewer). The panel reviewed the application files and evaluation memos, and held a discussion with the application reviewers to prepare for the interviews. The panel interviewed representatives from 59 applicant organizations. The applicant representatives were interviewed at the CDFI offices. Periodic look-back sessions were held by the panel to assess the merits of 14 to 15 applicants at a time. At the end of the process the panel held a final session. There were no official minutes taken during the panel interviews nor was there any documentary evidence supporting the funding decisions. However, two of the panel reviewers had notes and these were consistent and in general agreement with the award decisions.
During our review, we noted that the receipt and acknowledgment of applications for assistance were consistently processed using procedures similar to those suggested in earlier OIG reports. Administrative personnel who were not involved in the review or evaluation of applications processed the mail, recording control numbers on the applications and copies. Standard acknowledgment letters signed by the Director were sent to the applicants after all the applications had been received.
A deviation to the deadline for acceptance of applications was established. The waiver policy was published in the Federal Register on March 18, 1996. Our review showed the policy was applied equally to all applications received after the deadline.
Initial substantive reviews were consistently processed, with the exception of five applications which were reviewed by the Deputy Director. The Deputy Director reviewed the five applications because the reviewer who specialized in community development banks had past contractual relationships with the applicants, and to avoid a potential conflict of interest could not review the applications.
 
The reviews conducted by the Deputy Director were not subjected to a secondary review and were therefore processed differently than the other applications. Documentary evidence within four of the five application files suggested that reviews had been completed by the Deputy Director, even though none of those application files had timely prepared evaluation memos. However, the 5th file reviewed (an unsuccessful applicant) did not contain any evidence that a review had taken place. Therefore, we concluded that this file probably had not been processed in a manner consistent with the others.
Relationships between entities applying for assistance were adequately considered when applications were reviewed, with no awards made to any entity exceeding the statutory limit of $5 million to a single applicant. This limitation applied to applicants and their affiliates. For purposes of the CDFI Program, affiliate means any company or entity that controls, is controlled by, or is under common control with another company.
 
The applications for assistance required disclosure of the portion of shares held by another depository institution. Disclosure of shares held by applicant entities that raised questions of affiliation would have been discovered during the initial review, if any existed.
In response to an inquiry from the General Oversight and Investigations Subcommittee of the House Committee on Banking and Financial Services regarding an alleged affiliation between Shorebank and two or more of the FY96 award recipients, the CDFI Legal Counsel made inquiries at the Federal Reserve Board. After consultation with the Federal Reserve Board, CDFI Legal Counsel made the determination that no affiliation existed.
The following year (FY97), CDFI created procedures governing the application review process. Our review of selected FY97 CDFI applications for assistance and the controls established for handling them revealed that the new procedures instituted for the second round were followed, resulting in extensive documentation of every step of the review and decision process.
 
The standard review forms contained the same five evaluation categories as in FY96 and had a rating system which was followed in all cases. Each file contained a list of applicant contacts, an interview guide and list of suggested supplemental information to request. The evaluation forms had numeric ratings and were signed by a reviewer.
 
The forms had recommendations and were approved by the Director. Additionally, we found that the FY96 BEA Program application and selection process was well documented. The sample application files reviewed were consistently organized and used standard forms.
Therefore, our work indicates that the problems that developed with the CDFI award process in FY 1996 were remedied in FY 1997. While additional improvements may enhance the award process, the basic structure of an objective CDFI award process has been established. The Fund has hired an experienced Federal grants manager and an experienced Federal CFO who should be able to provide the expertise for strengthening award processes and controls .
Future Audits Planned
During the course of our audit work, we became aware of questions about CDFI’s contracting practices, as did the General Oversight and Investigations Subcommittee of the Committee on Banking and Financial Services in their work at the Fund. Therefore, a separate audit of contract practices is scheduled to begin upon completion of our current audit. We will also carry out reviews of internal controls as part of the annual financial statement audits.
 
Closing
 
In closing, we believe that CDFI has weathered a difficult start but is now taking steps to follow accepted Federal processes for assistance programs and is welcome to suggestions for methods to strengthen its operations.
Madam Chairwoman, this concludes my testimony. I would be pleased to answer any questions that you or Members of the Subcommittee may have at this time.
 
 
 
 
 
 
Changed
Mon, 12/21/2020 - 22:26