Additional Materials:

Contact:

SRA International, Inc., of Fairfax, Virginia, protests the issuance of a task order to Computer Sciences Corporation (CSC), of Falls Church, Virginia, under task order request (TOR) No. GSC-QFOB-12-0020, issued by the General Services Administration (GSA), Federal Systems Integration and Management Center, to procure, on behalf of the Federal Deposit Insurance Corporation (FDIC), information technology (IT) services for the infrastructure support contract (ISC3). SRA argues that the agency's evaluation of offerors' proposals and award decision were improper.

We dismiss the protest in part and deny it in part.

DOCUMENT FOR PUBLIC RELEASEThe decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

1. Protest challenging an agencys evaluation of a potential unequal access to information organizational conflict of interest (OCI) is dismissed as academic where, the agency waived any OCI concerns under the authority granted to it by section 9.503 of the Federal Acquisition Regulation.

2. Protest of an agencys technical evaluation is denied where record shows that the evaluation was reasonable and consistent with the stated evaluation criteria.

DECISION

SRA International, Inc., of Fairfax, Virginia, protests the issuance of a task order to Computer Sciences Corporation (CSC), of Falls Church, Virginia, under task order request (TOR) No. GSC-QFOB-12-0020, issued by the General Services Administration (GSA), Federal Systems Integration and Management Center, to procure, on behalf of the Federal Deposit Insurance Corporation (FDIC), information technology (IT) services for the infrastructure support contract (ISC3). SRA argues that the agencys evaluation of offerors proposals and award decision were improper.[1]

We dismiss the protest in part and deny it in part.

BACKGROUND

The FDIC is a self-funded (i.e., non-appropriated) entity of the federal government. The overall mission of the FDIC is to preserve and promote public confidence in the U.S. financial system by insuring deposits in banks and thrift institutions for up to $250,000; by identifying, monitoring, and addressing risks to the deposit insurance funds; and by limiting the effect on the account holder and financial system when a bank or thrift institution fails. TOR § C.1.

The TOR was issued on June 12, 2012, to all contract holders under GSAs Alliant government-wide acquisition contract (GWAC)[2], and provided for the issuance of a cost-plus-award-fee task order for a 6-month base period and four option years.[3] A detailed performance-based statement of work was provided describing the required services. TOR § C. Offerors were informed that the ISC3 task order would replace the prior ISC2 task order[4] and cover the day-to-day operations of the FDICs IT infrastructure facilities, hardware, software, and systems. TOR § C.1. The solicitation also stated that the contractor was to provide the support activities that facilitate the FDICs delivery of software applications by managing the underlying infrastructure, supporting release management, and providing operations and maintenance of the development, quality assurance, testing, production and disaster recovery environments, as defined by seven task areas. TOR § C.1.

The TOR provided for issuance of the task order on a best-value basis, considering the following evaluation factors: technical approach, key personnel and project staffing approach, management approach, corporate experience, and cost. TOR § M.5. Offerors were informed that the noncost factors were in descending order of importance and, when combined, were significantly more important than cost. TOR §§ M.1, M.5.

Detailed instructions were provided for the preparation of proposals under each factor. TOR § L. For example, with respect to the technical approach factor, offerors were informed that the TOR sought a tailored technical approach and that an offeror was required to clearly describe its technical methodology in fulfilling the technical requirements identified in the TOR. TOR § L.8.1. Offerors were informed that under this factor the agency would consider the clarity and thoroughness and the effectiveness and efficiency of the offerors technical approach.[5]See TOR § M.5.1.

Five offerors, including CSC and SRA, submitted proposals by the July 23 closing date. The technical proposals were evaluated by the agencys technical evaluation board (TEB), which used the following adjectival ratings: excellent, good, acceptable, and not acceptable. The cost proposals were evaluated by different evaluatorsfor reasonableness and realism. On October 12, the agencys source selection authority (SSA) selected CSCs proposal as the best value to the government, and a task order was issued to CSC.

On October 22, SRA protested to our Office, challenging the agencys evaluation of proposals and selection decision. SRA filed supplemental protests on November 13 (following receipt of documents from GSA), and on December 7 (following receipt of the agencys report). On December 13, GSA informed our Office that it would take corrective action in response to SRAs protest by terminating CSCs task order, seeking and evaluating revised proposals, and making a new selection decision. GSA Letter to GAO, Dec. 13, 2012, at 1-2. On December 19, we dismissed SRAs protest as academic. SRA Intl, Inc., B-407709 et. al, Dec. 19, 2012.

On March 6, 2013, GSA issued an amended solicitation for the ISC3 task order procurement. GSA received revised proposals from four offerors, including CSC and SRA. The parties revised proposals were evaluated as follows:

Agency Report (AR), Tab 119, TEB Report, at 18; Tab 120, Source Selection Decision, at 61. The TEBs adjectival ratings were supported by a narrative report that detailed the proposals respective strengths, weaknesses, risks, and deficiencies. For example with respect to the technical approach factor under which SRAs proposal received an acceptable rating, the TEB identified six strengths and eleven weaknesses. AR, Tab 119, TEB Report, at 47-50.

On August 14, the SSA again determined that CSCs proposal represented the best value to the government. Specifically, the SSA found that CSCs qualitative advantages under the technical approach and the key personnel and staffing approach factors--the two most important technical factors--outweighed SRAs cost advantage ($3.5 million, or less than 1%) and higher ratings under the less important management approach and corporate experience factors. AR, Tab 120, Source Selection Decision, at 61-65.

SRA raises numerous challenges to the agencys evaluation of proposals and selection decision.[8] First, the protester contends that CSC has an organizational conflict of interest (OCI) which GSA failed to identify or mitigate due to CSCs proposal of Blue Canopy Group, LLC, as a subcontractor. SRA also contends that GSA unreasonably evaluated SRAs proposal under the technical approach, key personnel/staffing approach, and management approach factors. SRA argues that had the agency conducted a proper evaluation of offerors proposals, SRAs proposal would have been found to represent the best value to the government. Protest, Aug. 26, 2013, at 1-63.

We have considered all of the protesters arguments, although we address only its primary ones, and find that none provide a basis on which to sustain the protest.

Organizational Conflict of Interest

SRA protests that the agency failed to properly investigate and mitigate a significant OCI concerning CSCs subcontractor, Blue Canopy. According to the protester, Blue Canopy has been performing as the FDIC network security services contractor since at least 2009 and in this role monitors and audits network security on the FDICs network. SRA states that its performance of the ISC2 task order contract was subject to security monitoring by Blue Canopy, and alleges that there were no limitations on Blue Canopys access to information stored in or transiting through the FDICs network. SRA alleges that Blue Canopy had unfettered access to all of SRAs documents and communications under the incumbent contract, including documents marked as proprietary and containing business sensitive information (e.g., staffing numbers, rates, salaries, planned changes to network infrastructure). SRA also implicitly alleges that Blue Canopy took SRAs information and shared it with CSC, thereby giving CSC an unfair competitive advantage in developing its proposal here. Lastly, SRA argues that the agency never investigated or mitigated this unequal access to information OCI.[9] Protest, Aug. 26, 2013, at 12-18.

The agency disputes that Blue Canopys role as the FDIC network security services contractor provided it with access to any SRA information, and argues SRA has done no more than speculate that this may have occurred. AR, Sept. 25, 2013, at 12-14. The agency also argues that SRAs protest regarding CSCs alleged OCI is untimely, as the protester knew of this basis of protest as of November 1, 2012, when SRA received documents from GSA in SRAs prior protest of this same procurement showing that Blue Canopy was CSCs subcontractor. Id. at 8-12.

Late in the protest process, however, GSA advised our Office and the parties that the agency had waived any OCIs regarding the award to CSC, and requested that our Office dismiss the protest as academic. The FAR establishes that, as an alternative to avoiding, neutralizing, or mitigating an OCI, an agency head or designee, not below the level of the head of the contracting activity, may execute a waiver. Specifically, the FAR provides as follows:

The agency head or a designee may waive any general rule or procedure of this subpart by determining that its application in a particular situation would not be in the Governments interest. Any request for waiver must be in writing, shall set forth the extent of the conflict, and requires approval by the agency head or a designee.

SRA protests the agencys evaluation of its proposal under the technical approach, key personnel/staffing approach, and management approach factors. In general terms, the protester challenges various weaknesses assigned to its proposal,[10] contends that the assigned ratings were inconsistent with the stated evaluation criteria, and argues that its proposal was entitled to higher ratings. Protest, Aug. 26, 2013, at 28-43. Among other things, SRA complains that the irrationality of the agencys evaluation is demonstrated by the fact that a number of strengths identified in its proposal for these factors by individual evaluators in their own worksheets were not included in the TEBs final consensus report.

The weaknesses that SRA challenges were assessed under the technical approach and management approach factors and reflect the TEBs judgment that SRA failed to clearly explain its technical and management approaches to performing the work.[11]See AR, Tab 119, TEB Report, at 49-50 (11 technical approach weaknesses), 56 (1 management approach weakness). We have considered each of SRAs challenges to these weaknesses, and, although we do not address each specifically, find that SRAs arguments provide no basis to conclude that the agencys evaluation judgments were unreasonable.

Technical Approach Evaluation

For example, SRA complains that GSAs evaluation of its proposal under the technical approach factor was unreasonable, because the agency allegedly used an unstated evaluation criterion in evaluating in the protesters technical approach: the lack of implementation detail. Specifically, SRA argues that implementation detail was neither an express nor implied requirement of the TOR, and that offerors were not on notice that implementation detail of their proposed approaches would be evaluated.[12] Protest, Oct. 17, 2013, at 49-54.

As set forth above, the TEB identified 6 strengths and 11 weaknesses in SRAs technical approach, which the TEB assessed as acceptable. The evaluators found that, overall, SRAs proposal was only somewhat clear and comprehensive, that it essentially regurgitated the TORs stated requirements with little detail as to how they would be accomplished, and that new or progressive approaches proposed by SRA also had little implementation detail.[13] AR, Tab 119, TEB Report, at 47-48. In fact, all eleven weaknesses identified in SRAs technical approach concerned a lack of detail generally, including lack of detail regarding the offerors implementation plan, performance methodology, and execution strategy.[14]Id. at 49-50.

The task order competition here was conducted among ID/IQ contract holders pursuant to FAR subpart 16.5. The evaluation of proposals in a task order competition, including the determination of the relative merits of proposals, is primarily a matter within the contracting agencys discretion, since the agency is responsible for defining its needs and the best method of accommodating them. Wyle Labs., Inc., B-407784, Feb. 19, 2013, 2013 CPD ¶ 63 at 6; Optimal Solutions & Techs., B-407467, B-407467.2, Jan. 4, 2013, 2013 CPD ¶ 20 at 6. Our Office will review evaluation challenges to task order procurements to ensure that the competition was conducted in accordance with the solicitation and applicable procurement laws and regulations. Logis-Tech, Inc., B-407687, Jan. 24, 2013, 2013 CPD ¶ 41 at 5; Bay Area Travel, Inc., et al., B-400442 et al., Nov. 5, 2008, 2009 CPD ¶ 65 at 9. A protesters mere disagreement with the agencys judgment is not sufficient to establish that an agency acted unreasonably. STG, Inc., B-405101.3 et al., Jan. 12, 2012, 2012 CPD ¶ 48 at 7.

We find that GSAs consideration of how offerors would implement the technical approaches they were proposing was entirely consistent with the stated evaluation criteria. The performance-based statement of work required the contractor to provide innovative, efficient, and cost-effective IT infrastructure support services. TOR § C.1.1. The solicitation then instructed each offeror to clearly describe its technical methodology [to] fulfilling the technical requirements identified in the TOR. TOR § L.8.1. Finally, the TOR established that the evaluation here would include consideration of the clarity and thoroughness of the [t]echnical [a]pproach, and the degree of effectiveness and efficiency of the offerors approach for meeting the goals, objectives, conditions, and task requirements of the TOR. TOR § M.5.1. In light thereof, the agency did not employ an unstated evaluation criterion when finding as a weakness that SRAs proposal failed to detail the implementation plan and/or execution methodology of its proposed technical approach.[15]SeeAdvanced Tech. Sys., Inc., B-296493.5, Sept. 26, 2006, 2006 CPD ¶ 147 at 16; Ridoc Enter., Inc., B-292962.4, July 6, 2004, 2004 CPD ¶ 169 at 4.

SRAs Lost Strengths

SRA complains that a number of strengths that were initially assigned to its proposal by individual evaluators were subsequently omitted from the final consensus report without explanation (SRA collectively terms these its lost strengths). SRA alludes to a total of 71 lost strengths--52 under the technical approach factor, 7 under the key personnel and project staffing approach factor, and 12 under the management approach factor--which SRA argues demonstrates that GSAs evaluation was not reasonable.

When evaluating offerors revised proposals, the agencys evaluators first performed individual assessments of each offerors submission. The evaluators then held a question-and-answer session with each offeror (as set forth in the TOR), followed by a TEB consensus determination. The agencys evaluation was memorialized in several documents: first there were the individual evaluator worksheets (AR, Tab 118); followed by TEB consensus notes (AR, Tab 117), and eventually a TEB final consensus evaluation report (AR, Tab 119). The TEBs final report included both adjectival ratings and detailed narrative findings regarding each offeror. For example, in addition to a summary rationale for each evaluation rating, the TEB identified six strengths and eleven weaknesses in SRAs technical approach, five strengths and two weaknesses in SRAs key personnel and project staffing approach, eight strengths and one weakness in SRAs management approach. AR, Tab 119, TEB Report, at 46-57.

The SSAs selection decision was based upon the TEBs final evaluation findings in its consensus evaluation report. AR, Tab 120, Source Selection Decision, at 61-64. The record shows that the SSA did not focus on the number of strengths and weaknesses identified in the proposals, or even if something had been identified as a strength or weakness. Rather, the SSAs best value tradeoff determination was based on the qualitative merits of each offerors proposal. Id.

SRA nevertheless contends that the agencys evaluation was unreasonable because the final evaluation differed without explanation from the initial evaluation. While the protester acknowledges that some of the initial evaluator findings were duplicative in nature, and that in other instances a strength could be offset by a corresponding weakness, SRA argues that the agencys unjustified omission of the remaining lost strengths from its final evaluation was improper. Protest, Oct. 17, 2013, at 33-41.

The agency disputes the merits of the protesters argument here. As a preliminary matter, GSA points out that SRAs assertion is selective and unbalanced. Although the protester references its alleged lost strengths, SRA makes no attempt to account for the numerous lost weaknesses also identified by the individual evaluators that were not in the final evaluation report (which SRA does not deny).[16] GSA Dismissal Request, Oct. 23, 2013, at 10. Further, the agency disputes SRAs central assertion that the strengths initially identified in the offerors proposal were in fact lost. AR, Oct. 30, 2013, at 10-12. In support thereof, the agency submitted a statement from the TEB Chairperson together with a crosswalk analysis to demonstrate that the final evaluation report consolidated all duplicative comments, grouped misplaced comments, and otherwise reconciled individual evaluators initial impressions as appropriate.[17]Id. at 1-4. Moreover, the individual evaluator findings occurred prior to GSA conducting a question-and-answer session with SRA. Contracting Officers Statement, Sept. 25, 2013, at 3-4. Thus, the agencys final evaluation report was not based on the same SRA proposal upon which the initial evaluation findings were premised.

We recognize that it is not unusual for individual evaluator ratings to differ from one another, or from the consensus ratings eventually assigned. Systems Research and Applications Corp.; Booz Allen Hamilton, Inc., B-299818 et al., Sept. 6, 2007, 2008 CPD ¶ 28 at 18. Indeed, the reconciling of such differences among evaluators viewpoints is the ultimate purpose of a consensus evaluation. J5 Sys., Inc., B-406800, Aug. 31, 2012, 2012 CPD ¶ 252 at 13; Hi-Tec Sys., Inc., B-402590, B-402590.2, June 7, 2010, 2010 CPD ¶ 156 at 5. Likewise, we are unaware of any requirement that every individual evaluators scoring sheet track the final evaluation report, or that the evaluation record document the various changes in evaluators viewpoints. J5 Sys., Inc., supra, at 13 n.15; seeSmart Innovative Solutions, B-400323.3, Nov. 19, 2008, 2008 CPD ¶ 220 at 3. The overriding concern for our purposes is not whether an agencys final evaluation conclusions are consistent with earlier evaluation conclusions (individual or group), but whether they are reasonable and consistent with the stated evaluation criteria, and reasonably reflect the relative merits of the proposals. See, e.g., URS Fed. Tech. Servs., Inc., B-405922.2, B-405922.3, May 9, 2012, 2012 CPD ¶ 155 at 9 (a consensus rating need not be the same as the rating initially assigned by the individual evaluators); J5 Sys., Inc., supra, at 13; Naiad Inflatables of Newport, B-405221, Sept. 19, 2011, 2012 CPD ¶ 37 at 11.

Based on our review, we find the agencys evaluation was reasonable. The TEBs final evaluation report detailed the relative merits of SRAs proposal under each evaluation factor, as required by the solicitation. For example, the TEB more than adequately explained the basis for its conclusion that SRAs technical approach was acceptable: there were some strengths and some weaknesses (which the evaluators identified); the approach was only somewhat clear, detailed, effective, and comprehensive; and the lack of detail in certain areas caused concerns regarding achievability within the timeframes proposed. AR, Tab 119, TEB Report, at 47-48.

Further, we see nothing unreasonable in the existence of differences between the evaluators preliminary findings and the final consensus evaluation findings of SRAs proposal. In performing its evaluation of offerors proposals, an agency commonly relies upon multiple evaluators who often perform individual assessments before the evaluation team reaches consensus as to the evaluation findings. In doing so, it is not uncommon for the final group evaluation to differ from individual evaluator findings. Moreover, there is simply no requirement that agencies document why evaluation judgments changed during the course of the evaluation process. Rather, agencies are required to adequately document the final evaluation conclusions on which their source selection decision was based, and we review the record to determine the rationality of the final evaluation conclusions.

We also find our decision in Systems Research and Applications Corp.; Booz Allen Hamilton, Inc., supra, upon which SRA heavily relies, to be distinguishable. In Systems Research, the agency failed to qualitatively assess the merits of offerors competing proposals: notwithstanding the fact that the offerors in that procurement all had differing technical approaches, they were all found, without explanation, to be technically acceptable and equal.[18] We noted that although an agency is not required to retain every document or worksheet generated during its evaluation of proposals, the agencys evaluation must be sufficiently documented to allow review of the merits of a protest. Id. at 11. In Systems Research, however, given the nearly complete absence in the record of any assessment of the firms different approaches, we found that the agency failed to reasonably evaluate the firms proposals consistent with the solicitation (i.e., the agencys consensus evaluation documents did not discuss, to any meaningful degree, the differences between the proposals which the evaluators agreed existed). Id. at 25.

Unlike in Systems Research, the contemporaneous record here documents the agencys evaluation, allowing for our review of the reasonableness of the agencys evaluation judgments. As stated above, the overriding concern for our purposes is not whether the final ratings are consistent with earlier, individual ratings, but whether they reasonably reflect the relative merits of the proposals. Id. at 18. Further, in Systems Research, the eliminated strengths were seemingly warranted based on specific proposal content. Here, by contrast, SRA has only established that the strengths were lost between the initial individual and final TEB evaluations, but not that they were strengths at all, i.e., aspects of the offerors proposal that exceeded stated requirements in a way beneficial to the government. See Protest, Oct. 17, 2013, at 33-41. Thus, we find SRAs lost strengths argument to be a red herring. Quite simply, the only thing SRA has demonstrated is that many of the agencys initial evaluation judgments did not become final evaluation judgments, not that the final evaluation judgments were unreasonable.

Number of Strengths

Lastly, SRA alleges that GSAs evaluation did not conform to the solicitation, as evidenced by statements made by the agency in its report to our Office. Protest, Oct. 17, 2013, at 41-46, citing AR, Sept. 25, 2013, at 27 (the decision on the relative importance of [SRAs] strengths to the Government, or on whether these strengths outweighed its weaknesses . . . rested squarely with the TEB). We find the protesters allegation unsupported by the record. Moreover, SRAs argument here reflects a fundamental misunderstanding of the evaluation process. An agencys evaluation is not to be based upon a mathematical counting of strengths and weaknesses, but rather, deciding what those strengths and weaknesses represent, in terms of qualitative assessments regarding the relative merits of the competing proposals. SeeSmiths Detection, Inc.; American Sci. & Engg, Inc., B-402168.4 et al., Feb. 9, 2011, 2011 CPD ¶ 39 at 14. It is an agencys qualitative findings in connection with its evaluation of proposals that govern the reasonableness of an agencys assessment of offerors proposals. Walton Constr. - a CORE Co., LLC, B-407621, B-407621.2, Jan. 10, 2013, 2013 CPD ¶ 29 at 9; Archer W. Contractors, Ltd., B-403227, B-403227.2, Oct. 1, 2010, 2010 CPD ¶ 262 at 5. Whether these features were considered as strengths, and whether SRAs proposal was rated acceptable or good, is immaterial provided that the agency considered the qualitative merits of the proposal features. Here, GSA clearly considered these features on the merits, and not on their characterization as strengths.

The protest is dismissed in part and denied in part.

Susan A. PolingGeneral Counsel

[1] While the solicitation was issued using the procedures in Federal Acquisition Regulation (FAR) subpart 16.5, the TOR stated that it sought proposals from offerors.

[3] As result of corrective action taken by GSA in response to an earlier protest by SRA, the TOR was amended a number of times. Our references to the solicitation are to the TOR, as finally amended by amendment 9 on March 28, 2013.

[4] SRA is the incumbent contractor that performed the ISC2 task order.

[5] Similarly, the TOR informed offerors that the agency would evaluate the effectiveness and efficiency of the offerors key personnel/project staffing and management approaches. See TOR §§ M.5.2, M.5.3.

[6] The TOR stated a total estimated ceiling cost of between $361,914,979 and $435,223,578. TOR § L.5.

[7] As the value of this task order is in excess of $10 million, this procurement is within our jurisdiction to hear protests related to the issuance of task orders under multiple-award indefinite-delivery, indefinite-quantity contracts. 41 U.S.C. § 4106(f)(1)(B).

[8] SRA initially challenged the agencys realism evaluation of CSCs cost proposal. We dismissed SRAs allegation as failing to set forth a valid basis for protest where the challenge was based only upon the fact that GSA had made no adjustments to CSCs proposed costs. SeeGeorge G. Sharp, Inc., B-408306, Aug. 5, 2013, 2013 CPD ¶ 190 at 1 n.1.

[9] The situations in which OCIs arise, as described in FAR subpart 9.5 and the decisions of our Office, can be broadly categorized into three groups: biased ground rules, unequal access to information, and impaired objectivity. SeeOrganizational Strategies, Inc., B-406155, Feb. 17, 2012, 2012 CPD ¶ 100 at 5. As relevant here, an unequal access to information OCI exists where a firm has access to nonpublic information as part of its performance of a government contract and where that information may provide the firm a competitive advantage in a later competition. FAR §§ 9.505(b), 9.505-4; Networking & Engg Techs., Inc., B-405062.4 et al., Sept. 4, 2013, 2013 CPD ¶ 219 at 10. SRA initially also argued that CSC had an impaired objectivity OCI, and that two former SRA employees now working for Blue Canopy improperly had access to SRA proprietary and competitively useful information. Protest, Aug. 26, 2013, at 13-14, 16-17. SRA subsequently withdrew these protest grounds. SRA Comments, Oct. 17, 2013, at 4; SRA Letter to GAO, Sept. 9, 2013, at 13-14.

[11] SRA does not challenge all of the weaknesses assessed in its proposal under the technical approach factor.

[12] SRA also argues that, although not required, its proposal did provide adequate detail regarding many of the identified technical approach weaknesses. Protest, Aug. 26, 2013, at 36-42; Protest, Oct. 17, 2013, at 54-58. Our review, however, indicates the agencys evaluation of SRAs proposal to be reasonable and does not provide a basis on which to sustain the protest. Likewise, we find the one weakness identified in SRAs management approach (i.e., the offerors approach to controlling costs did not provide insight into any cost control mechanisms) was also reasonable.

[13] One example of SRAs lack of detail or explanation of methodology was the firms proposal of several technology initiatives, for which SRA provided a chart outlining high-level timeframes for introducing its proposed innovations (SRA also organized its technical approach around which technology would be leveraged each year). The TEB found that little detail on the implementation methodology was provided. This caused concern for the TEB as to whether the proposed technologies were attainable within the time frame proposed. AR, Tab 119, TEB Report, at 48.

[14] For example, the TEB found that while SRA suggested an [DELETED], no detailed methodology or execution strategy was provided. AR, Tab 119, TEB Report, at 49. Similarly, [DELETED] were proposed [by SRA], but there were no details of how they plan to accomplish it. Id.

[16] For example, Team SRA makes the assumption that all of the current staff and processes will step over to ISC3 and continue. This thought misses many of the critical activities and goals defined in the TOR. AR, Tab 118, SRA Individual Evaluator Notes (Technical - Task Area 1).

[17] For example, individual evaluators initially identified 11 strengths, weaknesses, and other comments for SRAs [DELETED], under the technical approach, key personnel and staffing approach, and management approach factors. During the consensus discussion, the TEB determined that SRAs [DELETED] tool should properly be assessed only under the management approach factor, and entered one strength under that factor. AR, Tab 134, TEB Chairperson Declaration, Oct. 30, 2013, at 2-3.