Author Archive

Are the i3 results correct?

At least one of the 49 i3 celebrations going on around the country tonight may be premature.

Like many others, I was very interested in seeing the reviewers’ comments and scores for the highest-scoring applicants. When the information was posted this afternoon, I immediately downloaded the comments and score for the highest-scoring applicant, Saint Vrain Valley School District. With a standardized score of 116.95–well ahead of any other applicants–I was certain it must be flawless.

It was far from it. The raw score for Saint Vrain Valley School District’s Development proposal was 67.17 out of 100, which is far below the level typically funded in federal programs. If the standardized scores were right, all other Development proposals must have been below 67.17, and that seems extremely unlikely.

I checked another Development score at random. District 75 of the New York City Department of Education had a standardized score of 104.60. The district’s raw score is 93. That’s much more in line with expectations.

I checked another one with a standardized score similar to District 75. The Board of Education of the City of New York had a standardized score of 104.18 and a raw score of 93.17. Although it doesn’t seem quite right that the standardized score would be lower than District 75’s even though the raw score was higher, at least it is very close compared to the problem with Saint Vrain Valley.

I checked another. Bay State Reading Institute had a standardized score of 97.51 and a raw score of 90.33. Again, this seems reasonable in comparison to the others.

I’m sure others will do the complete numbers for all of them in time, but for now, someone needs to explain why Saint Vrain Valley doesn’t seem to merit the title “highest-scoring applicant.” I wonder which other scores are wrong? And what of the more than 1600 other applications for which we won’t be able to review the scores?

Follow-up (12:21 pm CDT, August 5, 2010): Michele McNeil over at Education Week sheds a bit of light on the Saint Vrain scoring. As she points out, it’s all about the standardization process. Unfortunately, this means the raw scores and comments are of limited value for we outside observers.


Highest-scoring i3 applicants announced

Late yesterday the Department of Education alerted everyone to its plan to release the list of highest-scoring Investing in Innovation (i3) applicants at 5:30 pm EDT today, but they jumped the gun and posted everything this morning.

Preliminary stats:

  • 49 highest-scoring applicants
  • 4 Scale-up
  • 15 Validation
  • 30 Development
  • Total requested funding $637 million
  • 20 of 49 have already secured the 20% match
  • Lowest score of 81.17 netted $45,593,170
  • Highest score of 116.95 netted $3,608,880

Additional information, including reviewers’ comments and scoring details, will be released later today.


ED’s i3 ‘highest scorers’ list to be revealed August 5th

More info here.

And don’t miss the couldn’t their understanding of the program be useful on a Validation panel?

In the end, they chose 330 unblemished panelists and provided training (no details offered) on the program. Expect to hear some noisy complaints about panelist comments in conflict with program requirements.

I’m looking forward to reading some of the winning narratives. They’ll be posted on data.ed.gov.


NCRR requests applications for a Clinical and Translational Science Coordinating Center

The 55 recipients of the Clinical and Translational Science Awards (CTSA) have been invited to apply for an additional award to act as a national coordinating center.

The new center will receive $4 million a year for 5 years ($20 million total) to develop and deliver communication infrastructure (i.e., wikis, forums, other online tools) and to provide administrative support for committee and working group meetings. The center will also be responsible for maintaining CTSAweb.org, the non-governmental website for the consortium of CTSAs. The center won’t be expected to provide any support for research.

A 30-page narrative and supplemental documents are due January 12, 2011, and the new center has an anticipated start date of December 2011.

Full details are included in the RFA.


Overlooked IES provides several great funding opportunities

The Institute of Education Sciences (IES) is often overlooked by grantseekers despite its relatively large award sizes and broad scope. Opportunities for education research range from early childhood to postdoctoral education, and the diversity of funding categories within each opportunity offers something for just about any applicant.

Responses were due yesterday for several of the FY11 RFAs, but most of the programs accept a second round of submissions September 16th. The full list of opportunities is available here.

Notably, the Education Research Grants bear a striking resemblance to the i3 program and would be a good fit for some of the smaller i3 Development proposals or for applicants looking to conduct a rigorous (and funded!) evaluation of current programs prior to applying for Validation or Scale-up award under a future i3 opportunity. Awards range from $200,000 over two years to almost $5 million over four years.

The best thing about IES funding: fewer applicants means less competition, and less competition means better odds of funding for solid proposals.


data.ed.gov

The Department of Education has unveiled

Unlike the portal provided through the Open Government site (used by NSF and several other federal agencies), data.ed.gov is very easy to use. It’ll be interesting to see how it evolves.


Fund for Improving Post-Secondary Education

The Department of Education published its complete announcement for FIPSE this week.

Applications are due July 29th. The Department anticipates making 37 three-year awards between $500,000 and $750,000 each.

This year’s program has eight invitational priorities:

Invitational Priority 1.
Under this priority, we are particularly interested in centers of excellence for teacher preparation as described in section 242 of the Higher Education Act of 1965, as amended (HEA).

Invitational Priority 2.
Under this priority, we are particularly interested in university sustainability initiatives as described in section 881 of HEA.

Invitational Priority 3.
Under this priority, we are particularly interested in rural development initiatives for rural-serving colleges and universities as described in section 861 of HEA.

Invitational Priority 4.
Under this priority, we are particularly interested in initiatives to assist highly qualified minorities and women to acquire doctoral degrees in fields where they are underrepresented as described in section 807 of HEA.

Invitational Priority 5.
Under this priority, we are particularly interested in modeling and simulation programs as described in section 891 of HEA.

Invitational Priority 6.
Under this priority, we are particularly interested in higher education consortia to design and offer interdisciplinary programs that focus on poverty and human capability as described in section 741(a)(11) of HEA.

Invitational Priority 7.
Under this priority, we are particularly interested in innovative postsecondary models to improve college matriculation and graduation rates, including activities to facilitate transfer of credits between institutions of higher education (IHEs), alignment of curricula on a State or multi-State level between high schools and colleges and between two-year and four-year postsecondary programs, dual enrollment, articulation agreements, partnerships between high schools and community colleges, and partnerships between K-12 organizations and colleges for college access and retention programs.

Invitational Priority 8.
Under this priority, we are particularly interested in activities to develop or enhance educational partnerships and cross-cultural cooperation between postsecondary educational institutions in the United States and similar institutions in Haiti.


NSF Rapid Response Grants available for oil spill research

The National Science Foundation issued a dear colleague letter last week encouraging researchers to use the agency’s Rapid Response Grants mechanism to request funding for studies of the Gulf of Mexico oil spill and its effects.

If you’re wondering just how rapid the Rapid Response grants are, take note: the agency announced a Rapid Response award to UC Santa Barbara just 31 days after the start of the spill to study the effects of dispersants on microbial degradation of oil.


i3 receives 1,669 proposals

The Department of Education announced it received 1,669 proposals for the new Investing in Innovation (i3) program by the May 12th deadline. Additional proposals from federal disaster areas in Tennessee can be submitted as late as May 19th, and one additional applicant received approval to submit an application by mail that has not yet been received.

The submissions represent 68% of the number of LOIs the department received in April.

If the reduction in applications was distributed proportionally across the three application types, the new odds are better for applicants (but again, the department doesn’t have enough money to fund all the awards it projects, so expect the Validation award rate to be <10% as well):

Development: ~8%
Validation: ~28%
Scale-up: ~9%

It’ll be interesting to see whether the reductions were, in fact, proportional. Of the proposals I’m aware of, ALL initially planned to do Validation, and ALL made decisions to switch to Development and then back to Validation at some point–and all the discussions centered around whether the evidence fit the vague descriptions of what is expected.

I suspect most of the scale-up LOIs resulted in full proposals because those groups likely knew back in November exactly what they wanted to do and how they planned to do it. I also suspect many of the Development groups dropped out after discovering they didn’t have the resources to pull together a competitive proposal or couldn’t get internal agreement about the direction of the proposal. It’s a toss up as to whether Validation groups switched to Development or just dropped out.

But we’ll know much more at some point. This from the department’s announcement:

Being transparent: In the coming weeks, the Department will make an unprecedented amount of information available to the public about each i3 applicant and the funding process.  Specifically, the Department intends to provide detailed information on the applicants, partners, priorities, budgets and descriptions of each i3 application.  The Department will leverage a new user-friendly platform that will allow the public to run customized reports on the application pool.  We believe posting this information will improve the quality of the i3 program, spark the imaginations of the public and further our country’s collective efforts to support innovation in education.


NSF: Data management plans to be required for applicants

Starting in October 2010, applications to NSF must include a supplementary two-page about this over the summer.

The new requirement is designed to help NSF meet its obligations for more open research under the Open Government Directive.


Copyright © 2005-2010 Bryan DeBusk, PhD. All rights reserved.
iDream theme by Templates Next | Powered by WordPress