Archive for the ‘Federally funded institutions’ Category

NASA collaboration

Thursday, July 14th, 2005

The tragedy that befell the shuttle Columbia in 2003 resulted in numerous changes within NASA’s shuttle program. One of these changes, as outlined at SiliconValley.com, is greater intra-agency collaboration within the shuttle program itself. Experts at NASA’s Ames Center in California are now involved in certain aspects of the shuttle program, once reserved for NASA’s Johnson Space Center in Texas and the Kennedy Space Center in Florida. Taking advantage of it’s institutional-wide expertise, NASA now involves all of its centers in shuttle processes. The Ames Center has specifically been called upon to run simulations using it’s supercomputer (the third fastest in the world based on June’s Top500 list), appropriately named Columbia, to test scenarios involving the thermal tiles and other components. Personnel at Ames will also be on call to use the Columbia supercomputer to run immediate simulations during missions to find the best solution to specific problems.

NASA leveraging its overall talent, expertise, and multiple resources for a common goal represents an approach that should be utilized in greater frequency, not only within other agencies, but between separate organizations as well (such as academia and government). Such a paradigm would, in many ways, accelerate the nation’s cyberinfrastructure efforts.

Thanks to SiliconValley.com for the article.

More on cyberinfrastructure funding

Wednesday, July 13th, 2005

As a partial acknowledgment of and natural extension to the recent academic recommendations for more multidisciplinary collaboration in higher education research, John Marburger recently submitted a memo to the Heads of Executive Departments and Agencies calling for greater unification of funding efforts to address project duplication among other things. From the optimists perspective, the Office of Science and Technology also wants to place greater emphasis on high performance computing efforts. Quoting from the memo, Marburger says

While the importance of each of the Networking and Information Technology R&D (NITRD) program areas continues, investments in high-end computing and cyber infrastructure R&D should be given higher relative priority due to their potential for broad impact.

The memo goes on to state

Advanced networking research (including test-beds) on hardware and software for secure, reliable, distributed computing environments and tools that provide the communication, analysis and sharing of very large amounts of information will accelerate discovery and enable new technological advances.

This is certainly a step in the right direction for making high-end computing a more salient issue on the federal research agenda. Thanks to Government Computing News for their coverage of the memo.

Weather forecasting breakthrough

Thursday, July 7th, 2005

It seems only fitting as hurricane season is well underway that some news about weather forecasting get the spotlight. The Pittsburgh Supercomputing Center, in a multiple partnership led by NOAA, successfully demonstrated never before achieved storm forecasting by producing higher resolution results than currently used forecasting models are capable of. Over a three month period from April to June, PSC utilized a new forecasting model on its Terascale Computing System to generate three forecasts a day over an area of the Great Plains in the midwest. According to Kelvin Droegemeier, director of the Center for Analysis and Prediction of Storms at the University of Oklahoma and one of the partners of the effort,

Results from the spring experiment suggest that the atmosphere may be fundamentally more predictable at the scale of individual storms and especially organized storm systems than previously thought. Real time daily forecasts over such a large area and with such high spatial resolution have never been attempted before.

It’s good to see a parallel software success story as is the case with this new weather modeling system called the Weather Research and Forecasting (WRF) Model, especially given the fact that high performance software development lags well behind the much publicized advances in computing power.

The full story can be found on PSC’s website here.

It’s not all about the speed, but that helps

Friday, May 27th, 2005

Everyone has seen the IBM commercials that tout a computer’s ability to reconfigure itself depending on what it’s being asked to do. But such “on-the-fly” circuit changes aren’t just the vision of the corporate world. The academic sector is also a major partner in bringing such vision to reality by getting federal funding for just such research. In the latest edition of BusinessWeek Online, the article “Mighty Morphing Power Processors” points to the University of Texas and the Ohio Supercomputer Center as two institutions heavily involved. There is a lot at stake in this next generation, reconfigurable chip. The article states

IBM is hardly the only chipmaker chasing morphing semiconductors. Virtually every major supplier of so-called logic chips is working on some such notion, including Hewlett-Packard, Intel, NEC, Philips Electronics, and Texas Instruments. A dozen or more startups are in the race as well, including Velogix, picoChip Designs, and MathStar.

 

Chronicle on a crisis

Friday, May 20th, 2005

Today’s Chronicle of Higher Education includes an article on the current budget situation for the centers program at NSF. (Free for a few days, then requires subscription.) It gets right to the point:

Many researchers warn that a crisis looms for academic supercomputing in the United States, largely because of what they see as the National Science Foundation’s failure to support the technology adequately…Even some advisers to the Bush administration have recently called on government agencies to develop a clearer road map for purchasing and operating cutting-edge supercomputers and for developing supercomputer software.

The usual suspects make their appearances including:

[A]cademic scientists worry that the changes in the mission of these centers and the NSF’s financing decisions could upend American supercomputing research. If none of the incumbents win a new contract from the NSF, building a new supercomputer center from scratch would not be easy or inexpensive, they say. It might not even be smart.

“You don’t build a highway and decide a few years later that you’re going to take it away,” says Kelvin K. Droegemeier, a professor of meteorology at the University of Oklahoma who relies heavily on supercomputers in his research.

Losing to your forward

Friday, May 20th, 2005

Most of you have probably already had this forwarded to your inbox, but transcripts of the testimony delivered by the likes of John Marburger and Anthony Tether to the House Science Committee last week are available online.

The moderators and/or administrators of this weblog reserve the right to edit or delete ANY content that appears on the site. In other words, the moderators and administrators have complete discretion over the removal of any content deemed by them to be inappropriate, in full or in part.

Any opinions expressed on this site belong to their respective authors and are not necessarily shared by the sponsoring institutions or the National Science Foundation.

Any trademarks or trade names, registered or otherwise, that appear on this site are the property of their respective owners and, unless noted, do not represent endorsement by the editors, publishers, sponsoring institutions, the National Science Foundation, or any other member of the CTWatch team.

No guarantee is granted by CTWatch that information appearing in the Blog is complete or accurate. Information on this site is not intended for commercial purposes.