Here’s one for the VoIP crowd and eBay fans. It was reported earlier this month that eBay purchased the Internet communications company Skype for $2.6 billion in cash and stock. How eBay intends to integrate Skype technology into their operations isn’t entirely clear except that it’s clear eBay’s sole reliance on e-mail as the primary form of communication between parties has become rather archaic and untimely. VoIP has potential and is getting more and more press lately, but it seems that Skype’s technology, which includes peer-to-peer service (IM) and file transfer capability, and its marriage with eBay may have some problems.
Archive for September, 2005
In this article at Korea’s JoongAng Daily, the President of Lucent Technologies research division states that by 2060, computer CPUs will have the capacity of every human brain combined. That’s a pretty bold prediction, but perhaps not as farfetched as we might think. For those familiar with Ray Kurzweil (more here), the futurist, author, inventor, and AI expert you might recall he predicted in the 1990s that by 2019 the PC will have the processing power of one human brain. By 2029, 1,000 human brains.
Needless to say, computational capacity of big computing machines remains a salient issue, but rarely is it put into terms of human brain processing. One claim has the human brain capable of nearly 20-million-billion calculations per second. But who really knows how accurate that is.
Microsoft has entered the cluster computing market with hopes of grabbing a share of the market currently led by Linux. The proliferation of clusters for heavy duty computing continues across many business segments, from industry to government, as evidenced by the latest edition of the Top500, which shows that cluster systems comprise 60% of the list.
According to this article from Grid Computing Planet, Microsoft’s initial software entry works on clusters of up to 128 machines and Microsoft intends to better integrate heterogeneous applications on the cluster compared to Linux as well as offering better support. An overview of their cluster solution can be found here.
The new software will include an open source MPI middleware. That’s right - open source. More info about Microsoft’s decision to implement this into the Compute Cluster Solution can be found in this eWeek piece.
Not everyone has a supercomputer lying around, especially an idle one, but if you have a high performance system and are looking to save some energy without losing much performance, Los Alamos National Laboratory might have the solution. With the use of EnergyFit 1.0, LANL is claiming a potential 10-25% energy savings in system energy consumption. According to this mobile piece from LinuxHPC.org, EnergyFit is
a transparent software layer based on a novel algorithm that reduces the power and energy consumption of high-performance computing systems.
Developed by Chung-Hsing Hsu and Wu-chun Feng at LANL, this software is another approach to addressing the enormous energy consumption and heat distribution of multiple processor architectures. Reducing the amount of energy used and consequent head produced by big machines leads, among other things, to a decrease in MTBF (mean time between failure) of processors and an increase in overall system reliability.
More information about power consumption and savings on large systems can be found in this Computerworld article.
In this piece written by the President of Building Engineering and Science Talent (BEST), John Yochelson, for SiliconValley.com, California’s efforts to address the decline in science education are highlighted, including examples of other community based efforts to achieve the same goal. By creating a program specifically for incoming freshman geared toward producing K-12 science teachers, the University of California system is partnering with business and industry.
Part of the motivation stems from what Yochelson considers the phenomenal growth in other foreign economies and changes in our own,
With China and India churning out tens of thousands of additional engineers each year, poorly paid research apprenticeships in science lasting longer, and the incomes of business and law school graduates going up, it is no surprise that U.S. degree production in many key technical fields has been flat or down since the mid-1980s.
The decline in technical skills and interest in science stems from many sources, which is why multiple efforts are underway to address it. Science and technology disciplines in education need better promotion. We need to do a better job of selling the science.
BEST is an organization founded in 2001 as a result of the recommendations from the 2000 Congressional Commission on the Advancement of Women and Minorities in Science, Engineering and Technology Development.
In following the devastation to come out of the Katrina catastrophe, it seems appropriate to look at the communication infrastructure that failed so miserably in New Orleans. The complete inability to connect people with real-time events in NO underscores the importance of having a more or less fault-tolerant communication infrastructure in place. The need for a more complete and robust cyberinfrastructure has now come to the forefront whereas before it may have just been viewed as more “Star Wars” technology being promoted by computer scientists and computer geeks. According to John Powell, a senior consulting engineer with the National Public Safety Telecommunications Council,
… emerging technologies can lead to better communications in the future, but no technology will help you when the total physical infrastructure is inadequate. This is our first big disaster where urban response teams have had to bring in all their own communications equipment” because there was almost no emergency-communications capability left in the city.
Specifically what Powell was referring to are mesh wireless networks based on the 802.11 wi-fi standard and the need for a better system in place as discussed in this Information Week article. It is somewhat unbelievable to think that no communication contingency was in place for an event that urban planners knew could, and probably would, happen in NO. The threat of such an event occurring and discussions of what to do if all power, and thus communication, was lost in the city have been on the table for years.