Prior to the late 1920s the view of the universe was that it was static, or stars we could see seemed to be in fixed relative positions. As Albert Einstein developed his General Theory of Relativity around 1915, which is still our best description of gravity and the development of the universe, he accounted for the static state of the universe with a term in his field equations called the cosmological constant. Essentially, this was equivalent to a type of anti-gravity force that balanced the mutual gravitational attraction between objects in the universe and held everything fixed. However, the observation by Edwin Hubble that galaxies were rushing away from each other led to Big Bang models of the universe and required Einstein to remove his cosmological constant, leading to his famous quote that the constant was his "biggest blunder."
Several years ago, two independent observations of distant supernovae showed that some objects seem to be speeding up as they rush away from us. In other words, it seems as if there is a repulsive force stronger than the gravitational attraction on these objects, acting like an 'anti-gravity' force should. The term 'dark energy' has entered the scientific literature, almost sounding like it was stolen from Star Wars.
Since those preliminary data were published, more groups of researchers have looked at many more supernovae, and the results appear to resemble Einstein's original cosmological constant. Einstein asumed a value of -1.0, and data place the value at something less than -0.85. This acceleration appears to be a real phenomenon, and it does not appear to vary with distance. These new data will help reduce the number of theoretical models attempting to describe what dark energy is and how it behaves, and it shows that Einstein may have been on the right track after all...why does that not seem all so surprising?!
A site for science (especially physics), education, and political news, views, commentary, and debate.
Wednesday, November 30, 2005
Saturday, November 26, 2005
New Possibilities for Nuclear Power Production
An article in the December issue of Scientific American describes some new technologies that have developed that will make conventional nuclear power plants safer and more environmentally sound than what we currently have. This is a timely article since there has been a call for more widespread use of nuclear power in the national energy plan that has been pushed through by the Bush administration and Congress.
In this day and age of terrorism (possibility of dirty bombs and sabotage of nuclear power plants), oil dependency from the Mideast, and tons of spent nuclear fuel that is dangerous, toxic waste that will be around for millennia, it is no wonder that one’s reaction to more nuclear plants is one of disbelief and fear. There is no question that nuclear power is an attractive energy source because there is little in the way of gaseous emissions that can cause illness and contribute to global climate change as we get from conventional gas and coal burning power plants. There is a supply of uranium that will last long after our oil and gas reserves are spent (if we maximize its use), and it is an efficient energy source. It has been the public and political fear factor after Three-Mile Island and Chernobyl that has prevented the U.S. from building new plants over the past thirty years.
However, I read with much interest the new technologies that may allow the re-birth of this industry in the next decade. The first innovation involves using a high-temperature method (pyrometallurgical processing) of recycling normal reactor waste products from fission reactions into new forms that can then themselves be burned in a modified type of reactor called an advanced fast-neutron reactor. The trouble with present-day thermal, slow-neutron reactors is that, while efficient in terms of electricity generation, they cannot minimize the output of radioactive waste. In addition, something I wasn’t aware of, when reactor technicians remove and replace the uranium fuel in reactors, only 5% by weight of the ore has actually been used in energy production. The remaining 95% is unused uranium-235 or waste products, including transuranic elements. With other nations increasing their numbers of nuclear reactors, and likely the U.S. will start building more within the next ten years, the world’s uranium supply may only last fifty years. A new way of harnessing the electricity potential of uranium ore needs to be employed if nuclear power from fission is a viable long-term alternative energy source.
The fast-neutron reactor, though, in conjunction with new recycling methods used on nuclear waste products, have been shown to solve many of the traditional problems with nuclear power. Fast reactors (which have been around since the late 1960s) already exist in France, Japan, Russia, the U.K. and two plants in the U.S. Using full recycling on wastes from thermal reactors, for instance, can lead to about a 1% amount of wasted energy potential from uranium ore, versus the 95% wasted energy potential we now have. The relatively small amount of waste products only contain trace amounts of transuranic elements such as plutonium (needed for weapons) and americium; these are some of the wastes that currently are troublesome since their half-lives are tens of thousands of years or more, depending on the isotope, and the question is what to do with all that toxic waste. The storage issue as well as black-markets selling weapons materials to rogue nations or terrorist organizations requires us to minimize the amounts available, meaning the new recycling methods may help significantly reduce the problem. What little waste is left from fast reactors typically is a problem for 500 years, still a long time, but much better than tens of thousands of years and much smaller quantities to store in a facility like the proposed Yucca Mountain site. A third consideration is the lack of greenhouse emissions from nuclear facilities, which means such reactors would not contribute to global warming.
While there would still be a waste issue with fast reactor and recycling technologies, they would be minimized for fission-based reactors. Weapons materials would not be produced in any significant amounts, toxic waste would only be produced in small amounts which would need to be stored for far shorter periods of time than what wastes we now have, and there would be no climate change contributions. The global uranium supply lifetime would be extended well beyond what it would be with thermal reactors, and far beyond what is projected for oil and gas reserves. These new technologies seem to make fission-based nuclear power much more attractive than only a few years ago, and may be worth pursuing. While I would want much more research before doing this on a large scale, it seems as if the fast reactors that already exist are working well, and large-scale recycling facilities would need to be built in unison.
Of course, this is not the only option for power production. Wind power is the fastest growing type of energy production, and I would like to see significant increases into funding for the R&D of new solar technologies (these need to be made more efficient for widespread use, and new storage techniques need to be developed for cloud days/nights). There are good possibilities with geothermal power as well as harnessing the endless energy of ocean waves, the tides, and ocean currents. Then, of course, there is the possibility of hydrogen-based power, such as fuel cells.
Either way, nuclear power will almost certainly be a part of future energy strategies for the U.S. and numerous nations around the world, as planning for the end of the oil era must begin now. This is not the sort of thing we can wait for because every aspect of our society now revolves around energy, and economic disaster (as well as potential environmental and security disasters) awaits those who do not have viable plans and massive amounts of new energy infrastructure developed over the next few decades. We must get serious now about our energy use and sources of energy production because of the enormous amount of infrastructure construction that will be needed.
In this day and age of terrorism (possibility of dirty bombs and sabotage of nuclear power plants), oil dependency from the Mideast, and tons of spent nuclear fuel that is dangerous, toxic waste that will be around for millennia, it is no wonder that one’s reaction to more nuclear plants is one of disbelief and fear. There is no question that nuclear power is an attractive energy source because there is little in the way of gaseous emissions that can cause illness and contribute to global climate change as we get from conventional gas and coal burning power plants. There is a supply of uranium that will last long after our oil and gas reserves are spent (if we maximize its use), and it is an efficient energy source. It has been the public and political fear factor after Three-Mile Island and Chernobyl that has prevented the U.S. from building new plants over the past thirty years.
However, I read with much interest the new technologies that may allow the re-birth of this industry in the next decade. The first innovation involves using a high-temperature method (pyrometallurgical processing) of recycling normal reactor waste products from fission reactions into new forms that can then themselves be burned in a modified type of reactor called an advanced fast-neutron reactor. The trouble with present-day thermal, slow-neutron reactors is that, while efficient in terms of electricity generation, they cannot minimize the output of radioactive waste. In addition, something I wasn’t aware of, when reactor technicians remove and replace the uranium fuel in reactors, only 5% by weight of the ore has actually been used in energy production. The remaining 95% is unused uranium-235 or waste products, including transuranic elements. With other nations increasing their numbers of nuclear reactors, and likely the U.S. will start building more within the next ten years, the world’s uranium supply may only last fifty years. A new way of harnessing the electricity potential of uranium ore needs to be employed if nuclear power from fission is a viable long-term alternative energy source.
The fast-neutron reactor, though, in conjunction with new recycling methods used on nuclear waste products, have been shown to solve many of the traditional problems with nuclear power. Fast reactors (which have been around since the late 1960s) already exist in France, Japan, Russia, the U.K. and two plants in the U.S. Using full recycling on wastes from thermal reactors, for instance, can lead to about a 1% amount of wasted energy potential from uranium ore, versus the 95% wasted energy potential we now have. The relatively small amount of waste products only contain trace amounts of transuranic elements such as plutonium (needed for weapons) and americium; these are some of the wastes that currently are troublesome since their half-lives are tens of thousands of years or more, depending on the isotope, and the question is what to do with all that toxic waste. The storage issue as well as black-markets selling weapons materials to rogue nations or terrorist organizations requires us to minimize the amounts available, meaning the new recycling methods may help significantly reduce the problem. What little waste is left from fast reactors typically is a problem for 500 years, still a long time, but much better than tens of thousands of years and much smaller quantities to store in a facility like the proposed Yucca Mountain site. A third consideration is the lack of greenhouse emissions from nuclear facilities, which means such reactors would not contribute to global warming.
While there would still be a waste issue with fast reactor and recycling technologies, they would be minimized for fission-based reactors. Weapons materials would not be produced in any significant amounts, toxic waste would only be produced in small amounts which would need to be stored for far shorter periods of time than what wastes we now have, and there would be no climate change contributions. The global uranium supply lifetime would be extended well beyond what it would be with thermal reactors, and far beyond what is projected for oil and gas reserves. These new technologies seem to make fission-based nuclear power much more attractive than only a few years ago, and may be worth pursuing. While I would want much more research before doing this on a large scale, it seems as if the fast reactors that already exist are working well, and large-scale recycling facilities would need to be built in unison.
Of course, this is not the only option for power production. Wind power is the fastest growing type of energy production, and I would like to see significant increases into funding for the R&D of new solar technologies (these need to be made more efficient for widespread use, and new storage techniques need to be developed for cloud days/nights). There are good possibilities with geothermal power as well as harnessing the endless energy of ocean waves, the tides, and ocean currents. Then, of course, there is the possibility of hydrogen-based power, such as fuel cells.
Either way, nuclear power will almost certainly be a part of future energy strategies for the U.S. and numerous nations around the world, as planning for the end of the oil era must begin now. This is not the sort of thing we can wait for because every aspect of our society now revolves around energy, and economic disaster (as well as potential environmental and security disasters) awaits those who do not have viable plans and massive amounts of new energy infrastructure developed over the next few decades. We must get serious now about our energy use and sources of energy production because of the enormous amount of infrastructure construction that will be needed.
Wednesday, November 23, 2005
Happy Thanksgiving!
It is easy to fall into the trap of thinking life is difficult and the world is about to end. But relativity exists and in the end, chances are things aren't so bad compared to what a lot of other people have to deal with. If you have a job, a house, enough food, good health, and a family and friends who care about you, in the end life is pretty good and there is an awful lot to be thankful for. Have a wonderful holiday.
Tuesday, November 22, 2005
Bush may have wanted to bomb Al-Jazeera
In a story reported by the Associated Press out of London, memos have been leaked that suggest President Bush wanted to bomb the main Qatar headquarters of Al-Jazeera, the major Arab news outlet. British Prime Minister Tony Blair supposedly convinced him this would be a political and public relations disaster. If this is true, thank you Mr. Blair for saving us from another colossal mistake by Bush and his administration's handling of the war.
What could be a worse signal to the Arab world than when someone who claims they want to establish democracy and promote the freedoms a real democracy is supposed to enjoy goes ahead and blows up one of the few establishments of free speech in the Arab world, just because of disagreement of the coverage? Could the president really have considered this as a serious option? What little respect we still carry in the region would have likely gone up in smoke, just like the headquarters would have.
Now, I don't read Al-Jazeera's web site, and I am aware of the claims that their stories often try to spin issues towards the Arab perspective. This upsets the administration, who would like to see more photos of the good things that exist for Iraqis, rather than photos of dead Iraqi civilians. But I'm sure Arabs familiar with American news coverage would like to see more photos of dead Iraqi civilians and fewer shots of the president on aircraft carriers saying how well things are going, in order to try and show American viewers the plight of some Arabs. It is a matter of perspective, but it is also part of what you get with free speech. Wars are, in part, won with propoganda as well as what happens on the battlefield, and governments try to spin information in the press to fit their agenda as much as possible. The press may also try to slant things ever so slightly as well. Is that ideal journalism? No. But it is reality. The right complains about our liberal press and all the bias that is unfair to the right's causes. But we don't blow them up. I think we would be upset if Arabs came and bombed Fox News headquarters because of their positive (biased) coverage of Bush and company, which doesn't help the Arab cause.
The current administration has set up a culture where they attack those who disagree with their policy. It is one thing to attack verbally and try to sway public opinion, it is quite another to consider physically destroying those who speak out against you when the goal is to promote democratic principles. Again, if this story is in fact true, thank you Tony Blair.
What could be a worse signal to the Arab world than when someone who claims they want to establish democracy and promote the freedoms a real democracy is supposed to enjoy goes ahead and blows up one of the few establishments of free speech in the Arab world, just because of disagreement of the coverage? Could the president really have considered this as a serious option? What little respect we still carry in the region would have likely gone up in smoke, just like the headquarters would have.
Now, I don't read Al-Jazeera's web site, and I am aware of the claims that their stories often try to spin issues towards the Arab perspective. This upsets the administration, who would like to see more photos of the good things that exist for Iraqis, rather than photos of dead Iraqi civilians. But I'm sure Arabs familiar with American news coverage would like to see more photos of dead Iraqi civilians and fewer shots of the president on aircraft carriers saying how well things are going, in order to try and show American viewers the plight of some Arabs. It is a matter of perspective, but it is also part of what you get with free speech. Wars are, in part, won with propoganda as well as what happens on the battlefield, and governments try to spin information in the press to fit their agenda as much as possible. The press may also try to slant things ever so slightly as well. Is that ideal journalism? No. But it is reality. The right complains about our liberal press and all the bias that is unfair to the right's causes. But we don't blow them up. I think we would be upset if Arabs came and bombed Fox News headquarters because of their positive (biased) coverage of Bush and company, which doesn't help the Arab cause.
The current administration has set up a culture where they attack those who disagree with their policy. It is one thing to attack verbally and try to sway public opinion, it is quite another to consider physically destroying those who speak out against you when the goal is to promote democratic principles. Again, if this story is in fact true, thank you Tony Blair.
Sunday, November 20, 2005
Iran Parliament Votes to Keep IAEA Out
In a story just released, Iran's parliament has voted to keep the International Atomic Energy Agency (IAEA) out of Iran and block any inspections of its nuclear facilities. This, of course, is troublesome because Iran consistently claims it is building facilities and enriching uranium solely for power production and peaceful purposes. The Iranian leaders also consistently whine that the world is against them (i.e. being a member of the axis of evil), and wrongly accuse Iran of trying to develop a full nuclear weapons capability. Of course, it doesn't help their cause when the Iranian president states publicly that Israel should be wiped off the face of the earth. Nor will actions such as the one just taken by the parliament.
When a nation has consistently lied about their intentions and has tried to get away with hiding facts (such as state support of terror groups) in the past, it would seem logical that if they want to gain world acceptance and respect and become a legitimate player in the MidEast they would want to show a sign of good faith. This is not the way to do it. How else can we interpret their defiance of allowing inspections except to conclude Iran is trying to hide a weapons program?
Iran has had essentially a two-year grace period since the U.S. invaded Iraq to do whatever they want, since the U.S. did not keep pressure on Iran as I think we should have. Saddam was contained and we had UN inspectors on the ground in Iraq (and they were discovering that our intelligence was not accurate), and during that time I think we made a mistake by not focusing on the larger threats of Iran and North Korea. Has Iran made progress towards a nuclear weapons program during the past two years, as North Korea supposedly has? It may be the price we pay for going into Iraq, and it is absolutely essential we figure out a way to get inspectors into Iran to make sure. It will be difficult to believe our intelligence after the disastrous failures in Iraq, so we need to get direct, on the ground evidence to be absolutely sure about Iran.
When a nation has consistently lied about their intentions and has tried to get away with hiding facts (such as state support of terror groups) in the past, it would seem logical that if they want to gain world acceptance and respect and become a legitimate player in the MidEast they would want to show a sign of good faith. This is not the way to do it. How else can we interpret their defiance of allowing inspections except to conclude Iran is trying to hide a weapons program?
Iran has had essentially a two-year grace period since the U.S. invaded Iraq to do whatever they want, since the U.S. did not keep pressure on Iran as I think we should have. Saddam was contained and we had UN inspectors on the ground in Iraq (and they were discovering that our intelligence was not accurate), and during that time I think we made a mistake by not focusing on the larger threats of Iran and North Korea. Has Iran made progress towards a nuclear weapons program during the past two years, as North Korea supposedly has? It may be the price we pay for going into Iraq, and it is absolutely essential we figure out a way to get inspectors into Iran to make sure. It will be difficult to believe our intelligence after the disastrous failures in Iraq, so we need to get direct, on the ground evidence to be absolutely sure about Iran.
Saturday, November 19, 2005
Thinking Out Loud About Emergent Behavior...Those Power Laws
I’ve addressed the idea of emergence, emergent behavior, network structures, and so on a number of times in the not so distant past, and a new result by a student of mine has me thinking about these topics again. There is an ever-growing list of complex systems in all areas of science, economics, social science, and technology that have been described or categorized as ‘emergent systems.’ Related to these types of systems are things like networks and self-organized systems, where structure/organization arises from what initially is a disorganized system of individual components. In addition, what we commonly call phase transitions can also be grouped in with the mix.
A common problem with all this is understanding what the fundamental organizational principles are that are responsible for the transition from disorganized/chaotic state to an organized state. In other words, we see what the initial state is, we observe what comes out as a final state, but the actual emergent process is typically not well understood. I began wondering if there may be some type of system that is well understood, shows some sort of transition with a signature of emergent behavior, and could be used to give new clues to organizational principles that may be useful in other, different areas of research. One place to look is in the most fundamental system possible, subatomic particles.
A common signature that is linked to emergent behavior, phase transitions, network structure (specifically scale-free networks), across all fields is the presence of power laws relating various quantities relevant to a specific system. Do power laws exist in analyses of subatomic particles that are along the same lines as other emergence studies? Inspired by a study that looked at what ends up being a scale-free network structure in cellular metabolic chemical systems, where power laws result when one counts the number of chemical reactions within the metabolic process particular molecules participate in, we looked at baryons and the number of decay modes they have, as well as the number of times they appear in other particle decays. It was not obvious going into this what to expect, because each type of baryon has specific numbers of allowed decay modes, each with their own branching ratios (i.e. probabilities of occurring), and having a limited number of particles that exist within the quark model. The results were…power laws.
The interesting aspect of this is that the decay of particles is described in detail by the Standard Model (SM). The SM predicts what type of particles may exist (there is a finite set of possibilities because of three families of six quarks, and the quarks only combine in pairs, mesons, or triplet states, baryons), it predicts what they are allowed to decay into, and it predicts the branching ratios. What we observe in experiments fits the predictions beautifully in all cases. The SM bases its predictions on conservation laws and selection rules of certain quantum numbers, similar to the way quantum mechanics predicts allowed electron configurations (i.e. the entire periodic table of the elements) from simple selection rules of electron quantum numbers in bound states.
The questions in my mind now are: Why are power laws found in so many systems that have nothing to do with each other? Why are power laws signatures for emergent behavior? Now that we see power laws in a fundamental system such as baryons, is there a deeper meaning we can gather from the results? That is, are there possible analogs to conservation laws and quantum numbers (specific quantities that can only have certain values) in other systems that haven’t been thought of that could be the organizational principles, and responsible for the observed power laws?
In some sense this is my take on what is being tried in a field such as econophysics, where analysis methods and techniques that have been perfected in describing physical systems is leading to new insights and new ways of thinking in economic models. Could the baryon analysis be used to cue into a new way of thinking about other systems? I have no idea what the answers to these questions are or if these are nonsense questions to ask...I am only thinking out loud at the moment.
A common problem with all this is understanding what the fundamental organizational principles are that are responsible for the transition from disorganized/chaotic state to an organized state. In other words, we see what the initial state is, we observe what comes out as a final state, but the actual emergent process is typically not well understood. I began wondering if there may be some type of system that is well understood, shows some sort of transition with a signature of emergent behavior, and could be used to give new clues to organizational principles that may be useful in other, different areas of research. One place to look is in the most fundamental system possible, subatomic particles.
A common signature that is linked to emergent behavior, phase transitions, network structure (specifically scale-free networks), across all fields is the presence of power laws relating various quantities relevant to a specific system. Do power laws exist in analyses of subatomic particles that are along the same lines as other emergence studies? Inspired by a study that looked at what ends up being a scale-free network structure in cellular metabolic chemical systems, where power laws result when one counts the number of chemical reactions within the metabolic process particular molecules participate in, we looked at baryons and the number of decay modes they have, as well as the number of times they appear in other particle decays. It was not obvious going into this what to expect, because each type of baryon has specific numbers of allowed decay modes, each with their own branching ratios (i.e. probabilities of occurring), and having a limited number of particles that exist within the quark model. The results were…power laws.
The interesting aspect of this is that the decay of particles is described in detail by the Standard Model (SM). The SM predicts what type of particles may exist (there is a finite set of possibilities because of three families of six quarks, and the quarks only combine in pairs, mesons, or triplet states, baryons), it predicts what they are allowed to decay into, and it predicts the branching ratios. What we observe in experiments fits the predictions beautifully in all cases. The SM bases its predictions on conservation laws and selection rules of certain quantum numbers, similar to the way quantum mechanics predicts allowed electron configurations (i.e. the entire periodic table of the elements) from simple selection rules of electron quantum numbers in bound states.
The questions in my mind now are: Why are power laws found in so many systems that have nothing to do with each other? Why are power laws signatures for emergent behavior? Now that we see power laws in a fundamental system such as baryons, is there a deeper meaning we can gather from the results? That is, are there possible analogs to conservation laws and quantum numbers (specific quantities that can only have certain values) in other systems that haven’t been thought of that could be the organizational principles, and responsible for the observed power laws?
In some sense this is my take on what is being tried in a field such as econophysics, where analysis methods and techniques that have been perfected in describing physical systems is leading to new insights and new ways of thinking in economic models. Could the baryon analysis be used to cue into a new way of thinking about other systems? I have no idea what the answers to these questions are or if these are nonsense questions to ask...I am only thinking out loud at the moment.
Life's Molecules Abundant in Space
It has been established for a number of years that amino acids, the building blocks of proteins (proteins are long chains of amino acids), which in turn are the building blocks of life as we know it, are abundant throughout the solar system. They are found in meteorites that land on earth as well as other objects in orbit around the Sun. I remember reading about the discovery of amino acids in far reaches of space in the Milky Way, well outside of the solar system, and I finally remembered to find a reference to it.
In 2003, a collaboration of researchers from NASA, Taiwan, and Poland discovered the spectral lines of glycine, the simplest of the 20 amino acids needed to create life. These molecules were found not only in our solar system, but in systems of hot gas clouds (in the early portion of active star formation) tens of thousands of light years away. Presumably, this could have occurred when our solar system was forming some five billion years ago. As astronomers and astrobiologists look around, they find more evidence that such organic molecules are prevalent in numerous locations of the Milky Way. There is no reason not to assume that if these molecules form in our galaxy, they likely form in many other galaxies. One open question, as far as I am aware, is through what chemical processes do these organic molecules and precursors of organic life form? I’m sure it is an active area of research, and keeps alive the question that originated some forty years ago: Did life on earth come from outer space?
In 2003, a collaboration of researchers from NASA, Taiwan, and Poland discovered the spectral lines of glycine, the simplest of the 20 amino acids needed to create life. These molecules were found not only in our solar system, but in systems of hot gas clouds (in the early portion of active star formation) tens of thousands of light years away. Presumably, this could have occurred when our solar system was forming some five billion years ago. As astronomers and astrobiologists look around, they find more evidence that such organic molecules are prevalent in numerous locations of the Milky Way. There is no reason not to assume that if these molecules form in our galaxy, they likely form in many other galaxies. One open question, as far as I am aware, is through what chemical processes do these organic molecules and precursors of organic life form? I’m sure it is an active area of research, and keeps alive the question that originated some forty years ago: Did life on earth come from outer space?
Thursday, November 17, 2005
21st Century Research: How to Handle Massive Amounts of Data
Think about this: In about two years, if all goes according to plan, the Large Hadron Collider (LHC) will be commissioned and running out at CERN, the European particle physics facility near Geneva. One of the experiments, the Compact Muon Solenoid (CMS), will be able to collect 225 MB of data each second, and will run for the equivalent of 115 full days in 2008. This adds up to 2 petabytes of data (enough to be stored on 1.4 million CDs!). So goes 21st century scientific research. The question, of course, is where and how does one store such massive amounts of data, let alone distribute it to hundreds or even thousands of scientists, postdocs, and graduate students who need the data for their individual projects?
Back around 1999, physicists and computer scientists began developing prototype systems known as 'grid computing' in order to handle large datasets. Single labs or universities almost certainly will not have the capability to store and efficiently make use of datasets mentioned above for CMS, so it is necessary to expand on collaborative efforts of the past for analyses of all kinds. Grid computing is named because it is analogus to an electrical grid. When you turn on a light in your house, you don't really have any idea from where on the grid the power is coming from, just as long as you get it. The grid is a sort of black box, an enormous network of wires, transformers, power stations, and so on the collectively feeds power when needed on a large scale. Data and information grid computing networks act in a similar manner.
In a sense this means that there will be virtual organizations and collaborations. Members of a particular grid network will enter the information they need at any time on their own computer, and whatever it is they requested from the grid will be accessed from wherever that information is stored on the grid. All members of the grid network will have such capabilities. Large datasets can presumably be broken up and stored at any number of sites all over the world, and the grid architecture and software will be able to quickly grab any files that are requested, regardless of what local computer network has that file. Such a system of virtual organizations is essential because of the size and cost of many scientific projects that are either in existence or are planned for the future. To put it in perspective, Microsoft's Tony Hey said, "It's no exaggeration to say that we'll collect more data in the next five years than we have in all of human history."
The data being collected comes as both data from physical experiments such as CMS or other experiments at the Fermilab national lab outside Chicago, as well as simulation data from computer-based experiments. Many of the complicated and dense simulation data comes from programs being run on modern supercomputers and other parallel-processing networks. Presently large databases can be accessed on the traditional Internet, such as protein databases and other information from the Human Genome Project, and these and similar databases are only expected to grow over time. Large national and international collaborations that share data are not new, and from my own experience at Fermilab, certain types of science cannot be done without such efforts. Grid computing is the natural extension and solution to ever-increasing information from such collaborations.
An interesting extension of collaborative work comes in the form of increasing amounts of multidisciplinary research. More research areas are overlapping, and with grid computing there is an expectation that biologists, chemists, geologists, physicists, astronomers, theorists, and so on will be able to access each other's data and tools, as well as share research methods and models to enable new breakthroughs. With growing work being done on complex systems and emergent behavior, for instance, such sharing will likely be necessary to expand the field.
How grid computing works depends on four layers of the network. The foundation is the physical network that links all the members and resources, which are considered the second layer, on the grid. Each successive layer depends on the lower layers. On top of the resources sits the middleware, which refers to the software that makes the grid work, as well as hides the complexity of the grid from users. The top layer is then the applications software that users actually see and access on their computers. This is like the application icons one has on your computer desktop; click on it, and software that you don't see and take for granted does its thing to open up the program or application you want. The big difference from your current computer is that a computer that is a member of the grid will have icons for applications or data that, when you need them and click on them, will not grab the program off your hard-drive, but rather pull it from whatever computer on the grid that has it...and that computer may be on the other side of the world. This all happens automatically.
As people gain experience with the relatively small grid networks that currently exist, larger and more complex grids will naturally form. It will be an Internet on steroids, and as always with technology, we cannot imagine the ways in which it will be used. One potential problem, of course, is security of enormous datasets, so development of new types of encryption and anti-virs software will likely be developed as well to maintain the integrity of a grid where members change frequently and new information is added daily to the grid, much like what already happens on the traditional Internet.
Back around 1999, physicists and computer scientists began developing prototype systems known as 'grid computing' in order to handle large datasets. Single labs or universities almost certainly will not have the capability to store and efficiently make use of datasets mentioned above for CMS, so it is necessary to expand on collaborative efforts of the past for analyses of all kinds. Grid computing is named because it is analogus to an electrical grid. When you turn on a light in your house, you don't really have any idea from where on the grid the power is coming from, just as long as you get it. The grid is a sort of black box, an enormous network of wires, transformers, power stations, and so on the collectively feeds power when needed on a large scale. Data and information grid computing networks act in a similar manner.
In a sense this means that there will be virtual organizations and collaborations. Members of a particular grid network will enter the information they need at any time on their own computer, and whatever it is they requested from the grid will be accessed from wherever that information is stored on the grid. All members of the grid network will have such capabilities. Large datasets can presumably be broken up and stored at any number of sites all over the world, and the grid architecture and software will be able to quickly grab any files that are requested, regardless of what local computer network has that file. Such a system of virtual organizations is essential because of the size and cost of many scientific projects that are either in existence or are planned for the future. To put it in perspective, Microsoft's Tony Hey said, "It's no exaggeration to say that we'll collect more data in the next five years than we have in all of human history."
The data being collected comes as both data from physical experiments such as CMS or other experiments at the Fermilab national lab outside Chicago, as well as simulation data from computer-based experiments. Many of the complicated and dense simulation data comes from programs being run on modern supercomputers and other parallel-processing networks. Presently large databases can be accessed on the traditional Internet, such as protein databases and other information from the Human Genome Project, and these and similar databases are only expected to grow over time. Large national and international collaborations that share data are not new, and from my own experience at Fermilab, certain types of science cannot be done without such efforts. Grid computing is the natural extension and solution to ever-increasing information from such collaborations.
An interesting extension of collaborative work comes in the form of increasing amounts of multidisciplinary research. More research areas are overlapping, and with grid computing there is an expectation that biologists, chemists, geologists, physicists, astronomers, theorists, and so on will be able to access each other's data and tools, as well as share research methods and models to enable new breakthroughs. With growing work being done on complex systems and emergent behavior, for instance, such sharing will likely be necessary to expand the field.
How grid computing works depends on four layers of the network. The foundation is the physical network that links all the members and resources, which are considered the second layer, on the grid. Each successive layer depends on the lower layers. On top of the resources sits the middleware, which refers to the software that makes the grid work, as well as hides the complexity of the grid from users. The top layer is then the applications software that users actually see and access on their computers. This is like the application icons one has on your computer desktop; click on it, and software that you don't see and take for granted does its thing to open up the program or application you want. The big difference from your current computer is that a computer that is a member of the grid will have icons for applications or data that, when you need them and click on them, will not grab the program off your hard-drive, but rather pull it from whatever computer on the grid that has it...and that computer may be on the other side of the world. This all happens automatically.
As people gain experience with the relatively small grid networks that currently exist, larger and more complex grids will naturally form. It will be an Internet on steroids, and as always with technology, we cannot imagine the ways in which it will be used. One potential problem, of course, is security of enormous datasets, so development of new types of encryption and anti-virs software will likely be developed as well to maintain the integrity of a grid where members change frequently and new information is added daily to the grid, much like what already happens on the traditional Internet.
Thursday, November 10, 2005
Update from the House
The House leadership had to delay the vote on what budget cuts would be made because moderate Republicans have broken ranks. In part because of the harshness of the cuts to the poor, and in part because of the overwhelming negative feedback they are receiving for drilling in ANWAR. I heard an interview with a Republican congressman (I'm blanking n his name) who received in the last 2 days 1600 calls and emails asking him not to vote for the proposed bill because of ANWAR. Interesting. We'll have to wait and see how much armtwisting the leadership puts on the moderates to try and get the votes next week. They are likely nervous if they cannot get their additional tax cuts...
Wednesday, November 09, 2005
Once Again, a Call to Arms
As was the case about two weeks ago, try to contact your Representative before a likely vote tomorrow that will cut $50 billion from education, food stamps, college aid, and aid for the elderly and poor. The House leadership is claiming this is to pay for hurricane relief and to reduce the deficit (when is the last time you heard the leadership worrying about the deficit...give me a break). However, doubting Thomases like me believe the real reason is a new round of $70 billion in tax cuts, which the rumor-mill says is likely to come up later this month (I guess this would make for a pleasant Thanksgiving for the well-to-do). How much of these tax cuts do you think you will see? Let's make sure those who struggle to put food on the table can during the holidays.
Saturday, November 05, 2005
Speaking of Copernicus....
In a previous post today, Nicholas Copernicus came up. Copernicus is, of course, the Polish 'revolutionary' cleric who is given credit for the heliocentric model of the solar system. I just came across a story where Polish archaeologists believe they have found the remains of Copernicus, who lived in the 16th century. An interesting coincidence...
The heliocentric model changed our entire picture of the heavens, where the earth is not the center of the universe, but rather just one of many objects in orbit around the Sun. In Copernicus's lifetime, the Church dominated all aspects of life, and 'science' did not truly exist. Copernicus did not want to go public with his beliefs for fear of persecution by the Church, and it was Galileo who took the torch and promoted the heliocentric model. Galileo, in turn, found himself in conflict with the Church. We should be grateful to these two men, who were largely responsible for the birth of what we now call science (particularly Galileo, who promoted experimentation and observation as the basis for one's conclusions; Isaac Newton was Galileo's successor and gave birth to rigorous mathematical science).
The heliocentric model changed our entire picture of the heavens, where the earth is not the center of the universe, but rather just one of many objects in orbit around the Sun. In Copernicus's lifetime, the Church dominated all aspects of life, and 'science' did not truly exist. Copernicus did not want to go public with his beliefs for fear of persecution by the Church, and it was Galileo who took the torch and promoted the heliocentric model. Galileo, in turn, found himself in conflict with the Church. We should be grateful to these two men, who were largely responsible for the birth of what we now call science (particularly Galileo, who promoted experimentation and observation as the basis for one's conclusions; Isaac Newton was Galileo's successor and gave birth to rigorous mathematical science).
Note to Commenters
Because of spamming in 'comments' sections of my posts, I have activated word verification if you want to post comments. Comments are too important since I learn a great deal from them, and good, honest discussons and debate are vital for intellectual development and understanding. I apologize for the added inconvenience, but I also want to keep the random crap and advertisements out of my blog. Thanks to Zenpundit for pointing out this option!
Vatican Cardinal Suggests Faithful Should Listen to Scientists
Cardinal Paul Poupard, who heads the Pontifical Council for Culture, said Friday that it is important for the faithful to pay attention to what science has to offer on various issues, because religion risks turning into 'fundamentalism' if it ignores scientific reasoning. He then went further than Pope John Paul did with comments on evolution. In 1996, the Pope said "evolution is more than a hypothesis," and this followed a 1992 declaration that the church's 17th-century denunciation of Galileo was an error resulting from "tragic mutual incomprehension." Galileo was condemned for supporting Nicolaus Copernicus' discovery that the Earth revolved around the sun; church teaching at the time placed Earth at the center of the universe.
"The permanent lesson that the Galileo case represents pushes us to keep alive the dialogue between the various disciplines, and in particular between theology and the natural sciences, if we want to prevent similar episodes from repeating themselves in the future," Poupard said.
I could not agree more. In addtion, I do think that science needs to pay attention to and respect what religion has to offer. Good examples come in the form of what are guiding principles for science ethics, as I addressed in my last post, or where does morality fit into science research. Religion can help keep good debates going in areas like nuclear weaponry research, cloning, limits on medical research on humans, and so on. Without a moral compass, science can go extreme, as evidenced by Nazi 'medical research' back in the 1930s and 1940s.
Most humans practice some sort of religion, so science cannot ignore this. As for the intelligent design debate, Poupard expanded on John Paul's comment and stated evolution is supported by physical evidence, and that this evidence is constantly growing. This is why evolution should be and is the overwhelming, dominant scientific model for how life evolved. Keep in mind that evolution does not state how the first life began, but rather describes how more complex life evolves from simpler life forms. It describes the process by which we now have such amazing variety of life. Cardinal Poupard suggests, as I also believe, that religion and science do not have to be mutually exclusive in this debate. To be honest, I prefer and believe the intelligent design model, which has as its premise a supernatural entity called the 'designer' as the designer of such a complex system as a human being. But I realize my belief in this 'theory' only comes from my personal religious faith and the way I was raised, since I am a Christian. I absolutely do not support, however, including intelligent design in science curricula because I do not, nor anyone else, have any physical evidence for a 'supernatural entity' (so ID supporters out there, let's cut to the chase and say the designer is God). ID is not a scientific theory, but rather a theological and philosophical model.
I get the impression that Cardinal Poupard is saying the same thing, and I am glad to see he understands that both science and religion are important and should have mutual respect, but there are also boundaries that keep these two different realms of thought and practice, and that is OK! The same can be said for philosophy. There is a place for creationism and a grand 'designer' of life, but that should be in religious and philosophical venues where one can accept ideas that may not have physical support. I would not want to advocate or mandate the teaching of evolution in such a venue, because that is not its proper place. Likewise, science has a mandate that requires it to find the physical reasons for why the universe works the way we observe, and nonphysical models are not appropriate in the teaching of science.
"The permanent lesson that the Galileo case represents pushes us to keep alive the dialogue between the various disciplines, and in particular between theology and the natural sciences, if we want to prevent similar episodes from repeating themselves in the future," Poupard said.
I could not agree more. In addtion, I do think that science needs to pay attention to and respect what religion has to offer. Good examples come in the form of what are guiding principles for science ethics, as I addressed in my last post, or where does morality fit into science research. Religion can help keep good debates going in areas like nuclear weaponry research, cloning, limits on medical research on humans, and so on. Without a moral compass, science can go extreme, as evidenced by Nazi 'medical research' back in the 1930s and 1940s.
Most humans practice some sort of religion, so science cannot ignore this. As for the intelligent design debate, Poupard expanded on John Paul's comment and stated evolution is supported by physical evidence, and that this evidence is constantly growing. This is why evolution should be and is the overwhelming, dominant scientific model for how life evolved. Keep in mind that evolution does not state how the first life began, but rather describes how more complex life evolves from simpler life forms. It describes the process by which we now have such amazing variety of life. Cardinal Poupard suggests, as I also believe, that religion and science do not have to be mutually exclusive in this debate. To be honest, I prefer and believe the intelligent design model, which has as its premise a supernatural entity called the 'designer' as the designer of such a complex system as a human being. But I realize my belief in this 'theory' only comes from my personal religious faith and the way I was raised, since I am a Christian. I absolutely do not support, however, including intelligent design in science curricula because I do not, nor anyone else, have any physical evidence for a 'supernatural entity' (so ID supporters out there, let's cut to the chase and say the designer is God). ID is not a scientific theory, but rather a theological and philosophical model.
I get the impression that Cardinal Poupard is saying the same thing, and I am glad to see he understands that both science and religion are important and should have mutual respect, but there are also boundaries that keep these two different realms of thought and practice, and that is OK! The same can be said for philosophy. There is a place for creationism and a grand 'designer' of life, but that should be in religious and philosophical venues where one can accept ideas that may not have physical support. I would not want to advocate or mandate the teaching of evolution in such a venue, because that is not its proper place. Likewise, science has a mandate that requires it to find the physical reasons for why the universe works the way we observe, and nonphysical models are not appropriate in the teaching of science.
Wednesday, November 02, 2005
The Need for Ethical Science
A couple months ago the results of a study published in the journal Nature showed that about 1 of 3 scientists doing research admitted to poor ethics, which typically means they fudged data at some point in their studies. The study surveyed over one thousand researchers.
This is disturbing. Science has as a goal to find the truths in Nature. When scientists submit articles for publications in the top journals of their respective fields, it is assumed the primary research was completed with honor and respect for the truth, no matter if what was measured and observed fits in with preconceived ideas/attitudes/beliefs or not. While articles submitted to the major journals are peer-reviewed and go through a series of re-edits, it is impossible for the reviewers to be one hundred percent sure of whether an honest effort was made or not. If a dishonest paper makes it through the process, it may be years before that dishonesty is discovered, most likely through other independent checks of the experiments that lead to opposing or conflicting results and/or conclusions.
One trend that has developed in fields such as biochemistry, genetics, biomedical engineering, and so on, is large numbers of researchers going off and developing private companies that try to develop a new drug, procedure, or technology for specific needs, in addition to their university research. Private enterprise springs the ultimate motive to slightly tweak data or develop overly optmistic conclusions - of course, I mean monetary profit. In fact, when I was talking with acquaintances who are in the administration at the medical research facility of a major university hospital, I was shocked by the lack of collaboration and sharing of information on cancer research between their labs and another nearby major university facility. The reason given was largely because of the race to get patents and private funding from pharmaceutical and biotechnology firms. With potential billions of dollars in such areas of research for the next wonderdrug or life-saving procedure, of course some may weaken under the stress to win the race and miss something in the data or be a tad sloppy with analysis, or feel the need to make results just a tad better than they really are in order to get a piece of the money pie. Such actions not only go against the science code of ethics, but also can conceivably delay the process that can mean unnecessary deaths or progress in an entire field. It is likely that real collaboration and sharing of knowledge between the two university facilities could lead to breakthroughs in a more timely manner than the two working in secrecy, but the need for money and glory get in the way, and weaken all of science in the process when unethical behavior enters the fray.
This is disturbing. Science has as a goal to find the truths in Nature. When scientists submit articles for publications in the top journals of their respective fields, it is assumed the primary research was completed with honor and respect for the truth, no matter if what was measured and observed fits in with preconceived ideas/attitudes/beliefs or not. While articles submitted to the major journals are peer-reviewed and go through a series of re-edits, it is impossible for the reviewers to be one hundred percent sure of whether an honest effort was made or not. If a dishonest paper makes it through the process, it may be years before that dishonesty is discovered, most likely through other independent checks of the experiments that lead to opposing or conflicting results and/or conclusions.
One trend that has developed in fields such as biochemistry, genetics, biomedical engineering, and so on, is large numbers of researchers going off and developing private companies that try to develop a new drug, procedure, or technology for specific needs, in addition to their university research. Private enterprise springs the ultimate motive to slightly tweak data or develop overly optmistic conclusions - of course, I mean monetary profit. In fact, when I was talking with acquaintances who are in the administration at the medical research facility of a major university hospital, I was shocked by the lack of collaboration and sharing of information on cancer research between their labs and another nearby major university facility. The reason given was largely because of the race to get patents and private funding from pharmaceutical and biotechnology firms. With potential billions of dollars in such areas of research for the next wonderdrug or life-saving procedure, of course some may weaken under the stress to win the race and miss something in the data or be a tad sloppy with analysis, or feel the need to make results just a tad better than they really are in order to get a piece of the money pie. Such actions not only go against the science code of ethics, but also can conceivably delay the process that can mean unnecessary deaths or progress in an entire field. It is likely that real collaboration and sharing of knowledge between the two university facilities could lead to breakthroughs in a more timely manner than the two working in secrecy, but the need for money and glory get in the way, and weaken all of science in the process when unethical behavior enters the fray.
Subscribe to:
Posts (Atom)