A site for science (especially physics), education, and political news, views, commentary, and debate.
Monday, July 31, 2006
NASA Mission Statement Changed
The problems NASA climate expert Jim Hansen has created for the White House probably have much to do with the change in NASA's mission statement, which used to include the line "to understand and protect our home planet." This line is now cut entirely from the mission statement, which certainly puts pressure on NASA and other government scientists to stop worrying about studying global climate change. This has been an increasing area of study within NASA since they obviously have the tools necessary to do the science from space. As Hansen points out, perhaps the White House is trying to eliminate research that is creating headaches for Bush.
Rewriting Science, the Administration Way
There was a fascinating, and extremely disturbing, story on 60 Minutes this past Sunday. I believe it was a replay, as I missed it when it was originally aired.
The story included an interview with NASA climate expert (generally considered the top expert on the panet in his field), Jim Hansen, who risked his job by going public with some White House activity. Ever since the Bush administration came to power, there has been an ongoing campaign of overlooking, questioning, and simply ignoring the science behind several issues, most notably the global changes in climate. It is certainly well-known that it took a number of years before the president would even accept that the global temperature is rising, and even longer to finally acknowledge that humans have something to do with it. But what scientists have been complaining about for years now is the unprecedented way in which this White House censors and restricts scientific results from making it to the public by government scientists. Now, every administration tries to put its own spin on science. Hansen, for instance, mentioned he is politically an independent, and that during the Clinton administration they wanted him to try to spin the science to make global warming and climate change appear to be worse than what the data suggest. Hansen correctly did not do this and made sound scientific recommendations and reported his conclusions based on available evidence, and not political goals.
However, something that has begun to happen with the Bushies is that all scientific reports that are to go public from government research scientists must go through a White House editing process first; and it is not scientists in the White House that are doing the editing...it is lawyers and administration officials. This is nonsense!!
For example, any reports that are published for public review on environmental and climate issues are run through the office run by Phil Cooney, the chief of staff for the White House Council on Environmental Quality. It is Cooney who has personally edited the science reports coming from the government agencies that study such topics. The first problem is, Cooney is an attorney, and he is marking up and removing scientific evidence and conclusions from researc papers and memos. I think every rational person would see some problems with an editor who has absolutely no expertise with technical subjects. The second problem is that Cooney was formerly a star lobbyist for the petrolium industry. The 60 Minutes piece had numerous examples of the edits made by Cooney, as they got their hands on actual copies of the papers he edited with his hand-written marks. My favorite is the complete deletion of the paragraph explaining how and why energy production contributes to global warming because of high levels of greenhouse gases during the relevant chemical reactions of the process. He did not even try to fudge the language, but simply crossed it out. The paper submitted to congressional committees contained all the edit and deletions made by Cooney, so that Congress would not be able to see the actual science in order to act at all to curb the problems at hand (since according to the White House, owned and run by big oil, there are no problems). This si despicable in my view, and goes to an entire new level of misinformation and lack of respect for science...and it deals with a major issue that will likely explode in the near future for global environmental damage and global economic chaos.
It should be noted that Hansen has not been allowed to take other interviews, and during the 60 Minutes interview he had a NASA public relations official sitting just off camera, so if he said something the administration did not want to hear, she would have stopped the interview on the spot. This is not what I want in the United States....it is a reflection of an administration that is in denial and has propped up political ideology and overall ignorance ahead of facts and science. It seems like a story that would come out of the former Soviet Union or something.
In my opinion there is no question that the numerous reports of White House tampering with intelligence prior to the invasion of Iraq are true. This White House in particular has gone beyond the normal spin all politicians do, and has created an environment where the expectation is to censor, edit, misinform, and select only those morsels of information that works in their favor. We have seen the results of that misinformation in Iraq, and I fear these eight years of Bush denial and total inaction, which are critical years for the fight of environmental damage, will show similar results in several decades. America and the world deserve better, and I think the single best thing that can happen in the near-term is for the Democrats to win back the House in the fall election. We need to have a buffer to stop the shift to the far right in our government, force it back to a more moderate government, and begin to put science, facts, data, and evidence back into policymaking.
The story included an interview with NASA climate expert (generally considered the top expert on the panet in his field), Jim Hansen, who risked his job by going public with some White House activity. Ever since the Bush administration came to power, there has been an ongoing campaign of overlooking, questioning, and simply ignoring the science behind several issues, most notably the global changes in climate. It is certainly well-known that it took a number of years before the president would even accept that the global temperature is rising, and even longer to finally acknowledge that humans have something to do with it. But what scientists have been complaining about for years now is the unprecedented way in which this White House censors and restricts scientific results from making it to the public by government scientists. Now, every administration tries to put its own spin on science. Hansen, for instance, mentioned he is politically an independent, and that during the Clinton administration they wanted him to try to spin the science to make global warming and climate change appear to be worse than what the data suggest. Hansen correctly did not do this and made sound scientific recommendations and reported his conclusions based on available evidence, and not political goals.
However, something that has begun to happen with the Bushies is that all scientific reports that are to go public from government research scientists must go through a White House editing process first; and it is not scientists in the White House that are doing the editing...it is lawyers and administration officials. This is nonsense!!
For example, any reports that are published for public review on environmental and climate issues are run through the office run by Phil Cooney, the chief of staff for the White House Council on Environmental Quality. It is Cooney who has personally edited the science reports coming from the government agencies that study such topics. The first problem is, Cooney is an attorney, and he is marking up and removing scientific evidence and conclusions from researc papers and memos. I think every rational person would see some problems with an editor who has absolutely no expertise with technical subjects. The second problem is that Cooney was formerly a star lobbyist for the petrolium industry. The 60 Minutes piece had numerous examples of the edits made by Cooney, as they got their hands on actual copies of the papers he edited with his hand-written marks. My favorite is the complete deletion of the paragraph explaining how and why energy production contributes to global warming because of high levels of greenhouse gases during the relevant chemical reactions of the process. He did not even try to fudge the language, but simply crossed it out. The paper submitted to congressional committees contained all the edit and deletions made by Cooney, so that Congress would not be able to see the actual science in order to act at all to curb the problems at hand (since according to the White House, owned and run by big oil, there are no problems). This si despicable in my view, and goes to an entire new level of misinformation and lack of respect for science...and it deals with a major issue that will likely explode in the near future for global environmental damage and global economic chaos.
It should be noted that Hansen has not been allowed to take other interviews, and during the 60 Minutes interview he had a NASA public relations official sitting just off camera, so if he said something the administration did not want to hear, she would have stopped the interview on the spot. This is not what I want in the United States....it is a reflection of an administration that is in denial and has propped up political ideology and overall ignorance ahead of facts and science. It seems like a story that would come out of the former Soviet Union or something.
In my opinion there is no question that the numerous reports of White House tampering with intelligence prior to the invasion of Iraq are true. This White House in particular has gone beyond the normal spin all politicians do, and has created an environment where the expectation is to censor, edit, misinform, and select only those morsels of information that works in their favor. We have seen the results of that misinformation in Iraq, and I fear these eight years of Bush denial and total inaction, which are critical years for the fight of environmental damage, will show similar results in several decades. America and the world deserve better, and I think the single best thing that can happen in the near-term is for the Democrats to win back the House in the fall election. We need to have a buffer to stop the shift to the far right in our government, force it back to a more moderate government, and begin to put science, facts, data, and evidence back into policymaking.
Monday, July 17, 2006
NAACP President Looking for Action
NAACP President Bruce Gordon stated in his speech at the national convention that Black Americans needed to end "victim-like thinking" and take advantage of the opportunities that are out there presently to begin pushing more people of color out of poverty. This year's convention is being held in Washington DC, and the hope is that the president, who has never made an appearance at any NAACP event (perhaps with all the problems that have existed in black precincts the last two elections has something to do with the tension between the president and the Black community), will come.
With nearly a dozen years of experience working with minority students and parents, I've concluded that the single biggest obstacle that perpetuates the continuing achievement gap between white students and students of color is cultural in nature, and the attitude taken by Gordon is certainly a positive step. One grand experiment is Project Excite, which essentially is looking to see if there is a critical mass of minority students who excel academically that begins a domino effect, where being smart (which most of the students are) turns into acting smart and allows students of color to believe it is OK (and not an act of "acting white") to get into advanced classes and aim for top colleges academically. Politically, the minority blocks are large enough to determine elections, and time will tell if there is the motivation on a large scale to take advantage of that potential.
With nearly a dozen years of experience working with minority students and parents, I've concluded that the single biggest obstacle that perpetuates the continuing achievement gap between white students and students of color is cultural in nature, and the attitude taken by Gordon is certainly a positive step. One grand experiment is Project Excite, which essentially is looking to see if there is a critical mass of minority students who excel academically that begins a domino effect, where being smart (which most of the students are) turns into acting smart and allows students of color to believe it is OK (and not an act of "acting white") to get into advanced classes and aim for top colleges academically. Politically, the minority blocks are large enough to determine elections, and time will tell if there is the motivation on a large scale to take advantage of that potential.
Friday, July 14, 2006
Will we ever be able to predict what social systems and networks will do? Perhaps globally, but likely not locally
There is a lot of interest in social systems and networks, and the use of network theory to help explain how and why social systems work the way they do. While research has shown, for instance, how different rulesets lead to various decisions or how network topology helps identify how disease spreads, one needs to keep in mind that there is a difference between local environments and global environments. What I mean by this is that the analyses done in these areas of study essentially look at results that affect the system more globally. It is quite another thing to see what happens to individual agents, since in complex systems the rules that govern individuals can be and typically are very different from the rules that govern collective behavior.
In a physical system this is similar to studying gases. We can in principle use Newton's laws to predict what should happen to individual atoms and molecules, but collectively we need to resort to a statistical/probabilistic approach. Collectively, there are set probability distribution functions for something like molecular speed, but that is meaningless to an individual molecule of the gas. In social systems, we are dealing with complex, unpredictable individual agents that make up the system, and this makes things considerably more difficult to analyze than a gas, whose individual agents are governed by deterministic rules (at least to a good approximation using classical physics). It will be quite difficult to accurately model emotion and religious fanaticism, for example, for individuals in a social system. We can guess and try to take a statistical approach, but this leaves some degree of uncertainty in results and predictions. It will be very difficult to model and predict what is going on in the head of a leader such as Osama bin Laden; there is a good deal we can only guess at, even though there has been research and progress in figuring out how his larger terror network operates and is structured. This is the difference between local and global environments and rulesets.
Whether it is trying to figure out economies, decision-making, trade networks, terrorist organizations, or worldwide transportation networks and systems, we will likely always be in a better position to understand the global structures and behaviors of those networks compared to what happens to individuals and in local segments of the larger network. There will be a 'fuzzy' area where the transition takes place between where the local environment ends and the global begins, and vice versa, similar to the fuzzy area between where relativistic effects are significant compared to Newtonian predictions, or where the boundary is between a classical and quantum systems. These are areas of study that have no clear-cut borders, and much of the answer depends on what level of sensitivity and precision you are interested in. So it goes, too, for social systems, and those who are interested in such areas of study need to keep this concept in mind. As in physical science, the difficult part will be to determine how large the error bars are on results and predictions, and will be based on how sensitive the global systems are to local perturbations caused by individuals within the system.
In a physical system this is similar to studying gases. We can in principle use Newton's laws to predict what should happen to individual atoms and molecules, but collectively we need to resort to a statistical/probabilistic approach. Collectively, there are set probability distribution functions for something like molecular speed, but that is meaningless to an individual molecule of the gas. In social systems, we are dealing with complex, unpredictable individual agents that make up the system, and this makes things considerably more difficult to analyze than a gas, whose individual agents are governed by deterministic rules (at least to a good approximation using classical physics). It will be quite difficult to accurately model emotion and religious fanaticism, for example, for individuals in a social system. We can guess and try to take a statistical approach, but this leaves some degree of uncertainty in results and predictions. It will be very difficult to model and predict what is going on in the head of a leader such as Osama bin Laden; there is a good deal we can only guess at, even though there has been research and progress in figuring out how his larger terror network operates and is structured. This is the difference between local and global environments and rulesets.
Whether it is trying to figure out economies, decision-making, trade networks, terrorist organizations, or worldwide transportation networks and systems, we will likely always be in a better position to understand the global structures and behaviors of those networks compared to what happens to individuals and in local segments of the larger network. There will be a 'fuzzy' area where the transition takes place between where the local environment ends and the global begins, and vice versa, similar to the fuzzy area between where relativistic effects are significant compared to Newtonian predictions, or where the boundary is between a classical and quantum systems. These are areas of study that have no clear-cut borders, and much of the answer depends on what level of sensitivity and precision you are interested in. So it goes, too, for social systems, and those who are interested in such areas of study need to keep this concept in mind. As in physical science, the difficult part will be to determine how large the error bars are on results and predictions, and will be based on how sensitive the global systems are to local perturbations caused by individuals within the system.
Monday, July 10, 2006
Quantum Biology Pushing Computing Technology
Many have heard of areas of science such as quantum mechanics, biochemistry, biophysics, physical chemistry, and so on. One growing area that has not received much popular attention is quantum biology, which looks at biological processes at the molecular and atomic scale, where weak but relevant quantum effects are helping to drive different biochemical reactions, as well as the processes involving energy conversion, such as light into chemical. Look at some specific projects here. Another significant aspect of these studies involves doing computational computer simulations of such reactions and biological processes. For example, for the first time, a group at the University of Illinois at Urbana-Champaign (UIUC), led by Prof. Klaus Schulten, has done a full blown, atom-by-atom simulation of an entire life form. It sounds crazy, but this is the level the science is at presently.
UIUC is home to one of a handful of National Supercomputing Centers, and the Schulten group ran a simulation of the satellite tobacco mosaic virus, which consists of abot a million atoms in total. For 100 days, they ran a simulation of a 50-nanosecond time interval to see how every atom behaves and could therefore map all the processes occuring in the virus for that time period. Now, this doesn't sound like much...only a 50-nanosecond interval. But to get better and longer intervals to study, new computing schemes are necessary and are being developed. For example, at UIUC there is work being done to advance into the next level of computing power, the petascale computer (a thousand trillion calculations per second; currently supercomputers are in the terascale range, or only a trillion calculations per second. Compare to most home PCs which are in the gigascale range, or billions). The simulations being done by such groups would take an estimated 35 years on a home PC, so this gives a good comparison to see how advanced supercomuting platforms are. The importance of such simulations is to get a handle on all behaviors of something like a virus, which are being focused on because of their relative simplicity (no simulations of, say, humans, will be possible any time soon) as well as for medical research where molecular medications may be developed (using nanotechnology) that can be effective against a particular harmful virus.
UIUC is home to one of a handful of National Supercomputing Centers, and the Schulten group ran a simulation of the satellite tobacco mosaic virus, which consists of abot a million atoms in total. For 100 days, they ran a simulation of a 50-nanosecond time interval to see how every atom behaves and could therefore map all the processes occuring in the virus for that time period. Now, this doesn't sound like much...only a 50-nanosecond interval. But to get better and longer intervals to study, new computing schemes are necessary and are being developed. For example, at UIUC there is work being done to advance into the next level of computing power, the petascale computer (a thousand trillion calculations per second; currently supercomputers are in the terascale range, or only a trillion calculations per second. Compare to most home PCs which are in the gigascale range, or billions). The simulations being done by such groups would take an estimated 35 years on a home PC, so this gives a good comparison to see how advanced supercomuting platforms are. The importance of such simulations is to get a handle on all behaviors of something like a virus, which are being focused on because of their relative simplicity (no simulations of, say, humans, will be possible any time soon) as well as for medical research where molecular medications may be developed (using nanotechnology) that can be effective against a particular harmful virus.
Wednesday, July 05, 2006
Is it just me, or is this something we should address - Scientists making connections with the public
I was just made aware of a survey in England. Nearly 1500 scientists were surveyed about making connections with the public, such as giving popular talks, going into classrooms to talk about their work and encourage students to pursue science, and so on. The results had nearly two-thirds (64%) asy they were too busy to do any sort of public outreach, and instead needed the time to raise funds for their departments, i.e. grant writing. A majority of the respondents thought it was not important to go into schools, participate in public debates, or do media interviews. This sort of thing is viewed as 'fluffy' and not a good career move (my guess is this was the view of mostly non-tenured faculty). The Royal Society put out a statement which said scientists need to be encouraged somehow to get their work out in the public arena.
In this day and age, where science and technology drive the global economy and scientists complain about funding cuts and the lack of public knowledge or understanding of basic science, these results surprised me. Perhaps we are starting to wake up here in the U.S., where many NSF grants, for instance, require some small section of public outreach. This is actually a good time for schools to approach universities and attempt to collaborate, since many in the universities may actually consider forming a program or project with local schools in order to have it to put in grants. I've personally written four letters of support for Northwestern professors in the past year. But it sounds as if overseas this is not yet the case. My only hope is that federal funding agencies in the U.S. do not take such requirements out of grant RFPs.
I think it is true that the general public is largely scientifically illiterate, and scientists have done a poor job of getting their message out in a good, clear manner so the public cares more about what science is and how vital it is to our way of life. If scientists are unwilling to get the message out, help schools, and get involved in debate, then perhaps we should not be so miffed when a significant portion of the masses comes out and wants intelligent design in science classes and don't believe the science of global warming. I would have to think that it is in the best interest of the scientific community that the message gets out to the public, and that institutions should try to encourage it in some way. In addition, with looming shortages of scientists in the near future, one would think scientists would want to have some contact with the next generation and try to encourage them to pursue science, math and engineering, as well as science education.
In this day and age, where science and technology drive the global economy and scientists complain about funding cuts and the lack of public knowledge or understanding of basic science, these results surprised me. Perhaps we are starting to wake up here in the U.S., where many NSF grants, for instance, require some small section of public outreach. This is actually a good time for schools to approach universities and attempt to collaborate, since many in the universities may actually consider forming a program or project with local schools in order to have it to put in grants. I've personally written four letters of support for Northwestern professors in the past year. But it sounds as if overseas this is not yet the case. My only hope is that federal funding agencies in the U.S. do not take such requirements out of grant RFPs.
I think it is true that the general public is largely scientifically illiterate, and scientists have done a poor job of getting their message out in a good, clear manner so the public cares more about what science is and how vital it is to our way of life. If scientists are unwilling to get the message out, help schools, and get involved in debate, then perhaps we should not be so miffed when a significant portion of the masses comes out and wants intelligent design in science classes and don't believe the science of global warming. I would have to think that it is in the best interest of the scientific community that the message gets out to the public, and that institutions should try to encourage it in some way. In addition, with looming shortages of scientists in the near future, one would think scientists would want to have some contact with the next generation and try to encourage them to pursue science, math and engineering, as well as science education.
Monday, July 03, 2006
Quantum Computing via Quantum Interrogation
Here is one of the strangest things I've ever heard of. Prof. Paul Kwiat's group at my alma mater, the University of Illinois at Urbana-Champaign, has determined an answer to an algorithm using a quantum computer, without ever running the algorithm (see Nature 439, p. 949-952, 2006). Now, quantum mechanics is bizarre no matter how one looks at it, but this is as counterintuitive as it gets, and is bilt around the phenomenon called quantum entanglement.
The idea is to use an interferometer, which is a device that splits a beam of light down two perpendicular arms. This was the device used in the classic Michelson-Morley experiment that ultimately showed there was no ether, and helped lead to Einstein's relativity theories. If a laser is used as the light source, then one has a system of photons that are all in the same quantum mechanical state. In a quantum mechanical system, computers are being designed where bits, the binary digits that are the 1's and 0's in any computer, may be certain spin states of the particles being used in the system. What's more, quantum mechanics is built around the probabilities of particles being in one state as opposed to another state, and the quantum bits (or qubits) may be placed in superpositions of 1 and 0. In Kwiat's computing system, photons from a laser were entangled.
As Kwiat describes it in Physics Illinois News, "By placing our photon in a quantum superposition of running and not running the search algorithm, we obtained information about the answer even when the photon did not run the search algorithm." This was the first time that a theoretical possibility known as 'counterfactual computing,' or inferring information about an answer even through the computer did not run, has been successfully demonstrated in the lab. The goal is to use this quantum interrogation scheme to reduce noise in larger-scale quantum computers, which is a key technical and engineering problem in this field. Large-scale quantum computing is perhaps the Holy Grail of computing and encryption, and it is experiments like Kwiat's that are leading the way to that type of technology.
The idea is to use an interferometer, which is a device that splits a beam of light down two perpendicular arms. This was the device used in the classic Michelson-Morley experiment that ultimately showed there was no ether, and helped lead to Einstein's relativity theories. If a laser is used as the light source, then one has a system of photons that are all in the same quantum mechanical state. In a quantum mechanical system, computers are being designed where bits, the binary digits that are the 1's and 0's in any computer, may be certain spin states of the particles being used in the system. What's more, quantum mechanics is built around the probabilities of particles being in one state as opposed to another state, and the quantum bits (or qubits) may be placed in superpositions of 1 and 0. In Kwiat's computing system, photons from a laser were entangled.
As Kwiat describes it in Physics Illinois News, "By placing our photon in a quantum superposition of running and not running the search algorithm, we obtained information about the answer even when the photon did not run the search algorithm." This was the first time that a theoretical possibility known as 'counterfactual computing,' or inferring information about an answer even through the computer did not run, has been successfully demonstrated in the lab. The goal is to use this quantum interrogation scheme to reduce noise in larger-scale quantum computers, which is a key technical and engineering problem in this field. Large-scale quantum computing is perhaps the Holy Grail of computing and encryption, and it is experiments like Kwiat's that are leading the way to that type of technology.
Subscribe to:
Posts (Atom)