Saturday, December 30, 2006

US Should Absolutely Try to Get the International Linear Collider

I've been meaning to write about this for a while, and a post by Zenpundit finally got me going. I could not agree more with an article in Seed that argues the U.S. needs to make a strong bid in order to have the International Linear Collider (ILC) built in the states. A likely spot for construction could be at the current Stanford Linear Collider (SLAC) site.

The U.S. presently has the world's most energetic particle physics facility at Fermilab, but its days of world dominance are numbered. The Large Hadron Collider (LHC) will presumably be commissioned next year or early 2008 at the European facility CERN, in Geneva, and it will nearly double the energy of Fermilab. Of course, the most frequent question any particle physicist gets from students, family and friends, the general public (who would likely pay for a good portion of the ILC if the U.S. gets it), and politicians is, "Why on earth would we spend multiple billions of dollars on particle research?" That is a fundamental question to ask that must be answered in this age of record budget deficits.

Particle accelerators are the necessary tools to study the basic constituents of matter and the fundamental forces of Nature. This is what particle physics is all about. But what many people do not understand about science and technology is that there are generally two types of science, pure and applied. I've posted on these before, including the panel that was formed to determine the best course for particle physics as well as pure versus applied science. While I am the first to admit that determining the mass of a top quark means nothing to the average person, and top quarks are not going to have any direct applications to improve one's life, gaining knowledge has some worth. Human curiosity has no bounds, and we are a species that is driven to find answers to the questions we develop. How did the universe begin? What are we made of? What makes the universe tick the way it does? These are fundamental questions we all ask at some point, and partcle accelerators have been the tools used to start finding the answers to those questions. This is pure science, and we never know what some new discovery will lead to in the long-term. Scientists do not have crystal balls, and cannot know what applications will exist if the fundamental knowledge is not there.

But many still have a difficult time justifying the costs a machine like the ILC will have. So we can think of it this way: Fermilab has more than paid for itself over its lifetime. In fact, it has paid for itself many times over. Why would I say this, after saying a major discovery like the top quark has no direct applications? Because there are indirect benefits and applications that develop from the types of technology that are created to do this type of work. Building accelerators that are many miles long, and make antimatter and subatomic particles move at essentially the speed of light does not include going to Radio Shack and buying the hardware one needs. The technology did not exist when the blueprints were drawn up. Scientists and technicians had to work over a period of many years to build the machinery, write the software, and develop the electronics and computing power that eventually led to the accelerator and various experimental detectors at these major labs.

In the marketplace, these types of technologies were, at the time, nonexistent and meant nothing to society. As the technology developed, however, think of the following spin-offs: personal computers, the Internet, particle detection systems that now form the basis of detectors being developed by homeland security (to detect nuclear materials, for example), laser applications, fiber optic technology, superconductors and superconducting magnets that now allow MRIs to be available in hospitals, new levels of technological complexity (my old experiment, CDF at Fermilab, has to coordinate a couple hundred thousand individual lines of data to recreate an event, see if it is worth keeping, record it, and reset the detectors in about a is amazing it works), and even new experience in tunneling technology to dig the vast tunnels several stories below ground. Engineering breakthroughs were required to get one of the most complicated machines in history working. New cancer treatments have been discovered, such as the neutron therapy center at Fermilab that treats several thousand cancer patients each year. And yes, the military has been dabbling with particle beam weapons for years. A large lab employs several thousand people. And, something one cannot really put a pricetag on, these massive laboratories are training grounds for generations of American scientists, engineers, and technicians.

We live in a technology driven world. New technologies develop at places where new questions are asked and new solutions required. Creative solutions and problem-solving flourish. And new applications we do not dream of now will undoubtedly arise over time. The U.S. can either make the investment for the long-term health of its scientific and technological base that has led to its status as the world's only current superpower, or it won't, and some portion of the next generation of scientists will leave and go where the experimental facilities are located. We blew it with the SSC back in the early 1990's when Congress pulled the plug. Let us not repeat history and allow a major science facility go elsewhere.

Saturday, December 09, 2006

Physics is a Good Domain for Horizontal Thinking

Well, Zenpundit had a thought provoking post, about what field of expertise might be best as a vertical thinking domain that would lead to productive horizontal thinking. Among his possible choices was physics, which is, of course, near and dear to my heart. Simply because of personal bias, I would have to say physics is the best domain to start from in terms of horizontal productivity (besides, physicists are known as being quite arrogant about the range of problems, like everything, they feel trained to tackle). But when I think about this seriously, it seems to make the most sense, at least to me.

Physics deals with fundamentals. It is the branch of science that looks to understand the quantities and phenomena that literally make up everything in the universe. In order to do high-level physics, mathematics, another field of study on Zen's list, is essential. So is mathematics a more important domain as far as making progress horizontally? I guess I swing back to physics only because, in the end, to solve real problems, one must have at least one eye that can see reality. One can also look at history when Isaac Newton, not a bad horizontal thinker/visionary, had to create calculus in order to solve a physics problem: gravity. I think one of the great examples of horizontal thinking in all of history was Newton's great leap that the force making an apple fall is the same as the force keeping the moon in orbit. That is not at all obvious to mere mortals!

Because physics is a science, it tackles problems through logic, common sense, observation, and experimentation. It studies the basic ingredients of the universe, energy, matter, and forces. And, it is built around the idea of finding the relationships, or interconnectedness, between all physical quantities for any physical system, no matter how simple or complex. It is the combination of these three features, mathematical preciseness and logic, fundamentals, and interconnectedness, that would allow a trained mind to expand on and attempt to tackle the most complex problems. It is the nature of a physicists mind to think we may be capable of a true 'theory of everything.' Now that is arrogance, but may turn out to not be that far-fetched an idea!

It appears that using physics as a 'training grounds' to horizontal breakthroughs is already playing out. The most intriguing areas in human thought right now tend to deal with complex systems. How is globalization going to affect both local and global societies and economies? What are the political, environmental, military, and socioeconomic consequences of global climate change? How do geopolitical hotspots, such as the Mideast, affect the global economy? What is the nature of terror organizations? Where does religion fit into the mix as far as East-West relationships? Now, in each of these examples, complexity reigns supreme because each big question being considered consists of multiple interacting agents that make up a given system. In complexity, the interrelationships between the quantities or principles are key to understanding how the system is going to evolve. This is the essence of what physicists do, and how they are trained to think and analyze problems. And, physicists have an advantage over mathematicians...not only are physicists trained in advanced mathematics and abstract thinking, but they are also trained as scientists, and are driven to always think in terms of basing conclusions on some type of real evidence - some kind of connection to the real world.

Already, domains of study such as economics have begun using mathematical analysis techniques developed by physicists to revolutionize economic theory. Econophysics is being born. Chemistry and biology are working at the molecular and atomic level, which is the realm of the physicist. Technology is driven by nanotechnology and electronics, the realms of physicists (both classical electromagnetic theory and quantum mechanics). Engineering in general is essentially applied physics. The exploding realm of computational science was given birth by theoretical physicists. And, going back to Newton, even the notion of using mathematical analysis of real systems began by addressing physics questions. Such mathematical analysis is now dominating areas such as network theory and complex systems, which includes social systems. Even modern areas of psychology, from a research perspective, are at the level of looking at information dispersal and signal processing in neural networks in terms of electrical pulses at the molecular level, which is a biophysical process.

In the end, physics, or at least a physicist's mentality and approach to problem solving, will likely lead to many horizontal breakthroughs in the future. However, I happen to believe certain issues cannot be thoroughly analyzed without some amount of historical analysis. Zen and I have had some amazing discussions over many years by taking historical features and precedents combined with technological and scientific advancements (which tend to throw off historical analogies, since the hyperspeed with which technology expands on a global scale is in fact creating situations with no historical analogs), so trying to attack some modern problems will require a mix of domains (i.e. consilient analyses), to be sure. New visions can also occur in unexpected ways, where accidental discoveries might trigger some new thought, or a creative mind that was trained in some field that is not directly related to a given problem. In the information age, some groups get it that it is imperative to build working teams of people triained in multiple disciplines, but much more of this will be needed in order to tackle the truly complex problems that affect the world presently.

Sunday, December 03, 2006

Unintended EMP strike

A quick story I just found. A military (Air Force) radio signal was being tested in Colorado that would be used to communicate with first responders during some future disaster...the problem is it is in the same electromagnetic band as the signals used in 50 million garage door openers. Hundreds of calls were received by residents who could no longer operate their garage doors. While this is a bit amusing, it also should keep in the front of our minds how easy it is to cause widespread disruptions of everyday life with common, cheap technology. We need to have plans in place for a future EMP attack, where redundant and resilient features are built into our electronic, computerized society.

Monday, November 20, 2006

Woodland Consolidated School District 50 - Running for Board of Education

I will be taking on a new challenge before long: running for the school board of the elementary and middle schools my kids are and will be in. There is the old saying that 'all politics is local,' and the village hall and local school district have the single biggest impact on a community. I've devoted my adult life to helping kids in the classroom, and now it is time to try and help at the community level. I like to think I have a broad range of experiences that will make me useful on a school board; and I know what goes on in schools and, most importantly, in classrooms. Otherwise, what is the point in running.

Education is the one thing that cannot be taken from an individual, and having a good education is something that opens doors and gives a person options and opportunities in life, and nothing is more important to me than to give my own children a good school experience in which they can grow. The first step is to get the signatures, and then do some addtional paperwork to get on the April, 2007, ballot. Campaigning will soon follow. I'm excited that the present Board has begun 3-5 year strategic planning, of which I have been involved on a community committee, and if I am fortunate enough to get elected I can play a direct role in making sure priorties are set in such a way as to develop a strong school experience that will help our kids reach a point where they can truly compete in a global community, rather than just a local or national community.

Saturday, November 11, 2006

Perhaps Environment will be a New Focus After the Election

I am more hopeful, after this last election where a Democratic tidal wave overtook the nation, that environmental and global climate change will get more attention, and most importantly, some actual action. The past six years of complete Republican control of the government has set back environmental agendas and action, even as mountains of evidence and environmental change have been rapidly taking place world wide. Yet another report is out as of last Thursday, where the famous glaciers on some of Africa's mountains are melting and receding at unprecedented rates. For instance, the glaciers on Mt. Kilimanjaro have been reduced by a staggering 80% over the past century and those on the Rwenzori mountains (between Congo and Uganda) have been reduced by 60%, as temperatures rise in Africa. Runoff from these glaciers provide the region with some of the rare fresh water that the people get, and if these glaciers disappear entirely, as will happen within only a couple more decades, the only source of water during the dry season will also vanish. We will see mass migrations of people if and when this occurs, which is not what one wants in an already troubled region of the world. Water supplies will be threatened in similar ways around the world if climate change continues to progress at the accelerating rates we have been seeing over the past few decades.

It is imperative that something, anything, gets done soon in the U.S. so we begin to contribute to working on the environment. It is in both our interests as well as the world interests that the current leader in the production of greenhouse gases take a leading role in doing something to clean up this mess, and the new Democratic leadership in Congress can have an impact as they will get to set the agenda come January.

Monday, November 06, 2006

Get out and Vote!

It is clear that this is an important midterm election. There is a divide in the country about what the best path is during the last two years of the current administration, and it is time to use one of our most sacred rights we have, the power of the vote, the power of numbers, to let those in leadership positions know your view. Get out and vote, and we'll then see what happens. If you don't at least vote, then I certainly don't want to hear complaints about the way things are or how they should be....get involved if you care, and voting is a great way to do so!

Monday, October 30, 2006

Something we just never hear about - world hunger

I just wanted to make note of a study done by the UN's Food and Agriculture Organization about world hunger. Nearly 860 million people, mostly in developing nations, are severely undernourished. In 1996, a world summit on hunger set the goal that the number of undernourished people would be halved by 2015, but so far it is estimated that only 3 million have been served...this is such a small percentage that it is not statistically significant. The world is richer, there is more food, there is better agricultural technology, and there are better communications and distributions networks and technologies, but virtually no progress has been made when it comes to hunger (and I would like to know the percentage of this group who are children, essentially with no future in life). It is heartbreaking to think of these staggering numbers.

Check out Zenpundit - Super-Empowered Individuals

A thought-provoking entry has led to a wide discussion on super-empowerment, or how an individual can single handedly have enormous influence on some given system, over on Zenpundit. More comments can be found here. My own comment that I emailed to Zen was:

"This post is one of the natural extensions of what we have been discussing. I don't think there is any doubt that it is inevitable. I suppose the 'when' depends on what system is perturbed/attacked. It will be done as our understanding of network theory and complexity advance; to have, say, an individual do tremendous damage, that person will need the means of mapping out and understanding the levels of connectivity inherent to the system, whether that system is social, electronic, environmental, industrial, etc. Even with a lack of understanding of the system's multi-dimensional topology in whatever relevant phase space, I can imagine someone developing and using one of these newer adaptive genetic computer algorithms...this type of program can 'learn' as it crunches data, and can adapt itself to the system. It is along the lines of the programming being tried for intelligent robots, etc. That is probably the scariest scenario to me."

My thinking is that at some point, as these types of algorithms and technology further develop and become more widespread, cheaper, and user-friendly, it will no longer take an expert in the relevant fields to do damage to different systems of concern...some amateur hacker type can just unleash a virus built around such software and the software will be 'intelligent' enough to do damage on its own. As all aspects of life become computerized at some level, this form of super-empowerment is, in my mind at least, the single greatest technology security issue that faces us in the future (and is on a level near that or arguably equivalent to nuclear and biological terrorism...while it may not cause immediate death and physical destruction, the potential to adversely affect countless millions of people is there). Resilience in all computer systems and networks is absolutely essential.

Zen, good job as always, my friend.

Saturday, October 28, 2006

A Nice Example of How Science Works - The Case of the Pentaquark

It is clear that most people do not have a good grasp of the fundamental nature of science. I think the fact that a large number of Americans believe that things like creationism/intelligent design should be taught in high school biology classes is evidence of this conclusion. Ths si why, as a science educator, I am always on the lookout for examples that give a clear snapshot of how science works. A recent example is the case of the pentaquark. A nice, understandable article regarding the pentaquark can be found in Symmetry Magazine, a joint publication from the Stanford Linear Accelerator (SLAC) and the Fermi National Accelerator Laboratory (FNAL, or simply Fermilab).

In a nutshell, quantum chromodynamics (QCD), which is the current quantum field theory that describes the strong nuclear force (responsible for binding quarks into observed particles, as well as holding the nuclei of atoms together), allows for particles that are combinations of five quarks, hence the name pentaquarks. This is very different than the particles that we observe normally, which are baryons (3-quark combos, such as protons and neutrons) and mesons (2-quark combos). When the possibility of pentaquarks was first theoretically predicted in 1997, experimentalists at a variety of labs around the world began looking for evidence of this potentially strange breed of particle. In 2003, the first announcement that there was some evidence for pentaquarks was made.

This doesn't seem like much right now. A well-established theory predicts something, and when it is looked for it is found. However, that is just the beginning in science. What many people don't understand about the nature and process of science is that just because one person or one group say they found evidence for something, that doesn't mean we should believe it. Rather, the opposite is true. When new discoveries are announced, the scientific community takes on the role of skeptic. The articles announcing the discovery are looked at with a fine-tooth comb, at least this is how it is supposed to work. Other scientists in that particular field think about the analysis and methodolgies used in the research. Statistical standards must be met within the field in order to announce discovery. The article was peer-reviewed before even being published. The whole community is supposed to try and find flaws in the work. In the case of the pentaquark, the concept of reproducibility took place, where independent groups at different labs try and reproduce the results.

As other groups designed and ran experiments specifically to look for pentaquark signatures and collected greater volumes of data, better statistical results were determined, and the new conclusion from several independent groups was that there was actually no reliable evidence for pentaquarks. The original studies suggesting there could be this new type of particle were isputed by better experiments and data sets. Does this mean the original experimentalists fabricated their data or did not know what they were doing? Not at all. There could have been a variety of reasons why they reached their conclusions, such as statistical fluctuations in the data, high background rates, detector issues, low statistics, unknown systematic errors, and so on.

The point is, science is always evolving. As technology improves, as new knowledge is developed, and as old, accepted ideas are re-examined under new points of view and studies, if there is evidence that suggests old, accepted theories or ideas are incorrect and need to be modified, then the appropriate changes based on the best new information are made. Perhaps the most impressive example is when young Albert Einstein, with a new, fresh point of view, came out and said that the bedrock foundation of physics, Newton's laws, were fundamentally flawed and simply did not work when objects moved at a substantial fraction of the speed of light. He presented a new theory, the special theory of relativity, which did a better job of describing Nature.

Science is self-correcting. It is skeptical. It challenges us to not accept something the way it may appear at first glance, but rather what it is after exhaustive study. Science bases its conclusions on observation, reality and evidence, rather than on common sense and logic. If at all possible, scientific conclusions and discoveries should be re-tested independently and either confirmed or disputed. It can be a slow process at times, but this is simply the nature of this realm of human thought and productivity.

Philosophy differs from science in that logic dominates the process. This does not necessarily allow us to accurately describe the world, though, as we found out when heavy objects don't fall faster than lighter objects, as Aristotle argued based on logic/common sense, but rather at the same rate. We also see a complete loss of comon sense and logic in something like quantum mechanics...but all physical tests of the many bizarre predictions of the theory have confirmed the theory. Religion also differs in its process of understanding the world around us, as religious texts lay down down exactly what should be believed. There is little to no room for skepticism in religion, for one either accepts the word of the Creator or not, and typically it is left at that. And religion lacks physical tests or evidence to prove a Creator exists; rather one's faith in the Creator is necessary for one's religious development.

Do pentaquarks exist? The best evidence suggests we have not found them. Does this mean we accept this and never look again? Definitely not. Perhaps in a future experiment there will be some strange signal that appears and we get one of the 'accidental' discoveries that are comon in science (such as penicillin or X-rays). Perhaps the calculations that led to the prediction were not done accurately and pentaquarks are in reality heavier than first thought, and it will take a new facility to produce them. Who knows? But this is what continues to drive science forward as it tries to figure out how the physical world around us works. And it is a very different approach than what is done in philosophy and religion.

An Amusing Quote...Not

The president is out on the campaign trail in a last gasp effort to help some GOP congressman hold their seats. One of his main points is that Democrats cannot be trusted to have control of Congress "because they don't know how to win in Iraq." I guess I have been blinded by reality, having been under the impression that the administration's handlng of Iraq is as close to an overall disaster as one can imagine...clearly Bush has the answers of how to win in Iraq (how silly of me to use the evidence of reality on which I base my own conclusions). I want to say that Bush's stump speeches are laughable, but unfortunately things are too serious for our troops to just be sarcastic.

For three long years the GOP-controlled Congress has allowed this president to get us into this mess without any serious objections, oversight, demands for accountability, or suggestions to at least rethink strategy because of the poor results and steady deterioration of conditions on the ground. Only when the polls turned did one see the consistent GOP calls to "stay the course" fade away. Many Republicans who have consistently backed Bush on the war now hope to be seen as independents, including my congressman (Mark Kirk), because they now call for changes in strategy and suggest that timelines, benchmarks, and redeployment need to be considered...the same conclusions reached by most Democrats a year or more ago. They are running for cover, trying to distance themselves from their multiple years' worth of support for Bush's policies on Iraq (and we cannot forget that the Taliban is essentially in control of several regions in Afghanistan, to the point where the top NATO commander said that things need to significantly improve in the next 6 months or else we will lose the Afghan people to the Taliban). Democrats who demanded that we do the job right in Afghanistan, against those who were actually responsible for killing 3000 Americans, before going into Iraq were chastised, and simply dismissed as 'unpatriotic' for daring to question Bush. And the standard line Republican candidates still use against Democrats who want some sort of change in Iraq policy and strategy is "cutting and running," even as the Republicans suddenly say the same thing as the Democrats have been saying. This is ridiculous and infuriating, and as both sides resort to what is likely the most negative campaigning in our history, it is the troops who will continue to suffer because of a lack of leadership from Washington.

I am fairly certain that there is absolutely no person on the planet right now who knows how to achieve true victory in Iraq, largely because I honestly do not know what victory means, and I have not heard anyone give a convincing argument/definition of what victory is. I cannot grasp in my mind (and I have tried) how anyone can listen to Bush's latest speeches and regard them as believable or credible, as if the president knows what victory in Iraq looks like.

Friday, October 13, 2006

Report Slams Teacher Education Programs

This is a report I have been waiting for. When you talk with teachers about the quality and relevance of their courses and programs that prepared them to teach and become certified, chances are most will say that, by and large, the coursework and preparation was largely not helpful or relevant to what goes on in the classroom. In my own experience, I cannot think of too many things that prepared me for my first teaching job in a Chicago public high school, where over 60 languages were spoken, 75% of my students had English as a second language, and 95% of the 1800 students were from low-income families. I was decidedly unprepared in terms of what to expect and strategies to use in the actual classes I was teaching, and instead had to very quickly learn on the fly. This is from the NSTA Express:

"Despite some examples of success, the majority of today's teacher-education programs are engaged in a "pursuit of irrelevance," having failed to keep pace with substantial changes in technology, student demographics, and global competition, according to a new report from the non-partisan Education Schools Project. The American Association of Colleges for Teacher Education said it welcomed the report and agreed with some, though not all, of its recommendations. To read the eSchool News article, visit To read more about the report Educating School Teachers, visit"

If you are a teacher, it is worth a read. Real reform and improvement in student achievement on a large scale will not be possible unless teacher training and education is improved on a large scale. Quality teacher preparation and training in reformed teacher education colleges/programs needs to be a central pillar to any future education policy, without question.

It is time to get some balance back in the federal government

As you might suspect, I tend to agree more with the Democrats compared to the Republicans. Not on all issues to be sure, but a good majority of them.

But as I look at where the country is headed and the major problems looming in the distance, I can't help but think back to what happened in 1994. In that year's midterm election, President Clinton was brought back to the middle after the Republicans took control of the House. Clinton was being pulled to the left since Democrats had control of the government from 1992 to 1994. Most memorable was the national healthcare proposal that Hillary Clinton and her committee developed, and which became a symbol for big government and a bureaucratic nightmare.

In my opinion, the GOP takeover of Congress in 1994 was the best thing that could have happened during the Clinton presidency, as far as the country was concerned. It forced Clinton back to the center, which is where I think he wanted to be anyhow (the far left has just as many nuts as the far right). It brought back policymaking to the middle, which is where most Americans are. Surely, there is gridlock, and bickering, and partisan maneuvering, but at the end of the day we had checks and balances back in place that forced both sides to non-extreme policies and governing.

We are at a point where we desperately need a similar changing of the guard in Congress. The single best thing for the country is undoubtedly for the Democrats to take back at least the House. The far right has helped pull our policy away from center, and the fact that the GOP has had full control of the government for the last 6 years has created some real problems. Perhaps the biggest problem of all is the lack of checks and balances, with the executive having a nearly free reign over foreign policy. There is a near complete lack of accountability and oversight of the White House. Having such a large degree of power has led to extensive and ever expanding corruption and scandal among GOP lawmakers (and today Bob Ney has indeed submitted a guilty plea of accepting bribes, etc.). I do hope that the polling data is accurate, and that there is a legitimate chance (and some would say likely) for the Dems to win back the House, and an outside chance of winning the Senate.

US Sweeps Science Nobels

For the first time since 1983, the medicine, physics and chemistry Nobel Prizes all went to five American scientists. An American also won the economics Nobel Prize. There is a good deal of information on the Nobel Prizes here.

As might be expected, American science educators are thrilled by this development, but one should not forget that while our top level students and scientists are typically the best in the world, and that the U.S. has by far the largest monetary commitment to research, the science education the average American student receives does not compare well with the rest of the developed world. USA Today has a nice summary article.

Sunday, September 17, 2006

Quick Update - Gas Prices

Well, it has happened even faster than I predicted. It only took 5 days for gas to go from $2.65 to the first station I saw at $2.50 per gallon. This must be the result of the Middle East suddenly being in such a grand, peaceful state, with no more fears or pressures on the world oil supply, and our sudden therapy that has gotten the U.S. over its addiction to oil. China and India are also being good stewards of the environment and have cut their demands and use of oil.

I am expecting prices to balance out around this level for the next few weeks, and even go lower heading up to the election. Let's see if that is what happens. And, we'll see if prices just happen to increase after the election. I can't possibly imagine prices will increase significantly, if at all, even if a major event were to occur in the Mideast before the election. Again, time will tell...

Well, prices are down under $2.50, at least in my district where a Republican incumbent, Mark Kirk, has a competitive race with Dan Seals, a Democratic challenger. Now, I want to research something, and anyone who reads this please feel free to chime in. As a scientist, I can't help but to look for and notice patterns. In other congressional districts around the Illinois 10th, where there really aren't major challenges, gas prices are over 30 cents higher than in the 10th district.

I really want to know why this is, because it has NEVER been like that before! Prices are always within 10 cents of each other as long as I have lived here. I would be fascinated to know what the average price is in noncompetitive districts versus in competitive Republican districts around the country. And, the Kirk campaign sends out the occasional email campaign update. One was sent out this morning, and the only topic was his gushing over the fact that within the last 3 weeks gas prices are 50 cents cheaper, meaning things are fine and dandy under Republican leadership. I find this almost laughable, when literally across the street in the 9th district (a safe Democratic district) prices are still over $2.80. It is hard to conclude anything other than something's up! If this is the case, does anyone know the legalities of price fixing in specific areas? All I can say is Go Dan!!

Wednesday, September 13, 2006

Continuing Decline of Arctic Ice

Just an update, as two new NASA studies show clear and undeniable evidence of rapid melting of winter ice. Since December of 2004, every single month has shown new record declines in the ice cap. And to quantify it better, melting is occuring at an annual rate 10 to 15 times faster than in the past. We (i.e. the U.S.) will continue to do nothing constructive as far as combating global climate change for at least two more years with the Bush administration, but voters had better wake up some day and move environmental issues up the ladder of priorities. Do we not at least owe our kids a better world than what we inherited, or shall we leave them with a plethora of problems that will occur if the climate continues on the path that is clearly laid out before us? Let's consider this question seriously, shall we.

Monday, September 11, 2006

Let's Collect Some Data on Gas Prices, Shall We?

Here around Chicago something seemingly extraordinary is happening. Gas prices have been coming down steadily and even dramatically. I paid $2.65 today, a price I cannot even remember seeing for a long time. Maybe 2-3 weeks ago it was still over $3 per gallon.

I have to wonder about this. Yes, it is post-Labor Day, but demand has not fallen that much. Worldwide demand certainly is only increasing. The violence in the Middle East, at least in Iraq, has gotten worse, and tensions with Iran have increased, not improved. Al Qaeda just today warned of attacks in the Mid-East and against Israel. The price of a barrel of crude is still in the high $60s. These are all the reasons big oil uses time and time again to justify the rise of oil prices over the past two years. So why the significant drops in price? Is Big Oil simply so satisified with their record and mind-numbing profits that they are being nice to us?

Oh, wait, there may be another reason. Who is Big Oil's best friend, outside of the administration? A Republican in the House of Representatives. And I almost forgot, the Labor Day weekend is the unofficial start of the election season's final push, where voters begin to actually pay attention to all the rhetoric and character assassination that is modern American politics. With the GOP in legitimate trouble, with a real possibility of losing the House, I wonder if Big Oil is making the assist...I predict that oil prices will drop to about the $2.50 range on average, several weeks before the election, and remain there. Voters will notice the positive effect on their pocketbooks. The GOP desperately needs a pocketbook issue boost to offset the Iraq mess, and gain some distance from the president's dismal approval rating. So, let's just keep track of oil prices. Let's see if I am wrong. Let's see if there is a stabilizing and extended time interval for lower prices. And then let's see what happens a week or two after the election. Will prices begin a steady ascent back up to the $3 level before or after the election? Time will tell, but these will, in my mind, be interesting and important questions to note and keep track of.

Tuesday, August 22, 2006

Now this is a Legacy to Leave Behind....

Here's to you, Grandma, as we remember your life and the incredible legacy you and Grandpa have left behind. Mary Vondracek lived to the age of 97, and here is what she and Frank Vondracek, who passed away in 1983 and in his prime was a world class football (he would roll in his grave if I called it soccer!) player for Czechoslovakia, were all about: 7 kids, 24 grandkids, 33 great grandkids, and 3 great-great grandkids. Now that is a legacy worth noting, and all of us will always love you both until we join you...and thank you...

Friday, August 18, 2006

Administration Surveillance Policy Fails First Legal Test

King Bush's, I mean President Bush's, secret and warrantless surveillance program has been ruled unconstitutional by a U.S. District Court. In my humble opinion, this is only a good thing as it will hopefully hold when the appeals are made by the administration. Ultimately, this will make it to the Supreme Court, where this decision may in fact be overturned now that it is stacked with a majority on the right, but time will tell. The District Court ruled a warrantless program violates free speech and privacy rights, which I realize will be attacked, but in my mind it is separation of powers that is most relevant.

A majority of Americans don't have a problem with wire-tapping suspected terrorists or their acquaintances, regardless if they are in or out of the country. The trouble is, just as the administration now has a doctrine of unilateral, first-strike, pre-emptive war (how has that turned out so far? Do you feel like the world is a safer place at the moment?), it has developed a similar view of its place in our government. Certainly in wartime the executive has stretched the boundaries stated in the Constitution in the name of national security, but in the end the reason for three branches is basic oversight and checks/balances. No one argues that there are national security concerns. No one argues that surveillance needs to be secret, because we don't want the bad guys to know when they are being tapped. But I think most will sleep better at night when there is a system in place where someone, such as the secret court already allowed under the 1978 FISA law, keeping an eye on the scope of the program. A Republican Congress has already come out and said it will work with the administration to modify FISA. What is the administration's issue at this point? What Americans want is simply a system for someone else to ensure there is a check to make sure it is only suspected terrorists and any associates within that network who are being monitored ( and keep that as secret as you want), and that flagrant abuses of executive power are not taking place on the general American public.

Monday, August 14, 2006

Research Showing Reading Gives Brain a Workout

Results of some recent research needs to be made available to parents, educators, and children everywhere: Reading gives the brain a more thorough workout than previously believed, and needs to remain a primary component of every classroom. This research comes at a critical time, as results of a landmark literacy study show that from 1992 to 2003, the overall illiteracy rates have increased (a summary of the studies in 1992 and the repeated study in 2003 is in the September, 2006, Scientific American, page 32). It is now estimated that by the standards of an information economy, about one-third of all American adults is functionally illiterate, rating as either "below basic" (12%) or "basic" (22%) skills.

Brain imaging technology shows that areas of the brain become active when reading that, up to now, were not known to be active. For instance, an example of reading the word 'cinnamon', activates olfactory portions of the brain (thanks to the Eides for posting this). Reading about 'kicking' activates that area of the brain responsible for leg motion, and reading about 'picking' activates the portion used for hand movements. This involvement of multiple areas of the brain during reading contributes, I would imagine, to the sense a reader gets of not being able to put a good book down because he/she is literally sensing what is happening...there is the mental imaging and reactions taking place in the brain similar to if the reader were actually doing what was being presented in print. As the Eides point out, it does not require virtual imaging experiences to activate the brain so fully, just a good book. These findings support the notion that reading is a tremendous way of learning because of such a dramatic response by the brain when processing the written word.

As for the illiteracy rates, some of this is almost certainly due to larger numbers of immigrants, meaning a larger number and percentage of immigrant children in schools in 2003 compared to 1992, as well as a continuously increasing number of elderly Americans. Age is known to contribute to a decline in literacy skills and abilities. My own experience in schools over the last decade causes me to suggest that kids do spend more time today with video games and other forms of entertainment that do not require any literacy skills than ten years ago, and therefore read less. But to say one in three adults will struggle with literacy in the workplace is still a staggering number. We need to make good use of this type of research to try and convince students and workers that reading is worth the effort and still needs to be a part of one's daily routine throughout life.

Sunday, August 13, 2006

If I were Education King for a day

If there is one good thing to No Child Left Behind, it is the notion that there should be a well-qualified teacher in every classroom. Now, what "well-qualified" means, exactly, has been and continues to be debated, but it does sound like a pretty basic component of a good education system to have professionals working with kids who have been trained in their subject areas as well as trained to work with kids in a particular age group. The trouble is, there is evidence that suggests neither is taking place often enough.

For instance, something like a third of all new teachers leave the profession within the first three years of their teaching careers (and half leave after only four years). This begs for several questions to be asked, starting with: If new teachers were properly prepared for what they would be doing with kids in a classroom, would we expect such a large percentage of them to quit the profession in such a short period of time? A second question is: Would such a large percentage of new teachers quit if they had good support in the schools in which they went to work? In my mind, the answer to these questions probably not.

Ever since I went through a certification program twelve years ago, I often think back to those classes and try to figure out what I learned and how much I took awa from those courses, and to this day I can't think of many experiences that were useful to the reality I faced in an inner-city Chicago high school, where I began teaching. And I know for a fact I am not alone in having a severe disconnect between teacher training in college and reality in difficult schools. If I were Education King for a day, I would first get involved with reforming education programs in teaching colleges around the country. One example of reform would be to require professors and instructors who teach teachers how to teach to spend X weeks observing not top suburban schools, but some of the tougher schools in the area. Then, their classes would focus not on psychological theory, but rather the realities of actual classrooms and real problems and issues teachers face consistently, from day to day. Of all my professors I had in teaching certification classes, not a single one had ever taught high school, and I cannot remember any one professor who had spent more than a few days observing high school classes. Many of the courses focused on 'one size fits all' strategies and methodologies. I learned within the first two weeks of student teaching that most of what was taught to me was irrelevant, and that classroom teaching relies on figuring out, quickly, how to use multiple strategies simultaneously because of the range of ability and discipline from one student to the next. Teachers need to be prepared for the worst case scenarios in schools, and not ideal world models that do not exist in most places.

The next thing I would do as Education King would be to focus on teacher mastery of the subjects they are to teach. For instance, science teachers should have at least a semester's worth of actual science research under their belts, so they get a clear understanding of what science really is and how it really is done in a lab, and not just remembering a series of facts. Math teachers should have to take applied math courses in order to be able to explain to students why math is important in and relevant to modern life, so kids don't leave feeling like they simply are being forced to memorize a methodology to solving some type of problem, and then become lost when a slightly different problem (dealing with the same material) comes along. And math teachers should be trained to do simple experiments/hands-on activities where students collect data and use those data in solving a certain set of problems, again to make the math real and relevant.

I am using science and math for the moment largely because I teach science and there are some very good math related posts on Eideneuroloearning Blog. First, they note some reasons why children have difficulties learning math in the first place. A second gets into a comparison between how math is taught in China versus the United States, and also exam results taken by teachers. American teachers did far worse than their Chinese counterparts.

Often a goal of schools is to improve curriculum. Often, however, instruction is forgotten or takes a back seat to curriculum development. Instruction and preparedness is so important, though, that this needs to be the focus, especially in schools that are having academic troubles. One may have a worldclass curriculum, but if the teacher is terrible, it won't matter. Even if resources are limited and the curriculum is not demanding, a good teacher, with strong subject knowledge and preparation, can still work wonders in that situation. And if a teacher cannot answer the question "Why are we studying this?", chances are learning will suffer. There is much to be done to reform and improve education across all age groups and in all schools, and if teachers are not prepared our kids will continue to suffer the consequences. Much of that reform will not happen until we get serious about better teacher training from day 1 in the teacher colleges and universities. Contrary to much popular belief, teaching is far from an easy ob and profession; again, if it is so easy, we shouldn't be losing so many new teachers so quickly at the start of their careers.

Monday, August 07, 2006

Television Marketing of Food Affecting Children's Understanding of Health

A recently published study (the journal Speech Communication) by U. of Illinois communication professor Kristen Harrison caught my eye. She studied children's understanding of what foods would help make them healthy (and not just slim and trim), and the role television marketing plays in that understanding. Children's diets and health have made banner headlines in teh past coupe of years, as child obesity has been increasing steadily over the past decade, and studies such as this should help parents and educators decide on good strategies to get a child's awareness and healthy eating habits established.

The study found that television has largely stopped touting foods that are rich in nutrients, and rather focus on what ingredients a product does not have, such as fat or carbohydrates. Many children as well as parents have begun to think that, because of the emphasis on obesity, a skinny kid is equated to a healthy kid, when in fact some skinny kids may actually be malnourished. As is typical, a balance is needed and the study shows that there are severe misunderstandings in actual nutritional value of foods among many children. What's more, Harrison found correlations between a child's weight and their understanding (or lack thereof) of nutrition. For example, heavy kids who watch a lot of TV (her study included a panel of 134 1st-3rd graders, who averaged 28 hours of television viewing per week) are more likely to to think Diet Coke is healthier than orange juice, because I suspect, they have been exposed to diets by the adults in their lives. They also think fat-free ice cream is healthier than cottage cheese. Harrison also discovered that in nutritional reasoning, where children were interviewed and had to explain their answers, the more TV the children watched, the worse their nutritional reasoning. One example is the kid who says a particular food is not healthy because "his sister hates it," rather than any legitimate reason such as it is a fatty food or because parents said it was not healthy.

In the end, this study shows that parents need to be the main line of defense when it comes to a child's health. Limiting television viewing and the number of commercials children watch should help according to these results, and paying attention to pediatricians and public service announcements about what a truly healthy diet looks like is essential. Schools do need to pay attention to what is served in cafeterias, and this is happening at an accelerated rate; many schools in my area are replacing soda machines with juice and other healther drinks, and healthier snacks have begun to appear in vending machines in a number of schools, replacing potato chips and some candy bars. These are steps in the right direction, but it is a major challenge because of the sophisticated marketing various companies now use to hook the largest number of youngsters on their products.

Thursday, August 03, 2006

Using Mediciexity to Define the Future of U.S. Particle Physics

In modern scientific research, we find more and more often groups put together in a very multidisciplinary way. What can happen from a mix of people who are trained in a variety of fields is often rapid progress and new findings, and has been referred to as the Medici effect. Recently, a panel was put together to discuss and make recommendations to Congress about the future of American high energy physics. The U.S., which has been at the forefront of particle physics for decades, will lose its lead when the Large Hadron Collider (LHC) is commissioned in 2007 at CERN. The LHC will replace Fermilab as the world's most powerful accelerator, and numerous American physicists will center their research overseas (they have been doing so in larger and larger numbers for the past 8-10 years already).

Since 2004, a panel (EPP2010) was put together by the National Research Council, following a request by the Department of Energy and the National Science Foundation, to set the course for U.S. high energy physics. Their report came out this past April. What is interesting about the panel, though, beyond their final report and recommendations, is the make-up of the panel. In the past, advisory panels consisted of high energy physicists and some administrators of national labs. This time around, knowing that our loss of the lead in this type of research was a certainty for many years to come, the NRC took a new approach and formed a multidisciplinary panel. The chair was Harold Shapiro, an economist and president emeritus of Princeton, and the other members included 3 Nobel winners (2 in physics, 1 in medicine), an astronomer, a former CEO of a technology firm, a former director of Brookaven National Lab, a former White House OMB official (expert in budgets), a former Presidential science advisor, a condensed amtter physicist, and then several high energy experts.

I think this is an important step not only for high energy physics, which historically has been viewed many non-physicists as a waste of time and vast sums of money, but for American science in general. I believe this panel will become (at least I hope it does) the model for how to map out the future of U.S. science in all areas of research because of the nature of science today. It is incredibly expensive, and more often than not research programs are emerging as multidisciplinary entities that require the efforts of numerous fields of study. I also think the make-up of the EPP2010, for instance, gives the science more credibility in the eyes of Congress and the public, because it addresses not only the particle physics/science issues, but applications in and out of the field and cost effectiveness. In the past, we (high energy physicists) have not been very good of communicating why the work is important and the many benefits that arise directly and indirectly from the research. This approach should begin to improve the communication, and should emphasize that the two types of science, applied and pure, cannot live without each other. Trying to take advantage of mediciexity is the way of the future.

Tuesday, August 01, 2006

Check out Your State's Status for NCLB

Here is the link to the letters sent to each state Board of Education, regarding the state's status for standards and assessments. Most states have failed to meet deadlines for having 'qualified teachers' in every classroom, and most states have had issues gearing all the testing to meet the criteria laid out in the No Child Left Behind law. Some states are losing some of the Title I money as punishment (including my state of Illinois).

Monday, July 31, 2006

NASA Mission Statement Changed

The problems NASA climate expert Jim Hansen has created for the White House probably have much to do with the change in NASA's mission statement, which used to include the line "to understand and protect our home planet." This line is now cut entirely from the mission statement, which certainly puts pressure on NASA and other government scientists to stop worrying about studying global climate change. This has been an increasing area of study within NASA since they obviously have the tools necessary to do the science from space. As Hansen points out, perhaps the White House is trying to eliminate research that is creating headaches for Bush.

Rewriting Science, the Administration Way

There was a fascinating, and extremely disturbing, story on 60 Minutes this past Sunday. I believe it was a replay, as I missed it when it was originally aired.

The story included an interview with NASA climate expert (generally considered the top expert on the panet in his field), Jim Hansen, who risked his job by going public with some White House activity. Ever since the Bush administration came to power, there has been an ongoing campaign of overlooking, questioning, and simply ignoring the science behind several issues, most notably the global changes in climate. It is certainly well-known that it took a number of years before the president would even accept that the global temperature is rising, and even longer to finally acknowledge that humans have something to do with it. But what scientists have been complaining about for years now is the unprecedented way in which this White House censors and restricts scientific results from making it to the public by government scientists. Now, every administration tries to put its own spin on science. Hansen, for instance, mentioned he is politically an independent, and that during the Clinton administration they wanted him to try to spin the science to make global warming and climate change appear to be worse than what the data suggest. Hansen correctly did not do this and made sound scientific recommendations and reported his conclusions based on available evidence, and not political goals.

However, something that has begun to happen with the Bushies is that all scientific reports that are to go public from government research scientists must go through a White House editing process first; and it is not scientists in the White House that are doing the is lawyers and administration officials. This is nonsense!!

For example, any reports that are published for public review on environmental and climate issues are run through the office run by Phil Cooney, the chief of staff for the White House Council on Environmental Quality. It is Cooney who has personally edited the science reports coming from the government agencies that study such topics. The first problem is, Cooney is an attorney, and he is marking up and removing scientific evidence and conclusions from researc papers and memos. I think every rational person would see some problems with an editor who has absolutely no expertise with technical subjects. The second problem is that Cooney was formerly a star lobbyist for the petrolium industry. The 60 Minutes piece had numerous examples of the edits made by Cooney, as they got their hands on actual copies of the papers he edited with his hand-written marks. My favorite is the complete deletion of the paragraph explaining how and why energy production contributes to global warming because of high levels of greenhouse gases during the relevant chemical reactions of the process. He did not even try to fudge the language, but simply crossed it out. The paper submitted to congressional committees contained all the edit and deletions made by Cooney, so that Congress would not be able to see the actual science in order to act at all to curb the problems at hand (since according to the White House, owned and run by big oil, there are no problems). This si despicable in my view, and goes to an entire new level of misinformation and lack of respect for science...and it deals with a major issue that will likely explode in the near future for global environmental damage and global economic chaos.

It should be noted that Hansen has not been allowed to take other interviews, and during the 60 Minutes interview he had a NASA public relations official sitting just off camera, so if he said something the administration did not want to hear, she would have stopped the interview on the spot. This is not what I want in the United is a reflection of an administration that is in denial and has propped up political ideology and overall ignorance ahead of facts and science. It seems like a story that would come out of the former Soviet Union or something.

In my opinion there is no question that the numerous reports of White House tampering with intelligence prior to the invasion of Iraq are true. This White House in particular has gone beyond the normal spin all politicians do, and has created an environment where the expectation is to censor, edit, misinform, and select only those morsels of information that works in their favor. We have seen the results of that misinformation in Iraq, and I fear these eight years of Bush denial and total inaction, which are critical years for the fight of environmental damage, will show similar results in several decades. America and the world deserve better, and I think the single best thing that can happen in the near-term is for the Democrats to win back the House in the fall election. We need to have a buffer to stop the shift to the far right in our government, force it back to a more moderate government, and begin to put science, facts, data, and evidence back into policymaking.

Monday, July 17, 2006

NAACP President Looking for Action

NAACP President Bruce Gordon stated in his speech at the national convention that Black Americans needed to end "victim-like thinking" and take advantage of the opportunities that are out there presently to begin pushing more people of color out of poverty. This year's convention is being held in Washington DC, and the hope is that the president, who has never made an appearance at any NAACP event (perhaps with all the problems that have existed in black precincts the last two elections has something to do with the tension between the president and the Black community), will come.

With nearly a dozen years of experience working with minority students and parents, I've concluded that the single biggest obstacle that perpetuates the continuing achievement gap between white students and students of color is cultural in nature, and the attitude taken by Gordon is certainly a positive step. One grand experiment is Project Excite, which essentially is looking to see if there is a critical mass of minority students who excel academically that begins a domino effect, where being smart (which most of the students are) turns into acting smart and allows students of color to believe it is OK (and not an act of "acting white") to get into advanced classes and aim for top colleges academically. Politically, the minority blocks are large enough to determine elections, and time will tell if there is the motivation on a large scale to take advantage of that potential.

Friday, July 14, 2006

Will we ever be able to predict what social systems and networks will do? Perhaps globally, but likely not locally

There is a lot of interest in social systems and networks, and the use of network theory to help explain how and why social systems work the way they do. While research has shown, for instance, how different rulesets lead to various decisions or how network topology helps identify how disease spreads, one needs to keep in mind that there is a difference between local environments and global environments. What I mean by this is that the analyses done in these areas of study essentially look at results that affect the system more globally. It is quite another thing to see what happens to individual agents, since in complex systems the rules that govern individuals can be and typically are very different from the rules that govern collective behavior.

In a physical system this is similar to studying gases. We can in principle use Newton's laws to predict what should happen to individual atoms and molecules, but collectively we need to resort to a statistical/probabilistic approach. Collectively, there are set probability distribution functions for something like molecular speed, but that is meaningless to an individual molecule of the gas. In social systems, we are dealing with complex, unpredictable individual agents that make up the system, and this makes things considerably more difficult to analyze than a gas, whose individual agents are governed by deterministic rules (at least to a good approximation using classical physics). It will be quite difficult to accurately model emotion and religious fanaticism, for example, for individuals in a social system. We can guess and try to take a statistical approach, but this leaves some degree of uncertainty in results and predictions. It will be very difficult to model and predict what is going on in the head of a leader such as Osama bin Laden; there is a good deal we can only guess at, even though there has been research and progress in figuring out how his larger terror network operates and is structured. This is the difference between local and global environments and rulesets.

Whether it is trying to figure out economies, decision-making, trade networks, terrorist organizations, or worldwide transportation networks and systems, we will likely always be in a better position to understand the global structures and behaviors of those networks compared to what happens to individuals and in local segments of the larger network. There will be a 'fuzzy' area where the transition takes place between where the local environment ends and the global begins, and vice versa, similar to the fuzzy area between where relativistic effects are significant compared to Newtonian predictions, or where the boundary is between a classical and quantum systems. These are areas of study that have no clear-cut borders, and much of the answer depends on what level of sensitivity and precision you are interested in. So it goes, too, for social systems, and those who are interested in such areas of study need to keep this concept in mind. As in physical science, the difficult part will be to determine how large the error bars are on results and predictions, and will be based on how sensitive the global systems are to local perturbations caused by individuals within the system.

Monday, July 10, 2006

Quantum Biology Pushing Computing Technology

Many have heard of areas of science such as quantum mechanics, biochemistry, biophysics, physical chemistry, and so on. One growing area that has not received much popular attention is quantum biology, which looks at biological processes at the molecular and atomic scale, where weak but relevant quantum effects are helping to drive different biochemical reactions, as well as the processes involving energy conversion, such as light into chemical. Look at some specific projects here. Another significant aspect of these studies involves doing computational computer simulations of such reactions and biological processes. For example, for the first time, a group at the University of Illinois at Urbana-Champaign (UIUC), led by Prof. Klaus Schulten, has done a full blown, atom-by-atom simulation of an entire life form. It sounds crazy, but this is the level the science is at presently.

UIUC is home to one of a handful of National Supercomputing Centers, and the Schulten group ran a simulation of the satellite tobacco mosaic virus, which consists of abot a million atoms in total. For 100 days, they ran a simulation of a 50-nanosecond time interval to see how every atom behaves and could therefore map all the processes occuring in the virus for that time period. Now, this doesn't sound like much...only a 50-nanosecond interval. But to get better and longer intervals to study, new computing schemes are necessary and are being developed. For example, at UIUC there is work being done to advance into the next level of computing power, the petascale computer (a thousand trillion calculations per second; currently supercomputers are in the terascale range, or only a trillion calculations per second. Compare to most home PCs which are in the gigascale range, or billions). The simulations being done by such groups would take an estimated 35 years on a home PC, so this gives a good comparison to see how advanced supercomuting platforms are. The importance of such simulations is to get a handle on all behaviors of something like a virus, which are being focused on because of their relative simplicity (no simulations of, say, humans, will be possible any time soon) as well as for medical research where molecular medications may be developed (using nanotechnology) that can be effective against a particular harmful virus.

Wednesday, July 05, 2006

Is it just me, or is this something we should address - Scientists making connections with the public

I was just made aware of a survey in England. Nearly 1500 scientists were surveyed about making connections with the public, such as giving popular talks, going into classrooms to talk about their work and encourage students to pursue science, and so on. The results had nearly two-thirds (64%) asy they were too busy to do any sort of public outreach, and instead needed the time to raise funds for their departments, i.e. grant writing. A majority of the respondents thought it was not important to go into schools, participate in public debates, or do media interviews. This sort of thing is viewed as 'fluffy' and not a good career move (my guess is this was the view of mostly non-tenured faculty). The Royal Society put out a statement which said scientists need to be encouraged somehow to get their work out in the public arena.

In this day and age, where science and technology drive the global economy and scientists complain about funding cuts and the lack of public knowledge or understanding of basic science, these results surprised me. Perhaps we are starting to wake up here in the U.S., where many NSF grants, for instance, require some small section of public outreach. This is actually a good time for schools to approach universities and attempt to collaborate, since many in the universities may actually consider forming a program or project with local schools in order to have it to put in grants. I've personally written four letters of support for Northwestern professors in the past year. But it sounds as if overseas this is not yet the case. My only hope is that federal funding agencies in the U.S. do not take such requirements out of grant RFPs.

I think it is true that the general public is largely scientifically illiterate, and scientists have done a poor job of getting their message out in a good, clear manner so the public cares more about what science is and how vital it is to our way of life. If scientists are unwilling to get the message out, help schools, and get involved in debate, then perhaps we should not be so miffed when a significant portion of the masses comes out and wants intelligent design in science classes and don't believe the science of global warming. I would have to think that it is in the best interest of the scientific community that the message gets out to the public, and that institutions should try to encourage it in some way. In addition, with looming shortages of scientists in the near future, one would think scientists would want to have some contact with the next generation and try to encourage them to pursue science, math and engineering, as well as science education.

Monday, July 03, 2006

Quantum Computing via Quantum Interrogation

Here is one of the strangest things I've ever heard of. Prof. Paul Kwiat's group at my alma mater, the University of Illinois at Urbana-Champaign, has determined an answer to an algorithm using a quantum computer, without ever running the algorithm (see Nature 439, p. 949-952, 2006). Now, quantum mechanics is bizarre no matter how one looks at it, but this is as counterintuitive as it gets, and is bilt around the phenomenon called quantum entanglement.

The idea is to use an interferometer, which is a device that splits a beam of light down two perpendicular arms. This was the device used in the classic Michelson-Morley experiment that ultimately showed there was no ether, and helped lead to Einstein's relativity theories. If a laser is used as the light source, then one has a system of photons that are all in the same quantum mechanical state. In a quantum mechanical system, computers are being designed where bits, the binary digits that are the 1's and 0's in any computer, may be certain spin states of the particles being used in the system. What's more, quantum mechanics is built around the probabilities of particles being in one state as opposed to another state, and the quantum bits (or qubits) may be placed in superpositions of 1 and 0. In Kwiat's computing system, photons from a laser were entangled.

As Kwiat describes it in Physics Illinois News, "By placing our photon in a quantum superposition of running and not running the search algorithm, we obtained information about the answer even when the photon did not run the search algorithm." This was the first time that a theoretical possibility known as 'counterfactual computing,' or inferring information about an answer even through the computer did not run, has been successfully demonstrated in the lab. The goal is to use this quantum interrogation scheme to reduce noise in larger-scale quantum computers, which is a key technical and engineering problem in this field. Large-scale quantum computing is perhaps the Holy Grail of computing and encryption, and it is experiments like Kwiat's that are leading the way to that type of technology.

Friday, June 30, 2006

Gore Gets the Science Correct, According to Experts

To the many doubters I have talked with about Al Gore's documentary on global warming and climate change, "An Inconvenient Truth", all the climate experts surveyed by AP say the science is accurate. Many believe the film is, without having seen it, a doomsday, political comeback in the making for Gore, and that he would be apt to exaggerate scenarios and the science to make a more dramatic impact on the public. This is a logical thought when any poltician is involved, because it is largely their job to spin a topic to their advantage and shape public opinion. While I have not yet had the chance to see it, I certainly plan to because it has received good reviews, and in my case I am most interested in the science of climate change. It si good to hear GOre got it right.

The article referenced notes that many of the top science agency directors (appointed by the president) have not yet seen it, and the president simply refuses to see it...this subject truly is an inconvenient truth for current policies, and it is easier to simply ignore some potentially devastating consequences (leave that to future presidents and generations to worry about) than to anger friends in the oil and industrial sectors.

Friday, June 23, 2006

Getting New Teachers into Public Schools: Teach for America Model

A recent Chicago Tribune editorial praises the recruiting efforts of the Teach for America (TFA)organization. This group recruits graduating college students to commit to two years of teaching in public scools in 22 regions throughout the country. With 3500 current teachers in 1000 different schools, TFA has produced 14,000 teachers altogether since 1990. Last year alone there were 19,000 applications to TFA, which is impressive considering the salary falls into a $23,000 - $43-000 range, often well below the starting salary these students could make outside of education. Often the teachers coming through the TFA program will put in their two years and then move on to other careers, but some two-thirds will remain in teaching beyond that required period. TFA is looking to expand into more states and regions of the country and continue to increase the number of teachers it provides to our schools, as we see shortages of qualified teachers in many areas of the country, as well as continued high attrition rates where about half of all new teachers will leave the profession within their first 3-4 years.

Chicago Public Schools Makng a Push to Promoting Science Research

I just completed a series of workshops with a couple dozen high school science teachers from the Chicago Public Schools (CPS). These teachers have been involved in the Chicago Science Fair, which is the main thrust of science research in city schools. The Science Fair, which has been running for over fifty years, gets literally thousands of students to do projects at all grade levels, but the level of work is typically of the 'cookie cutter' variety, which means projects tend to be recyccled each year and have known answers. Many times the research is a standard experiment that is done in the classroom at some point. In most cases the outcomes are either known or can be guessed at fairly easily. However, new goals for research programs throughout the city now include getting more teachers and students to pursue possible projects of a more advanced variety, where students try to do more original work and get into 'real' science. Ultimately, CPS wants to see more submissions of student work to the major national competitions such as the Intel Science Talent Search (also known as the 'Nobel Prize' for high schools, as six former participants have actually won Nobel Prizes!) and Siemens Competition in Math, Science and Technology.

It is truly remarkable the quality of work high students are capable of doing under the right circumstances. If given doable projects, truly advanced work can be done by bright, motivated and curious students. The obvious and serious problem we have in high schools, though, is a lack of resources and expertise to develop doable projects that yield some level of original work and results. The workshops I led this week focused mainly on getting resources organized and developing strategies for the development of ideas and opportunities for students, and CPS is providng resources and time for teachers to go on and develop their own research programs. This new level of commitment in one of the largest school districts in the nation is most welcome as the U.S. seeks to increase the numbers of stdents who move into technical majors in college and beyond. It also may provide a necessary boost to science programs as the No Child Left Behind school ratings will include science scores for the first time in the 2006-07 school year. I encourage all students and teachers who have an interest in research to check out this website in order to begin developing ideas of your own.

Monday, June 12, 2006

Importance of Teacher Quality

A new study on the effect of teacher quality in the classroom was highlighted in this past Sunday's Chicago Tribune (Metro section). It was a comprehensive study that supports anecdotal evidence from teachers, administrators and parents that has been around for years, and shows conclusively how important it is to have high quality teachers in the classroom, particularly for poor and minority students. Schools from Ohio, Illinois and Wisconsin were evaluated at all grade levels, and teacher quality proved to be most important in the high schools.

My first question on any study like this is how is 'high teacher quality' defined. In this study, five factors were included in the definition: average college entrance exam score of all the teachers in a school, results on the teacher licensing test of basic skills in a particular state, a national ranking of college attended by the teachers, years of experience, and number of teachers with provisional credentials. I'm not convinced that college entrance exam scores are always a good indicator (I've seen former students who scored lower than they wanted, but then blossom in college and beyond, for instance), as well as national rankings of colleges attended (I know some outstanding personnel from smaller, 'no-name' colleges), but this is what they went with.

Results for poor Illinois schools (50-89% poverty rates):
For elementary and middle schools, students who have teachers of:
High quality 56.4% pass state test (ISAT)
Middle-high quality 53.6% pass
Middle-low quality 53.2% pass
Lowest 10 percent 43.8% pass

For poor high schools:
High quality No low-income high schools in this category had high-quality teachers
Middle-high quality 32.5% passed state test (PSAE)
Middle-low quality 27.0% passed
Lowest 10 percent 13.7% passed

This is a large discrepancy for the high schools, where subject matter is more advanced and more important, and teachers who are energized and competent are vital. Poor schools tend to have the largest minority populations (and are located in cities), and the fact they typically pay less and have worse conditions for teaching than wealthier suburban districts leads to those schools further demise. The kids are the ones who pay in the end, and do not get the same education that their peers in wealthier districts receive. Although this has always been 'known' by those of us who have taught in both types of districts (in my case Chicago public high school as well as a wealthier district in the North Shore region above Chicago), it is good to see those gut reactions supported by the data. Hopefully studies such as these will spur on continued policy debates regarding education and how to help those schools that truly need reform.

Tuesday, June 06, 2006

Even though it is 06/06/06, let's not forget 06/06/44

Everyone I have seen today is talking about 06/06/06, the release of the new version of "The Omen," and Satan worship. Let's not forget, however, those who fought on D-Day and helped with the final push to defeat Nazi Germany. We still thank you for your sacrifice...

Creative Team Formation

Whether you work in academia, business, education, the fine arts, construction, or just about any other field, chances are you have been a member of some sort of team charged with solving a problem or to create something new. Within the last couple of years, team formation has been looked at through the lens of network theory, and there are some new and interesting findings.

A group out of Northwestern University (NU), led by Prof. Luis Amaral, has looked at how effective creative teams are formed. Their data sets included looking at the teams that have created Broadway musicals for the past century and the publication records of the top journals in the areas of social psychology, economics, ecology and astronomy, each over the last half century. I'll just summarize their findings here, but the full paper (published in one of the top science journals Science, April 29, 2005) is available online; select the PDF file for the article entitled "Team Assembly Mechanisms Determine Collaboration Network Structure and Team Performance."

The model used in the analysis distinguishes between veterans who have been involved in creative collaborations before and rookies, who are about to see their names appear in print for the first time. What is interesting as far as what leads to success (for plays success is getting to Broadway, and in publications it is to be published in a top journal) is two parameters: the fraction of the team that is composed of veterans (or incumbents, as used in the paper), and the propensity of the veterans to use their connections within the field's network and select agents they have collaborated with in the past. What the research shows is a phase transition, from one regime where you find a large cluster connecting a substantial fraction of agents in a particular field, and another regime where there are large number of isolated clusters of agents. Veterans tend to make up the large clustered, well-connected portion of the network (and are more likely to be hubs within a network), and rookies tend to be more isolated since they have not yet had the time nor experience to become embedded within the larger network (tend to be in the periphery of a network). The size of successful teams tends to vary within the scientific fields studied, but for the Broadway case, the team size averages seven members of a team. It is also important to realize that regardless of the team size, teams tend to be embedded in a larger network because of the fact that veterans on the team tend to know others within the field that may be collaborating elsewhere.

The network formation of the model used by the NU group does indicate a scale-free architecture to the larger network, where hubs are formed due to the rookies' desire to 'make a name for themselves' and wanting to be associated with better known veterans in the field. This is called preferential attachment in network language. For teams that do not make getting veterans to join the team a priority, success is less likely and the network these teams belong to are much more idolated from the rest of the field. However, as the desire to include veterans on the team increases, teams are more successful, and in addition, coalesce into a well-connected single cluster, where links between separate teams exist because of the veterans' links to past collaborators who belong to other teams. This phase transition shown in the data and predicted by the network model is clear evidence for what has been called the "invisible college" proposed by other researchers since the 1960s (and a Wikipedia article about the Royal Society claims Robert Boyle used this term as far back as 1646). The invisible college is the web of social and professional contacts that link, say, scientists across universities.

The main, generalized conclusion from the formal application of network theory to a social system such as creative teams is that to be successful a 'dream team' needs to be built with a majority of veterans who have not necessarily worked together before. Staying with the same collaborators over and over again can actually hurt the creative process and performance of the team (i.e. you need some new blood from time to time, and this can take the form of a rookie or another veteran from the larger network who you may not know very well, to keep the creative juices flowing). In other words, change is good for the creative process to work well.

Much of this seems intuitive. Many people change jobs after a few years because they become bored and simply need a change to spice up their career. Over time, members of teams within a company may find it more difficult to stay productive if no change ever takes place, regardless of past success and creativity. It is rare to find teams of performers who are successful over long periods of time because they have trouble finding something new to keep their audience interested. Normally an athletic team consisting mostly of rookies and other young players, and few veterans, will not find success; rather, it takes a mix of established players (veterans) and some new blood to be successful. And each year or every other year, teams may want to get rid of some of the younger talent and bring in some different young talent or another veteran who would be new to that team, in order to remain successful. This is precisely what the Chicago Bulls did in their dynasty years with Michael Jordan as the main hub. It is truly interesting that applying network theory to team formation shows this is the scenario needed to have the highest chance for success. It pays to dip into the larger network and make some changes in team structure in order to remain fresh and creative.

It may be worthwhile to contrast this with what Howard Gardner writes about for creative individuals who make groundbreaking discoveries. As brilliant as Albert Einstein was, and his nearly individual burst of creativity in physics from 1905 to about 1925, he was largely unproductive in the second half of his career because he isolated himself from the rest of the scientific establishment (i.e. network) and rarely collaborated.

Saturday, June 03, 2006

Mediciexity Defined

Check out a most interesting post at Zenpundit, entitled Creating a Culture of Mediciexity. This gets into the definitions and the interplay between numerous concepts such as vertical and horizontal thinking, resilience, complex thinking and the Medici effect (the creation and/or innovation that happens at the 'intersection' of multiple disciplines working and collaborating on the same problem). Zen's concluding paragraph is:

"Is mediciexity a permanent condition ? No. Like the historical Renaissance it is a moment in time that emerges, is enjoyed and then passes, hopefully leaving a legacy in its wake. However, an organization can build a resilient, institutional culture that nurtures and encourages moments of mediciexity and helps them to come to fruition repeatedly. How ? By embracing change; by honest and regular self-reflection; by a steady engagement of horizontal thinking; by welcoming a regular flow of " new blood" or at least ideas; finally - and this is absolutely critical in my view - by investing in the time and space for " unproductive" intellectual free play that is to human creativity what air is to the body.

Mediciexity is what we need to aspire for to thrive in the 21st century."

Zen and I have discussed and written about these ideas for some time now, and I could not agree more (as, hopefully, a long-term project will spell out that we are working on). I have seen this interplay of ideas firsthand on many occasions. In scientific research, check out the make-up of research groups at a university. One professor I've known for a number of years has a large research group made up of postdocs with chemistry, biochemistry, and physics degrees, a computer scientist, and graduate students who are working in biology and physical chemistry. The professor is a chemical engineer. Bringing together people who are vertically trained (i.e. experts in a specific field) in a variety of fields and allowing them to sit down and bounce ideas off each other and bring in multiple perspectives about how they would approach the same problem tends to lead to a collective horizontal thinking process, where creative ideas and possible solutions to the problem are hatched (and which no one individual in the group would likely ever develop on his or her own) and, often, innovative products are the end result.

The days of research that is just biology, just chemistry, or just physics, are numbered; it is along the lines of saying that most of the 'easy' things have largely been done in many of the major disciplines, and much of the interesting and important work is being done at the interfaces between disciplines. This makes me think of a post I did last September on econophysics. Economists are finding it valuable to work with physicists in order to learn and employ mathematical and simulation tools to economic models. Many believe that the 21st century will be the age of biochemistry and molecular biology (which will require more people who are trained in biophysics and physical chemistry since they are working at the molecular and smaller size scales; we can throw nanotechnologists into the mix as well as mathematicians and computer scientists!). So in science, at least, bringing in a variety of thinkers into a group has been going on for some time, but it is really beginning to take hold with the masses, as research is supporting the validity of this method of developing a creative process. My next post will expand on this last statement.

Thursday, June 01, 2006

NAEP Science Scores for 4th, 8th, and 12th Graders

The NAEP Science scores from last year are now published. Fourth grade students saw increases, 8th grade students were flat, and 12th grade students had a decrease in scores. An additional disturbing result is that there was a widening in the achievement gap between black and white 12th grade students. Below is from the NSTA Express email:

"Science achievement scores released last week in the National Assessment of Education Progress (NAEP) show improvement among fourth-grade students in science, but scores for eighth-grade students remain flat and twelfth-grade students decline. Considered the “Nation’s Report Card,” NAEP released the latest science performance scores of students at grades 4, 8, and 12. The test was administered in early 2005 by the Department of Education to more than 300,000 students across the nation and on military bases around the world.

According to the NAEP study, fourth-grade students’ achievement scores rose four percentage points since the last assessment in 2000. There were also large gains in the number of students moving into the Basic performance level, and minority students—particularly blacks and Hispanics—made impressive increases.

The scores for eighth-grade students have remained flat since 1996 with students losing ground in the physical science area, and most gaps between minority and white students remain unchanged. The achievement scores of twelfth-grade students declined three percentage points since the 1996 assessment. No significant changes in the scores were reported by racial/ethnic groups since 1996, and there was a significant widening of the gap between whites and blacks.

NSTA issued a press release responding to the NAEP report. President Mike Padilla was quoted in numerous news stories. To read the NSTA Reports Online Exclusive article on the NAEP report, visit;
to read the NSTA press release, go to"

This is discouraging news as the nation also faces declining numbers of American students going into science and other technical fields. With our economy driven by technology and science innovation, and the rest of the world putting increased emphasis on producing their own science personnel and infrastructure, the U.S. has its work cut out for itself if we want to remain ahead of the world and competitive in the global marketplace.