Intro Apparently I can quote Wired articles at will as long as the whole thing is posted with the copyright notice. Beautiful then, I will have to do so. Thank you Wired.



The article as follows: Copyright � 1994-99 Wired Digital Inc. All rights reserved. 

W I R E D   N E W S 
A New Computer Age Dawns 
Reuters 
WASHINGTON -- Computer experts said Thursday they had taken a big step toward making tiny, super-fast computers known as molecular computers. 
Built on a crystalline structure, such computers will someday replace those based on silicon chips and could ultimately make it possible to have a computer so small it could be woven into clothing, they predicted. 
They will need far less power than current computers and may be able to hold vast amounts of data permanently, doing away with the need to erase files, and perhaps also be immune to computer viruses, crashes and other glitches. 
"You can potentially do approximately 100 billion times better than a current Pentium (chip) in terms of energy required to do a calculation," James Heath, a chemistry professor at the University of California Los Angeles (UCLA), said in a statement. 
"We can potentially get the computational power of 100 workstations on the size of a grain of sand." 
The team at UCLA and at Hewlett-Packard created a molecular "logic gate," which forms the basis of how a computer works. "We have actually built the very simplest gates used in computers -- logic gates -- and they work," Phil Kuekes, a computer architect at Hewlett-Packard in Palo Alto, said in a telephone interview. 
Logic gates switch between "on" and "off" positions, creating the changes in electrical voltage that represent "bits" of information. 
Heath's team did this by creating a new compound, called rotaxane, which grows in a crystalline structure. 
Writing in the journal Science, Heath's and Kuekes' teams said the rotaxane molecules, sandwiched between metal electrodes, functioned as logic gates. 
Computers are now based on silicon chips. The information they carry is etched onto them, and it is becoming harder and harder to do this precisely on ever-smaller chips. 
But a crystal can absorb information, in the form of an electrical charge, and organize it more efficiently. 
The "chips" made using this molecular technology could be as small as a grain of dust, Kuekes said. "When you walk into a room, it will turn the TV to your favorite channel. Or instead of getting carpal tunnel syndrome pushing a mouse around, your finger becomes the mouse," he said. 
The next step will be structuring the chip. Instead of etching this structure onto the surface, as is done now with silicon chips, it will be downloaded electrically. 
"We can download all the complexity, by wire, attached to a bigger computer," Kuekes said. 
But currently available wires are too big -- much bigger than the rotaxane molecules -- to do this. "So the next step is going to be to shrink the wires until they are the same diameter as the molecules, and then we will have the miniaturized technology," he said. 
It might be possible to use carbon nanotubes -- long thin tubes made of pure carbon. Also known as "Bucky tubes," they are no thicker than most molecules. 
Last year the same team announced they had made the largest "defect tolerant" computer ever and named it the Teramac. 
Shares of Hewlett-Packard surged Thursday $4.56 to close at $113 in composite US stock market trading, even as rival computer makers saw share price declines. 


Reuters I don't know how reuters feels about it but we'll find out.


09:25 PM ET 07/14/99
Clinton Wants Stem Cell Research
By LAURAN NEERGAARD
AP Medical Writer
	   WASHINGTON (AP) _ President Clinton's top ethics advisers are
close to recommending a change in federal law to allow the
government to finance a certain type of human embryo research _ but
the White House instead said Wednesday it will support a more
conservative approach.
	   The issue is over experiments with ``master cells,'' the
building blocks for other tissues in the body that scientists can
cull from human embryos.
	   These embryonic stem cells are generating huge excitement
because they could lead to new therapies for Alzheimer's and other
devastating diseases. But the use of them has raised troubling
ethical questions, and is generating opposition from anti-abortion
forces and some members of Congress.
	   The National Bioethics Advisory Commission met this week to put
final touches on recommendations that would allow women who have
leftover embryos after fertility treatment to donate those embryos
to taxpayer-funded scientists, who would remove the stem cells for research.
	   Federal law prohibits taxpayer-funded research involving human
embryos. So the commission's draft recommendations urged a change
to that law, citing the great promise of the research.
	   That's further than the National Institutes of Health had proposed going.
	   Scientists working last year with scarce private funding
succeeded in culling some stem cells from embryos _ a process that
destroys the embryos _ and grew more cells for future research.
	   Because government-funded scientists didn't actually touch those
embryos, NIH Director Harold Varmus says it is legal for government
scientists to use the resulting lab-grown cells in an attempt to
create new therapies.
	   In a statement Wednesday, the White House appeared to endorse
NIH's more cautious proposal instead of the ethics committee's
bolder recommendation.
	   ``No other legal actions are necessary at this time, because it
appears that human embryonic stem cells will be available from the
private sector,'' said the White House statement.
	   The White House went on to say ``publicly funded research using
these cells is permissible'' under current law, thus supportingVarmus' position.
	   Varmus has said that the government doesn't need to finance the
culling of stem cells from additional embryos because, he contends,
the lab-grown supplies are sufficient for the necessary research.
The White House ultimately agreed.
	   Ethics commission members, in contrast, have said during their
eight-month debate of the issue that more cells may in fact be needed.
	   The executive director of the ethics panel, Eric Meslin, was
initially unaware of the White House position and had no immediate comment.

Reuters I don't know how reuters feels about it but we'll find out.


02:46 PM ET 07/14/99
New Element on Periodic Table
By DAVID KINNEY
Associated Press Writer
	   Russian physicists have created a new, super-heavy element that
lasted a surprisingly long 30 seconds before disintegrating _
long-sought proof, they say, of the existence of an ``island of stability.''
	   Using an atom smasher to bombard plutonium with calcium ions,
the physicists created an element with an atomic weight of 114. The
newest addition to the periodic table has yet to be named.
	   Ninety-four elements exist in nature. Scientists have spent 60
years creating elements in the lab, registering 21 so far. But some
of the more recent elements were so unstable that they
disintegrated in milliseconds.
	   For decades, physicists have theorized the existence of
super-heavy manmade elements with a much longer life. These
elements would make up an ``island of stability.''
	   In Thursday's issue of the journal Nature, researchers at the
Joint Institute for Nuclear Research in Dubna, Russia, reported
creating two atoms of element 114 that lasted for as long as 30
seconds before flickering out. This, they say, is proof the island exists.
	   The discovery, and more recent creations of even heavier
elements, have no practical applications as far as today's scientists know.
	   But for academics, it's thrilling. The study of super-heavies
could shed light on supernovas and origins of the universe. And
chemists are interested in how they bond with compounds.
	   The new manmade elements are numbered according to how many
protons are in their nuclei, not by their order of discovery.
Numbers 95 through 112 were created between 1944 and 1996. In the
past year, scientists have created not just 114, but also 116 and
118. The ones in between have not yet been created.
	   For decades, scientists thought one isotope, or version, of
element 114 _ with 114 protons and 184 neutrons _ would be very
stable because its nucleus would have a full complement of neutrons
and protons. No more could be squeezed inside.
	   Late last year, the Dubna scientists made an isotope of element
114 with 175 neutrons. In March, the lab created another 114
isotope, but it had only 173 neutrons and was therefore less stable
than the first one they created.
	   This year, another major lab trying to create elements, Lawrence
Berkeley National Laboratory in California, forged the heaviest
element yet, 118, and when it decayed, it morphed into element 116,
then an isotope of 114 with even fewer neutrons than Dubna's. It
lasted for milliseconds.
	   These three types of 114 are just off the ``island of
stability,'' scientists say, because they are all short of the 184
neutrons needed. But physicists say they are in ``shallow water,''
and that's proof enough.
	   If they can create a 114 isotope with 184 neutrons, they would
reach real stability: perhaps a life measured in years.
	   One physicist, Albert Ghiorso of Lawrence Berkeley, said he is
skeptical the Russians really did create such an element. He said
that with their setup, it is too difficult to pinpoint a single
atom among all the collision byproducts.
	   But Neil Rowley, of the Institute for Subatomic Research in
France, is convinced the Dubna observations are real. ``Everything
behaves the way it ought to,'' he said.

Reuters I don't know how reuters feels about it but we'll find out.


10:57 AM ET 07/29/99
Pacifica To Reopen Berkeley Station
By JESSIE SEYFER
Associated Press Writer
	   BERKELEY, Calif. (AP) _ Protesters outside community radio
station KPFA were unmoved _ literally and figuratively _ by news
that station owners planned to end a two-week staff lockout.
	   ``It's a positive move, but we're still concerned that the sale
of the station is still on the table,'' said protester Tracy
Rosenberg of the free-speech advocacy group Media Alliance.
	   KPFA supporters have been in an uproar over the past few weeks,
claiming that station owners want to sell and let the radical
station go mainstream. Two weeks ago, owners locked out staffers
and put them on paid leave.
	   Rosenberg was among 30 protesters Wednesday night who carried on
as they had for the past two weeks, planning food runs and night
watches outside the station.
	   Hours earlier, the Pacifica Foundation, which owns KPFA,
announced it would end the lockout, inviting staff to resume
regular programming at 9 a.m. Friday. Security personnel would be
asked to leave, and Pacifica would ``remove itself from management
of the station for six months to one year,'' according to a news release.
	   But Rosenberg and others said they saw the announcement as an
offer that would need approval from the staff's union at mediation
sessions that were scheduled to continue today. Meanwhile, theywould sit tight.
	   ``We will continue tomorrow to see the results of the mediation.
Then we'll take it from there,'' Rosenberg said.
	   Pacifica Chairwoman Mary Frances Berry has denied the station
was for sale, but Pacifica board member Pete Bramson said the issue
was under discussion.
	   The Pacifica board said in a resolution Wednesday that if it
cannot resolve the situation at KPFA within six months it would
talk to people interested in buying the station, according to a
copy of the statement obtained by the San Jose Mercury News.
	   KPFA's troubles date to April when a popular station manager was
dismissed. A subsequent management edict that the issue not be
discussed on the air set the stage for a series of showdowns. A
talk show host and music programmer were fired and more than a
dozen protesters arrested for trespassing. In mid-July, veteran
newsman Dennis Bernstein was yanked off the air for talking about
the controversy.
	   Pacifica also owns stations in Los Angeles, Houston, New York and Washington, D.C.
       

Apparently I can quote Wired articles at will as long as the whole thing is posted with the copyright notice. Beautiful then, I will have to do so. Thank you Wired.



The article as follows: Copyright � 1994-99 Wired Digital Inc. All rights reserved. 

W I R E D   N E W S
The Dawn of a New Mesozoic Era
by Kristen Philipkoski

New data on biodiversity show that, if humans don't make some changes, two-thirds of the world's species could face the fate of
dinosaurs within 100 years.
Scientists at the International Botanical Congress in St. Louis said humans had abused the earth so effectively that extinction
rates are close to that of previous mass extinctions in geologic history. The IBC, which convenes every six years, gathered together
more than 4,000 experts in botany, mycology, plant ecology, horticulture, and agriculture from around the world to discuss the
latest developments in the plant sciences.
 See also: Super Trees Not Pulp Fiction
"It's the same as in a zoo. If a few individuals of some species are genetically not very variable they could be hit by disease much
more easily," said Peter Raven, director of the Missouri Botanical Garden, and a plant conservation expert.
Alan Thornhill, executive director of the Center for Conservation Biology Network, an organization not affiliated with the IBC,
echoed Raven's pessimism.
With world population threatening to double to 12 billion in the next 40 years, Thornhill said humans -- and too many of them -- are
to blame for the precipitous changes in biodiversity.
"The question is, Is that what we want? How many people do we want on this planet?"
Controlling food production could be one way of controlling the population, according to Thornhill. "Every year we produce more than
we did last year, and every year the population gets larger. In any ecological system, more food equals more individuals."
Research presented at the conference predicted that between one-third and two-thirds of all plants and animals -- most of them in
the tropics -- will die during the second half of the next century. If current trends continue, only 5 percent of the earth's
tropical forests will remain in 50 years, the data showed.
The loss would equal that of the last major extinction at the end of the Cretaceous Period and the Mesozoic Era, when the last of
the dinosaurs died off.
About 30 percent of the earth's 300,000 species are in protected cultivation in botanical gardens around the world, Raven said. But
society also needs to protect plants in the wild.
 "The point is that many of the species in cultivation are in only one small patch. That may be the only patch of cultivation of
that species anywhere in the world," he said.
Medicinal plants are particularly important, Raven said.
For example, rosy periwinkle, a plant native only to Madagascar, is the source of a drug that makes Eli Lilly US$130 million per
year. The drug increased the survival rate for childhood leukemia from one in 20 to 19 in 20.
"Ninety percent of the natural vegetation of Madagascar has already been destroyed," Raven said. "With that degree of destruction,
the chance of [rosy periwinkle] surviving is really minimal."
Raven had several suggestions in his speech to the conference, including the establishment of a new United Nations agency that would
monitor plants, detect endangered species, and take steps to conserve them.
He also recommended greater financial support for ongoing research into plant population biology and reproduction.
The money should come from the Global Environment Facility Fund, established by the World Bank, Raven said. The money could be used
to make plant information available on the Internet, and a census of plants in each country would keep researchers informed about
the status of different species.
"While I don't disagree with any of his points, and think these issues are certainly important, they are in no way enough,"
Thornhill said.
"I would suggest that these are all appropriate responses to the ailment, but what we need is to deal with the agent of the disease:
humans. This doesn't make me a very popular ecologist, but humans are turning the biomass of the planet into human mass, and in so
doing, depleting the diversity of the planet."
Raven agreed with Thornhill that overpopulation is the root of the problem.
"In the broader sense, you can't conserve anything without a sustainable world," Raven said.
The United States is particularly guilty of wasting resources, Thornhill said. With a fraction of the population of China, the two
countries consume virtually the same amount of energy, food, and water.
"It's easy to point the finger and say 'They're reproducing too much,' but we're wasting too much," Thornhill said.


Apparently I can quote Wired articles at will as long as the whole thing is posted with the copyright notice. Beautiful then, I will have to do so. Thank you Wired.



The article as follows: Copyright 1994-99 Wired Digital Inc. All rights reserved.
DOD Scientist: Lose the Humans by Niall McKay
SEATTLE Advances in computer science would occur more rapidly if it weren't for one thing - people.
"People are the single most limiting factor to the progress of computer science," said David Tennenhouse, chief scientist with Darpain a speech at Mobicom, a mobile technology conference.
Tennenhouse was addressing a crowd of 400 international academicians and scientists, most of whom had Ph.D. 
on their name badges. "We need to get humans out of the (computing) loop," he said.Tennenhouse is charge with 
overseeing the funding grants and research projects for Darpa, the central research and development organization 
for the Department of Defense. Darpa pursues research and technology where benefits might not be seen for 
decades. Darpa projects have been instrumental in the development of the Internet, high-performance computing, 
and graphics for military andcivilian applications. However, according to Tennenhouse, the agency needs to 
change its focus from interactive computing, whereby computers interact with people, to PROactive computing -- his 
acronym for "physical, real and out there." In this model, computers interactive with networksensors or robots."This 
is the second front of computer science. We need to declare victory in the field of interactive computing and move 
on."Tennenhouse believes scientists need to bridge the gap caused by humans interjecting themselves between 
computers and the physical world."For example, one could connect all the weather monitoring sensors (for a 
region) over a network, suck down all the data produced by those sensors, and then analyze it."Today, computers 
interact with people, and then people interact with the physical world, slowing down the process of data 
analysis,said Tennenhouse."We need to build applications and computers that operate at faster than human 
speed." Tennenhouse cites examples of automotive anti-lock brakes and airbags, as well as fighter plane controls, 
as successful sensor-to-computing-to-physical world applications. Despite the technophobia caused by Y2K threats and replicating software viruses, Tennenhouse puts great faith in the abilities ofrobotic devices to operate without human interaction. Tennenhouse has recently renewed funding of robotics research at Darpa after alengthy lapse."But I don't want the mistake Darpa made in the past which was to take great computer scientists and turn them into mediocremechanical engineers," he said.Instead, Darpa's research will concentrate on how to build software for robots so that they can operate autonomous devices."Robots need to be able to coordinate themselves, and at the same time, computer science needs to reinvent the functions of theembedded processor."Tennenhouse says that in the future, one person could be responsible for overseeing thousands of processors, but since a human can't handle the load, scientists will have to find a way to monitor the data and oversee the process.At present, computer systems are still largely hierarchical in nature -- a sensor contributes a piece of information that is thenprocessed centrally before any action takes place."What if a sensor could launch an application individually?" said Tennenhouse.For example a sensor spotting a unique weather condition could start an application that would analyze all the data from otherweather sensors to track patterns.Tennenhouse also advocates computer science to use more predictive analytical methods."We need to use stochastics [statistical] software so that devices can predict their environment rather than react to it. Thephysical world is full of uncertainty, so we're going to have to build systems that can deduce what is going to happen," he said.While some computer industry observers might assume that computer science has already reached its pinnacle, Tennenhouse said hecalculated that only 2 percent of the potential of computer science has been realized to date."At the current rate of progress it will take another 2000 years to figure out the rest."

Apparently I can quote Wired articles at will as long as the whole thing is posted with the copyright notice. Beautiful then, I will have to do so. Thank you Wired.


W I R E D N E W S
Mercury's Scary Migration
Environment News Service Researchers say they have found the first evidence that mercury can circumvent the 
blood-brain barrier that usually prevents toxins from entering the brain.Though the studies involved fish, the 
findings have implications for humans, particularly children, and for other species as well. Scientists at Canada's
Maurice Lamontagne Institute and the Swedish University of Agricultural Sciences found that mercury dissolved in 
lake and river water can enter the nerves that connect water-exposed sensory receptors -- for odor, taste, 
vibration, and touch-- to the brains of brown and rainbow trout.The mercury can go directly to the brain, 
circumventing the blood-brain barrier -- a nearly impermeable membrane that prevents most toxins from reaching 
the brain. The researchers say this is the first study concerning mercury levels in fish brains, as opposed to levels
accumulated in other body areas, and the first time it has been established that mercury can enter fish brains 
through sensory receptors and their connected nerves."Considering the importance of complex behavior in the life 
of fish, and the well-known deleterious effects of mercury upon thenervous system, the toxicological significance of 
this uptake route needs to be assessed," says Claude Rouleau, Ph.D., a research scientist at Environment 
Canada's National Water Research Institute and the study's primary investigator. Rouleau performed the research 
at the Swedish University of Agricultural Sciences, Uppsala, and completed it for publication while at the  Maurice Lamontagne Institute-Department of Fisheries and Oceans Canada, in Mont-Joli, Quebec. Mercury 
pollution in waterways comes from unsafe manufacturing processes, and the combustion of fossil fuels that contain 
mercury. Natural sources such as the degassing of the earth's crust, forest fires, the evaporation of seawater, and 
volcanoes also add mercury to the environment. But an estimated two-thirds of environmental mercury is the result 
of human activities. Mercury's toxic effects on fish and human brains are well established. Fish depend on their 
nervous systems to find food, communicate, migrate, orient themselves, and recognize predators. Dissolved 
mercury usually is taken in by fish through their gills and dispersed by blood as it circulates through the 
body. Exposure to mercury can damage the brain and nervous system, affecting language, attention, and memory, 
particularly in children.The environmental group Clean Water Action has calculated that the average mercury level 
in tuna is high enough that eating as little as two ounces of tuna a week would be unsafe for a child weighing 35 
pounds. In May, the Environmental Working Group and Health Care Without Harm issued a report warning 
pregnant mothers to avoid canned tuna due to mercury contamination risks. In most cases, little mercury 
accumulates in the brain, which is protected by the blood-brain barrier. However, mercury that does accumulate, 
having passed through the bloodstream or through nerves, is concentrated in specific sites connected to primary 
sensory nerves critical to the function of the nervous system. "The accumulation of mercury or other toxic chemicals 
in the brain via water-exposed nerve terminals may result in an alteration of these functions and jeopardize fish 
survival," says Rouleau. "We believe that uptake of metals such as mercury and the subsequent transport along 
sensory nerves is a process common to all fish species, and in this respect, it is possible that other toxins -- such 
as pesticides -- also could reach fish brains in this way and this is a subject worthy of further study."Rouleau says 
that while chemicals in the brains of such fish may not have direct human implications, as most people do not eat 
fish brains, the survival of these species does affect humans. More importantly, mercury may reach the brains of 
humans along similar pathways. Earlier research has shown that manganese, cadmium, and mercury can be taken 
through the nose and mouth linings of rodents and transported to the brain through the olfactory nerves. "The fact 
that mercury is transported along fish nerves can be extrapolated to humans, as nerve transport also occurs in 
mammals, including humans," said Rouleau. "Thus, mercury and other toxins could possibly accumulate in human 
brains via nerve transport." Rouleau's research is published in the 1 October issue of Environmental Science and 
Technology, a peer reviewed publication of theAmerican Chemical Society.
Copyright 1999 Environment News Service (ENS).

Apparently I can quote Wired articles at will as long as the whole thing is posted with the copyright notice. Beautiful then, I will have to do so. Thank you Wired.


W I R E D   N E W S
Earth's Oceans Washing Away?
by Lindsey Arent
Forget surfing, sandcastles, and sunsets as we know them. The Earth could one day be as dry as a bone and as 
desolate as Mars, scientists say. Don't ditch your yacht just yet, however. You've got another billion years.
Geologists from the Tokyo Institute of Technology have found that the Earth's oceans are seeping into the planet's interior five times faster than they are being replenished.
The Japanese team, led by scientist Shigenori Maruyama, calculated that about 1.12 billion tons of water drains 
into the Earth's mantle each year, while just .23 billion tons moves in the opposite direction, according to an article 
published in the 11 September issue of New Scientist.
Maruyama believes that over time the planet's oceans will disappear altogether.
"The world's oceans will dry up within a billion years," Maruyama told New Scientist. "Earth's surface will look very 
much like the surface of Mars, where a similar process seems to have taken place."
While others have come to similar conclusions about the rate at which oceans are draining, some scientists find 
Maruyama's theory of an oceanless world a bit hard to swallow.
"It's a neat idea," said UC Berkeley geophysics professor Raymond Jeanloz. "But I'm not so enthusiastic about it 
because there no reason to expect that how much water comes in and how much flows out is constant as a function 
of time." Just because more water leaks into the earth now doesn't mean that it's going to continue that way, said 
Jeanloz. "More water is going down than coming out right now, but in the past we thought the opposite was true -- that more was coming out than going in. Maybe we've been in this situation many times in geologic history before," 
he said. Geoscientists generally believe that a large reservoir of water lies about 250 miles below the surface of the 
Earth, in the zone between the upper and lower mantles. Water flows to the mantle through areas where oceanic 
plates dive beneath continental plates, called subduction zones. The water returns to the Earth's surface via 
volcanic hotspots and oceanic ridges, where molten rock and gases are pushed up through the earth's crust.
Maruyama will present his findings at the December meeting of the American Geophysical Union in San Francisco. 
But for now, the jury is still out on his theory, Jeanloz said. "The uncertainties are pretty big, even with the current 
rate of water coming out. It is a huge extrapolation, but an interesting hypothesis."

From TechReview


Mining the Genome 
The Human Genome Project has piled up a mountain of data. How will companies extract the gold of new drugs? One emerging tool may hold the key: pattern-finding software that uncovers the richest veins. 

By Antonio Regalado 

Larry Hunter had just moved into his new office when a reporter visited, so the room lacked knickknacks and family snapshots. Hunter had, however, started unpacking his books, and they were already beginning to form an interesting pattern. Roger Schank�s Dynamic Memory, a classic title in artificial intelligence, was shelved next to Georg Schulz�s Principles of Protein Structure. Machine Learning flanked Oncogenes. Artificial Life leaned on Medical Informatics. 

Properly interpreted, the pattern on Hunter�s bookshelf reveals the latest trend in biology, a field now so overwhelmed by information that it is increasingly dependent on computer scientists like Hunter to make sense of its findings. An expert in an offshoot of artificial intelligence research known as machine learning, in which computers are taught to recognize subtle patterns, Hunter was recently lured from a solitary theoretical post in the National Library of Medicine to head the molecular statistics and bioinformatics section at the National Cancer Institute (NCI)�a group formed in 1997 to use mathematical know-how to sift the slurry of biological findings. 

Where is all the data coming from? The simple answer is that it�s washing out of the Human Genome Project. Driven by surprise competition from the commercial sector, the publicly funded effort to catalog the estimated 100,000 human genes is nearing its endgame; several large academic centers aim to finish a rough draft by next spring. By then, they will have dumped tens of billions of bits of data into the online gene sequence repository known as GenBank, maintained by the National Center for Biotechnology Information (NCBI) at the National Institutes of Health (NIH) in Bethesda, Md. And DNA sequences aren�t the only type of data on the rise. Using �DNA chips,� scientists can now detect patterns as thousands of genes are being turned on and off in a living cell�adding to the flood of findings. 

�New kinds of data are becoming available at a mind-blowing pace,� exults Nat Goodman, director of life sciences informatics at Compaq Computer. Compaq is one of many companies seeking an important commercial opportunity in �bioinformatics.� This congress of computers and biology is a booming business, but has so far revolved mostly around software for generating and managing the mountain of gene data. Now, pharmaceutical companies need ever-faster ways to mine that mountain for the discoveries that will lead to new treatments for disease. 

That�s where entrepreneurial researchers such as Larry Hunter come in. On Hunter�s bookshelf sits a glass bauble reading: �$2,000,000 Series A Preferred. March 5, 1999��a celebration of venture capital funds raised by Molecular Mining, a company he co-founded. The firm, based in Kingston, Ontario, hopes to use data-mining methods to help pharmaceutical companies speed the development of new drugs by identifying key biological patterns in living cells�such as which genes are turned on in particularly dangerous tumors and which drugs those tumors will respond to. And a dozen other startups�the biotech industry�s best indicator of a hot trend�have been formed to make data-mining tools (see �The Genome Miners�). �Biology,� Hunter predicts, �will increasingly be underpinned by algorithms that can find hidden structure in massive amounts of molecular data.� This kind of data-mining work, which Hunter specializes in, is often known as �pattern recognition� and it�s one of the fastest-moving areas in bioinformatics. Indeed, if Hunter is right, pattern recognition might turn out to be the pick that brings forth the gold of new therapies. 


First You Have to Find Them 
To get a sense of how big the mountain Hunter and his colleagues are tunneling into, consider the fact that every human cell has 23 pairs of chromosomes containing about 3.5 billion pairs of nucleotides, the chemical �letters� A, C, G and T that make up DNA�s genetic code. But the actual genes that carry code to make proteins, and go wrong in genetic diseases and cancer, occupy less than 3 percent of the genome; the rest is genetic noise. Making genes still trickier to unearth is the fact that their protein-coding elements are scattered, as are the genetic signals that the cell uses to stitch them back together and guide their �expression�: the process that activates them to make proteins. �The key to understanding the genome is understanding the language of these signals,� says David Haussler, a leading computational biologist at the University of California at Santa Cruz. �But they are hidden, and they are noisy.� 

The first crucial problem is to extract them from this maze of irrelevant code. At Oak Ridge National Laboratory, Edward Uberbacher�s Computational Biosciences Section has tackled the gene-finding problem with artificial neural networks�a type of artificial intelligence (AI) program distinguished by its capacity to learn from experience. At Oak Ridge, neural nets had been used for jobs such as recognizing enemy tanks in fuzzy satellite images; in 1991, Uberbacher adapted these methods to create a program, called GRAIL, that can pick out genes. Since then, GRAIL has been joined by at least a dozen other gene-finding programs, many of which are available to researchers online. 

The current gene-locating programs are far from perfect, sometimes predicting genes that aren�t real and often missing genes that are. Partly because of accuracy problems, says Uberbacher, �these methods have been on the fringe for a while. � But given the accelerating flood of genome data, biologists will be forced to rely on�and improve�them. �Imperfect as they are, they are the best place to start,� says Lisa Brooks, program director of the National Human Genome Research Institute�s genome informatics branch, whose operation doles out $20 million a year to support bioinformatics databases and to develop new data-mining methods. 

Pattern-recognition programs aren�t used only for discovering genes; they�re also heavily exploited to give researchers clues as to what genes do. Today the most widely used program�the NCBI�s Basic Local Alignment Search Tool, or BLAST�receives 50,000 hits per day from researchers searching for similarities between newly discovered DNA sequences and ones whose roles are already understood. Given similar sequences, scientists can often deduce that two genes have similar functions. 

In researchspeak, the process of interpreting a gene�s function and entering it into a database is called �annotation.� In May, London�s Sanger Center and the European Bioinformatics Institute (EBI), a branch of the multinational European Molecular Biology Laboratory in Hinxton, England, announced a hastily organized project known as EnsEMBL. The goal of EnsEMBL, says EBI�s Alan Robinson, is �to make sure the first draft of the human genome will have annotation attached.� EnsEMBL�s first activity will be to send out gene-finding algorithms to rove the genome and bring back a rough picture of where the genes are�a prospector�s hand-drawn map. With the map drawn, EnsEMBL will use tools such as BLAST to guess at the genes� functions. 

Plans for computerized discovery pipelines like this one are important to pharmaceutical companies, who are racing to identify�and patent�key disease-causing genes. In June, for example, the German drug giant Bayer agreed to pay a Heidelberg startup, Lion Bioscience, as much as $100 million for an automated system to mine genetic databases. Lion has dubbed the computerized approach �i-biology,� according to its head of bioinformatics Reinhard Schneider, and is promising Bayer that in five years its computers will discover 500 new genes, as well as annotate 70 genes Bayer has already found. Pattern-recognition algorithms, which will drive the daily scourings of the databases, lie at the core of i-biology. 

Although the Bayer-Lion pact is a record-breaker, it is just one among dozens of data-mining alliances between pharmaceutical giants and computationally savvy startups�evidence that mathematical methods are taking center stage in genomic research. And the academics who write the algorithms also find their stars rising, especially in industry. Lion was founded by top bio-infonauts from the European Molecular Biology Laboratory, headquartered in Heidelberg. At Celera Genomics, the Rockville, Md., company whose plans to decipher the genetic code have shaken up the Human Genome Project and accelerated the publicly funded work, success rides on the expertise of pattern analysis expert Eugene Myers. Celera lured Myers from a tenured position at the University of Arizona to head its informatics efforts, hiring Compaq to build him what�s being touted as the world�s most powerful civilian supercomputer (see �The Gene Factory,� TR March/April 1999). According to Haussler, most scientists think the success of Myers� methods will �make or break� Celera. 


Cancer Categorizer 
Crucial as they are, identifying and comparing genes for clues to their function are just first steps on a long path toward medical relevance�developing a drug can take many years longer. But computational scientists say pattern mining could have much nearer-term payoffs when applied to another type of genomic data known as �gene expression profiles.� 

A gene�s expression level refers to how many copies of its specific protein it is being called upon to make at any given time. The proteins are the actual workhorses in the cell, carrying out the daily tasks of metabolism; the levels of each can vary dramatically over time, and are often out of kilter in diseased cells. Thanks to devices known as DNA microarrays, or, more familiarly, �DNA chips,� scientists can now for the first time regularly measure the expression levels of thousands of genes at once. DNA chips take advantage of the fact that to make a protein, a cell first �translates� a gene into multiple copies of a molecule called messenger RNA (mRNA). The type and quantity of mRNAs in a cell correspond to the proteins on order�and by measuring the levels of thousands of different mRNAs at once, DNA chips are able to create a snapshot of the activity of thousands of genes. 

Mark Buguski, senior investigator at NCBI, says the new data on gene expression levels are �unlike anything biologists have ever been exposed to.� Before, biologists could only analyze the activity of a few genes at a time. Now, DNA chips can produce a �massively parallel� readout of cellu-lar activity. That�s an important advance, because the difference between health and disease usually lies not in the activity of a single gene but in the overall pattern of gene expression. 

A team at the Whitehead/MIT Center for Genome Research is putting this massively parallel readout to work identifying telltale differences between different cancers. Known as the Molecular Pattern Recognition group, it was started last year by genome center director Eric Lander and is led by molecular biologist Todd Golub. Other members include ex-IBM mathematician Jill Mesirov, computer scientist Donna Slonim, and computational physicist Pablo Tamayo, who joined Whitehead from the supercomputer company Thinking Machines. 

This interdisciplinary brain trust is trying to solve an enormously important problem in pattern recognition. Tumors vary in subtle ways, and cancer cells that look the same under a microscope respond very differently to drugs. �The things we call a single type of cancer are surely many types of cancer,� says Lander, �but we don�t know what [differences] to look for.� 

To provide a benchmark for the new methods, Lander�s group started with two types of leukemia that can already be distinguished under the microscope: acute myeloid leukemia (AML) and acute lymphoid leukemia (ALL). They measured the levels of about 6,800 different genes in bone marrow samples from 38 leukemia patients, which they would mine for patterns that could distinguish AML from ALL. But working with 6,800 parameters (the genes) and only 38 data points (the samples) made for a task akin to trying to forecast an election by polling a dozen people. After running through a year�s supply of pencils and scratch paper, they hit on a solution. 

A key step involved feeding the data points into a learning algorithm known as a �self-organizing map.� By plotting the 38 samples into a high-dimensional mathematical space, the map algorithm was able to partition the samples into two groups�one for each type of cancer. Checking against information about the known tumor types, Lander says, it became clear that the clusters broke out the ALL and AML samples almost perfectly. �We showed that if you hadn�t known the distinction between these two types of leukemias�which in fact took 40 years of work to establish�you would have been able to recapitulate that in one afternoon,� he says. 

The research team also got an inkling of how valuable their methods (still unpublished as TR went to press) could be for patients. At one point, the algorithms failed to categorize a sample into either of the leukemia categories. Was the math flawed? No�the diagnosis was. Prompted by the program�s result, doctors took another look and found what they had believed was leukemia was in fact a highly malignant muscle cancer, for which the patient is now being treated. At Cambridge, Mass.-based Millennium Pharmaceuticals, researchers are betting similar approaches will lead to �optimal diagnostic tests� for cancer, according to Dave Ficenec, a former astrophysicist hired by Millennium to install the latest data-mining algorithms in its in-house software. The company collaborates closely with Lander�s center�Lander is a Millennium co-founder who sits on the company�s board of directors. 

The new parallel methods for making snapshots of gene expression are also being used to evaluate new drug candidates. At startup Rosetta Inpharmatics in Kirkland, Wash., a scientific team is assembling and mining databases for gene patterns to speed drug discovery. Rosetta studies yeast cells, exposing them to potential new drugs and then analyzing levels of gene expression for clues to the drugs� actions. For example, the cells can be rapidly checked to see whether their response matches a pattern typical of toxic side effects. Tossing out such losers early on is part of Rosetta�s program of �improving the efficiency of drug discovery,� says Stephen Friend, who doubles as Rosetta�s chief science officer and head of the molecular pharmacology program at Seattle�s Fred Hutchinson Cancer Research Center. Drug firms have taken notice, with eight signed up as Rosetta partners. 

Brain Drain 
While researchers at companies and universities are jumping on the data-mining bandwagon, they are likely to encounter plenty of bumps in the road ahead. Some investors, for instance, remain concerned that databases of different biological results are still poorly interconnected, and sometimes of uneven quality. Says Larry Bock, an investor at the Palo Alto office of the venture firm CW Group: �It may be a bit early for data-mining, since your ability to mine is directly related to the quality of the database.� Still, says Barbara Dalton, vice president at the venture firm SR One in West Conshohocken, Pa., �the long-term prospects look good.� SR One, along with Princeton, N.J.�s Cardinal Health Partners, anted up $2 million to finance Larry Hunter�s startup, Molecular Mining. �Data-mining is going to be a core part� of drug discovery, Dalton predicts. 

But before that happens, the field may have to break its most serious bottleneck: an acute shortage of mentors. Bioinformatics has grown explosively during the 1990s, drawing many of the best university teachers and researchers into the high-paying private sector. �We went from very little interest in bioinformatics, to�Bang!�having most of the people working in companies,� says Mark Adams, who left the academic track to work for the Cambridge, Mass., biotech company Variagenics. With universities drained of some of their brightest minds, many wonder who will train the next generation of computational biologists. 

Part of the answer came in June, when a special advisory panel convened by NIH director Harold Varmus concluded the U.S. government should spend as much as $10 million to fund 20 new �programs of excellence� in biomedical computing. Several universities have also gotten into the act, including Johns Hopkins, where a new computational biology program is under way, thanks to a $2.5 million grant from the Burroughs Wellcome Fund. Stanford, Princeton and the University of Chicago are all planning major centers that will bring physical scientists together with biologists. 

In industry, the convergence is already reality. One-third of Rosetta Inpharmatics� 100 employees are computational scientists, drawn from fields as diverse as sonar detection, air traffic control and astrophysics. Chief scientist Stephen Friend says he�s come to an important realization since joining the company in 1997. Biologists may still ask the best questions and design the most compelling experiments, he says, but �the best answers are coming from the physicists or mathematicians.� Those answers are likely to lead to important new therapies�gold extracted from the mountains of the Human Genome Project by the tools of pattern recognition. 

Antonio Regalado is a senior associate editor at Technology Review 

From Wired News

W I R E D   N E W S
The Little Engine that Might
by Leander Kahney
Taking on the world's giant energy business, a tiny startup is set to launch an engine that requires no fuel, produces
no pollution, and is free to run. Naturally, the experts think it's too good to be true -- although they can't exactly say 
why. Entropy Systems, a seven-person startup based in Youngstown, Ohio, is scheduled to launch the Entropy 
engine early next year, said the technology's inventor, Sanjay Amin, a mechanical engineer and co-founder of the 
company. The Entropy engine acts like a heat sponge, absorbing heat in the atmosphere and converting it to power, 
Amin said. Since it consumes no fossil fuels, nuclear fuels, or electrical power, it produces no emissions, directly or 
indirectly. Its only byproduct is cold air. Initially, the technology will be used to create an outboard motor for small 
pleasure boats, simply because it's the easiest market to break into, Amin said. But as it is developed, the technology 
could be used to run refrigerators, air conditioners, generators -- even automobiles."There's no reason it can't power 
a car," Amin said. So far, Amin has built a prototype, which he said generates one-tenth of one horsepower. The 
outboard motor -- yet to be built -- will produce between two and three horsepower. It will be roughly the same size as 
a conventional outboard motor and only marginally more expensive. But, apart from routine maintenance and 
lubrication, the engine will be free to run. Named after the unit in physics that describes the amount of available 
energy in a system, the Entropy engine consists of a central chamber, filled with air, that has a piston in the center, 
Amin said. The engine operates on a cycle. First, a starter motor spins the engine to a high speed, which pushes the 
gas to the edge of the central chamber, as in a centrifuge. As the gas moves to the edge, it creates a partial vacuum 
in the center that draws the piston out, compressing the gas. In the second part of the cycle, the engine is slowed, 
and the gas redistributes itself throughout the chamber, which increases the pressure on the piston. Heat trapped in 
the gas is converted into the energy that moves the piston, which cools the air in the engine chamber. The engine will 
run year-round in any climate, even in sub-zero temperatures. Although it operates better in warmer climates, it
will work in any environment above absolute zero (minus 273 degrees Celsius). "In physical terms, even ice has a lot 
of heat," Amin said.Amin claims to have patented the technology in the United States, Australia, and Europe. He said 
he has published a book on thermodynamics and in 1996 received an Engineer of the Year award from the American 
Society of Engineers of Indian Origin. Always obsessed with engines, Amin built steam engines as a teenager. He has  
devoted more than a decade to the Entropy engine. He began by looking at gravity as a power source, which 
eventually led to the idea of using atmospheric heat. The technology was developed in part when Amin was studying 
at Youngstown State University, which helped launch the fledgling company.  Bill Dunn, an associate professor of 
mechanical engineering at the University of Illinois at Urbana-Champaign, said that while he hasn't seen the engine in 
action, he has examined the materials on Entropy's Web site. He said the logic appears sound, but the outcome -- 
free power -- doesn't make sense. "It's the end result -- that you can create power from heat at ambient temperature -
- that flies in the face of the basic laws of physics," said Dunn, who acknowledges that he hasn't devoted time to 
figure out why the engine shouldn't work. "To track down where his thinking may be flawed is a difficult thing to do," 
Dunn said. In Amin's favor, Dunn noted that he has attracted backing from "some very intelligent people." Hedging his 
bets, Dunn said breakthrough technologies have frequently been greeted with skepticism. "Every time someone 
suggests something like this, you should at least give them the benefit of an open mind." Iain MacGill, an energy 
campaigner at Greenpeace, said that because vehicle pollution makes up about a third of US greenhouse gas 
emissions, a pollution-free engine would be an incredible breakthrough. Nevertheless, it sounds to him like 
fiction. "It's got a flavor of 'too-good-to-be-true' about it," he said. "I'm a wee bit skeptical."