brdgt: (Default)
[personal profile] brdgt
Drug Approved. Is Disease Real?
By ALEX BERENSON, The New York Times, January 14, 2008

Fibromyalgia is a real disease. Or so says Pfizer in a new television advertising campaign for Lyrica, the first medicine approved to treat the pain condition, whose very existence is questioned by some doctors.

For patient advocacy groups and doctors who specialize in fibromyalgia, the Lyrica approval is a milestone. They say they hope Lyrica and two other drugs that may be approved this year will legitimize fibromyalgia, just as Prozac brought depression into the mainstream.

But other doctors — including the one who wrote the 1990 paper that defined fibromyalgia but who has since changed his mind — say that the disease does not exist and that Lyrica and the other drugs will be taken by millions of people who do not need them.

As diagnosed, fibromyalgia primarily affects middle-aged women and is characterized by chronic, widespread pain of unknown origin. Many of its sufferers are afflicted by other similarly nebulous conditions, like irritable bowel syndrome.

Because fibromyalgia patients typically do not respond to conventional painkillers like aspirin, drug makers are focusing on medicines like Lyrica that affect the brain and the perception of pain.

Advocacy groups and doctors who treat fibromyalgia estimate that 2 to 4 percent of adult Americans, as many as 10 million people, suffer from the disorder.

Those figures are sharply disputed by those doctors who do not consider fibromyalgia a medically recognizable illness and who say that diagnosing the condition actually worsens suffering by causing patients to obsess over aches that other people simply tolerate. Further, they warn that Lyrica’s side effects, which include severe weight gain, dizziness and edema, are very real, even if fibromyalgia is not.

Despite the controversy, the American College of Rheumatology, the Food and Drug Administration and insurers recognize fibromyalgia as a diagnosable disease. And drug companies are aggressively pursuing fibromyalgia treatments, seeing the potential for a major new market.

Hoping to follow Pfizer’s lead, two other big drug companies, Eli Lilly and Forest Laboratories, have asked the F.D.A. to let them market drugs for fibromyalgia. Approval for both is likely later this year, analysts say.

Worldwide sales of Lyrica, which is also used to treat diabetic nerve pain and seizures and which received F.D.A. approval in June for fibromyalgia, reached $1.8 billion in 2007, up 50 percent from 2006. Analysts predict sales will rise an additional 30 percent this year, helped by consumer advertising.

In November, Pfizer began a television ad campaign for Lyrica that features a middle-aged woman who appears to be reading from her diary. “Today I struggled with my fibromyalgia; I had pain all over,” she says, before turning to the camera and adding, “Fibromyalgia is a real, widespread pain condition.”

Doctors who specialize in treating fibromyalgia say that the disorder is undertreated and that its sufferers have been stigmatized as chronic complainers. The new drugs will encourage doctors to treat fibromyalgia patients, said Dr. Dan Clauw, a professor of medicine at the University of Michigan who has consulted with Pfizer, Lilly and Forest.

“What’s going to happen with fibromyalgia is going to be the exact thing that happened to depression with Prozac,” Dr. Clauw said. “These are legitimate problems that need treatments.”

Dr. Clauw said that brain scans of people who have fibromyalgia reveal differences in the way they process pain, although the doctors acknowledge that they cannot determine who will report having fibromyalgia by looking at a scan.

Lynne Matallana, president of the National Fibromyalgia Association, a patients’ advocacy group that receives some of its financing from drug companies, said the new drugs would help people accept the existence of fibromyalgia. “The day that the F.D.A. approved a drug and we had a public service announcement, my pain became real to people,” Ms. Matallana said.

Ms. Matallana said she had suffered from fibromyalgia since 1993. At one point, the pain kept her bedridden for two years, she said. Today she still has pain, but a mix of drug and nondrug treatments — as well as support from her family and her desire to run the National Fibromyalgia Association — has enabled her to improve her health, she said. She declined to say whether she takes Lyrica.

“I just got to a point where I felt, I have pain but I’m going to have to figure out how to live with it,” she said. “I absolutely still have fibromyalgia.”

But doctors who are skeptical of fibromyalgia say vague complaints of chronic pain do not add up to a disease. No biological tests exist to diagnose fibromyalgia, and the condition cannot be linked to any environmental or biological causes.

The diagnosis of fibromyalgia itself worsens the condition by encouraging people to think of themselves as sick and catalog their pain, said Dr. Nortin Hadler, a rheumatologist and professor of medicine at the University of North Carolina who has written extensively about fibromyalgia.

“These people live under a cloud,” he said. “And the more they seem to be around the medical establishment, the sicker they get.”

Dr. Frederick Wolfe, the director of the National Databank for Rheumatic Diseases and the lead author of the 1990 paper that first defined the diagnostic guidelines for fibromyalgia, says he has become cynical and discouraged about the diagnosis. He now considers the condition a physical response to stress, depression, and economic and social anxiety.

“Some of us in those days thought that we had actually identified a disease, which this clearly is not,” Dr. Wolfe said. “To make people ill, to give them an illness, was the wrong thing.”

In general, fibromyalgia patients complain not just of chronic pain but of many other symptoms, Dr. Wolfe said. A survey of 2,500 fibromyalgia patients published in 2007 by the National Fibromyalgia Association indicated that 63 percent reported suffering from back pain, 40 percent from chronic fatigue syndrome, and 30 percent from ringing in the ears, among other conditions. Many also reported that fibromyalgia interfered with their daily lives, with activities like walking or climbing stairs.

Most people “manage to get through life with some vicissitudes, but we adapt,” said Dr. George Ehrlich, a rheumatologist and an adjunct professor at the University of Pennsylvania. “People with fibromyalgia do not adapt.”

Both sides agree that people who are identified as having fibromyalgia do not get much relief from traditional pain medicines, whether anti-inflammatory drugs like ibuprofen — sold as Advil, among other brands — or prescription opiates like Vicodin. So drug companies have sought other ways to reduce pain.

Pfizer’s Lyrica, known generically as pregabalin, binds to receptors in the brain and spinal cord and seems to reduce activity in the central nervous system.

Exactly why and how Lyrica reduces pain is unclear. In clinical trials, patients taking the drug reported that their pain — whether from fibromyalgia, shingles or diabetic nerve damage — fell on average about 2 points on a 10-point scale, compared with 1 point for patients taking a placebo. About 30 percent of patients said their pain fell by at least half, compared with 15 percent taking placebos.

The F.D.A. reviewers who initially examined Pfizer’s application for Lyrica in 2004 for diabetic nerve pain found those results unimpressive, especially in comparison to Lyrica’s side effects. The reviewers recommended against approving the drug, citing its side effects.

In many patients, Lyrica causes weight gain and edema, or swelling, as well as dizziness and sleepiness. In 12-week trials, 9 percent of patients saw their weight rise more than 7 percent, and the weight gain appeared to continue over time. The potential for weight gain is a special concern because many fibromyalgia patients are already overweight: the average fibromyalgia patient in the 2007 survey reported weighing 180 pounds and standing 5 feet 4 inches.

But senior F.D.A. officials overruled the initial reviewers, noting that severe pain can be incapacitating. “While pregabalin does present a number of concerns related to its potential for toxicity, the overall risk-to-benefit ratio supports the approval of this product,” Dr. Bob Rappaport, the director of the F.D.A. division reviewing the drug, wrote in June 2004.

Pfizer began selling Lyrica in the United States in 2005. The next year the company asked for F.D.A. approval to market the drug as a fibromyalgia treatment. The F.D.A. granted that request in June 2007.

Pfizer has steadily ramped up consumer advertising of Lyrica. During the first nine months of 2007, it spent $46 million on ads, compared with $33 million in 2006, according to TNS Media Intelligence.

Dr. Steve Romano, a psychiatrist and a Pfizer vice president who oversees Lyrica, says the company expects that Lyrica will be prescribed for fibromyalgia both by specialists like neurologists and by primary care doctors. As doctors see that the drug helps control pain, they will be more willing to use it, he said.

“When you help physicians to recognize the condition and you give them treatments that are well tolerated, you overcome their reluctance,” he said.

Both the Lilly and Forest drugs being proposed for fibromyalgia were originally developed as antidepressants, and both work by increasing levels of serotonin and norepinephrine, brain transmitters that affect mood. The Lilly drug, Cymbalta, is already available in the United States, while the Forest drug, milnacipran, is sold in many countries, though not the United States.

Dr. Amy Chappell, a medical fellow at Lilly, said that even though Cymbalta is an antidepressant, its effects on fibromyalgia pain are independent of its antidepressant effects. In clinical trials, she said, even fibromyalgia patients who are not depressed report relief from their pain on Cymbalta.

The overall efficacy of Cymbalta and milnacipran is similar to that of Lyrica. Analysts and the companies expect that the drugs will probably be used together.

“There’s definitely room for several drugs,” Dr. Chappell said.

But physicians who are opposed to the fibromyalgia diagnosis say the new drugs will probably do little for patients. Over time, fibromyalgia patients tend to cycle among many different painkillers, sleep medicines and antidepressants, using each for a while until its benefit fades, Dr. Wolfe said.

“The fundamental problem is that the improvement that you see, which is not really great in clinical trials, is not maintained,” Dr. Wolfe said.

Still, Dr. Wolfe expects the drugs will be widely used. The companies, he said, are “going to make a fortune.”





Mind: Crisis? Maybe He’s a Narcissistic Jerk
By RICHARD A. FRIEDMAN, M.D., The New York Times, January 15, 2008

With the possible exception of “the dog ate my homework,” there is no handier excuse for human misbehavior than the midlife crisis.

Popularly viewed as a unique developmental birthright of the human species, it supposedly strikes when most of us have finally figured ourselves out — only to discover that we have lost our youth and mortality is on the horizon.

No doubt about it, life in the middle ages can be challenging. (Full disclosure: I’m 51.) What with the first signs of physical decline and the questions and doubts about one’s personal and professional accomplishments, it is a wonder that most of us survive.

Not everyone is so lucky; some find themselves seized by a seemingly irresistible impulse to do something dramatic, even foolish. Everything, it appears, is fair game for a midlife crisis: one’s job, spouse, lover — you name it.

I recently heard about a severe case from a patient whose husband of nearly 30 years abruptly told her that he “felt stalled and not self-actualized” and began his search for self-knowledge in the arms of another woman.

It was not that her husband no longer loved her, she said he told her; he just did not find the relationship exciting anymore.

“Maybe it’s a midlife crisis,” she said, then added derisively, “Whatever that is.”

Outraged and curious, she followed him one afternoon and was shocked to discover that her husband’s girlfriend was essentially a younger clone of herself, right down to her haircut and her taste in clothes.

It doesn’t take a psychoanalyst to see that her husband wanted to turn back the clock and start over. But this hardly deserves the dignity of a label like “midlife crisis.” It sounds more like a search for novelty and thrill than for self-knowledge.

In fact, the more I learned about her husband, it became clear that he had always been a self-centered guy who fretted about his lost vigor and was acutely sensitive to disappointment. This was a garden-variety case of a middle-aged narcissist grappling with the biggest insult he had ever faced: getting older.

But you have to admit that “I’m having a midlife crisis” sounds a lot better than “I’m a narcissistic jerk having a meltdown.”

Another patient, a 49-year-old man at the pinnacle of his legal career, started an affair with an office colleague. “I love my wife,” he said, “and I don’t know what possessed me.”

It didn’t take long to find out. The first five years of his marriage were exciting. “It was like we were dating all the time,” he recalled wistfully. But once they had a child, he felt an unwelcome sense of drudgery and responsibility creep into his life.

Being middle-aged had nothing to do with his predicament; it was just that it took him 49 years to reach a situation where he had to seriously take account of someone else’s needs, namely those of his baby son. In all likelihood, the same thing would have happened if he had become a father at 25.

Why do we have to label a common reaction of the male species to one of life’s challenges — the boredom of the routine — as a crisis? True, men are generally more novelty-seeking than women, but they certainly can decide what they do with their impulses.

But surely someone has had a genuine midlife crisis. After all, don’t people routinely struggle with questions like “What can I expect from the rest of my life?” or “Is this all there is?”

Of course. But it turns out that only a distinct minority think it constitutes a crisis. In 1999, the MacArthur Foundation study on midlife development surveyed 8,000 Americans ages 25 to 74. While everyone recognized the term “midlife crisis,” only 23 percent of subjects reported having one. And only 8 percent viewed their crisis as something tied to the realization that they were aging; the remaining 15 percent felt the crisis resulted from specific life events. Strikingly, most people also reported an increased sense of well-being and contentment in middle age.

So what keeps the myth of the midlife crisis alive?

The main culprit, I think, is our youth-obsessed culture, which makes a virtue of the relentless pursuit of self-renewal. The news media abound with stories of people who seek to recapture their youth simply by shedding their spouses, quitting their jobs or leaving their families. Who can resist?

Most middle-aged people, it turns out, if we are to believe the definitive survey.

Except, of course, for the few — mainly men, it seems — who find the midlife crisis a socially acceptable shorthand for what you do when you suddenly wake up and discover that you’re not 20 anymore.

Richard A. Friedman is a professor of psychiatry at Weill Cornell Medical College.





Genetic Study Bolsters Columbus Link to Syphilis
By JOHN NOBLE WILFORD, The New York Times, January 15, 2008

Columbus, it seems, made another discovery of something that he was not looking for.

In a comprehensive genetic study, scientists have found what they say is the strongest evidence yet linking the first European explorers of the New World to the origin of sexually transmitted syphilis.

The research, they say, supports the hypothesis that returning explorers introduced organisms leading, in probably modified forms, to the first recorded syphilis epidemic, beginning in Europe in 1493.

The so-called Columbus hypothesis had previously rested on circumstantial evidence, mainly the timing of the epidemic. It was further noted that earlier traces of syphilis or related diseases had been few and inconclusive in Europe. Yet nonvenereal forms of the diseases were widespread in the American tropics.

Leaders of the new study said the most telling results were that the bacterium causing sexually transmitted syphilis arose relatively recently in humans and was closely related to a strain responsible for the nonvenereal infection known as yaws. The similarity was especially evident, the researchers said, in a variation of the yaws pathogen isolated recently among afflicted children in a remote region of Guyana in South America.

Researchers who conducted the study and others familiar with it said the findings suggested Columbus and his men could have carried the nonvenereal tropical bacteria home, where the organisms may have mutated into a more deadly form in the different conditions of Europe.

In the New World, the infecting organisms for nonvenereal syphilis, known as bejel, and yaws were transmitted by skin-to-skin and oral contact, more often in children. The symptoms are lesions primarily on the legs, not on or near the genitals.

Kristin N. Harper, a researcher in molecular genetics at Emory University who was the principal investigator in the study, said the findings supported “the hypothesis that syphilis, or some progenitor, came from the New World.”

The examination of the evolutionary relatedness of organisms associated with syphilis was reported on Monday in the online journal Public Library of Science/Neglected Tropical Diseases.

Ms. Harper, a doctoral student in the Emory department of population biology, ecology and evolution, was the lead author. Her co-authors included George J. Armelagos, an Emory anthropologist who has studied the origins of syphilis for more than 30 years, and Dr. Michael S. Silverman, a Canadian infectious diseases physician who collected and tested specimens from yaws lesions in Guyana, the only known site today of yaws infections in the Western Hemisphere.

The researchers said their study “represents the first attempt to address the problem of the origin of syphilis using molecular genetics, as well as the first source of information regarding the genetic makeup of nonvenereal strains from the Western Hemisphere.”

They applied phylogenetics, the study of evolutionary relationships between organisms, in examining 26 geographically disparate strains in the family of Treponema bacteria. Treponema pallidum subspecies pallidum is the agent for the scourge of venereal syphilis. The subspecies endemicum causes bejel, usually in hot, arid climates, and pertenue spreads yaws in hot, humid places.

Della Collins Cook, a paleopathologist at Indiana University who did not participate in the study but specializes in treponemal diseases, praised the research as a “very, very interesting step” advancing understanding of syphilis. “They have looked at a wider range of the genome” of these bacteria, Dr. Cook said, “and have scared up some new samples from parts of the world and the group of related diseases that hadn’t been available to researchers before.”

But she recommended an even broader investigation of the natural history of these diseases, making an effort to find more people with active treponemal cases where they probably still exist in parts of South America. Cases of yaws in Africa and Asia are periodically reported.

John W. Verano, an anthropologist at Tulane, said the findings would “probably not settle the debate” over the origins of venereal syphilis, though most scientists had become convinced that the disease was not transmitted sexually before Europeans made contact with the New World.

Donald J. Ortner, an anthropologist at the Smithsonian Institution, questioned whether the organisms causing the first European epidemic were actually distinct from others in the treponemal family. “What we are seeing is an organism with a long history, and it is very adaptable to different modes of transmission that produce different manifestations,” Dr. Ortner said.

Three medical scientists, responding to the new study, pointed out what they considered shortcomings in its methods and interpretations.

In a critique also published by the online journal, Connie J. Mulligan of the University of Florida, Steven J. Norris of the University of Texas at Houston and Sheila A. Lukehart of the University of Washington wrote that caution “must be used in drawing conclusions about the evolution of ‘subspecies’ that may represent a biological continuum, rather than discrete agents.”

“Firm conclusions should not be based,” for example, on the two samples from one location in Guyana, they added.

But scientists generally agreed that the molecular approach would overcome some limitations of other investigations.

Paleopathologists like Dr. Cook have for years analyzed skeletons for the bone scars from lesions produced by treponemal diseases, except for the mild form called pinta. In this way, they traced the existence of these infections in the New World back at least 7,000 years. But it has often been difficult to determine the age of the bones and distinguish the different diseases that share symptoms but have different modes of transmission.

Dr. Cook said the skeletal evidence for treponemal disease in pre-Columbian Europe and Africa was sketchy and even more ambiguous than in the New World. In the 1990s, scientists reported finding bones in Italy and England, from before Columbus’s return, that bore lesion scars that they said appeared to have been caused by venereal syphilis.

Scientists remain skeptical of this interpretation. If highly contagious venereal syphilis had existed in Europe in antiquity, said Dr. Armelagos, the Emory anthropologist, there should be more supporting epidemiological evidence than two or three skeletons bearing suggestive scars.

In her investigation, Ms. Harper studied 22 human Treponemal pallidum strains. The DNA in their genes was sequenced in nearly all cases, examined for changes and eventually used in constructing phylogenetic trees incorporating all variations in the strains.

An Old World yaws subspecies was found to occupy the base of the tree, indicating its ancestral position in the treponemal family, she said. The terminal position of the venereal syphilis subspecies on the tree showed it had diverged most recently from the rest of the bacterial family.

Specimens from two Guyana yaws cases were included in the study, after they were collected and processed by Dr. Silverman. Genetic analysis showed that this yaws strain was the closest known relative to venereal syphilis.

Ms. Harper’s team concluded that New World yaws belonged to a group distinct from Old World strains, thus occupying the place on the tree more likely to be intermediate between the nonvenereal strains previously existing in Europe and the one for modern syphilis.

If this seemed to solidify the Columbus hypothesis, the researchers cautioned that a “transfer agent between humans and nonhuman primates cannot be ruled out using the available genetic data.”

Dr. Armelagos said research into the origins of syphilis would continue, because “understanding its evolution is important not just for biology, but for understanding social and political history.”

Noting that the disease was a major killer in Renaissance Europe, he said, “It could be argued that syphilis is one of the important early examples of globalization and disease, and globalization remains an important factor in emerging diseases.”





Big Brain Theory: Have Cosmologists Lost Theirs?
By DENNIS OVERBYE, The New York Times, January 15, 2008

It could be the weirdest and most embarrassing prediction in the history of cosmology, if not science.

If true, it would mean that you yourself reading this article are more likely to be some momentary fluctuation in a field of matter and energy out in space than a person with a real past born through billions of years of evolution in an orderly star-spangled cosmos. Your memories and the world you think you see around you are illusions.

This bizarre picture is the outcome of a recent series of calculations that take some of the bedrock theories and discoveries of modern cosmology to the limit. Nobody in the field believes that this is the way things really work, however. And so there in the last couple of years there has been a growing stream of debate and dueling papers, replete with references to such esoteric subjects as reincarnation, multiple universes and even the death of spacetime, as cosmologists try to square the predictions of their cherished theories with their convictions that we and the universe are real. The basic problem is that across the eons of time, the standard theories suggest, the universe can recur over and over again in an endless cycle of big bangs, but it’s hard for nature to make a whole universe. It’s much easier to make fragments of one, like planets, yourself maybe in a spacesuit or even — in the most absurd and troubling example — a naked brain floating in space. Nature tends to do what is easiest, from the standpoint of energy and probability. And so these fragments — in particular the brains — would appear far more frequently than real full-fledged universes, or than us. Or they might be us.

Alan Guth, a cosmologist at the Massachusetts Institute of Technology who agrees this overabundance is absurd, pointed out that some calculations result in an infinite number of free-floating brains for every normal brain, making it “infinitely unlikely for us to be normal brains.” Welcome to what physicists call the Boltzmann brain problem, named after the 19th-century Austrian physicist Ludwig Boltzmann, who suggested the mechanism by which such fluctuations could happen in a gas or in the universe. Cosmologists also refer to them as “freaky observers,” in contrast to regular or “ordered” observers of the cosmos like ourselves. Cosmologists are desperate to eliminate these freaks from their theories, but so far they can’t even agree on how or even on whether they are making any progress.

If you are inclined to skepticism this debate might seem like further evidence that cosmologists, who gave us dark matter, dark energy and speak with apparent aplomb about gazillions of parallel universes, have finally lost their minds. But the cosmologists say the brain problem serves as a valuable reality check as they contemplate the far, far future and zillions of bubble universes popping off from one another in an ever-increasing rush through eternity. What, for example is a “typical” observer in such a setup? If some atoms in another universe stick together briefly to look, talk and think exactly like you, is it really you?

“It is part of a much bigger set of questions about how to think about probabilities in an infinite universe in which everything that can occur, does occur, infinitely many times,” said Leonard Susskind of Stanford, a co-author of a paper in 2002 that helped set off the debate. Or as Andrei Linde, another Stanford theorist given to colorful language, loosely characterized the possibility of a replica of your own brain forming out in space sometime, “How do you compute the probability to be reincarnated to the probability of being born?”

The Boltzmann brain problem arises from a string of logical conclusions that all spring from another deep and old question, namely why time seems to go in only one direction. Why can’t you unscramble an egg? The fundamental laws governing the atoms bouncing off one another in the egg look the same whether time goes forward or backward. In this universe, at least, the future and the past are different and you can’t remember who is going to win the Super Bowl next week.

“When you break an egg and scramble it you are doing cosmology,” said Sean Carroll, a cosmologist at the California Institute of Technology.

Boltzmann ascribed this so-called arrow of time to the tendency of any collection of particles to spread out into the most random and useless configuration, in accordance with the second law of thermodynamics (sometimes paraphrased as “things get worse”), which says that entropy, which is a measure of disorder or wasted energy, can never decrease in a closed system like the universe.

If the universe was running down and entropy was increasing now, that was because the universe must have been highly ordered in the past.

In Boltzmann’s time the universe was presumed to have been around forever, in which case it would long ago have stabilized at a lukewarm temperature and died a “heat death.” It would already have maximum entropy, and so with no way to become more disorderly there would be no arrow of time. No life would be possible but that would be all right because life would be excruciatingly boring. Boltzmann said that entropy was all about odds, however, and if we waited long enough the random bumping of atoms would occasionally produce the cosmic equivalent of an egg unscrambling. A rare fluctuation would decrease the entropy in some place and start the arrow of time pointing and history flowing again. That is not what happened. Astronomers now know the universe has not lasted forever. It was born in the Big Bang, which somehow set the arrow of time, 14 billion years ago. The linchpin of the Big Bang is thought to be an explosive moment known as inflation, during which space became suffused with energy that had an antigravitational effect and ballooned violently outward, ironing the kinks and irregularities out of what is now the observable universe and endowing primordial chaos with order.

Inflation is a veritable cosmological fertility principle. Fluctuations in the field driving inflation also would have seeded the universe with the lumps that eventually grew to be galaxies, stars and people. According to the more extended version, called eternal inflation, an endless array of bubble or “pocket” universes are branching off from one another at a dizzying and exponentially increasing rate. They could have different properties and perhaps even different laws of physics, so the story goes.

A different, but perhaps related, form of antigravity, glibly dubbed dark energy, seems to be running the universe now, and that is the culprit responsible for the Boltzmann brains.

The expansion of the universe seems to be accelerating, making galaxies fly away from one another faster and faster. If the leading dark-energy suspect, a universal repulsion Einstein called the cosmological constant, is true, this runaway process will last forever, and distant galaxies will eventually be moving apart so quickly that they cannot communicate with one another. Being in such a space would be like being surrounded by a black hole.

Rather than simply going to black like “The Sopranos” conclusion, however, the cosmic horizon would glow, emitting a feeble spray of elementary particles and radiation, with a temperature of a fraction of a billionth of a degree, courtesy of quantum uncertainty. That radiation bath will be subject to random fluctuations just like Boltzmann’s eternal universe, however, and every once in a very long, long time, one of those fluctuations would be big enough to recreate the Big Bang. In the fullness of time this process could lead to the endless series of recurring universes. Our present universe could be part of that chain.

In such a recurrent setup, however, Dr. Susskind of Stanford, Lisa Dyson, now of the University of California, Berkeley, and Matthew Kleban, now at New York University, pointed out in 2002 that Boltzmann’s idea might work too well, filling the megaverse with more Boltzmann brains than universes or real people.

In the same way the odds of a real word showing up when you shake a box of Scrabble letters are greater than a whole sentence or paragraph forming, these “regular” universes would be vastly outnumbered by weird ones, including flawed variations on our own all the way down to naked brains, a result foreshadowed by Martin Rees, a cosmologist at the University of Cambridge, in his 1997 book, “Before the Beginning.”

The conclusions of Dr. Dyson and her colleagues were quickly challenged by Andreas Albrecht and Lorenzo Sorbo of the University of California, Davis, who used an alternate approach. They found that the Big Bang was actually more likely than Boltzmann’s brain.

“In the end, inflation saves us from Boltzmann’s brain,” Dr. Albrecht said, while admitting that the calculations were contentious. Indeed, the “invasion of Boltzmann brains,” as Dr. Linde once referred, was just beginning.

In an interview Dr. Linde described these brains as a form of reincarnation. Over the course of eternity, he said, anything is possible. After some Big Bang in the far future, he said, “it’s possible that you yourself will re-emerge. Eventually you will appear with your table and your computer.”

But it’s more likely, he went on, that you will be reincarnated as an isolated brain, without the baggage of stars and galaxies. In terms of probability, he said, “It’s cheaper.”

You might wonder what’s wrong with a few brains — or even a preponderance of them — floating around in space. For one thing, as observers these brains would see a freaky chaotic universe, unlike our own, which seems to persist in its promise and disappointment.

Another is that one of the central orthodoxies of cosmology is that humans don’t occupy a special place in the cosmos, that we and our experiences are typical of cosmic beings. If the odds of us being real instead of Boltzmann brains are one in a million, say, waking up every day would be like walking out on the street and finding everyone in the city standing on their heads. You would expect there to be some reason why you were the only one left right side up.

Some cosmologists, James Hartle and Mark Srednicki, of the University of California, Santa Barbara, have questioned that assumption. “For example,” Dr. Hartle wrote in an e-mail message, “on Earth humans are not typical animals; insects are far more numerous. No one is surprised by this.”

In an e-mail response to Dr. Hartle’s view, Don Page of the University of Alberta, who has been a prominent voice in the Boltzmann debate, argued that what counted cosmologically was not sheer numbers, but consciousness, which we have in abundance over the insects. “I would say that we have no strong evidence against the working hypothesis that we are typical and that our observations are typical,” he explained, “which is very fruitful in science for helping us believe that our observations are not just flukes but do tell us something about the universe.”

Dr. Dyson and her colleagues suggested that the solution to the Boltzmann paradox was in denying the presumption that the universe would accelerate eternally. In other words, they said, that the cosmological constant was perhaps not really constant. If the cosmological constant eventually faded away, the universe would revert to normal expansion and what was left would eventually fade to black. With no more acceleration there would be no horizon with its snap, crackle and pop, and thus no material for fluctuations and Boltzmann brains.

String theory calculations have suggested that dark energy is indeed metastable and will decay, Dr. Susskind pointed out. “The success of ordinary cosmology,” Dr. Susskind said, “speaks against the idea that the universe was created in a random fluctuation.”

But nobody knows whether dark energy — if it dies — will die soon enough to save the universe from a surplus of Boltzmann brains. In 2006, Dr. Page calculated that the dark energy would have to decay in about 20 billion years in order to prevent it from being overrun by Boltzmann brains.

The decay, if and when it comes, would rejigger the laws of physics and so would be fatal and total, spreading at almost the speed of light and destroying all matter without warning. There would be no time for pain, Dr. Page wrote: “And no grieving survivors will be left behind. So in this way it would be the most humanely possible execution.” But the object of his work, he said, was not to predict the end of the universe but to draw attention to the fact that the Boltzmann brain problem remains.

People have their own favorite measures of probability in the multiverse, said Raphael Buosso of the University of California, Berkeley. “So Boltzmann brains are just one example of how measures can predict nonsense; anytime your measure predicts that something we see has extremely small probability, you can throw it out,” he wrote in an e-mail message.

Another contentious issue is whether the cosmologists in their calculations could consider only the observable universe, which is all we can ever see or be influenced by, or whether they should take into account the vast and ever-growing assemblage of other bubbles forever out of our view predicted by eternal inflation. In the latter case, as Alex Vilenkin of Tufts University pointed out, “The numbers of regular and freak observers are both infinite.” Which kind predominate depends on how you do the counting, he said..

In eternal inflation, the number of new bubbles being hatched at any given moment is always growing, Dr. Linde said, explaining one such counting scheme he likes. So the evolution of people in new bubbles far outstrips the creation of Boltzmann brains in old ones. The main way life emerges, he said, is not by reincarnation but by the creation of new parts of the universe. “So maybe we don’t need to care too much” about the Boltzmann brains,” he said.

“If you are reincarnated, why do you care about where you are reincarnated?” he asked. “It sounds crazy because here we are touching issues we are not supposed to be touching in ordinary science. Can we be reincarnated?”

“People are not prepared for this discussion,” Dr. Linde said.

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

brdgt: (Default)
Brdgt

December 2018

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829
3031     

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated May. 24th, 2025 03:20 pm
Powered by Dreamwidth Studios