While he managed to salvage the $1.3 million deal after apologizing to his suitor, Mr. Campbell continues to struggle with the effects of the deluge of data. Even after he unplugs, he craves the stimulation he gets from his electronic gadgets. He forgets things like dinner plans, and he has trouble focusing on his family.

His wife, Brenda, complains, “It seems like he can no longer be fully in the moment.”

This is your brain on computers.

Scientists say juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.

These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive. In its absence, people feel bored.

The resulting distractions can have deadly consequences, as when cellphone-wielding drivers and train engineers cause wrecks. And for millions of people like Mr. Campbell, these urges can inflict nicks and cuts on creativity and deep thought, interrupting work and family life.

While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.

And scientists are discovering that even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.

“The technology is rewiring our brains,” said Nora Volkow, director of the National Institute of Drug Abuse and one of the world’s leading brain scientists. She and other researchers compare the lure of digital stimulation less to that of drugs and alcohol than to food and sex, which are essential but counterproductive in excess.

Technology use can benefit the brain in some ways, researchers say. Imaging studies show the brains of Internet users become more efficient at finding information. And players of some video games develop better visual acuity.

More broadly, cellphones and computers have transformed life. They let people escape their cubicles and work anywhere. They shrink distances and handle countless mundane tasks, freeing up time for more exciting pursuits.

For better or worse, the consumption of media, as varied as e-mail and TV, has exploded. In 2008, people consumed three times as much information each day as they did in 1960. And they are constantly shifting their attention. Computer users at work change windows or check e-mail or other programs nearly 37 times an hour, new research shows.

“We are exposing our brains to an environment and asking them to do things we weren’t necessarily evolved to do,” he said. “We know already there are consequences.”

Mr. Campbell, 43, came of age with the personal computer, and he is a heavier user of technology than most. But researchers say the habits and struggles of Mr. Campbell and his family typify what many experience — and what many more will, if trends continue.

For him, the tensions feel increasingly acute, and the effects harder to shake.

He goes to sleep with a laptop or iPhone on his chest, and when he wakes, he goes online. He and Mrs. Campbell, 39, head to the tidy kitchen in their four-bedroom hillside rental in Orinda, an affluent suburb of San Francisco, where she makes breakfast and watches a TV news feed in the corner of the computer screen while he uses the rest of the monitor to check his e-mail.

Major spats have arisen because Mr. Campbell escapes into video games during tough emotional stretches. On family vacations, he has trouble putting down his devices. When he rides the subway to San Francisco, he knows he will be offline 221 seconds as the train goes through a tunnel.

Their 16-year-old son, Connor, tall and polite like his father, recently received his first C’s, which his family blames on distraction from his gadgets. Their 8-year-old daughter, Lily, like her mother, playfully tells her father that he favors technology over family.

“I would love for him to totally unplug, to be totally engaged,” says Mrs. Campbell, who adds that he becomes “crotchety until he gets his fix.” But she would not try to force a change.

“He loves it. Technology is part of the fabric of who he is,” she says. “If I hated technology, I’d be hating him, and a part of who my son is too.”

Always On

Mr. Campbell, whose given name is Thomas, had an early start with technology in Oklahoma City. When he was in third grade, his parents bought him Pong, a video game. Then came a string of game consoles and PCs, which he learned to program.

In high school, he balanced computers, basketball and a romance with Brenda, a cheerleader with a gorgeous singing voice. He studied too, with focus, uninterrupted by e-mail. “I did my homework because I needed to get it done,” he said. “I didn’t have anything else to do.”

He left college to help with a family business, then set up a lawn mowing service. At night he would read, play video games, hang out with Brenda and, as she remembers it, “talk a lot more.”

In 1996, he started a successful Internet provider. Then he built the start-up that he sold for $1.3 million in 2003 to LookSmart, a search engine.

Mr. Campbell loves the rush of modern life and keeping up with the latest information. “I want to be the first to hear when the aliens land,” he said, laughing. But other times, he fantasizes about living in pioneer days when things moved more slowly: “I can’t keep everything in my head.”

No wonder. As he came of age, so did a new era of data and communication.

At home, people consume 12 hours of media a day on average, when an hour spent with, say, the Internet and TV simultaneously counts as two hours. That compares with five hours in 1960, say researchers at the University of California, San Diego. Computer users visit an average of 40 Web sites a day, according to research by RescueTime, which offers time-management tools.

As computers have changed, so has the understanding of the human brain. Until 15 years ago, scientists thought the brain stopped developing after childhood. Now they understand that its neural networks continue to develop, influenced by things like learning skills.

So not long after Eyal Ophir arrived at Stanford in 2004, he wondered whether heavy multitasking might be leading to changes in a characteristic of the brain long thought immutable: that humans can process only a single stream of information at a time.

Going back a half-century, tests had shown that the brain could barely process two streams, and could not simultaneously make decisions about them. But Mr. Ophir, a student-turned-researcher, thought multitaskers might be rewiring themselves to handle the load.

His passion was personal. He had spent seven years in Israeli intelligence after being weeded out of the air force — partly, he felt, because he was not a good multitasker. Could his brain be retrained?

Mr. Ophir, like others around the country studying how technology bent the brain, was startled by what he discovered.

The Myth of Multitasking

The test subjects were divided into two groups: those classified as heavy multitaskers based on their answers to questions about how they used technology, and those who were not.

In a test created by Mr. Ophir and his colleagues, subjects at a computer were briefly shown an image of red rectangles. Then they saw a similar image and were asked whether any of the rectangles had moved. It was a simple task until the addition of a twist: blue rectangles were added, and the subjects were told to ignore them. (Play a game testing how well you filter out distractions.)

The multitaskers then did a significantly worse job than the non-multitaskers at recognizing whether red rectangles had changed position. In other words, they had trouble filtering out the blue ones — the irrelevant information.

So, too, the multitaskers took longer than non-multitaskers to switch among tasks, like differentiating vowels from consonants and then odd from even numbers. The multitaskers were shown to be less efficient at juggling problems. (Play a game testing how well you switch between tasks.)

Other tests at Stanford, an important center for research in this fast-growing field, showed multitaskers tended to search for new information rather than accept a reward for putting older, more valuable information to work.

Researchers say these findings point to an interesting dynamic: multitaskers seem more sensitive than non-multitaskers to incoming information.

The results also illustrate an age-old conflict in the brain, one that technology may be intensifying. A portion of the brain acts as a control tower, helping a person focus and set priorities. More primitive parts of the brain, like those that process sight and sound, demand that it pay attention to new information, bombarding the control tower when they are stimulated.

Researchers say there is an evolutionary rationale for the pressure this barrage puts on the brain. The lower-brain functions alert humans to danger, like a nearby lion, overriding goals like building a hut. In the modern world, the chime of incoming e-mail can override the goal of writing a business plan or playing catch with the children.

“Throughout evolutionary history, a big surprise would get everyone’s brain thinking,” said Clifford Nass, a communications professor at Stanford. “But we’ve got a large and growing group of people who think the slightest hint that something interesting might be going on is like catnip. They can’t ignore it.”

Mr. Nass says the Stanford studies are important because they show multitasking’s lingering effects: “The scary part for guys like Kord is, they can’t shut off their multitasking tendencies when they’re not multitasking.”

Melina Uncapher, a neurobiologist on the Stanford team, said she and other researchers were unsure whether the muddied multitaskers were simply prone to distraction and would have had trouble focusing in any era. But she added that the idea that information overload causes distraction was supported by more and more research.

A study at the University of California, Irvine, found that people interrupted by e-mail reported significantly increased stress compared with those left to focus. Stress hormones have been shown to reduce short-term memory, said Gary Small, a psychiatrist at the University of California, Los Angeles.

Preliminary research shows some people can more easily juggle multiple information streams. These “supertaskers” represent less than 3 percent of the population, according to scientists at the University of Utah.

Other research shows computer use has neurological advantages. In imaging studies, Dr. Small observed that Internet users showed greater brain activity than nonusers, suggesting they were growing their neural circuitry.

At the University of Rochester, researchers found that players of some fast-paced video games can track the movement of a third more objects on a screen than nonplayers. They say the games can improve reaction and the ability to pick out details amid clutter.

“In a sense, those games have a very strong both rehabilitative and educational power,” said the lead researcher, Daphne Bavelier, who is working with others in the field to channel these changes into real-world benefits like safer driving.

There is a vibrant debate among scientists over whether technology’s influence on behavior and the brain is good or bad, and how significant it is.

“The bottom line is, the brain is wired to adapt,” said Steven Yantis, a professor of brain sciences at Johns Hopkins University. “There’s no question that rewiring goes on all the time,” he added. But he said it was too early to say whether the changes caused by technology were materially different from others in the past.

Mr. Ophir is loath to call the cognitive changes bad or good, though the impact on analysis and creativity worries him.

He is not just worried about other people. Shortly after he came to Stanford, a professor thanked him for being the one student in class paying full attention and not using a computer or phone. But he recently began using an iPhone and noticed a change; he felt its pull, even when playing with his daughter.

It is a Wednesday in April, and in 10 minutes, Mr. Campbell has an online conference call that could determine the fate of his new venture, called Loggly. It makes software that helps companies understand the clicking and buying patterns of their online customers.

Mr. Campbell and his colleagues, each working from a home office, are frantically trying to set up a program that will let them share images with executives at their prospective partner.

But at the moment when Mr. Campbell most needs to focus on that urgent task, something else competes for his attention: “Man Found Dead Inside His Business.”

That is the tweet that appears on the left-most of Mr. Campbell’s array of monitors, which he has expanded to three screens, at times adding a laptop and an iPad.

On the left screen, Mr. Campbell follows the tweets of 1,100 people, along with instant messages and group chats. The middle monitor displays a dark field filled with computer code, along with Skype, a service that allows Mr. Campbell to talk to his colleagues, sometimes using video. The monitor on the right keeps e-mail, a calendar, a Web browser and a music player.

Even with the meeting fast approaching, Mr. Campbell cannot resist the tweet about the corpse. He clicks on the link in it, glances at the article and dismisses it. “It’s some article about something somewhere,” he says, annoyed by the ads for jeans popping up.

The program gets fixed, and the meeting turns out to be fruitful: the partners are ready to do business. A colleague says via instant message: “YES.”

Other times, Mr. Campbell’s information juggling has taken a more serious toll. A few weeks earlier, he once again overlooked an e-mail message from a prospective investor. Another time, Mr. Campbell signed the company up for the wrong type of business account on Amazon.com, costing $300 a month for six months before he got around to correcting it. He has burned hamburgers on the grill, forgotten to pick up the children and lingered in the bathroom playing video games on an iPhone.

Mr. Campbell can be unaware of his own habits. In a two-and-a-half hour stretch one recent morning, he switched rapidly between e-mail and several other programs, according to data from RescueTime, which monitored his computer use with his permission. But when asked later what he was doing in that period, Mr. Campbell said he had been on a long Skype call, and “may have pulled up an e-mail or two.”

The kind of disconnection Mr. Campbell experiences is not an entirely new problem, of course. As they did in earlier eras, people can become so lost in work, hobbies or TV that they fail to pay attention to family.

Mr. Campbell concedes that, even without technology, he may work or play obsessively, just as his father immersed himself in crossword puzzles. But he says this era is different because he can multitask anyplace, anytime.

“It’s a mixed blessing,” he said. “If you’re not careful, your marriage can fall apart or your kids can be ready to play and you’ll get distracted.”

The Toll on Children

Father and son sit in armchairs. Controllers in hand, they engage in a fierce video game battle, displayed on the nearby flat-panel TV, as Lily watches.

They are playing Super Smash Bros. Brawl, a cartoonish animated fight between characters that battle using anvils, explosives and other weapons.

Screens big and small are central to the Campbell family’s leisure time. Connor and his mother relax while watching TV shows like “Heroes.” Lily has an iPod Touch, a portable DVD player and her own laptop, which she uses to watch videos, listen to music and play games.

Lily, a second-grader, is allowed only an hour a day of unstructured time, which she often spends with her devices. The laptop can consume her.

“When she’s on it, you can holler her name all day and she won’t hear,” Mrs. Campbell said.

Researchers worry that constant digital stimulation like this creates attention problems for children with brains that are still developing, who already struggle to set priorities and resist impulses.

Connor’s troubles started late last year. He could not focus on homework. No wonder, perhaps. On his bedroom desk sit two monitors, one with his music collection, one with Facebook and Reddit, a social site with news links that he and his father love. His iPhone availed him to relentless texting with his girlfriend.

When he studied, “a little voice would be saying, ‘Look up’ at the computer, and I’d look up,” Connor said. “Normally, I’d say I want to only read for a few minutes, but I’d search every corner of Reddit and then check Facebook.”

But the parents worry too. “Connor is obsessed,” his mother said. “Kord says we have to teach him balance.”

So in January, they held a family meeting. Study time now takes place in a group setting at the dinner table after everyone has finished eating. It feels, Mr. Campbell says, like togetherness.

No Vacations

For spring break, the family rented a cottage in Carmel, Calif. Mrs. Campbell hoped everyone would unplug.

But the day before they left, the iPad from Apple came out, and Mr. Campbell snapped one up. The next night, their first on vacation, “We didn’t go out to dinner,” Mrs. Campbell mourned. “We just sat there on our devices.”

She rallied the troops the next day to the aquarium. Her husband joined them for a bit but then begged out to do e-mail on his phone.

Later she found him playing video games.

The trip came as Mr. Campbell was trying to raise several million dollars for his new venture, a goal that he achieved. Brenda said she understood that his pursuit required intensity but was less understanding of the accompanying surge in video game.

His behavior brought about a discussion between them. Mrs. Campbell said he told her that he was capable of logging off, citing a trip to Hawaii several years ago that they called their second honeymoon.

“What trip are you thinking about?” she said she asked him. She recalled that he had spent two hours a day online in the hotel’s business center.

On Thursday, their fourth day in Carmel, Mr. Campbell spent the day at the beach with his family. They flew a kite and played whiffle ball.

The next day, the family drove home, and Mr. Campbell disappeared into his office.

Technology use is growing for Mrs. Campbell as well. She divides her time between keeping the books of her husband’s company, homemaking and working at the school library. She checks e-mail 25 times a day, sends texts and uses Facebook.

Recently, she was baking peanut butter cookies for Teacher Appreciation Day when her phone chimed in the living room. She answered a text, then became lost in Facebook, forgot about the cookies and burned them. She started a new batch, but heard the phone again, got lost in messaging, and burned those too. Out of ingredients and shamed, she bought cookies at the store.

She feels less focused and has trouble completing projects. Some days, she promises herself she will ignore her device. “It’s like a diet — you have good intentions in the morning and then you’re like, ‘There went that,’ ” she said.

Mr. Nass at Stanford thinks the ultimate risk of heavy technology use is that it diminishes empathy by limiting how much people engage with one another, even in the same room.

“The way we become more human is by paying attention to each other,” he said. “It shows how much you care.”

That empathy, Mr. Nass said, is essential to the human condition. “We are at an inflection point,” he said. “A significant fraction of people’s experiences are now fragmented.”