Reply Thu 13 Sep, 2012 01:06 am
A scientist says that he has science which shows that the internet makes us dumb. He also believes that computers should be banned from school.

BERLIN - Dr. Manfred Spitzer knows that people find his arguments provocative. In his first book, he warned parents of the very real dangers of letting their children spend too much time in front of the TV. Now, in a second book called Digitale Demenz [Digital Dementia], he’s telling them that teaching young kids finger-counting games is much better for them than letting them explore on a laptop.
Spitzer, 54, may be a member of the slide-rule generation that learned multiplication tables by heart, but his work as a neuropsychiatrist has shown him that when young children spend too much time using a computer, their brain development suffers and that the deficits are irreversible and cannot be made up for later in life.
South Korean doctors were the first to describe this phenomenon, and dubbed it digital dementia – whence the title of Spitzer’s book. Simplistically, the message can be summed up this way: the Internet makes you dumb. And it is of course a message that outrages all those who feel utterly comfortable in the digital world. In the aftermath of the publication of Spitzer’s book, they have lost no time venting their wrath across Germany.
And yet Spitzer has accumulated a wealth of scientific information that gives his thesis solid underpinnings, and the studies and data he draws on offer more than enough room for consternation.

are you willing to consider the possibility that the digital age is bad for us?
Reply Thu 13 Sep, 2012 01:09 am
Reply Thu 13 Sep, 2012 01:16 am
i have read many teachers claiming that technology harms young brains (and teachers are the first ones i would expect to notice this if it existed), so that is not new, but this idea that brains will never recover I think is new. that science can show this harm may also be new.
Reply Thu 13 Sep, 2012 05:28 am
Lord you could read such nonsense about comic books in the 1950s by doctors and teachers so nothing ever ever had change.

It not the kids who are exposed to the internet and computers at any early age that we need to concern about it the kids who are not.
Reply Thu 13 Sep, 2012 06:38 am
Footnote there was also some expressions of the harm that the printing press will do to mankind in the late 1400s.

As I said nothing ever change except the technology.
0 Replies
Reply Thu 13 Sep, 2012 09:23 am
hawkeye10 wrote:

are you willing to consider the possibility that the digital age is bad for us?
I suspect the main danger comes from lack of exercise, much more than any degradation (or change) in thinking.

But I'm sure he's making lots of nice money on his controversial book. Good for him.
Reply Thu 13 Sep, 2012 09:27 am
hawkeye10 wrote:

A scientist says that he has science which shows that the internet makes us dumb.

quite a bit of research around this has been coming out in the last five or so years. I'm not sure that it's all wrong.
Reply Thu 13 Sep, 2012 09:28 am
Makes us dumb, or reveals that certain people are dumb?

Reply Thu 13 Sep, 2012 09:32 am
rosborne979 wrote:

hawkeye10 wrote:

are you willing to consider the possibility that the digital age is bad for us?
I suspect the main danger comes from lack of exercise, much more than any degradation (or change) in thinking.

But I'm sure he's making lots of nice money on his controversial book. Good for him.

agreed, lack of exercise will just as easily lead to degradation in thinking, lack of exercise impacts circulation, less food (blo0d and oxygen) for the brain
0 Replies
Reply Thu 13 Sep, 2012 09:50 am
quite a bit of research around this has been coming out in the last five or so years. I'm not sure that it's all wrong

Of course it is harmful to have a large percent of the total knowledge of the human race at children finger tips.
0 Replies
Reply Thu 13 Sep, 2012 09:57 am
hawkeye10 wrote:

A scientist says that he has science

Science is something you do, not something you have.
Reply Thu 13 Sep, 2012 10:27 am

are you willing to consider the possibility that the digital age is bad for us?

Certainly. There is no reason not to approach the issue objectively--which means considering the possibility that the use of digital devices, and electronic devices, have both positive and negative aspects and effects.

There is no reason not to believe that the reliance on digital devices wouldn't negatively impact certain cognitive functions, particularly if we excessively rely on these devices rather than our brains. Just as muscles can wither and atrophy from lack of use and exercise, our cognitive functions can also decline for the same reason.
In a recent survey on 2,030 office workers conducted by the online job portal site, Incruit, and research firm, Embrain, some 63% (1,281 respondents) said they suffer from forgetfulness. When asked about the reason for their forgetfulness, 261 people (20.4%) cited their growing dependence on mobile phones, PCs, and other digital devices...

Types of digital dementia

ㆍNot remembering names, phone numbers, numbers, etc. at critical moments
ㆍNot remembering one’s own home phone number, ID number, account number, passwords, etc.
ㆍNot remembering what one had for lunch.


It is said we are now in the era of “well-thinking,” a step beyond the era of “well-being.” Your wellbeing habits will become meaningless if you cannot remember what you ate, no matter how healthy it was. Digital dementia means a person’s memory or ability to calculate decreases because he or she relies too much on digital tools such as cell phones. The symptom is also often called technology amnesia. The dementia-like state can be interpreted with the “use and disuse theory.” People these days do not need to remember lots of information because cell phones and PDAs remember phone numbers instead of the users, computers save documents, navigation devices tell people the way to destinations, and karaoke machines show them the lyrics to songs. What people do is just press buttons to get to information saved in the digital devices. To prevent digital dementia and live the life of “well-thinking,” one should keep a bit of a distance from digital devices and live in a little more old-fashioned manner. Does that mean that analog will prevent digital dementia?

Ways of preventing digital dementia

1. Remember that your memory, if not used, will decline
2. Try to remember as many phone numbers, names, poems, phrases, etc. as possible
3. Reduce dependency your on digital devices
4. Spend more time reading and watching movies and then discuss them with other people

MR: Smartphones could contribute to ‘Digital Dementia’
Media Release
Thursday, June 21st, 2012

The growing reliance on digital gadgets could lead to ‘Digital Dementia’, resulting in a growing inability to concentrate, read other people’s body language, or even manage our finances, according to Social Futurist Mal Fletcher.

Mr Fletcher is the keynote speaker at this year’s Australian Christian Lobby Tasmanian State Conference being held in Launceston this Saturday, 23rd July.

The younger generation is in danger of living large parts of their lives at ‘phone call’ level where they are unable to read and respond to non-verbal expressions, gestures and signals, he believes.

Signs of dementia today could become something like the normal state of mind by 2022 as the population shows a decline in mental and social function.

It’s an issue that’s already worrying some people, Mal Fletcher believes: “We start to question our memory because we can’t remember 50 different logins, passwords and pin numbers. In reality our brains haven’t developed to function like that.”

Multitasking, he says, is largely just another word for distraction, and it stops people picking up important signals from the people they’re with.

“If someone is with a group of people and they’re also using their smartphone, then they’re not multitasking, they’re simply spreading their attention more thinly. As a result, they are losing their capacity to listen well and also to pick up those non-verbal signals from other people,” says Fletcher.

A leading UK judge has declared that today’s “internet generation” are not well suited to jury duty because they find it hard to take in complex and lengthy arguments in a courtroom.

“What would happen to a baby’s development if, instead of interacting with a live human mother, it was interacting only with mum on a video screen or even a digitally-rendered hologram? By the same token, what happens to the parts of the human brain responsible for reading physical, biometric signals, when we talk more via text, social networking or online video than we do face-to-face?”

Mal Fletcher, a Social Commentator and Social Futurist, leads the London-based think-tank 2020Plus. Originally from Australia, he is a regular expert guest on national BBC TV and radio programmes and other European media and press platforms and has researched global social trends for two decades

And there is no reason to beleve that reliance on digital devices wouldn't affect developing brains and negatively impact certain coginitive abilities.
November 21, 2010
Growing Up Digital, Wired for Distraction

REDWOOD CITY, Calif. — On the eve of a pivotal academic year in Vishal Singh’s life, he faces a stark choice on his bedroom desk: book or computer?

By all rights, Vishal, a bright 17-year-old, should already have finished the book, Kurt Vonnegut’s “Cat’s Cradle,” his summer reading assignment. But he has managed 43 pages in two months.

He typically favors Facebook, YouTube and making digital videos. That is the case this August afternoon. Bypassing Vonnegut, he clicks over to YouTube, meaning that tomorrow he will enter his senior year of high school hoping to see an improvement in his grades, but without having completed his only summer homework.

On YouTube, “you can get a whole story in six minutes,” he explains. “A book takes so long. I prefer the immediate gratification.”

Students have always faced distractions and time-wasters. But computers and cellphones, and the constant stream of stimuli they offer, pose a profound new challenge to focusing and learning.

Researchers say the lure of these technologies, while it affects adults too, is particularly powerful for young people. The risk, they say, is that developing brains can become more easily habituated than adult brains to constantly switching tasks — and less able to sustain attention.

“Their brains are rewarded not for staying on task but for jumping to the next thing,” said Michael Rich, an associate professor at Harvard Medical School and executive director of the Center on Media and Child Health in Boston. And the effects could linger: “The worry is we’re raising a generation of kids in front of screens whose brains are going to be wired differently.”

But even as some parents and educators express unease about students’ digital diets, they are intensifying efforts to use technology in the classroom, seeing it as a way to connect with students and give them essential skills. Across the country, schools are equipping themselves with computers, Internet access and mobile devices so they can teach on the students’ technological territory.

It is a tension on vivid display at Vishal’s school, Woodside High School, on a sprawling campus set against the forested hills of Silicon Valley. Here, as elsewhere, it is not uncommon for students to send hundreds of text messages a day or spend hours playing video games, and virtually everyone is on Facebook.

The principal, David Reilly, 37, a former musician who says he sympathizes when young people feel disenfranchised, is determined to engage these 21st-century students. He has asked teachers to build Web sites to communicate with students, introduced popular classes on using digital tools to record music, secured funding for iPads to teach Mandarin and obtained $3 million in grants for a multimedia center.

He pushed first period back an hour, to 9 a.m., because students were showing up bleary-eyed, at least in part because they were up late on their computers. Unchecked use of digital devices, he says, can create a culture in which students are addicted to the virtual world and lost in it.

“I am trying to take back their attention from their BlackBerrys and video games,” he says. “To a degree, I’m using technology to do it.”

The same tension surfaces in Vishal, whose ability to be distracted by computers is rivaled by his proficiency with them. At the beginning of his junior year, he discovered a passion for filmmaking and made a name for himself among friends and teachers with his storytelling in videos made with digital cameras and editing software.

He acts as his family’s tech-support expert, helping his father, Satendra, a lab manager, retrieve lost documents on the computer, and his mother, Indra, a security manager at the San Francisco airport, build her own Web site.

But he also plays video games 10 hours a week. He regularly sends Facebook status updates at 2 a.m., even on school nights, and has such a reputation for distributing links to videos that his best friend calls him a “YouTube bully.”

Several teachers call Vishal one of their brightest students, and they wonder why things are not adding up. Last semester, his grade point average was 2.3 after a D-plus in English and an F in Algebra II. He got an A in film critique.

“He’s a kid caught between two worlds,” said Mr. Reilly — one that is virtual and one with real-life demands.

Vishal, like his mother, says he lacks the self-control to favor schoolwork over the computer. She sat him down a few weeks before school started and told him that, while she respected his passion for film and his technical skills, he had to use them productively.

“This is the year,” she says she told him. “This is your senior year and you can’t afford not to focus.”

It was not always this way. As a child, Vishal had a tendency to procrastinate, but nothing like this. Something changed him.

Growing Up With Gadgets

When he was 3, Vishal moved with his parents and older brother to their current home, a three-bedroom house in the working-class section of Redwood City, a suburb in Silicon Valley that is more diverse than some of its elite neighbors.

Thin and quiet with a shy smile, Vishal passed the admissions test for a prestigious public elementary and middle school. Until sixth grade, he focused on homework, regularly going to the house of a good friend to study with him.

But Vishal and his family say two things changed around the seventh grade: his mother went back to work, and he got a computer. He became increasingly engrossed in games and surfing the Internet, finding an easy outlet for what he describes as an inclination to procrastinate.

“I realized there were choices,” Vishal recalls. “Homework wasn’t the only option.”

Several recent studies show that young people tend to use home computers for entertainment, not learning, and that this can hurt school performance, particularly in low-income families. Jacob L. Vigdor, an economics professor at Duke University who led some of the research, said that when adults were not supervising computer use, children “are left to their own devices, and the impetus isn’t to do homework but play around.”

Research also shows that students often juggle homework and entertainment. The Kaiser Family Foundation found earlier this year that half of students from 8 to 18 are using the Internet, watching TV or using some other form of media either “most” (31 percent) or “some” (25 percent) of the time that they are doing homework.

At Woodside, as elsewhere, students’ use of technology is not uniform. Mr. Reilly, the principal, says their choices tend to reflect their personalities. Social butterflies tend to be heavy texters and Facebook users. Students who are less social might escape into games, while drifters or those prone to procrastination, like Vishal, might surf the Web or watch videos.

The technology has created on campuses a new set of social types — not the thespian and the jock but the texter and gamer, Facebook addict and YouTube potato.

“The technology amplifies whoever you are,” Mr. Reilly says.

For some, the amplification is intense. Allison Miller, 14, sends and receives 27,000 texts in a month, her fingers clicking at a blistering pace as she carries on as many as seven text conversations at a time. She texts between classes, at the moment soccer practice ends, while being driven to and from school and, often, while studying.

Most of the exchanges are little more than quick greetings, but they can get more in-depth, like “if someone tells you about a drama going on with someone,” Allison said. “I can text one person while talking on the phone to someone else.”

But this proficiency comes at a cost: she blames multitasking for the three B’s on her recent progress report.

“I’ll be reading a book for homework and I’ll get a text message and pause my reading and put down the book, pick up the phone to reply to the text message, and then 20 minutes later realize, ‘Oh, I forgot to do my homework.’ ”

Some shyer students do not socialize through technology — they recede into it. Ramon Ochoa-Lopez, 14, an introvert, plays six hours of video games on weekdays and more on weekends, leaving homework to be done in the bathroom before school.

Escaping into games can also salve teenagers’ age-old desire for some control in their chaotic lives. “It’s a way for me to separate myself,” Ramon says. “If there’s an argument between my mom and one of my brothers, I’ll just go to my room and start playing video games and escape.”

With powerful new cellphones, the interactive experience can go everywhere. Between classes at Woodside or at lunch, when use of personal devices is permitted, students gather in clusters, sometimes chatting face to face, sometimes half-involved in a conversation while texting someone across the teeming quad. Others sit alone, watching a video, listening to music or updating Facebook.

Students say that their parents, worried about the distractions, try to police computer time, but that monitoring the use of cellphones is difficult. Parents may also want to be able to call their children at any time, so taking the phone away is not always an option.

Other parents wholly embrace computer use, even when it has no obvious educational benefit.

“If you’re not on top of technology, you’re not going to be on top of the world,” said John McMullen, 56, a retired criminal investigator whose son, Sean, is one of five friends in the group Vishal joins for lunch each day.

Sean’s favorite medium is video games; he plays for four hours after school and twice that on weekends. He was playing more but found his habit pulling his grade point average below 3.2, the point at which he felt comfortable. He says he sometimes wishes that his parents would force him to quit playing and study, because he finds it hard to quit when given the choice. Still, he says, video games are not responsible for his lack of focus, asserting that in another era he would have been distracted by TV or something else.

“Video games don’t make the hole; they fill it,” says Sean, sitting at a picnic table in the quad, where he is surrounded by a multimillion-dollar view: on the nearby hills are the evergreens that tower above the affluent neighborhoods populated by Internet tycoons. Sean, a senior, concedes that video games take a physical toll: “I haven’t done exercise since my sophomore year. But that doesn’t seem like a big deal. I still look the same.”

Sam Crocker, Vishal’s closest friend, who has straight A’s but lower SAT scores than he would like, blames the Internet’s distractions for his inability to finish either of his two summer reading books.

“I know I can read a book, but then I’m up and checking Facebook,” he says, adding: “Facebook is amazing because it feels like you’re doing something and you’re not doing anything. It’s the absence of doing something, but you feel gratified anyway.”

He concludes: “My attention span is getting worse.”

The Lure of Distraction

Some neuroscientists have been studying people like Sam and Vishal. They have begun to understand what happens to the brains of young people who are constantly online and in touch.

In an experiment at the German Sport University in Cologne in 2007, boys from 12 to 14 spent an hour each night playing video games after they finished homework.

On alternate nights, the boys spent an hour watching an exciting movie, like “Harry Potter” or “Star Trek,” rather than playing video games. That allowed the researchers to compare the effect of video games and TV.

The researchers looked at how the use of these media affected the boys’ brainwave patterns while sleeping and their ability to remember their homework in the subsequent days. They found that playing video games led to markedly lower sleep quality than watching TV, and also led to a “significant decline” in the boys’ ability to remember vocabulary words. The findings were published in the journal Pediatrics.

Markus Dworak, a researcher who led the study and is now a neuroscientist at Harvard, said it was not clear whether the boys’ learning suffered because sleep was disrupted or, as he speculates, also because the intensity of the game experience overrode the brain’s recording of the vocabulary.

“When you look at vocabulary and look at huge stimulus after that, your brain has to decide which information to store,” he said. “Your brain might favor the emotionally stimulating information over the vocabulary.”

At the University of California, San Francisco, scientists have found that when rats have a new experience, like exploring an unfamiliar area, their brains show new patterns of activity. But only when the rats take a break from their exploration do they process those patterns in a way that seems to create a persistent memory.

In that vein, recent imaging studies of people have found that major cross sections of the brain become surprisingly active during downtime. These brain studies suggest to researchers that periods of rest are critical in allowing the brain to synthesize information, make connections between ideas and even develop the sense of self.

Researchers say these studies have particular implications for young people, whose brains have more trouble focusing and setting priorities.

“Downtime is to the brain what sleep is to the body,” said Dr. Rich of Harvard Medical School. “But kids are in a constant mode of stimulation.”

“The headline is: bring back boredom,” added Dr. Rich, who last month gave a speech to the American Academy of Pediatrics entitled, “Finding Huck Finn: Reclaiming Childhood from the River of Electronic Screens.”

Dr. Rich said in an interview that he was not suggesting young people should toss out their devices, but rather that they embrace a more balanced approach to what he said were powerful tools necessary to compete and succeed in modern life.

The heavy use of devices also worries Daniel Anderson, a professor of psychology at the University of Massachusetts at Amherst, who is known for research showing that children are not as harmed by TV viewing as some researchers have suggested.

Multitasking using ubiquitous, interactive and highly stimulating computers and phones, Professor Anderson says, appears to have a more powerful effect than TV.

Like Dr. Rich, he says he believes that young, developing brains are becoming habituated to distraction and to switching tasks, not to focus.

“If you’ve grown up processing multiple media, that’s exactly the mode you’re going to fall into when put in that environment — you develop a need for that stimulation,” he said.

Vishal can attest to that.

“I’m doing Facebook, YouTube, having a conversation or two with a friend, listening to music at the same time. I’m doing a million things at once, like a lot of people my age,” he says. “Sometimes I’ll say: I need to stop this and do my schoolwork, but I can’t.”

“If it weren’t for the Internet, I’d focus more on school and be doing better academically,” he says. But thanks to the Internet, he says, he has discovered and pursued his passion: filmmaking. Without the Internet, “I also wouldn’t know what I want to do with my life.”

Reply Thu 13 Sep, 2012 10:45 am
Personally, I view my devices as "digital prostheses."
0 Replies
Reply Thu 13 Sep, 2012 10:55 am
Yes, every time a new technology come along it is the work of the Devil and I love the statement that poor families who do not oversee their children computer use suffer compare to children that do, however that can be said about all aspects of children being better off if the parents are involved.

As far as memory is concern and people lack of abilities to remember such things as phone numbers so what?

If a skill is no longer needed or at least far less needed it is to be expected that this skill will decrease over time in the population.

Before writing people used to remember long and complex stories and pass them down over the generation in audio form with few changes.

When writing came along that skill level was no longer needed.

If the computers and scientific calculators would stop working and I needed to go back to using my old log-log picket slide rule I would be in a world of hurt in trying to keep a long string of decimal points in my mind.

At one time that was not a problem at all for me to do and that ability had greatly decrease not so must because of my age as I had not needed such abilities since the late 1970s.
0 Replies
Reply Thu 13 Sep, 2012 10:57 am
DrewDad wrote:

hawkeye10 wrote:

A scientist says that he has science

Science is something you do, not something you have.

Carevto document that assertion? According to "science" is a noun, not a verb.
Reply Thu 13 Sep, 2012 10:59 am
Brain Development in a Hyper-Tech World
By Brenda Patoine

From tweens to 20-somethings, students are heading back to school this year equipped with the latest electronic gadgets and high-tech accessories. Today’s youth, the most techno-savvy generation yet, have grown up on the computer and Internet and have fully embraced the virtual world, with its emphasis on instant, constant information and communication. They have practically adopted iPod headsets and cell phones as appendages, often to the bafflement of older generations.

In the face of this nonstop barrage of technology-induced stimulation, a question on the minds of many parents, educators and scientists is: how is this affecting young brains? The question is an important one, and from a scientific standpoint, reasonable to ask given what is known about the developing brain.

A central tenet of neuroscience, for example, is that the brain continues to develop its “wiring diagram” at least well into a person’s 20s. The frontal lobes, regions critical to high-level cognitive skills such as judgment, executive control, and emotional regulation, are the last to fully develop. It is also well accepted that during this extended developmental period, the brain is highly adaptable to and influenced by external environmental circumstances. Might the perpetual bath of technology-driven information and sensory overload impact the still-developing brain in some way?

“There are a lot of things we’ve learned about fundamental principles of brain development and interactions with the environment and so forth that one can reasonably hypothesize about what the effects might be,” said Michael Friedlander, head of neuroscience at Baylor College of Medicine and a member of the Dana Alliance for Brain Initiatives. “But for the most part, the data aren’t there yet. In terms of actual science investigating people who are using these technologies–the kind of experiments and hard data that most neuroscientists would like to collect–it’s pretty thin.”

Given that reality, he added: “The best we can do at this point is look at a lot of the science that has been done in much more controlled settings and try to extrapolate that to the real world of kids interacting with these technologies.”

A Cautionary Flag

While acknowledging that the dearth of data makes it impossible to know what’s going on for sure, a few prominent neuroscientists are raising a cautionary flag about the possible long-term consequences of technology overload.

Among them is Dana Alliance member Jordan Grafman, chief of cognitive neuroscience at the National Institute of Neurological Disorders and Stroke. “In general, technology can be good [for children’s cognitive development] if it is used judiciously,” Grafman said. “But if it is used in a nonjudicious fashion, it will shape the brain in what I think will actually be a negative way.”

The problem is that judicious thinking is among the frontal-lobe skills that are still developing way past the teenage years. In the meantime, the pull of technology is capturing kids at an ever earlier age, when they are not generally able to step back and decide what’s appropriate or necessary, or how much is too much. The outcome, Grafman fears, will be a generation marked by “laziness of thinking.”

“A lot of what is appealing about all these types of instant communications is that they are fast,” he said. “Fast is not equated with deliberation. So I think they can produce a tendency toward shallow thinking. It’s not going to turn off the brain to thinking deeply and thoughtfully about things, but it is going to make that a little bit more difficult to do.”

Multitasking Taxes the Brain

One area where the research is particularly strong is what is popularly known as multitasking. Plugged-in kids have gained a reputation for being masters at toggling between, say, a homework assignment and instant-messaging classmates, downloading music and texting on the cell phone, surfing the Internet while updating Facebook pages, and so on.

A 2006 survey by the Kaiser Family Foundation1 found that middle and high school students spend an average of 6.5 hours a day hooked up to computers or otherwise using electronic devices, and more than a quarter of them are routinely using several types of media at once. It also found that when teens are “studying” at the computer, two-thirds of the time they are also doing something else.

“Children’s rooms are now almost pathogenic because they have so many distractions,” said Dana Alliance member Martha Bridge Denckla, a neuroscientist at Kennedy Krieger Institute and Johns Hopkins who studies attention deficit disorders in kids. “I think the most devastating thing that has happened is giving a child a room with a computer in it–you think you’re being a good parent by doing so. Well, a funny thing can happen on the way to the homework.”

While the common perception is that multitasking saves time, enabling one to get things done faster and better, the evidence suggests quite the opposite. It is clear from a large body of solid scientific research conducted over the past two decades that dividing the brain’s attention between two or more tasks simultaneously has costs, both in performance and time.

Several independent research groups have reported evidence that, at the level of neural systems, multitasking actually entails rapid switching from one task to another. Each switch exacts a toll, at least doubling the time it takes to complete a task and decreasing both the level of performance and the ability to recall what you were doing later on. Study after study has found that multitasking degrades the quality of learning.2

Among the leading researchers who have published heavily in this area are Paul Dux, Vanderbilt University; Marcel Just, Carnegie Mellon University; David Meyer, University of Michigan; Hal Pashler, University of California at San Diego; Russell Poldrack, University of California at Los Angeles, and David Strayer, University of Utah.

The bulk of the evidence comes from laboratory-based studies, using carefully designed experiments in controlled settings to tease apart the brain mechanisms underlying task-switching and its costs–and much of it has been conducted with 20-somethings. As such, Grafman said, the research “relates very nicely to multitasking on computers” and is highly relevant to the developing brain.

Strayer’s work has extended the research to real-world situations such as driving while talking on a cell phone. He has found significantly lower reaction times and a two-fold increase in rear-end accidents among both teenage and older drivers who were simultaneously engaged in cell-phone conversations.3 In one study, Strayer and colleagues concluded that “the impairments associated with using a cell phone while driving can be as profound as those associated with driving while drunk.”3

“The bottom line is that if you try to do more than one thing at the same time, you’re going to have a decrement in performance,” said Grafman. “This has been shown over and over again, and it has not changed from the last generation of young people to today’s young people.”

He added: “I think that one of the big trade-offs between multitasking and ‘unitasking,’ as I call it, is that in multitasking, the opportunity for deeper thinking, for deliberation, or for abstract thinking is much more limited. You have to rely more on surface-level information, and that is not a good recipe for creativity or invention.”

‘Mile Wide, Inch Deep’ Knowledge?

Friedlander echoed this sentiment: “If a child is doing homework while on the computer engaged in chat rooms, or listening to iTunes and so forth, I do think there is a risk that there will never be enough depth and time spent on any one component to go as deep or as far as you might have. You might satisfactorily get all these things done, but the quality of the work or of the communication may not reach the level that it could have had it been given one’s full attention. There’s a risk of being a mile wide and an inch deep.”

Grafman emphasized that the issues–while relevant to people of all ages–are particularly of concern to children’s whose brains are still developing. “When teens are learning routines–whatever those routines are–the dominant routine is going to play a bigger role in how their brain develops and what kinds of strategies are stored,” he said. If they are constantly toggling between homework and instant messaging and videos, they may get really good at toggling, but as Grafman pointed out, “that does not necessarily equate to being a smarter person.”

Social Development in the Facebook Age

Another area of concern in today’s digital world is the impact of electronic communication on social interactions. The hard science is slim, but experts say there is reason to believe that when the bulk of a young person’s interactions with others is done electronically at the expense of face-to-face communication, social development may be affected.

So-called “social cognition,” which encompasses such things as the ability to form impressions of others, make inferences about their intentions, gauge their emotional reactions and adjust your actions accordingly, is another complex skill that relies on the pre-frontal cortex, the brain’s forward-most and last-to-develop region. Like other high-level cognitive functions, mastery of these skills requires practice. “If you don’t have sufficient in-person practice, that has got to be handicapping you in some way,” said Grafman.

Real-world interactions entail what Friedlander calls “broadband communication,” a term borrowed from the digital world. “So much of what we’re conveying to each other comes from the intonation of our voice, the looks, the facial expression, the body language, the pauses–all those subtle cues that go into communication. Kids who are spending all of their time interacting through this cyber world are very likely to not have the opportunity to develop sets of skills that are innate and important to the human brain in terms of what we call social cognition.”

Friedlander also wonders if over-reliance on electronic interactions, which are so often marked by unnatural delays, even minute ones as in cell-phone conversations, might wire developing brains to a different baseline set-point for temporal processing–how time is interpreted.

“We don’t really know how that will affect kids or if it will have long-term effects, but I think it supports the notion that one needs to be careful to not become totally immersed in the cyber world, because it may be a little more awkward interacting with real living people in real situations where those timing delays are somewhat different,” Freidlander said.

How Much is Too Much?

The information explosion brought about by the Internet and other modern technological tools has undeniably had positive influences on society. “These are enabling technologies,” said Friedlander. “I think their greatest power lies in their ability to enable people to reach out to a world that is much greater than what any child is likely to get in their home or school environment. That’s all good and positive.”
The trick, he said, is knowing where to draw the line. “It gets down to a quantitative question: how much is too much? That’s where the rubber really meets the road for most people, and that is a really tough question to answer.”
Reply Thu 13 Sep, 2012 11:21 am
Yes and TV was going to be the end of human civilization in the 1950s with all kind of studies to support that claim.

If there are funds to be had for grants in the social 'sciences' crying doom over TV, comic books, computer games, the internet and so on there will be papers released with PhD names on them.
0 Replies
Reply Thu 13 Sep, 2012 11:23 am
Television, computers and video games are permanent fixtures in American family life. In fact, 99 percent of households have, at least, one TV, and more than 80 percent now have, at least, one computer. This gives kids plenty of opportunities for so-called "screen time."

A 2010 study by researchers at the Seattle Children's Research Institute shows that preschoolers get an average of four hours per day of screen time, or twice the limit recommended by the American Academy of Pediatrics. The effects of so much screen time can include speech delays, aggressive behavior and obesity. On the positive side, kids who play games may develop quicker reflexes and better problem-solving skills.

In a 2009 study, researchers at the Seattle Children's Research Institute studied kids ages 2 months to 4 years old to determine whether screen time made a difference in language development. Their conclusion: Yes. The researchers fitted 329 kids and their parents with digital devices that randomly recorded everything they heard or said for 12 to 16 hours at a time. Kids who were exposed to more TV heard 7 percent fewer words from adults and spoke fewer words themselves. Their conclusion was that the screen time cut into the adult-child interaction that is crucial to developing language skills.

Reply Thu 13 Sep, 2012 11:37 am
Footnote all my step grandkids had been around computers from almost birth and my one six years old grandson got into my wife laptop administrated account settings and did a number on the settings that took me over an hour to track down and repair.

I was very impressed that he found settings that most adults do not know exist on their computers.

I said dear set up a guest account for the kid and do not allow him to play games on your laptop otherwise, however we do try to limit him to his ‘own’ outdated computer.

If not however my grandkids who will suffer by knowing computers in detail from a very early age it is the kids who only have access to computers at school for a few hours a week.

Junk science studies or no such studies.
Reply Thu 13 Sep, 2012 11:40 am
LOL as computer time is replacing TV time in most homes including the children in the home it should be interesting to see how the junk science people work that relationship out between TV and computers time.
0 Replies

Related Topics

YouTube Is Doomed - Discussion by Shapeless
So I just joined Facebook.... - Discussion by DrewDad
Internet disinformation overload - Discussion by rosborne979
Participatory Democracy Online - Discussion by wandeljw
OpenDNS and net neutrality - Question by Butrflynet
Internet Explorer 8? - Question by Pitter
  1. Forums
  2. » Digital Dementia
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.08 seconds on 07/15/2024 at 06:31:12