Jump to content

Should we stop using computers?


Soup forgot his password

Recommended Posts

Dopamine is the completely related to what a lot of people see as a problem here.

 

Couple excerpts from "The Shallows" by Nicholas Carr

 

MICHAEL GREENBERG, IN a 2008 essay in the New York Review of Books,

found the poetry in neuroplasticity. He observed that our neurological

system, “with its branches and transmitters and ingeniously spanned gaps,

has an improvised quality that seems to mirror the unpredictability of

thought itself.” It’s “an ephemeral place that changes as our experience

changes.”32 There are many reasons to be grateful that our mental

hardware is able to adapt so readily to experience, that even old brains

can be taught new tricks. The brain’s adaptability hasn’t just led to new

treatments, and new hope, for those suffering from brain injury or illness.

It provides all of us with a mental flexibility, an intellectual litheness, that

allows us to adapt to new situations, learn new skills, and in general expand

our horizons.

 

But the news is not all good. Although neuroplasticity provides an escape from

genetic determinism, a loophole for free thought and free will, it also imposes

its own form of determinism on our behavior. As particular circuits in our brain

strengthen through the repetition of a physical or mental activity, they begin

to transform that activity into a habit. The paradox of neuroplasticity, observes

Doidge, is that, for all the mental flexibility it grants us, it can end up locking

us into “rigid behaviors.”33 The chemically triggered synapses that link our

neurons program us, in effect, to want to keep exercising the circuits they’ve

formed. Once we’ve wired new circuitry in our brain, Doidge writes, “we long

to keep it activated.”34 That’s the way the brain fine-tunes its operations.

Routine activities are carried out ever more quickly and efficiently, while unused

circuits are pruned away.

 

Plastic does not mean elastic, in other words. Our neural loops don’t snap back

to their former state the way a rubber band does; they hold onto their changed

state. And nothing says the new state has to be a desirable one. Bad habits can

be ingrained in our neurons as easily as good ones. Pascual-Leone observes that

“plastic changes may not necessarily represent a behavioral gain for a given

subject.” In addition to being “the mechanism for development and learning,”

plasticity can be “a cause of pathology.”35

It comes as no surprise that neuroplasticity has been linked to mental afflictions

ranging from depression to obsessive-compulsive disorder to tinnitus. The more a

sufferer concentrates on his symptoms, the deeper those symptoms are etched

into his neural circuits. In the worst cases, the mind essentially trains itself to

be sick. Many addictions, too, are reinforced by the strengthening of plastic

pathways in the brain. Even very small doses of addictive drugs can dramatically

alter the flow of neurotransmitters in a person’s synapses, resulting in long-lasting

alterations in brain circuitry and function. In some cases, the buildup of certain

kinds of neurotransmitters, such as dopamine, a pleasure-producing cousin to

adrenaline, seems to actually trigger the turning on or off of particular genes,

bringing even stronger cravings for the drug. The vital paths turn deadly.

The potential for unwelcome neuroplastic adaptations also exists in the everyday,

normal functioning of our minds. Experiments show that just as the brain can build

new or stronger circuits through physical or mental practice, those circuits can

weaken or dissolve with neglect. “If we stop exercising our mental skills,” writes

Doidge, “we do not just forget them: the brain map space for those skills is turned

over to the skills we practice instead.”36 Jeffrey Schwartz, a professor of psychiatry

at UCLA’s medical school, terms this process “survival of the busiest.”37 The mental

skills we sacrifice may be as valuable, or even more valuable, than the ones we gain.

When it comes to the quality of our thought, our neurons and synapses are entirely

indifferent. The possibility of intellectual decay is inherent in the malleability of our

brains.

That doesn’t mean that we can’t, with concerted effort, once again redirect our neural

signals and rebuild the skills we’ve lost. What it does mean is that the vital paths in

our brains become, as Monsieur Dumont understood, the paths of least resistance.

They are the paths that most of us will take most of the time, and the farther we

proceed down them, the more difficult it becomes to turn back.

 

 

WHAT DETERMINES WHAT we

remember and what we forget? The key to memory consolidation is attentiveness.

Storing explicit memories and, equally important, forming connections between

them requires strong mental concentration, amplified by repetition or by intense

intellectual or emotional engagement. The sharper the attention, the sharper the

memory. “For a memory to persist,” writes Kandel, “the incoming information must

be thoroughly and deeply processed. This is accomplished by attending to the

information and associating it meaningfully and systematically with knowledge

already well established in memory.”35 If we’re unable to attend to the information

in our working memory, the information lasts only as long as the neurons that hold

it maintain their electric charge—a few seconds at best. Then it’s gone, leaving

little or no trace in the mind.

 

Attention may seem ethereal—a “ghost inside the head,” as the developmental

psychologist Bruce McCandliss says36—but it’s a genuine physical state, and it

produces material effects throughout the brain. Recent experiments with mice

indicate that the act of paying attention to an idea or an experience sets off a

chain reaction that crisscrosses the brain. Conscious attention begins in the frontal

lobes of the cerebral cortex, with the imposition of top-down, executive control

over the mind’s focus. The establishment of attention leads the neurons of the

cortex to send signals to neurons in the midbrain that produce the powerful

neurotransmitter dopamine. The axons of these neurons reach all the way into the

hippocampus, providing a distribution channel for the neurotransmitter. Once the

dopamine is funneled into the synapses of the hippocampus, it jump-starts the

consolidation of explicit memory, probably by activating genes that spur the

synthesis of new proteins.37

The influx of competing messages that we receive whenever we go online not

only overloads our working memory; it makes it much harder for our frontal lobes

to concentrate our attention on any one thing. The process of memory consolidation

can’t even get started. And, thanks once again to the plasticity of our neuronal

pathways, the more we use the Web, the more we train our brain to be distracted—

to process information very quickly and very efficiently but without sustained

attention. That helps explain why many of us find it hard to concentrate even

when we’re away from our computers. Our brains become adept at forgetting,

inept at remembering. Our growing dependence on the Web’s information stores

may in fact be the product of a self-perpetuating, self-amplifying loop. As our

use of the Web makes it harder for us to lock information into our biological

memory, we’re forced to rely more and more on the Net’s capacious and easily

searchable artificial memory, even if it makes us shallower thinkers.

 

The changes in our brains happen automatically, outside the narrow compass of

our consciousness, but that doesn’t absolve us from responsibility for the choices

we make. One thing that sets us apart from other animals is the command we have

been granted over our attention. “‘Learning how to think’ really means learning how

to exercise some control over how and what you think,” said the novelist David Foster

Wallace in a commencement address at Kenyon College in 2005. “It means being

conscious and aware enough to choose what you pay attention to and to choose how

you construct meaning from experience.” To give up that control is to be left with

“the constant gnawing sense of having had and lost some infinite thing.”38 A

mentally troubled man—he would hang himself two and a half years after the speech—

Wallace knew with special urgency the stakes involved in how we choose, or fail to

choose, to focus our mind. We cede control over our attention at our own peril.

Everything that neuroscientists have discovered about the cellular and molecular

workings of the human brain underscores that point.

 

Socrates may have been mistaken about the effects of writing, but he was wise to warn

us against taking memory’s treasures for granted. His prophecy of a tool that would

“implant forgetfulness” in the mind, providing “a recipe not for memory, but for

reminder,” has gained new currency with the coming of the Web. The prediction may

turn out to have been merely premature, not wrong. Of all the sacrifices we make when

we devote ourselves to the Internet as our universal medium, the greatest is likely to be

the wealth of connections within our own minds. It’s true that the Web is itself a network

of connections, but the hyperlinks that associate bits of online data are nothing like the

synapses in our brain. The Web’s links are just addresses, simple software tags that direct

a browser to load another discrete page of information. They have none of the organic

richness or sensitivity of our synapses. The brain’s connections, writes Ari Schulman,

“don’t merely provide access to a memory; they in many ways constitute memories.”39

The Web’s connections are not our connections—and no matter how many hours we

spend searching and surfing, they will never become our connections. When we outsource

our memory to a machine, we also outsource a very important part of our intellect

and even our identity. William James, in concluding his 1892 lecture on memory, said,

“The connecting is the thinking.” To which could be added, “The connecting is the self.”

 

As far as "integrating technology into our lives" When you look at every single piece of technology that has come along in America since the 18th century, it has always been a case of "integrating our lives into technology" In the 1800's, the printed word held a monopoly on public discourse. Then the telegraph did. Then the telephone. Then the radio. In the 1980's it was clearly the television. Today i would argue that the television still holds the throne of "public discourse monopolist" since the GOP primaries is still a tv show, and any media on the internet about our own presidential candidacy is owned and controlled by the same TV broadcasters that have controlled television from the beginning. That may sound like a conspiracy, NBC, CBS and their subsidiaries see themselves as a public utility service like the post office and electricity and I can find direct quotes from their CEO's of this if requested.

 

 

Edit: Fora.tv is one of those sites that keep me on the internet. Check it out.

"Has Malcolm Gladwell's Opinion on Social Media and the Arab Spring Changed?" -

Link to comment
Share on other sites

This forum is supported by the 12ozProphet Shop, so go buy a shirt and help support!
This forum is brought to you by the 12ozProphet Shop.
This forum is brought to you by the 12oz Shop.
  • Replies 90
  • Created
  • Last Reply

Top Posters In This Topic

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...