Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Should we stop using computers?


Soup forgot his password
 Share

Recommended Posts

This forum is supported by the 12ozProphet Shop, so go buy a shirt and help support!
This forum is brought to you by the 12ozProphet Shop.
This forum is brought to you by the 12oz Shop.
  • Replies 90
  • Created
  • Last Reply

Top Posters In This Topic

I reread the previous post and if you haven’t read any of the books I’ve offered then

admittedly you wont have a clue what I’m talking about. If you don’t want to read

the books because you're waiting for the movie, this is the best I can do for you.

 

 

 

"Medium is the message"

 

http://en.wikipedia.org/wiki/The_medium_is_the_message

 

Marshal McLuhan suggests that what you communicate is far less important than

how you communicate. It doesn't matter what you say to someone through a

telephone, using a telephone says a lot more. “The medium is the message.”

 

 

 

Neil Postman revises this argument by suggesting, "medium is the metaphor" in

Amusing Ourselves To Death

 

 

Neil Postman argues that what you can communicate through a medium is defined

by the medium, which then redefines your entire relationship to people, society,

culture, and the world. People who read books have more value for objectivity than

people who watch TV, for instance.

 

Each new technology monopolizes communication and therefore monopolizes

culture. We see this today in a fall of literacy and reading comprehension compared

to the 18th century when literacy was somewhere around 98%. When the book held

a monopoly on public discourse, society demanded high-grade literacy to plug into

the culture. We are guided by logic and abstraction when we are reading print. Yet

we are presented with concrete images and crave for aesthetics when we watch

television. This influences our decisions like which president we would choose.

Media therefore influences the way we tell the truth. Each technology comes with its

own epistemology. A print based culture was objective and rational. The way they

expressed ideas was logical and sequential.

 

A television based epistemology turns politics, education, news etc into

channels/entertainment. Focusing on aesthetics rather than paying attention to

logic and rationale.

 

When print held a monopoly on discourse in America (until the 19th century),

everything was discussed by everyone in a serious and thoughtful manner. It

demanded high-grade literacy to plug into society. The Lyceum Movement started as

a form of adult education. By 1835, more than 3,000 Lyceum lecture halls in 15

states. Intellectuals, writers and humorists spoke there to audiences of all classes for

free.

 

Postman defines the “Typographic Man" as detached, analytical, devoted to logic,

abhorring contradiction.” He was also classless as reading was not regarded as an

elitist activity in America.

 

First printing press in US: 1638 at Harvard University. From 16th century on, all

knowledge began to be transferred to the printed page in America. Books were

imported from Britain. Soon after we had public libraries that provided all these

books free of charge to every American in the country. There has never been a more

open loop of communication in this country. We had the telegraph, penny-papers,

telephone, radio, newspapers, and television, all of which became government

sponsored monopolies controlled by companies like AT&T and Bell.

 

 

 

 

Since “Amusing ourselves to death” was written in 1984 we have to go to “The

Shallows” by Nicholas carr, written in 2007 to see compare TV and Typography to

Computers.

 

 

Chapter 5: A Medium of the Most General Nature

 

Much like the television tried to do, the computer is slowly replacing all mediums

that came before it. The way the Web has progressed as a medium replays, with

the velocity of a time-lapse film, the entire history of modern media. Hundreds of

years have been compressed into a couple of decages. The first information-

processing machine that the Net replicated was Gutenberg's press. Because text is

fairly simple to translate into software code and to share over networks — it

doesn't require a lot of memory to store, a lot of bandwith to transmit, or a lot of

processing power to render on a screen — early Web sites were usually

constructed entirely of typographical symbols. The very term we came to use to

describe what we look at online — pages — emphasized the connection printed

documents.

 

Next, the Web began to take over the work of our traditional sound-processing

equipment — radios and phonographs and tape decks. The earliest sounds to be

heard online were spoken words, but soon snippets of music, and then entire

songs and even symphonies, were streaming through sites, at ever-higher levels

of fidelity. The network's ability to handle audio streams was aided by the

development of software algorithms, such as the one used to produce MP3 files,

that erase from music and other recordings sounds that are heard for the human

ear to hear. The algorithms allowed sound files to be compressed to much smaller

sizes with only slight sacrifices in quality. Telephone calls also began to be routed

over the fiber-optic cables of the Internet, bypassing traditional phone lines.

 

Chapter 7: The Juggler's Brain

 

Our use of the Internet involves many paradoxes, but the one that promises to

have the greatest long-term influences over how we think is this one: the Net

seizes our attention only to scatter it.

 

Chapter 9: Search, Memory

 

...it's no longer terribly efficient for our brains to store information. Memory

should now function like a simple index, pointing us to places on the Web where

we can locate the information we need at the moment we need it. Why memorize

the content of a single book when you could be using your brain to hold a quick

guide to an entire library? Rather than memorize information, we now store it

digitally and just remember what we stored. As the Web teaches us to think like it

does, we'll end up keeping rather little deep knowledge in our own heads.

 

When a person fails to consolidate a fact, an idea, or an experience in long-term

memory, he's not "freeing up" space in his brain for other functions. In contrast

to working memory, with its constrained capacity, long-term memory expands

and contracts with almost unlimited elasticity, thanks to the brain's ability to

grow and prune synaptic terminals and continually adjust the strength of

synaptic connections. The brain never reaches a point at which experiences can

no longer be committed to memory; the brain cannot be full. The very act of

remembering appears to modify the brain in a way that can make it

easier to learn ideas and skills in the future.

 

We don't constrain our mental powers when we store new long-term memories.

We strengthen them. With each expansion of our memory comes an enlargement

of our intelligence. The Web provides a convenient and compelling supplement to

personal memory, but when we start using the Web as a substitute for personal

memory, bypassing the inner processes of consolidation, we risk emptying our

minds of their riches.

 

In the 1970s, when schools began allowing students to use portable calculators,

many parents objected. They worried that a reliance on the machines would

weaken their children's grasp of mathematical concepts. The fears, subsequent

studies showed, were largely unwarranted. No longer forced to spend a lot of time

on routine calculations, many students gained a deeper understanding of the

principles underlying their exercises. Today, in freeing us from the work of

remembering, it's said, the Web allows us to devote more time to creative

thought. As the experience of math students has shown, the calculator made it

easier for the brain to transfer ideas from working memory to long-term memory

and encode them in the conceptual schemas that are so important to building

knowledge. The Web has a very different effect. It places more pressure on our

working memory, not only diverting resources from our higher reasoning

faculties but obstructing the consolidation of long-term memories and the

development of schemas. The calculator, a powerful but highly specialized tool,

turned out to be an aid to memory. The Web is a technology of forgiveness.

 

The more we use the Web, the more we train our brain to be distracted — to

process information very quickly and very efficiently but without sustained

attention. That helps explain why many of us find it hard to concentrate even

when we're away from our computers. Our brains become adept at forgetting,

inept at remembering.

 

 

 

I’ll save the other half of those books for later. While so far this has been widely an

existentialist argument, the rest of the books go into the more economic and

politically motivated monopolies of information technology. And the pros and cons

of bundling vs unbundling information, monopolizing vs democratizing information

technology and so on.

 

Tim Wu - The Master Switch

Nicholas Carr - The Big Switch (couldnt find anything in particular that covers the whole book so here) http://www.youtube.com/results?search_query=nicholas+carr+the+big+switch&oq=nicholas+carr+the+big+switch&gs_l=youtube.3..0.1961.5963.0.6181.16.5.0.11.11.1.139.464.4j1.5.0...0.0...1ac.tamzZKAvmes

Neil Postman - Technopoly http://www.youtube.com/watch?v=KbAPtGYiRvg

Neil Postman - Building a bridge to the 18th century http://www.youtube.com/watch?v=JovJr_LmAP8

 

Couple more

 

Jaron Lanier - You Are not a gadget

Sherry Turkle - Alone Together

William Powers - Hamlet's Blackberry http://www.youtube.com/watch?v=5IFhmw9YdoY

Link to comment
Share on other sites

Soup, I wasn't trying to pick a fight with you.

 

Here's a few things that I think may be helpful for you.

 

If you want to make a point, just make the point. (re-read this 5 times)

 

You shouldn't give out a list of books to read and think that we would understand you if only we would read books X, Y, and Z. If you understand the points in those books, just give us the points.

You shouldn't say things like "it's been scientifically proven" to try and make a point. You shouldn't appeal to authority, (just because someone says something doesn't make it true). You shouldn't

contradict yourself, (post #1 you say google has indexed .04% of the internet, post #37 you say google has indexed between 4-12%) You shouldn't overstate your case, it makes it that much easier

to falsify your claim, (e.g. post #48 "a hammer has only one use..." false, hammers have many uses including but not limited to: building, demolition, bottle opener, door stop etc. This is just one

simple example, but there are many like this you've made in this thread). You shouldn't stray off of your own topic, (this thread was about computer usage, no reason to bring in god, religion, eternity.

Whether any of these are true or false they aren't on topic and they don't further your idea). When someone makes a point, try to understand their point before responding to it.

 

There's more but I'm just going to stop for now. I'm sure I may come off as patronizing but that isn't my intention. I'm just trying to help you out.

 

And I still have no idea if you're trolling or just don't know what you're talking about.

Link to comment
Share on other sites

I'm glad you brought up Aristotles rules of logic. Your examples are a bit of a stretch but I agree with what you're trying to say, although what you're applying them to misses the points I was making. That could've been my fault but let me try explaining, since you're not really keen on asking questions and jump to a fuck-ton of conclusions.

 

"You shouldn't give out a list of books to read and think that we would understand you if only we would read books X, Y, and Z."

Because you went straight to being dismissive of my post instead of asking for clarification or posting anything constructive, I could only assume you were done with the discussion so I left it with a list of books you could read instead.

 

"If you understand the points in those books, just give us the points. "

I can definitely simplify those books into points. The question is why would I want do that? What value is there to breaking down a bunch of books in this thread when there is no real discussion here?

 

"You shouldn't say things like "it's been scientifically proven" to try and make a point."

You're confusing warrants for a claim with support. That's not an example of support. If you want the support i can provide it.

 

"You shouldn't appeal to authority, (just because someone says something doesn't make it true)."

True but I haven't. If you think i have let me know where. In fact I just told Spambot to stop appealing to the authority of his Austrian economists.

 

"You shouldn't contradict yourself, (post #1 you say Google has indexed .04% of the internet, post #37 you say Google has indexed between 4-12%)"

You're right, however I have found research papers that say both, while researchers can't agree on a number they support my claim that Google indexes only a fraction of the Internet.

 

"You shouldn't overstate your case, it makes it that much easier to falsify your claim, (e.g. post #48 "a hammer has only one use..." false, hammers have many uses including but not limited to: building, demolition, bottle opener, door stop etc."

Valid point but not a great example of it. The list of uses you gave can be given to a number of things, including your skull, but we're talking about the hammer as a sophisticated tool and technology. The the single purpose of the hammer, which defines the hammer as a hammer and not your skull or a rock or whatever, is that a hammer can put nails into a wall.

 

This technology changed people's entire relationship to the world. It created a new kind of building, a new way to think about nature, a new career and role in society, a new culture with a different value for everything around it. We see this with every new technology including agriculture, animal husbandry, the printed word, and so on. THAT was the point i was making. The role of the hammer—and all technology—in shaping modern society cannot be overstated.

 

"This is just one simple example, but there are many like this you've made in this thread). You shouldn't stray off of your own topic, (this thread was about computer usage, no reason to bring in god, religion, eternity.

Whether any of these are true or false they aren't on topic and they don't further your idea). "

 

I completely disagree with this. We are talking about computer as a technology and a medium. We are attempting to weigh the benefits against the costs and one part of this is to compare the computer to other technologies, to see how technology changes the metaphors/abstract concepts of our society. Since we see the world through abstract concepts like eternity, religion, and god—also capitalism, individualism, liberty, fairness, etc—its important to see how each new technology changes the meaning and our relationship to these metaphors. As we change everything over to the computer we must ask ourselves how our relationship to these metaphors and to the world is altered. Since you're clearly interested in epistemology, at least in the context of religion, you might also be interested in looking at technology as epistemology, and with every new technology the epistemology changes.

 

"When someone makes a point, try to understand their point before responding to it. "

Sage wisdom, Mr. "I’m just gonna assume you don’t know what you’re talking about."

Link to comment
Share on other sites

Actually you did.

 

There are only two ways to understand your last post.

 

1) You don't know what you're talking about.

 

2) You know what you're talking about and are misusing words and ideas intentionally. In other words, trolling.

And like i said that's really just one way to understand my post. You just added some flavor to #2.

 

How many of the people have come to a conclusion at all? The question is what's important here.

Link to comment
Share on other sites

There actually is context to my quote above.

 

As far as your question 'should we stop using computers?' I'm leaning towards not stopping.

 

Can you answer my question? How many people you listed above come to the conclusion we should stop using computers?

Link to comment
Share on other sites

ok

 

"If you understand the points in those books, just give us the points. "

I can definitely simplify those books into points. The question is why would I want do that? What value is there to breaking down a bunch of books in this thread when there is no real discussion here?"

 

Seems to me like there is some discussion here.

 

"You shouldn't say things like "it's been scientifically proven" to try and make a point."

You're confusing warrants for a claim with support."

 

I'm sort of used to using warrant in a particular way but recognize that people easily use it interchangeably with other words like justification. You're going to have to fill me in on how you use both those words for me to understand the point you're making here.

 

"You shouldn't overstate your case, it makes it that much easier to falsify your claim, (e.g. post #48 "a hammer has only one use..." false, hammers have many uses including but not limited to: building, demolition, bottle opener, door stop etc."

Valid point but not a great example of it. The list of uses you gave can be given to a number of things, including your skull, but we're talking about the hammer as a sophisticated tool and technology. The the single purpose of the hammer, which defines the hammer as a hammer and not your skull or a rock or whatever, is that a hammer can put nails into a wall."

 

I disagree that "the single purpose of the hammer, which defines the hammer...is that a hammer can put nails into a wall". I think what actually defines a hammer is common purpose and not single purpose. Thinking about it this way is more inclusive to how we actually use a hammer and still allows for the distinction between a hammer, my skull and a rock.

 

"This technology changed people's entire relationship to the world. It created a new kind of building, a new way to think about nature, a new career and role in society, a new culture with a different value for everything around it. We see this with every new technology including agriculture, animal husbandry, the printed word, and so on. THAT was the point i was making. The role of the hammer—and all technology—in shaping modern society cannot be overstated."

 

I guess we'll have to agree to disagree on this one. Seems to me like you are overstating things here.

 

"I completely disagree with this. We are talking about computer as a technology and a medium. We are attempting to weigh the benefits against the costs and one part of this is to compare the computer to other technologies, to see how technology changes the metaphors/abstract concepts of our society. Since we see the world through abstract concepts like eternity, religion, and god—also capitalism, individualism, liberty, fairness, etc—its important to see how each new technology changes the meaning and our relationship to these metaphors."

 

You give a list of things (eternity, religion, and god—also capitalism, individualism, liberty, fairness) and call them metaphors, but are they all metaphors?

 

"Since you're clearly interested in epistemology, at least in the context of religion."

 

I am interested in epistemology but I didn't mean to give you the impression that it was only in the context of religion. One statement I noticed from you was "Researching on the internet may seem faster than researching in a library but the information you end up with when you use the internet is so inferior to book learning that its truly a waste of fucking time." and this is so blatantly incorrect that I wanted to comment in this thread. I let you know there is a ton of good information I find on the internet thru google. You asked me what kind and I gave you one example, but there are lots more we don't even need to bother getting into (meaning I've successfully searched for things many times over).

 

"Since you're clearly interested in epistemology...you might also be interested in looking at technology as epistemology, and with every new technology the epistemology changes."

 

I think I would be sure to never slip up here and to always keep distinction between how we actually "know" things internally vs. the way we come to find new things externally. That's sloppy but it's open to critique and revision.

 

"I'm sorry, what context did I miss?"

 

Really? The post before the post you quoted.

 

"None of them directly answered that question. Just like a hazard warning on a cigarette box doesn't directly tell you to stop smoking."

 

I know you aren't saying using computers directly causes cancer. So it's not just like a warning on cigarette label. In other words, you don't have to tell us what it's like you can just tell us what it is.

Link to comment
Share on other sites

I'm sort of used to using warrant in a particular way but

recognize that people easily use it interchangeably with other words like

justification. You're going to have to fill me in on how you use both those words for

me to understand the point you're making here.

 

Claim: What I want readers to believe'

 

Support: What I will use to support the claim

 

Warrant: A general principle that explains why I think me evidence is accurate, and

relevant to your claim.

 

 

 

Claim: Books allow for deeper concentration, contemplation, and memorization

than any other format.

 

Warrant: The claim is evident through scientific research

 

Support: Not given at the time because this is the internet and I don't feel like qualifying things unless someone asks me to, but here:

 

 

 

Taken from "The Shallows" by Nicholas Carr

Keep in mind the entire book is devoted to this topic and this is only one section.

 

 

GARY SMALL, A professor of psychiatry at UCLA and the director of its Memory and

Aging Center, has been studying the physiological and neurological effects of the use of

digital media, and what he’s discovered backs up Merzenich’s belief that the Net causes

extensive brain changes. “The current explosion of digital technology not only is

changing the way we live and communicate but is rapidly and profoundly altering our

brains,” he says. The daily use of computers, smartphones, search engines, and other such

tools “stimulates brain cell alteration and neurotransmitter release, gradually

strengthening new neural pathways in our brains while weakening old ones.”7

 

In 2008, Small and two of his colleagues carried out the first experiment that actually

showed people’s brains changing in response to Internet use.8 The researchers recruited

twenty-four volunteers—a dozen experienced Web surfers and a dozen novices—and

scanned their brains as they performed searches on Google. (Since a computer won’t fit

inside a magnetic resonance imager, the subjects were equipped with goggles onto which

were projected images of Web pages, along with a small handheld touchpad to navigate

the pages.) The scans revealed that the brain activity of the experienced Googlers was

much broader than that of the novices. In particular, “the computer-savvy subjects used a

specific network in the left front part of the brain, known as the dorsolateral prefrontal

cortex, [while] the Internet-naïve subjects showed minimal, if any, activity in this area.”

As a control for the test, the researchers also had the subjects read straight text in a

simulation of book reading; in this case, scans revealed no significant difference in brain

activity between the two groups. Clearly, the experienced Net users’ distinctive neural

pathways had developed through their Internet use.

 

The most remarkable part of the experiment came when the tests were repeated six days

later. In the interim, the researchers had the novices spend an hour a day online, searching

the Net. The new scans revealed that the area in their prefrontal cortex that had been

largely dormant now showed extensive activity—just like the activity in the brains of the

veteran surfers. “After just five days of practice, the exact same neural circuitry in the

front part of the brain became active in the Internet-naïve subjects,” reports Small. “Five

hours on the Internet, and the naïve subjects had already rewired their brains.” He goes

on to ask, “If our brains are so sensitive to just an hour a day of computer exposure, what

happens when we spend more time [online]?” 9

 

One other finding of the study sheds light on the differences between reading Web pages

and reading books. The researchers found that when people search the Net they exhibit a

very different pattern of brain activity than they do when they read book-like text. Book

readers have a lot of activity in regions associated with language, memory, and visual

processing, but they don’t display much activity in the prefrontal regions associated with

decision making and problem solving. Experienced Net users, by contrast, display

extensive activity across all those brain regions when they scan and search Web pages.

The good news here is that Web surfing, because it engages so many brain functions,

may help keep older people’s minds sharp. Searching and browsing seem to “exercise”

the brain in a way similar to solving crossword puzzles, says Small.

 

But the extensive activity in the brains of surfers also points to why deep reading and

other acts of sustained concentration become so difficult online. The need to evaluate

links and make related navigational choices, while also processing a multiplicity of

fleeting sensory stimuli, requires constant mental coordination and decision making,

distracting the brain from the work of interpreting text or other information. Whenever

we, as readers, come upon a link, we have to pause, for at least a split second, to allow

our prefrontal cortex to evaluate whether or not we should click on it. The redirection of

our mental resources, from reading words to making judgments, may be imperceptible to

us—our brains are quick—but it’s been shown to impede comprehension and retention,

particularly when it’s repeated frequently. As the executive functions of the prefrontal

cortex kick in, our brains become not only exercised but overtaxed. In a very real way,

the Web returns us to the time of scriptura continua, when reading was a cognitively

strenuous act. In reading online, Maryanne Wolf says, we sacrifice the facility that makes

deep reading possible. We revert to being “mere decoders of information.”10Our ability to

make the rich mental connections that form when we read deeply and without distraction

remains largely disengaged.

 

Steven Johnson, in his 2005 book Everything Bad Is Good for You, contrasted the

widespread, teeming neural activity seen in the brains of computer users with the much

more muted activity evident in the brains of book readers. The comparison led him to

suggest that computer use provides more intense mental stimulation than does book

reading. The neural evidence could even, he wrote, lead a person to conclude that

“reading books chronically understimulates the senses.”11 But while Johnson’s diagnosis

is correct, his interpretation of the differing patterns of brain activity is misleading. It is

the very fact that book reading “understimulates the senses” that makes the activity so

intellectually rewarding. By allowing us to filter out distractions, to quiet the problem-

solving functions of the frontal lobes, deep reading becomes a form of deep thinking. The

mind of the experienced book reader is a calm mind, not a buzzing one. When it comes to

the firing of our neurons, it’s a mistake to assume that more is better.

 

John Sweller, an Australian educational psychologist, has spent three decades studying

how our minds process information and, in particular, how we learn. His work

illuminates how the Net and other media influence the style and the depth of our thinking.

Our brains, he explains, incorporate two very different kinds of memory: short-term and

long-term. We hold our immediate impressions, sensations, and thoughts as short-term

memories, which tend to last only a matter of seconds. All the things we’ve learned about

the world, whether consciously or unconsciously, are stored as long-term memories,

which can remain in our brains for a few days, a few years, or even a lifetime. One

particular type of short-term memory, called working memory, plays an instrumental role

in the transfer of information into long-term memory and hence in the creation of our

personal store of knowledge. Working memory forms, in a very real sense, the contents

of our consciousness at any given moment. “We are conscious of what is in working

memory and not conscious of anything else,” says Sweller.12

 

If working memory is the mind’s scratch pad, then long-term memory is its filing system.

The contents of our long-term memory lie mainly outside of our consciousness. In order

for us to think about something we’ve previously learned or experienced, our brain has to

transfer the memory from long-term memory back into working memory. “We are only

aware that something was stored in long-term memory when it is brought down into

working memory,” explains Sweller.13 It was once assumed that long-term memory

served merely as a big warehouse of facts, impressions, and events, that it “played little

part in complex cognitive processes such as thinking and problem-solving.”14 But brain

scientists have come to realize that long-term memory is actually the seat of

understanding. It stores not just facts but complex concepts, or “schemas.” By organizing

scattered bits of information into patterns of knowledge, schemas give depth and richness

to our thinking. “Our intellectual prowess is derived largely from the schemas we have

acquired over long periods of time,” says Sweller. “We are able to understand concepts in

our areas of expertise because we have schemas associated with those concepts.”15

 

The depth of our intelligence hinges on our ability to transfer information from working

memory to long-term memory and weave it into conceptual schemas. But the passage

from working memory to long-term memory also forms the major bottleneck in our brain.

Unlike long-term memory, which has a vast capacity, working memory is able to hold

only a very small amount of information. In a renowned 1956 paper, “The Magical

Number Seven, Plus or Minus Two,” Princeton psychologist George Miller observed that

working memory could typically hold just seven pieces, or “elements,” of information.

Even that is now considered an overstatement. According to Sweller, current evidence

suggests that “we can process no more than about two to four elements at any given time

with the actual number probably being at the lower [rather] than the higher end of this

scale.” Those elements that we are able to hold in working memory will, moreover,

quickly vanish “unless we are able to refresh them by rehearsal.”16

 

Imagine filling a bathtub with a thimble; that’s the challenge involved in transferring

information from working memory into long-term memory. By regulating the velocity

and intensity of information flow, media exert a strong influence on this process. When

we read a book, the information faucet provides a steady drip, which we can control by

the pace of our reading. Through our single-minded concentration on the text, we can

transfer all or most of the information, thimbleful by thimbleful, into long-term memory

and forge the rich associations essential to the creation of schemas. With the Net, we face

many information faucets, all going full blast. Our little thimble overflows as we rush

from one faucet to the next. We’re able to transfer only a small portion of the information

to long-term memory, and what we do transfer is a jumble of drops from different

faucets, not a continuous, coherent stream from one source.

 

The information flowing into our working memory at any given moment is called our

“cognitive load.” When the load exceeds our mind’s ability to store and process the

information—when the water overflows the thimble—we’re unable to retain the

information or to draw connections with the information already stored in our long-term

memory. We can’t translate the new information into schemas. Our ability to learn

suffers, and our understanding remains shallow. Because our ability to maintain our

attention also depends on our working memory—“we have to remember what it is we are

to concentrate on,” as Torkel Klingberg says—a high cognitive load amplifies the

distractedness we experience. When our brain is overtaxed, we find “distractions more

distracting.”17 (Some studies link attention deficit disorder, or ADD, to the overloading of

working memory.) Experiments indicate that as we reach the limits of our working

memory, it becomes harder to distinguish relevant information from irrelevant

information, signal from noise. We become mindless consumers of data.

 

Difficulties in developing an understanding of a subject or a concept appear to be

“heavily determined by working memory load,” writes Sweller, and the more complex

the material we’re trying to learn, the greater the penalty exacted by an overloaded

mind.18 There are many possible sources of cognitive overload, but two of the most

important, according to Sweller, are “extraneous problem-solving” and “divided

attention.” Those also happen to be two of the central features of the Net as an

informational medium. Using the Net may, as Gary Small suggests, exercise the brain the

way solving crossword puzzles does. But such intensive exercise, when it becomes our

primary mode of thought, can impede deep learning and thinking. Try reading a book

while doing a crossword puzzle; that’s the intellectual environment of the Internet.

 

 

 

 

 

7. Gary Small and Gigi Vorgan, iBrain: Surviving the Technological Alteration of the Modern Mind (New York: Collins, 2008), 1.

8. G. W. Small, T. D. Moody, P. Siddarth, and S. Y. Bookheimer, “Your Brain on Google: Patterns of Cerebral Activation during Internet Searching,” American Journal of Geriatric Psychiatry, 17, no. 2 (February 2009): 116–26. See also Rachel Champeau, “UCLA Study Finds That Searching the Internet Increases Brain Function,” UCLA Newsroom, October 14, 2008, http://newsroom.ucla.edu/portal/ucla/ucla-study-finds-that-searching-64348.aspx.

9. Small and Vorgan, iBrain, 16–17.

10. Maryanne Wolf, interview with the author, March 28, 2008.

11. Steven Johnson, Everything Bad Is Good for You: How Today’s Popular Culture Is Actually Making Us Smarter(New York: Riverhead Books, 2005), 19.

12. John Sweller, Instructional Design in Technical Areas(Camberwell, Australia: Australian Council for Educational Research, 1999), 4.

13. Ibid., 7.

14. Ibid.

15. Ibid., 11.

16. Ibid., 4–5. For a broad review of current thinking on the limits of working memory, see Nelson Cowan, Working Memory Capacity (New York: Psychology Press, 2005).

17. Klingberg, Overflowing Brain, 39 and 72–75.

18. Sweller, Instructional Design, 22.

 

 

 

 

I'll pause here to respond to the rest of your post.

Link to comment
Share on other sites

I disagree that "the single purpose of the hammer, which

defines the hammer...is that a hammer can put nails into a wall". I think what

actually defines a hammer is common purpose and not single purpose. Thinking

about it this way is more inclusive to how we actually use a hammer and still allows

for the distinction between a hammer, my skull and a rock.

 

That’s fine. We can agree that the lexical definition of a hammer is “a tool consisting

of a solid head, usually of metal, set crosswise on a handle, used for beating metals,

driving nails, etc.” That’s all the consensus we need. If you permit that’s a dictionary

definition for a hammer then I can move onto my next point.

 

I guess we'll have to agree to disagree on this one. Seems

to me like you are overstating things here.

 

I dunno if I need to point this out or not, but this is you just saying, “Well I disagree

without providing any logical or intelligible reason except wanting to disagree.”

 

I dunno what else to say except, “alright then. “ And move on.

 

If you want to know the supreme role technological advancement has always had in

society you can start with the technology that started civilization, agriculture. Just

move forward through history from there.

 

You give a list of things (eternity, religion, and god—also

capitalism, individualism, liberty, fairness) and call them metaphors, but are they all

metaphors?

 

Of all these responses I’m giving THIS one is the important one. I'm going to dedicate

my next post to this.

 

 

 

I am interested in epistemology but I didn't mean to give

you the impression that it was only in the context of religion.

 

I never said you gave me the impression you were only into religious epistemology. I

said you were into epistemology AT LEAST in the context of religion.

 

One statement I noticed from you was "Researching on

the internet may seem faster than researching in a library but the information you

end up with when you use the internet is so inferior to book learning that its truly a

waste of fucking time." and this is so blatantly incorrect that I wanted to comment in

this thread.

 

Ok now lets see how you proved my claim to be “blatantly incorrect.”

 

I let you know there is a ton of good information I find on

the internet thru google. You asked me what kind and I gave you one example, but

there are lots more we don't even need to bother getting into (meaning I've

successfully searched for things many times over).

 

Just because you think Google search results are good doesn’t mean the library isn’t

better—I mean vastly better. I'm not talking about personal experience, although

I agree with the findings. I'm talking about historical facts, scientific research into

brain function, human behaviors and the way people process information.

 

I think I would be sure to never slip up here and to

always keep distinction between how we actually "know" things internally vs. the

way we come to find new things externally. That's sloppy but it's open to critique

and revision.

 

I do not understand what this means or how you went from what I said to this,

which probably means you don’t understand what I meant by “technology as

epistemology.” If I need to explain this let me know.

 

Really? The post before the post you quoted.

 

 

Ok so I got the context, but your response still said I’m either a troll or I don’t know

what I’m talking about. Whichever way you look at it you were presumptuous and,

like you said, that’s bad. I don’t really care to continue this so moving on?

 

I know you aren't saying using computers directly causes

cancer. So it's not just like a warning on cigarette label. In other words, you don't

have to tell us what it's like you can just tell us what it is.

 

It’s exactly like a warning label on a cigarette box. “We’re not saying don’t smoke,

but here’s some irrefutable facts.” Still don’t like it? Lets just go back to this then,

 

Can you answer my question? How many people you

listed above come to the conclusion we should stop using computers?

 

Can I tell you how many of the authors above said stop using computers? Nope.

Link to comment
Share on other sites

You give a list of things (eternity, religion, and god—also

capitalism, individualism, liberty, fairness) and call them metaphors, but are they all

metaphors?

 

I’m sorry I mixed terms. Metaphor wasn’t what I meant. These are all concepts,

ideas, things that only exist in the abstraction of language. Our understanding of

them and our relationship to them is only as deep as the media-metaphor through

which they are conveyed. An abstract concept like "politics" is understood differently

when it is conveyed visually through television, typographically through books, or

the jumbled mess that is the computer.

 

 

Now this is an idea that comes from Neil Postman so to avoid future responses of

“well I just don’t agree with you” I’m just going to post the fucking text.

 

Neil Postman

the Medium Is the Metaphor (about half-way through)

 

In studying the Bible

as a young man, I found intimations of the idea that forms of media

favor particular kinds of content and therefore are capable of taking

command of a culture. I refer specifically to the Decalogue, the Second

Commandment of which prohibits the Israelites from making concrete

images of anything. "Thou shalt not make unto thee any graven image, any

likeness of any thing that is in heaven above, or that is in the earth

beneath, or that is in the water beneath the earth." I wondered then, as

so many others have, as to why the God of these people would have

included instructions on how they were to symbolize, or not symbolize,

their experience. It is a strange injunction to include as part of an

ethical system unless its author assumed a connection between forms of

human communication and the quality of a culture. We may hazard a guess

that a people who are being asked to embrace an abstract, universal

deity would be rendered unfit to do so by the habit of drawing pictures

or making statues or depicting their ideas in any concrete,

icono-graphic forms. the God of the Jews was to exist in the Word and

through the Word, an unprecedented conception requiring the highest

order of abstract thinking. Iconography thus became blasphemy so that a

new kind of God could enter a culture. People like ourselves who are in

the process of converting their culture from word-centered to

image-centered might profit by reflecting on this Mosaic injunction. But

even if I am wrong in these conjectures, it is, I believe, a wise and

particularly relevant supposition that the media of communication

available to a culture are a dominant influence on the formation of the

culture's intellectual and social preoccupations. Speech, of course, is

the primal and indispensable medium. It made us human, keeps us human,

and in fact defines what human means. This is not to say that if there

were no other means of communication all humans would find it equally

convenient to speak about the same things in the same way. We know

enough about language to understand that variations in the

structures of languages will result in variations in what may be called

"world view." How people think about time and space, and about things

and processes, will be greatly influenced by the grammatical features of

their language. We dare not suppose therefore that all human minds are

unanimous in understanding how the world is put together. But how much

more divergence there is in world view among different cultures can be

imagined when we consider the great number and variety of tools for

conversation that go beyond speech. For although culture is a creation

of speech, it is recreated anew by every medium of communication--from

painting to hieroglyphs to the alphabet to television. Each medium,

like language itself, makes possible a unique mode of discourse by

providing a new orientation for thought, for expression, for

sensibility. Which, of course, is what McLuhan meant in saying the

medium is the message. His aphorism, however, is in need of amendment

because, as it stands, it may lead one to confuse a message with a

metaphor. A message denotes a specific, concrete statement about the

world. But the forms of our media, including the symbols through which

they permit conversation, do not make such statements. They are rather

like metaphors, working by unobtrusive but powerful implication to

enforce their special definitions of reality. Whether we are

experiencing the world through the lens of speech or the printed word or

the television camera, our media-metaphors classify the world for us,

sequence it, frame it, enlarge it, reduce it, color it, argue a case for

what the world is like. As Ernst Cassirer remarked:

Physical reality seems to recede in proportion as man's symbolic

activity advances. Instead of dealing with the things themselves man is

in a sense constantly conversing with himself. He has so enveloped

himself in linguistic forms, in artistic images, in mythical symbols or

religious rites that he cannot see or know anything except by the

interposition of [an] artificial medium.

What is peculiar about such interpositions of media is that their role

in directing what we will see or know is so rarely noticed. A person

who reads a book or who watches television or who glances at his watch

is not usually interested in how his mind is organized and controlled by

these events, still less in what idea of the world is suggested by a

book, television, or a watch. But there are men and women who have

noticed these things, especially in our own times. Lewis Mumford, for

example, has been one of our great noticers. He is not the sort of a

man who looks at a clock merely to see what time it is. Not that he

lacks interest in the content of clocks, which is of concern to everyone

from moment to moment, but he is far more interested in how a clock

creates the idea of "moment to moment." He attends to the philosophy of

clocks, to clocks as metaphor, about which our education has had little

to say and clock makers nothing at all. "the clock," Mumford has

concluded, "is a piece of power machinery whose 'product' is seconds and

minutes." In manufacturing such a product, the clock has the effect of

disassociating time from human events and thus nourishes the belief in

an independent world of mathematically measurable sequences. Moment to

moment, it turns out, is not God's conception, or nature's. It is man

conversing with himself about and through a piece of machinery he

created. In Mumford's great book Technics and Civilization, he shows

how, beginning in the fourteenth century, the clock made us into

time-keepers, and then time-savers, and now time-servers. In the

process, we have learned irreverence toward the sun and the seasons, for

in a world made up of seconds and minutes, the authority of nature is

superseded. Indeed, as Mumford points out, with the invention of the

clock, Eternity ceased to serve as the measure and focus of human

events. And thus, though few would have imagined the connection, the

inexorable ticking of the clock may have had more to do with the

weakening of God's supremacy than all the treatises produced by the phi-

losophers of the Enlightenment; that is to' say, the clock introduced a

new form of conversation between man and God, in which God appears to

have been the loser. Perhaps Moses should have included another

Commandment: Thou shalt not make mechanical representations of time.

That the alphabet introduced a new form of conversation between man and

man is by now a commonplace among scholars. To be able to see one's

utterances rather than only to hear them is no small matter, though our

education, once again, has had little to say about this. Nonetheless,

it is clear that phonetic writing created a new conception of knowledge,

as well as a new sense of intelligence, of audience and of posterity,

all of which Plato recognized at an early stage in the development of

texts. "No man of intelligence," he wrote in his Seventh Letter, "will

venture to express his philosophical views in language, especially not

in language that is unchangeable, which is true of that which is set

down in written characters." This notwithstanding, he wrote voluminously

and understood better than anyone else that the setting down of views in

written characters would be the beginning of philosophy, not its end.

Philosophy cannot exist without criticism, and writing makes it possible

and convenient to subject thought to a continuous and concentrated

scrutiny. Writing freezes speech and in so doing gives birth to the

grammarian, the logician, the rhetorician, the historian, the

scientist--all those who must hold language before them so that they can

see what it means, where it errs, and where it is leading. Plato knew

all of this, which means that he knew that writing would bring about a

perceptual revolution: a shift from the ear to the eye as an organ of

language processing. Indeed, there is a legend that to encourage such a

shift Plato insisted that his students study geometry before entering

his Academy. If true, it was a sound idea, for as the great literary

critic Northrop Frye has remarked, "the written word is far more

powerful than simply a reminder: it re-creates the past in the present,

and gives

us, not the familiar remembered thing, but the glittering intensity of

the summoned-up hallucination." 3 All that Plato surmised about the

consequences of writing is now well understood by anthropologists,

especially those who have studied cultures in which speech is the only

source of complex conversation. Anthropologists know that the written

word, as Northrop Frye meant to suggest, is not merely an echo of a

speaking voice. It is another kind of voice altogether, a conjurer's

trick of the first order. It must certainly have appeared that way to

those who invented it, and that is why we should not be surprised that

the Egyptian god Thoth, who is alleged to have brought writing to the

King Thamus, was also the god of magic. People like ourselves may see

nothing wondrous in writing, but our anthropologists know how strange

and magical it appears to a purely oral people--a conversation with no

one and yet with everyone. What could be stranger than the silence one

encounters when addressing a question to a text? What could be more

metaphysically puzzling than addressing an unseen audience, as every

writer of books must do? And correcting oneself because one knows that

an unknown reader will disapprove or misunderstand? I bring all of this

up because what my book is about is how our own tribe is undergoing a

vast and trembling shift from the magic of writing to the magic of

electronics. What I mean to point out here is that the introduction

into a culture of a technique such as writing or a clock is not merely

an extension of man's power to bind time but a transformation of his way

of thinking--and, of course, of the content of his culture. And that is

what I mean to say by calling a medium a metaphor. We are told in

school, quite correctly, that a metaphor suggests what a thing is like

by comparing it to something else. And by the power of its suggestion,

it so fixes a conception in our minds that we cannot imagine the one

thing without the other: Light is a wave; language, a tree; God, a wise

and venerable man; the mind, a dark cavern illuminated by knowledge. And

if these

metaphors no longer serve us, we must, in the nature of the matter, find

others that will. Light is a particle; language, a river; God (as

Bertrand Russell proclaimed), a differential equation; the mind, a

garden that yearns to be cultivated. But our media-metaphors are not so

explicit or so vivid as these, and they are far more complex. In

understanding their metaphorical function, we must take into account the

symbolic forms of their information, the source of their information,

the quantity and speed of their information, the context in which their

information is experienced. Thus, it takes some digging to get at them,

to grasp, for example, that a clock recreates time as an independent,

mathematically precise sequence; that writing recreates the mind as a

tablet on which experience is written; that the telegraph recreates news

as a commodity. And yet, such digging becomes easier if we start from

the assumption that in every tool we create, an idea is embedded that

goes beyond the function of the thing itself. It has been pointed out,

for example, that the invention of eyeglasses in the twelfth century not

only made it possible to improve defective vision but suggested the idea

that human beings need not accept as final either the endowments of

nature or the ravages of time. Eyeglasses refuted the belief that

anatomy is destiny by putting forward the idea that our bodies as well

as our minds are improvable. I do not think it goes too far to say that

there is a link between the invention of eyeglasses in the twelfth

century and gene-splitting research in the twentieth. Even such an

instrument as the microscope, hardly a tool of everyday use, had

embedded within it a quite astonishing idea, not about biology but about

psychology. By revealing a world hitherto hidden from view, the

microscope suggested a possibility about the structure of the mind. If

things are not what they seem, if microbes lurk, unseen, on and under

our skin, if the invisible controls the visible, then is it not possible

that ids and egos and superegos also lurk somewhere unseen? What else

is psychoanalysis but a microscope of

the mind? Where do our notions of mind come from if not from metaphors

generated by our tools? What does it mean to say that someone has an IQ

of 126? There are no numbers in people's heads. Intelligence does not

have quantity or magnitude, except as we believe that it does. And why

do we believe that it does? Because we have tools that imply that this

is what the mind is like. Indeed, our tools for thought suggest to us

what our bodies are like, as when someone refers to her "biological

clock," or when we talk of our "genetic codes," or when we read

someone's face like a book, or when our facial expressions telegraph our

intentions. When Galileo remarked that the language of nature is written

in mathematics, he meant it only as a metaphor. Nature itself does not

speak. Neither do our minds or our bodies or, more to the point of this

book, our bodies politic. Our conversations about nature and about

ourselves are conducted in whatever "languages" we find it possible and

convenient to employ. We do not see nature or intelligence or human

motivation or ideology as "it" is but only as our languages are. And

our languages are our media. Our media are our metaphors. Our

metaphors create the content of our culture.

Link to comment
Share on other sites

"Claim: Books allow for deeper concentration, contemplation, and memorization

than any other format.

 

Warrant: The claim is evident through scientific research

 

Support:..... here:"

 

 

Seems like you could just use claim and support to get the point across.

 

I think this claim you made is not supported by what you supplied. (please don't take that as an invitation to post even more long passages.)

 

The passage you provided makes use of research by Gary Small, but that research does not support your claim. The findings by Small are that computer searches generate more activity in the dorsolateral prefrontal cortex. But I want to you pay attention to the way Small compares things: "The researchers found that when people search the Net they exhibit a

very different pattern of brain activity than they do when they read book-like text." Here there's a clear distinction being made between searching the Net and reading book-like text. Then Small says: "The good news here is that Web surfing, because it engages so many brain functions,

may help keep older people’s minds sharp." Again, so far, I don't see this as support for your claim: Books allow for deeper concentration, contemplation, and memorization

than any other format.

 

I'm not done but it's time to go to work. I know at the end of the passage Carr makes the claim:" Try reading a book

while doing a crossword puzzle; that’s the intellectual environment of the Internet" but I think he moves too fast here. There's nothing here to support that we are always doing these two things at once. Carr just asserts that.

 

Now if you would have said:

 

"Claim: Searching the Net and reading book-like text cause activity in different parts of the brain.

 

Warrant: The claim is evident through scientific research

 

Support:..... here:"

 

Then I would think your claim would get stronger support by the passage you quoted.

Link to comment
Share on other sites

Excuse the choppinss of this post, im on an iphone again.

 

To me anything under 10000 words is a short passage. If youre going to only read the first four paragraphs and assume a response based on those 4 paragraphs can somehow be intelligible youre going to have a hard time with future posts.

 

The prefrontal cortex is related to decision making, as in deciding if theres something to click on and if you should click on it. You can completely lose your prefrontal cortex in a car accident and stil retain memory, speech and motorskills. Its an unimportant part to reading, and remains largely inactive during book reading, comprehension, or assigning information to long term memory. The activation of the prefrontal cortex is a sign that the brain is distracted by reading online, and has a harder time retaining anything it reads. if you continue to read past the fourth paragraph he explains why what you just posted is wrong and goes on to show why reading online is inferior to books if your aim is to understand and retain information. If what you want is to exercise the prefrontal cortex there are many other non-reading activities you can do. Thats the point, most of what your brain is doing while reading online isnt reading.

 

Carr pus it like this. "using the Net may, as Gary Small suggests, exercise the brain the

way solving crossword puzzles does. But such intensive exercise, when it becomes our

primary mode of thought, can impede deep learning and thinking. Try reading a book

while doing a crossword puzzle; that’s the intellectual environment of the Internet"

 

Furthermore your inability to read anything of length is not supporting your case.

 

Also i need to point out that the passage is supported by eleven articles, not just one. I had already posted them under the passage. And finally claims, warrants, and support are not subjective terms that you or i can redefine when we want to. They are identifiabe components to any argument.

Link to comment
Share on other sites

Soup, let's try and stay on this one topic of your claim in post #62 and your support for that claim.

 

Here is your claim: "Books allow for deeper concentration, contemplation, and memorization

than any other format."

 

Then you posted a passage from The Shallows.

 

Now staying within the context of what you said in post #62 can you please show me in clean, clear, and crisp examples of support for your claim: "Books allow for deeper concentration, contemplation, and memorization than any other format."

 

A couple of simple and hopefully non controversial examples of what I'm asking of you are as follows:

 

Example A

Claim #1, Soup is a mortal

 

Support #1, Soup is a man

Support #2, All men are mortal

 

Example B

Claim #2, Soup is 6 feet tall

 

Support #1, Soup is a man

Support #2, All men are mortal

 

 

Now both claim #1 and claim #2 can both be true claims. However, only claim #1 follows from both support #1 and #2, whereas claim #2 receives no support.

 

 

Hopefully this question isn't asking too much and I think your answer will help me understand you more clearly. Again, please try and stay within the passage you quoted in post #62

Link to comment
Share on other sites

Cool, now you're bringing up Socrates and syllogisms. I like that logic and understanding logical fallacies is becoming part of crossfire instead of things going off the rails.

 

That said my entire post was on point and supports my claim multiple times. You chose to read only the first four paragraphs, which is where the misunderstanding comes from—not some logical fallacy. I completely support you referencing your freshman english textbook but if you want to refute something you have to read the whole thing. If you do choose to read the whole text and find a logical fallacy, feel free to point it out.

Link to comment
Share on other sites

Soup, I did read the whole thing. And I said earlier that I didn't think your claim was supported by your quote.

 

Claim: The article I posted does NOT support your claim that "Books allow for deeper concentration, contemplation, and memorization than any other format."

 

Qualifier: The passage I provided makes use of research by Gary Small

 

Support:The findings by Small are that computer searches generate more activity in the dorsolateral prefrontal cortex.

Support: Garry Small says, "The good news here is that Web surfing, because it engages so many brain functions,

may help keep older people’s minds sharp."

 

Warrant: You don't see this as support for your claim: Books allow for deeper concentration, contemplation, and memorization

 

 

 

REFUTATION (which was already made in #66)

 

The prefrontal cortex is related to decision making, as in deciding if theres something to click on and if you should click on it. You can completely lose your prefrontal cortex in a car accident and stil retain memory, speech and motorskills. Its an unimportant part to reading, and remains largely inactive during book reading, comprehension, or assigning information to long term memory. The activation of the prefrontal cortex is a sign that the brain is distracted by reading online, and has a harder time retaining anything it reads. if you continue to read past the fourth paragraph he explains why what you just posted is wrong and goes on to show why reading online is inferior to books if your aim is to understand and retain information. If what you want is to exercise the prefrontal cortex there are many other non-reading activities you can do. Thats the point, most of what your brain is doing while reading online isnt reading.

 

Carr puts it like this. "using the Net may, as Gary Small suggests, exercise the brain the

way solving crossword puzzles does. But such intensive exercise, when it becomes our

primary mode of thought, can impede deep learning and thinking. Try reading a book

while doing a crossword puzzle; that’s the intellectual environment of the Internet"

 

Furthermore your inability to read anything of length is not supporting your case.

 

Also i need to point out that the passage is supported by eleven articles, not just one. I had already posted them under the passage. And finally claims, warrants, and support are not subjective terms that you or i can redefine when we want to. They are identifiabe components to any argument.

 

 

REBUTTAL: I'm not staying on topic of my claim in post #62 and my support for that claim.

 

Support: Recitation of how to form a logical argument using claims and supprot.

Support: ?

Support: ?

Qualifier: ?

Warrant: ?

 

Was this an intelligible rebuttal with a warranted claim and qualified support? No.

 

Now we've circled back around to you.

 

Claim: I don't think your claim was supported by your quote.

 

Support: None

Support: None

Support: none

I shouldnt have explain to you how this works.

 

Is it too hard to ask you to show clear support for your claim?

 

I dont know, is it?

Link to comment
Share on other sites

Soup, let's try and stay on this one topic of your claim in post #62 and your support for that claim.

 

Here is your claim: "Books allow for deeper concentration, contemplation, and memorization

than any other format."

 

Then you posted a passage from The Shallows.

 

Now staying within the context of what you said in post #62 can you please show me in clean, clear, and crisp examples of support for your claim: "Books allow for deeper concentration, contemplation, and memorization than any other format."

 

A couple of simple and hopefully non controversial examples of what I'm asking of you are as follows:

 

Example A

Claim #1, Soup is a mortal

 

Support #1, Soup is a man

Support #2, All men are mortal

 

Example B

Claim #2, Soup is 6 feet tall

 

Support #1, Soup is a man

Support #2, All men are mortal

 

 

Now both claim #1 and claim #2 can both be true claims. However, only claim #1 follows from both support #1 and #2, whereas claim #2 receives no support.

 

 

Hopefully this question isn't asking too much and I think your answer will help me understand you more clearly. Again, please try and stay within the passage you quoted in post #62

 

 

Soup, what I'm asking here of you is simple. Please show me in a clear way how the passage you say supports your claim actually supports your claim.

Link to comment
Share on other sites

I've already qualified the support in fucking #66. Why are you stuck on this?

 

"using the Net may, as Gary Small suggests, exercise the brain the

way solving crossword puzzles does. But such intensive exercise, when it becomes our

primary mode of thought, can impede deep learning and thinking. Try reading a book

while doing a crossword puzzle; that’s the intellectual environment of the Internet"

 

This article supports my claim because the entire article is about the distractive nature of the internet, which inhibits concentration, contemplation, and memorization. I've said this NUMEROUS times already. You don't have to like my answer—we're not interested in likes or dislikes—but if you want to refute the statement you must provide your own support for your own claim.

Link to comment
Share on other sites

Soup, I'm very sorry for my ignorance here. I'm sure you have a very strong case.

 

Can you please give your "evident thru scientific research support" (your words, post #62)? Maybe it would be easier for me if you could number each "support" in order and identify for me all the relevant connecting language. Also, could you please use exact quotes, it wouldn't be much support if you had to change Carr's quote to suite your claim. I want to be able to see each step you make in order.

 

I'm not refuting you, I'm trying to understand you.

 

 

I apologize for having to ask you this four times in a row. Seriously, you can take your time, I'm not going anywhere.

Link to comment
Share on other sites

Can you please give your "evident thru scientific research support" (your words, post #62)?

 

7. Gary Small and Gigi Vorgan, iBrain: Surviving the Technological Alteration of the Modern Mind (New York: Collins, 2008), 1.

8. G. W. Small, T. D. Moody, P. Siddarth, and S. Y. Bookheimer, “Your Brain on Google: Patterns of Cerebral Activation during Internet Searching,” American Journal of Geriatric Psychiatry, 17, no. 2 (February 2009): 116–26. See also Rachel Champeau, “UCLA Study Finds That Searching the Internet Increases Brain Function,” UCLA Newsroom, October 14, 2008, http://newsroom.ucla.edu/portal/ucla/ucla-study-finds-that-searching-64348.aspx.

9. Small and Vorgan, iBrain, 16–17.

10. Maryanne Wolf, interview with the author, March 28, 2008.

11. Steven Johnson, Everything Bad Is Good for You: How Today’s Popular Culture Is Actually Making Us Smarter(New York: Riverhead Books, 2005), 19.

12. John Sweller, Instructional Design in Technical Areas(Camberwell, Australia: Australian Council for Educational Research, 1999), 4.

13. Ibid., 7.

14. Ibid.

15. Ibid., 11.

16. Ibid., 4–5. For a broad review of current thinking on the limits of working memory, see Nelson Cowan, Working Memory Capacity (New York: Psychology Press, 2005).

17. Klingberg, Overflowing Brain, 39 and 72–75.

18. Sweller, Instructional Design, 22./QUOTE]

Link to comment
Share on other sites

Im laughing at how long it's taking you to get this. You've lectured me twice already on the rules of logic so I assume you understand the rules of debate. I actually answer the questions you ask when you ask them. If you dont like the answers, come up with better questions. That's the only advice i can give.

 

 

"Can you tell me how many authors have said we should stop using computers?"

Nope.

 

"Can you show support for your claim?"

Here's a whole passage that makes the same claim.

 

"That doesn't support your claim."

That wasn't a question.

 

"Can you show support for your claim?"

I already did

 

"Can you qualify the support you've given?"

Yes I can and then I did.

 

"Can I give my "evident thru scientific research support"?"

First of all "evident thru scientific research support" isnt even a noun, so no I can't give it.

Second if what you meant was "How is this scientific research?" Well heres a list of scientific research that i already gave.

 

 

 

Bonus video, Lecture by Gary Small, your brain on Google:

http://www.pbs.org/wgbh/pages/frontline/digitalnation/living-faster/where-are-we-headed/your-brain-on-google.html

Link to comment
Share on other sites

Soup, let's try and stay on this one topic of your claim in post #62 and your support for that claim.

 

Here is your claim: "Books allow for deeper concentration, contemplation, and memorization

than any other format."

 

Then you posted a passage from The Shallows.

 

Now staying within the context of what you said in post #62 can you please show me in clean, clear, and crisp examples of support for your claim: "Books allow for deeper concentration, contemplation, and memorization than any other format."

 

A couple of simple and hopefully non controversial examples of what I'm asking of you are as follows:

 

Example A

Claim #1, Soup is a mortal

 

Support #1, Soup is a man

Support #2, All men are mortal

 

Example B

Claim #2, Soup is 6 feet tall

 

Support #1, Soup is a man

Support #2, All men are mortal

 

 

Now both claim #1 and claim #2 can both be true claims. However, only claim #1 follows from both support #1 and #2, whereas claim #2 receives no support.

 

 

Hopefully this question isn't asking too much and I think your answer will help me understand you more clearly. Again, please try and stay within the passage you quoted in post #62

 

Sixth times a charm?

 

Anyways, let stay on one topic.

 

And I'm sorry, when I said,

 

"Can you please give your "evident thru scientific research support" (your words, post #62)?"

 

I put the quotation after the word support and it should have been placed after the word research.

 

Let me fix it for you. Can you please give your "evident thru scientific research" (your words, post #62) support?

 

I don't want to hear someone else show the support for your claim. I want to see you show support for your claim using the passage in post #62

 

And it was kinda cute that on post #74 you listed footnotes as scientific support.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share


×
×
  • Create New...