Jump to content

Soup forgot his password

Member
  • Posts

    738
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by Soup forgot his password

  1. Dopamine is the completely related to what a lot of people see as a problem here.

     

    Couple excerpts from "The Shallows" by Nicholas Carr

     

    MICHAEL GREENBERG, IN a 2008 essay in the New York Review of Books,

    found the poetry in neuroplasticity. He observed that our neurological

    system, “with its branches and transmitters and ingeniously spanned gaps,

    has an improvised quality that seems to mirror the unpredictability of

    thought itself.” It’s “an ephemeral place that changes as our experience

    changes.”32 There are many reasons to be grateful that our mental

    hardware is able to adapt so readily to experience, that even old brains

    can be taught new tricks. The brain’s adaptability hasn’t just led to new

    treatments, and new hope, for those suffering from brain injury or illness.

    It provides all of us with a mental flexibility, an intellectual litheness, that

    allows us to adapt to new situations, learn new skills, and in general expand

    our horizons.

     

    But the news is not all good. Although neuroplasticity provides an escape from

    genetic determinism, a loophole for free thought and free will, it also imposes

    its own form of determinism on our behavior. As particular circuits in our brain

    strengthen through the repetition of a physical or mental activity, they begin

    to transform that activity into a habit. The paradox of neuroplasticity, observes

    Doidge, is that, for all the mental flexibility it grants us, it can end up locking

    us into “rigid behaviors.”33 The chemically triggered synapses that link our

    neurons program us, in effect, to want to keep exercising the circuits they’ve

    formed. Once we’ve wired new circuitry in our brain, Doidge writes, “we long

    to keep it activated.”34 That’s the way the brain fine-tunes its operations.

    Routine activities are carried out ever more quickly and efficiently, while unused

    circuits are pruned away.

     

    Plastic does not mean elastic, in other words. Our neural loops don’t snap back

    to their former state the way a rubber band does; they hold onto their changed

    state. And nothing says the new state has to be a desirable one. Bad habits can

    be ingrained in our neurons as easily as good ones. Pascual-Leone observes that

    “plastic changes may not necessarily represent a behavioral gain for a given

    subject.” In addition to being “the mechanism for development and learning,”

    plasticity can be “a cause of pathology.”35

    It comes as no surprise that neuroplasticity has been linked to mental afflictions

    ranging from depression to obsessive-compulsive disorder to tinnitus. The more a

    sufferer concentrates on his symptoms, the deeper those symptoms are etched

    into his neural circuits. In the worst cases, the mind essentially trains itself to

    be sick. Many addictions, too, are reinforced by the strengthening of plastic

    pathways in the brain. Even very small doses of addictive drugs can dramatically

    alter the flow of neurotransmitters in a person’s synapses, resulting in long-lasting

    alterations in brain circuitry and function. In some cases, the buildup of certain

    kinds of neurotransmitters, such as dopamine, a pleasure-producing cousin to

    adrenaline, seems to actually trigger the turning on or off of particular genes,

    bringing even stronger cravings for the drug. The vital paths turn deadly.

    The potential for unwelcome neuroplastic adaptations also exists in the everyday,

    normal functioning of our minds. Experiments show that just as the brain can build

    new or stronger circuits through physical or mental practice, those circuits can

    weaken or dissolve with neglect. “If we stop exercising our mental skills,” writes

    Doidge, “we do not just forget them: the brain map space for those skills is turned

    over to the skills we practice instead.”36 Jeffrey Schwartz, a professor of psychiatry

    at UCLA’s medical school, terms this process “survival of the busiest.”37 The mental

    skills we sacrifice may be as valuable, or even more valuable, than the ones we gain.

    When it comes to the quality of our thought, our neurons and synapses are entirely

    indifferent. The possibility of intellectual decay is inherent in the malleability of our

    brains.

    That doesn’t mean that we can’t, with concerted effort, once again redirect our neural

    signals and rebuild the skills we’ve lost. What it does mean is that the vital paths in

    our brains become, as Monsieur Dumont understood, the paths of least resistance.

    They are the paths that most of us will take most of the time, and the farther we

    proceed down them, the more difficult it becomes to turn back.

     

     

    WHAT DETERMINES WHAT we

    remember and what we forget? The key to memory consolidation is attentiveness.

    Storing explicit memories and, equally important, forming connections between

    them requires strong mental concentration, amplified by repetition or by intense

    intellectual or emotional engagement. The sharper the attention, the sharper the

    memory. “For a memory to persist,” writes Kandel, “the incoming information must

    be thoroughly and deeply processed. This is accomplished by attending to the

    information and associating it meaningfully and systematically with knowledge

    already well established in memory.”35 If we’re unable to attend to the information

    in our working memory, the information lasts only as long as the neurons that hold

    it maintain their electric charge—a few seconds at best. Then it’s gone, leaving

    little or no trace in the mind.

     

    Attention may seem ethereal—a “ghost inside the head,” as the developmental

    psychologist Bruce McCandliss says36—but it’s a genuine physical state, and it

    produces material effects throughout the brain. Recent experiments with mice

    indicate that the act of paying attention to an idea or an experience sets off a

    chain reaction that crisscrosses the brain. Conscious attention begins in the frontal

    lobes of the cerebral cortex, with the imposition of top-down, executive control

    over the mind’s focus. The establishment of attention leads the neurons of the

    cortex to send signals to neurons in the midbrain that produce the powerful

    neurotransmitter dopamine. The axons of these neurons reach all the way into the

    hippocampus, providing a distribution channel for the neurotransmitter. Once the

    dopamine is funneled into the synapses of the hippocampus, it jump-starts the

    consolidation of explicit memory, probably by activating genes that spur the

    synthesis of new proteins.37

    The influx of competing messages that we receive whenever we go online not

    only overloads our working memory; it makes it much harder for our frontal lobes

    to concentrate our attention on any one thing. The process of memory consolidation

    can’t even get started. And, thanks once again to the plasticity of our neuronal

    pathways, the more we use the Web, the more we train our brain to be distracted—

    to process information very quickly and very efficiently but without sustained

    attention. That helps explain why many of us find it hard to concentrate even

    when we’re away from our computers. Our brains become adept at forgetting,

    inept at remembering. Our growing dependence on the Web’s information stores

    may in fact be the product of a self-perpetuating, self-amplifying loop. As our

    use of the Web makes it harder for us to lock information into our biological

    memory, we’re forced to rely more and more on the Net’s capacious and easily

    searchable artificial memory, even if it makes us shallower thinkers.

     

    The changes in our brains happen automatically, outside the narrow compass of

    our consciousness, but that doesn’t absolve us from responsibility for the choices

    we make. One thing that sets us apart from other animals is the command we have

    been granted over our attention. “‘Learning how to think’ really means learning how

    to exercise some control over how and what you think,” said the novelist David Foster

    Wallace in a commencement address at Kenyon College in 2005. “It means being

    conscious and aware enough to choose what you pay attention to and to choose how

    you construct meaning from experience.” To give up that control is to be left with

    “the constant gnawing sense of having had and lost some infinite thing.”38 A

    mentally troubled man—he would hang himself two and a half years after the speech—

    Wallace knew with special urgency the stakes involved in how we choose, or fail to

    choose, to focus our mind. We cede control over our attention at our own peril.

    Everything that neuroscientists have discovered about the cellular and molecular

    workings of the human brain underscores that point.

     

    Socrates may have been mistaken about the effects of writing, but he was wise to warn

    us against taking memory’s treasures for granted. His prophecy of a tool that would

    “implant forgetfulness” in the mind, providing “a recipe not for memory, but for

    reminder,” has gained new currency with the coming of the Web. The prediction may

    turn out to have been merely premature, not wrong. Of all the sacrifices we make when

    we devote ourselves to the Internet as our universal medium, the greatest is likely to be

    the wealth of connections within our own minds. It’s true that the Web is itself a network

    of connections, but the hyperlinks that associate bits of online data are nothing like the

    synapses in our brain. The Web’s links are just addresses, simple software tags that direct

    a browser to load another discrete page of information. They have none of the organic

    richness or sensitivity of our synapses. The brain’s connections, writes Ari Schulman,

    “don’t merely provide access to a memory; they in many ways constitute memories.”39

    The Web’s connections are not our connections—and no matter how many hours we

    spend searching and surfing, they will never become our connections. When we outsource

    our memory to a machine, we also outsource a very important part of our intellect

    and even our identity. William James, in concluding his 1892 lecture on memory, said,

    “The connecting is the thinking.” To which could be added, “The connecting is the self.”

     

    As far as "integrating technology into our lives" When you look at every single piece of technology that has come along in America since the 18th century, it has always been a case of "integrating our lives into technology" In the 1800's, the printed word held a monopoly on public discourse. Then the telegraph did. Then the telephone. Then the radio. In the 1980's it was clearly the television. Today i would argue that the television still holds the throne of "public discourse monopolist" since the GOP primaries is still a tv show, and any media on the internet about our own presidential candidacy is owned and controlled by the same TV broadcasters that have controlled television from the beginning. That may sound like a conspiracy, NBC, CBS and their subsidiaries see themselves as a public utility service like the post office and electricity and I can find direct quotes from their CEO's of this if requested.

     

     

    Edit: Fora.tv is one of those sites that keep me on the internet. Check it out.

    "Has Malcolm Gladwell's Opinion on Social Media and the Arab Spring Changed?" -

  2. To that I say BULLSHIT. If you can't recognize your own limits, that was a problem from the jump....and computers are no different. .

     

    It's not just knowing when to stop, you also have to be able to. There are biological and social limits to things. The problem with the computer and technology in general is that technology has superceded ALL ideology. There is nobody saying how much technology use is too much. Secondly, even if there was a reachable limit to computer/technology use, either socially or physically, your argument assumes you have the willpower to change your behavior whenever you want.

     

    Drug addicts, like all human beings, need the assistance of a support group to help them lock in good or bad behaviors. If everyone thinks that computer usage is in no way detrimental, then there is no support group to help people stop using computers.

  3. Why would I leave when i can sit here and watch the suicide of American ideology? Freedom, individualism, rationalism... what do these terms mean on the internet? How absurd can we go with them?

     

    Look at you taking advantage of this superficial sense of validation from this false sense of "community" this bulletin board has created for you. Humans are civic creatures yet we can't tell the difference between real and fake citizenship. Or maybe we can tell the difference, but when it's so hard in real life to not be a coward, introduce ourselves to our neighbors, rejoin a REAL geographic community, it's just easier to play pretend and go for the lowest hanging fruit.

     

     

    You feel empowered on a messageboard by design. You type a few words into a text box and the bulletin board system generates a post visually equivalent to everybody else's. To you this is an improvement to your other life because here you feel a level of acceptance. Here you're not judged on physical appearance or intelligence. There's only one rule of anonymity: be as indifferent and uncaring as possible. Show any passion for anything and expect the wrath of Anonymous to wipe you from the internet. Make sure you're not the target by just going along with what everyone else is doing.

     

    Chimpanzees can be trained to perform the same trick. Lock a bunch of chimps in a cell, put a ladder in the cell and on top of the ladder place a banana. If one monkey goes for the banana, turn a firehose on the entire cell. Soon the monkeys will stop going for the banana. Take out one of the chimps and bring in an uninitiated new chimp. Watch him go for the banana. Watch every other chimp beat the piss out of him before he goes for it.

     

    One at a time, replace each original chimp with a new uninitiated one. Soon you have a bunch of chimps that have never even felt a firehose, but if one chimp goes for the banana they still beat him up.

     

    That's you Cunt. You're as intelligent and self aware as those chimpanzees. Your'e little Baby Bear in the three bears, lapping up his milk, trying to sit up at the table with the rest of the three bears, just trying to fit in like a tiny, feeble, cowardly homunculus without a fucking clue why you even bother.

  4. Let me fix it for you. Can you please give your "evident thru scientific research" (your words, post #62) support?

    Already did. If you dont like an answer you receive don't get to demand I give a different one. Ask a better question.

     

    I don't want to hear someone else show the support for your claim. I want to see you show support for your claim using the passage in post #62

    You dont want me to support my argument with harvard published papers and a pretty fucking extensive list of research papers to go along with it. Well, again, too bad.

    Let's move on.

  5. Im laughing at how long it's taking you to get this. You've lectured me twice already on the rules of logic so I assume you understand the rules of debate. I actually answer the questions you ask when you ask them. If you dont like the answers, come up with better questions. That's the only advice i can give.

     

     

    "Can you tell me how many authors have said we should stop using computers?"

    Nope.

     

    "Can you show support for your claim?"

    Here's a whole passage that makes the same claim.

     

    "That doesn't support your claim."

    That wasn't a question.

     

    "Can you show support for your claim?"

    I already did

     

    "Can you qualify the support you've given?"

    Yes I can and then I did.

     

    "Can I give my "evident thru scientific research support"?"

    First of all "evident thru scientific research support" isnt even a noun, so no I can't give it.

    Second if what you meant was "How is this scientific research?" Well heres a list of scientific research that i already gave.

     

     

     

    Bonus video, Lecture by Gary Small, your brain on Google:

    http://www.pbs.org/wgbh/pages/frontline/digitalnation/living-faster/where-are-we-headed/your-brain-on-google.html

  6. Can you please give your "evident thru scientific research support" (your words, post #62)?

     

    7. Gary Small and Gigi Vorgan, iBrain: Surviving the Technological Alteration of the Modern Mind (New York: Collins, 2008), 1.

    8. G. W. Small, T. D. Moody, P. Siddarth, and S. Y. Bookheimer, “Your Brain on Google: Patterns of Cerebral Activation during Internet Searching,” American Journal of Geriatric Psychiatry, 17, no. 2 (February 2009): 116–26. See also Rachel Champeau, “UCLA Study Finds That Searching the Internet Increases Brain Function,” UCLA Newsroom, October 14, 2008, http://newsroom.ucla.edu/portal/ucla/ucla-study-finds-that-searching-64348.aspx.

    9. Small and Vorgan, iBrain, 16–17.

    10. Maryanne Wolf, interview with the author, March 28, 2008.

    11. Steven Johnson, Everything Bad Is Good for You: How Today’s Popular Culture Is Actually Making Us Smarter(New York: Riverhead Books, 2005), 19.

    12. John Sweller, Instructional Design in Technical Areas(Camberwell, Australia: Australian Council for Educational Research, 1999), 4.

    13. Ibid., 7.

    14. Ibid.

    15. Ibid., 11.

    16. Ibid., 4–5. For a broad review of current thinking on the limits of working memory, see Nelson Cowan, Working Memory Capacity (New York: Psychology Press, 2005).

    17. Klingberg, Overflowing Brain, 39 and 72–75.

    18. Sweller, Instructional Design, 22./QUOTE]

  7. I've already qualified the support in fucking #66. Why are you stuck on this?

     

    "using the Net may, as Gary Small suggests, exercise the brain the

    way solving crossword puzzles does. But such intensive exercise, when it becomes our

    primary mode of thought, can impede deep learning and thinking. Try reading a book

    while doing a crossword puzzle; that’s the intellectual environment of the Internet"

     

    This article supports my claim because the entire article is about the distractive nature of the internet, which inhibits concentration, contemplation, and memorization. I've said this NUMEROUS times already. You don't have to like my answer—we're not interested in likes or dislikes—but if you want to refute the statement you must provide your own support for your own claim.

  8. Soup, I did read the whole thing. And I said earlier that I didn't think your claim was supported by your quote.

     

    Claim: The article I posted does NOT support your claim that "Books allow for deeper concentration, contemplation, and memorization than any other format."

     

    Qualifier: The passage I provided makes use of research by Gary Small

     

    Support:The findings by Small are that computer searches generate more activity in the dorsolateral prefrontal cortex.

    Support: Garry Small says, "The good news here is that Web surfing, because it engages so many brain functions,

    may help keep older people’s minds sharp."

     

    Warrant: You don't see this as support for your claim: Books allow for deeper concentration, contemplation, and memorization

     

     

     

    REFUTATION (which was already made in #66)

     

    The prefrontal cortex is related to decision making, as in deciding if theres something to click on and if you should click on it. You can completely lose your prefrontal cortex in a car accident and stil retain memory, speech and motorskills. Its an unimportant part to reading, and remains largely inactive during book reading, comprehension, or assigning information to long term memory. The activation of the prefrontal cortex is a sign that the brain is distracted by reading online, and has a harder time retaining anything it reads. if you continue to read past the fourth paragraph he explains why what you just posted is wrong and goes on to show why reading online is inferior to books if your aim is to understand and retain information. If what you want is to exercise the prefrontal cortex there are many other non-reading activities you can do. Thats the point, most of what your brain is doing while reading online isnt reading.

     

    Carr puts it like this. "using the Net may, as Gary Small suggests, exercise the brain the

    way solving crossword puzzles does. But such intensive exercise, when it becomes our

    primary mode of thought, can impede deep learning and thinking. Try reading a book

    while doing a crossword puzzle; that’s the intellectual environment of the Internet"

     

    Furthermore your inability to read anything of length is not supporting your case.

     

    Also i need to point out that the passage is supported by eleven articles, not just one. I had already posted them under the passage. And finally claims, warrants, and support are not subjective terms that you or i can redefine when we want to. They are identifiabe components to any argument.

     

     

    REBUTTAL: I'm not staying on topic of my claim in post #62 and my support for that claim.

     

    Support: Recitation of how to form a logical argument using claims and supprot.

    Support: ?

    Support: ?

    Qualifier: ?

    Warrant: ?

     

    Was this an intelligible rebuttal with a warranted claim and qualified support? No.

     

    Now we've circled back around to you.

     

    Claim: I don't think your claim was supported by your quote.

     

    Support: None

    Support: None

    Support: none

    I shouldnt have explain to you how this works.

     

    Is it too hard to ask you to show clear support for your claim?

     

    I dont know, is it?

  9. Cool, now you're bringing up Socrates and syllogisms. I like that logic and understanding logical fallacies is becoming part of crossfire instead of things going off the rails.

     

    That said my entire post was on point and supports my claim multiple times. You chose to read only the first four paragraphs, which is where the misunderstanding comes from—not some logical fallacy. I completely support you referencing your freshman english textbook but if you want to refute something you have to read the whole thing. If you do choose to read the whole text and find a logical fallacy, feel free to point it out.

  10. Excuse the choppinss of this post, im on an iphone again.

     

    To me anything under 10000 words is a short passage. If youre going to only read the first four paragraphs and assume a response based on those 4 paragraphs can somehow be intelligible youre going to have a hard time with future posts.

     

    The prefrontal cortex is related to decision making, as in deciding if theres something to click on and if you should click on it. You can completely lose your prefrontal cortex in a car accident and stil retain memory, speech and motorskills. Its an unimportant part to reading, and remains largely inactive during book reading, comprehension, or assigning information to long term memory. The activation of the prefrontal cortex is a sign that the brain is distracted by reading online, and has a harder time retaining anything it reads. if you continue to read past the fourth paragraph he explains why what you just posted is wrong and goes on to show why reading online is inferior to books if your aim is to understand and retain information. If what you want is to exercise the prefrontal cortex there are many other non-reading activities you can do. Thats the point, most of what your brain is doing while reading online isnt reading.

     

    Carr pus it like this. "using the Net may, as Gary Small suggests, exercise the brain the

    way solving crossword puzzles does. But such intensive exercise, when it becomes our

    primary mode of thought, can impede deep learning and thinking. Try reading a book

    while doing a crossword puzzle; that’s the intellectual environment of the Internet"

     

    Furthermore your inability to read anything of length is not supporting your case.

     

    Also i need to point out that the passage is supported by eleven articles, not just one. I had already posted them under the passage. And finally claims, warrants, and support are not subjective terms that you or i can redefine when we want to. They are identifiabe components to any argument.

  11. You give a list of things (eternity, religion, and god—also

    capitalism, individualism, liberty, fairness) and call them metaphors, but are they all

    metaphors?

     

    I’m sorry I mixed terms. Metaphor wasn’t what I meant. These are all concepts,

    ideas, things that only exist in the abstraction of language. Our understanding of

    them and our relationship to them is only as deep as the media-metaphor through

    which they are conveyed. An abstract concept like "politics" is understood differently

    when it is conveyed visually through television, typographically through books, or

    the jumbled mess that is the computer.

     

     

    Now this is an idea that comes from Neil Postman so to avoid future responses of

    “well I just don’t agree with you” I’m just going to post the fucking text.

     

    Neil Postman

    the Medium Is the Metaphor (about half-way through)

     

    In studying the Bible

    as a young man, I found intimations of the idea that forms of media

    favor particular kinds of content and therefore are capable of taking

    command of a culture. I refer specifically to the Decalogue, the Second

    Commandment of which prohibits the Israelites from making concrete

    images of anything. "Thou shalt not make unto thee any graven image, any

    likeness of any thing that is in heaven above, or that is in the earth

    beneath, or that is in the water beneath the earth." I wondered then, as

    so many others have, as to why the God of these people would have

    included instructions on how they were to symbolize, or not symbolize,

    their experience. It is a strange injunction to include as part of an

    ethical system unless its author assumed a connection between forms of

    human communication and the quality of a culture. We may hazard a guess

    that a people who are being asked to embrace an abstract, universal

    deity would be rendered unfit to do so by the habit of drawing pictures

    or making statues or depicting their ideas in any concrete,

    icono-graphic forms. the God of the Jews was to exist in the Word and

    through the Word, an unprecedented conception requiring the highest

    order of abstract thinking. Iconography thus became blasphemy so that a

    new kind of God could enter a culture. People like ourselves who are in

    the process of converting their culture from word-centered to

    image-centered might profit by reflecting on this Mosaic injunction. But

    even if I am wrong in these conjectures, it is, I believe, a wise and

    particularly relevant supposition that the media of communication

    available to a culture are a dominant influence on the formation of the

    culture's intellectual and social preoccupations. Speech, of course, is

    the primal and indispensable medium. It made us human, keeps us human,

    and in fact defines what human means. This is not to say that if there

    were no other means of communication all humans would find it equally

    convenient to speak about the same things in the same way. We know

    enough about language to understand that variations in the

    structures of languages will result in variations in what may be called

    "world view." How people think about time and space, and about things

    and processes, will be greatly influenced by the grammatical features of

    their language. We dare not suppose therefore that all human minds are

    unanimous in understanding how the world is put together. But how much

    more divergence there is in world view among different cultures can be

    imagined when we consider the great number and variety of tools for

    conversation that go beyond speech. For although culture is a creation

    of speech, it is recreated anew by every medium of communication--from

    painting to hieroglyphs to the alphabet to television. Each medium,

    like language itself, makes possible a unique mode of discourse by

    providing a new orientation for thought, for expression, for

    sensibility. Which, of course, is what McLuhan meant in saying the

    medium is the message. His aphorism, however, is in need of amendment

    because, as it stands, it may lead one to confuse a message with a

    metaphor. A message denotes a specific, concrete statement about the

    world. But the forms of our media, including the symbols through which

    they permit conversation, do not make such statements. They are rather

    like metaphors, working by unobtrusive but powerful implication to

    enforce their special definitions of reality. Whether we are

    experiencing the world through the lens of speech or the printed word or

    the television camera, our media-metaphors classify the world for us,

    sequence it, frame it, enlarge it, reduce it, color it, argue a case for

    what the world is like. As Ernst Cassirer remarked:

    Physical reality seems to recede in proportion as man's symbolic

    activity advances. Instead of dealing with the things themselves man is

    in a sense constantly conversing with himself. He has so enveloped

    himself in linguistic forms, in artistic images, in mythical symbols or

    religious rites that he cannot see or know anything except by the

    interposition of [an] artificial medium.

    What is peculiar about such interpositions of media is that their role

    in directing what we will see or know is so rarely noticed. A person

    who reads a book or who watches television or who glances at his watch

    is not usually interested in how his mind is organized and controlled by

    these events, still less in what idea of the world is suggested by a

    book, television, or a watch. But there are men and women who have

    noticed these things, especially in our own times. Lewis Mumford, for

    example, has been one of our great noticers. He is not the sort of a

    man who looks at a clock merely to see what time it is. Not that he

    lacks interest in the content of clocks, which is of concern to everyone

    from moment to moment, but he is far more interested in how a clock

    creates the idea of "moment to moment." He attends to the philosophy of

    clocks, to clocks as metaphor, about which our education has had little

    to say and clock makers nothing at all. "the clock," Mumford has

    concluded, "is a piece of power machinery whose 'product' is seconds and

    minutes." In manufacturing such a product, the clock has the effect of

    disassociating time from human events and thus nourishes the belief in

    an independent world of mathematically measurable sequences. Moment to

    moment, it turns out, is not God's conception, or nature's. It is man

    conversing with himself about and through a piece of machinery he

    created. In Mumford's great book Technics and Civilization, he shows

    how, beginning in the fourteenth century, the clock made us into

    time-keepers, and then time-savers, and now time-servers. In the

    process, we have learned irreverence toward the sun and the seasons, for

    in a world made up of seconds and minutes, the authority of nature is

    superseded. Indeed, as Mumford points out, with the invention of the

    clock, Eternity ceased to serve as the measure and focus of human

    events. And thus, though few would have imagined the connection, the

    inexorable ticking of the clock may have had more to do with the

    weakening of God's supremacy than all the treatises produced by the phi-

    losophers of the Enlightenment; that is to' say, the clock introduced a

    new form of conversation between man and God, in which God appears to

    have been the loser. Perhaps Moses should have included another

    Commandment: Thou shalt not make mechanical representations of time.

    That the alphabet introduced a new form of conversation between man and

    man is by now a commonplace among scholars. To be able to see one's

    utterances rather than only to hear them is no small matter, though our

    education, once again, has had little to say about this. Nonetheless,

    it is clear that phonetic writing created a new conception of knowledge,

    as well as a new sense of intelligence, of audience and of posterity,

    all of which Plato recognized at an early stage in the development of

    texts. "No man of intelligence," he wrote in his Seventh Letter, "will

    venture to express his philosophical views in language, especially not

    in language that is unchangeable, which is true of that which is set

    down in written characters." This notwithstanding, he wrote voluminously

    and understood better than anyone else that the setting down of views in

    written characters would be the beginning of philosophy, not its end.

    Philosophy cannot exist without criticism, and writing makes it possible

    and convenient to subject thought to a continuous and concentrated

    scrutiny. Writing freezes speech and in so doing gives birth to the

    grammarian, the logician, the rhetorician, the historian, the

    scientist--all those who must hold language before them so that they can

    see what it means, where it errs, and where it is leading. Plato knew

    all of this, which means that he knew that writing would bring about a

    perceptual revolution: a shift from the ear to the eye as an organ of

    language processing. Indeed, there is a legend that to encourage such a

    shift Plato insisted that his students study geometry before entering

    his Academy. If true, it was a sound idea, for as the great literary

    critic Northrop Frye has remarked, "the written word is far more

    powerful than simply a reminder: it re-creates the past in the present,

    and gives

    us, not the familiar remembered thing, but the glittering intensity of

    the summoned-up hallucination." 3 All that Plato surmised about the

    consequences of writing is now well understood by anthropologists,

    especially those who have studied cultures in which speech is the only

    source of complex conversation. Anthropologists know that the written

    word, as Northrop Frye meant to suggest, is not merely an echo of a

    speaking voice. It is another kind of voice altogether, a conjurer's

    trick of the first order. It must certainly have appeared that way to

    those who invented it, and that is why we should not be surprised that

    the Egyptian god Thoth, who is alleged to have brought writing to the

    King Thamus, was also the god of magic. People like ourselves may see

    nothing wondrous in writing, but our anthropologists know how strange

    and magical it appears to a purely oral people--a conversation with no

    one and yet with everyone. What could be stranger than the silence one

    encounters when addressing a question to a text? What could be more

    metaphysically puzzling than addressing an unseen audience, as every

    writer of books must do? And correcting oneself because one knows that

    an unknown reader will disapprove or misunderstand? I bring all of this

    up because what my book is about is how our own tribe is undergoing a

    vast and trembling shift from the magic of writing to the magic of

    electronics. What I mean to point out here is that the introduction

    into a culture of a technique such as writing or a clock is not merely

    an extension of man's power to bind time but a transformation of his way

    of thinking--and, of course, of the content of his culture. And that is

    what I mean to say by calling a medium a metaphor. We are told in

    school, quite correctly, that a metaphor suggests what a thing is like

    by comparing it to something else. And by the power of its suggestion,

    it so fixes a conception in our minds that we cannot imagine the one

    thing without the other: Light is a wave; language, a tree; God, a wise

    and venerable man; the mind, a dark cavern illuminated by knowledge. And

    if these

    metaphors no longer serve us, we must, in the nature of the matter, find

    others that will. Light is a particle; language, a river; God (as

    Bertrand Russell proclaimed), a differential equation; the mind, a

    garden that yearns to be cultivated. But our media-metaphors are not so

    explicit or so vivid as these, and they are far more complex. In

    understanding their metaphorical function, we must take into account the

    symbolic forms of their information, the source of their information,

    the quantity and speed of their information, the context in which their

    information is experienced. Thus, it takes some digging to get at them,

    to grasp, for example, that a clock recreates time as an independent,

    mathematically precise sequence; that writing recreates the mind as a

    tablet on which experience is written; that the telegraph recreates news

    as a commodity. And yet, such digging becomes easier if we start from

    the assumption that in every tool we create, an idea is embedded that

    goes beyond the function of the thing itself. It has been pointed out,

    for example, that the invention of eyeglasses in the twelfth century not

    only made it possible to improve defective vision but suggested the idea

    that human beings need not accept as final either the endowments of

    nature or the ravages of time. Eyeglasses refuted the belief that

    anatomy is destiny by putting forward the idea that our bodies as well

    as our minds are improvable. I do not think it goes too far to say that

    there is a link between the invention of eyeglasses in the twelfth

    century and gene-splitting research in the twentieth. Even such an

    instrument as the microscope, hardly a tool of everyday use, had

    embedded within it a quite astonishing idea, not about biology but about

    psychology. By revealing a world hitherto hidden from view, the

    microscope suggested a possibility about the structure of the mind. If

    things are not what they seem, if microbes lurk, unseen, on and under

    our skin, if the invisible controls the visible, then is it not possible

    that ids and egos and superegos also lurk somewhere unseen? What else

    is psychoanalysis but a microscope of

    the mind? Where do our notions of mind come from if not from metaphors

    generated by our tools? What does it mean to say that someone has an IQ

    of 126? There are no numbers in people's heads. Intelligence does not

    have quantity or magnitude, except as we believe that it does. And why

    do we believe that it does? Because we have tools that imply that this

    is what the mind is like. Indeed, our tools for thought suggest to us

    what our bodies are like, as when someone refers to her "biological

    clock," or when we talk of our "genetic codes," or when we read

    someone's face like a book, or when our facial expressions telegraph our

    intentions. When Galileo remarked that the language of nature is written

    in mathematics, he meant it only as a metaphor. Nature itself does not

    speak. Neither do our minds or our bodies or, more to the point of this

    book, our bodies politic. Our conversations about nature and about

    ourselves are conducted in whatever "languages" we find it possible and

    convenient to employ. We do not see nature or intelligence or human

    motivation or ideology as "it" is but only as our languages are. And

    our languages are our media. Our media are our metaphors. Our

    metaphors create the content of our culture.

  12. I disagree that "the single purpose of the hammer, which

    defines the hammer...is that a hammer can put nails into a wall". I think what

    actually defines a hammer is common purpose and not single purpose. Thinking

    about it this way is more inclusive to how we actually use a hammer and still allows

    for the distinction between a hammer, my skull and a rock.

     

    That’s fine. We can agree that the lexical definition of a hammer is “a tool consisting

    of a solid head, usually of metal, set crosswise on a handle, used for beating metals,

    driving nails, etc.” That’s all the consensus we need. If you permit that’s a dictionary

    definition for a hammer then I can move onto my next point.

     

    I guess we'll have to agree to disagree on this one. Seems

    to me like you are overstating things here.

     

    I dunno if I need to point this out or not, but this is you just saying, “Well I disagree

    without providing any logical or intelligible reason except wanting to disagree.”

     

    I dunno what else to say except, “alright then. “ And move on.

     

    If you want to know the supreme role technological advancement has always had in

    society you can start with the technology that started civilization, agriculture. Just

    move forward through history from there.

     

    You give a list of things (eternity, religion, and god—also

    capitalism, individualism, liberty, fairness) and call them metaphors, but are they all

    metaphors?

     

    Of all these responses I’m giving THIS one is the important one. I'm going to dedicate

    my next post to this.

     

     

     

    I am interested in epistemology but I didn't mean to give

    you the impression that it was only in the context of religion.

     

    I never said you gave me the impression you were only into religious epistemology. I

    said you were into epistemology AT LEAST in the context of religion.

     

    One statement I noticed from you was "Researching on

    the internet may seem faster than researching in a library but the information you

    end up with when you use the internet is so inferior to book learning that its truly a

    waste of fucking time." and this is so blatantly incorrect that I wanted to comment in

    this thread.

     

    Ok now lets see how you proved my claim to be “blatantly incorrect.”

     

    I let you know there is a ton of good information I find on

    the internet thru google. You asked me what kind and I gave you one example, but

    there are lots more we don't even need to bother getting into (meaning I've

    successfully searched for things many times over).

     

    Just because you think Google search results are good doesn’t mean the library isn’t

    better—I mean vastly better. I'm not talking about personal experience, although

    I agree with the findings. I'm talking about historical facts, scientific research into

    brain function, human behaviors and the way people process information.

     

    I think I would be sure to never slip up here and to

    always keep distinction between how we actually "know" things internally vs. the

    way we come to find new things externally. That's sloppy but it's open to critique

    and revision.

     

    I do not understand what this means or how you went from what I said to this,

    which probably means you don’t understand what I meant by “technology as

    epistemology.” If I need to explain this let me know.

     

    Really? The post before the post you quoted.

     

     

    Ok so I got the context, but your response still said I’m either a troll or I don’t know

    what I’m talking about. Whichever way you look at it you were presumptuous and,

    like you said, that’s bad. I don’t really care to continue this so moving on?

     

    I know you aren't saying using computers directly causes

    cancer. So it's not just like a warning on cigarette label. In other words, you don't

    have to tell us what it's like you can just tell us what it is.

     

    It’s exactly like a warning label on a cigarette box. “We’re not saying don’t smoke,

    but here’s some irrefutable facts.” Still don’t like it? Lets just go back to this then,

     

    Can you answer my question? How many people you

    listed above come to the conclusion we should stop using computers?

     

    Can I tell you how many of the authors above said stop using computers? Nope.

  13. I'm sort of used to using warrant in a particular way but

    recognize that people easily use it interchangeably with other words like

    justification. You're going to have to fill me in on how you use both those words for

    me to understand the point you're making here.

     

    Claim: What I want readers to believe'

     

    Support: What I will use to support the claim

     

    Warrant: A general principle that explains why I think me evidence is accurate, and

    relevant to your claim.

     

     

     

    Claim: Books allow for deeper concentration, contemplation, and memorization

    than any other format.

     

    Warrant: The claim is evident through scientific research

     

    Support: Not given at the time because this is the internet and I don't feel like qualifying things unless someone asks me to, but here:

     

     

     

    Taken from "The Shallows" by Nicholas Carr

    Keep in mind the entire book is devoted to this topic and this is only one section.

     

     

    GARY SMALL, A professor of psychiatry at UCLA and the director of its Memory and

    Aging Center, has been studying the physiological and neurological effects of the use of

    digital media, and what he’s discovered backs up Merzenich’s belief that the Net causes

    extensive brain changes. “The current explosion of digital technology not only is

    changing the way we live and communicate but is rapidly and profoundly altering our

    brains,” he says. The daily use of computers, smartphones, search engines, and other such

    tools “stimulates brain cell alteration and neurotransmitter release, gradually

    strengthening new neural pathways in our brains while weakening old ones.”7

     

    In 2008, Small and two of his colleagues carried out the first experiment that actually

    showed people’s brains changing in response to Internet use.8 The researchers recruited

    twenty-four volunteers—a dozen experienced Web surfers and a dozen novices—and

    scanned their brains as they performed searches on Google. (Since a computer won’t fit

    inside a magnetic resonance imager, the subjects were equipped with goggles onto which

    were projected images of Web pages, along with a small handheld touchpad to navigate

    the pages.) The scans revealed that the brain activity of the experienced Googlers was

    much broader than that of the novices. In particular, “the computer-savvy subjects used a

    specific network in the left front part of the brain, known as the dorsolateral prefrontal

    cortex, [while] the Internet-naïve subjects showed minimal, if any, activity in this area.”

    As a control for the test, the researchers also had the subjects read straight text in a

    simulation of book reading; in this case, scans revealed no significant difference in brain

    activity between the two groups. Clearly, the experienced Net users’ distinctive neural

    pathways had developed through their Internet use.

     

    The most remarkable part of the experiment came when the tests were repeated six days

    later. In the interim, the researchers had the novices spend an hour a day online, searching

    the Net. The new scans revealed that the area in their prefrontal cortex that had been

    largely dormant now showed extensive activity—just like the activity in the brains of the

    veteran surfers. “After just five days of practice, the exact same neural circuitry in the

    front part of the brain became active in the Internet-naïve subjects,” reports Small. “Five

    hours on the Internet, and the naïve subjects had already rewired their brains.” He goes

    on to ask, “If our brains are so sensitive to just an hour a day of computer exposure, what

    happens when we spend more time [online]?” 9

     

    One other finding of the study sheds light on the differences between reading Web pages

    and reading books. The researchers found that when people search the Net they exhibit a

    very different pattern of brain activity than they do when they read book-like text. Book

    readers have a lot of activity in regions associated with language, memory, and visual

    processing, but they don’t display much activity in the prefrontal regions associated with

    decision making and problem solving. Experienced Net users, by contrast, display

    extensive activity across all those brain regions when they scan and search Web pages.

    The good news here is that Web surfing, because it engages so many brain functions,

    may help keep older people’s minds sharp. Searching and browsing seem to “exercise”

    the brain in a way similar to solving crossword puzzles, says Small.

     

    But the extensive activity in the brains of surfers also points to why deep reading and

    other acts of sustained concentration become so difficult online. The need to evaluate

    links and make related navigational choices, while also processing a multiplicity of

    fleeting sensory stimuli, requires constant mental coordination and decision making,

    distracting the brain from the work of interpreting text or other information. Whenever

    we, as readers, come upon a link, we have to pause, for at least a split second, to allow

    our prefrontal cortex to evaluate whether or not we should click on it. The redirection of

    our mental resources, from reading words to making judgments, may be imperceptible to

    us—our brains are quick—but it’s been shown to impede comprehension and retention,

    particularly when it’s repeated frequently. As the executive functions of the prefrontal

    cortex kick in, our brains become not only exercised but overtaxed. In a very real way,

    the Web returns us to the time of scriptura continua, when reading was a cognitively

    strenuous act. In reading online, Maryanne Wolf says, we sacrifice the facility that makes

    deep reading possible. We revert to being “mere decoders of information.”10Our ability to

    make the rich mental connections that form when we read deeply and without distraction

    remains largely disengaged.

     

    Steven Johnson, in his 2005 book Everything Bad Is Good for You, contrasted the

    widespread, teeming neural activity seen in the brains of computer users with the much

    more muted activity evident in the brains of book readers. The comparison led him to

    suggest that computer use provides more intense mental stimulation than does book

    reading. The neural evidence could even, he wrote, lead a person to conclude that

    “reading books chronically understimulates the senses.”11 But while Johnson’s diagnosis

    is correct, his interpretation of the differing patterns of brain activity is misleading. It is

    the very fact that book reading “understimulates the senses” that makes the activity so

    intellectually rewarding. By allowing us to filter out distractions, to quiet the problem-

    solving functions of the frontal lobes, deep reading becomes a form of deep thinking. The

    mind of the experienced book reader is a calm mind, not a buzzing one. When it comes to

    the firing of our neurons, it’s a mistake to assume that more is better.

     

    John Sweller, an Australian educational psychologist, has spent three decades studying

    how our minds process information and, in particular, how we learn. His work

    illuminates how the Net and other media influence the style and the depth of our thinking.

    Our brains, he explains, incorporate two very different kinds of memory: short-term and

    long-term. We hold our immediate impressions, sensations, and thoughts as short-term

    memories, which tend to last only a matter of seconds. All the things we’ve learned about

    the world, whether consciously or unconsciously, are stored as long-term memories,

    which can remain in our brains for a few days, a few years, or even a lifetime. One

    particular type of short-term memory, called working memory, plays an instrumental role

    in the transfer of information into long-term memory and hence in the creation of our

    personal store of knowledge. Working memory forms, in a very real sense, the contents

    of our consciousness at any given moment. “We are conscious of what is in working

    memory and not conscious of anything else,” says Sweller.12

     

    If working memory is the mind’s scratch pad, then long-term memory is its filing system.

    The contents of our long-term memory lie mainly outside of our consciousness. In order

    for us to think about something we’ve previously learned or experienced, our brain has to

    transfer the memory from long-term memory back into working memory. “We are only

    aware that something was stored in long-term memory when it is brought down into

    working memory,” explains Sweller.13 It was once assumed that long-term memory

    served merely as a big warehouse of facts, impressions, and events, that it “played little

    part in complex cognitive processes such as thinking and problem-solving.”14 But brain

    scientists have come to realize that long-term memory is actually the seat of

    understanding. It stores not just facts but complex concepts, or “schemas.” By organizing

    scattered bits of information into patterns of knowledge, schemas give depth and richness

    to our thinking. “Our intellectual prowess is derived largely from the schemas we have

    acquired over long periods of time,” says Sweller. “We are able to understand concepts in

    our areas of expertise because we have schemas associated with those concepts.”15

     

    The depth of our intelligence hinges on our ability to transfer information from working

    memory to long-term memory and weave it into conceptual schemas. But the passage

    from working memory to long-term memory also forms the major bottleneck in our brain.

    Unlike long-term memory, which has a vast capacity, working memory is able to hold

    only a very small amount of information. In a renowned 1956 paper, “The Magical

    Number Seven, Plus or Minus Two,” Princeton psychologist George Miller observed that

    working memory could typically hold just seven pieces, or “elements,” of information.

    Even that is now considered an overstatement. According to Sweller, current evidence

    suggests that “we can process no more than about two to four elements at any given time

    with the actual number probably being at the lower [rather] than the higher end of this

    scale.” Those elements that we are able to hold in working memory will, moreover,

    quickly vanish “unless we are able to refresh them by rehearsal.”16

     

    Imagine filling a bathtub with a thimble; that’s the challenge involved in transferring

    information from working memory into long-term memory. By regulating the velocity

    and intensity of information flow, media exert a strong influence on this process. When

    we read a book, the information faucet provides a steady drip, which we can control by

    the pace of our reading. Through our single-minded concentration on the text, we can

    transfer all or most of the information, thimbleful by thimbleful, into long-term memory

    and forge the rich associations essential to the creation of schemas. With the Net, we face

    many information faucets, all going full blast. Our little thimble overflows as we rush

    from one faucet to the next. We’re able to transfer only a small portion of the information

    to long-term memory, and what we do transfer is a jumble of drops from different

    faucets, not a continuous, coherent stream from one source.

     

    The information flowing into our working memory at any given moment is called our

    “cognitive load.” When the load exceeds our mind’s ability to store and process the

    information—when the water overflows the thimble—we’re unable to retain the

    information or to draw connections with the information already stored in our long-term

    memory. We can’t translate the new information into schemas. Our ability to learn

    suffers, and our understanding remains shallow. Because our ability to maintain our

    attention also depends on our working memory—“we have to remember what it is we are

    to concentrate on,” as Torkel Klingberg says—a high cognitive load amplifies the

    distractedness we experience. When our brain is overtaxed, we find “distractions more

    distracting.”17 (Some studies link attention deficit disorder, or ADD, to the overloading of

    working memory.) Experiments indicate that as we reach the limits of our working

    memory, it becomes harder to distinguish relevant information from irrelevant

    information, signal from noise. We become mindless consumers of data.

     

    Difficulties in developing an understanding of a subject or a concept appear to be

    “heavily determined by working memory load,” writes Sweller, and the more complex

    the material we’re trying to learn, the greater the penalty exacted by an overloaded

    mind.18 There are many possible sources of cognitive overload, but two of the most

    important, according to Sweller, are “extraneous problem-solving” and “divided

    attention.” Those also happen to be two of the central features of the Net as an

    informational medium. Using the Net may, as Gary Small suggests, exercise the brain the

    way solving crossword puzzles does. But such intensive exercise, when it becomes our

    primary mode of thought, can impede deep learning and thinking. Try reading a book

    while doing a crossword puzzle; that’s the intellectual environment of the Internet.

     

     

     

     

     

    7. Gary Small and Gigi Vorgan, iBrain: Surviving the Technological Alteration of the Modern Mind (New York: Collins, 2008), 1.

    8. G. W. Small, T. D. Moody, P. Siddarth, and S. Y. Bookheimer, “Your Brain on Google: Patterns of Cerebral Activation during Internet Searching,” American Journal of Geriatric Psychiatry, 17, no. 2 (February 2009): 116–26. See also Rachel Champeau, “UCLA Study Finds That Searching the Internet Increases Brain Function,” UCLA Newsroom, October 14, 2008, http://newsroom.ucla.edu/portal/ucla/ucla-study-finds-that-searching-64348.aspx.

    9. Small and Vorgan, iBrain, 16–17.

    10. Maryanne Wolf, interview with the author, March 28, 2008.

    11. Steven Johnson, Everything Bad Is Good for You: How Today’s Popular Culture Is Actually Making Us Smarter(New York: Riverhead Books, 2005), 19.

    12. John Sweller, Instructional Design in Technical Areas(Camberwell, Australia: Australian Council for Educational Research, 1999), 4.

    13. Ibid., 7.

    14. Ibid.

    15. Ibid., 11.

    16. Ibid., 4–5. For a broad review of current thinking on the limits of working memory, see Nelson Cowan, Working Memory Capacity (New York: Psychology Press, 2005).

    17. Klingberg, Overflowing Brain, 39 and 72–75.

    18. Sweller, Instructional Design, 22.

     

     

     

     

    I'll pause here to respond to the rest of your post.

  14. Actually you did.

     

    There are only two ways to understand your last post.

     

    1) You don't know what you're talking about.

     

    2) You know what you're talking about and are misusing words and ideas intentionally. In other words, trolling.

    And like i said that's really just one way to understand my post. You just added some flavor to #2.

     

    How many of the people have come to a conclusion at all? The question is what's important here.

  15. I'm glad you brought up Aristotles rules of logic. Your examples are a bit of a stretch but I agree with what you're trying to say, although what you're applying them to misses the points I was making. That could've been my fault but let me try explaining, since you're not really keen on asking questions and jump to a fuck-ton of conclusions.

     

    "You shouldn't give out a list of books to read and think that we would understand you if only we would read books X, Y, and Z."

    Because you went straight to being dismissive of my post instead of asking for clarification or posting anything constructive, I could only assume you were done with the discussion so I left it with a list of books you could read instead.

     

    "If you understand the points in those books, just give us the points. "

    I can definitely simplify those books into points. The question is why would I want do that? What value is there to breaking down a bunch of books in this thread when there is no real discussion here?

     

    "You shouldn't say things like "it's been scientifically proven" to try and make a point."

    You're confusing warrants for a claim with support. That's not an example of support. If you want the support i can provide it.

     

    "You shouldn't appeal to authority, (just because someone says something doesn't make it true)."

    True but I haven't. If you think i have let me know where. In fact I just told Spambot to stop appealing to the authority of his Austrian economists.

     

    "You shouldn't contradict yourself, (post #1 you say Google has indexed .04% of the internet, post #37 you say Google has indexed between 4-12%)"

    You're right, however I have found research papers that say both, while researchers can't agree on a number they support my claim that Google indexes only a fraction of the Internet.

     

    "You shouldn't overstate your case, it makes it that much easier to falsify your claim, (e.g. post #48 "a hammer has only one use..." false, hammers have many uses including but not limited to: building, demolition, bottle opener, door stop etc."

    Valid point but not a great example of it. The list of uses you gave can be given to a number of things, including your skull, but we're talking about the hammer as a sophisticated tool and technology. The the single purpose of the hammer, which defines the hammer as a hammer and not your skull or a rock or whatever, is that a hammer can put nails into a wall.

     

    This technology changed people's entire relationship to the world. It created a new kind of building, a new way to think about nature, a new career and role in society, a new culture with a different value for everything around it. We see this with every new technology including agriculture, animal husbandry, the printed word, and so on. THAT was the point i was making. The role of the hammer—and all technology—in shaping modern society cannot be overstated.

     

    "This is just one simple example, but there are many like this you've made in this thread). You shouldn't stray off of your own topic, (this thread was about computer usage, no reason to bring in god, religion, eternity.

    Whether any of these are true or false they aren't on topic and they don't further your idea). "

     

    I completely disagree with this. We are talking about computer as a technology and a medium. We are attempting to weigh the benefits against the costs and one part of this is to compare the computer to other technologies, to see how technology changes the metaphors/abstract concepts of our society. Since we see the world through abstract concepts like eternity, religion, and god—also capitalism, individualism, liberty, fairness, etc—its important to see how each new technology changes the meaning and our relationship to these metaphors. As we change everything over to the computer we must ask ourselves how our relationship to these metaphors and to the world is altered. Since you're clearly interested in epistemology, at least in the context of religion, you might also be interested in looking at technology as epistemology, and with every new technology the epistemology changes.

     

    "When someone makes a point, try to understand their point before responding to it. "

    Sage wisdom, Mr. "I’m just gonna assume you don’t know what you’re talking about."

  16. I reread the previous post and if you haven’t read any of the books I’ve offered then

    admittedly you wont have a clue what I’m talking about. If you don’t want to read

    the books because you're waiting for the movie, this is the best I can do for you.

     

     

     

    "Medium is the message"

     

    http://en.wikipedia.org/wiki/The_medium_is_the_message

     

    Marshal McLuhan suggests that what you communicate is far less important than

    how you communicate. It doesn't matter what you say to someone through a

    telephone, using a telephone says a lot more. “The medium is the message.”

     

     

     

    Neil Postman revises this argument by suggesting, "medium is the metaphor" in

    Amusing Ourselves To Death

     

     

    Neil Postman argues that what you can communicate through a medium is defined

    by the medium, which then redefines your entire relationship to people, society,

    culture, and the world. People who read books have more value for objectivity than

    people who watch TV, for instance.

     

    Each new technology monopolizes communication and therefore monopolizes

    culture. We see this today in a fall of literacy and reading comprehension compared

    to the 18th century when literacy was somewhere around 98%. When the book held

    a monopoly on public discourse, society demanded high-grade literacy to plug into

    the culture. We are guided by logic and abstraction when we are reading print. Yet

    we are presented with concrete images and crave for aesthetics when we watch

    television. This influences our decisions like which president we would choose.

    Media therefore influences the way we tell the truth. Each technology comes with its

    own epistemology. A print based culture was objective and rational. The way they

    expressed ideas was logical and sequential.

     

    A television based epistemology turns politics, education, news etc into

    channels/entertainment. Focusing on aesthetics rather than paying attention to

    logic and rationale.

     

    When print held a monopoly on discourse in America (until the 19th century),

    everything was discussed by everyone in a serious and thoughtful manner. It

    demanded high-grade literacy to plug into society. The Lyceum Movement started as

    a form of adult education. By 1835, more than 3,000 Lyceum lecture halls in 15

    states. Intellectuals, writers and humorists spoke there to audiences of all classes for

    free.

     

    Postman defines the “Typographic Man" as detached, analytical, devoted to logic,

    abhorring contradiction.” He was also classless as reading was not regarded as an

    elitist activity in America.

     

    First printing press in US: 1638 at Harvard University. From 16th century on, all

    knowledge began to be transferred to the printed page in America. Books were

    imported from Britain. Soon after we had public libraries that provided all these

    books free of charge to every American in the country. There has never been a more

    open loop of communication in this country. We had the telegraph, penny-papers,

    telephone, radio, newspapers, and television, all of which became government

    sponsored monopolies controlled by companies like AT&T and Bell.

     

     

     

     

    Since “Amusing ourselves to death” was written in 1984 we have to go to “The

    Shallows” by Nicholas carr, written in 2007 to see compare TV and Typography to

    Computers.

     

     

    Chapter 5: A Medium of the Most General Nature

     

    Much like the television tried to do, the computer is slowly replacing all mediums

    that came before it. The way the Web has progressed as a medium replays, with

    the velocity of a time-lapse film, the entire history of modern media. Hundreds of

    years have been compressed into a couple of decages. The first information-

    processing machine that the Net replicated was Gutenberg's press. Because text is

    fairly simple to translate into software code and to share over networks — it

    doesn't require a lot of memory to store, a lot of bandwith to transmit, or a lot of

    processing power to render on a screen — early Web sites were usually

    constructed entirely of typographical symbols. The very term we came to use to

    describe what we look at online — pages — emphasized the connection printed

    documents.

     

    Next, the Web began to take over the work of our traditional sound-processing

    equipment — radios and phonographs and tape decks. The earliest sounds to be

    heard online were spoken words, but soon snippets of music, and then entire

    songs and even symphonies, were streaming through sites, at ever-higher levels

    of fidelity. The network's ability to handle audio streams was aided by the

    development of software algorithms, such as the one used to produce MP3 files,

    that erase from music and other recordings sounds that are heard for the human

    ear to hear. The algorithms allowed sound files to be compressed to much smaller

    sizes with only slight sacrifices in quality. Telephone calls also began to be routed

    over the fiber-optic cables of the Internet, bypassing traditional phone lines.

     

    Chapter 7: The Juggler's Brain

     

    Our use of the Internet involves many paradoxes, but the one that promises to

    have the greatest long-term influences over how we think is this one: the Net

    seizes our attention only to scatter it.

     

    Chapter 9: Search, Memory

     

    ...it's no longer terribly efficient for our brains to store information. Memory

    should now function like a simple index, pointing us to places on the Web where

    we can locate the information we need at the moment we need it. Why memorize

    the content of a single book when you could be using your brain to hold a quick

    guide to an entire library? Rather than memorize information, we now store it

    digitally and just remember what we stored. As the Web teaches us to think like it

    does, we'll end up keeping rather little deep knowledge in our own heads.

     

    When a person fails to consolidate a fact, an idea, or an experience in long-term

    memory, he's not "freeing up" space in his brain for other functions. In contrast

    to working memory, with its constrained capacity, long-term memory expands

    and contracts with almost unlimited elasticity, thanks to the brain's ability to

    grow and prune synaptic terminals and continually adjust the strength of

    synaptic connections. The brain never reaches a point at which experiences can

    no longer be committed to memory; the brain cannot be full. The very act of

    remembering appears to modify the brain in a way that can make it

    easier to learn ideas and skills in the future.

     

    We don't constrain our mental powers when we store new long-term memories.

    We strengthen them. With each expansion of our memory comes an enlargement

    of our intelligence. The Web provides a convenient and compelling supplement to

    personal memory, but when we start using the Web as a substitute for personal

    memory, bypassing the inner processes of consolidation, we risk emptying our

    minds of their riches.

     

    In the 1970s, when schools began allowing students to use portable calculators,

    many parents objected. They worried that a reliance on the machines would

    weaken their children's grasp of mathematical concepts. The fears, subsequent

    studies showed, were largely unwarranted. No longer forced to spend a lot of time

    on routine calculations, many students gained a deeper understanding of the

    principles underlying their exercises. Today, in freeing us from the work of

    remembering, it's said, the Web allows us to devote more time to creative

    thought. As the experience of math students has shown, the calculator made it

    easier for the brain to transfer ideas from working memory to long-term memory

    and encode them in the conceptual schemas that are so important to building

    knowledge. The Web has a very different effect. It places more pressure on our

    working memory, not only diverting resources from our higher reasoning

    faculties but obstructing the consolidation of long-term memories and the

    development of schemas. The calculator, a powerful but highly specialized tool,

    turned out to be an aid to memory. The Web is a technology of forgiveness.

     

    The more we use the Web, the more we train our brain to be distracted — to

    process information very quickly and very efficiently but without sustained

    attention. That helps explain why many of us find it hard to concentrate even

    when we're away from our computers. Our brains become adept at forgetting,

    inept at remembering.

     

     

     

    I’ll save the other half of those books for later. While so far this has been widely an

    existentialist argument, the rest of the books go into the more economic and

    politically motivated monopolies of information technology. And the pros and cons

    of bundling vs unbundling information, monopolizing vs democratizing information

    technology and so on.

     

    Tim Wu - The Master Switch

    Nicholas Carr - The Big Switch (couldnt find anything in particular that covers the whole book so here) http://www.youtube.com/results?search_query=nicholas+carr+the+big+switch&oq=nicholas+carr+the+big+switch&gs_l=youtube.3..0.1961.5963.0.6181.16.5.0.11.11.1.139.464.4j1.5.0...0.0...1ac.tamzZKAvmes

    Neil Postman - Technopoly http://www.youtube.com/watch?v=KbAPtGYiRvg

    Neil Postman - Building a bridge to the 18th century http://www.youtube.com/watch?v=JovJr_LmAP8

     

    Couple more

     

    Jaron Lanier - You Are not a gadget

    Sherry Turkle - Alone Together

    William Powers - Hamlet's Blackberry http://www.youtube.com/watch?v=5IFhmw9YdoY

  17. Alright I have a minute to use a real computer.

     

     

    This is what I'm talking about:

    Marshal McLuhan's "understanding media,"

    Aldous Huxley, "Brave New World,"

    Neil Postman, "Amusing Ourselves to death," "Technopoly," "Building a bridge to the 18th century"

    Tim Wu, "The Master Switch"

    Jaron Lanier "You Are Not A Gadget"

    Nicholas Carr "The Shallows" and "The Big Switch."

     

    Read up.

×
×
  • Create New...