Dh lecture1 copy


Published on

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Roger WhitsonMy work focuses on how literary studies is impacted by new technologiesUnit on Digital Humanities
    New/Old Field in English.
    Stresses Intersection b/t Literature and Digital Technology
    Very Open Field. Many People Disagree w/ What It May Mean.
    What are some ideas you have about English from the earlier portions of the course
  • Visualization of different definitions of DH by Elijah MeeksDay of DH in the Spring, asks people to define the digital humanities. Meeks took a selection of these just to show the diversity.
    Data form of distant reading called Topic Modeling.
    Talk about that later on in the unit.
    Now, see how DH is less a “field” in the traditional sense than a loose cluster of practices.
    Practices are related, but not in a systematic way.
  • This is roughly how our unit will work.
    Today, short discussion of the history of computing.
    Next week, we’ll look at how DH specifically effects English and Literature.
    Final week, we will look at some applications before playing around with some distant reading applications.
    DH is not just distant reading, but it is the most visible part of the discipline right now.
  • Today, talk about the early history of computing, starting with the 19th century.
    Kirk told me that you looked at Thoreau as nature writing and examined how his work went against industrial modes of clock time.
    Much about computation that is about efficiency. Thoreau might not like it.
    Engineers and programmers are notoriously lazy. They create programs to solve problems and make stuff easier.
    The question of how or why literary scholars would use computational methodologies, we’ll leave for next week.
    How many of you know either coding or programming? What’s the difference?
    Coding is markup. Marking up a text (see html). Coding is getting a computer to run a calculation based upon an input and an output. We start with programming.
  • Three figures are important here:
    1. Joseph Marie Jacquard (1752-1834): French weaver and merchant. Created the Jacquard loom, programmable form of textile manufacturing.
    2. Charles Babbage (1791-1871): British mathematician. Considered “father of the computer,” first person to create a diagram for a computational device. Babbage’s video game store.
    3. Ada Lovelace (1815-1852): First computer programmer, worked w/ Babbage. Called herself “poetical scientist.” Translated article from Italian military engineer Luigi Menbrea - including a set of notes that went beyond the original article, considered to be the first computer program and conceptualized the very idea of software. Babbage always considered the computer a number-cruncher. Ada thought computers could execute actions, not just solve equations.
  • Difference Engine schematic for difference engine
    mechanically computes astronomical and mathematical tables more efficiently and correctly than most people.
    1823, British government gave Babbage 17,000 pounds to finish the project.
    no one had yet built a device to such exacting standards beforethe machine was much more expensive than Babbage originally predicted.
    Britian scuttled the project before it was completedDifference engine was never completed in Babbage’s lifetime.
    Replica created in 1991 to celebrate Babbage’s 200th anniversary.
    Difference engine’s work w/ tables helped to conceptually create the idea common to computation now of the database (i.e. tabular data that could be computed simultaneously).
  • Flash forward to 1945. This is when Vannevar Bush (1890-1974) w/ a number of other scientists:
    Albert Einstein, Enrico Fermi, Niels Bohr, RIchard Feynman, John von Neumann.
    Picture of Fat Man (detonated over Nagasaki on 9 August 1945).
    Second wave of computational work was the mathematical modeling of atomic and hydrogen bomb
    explosions in places like Los Alamos.
    So-called ENIAC computer, short for (Electronic Numerical Integrator and Computer).
    After the explosions in Japan and end of WW II, question became what people would do now?
    Large amount of resources devoted to Manhattan Project were redirected to developing so-called “stored-memory computer.”
  • Bush’s article as it appeared in the September 1945 issue of The Atlantic.
    What were some themes Bush expresses about technology in the essay?
    How can a collective memory machine help solve these problems? Worried that technology will only be used for war. Emergence (as Eisenhower said) of military-industrial complex - anyone know what that means? connection b/t war and private enterprise.
    Yes, early version of conceptual hypertext. What were your favorite of his projections?
    Ideas: storing texts by using photography,
    speech recording w/ stenography,
    mechanization of repetitive thought processes makes it easier to do more complicated problems, machines can be used anywhere we can create systematic logic,
  • diagram of the memexcurrent indexing is alphabetical or numerical instead of associative (like the brain),
    memex as device that could store and retrieve information based upon association (way we do that today?)
    also pass items to another memex.
    trails of thought processes can also be published, like an encyclopedia. Making retrieval of information smarter.
  • Going further, another big figure in history of computation is Alan Turing.
    cryptoanalysist during WWII. Worked for British intelligence. Anyone know what that is?
    father of artificial intelligence
    1952: acknowledged to police that he was homosexual, which was a crime in Britian.
    Was given the choice of imprisonment or hormone therapy (synthetic oestrigen to reduce libido).
    lead to impotence and enlargement of breast glands.
    After treatment in 1954, found dead w/ a half-eaten apple filled w/ cyanide.
    Suicide was ruled the cause.
  • Turing Test: test of a machine’s ability to exhibit intelligence indistinguishable from human int.
    can machines think?
    Not an answerable question, in the final analysis.
    Foundational to the field of artificial intelligence and (later) to natural language processing (i.e. teaching computers to process human language).
    Acc to Turing, machine intelligence could be understood merely lesser in degree (not kind) from human intelligence.
  • Women providing input to an ENIAC (1946). 150 feet long.
    Personal computers won’t come until the 1970s. Women often worked on them b/c computations were considered menial labor.
    Human operators of these machines were called “computers.”
    See N. Katherine Hayles’s book My Mother Was A Computer.
  • Outputs were presented on punch-cards.
    More sophisticated form here. Can anyone point out what this might be?
    Scantron! Where you fill in your name. Make sure there aren’t any stray marks.
    You’d feed this into a machine, the machine would count where you filled in the correct oval, and this would be your grade.
    Punch-cards worked similarly. People would do their computations manually, feed their programs via punch cards into a teletype machine, that would communicate w/ a mainframe, which would send back the output on a punch card. You’d then interpret that information and formulate your theory.
  • What’s this?
    Known as computation on a command line. You can find the command line by typing “terminal” on your MacBook. Some people still want to use this instead of Windows or iOS b/c they argue it gives them closer access to the computer.
    DOS is pre-Windows. This is what’s shown when you type: dir/w, which shows a list of files w/in the directory.
  • early, early, EARLY form of a modem. Probably 1980s.
    I used a 2400 baud modem to connect to BBS’s when I was in high school.
    Took up a landline. Couldn’t use the phone when you used the internet.
    Why did you have to put the receiver on the modem?
    Uses the phoneline to convert electric signals to sound, then back to electric signals again.
    Which is what you see translated into characters on the screen.
    Notice what’s happening here and how it approximates what Bush said about the Memex machine.
  • finally, early EARLY version of Xerox’s GUI from 1983.
    what’s a GUI? (Graphical User Interface). Yes.
    Xerox was the first company to design a GUI.
    Apple actually stole the idea from them, and used their superior marketshare to make it seem like they invented it.
    See Walter Issacson’s biography of Steve Jobs (great book on Apple’s early days).
  • Dh lecture1 copy

    1. 1. Unit 4: The Digital Humanities
    2. 2. Today: History of Computing •Major Assignment #3 Due •Vannevar Bush, “As We May Think.” 10/31: No Class •Ignore the Video Assignment for This Day. 11/5: Digital Humanities and Literary Studies •Matt Kirschenbaum, “What’s DH and What’s It Doing in English Depts?” •Stephen Ramsay, “High Performance Computing for English Majors.” 11/7: Digital Humanities Applications •Lauren Klein, “‘A Report Has Come Here’: Social Network Analysis in Thomas Jefferson.” •Ted Underwood, “We Don’t Already Know the Broad Outlines of Literary History.” 11/12: Distant Reading or Macroanalysis •Franco Moretti, “Conjectures on World Literature.” •Matthew Jockers, “On Distant Reading and Macroanalysis.” 11/14: Voyant •My Voyant Tutorials and Play with Digital Applications!
    3. 3. The Early Years of Computation