Four article assignment

497 views
388 views

Published on

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
497
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Four article assignment

  1. 1. Article #1May 12, 2009Plugging Holes in the Science of ForensicsBy HENRY FOUNTAINIt was time, the panel of experts said, to put more science in forensic science.A report in February by a committee of the National Academy of Sciences found “serious problems” with muchof the work performed by crime laboratories in the United States. Recent incidents of faulty evidence analysis— including the case of an Oregon lawyer who was arrested by the F.B.I. after the 2004 Madrid terroristbombings based on fingerprint identification that turned out to be wrong — were just high-profile examples ofwider deficiencies, the committee said. Crime labs were overworked, there were few certification programs forinvestigators and technicians, and the entire field suffered from a lack of oversight.But perhaps the most damning conclusion was that many forensic disciplines — including analysis offingerprints, bite marks and the striations and indentations left by a pry bar or a gun‟s firing mechanism —were not grounded in the kind of rigorous, peer-reviewed research that is the hallmark of classic science. DNAanalysis was an exception, the report noted, in that it had been studied extensively. But many otherinvestigative tests, the report said, “have never been exposed to stringent scientific scrutiny.”While some forensic experts took issue with that conclusion, many welcomed it. And some scientists areworking on just the kind of research necessary to improve the field. They are refining software and studyinghuman decision-making to improve an important aspect of much forensic science — the ability to recognizeand compare patterns.The report was “basically saying what many of us have been saying for a long time,” said Lawrence Kobilinsky,chairman of the department of sciences at John Jay College of Criminal Justice in New York. “There are a lot ofareas in forensics that need improvement.”Barry Fisher, a past president of the American Academy of Forensic Sciences and a former director of the crimelaboratory at the Los Angeles County Sheriff‟s Department, said he and others had been pushing for this kindof independent assessment for years. “There needs to be a demonstration that this stuff is reliable,” he said.It‟s not that there hasn‟t been any research in forensic science. But over the years much of it has been done incrime labs themselves. “It hasn‟t gotten to the level where they can state findings in a rigorous scientific way,”said Constantine Gatsonis, director of the Center for Statistical Sciences at Brown University and co-chairmanof the National Academy of Sciences committee. And rather than being teased out in academic papers and
  2. 2. debated at scientific conferences, “a lot of this forensic stuff is being argued in the courtroom,” Mr. Fisher said.“That‟s not the place to validate any kind of scientific information.”Much forensic research has been geared to improving technologies and techniques. These studies can result inthe kinds of gee-whiz advances that may show up in the next episode of the “C.S.I.” series — a technique toobtain fingerprints from a grocery bag or other unlikely source, for example, or equipment that enablesanalyses of the tiniest bits of evidence.This kind of work is useful, Dr. Kobilinsky said, “but it doesn‟t solve the basic problem.”DNA analysis came out of the biological sciences, and much money and time has been spent developing thefield, resulting in a large body of peer-reviewed research. So when a DNA expert testifies in court that there is acertain probability that a sample comes from a suspect, that claim is grounded in science.As evidence to be analyzed, DNA has certain advantages. “DNA has a particular structure, and can bedigitized,” Dr. Gatsonis said. So scientists can agree, for example, on how many loci on a DNA strand to use intheir analyses, and computers can do the necessary computations of probability.“Fingerprints are a lot more complicated,” Dr. Gatsonis said. “There are a lot of different ways you can selectfeatures and make comparisons.” A smudged print may have only a few ridge endings or other points forcomparison, while a clear print may have many more. And other factors can affect prints, including thematerial they were found on and the pressure of the fingers in making them.Sargur N. Srihari, an expert in pattern recognition at the University at Buffalo, part of the New York stateuniversity system, is trying to quantify the uncertainty. His group did much of the research that led to postalsystems that can recognize handwritten addresses on envelopes, and he works with databases of fingerprints toderive probabilities of random correspondence between two prints.Most features on a print are usually represented by X and Y coordinates and by an angle that represents theorientation of the particular ridge where the feature is located. A single print can have 40 or more comparablefeatures.Dr. Srihari uses relatively small databases, including an extreme one that contains fingerprints from dozens ofidentical twins (so the probability of matches is high), and employs the results to further refine mathematicaltools for comparison that would work with larger populations.
  3. 3. “These numbers are not easy to come by at this point,” he said. The goal is not individualization — matchingtwo prints with absolute certainty — but coming up with firm probabilities that would be very useful in legalproceedings.Other researchers are compiling databases of their own. Nicholas D. K. Petraco, an assistant professor at JohnJay College, is studying microscopic tool marks of the kind made by a screwdriver when a burglar jimmies awindow. It has been hypothesized that no two screwdrivers leave exactly the same pattern of marks, althoughthat has never been proved. So Dr. Petraco is systematically making marks in jeweler‟s wax and othermaterials, creating images of them under a stereo microscope and quantifying the details, assembling adatabase that can eventually be mined to determine probabilities that a mark matches a certain tool.Dr. Petraco, a chemist with a strong background in computer science, looks to industry for ideas about patternrecognition — the tools that a company like Netflix uses, for example, to classify people by the kinds of moviesthey like. “A lot of computational machinery goes into making those kinds of decisions,” he said.He figures that if something works for industry, it will work for forensic science. “You don‟t want to inventanything new,” he said, because that raises legal issues of admissibility of evidence.The work takes time, but the good news is that the data stays around forever. So as software improves, theprobabilities should get more accurate. “Algorithms and data comparison evolve over time,” Dr. Petraco said.But it may not be possible to develop useful databases in some disciplines — bite mark analysis, for example.“Using a screwdriver, that‟s very straightforward and simple,” said Ira Titunik, a forensic odontologist andadjunct professor at John Jay College. But bites involve numerous teeth, and there are other factors, includingcondition of the skin, that may make it difficult to quantify them for purposes of determining probabilities.A few researchers are looking at how errors creep into forensic analysis. The National Institute of Standardsand Technology recently established a working group on fingerprints, with statisticians, psychologists andothers, “to try to understand the circumstances that lead to human error,” said Mark Stolorow, director of theOffice of Law Enforcement Standards at the institute.In Britain, Itiel Dror, a psychologist who studies decision-making processes, is already looking at humanfactors. “I like to say the mind is not a camera, objectively and passively recording information,” said Dr. Dror,who has a consulting firm and is affiliated with University College London. “The brain is an active and dynamicdevice.”
  4. 4. He has conducted studies that show that when working on an identification, fingerprint examiners can beinfluenced by what else they know about a case. In one experiment, he found that the same examiner can cometo different conclusions about the same fingerprint, if the context is changed over time.The same kinds of contextual biases arise with other decision-makers, said Dr. Dror, who works with themilitary and with financial and medical professionals. He thinks one reason forensic examiners often do notacknowledge that they make errors is that in these other fields, the mistakes are obvious. “In forensics, theydon‟t really see it,” he said. “People go to jail.”Forensics experts say the need for research like Dr. Dror‟s and Dr. Srihari‟s does not mean that disciplines likefingerprint analysis will turn out to be invalid. “I have no doubt that fingerprint evidence and firearmsevidence, once looked into by the appropriate research entities, are going to be shown to be very reliable andgood,” said Mr. Fisher, the former American Academy of Forensic Sciences president.Dr. Kobilinsky said people should not jump to the conclusion that forensic science is bad science. “There‟s a lotof experience and knowledge that goes into somebody‟s expertise,” he said.“It‟s not junk science. But that doesn‟t mean it shouldn‟t be improved.”
  5. 5. Article #2May 12, 2009Plugging Holes in the Science of ForensicsBy HENRY FOUNTAINIt was time, the panel of experts said, to put more science in forensic science.A report in February by a committee of the National Academy of Sciences found “serious problems” with much of the workperformed by crime laboratories in the United States. Recent incidents of faulty evidence analysis — including the case ofan Oregon lawyer who was arrested by the F.B.I. after the 2004 Madrid terrorist bombings based on fingerprintidentification that turned out to be wrong — were just high-profile examples of wider deficiencies, the committee said.Crime labs were overworked, there were few certification programs for investigators and technicians, and the entire fieldsuffered from a lack of oversight.But perhaps the most damning conclusion was that many forensic disciplines — including analysis of fingerprints, bitemarks and the striations and indentations left by a pry bar or a gun‟s firing mechanism — were not grounded in the kind ofrigorous, peer-reviewed research that is the hallmark of classic science. DNA analysis was an exception, the report noted,in that it had been studied extensively. But many other investigative tests, the report said, “have never been exposed tostringent scientific scrutiny.”While some forensic experts took issue with that conclusion, many welcomed it. And some scientists are working on justthe kind of research necessary to improve the field. They are refining software and studying human decision-making toimprove an important aspect of much forensic science — the ability to recognize and compare patterns.The report was “basically saying what many of us have been saying for a long time,” said Lawrence Kobilinsky, chairmanof the department of sciences at John Jay College of Criminal Justice in New York. “There are a lot of areas in forensicsthat need improvement.”Barry Fisher, a past president of the American Academy of Forensic Sciences and a former director of the crime laboratoryat the Los Angeles County Sheriff‟s Department, said he and others had been pushing for this kind of independentassessment for years. “There needs to be a demonstration that this stuff is reliable,” he said.It‟s not that there hasn‟t been any research in forensic science. But over the years much of it has been done in crime labsthemselves. “It hasn‟t gotten to the level where they can state findings in a rigorous scientific way,” said ConstantineGatsonis, director of the Center for Statistical Sciences at Brown University and co-chairman of the National Academy ofSciences committee. And rather than being teased out in academic papers and debated at scientific conferences, “a lot of
  6. 6. this forensic stuff is being argued in the courtroom,” Mr. Fisher said. “That‟s not the place to validate any kind of scientificinformation.”Much forensic research has been geared to improving technologies and techniques. These studies can result in the kinds ofgee-whiz advances that may show up in the next episode of the “C.S.I.” series — a technique to obtain fingerprints from agrocery bag or other unlikely source, for example, or equipment that enables analyses of the tiniest bits of evidence.This kind of work is useful, Dr. Kobilinsky said, “but it doesn‟t solve the basic problem.”DNA analysis came out of the biological sciences, and much money and time has been spent developing the field, resultingin a large body of peer-reviewed research. So when a DNA expert testifies in court that there is a certain probability that asample comes from a suspect, that claim is grounded in science.As evidence to be analyzed, DNA has certain advantages. “DNA has a particular structure, and can be digitized,” Dr.Gatsonis said. So scientists can agree, for example, on how many loci on a DNA strand to use in their analyses, andcomputers can do the necessary computations of probability.“Fingerprints are a lot more complicated,” Dr. Gatsonis said. “There are a lot of different ways you can select features andmake comparisons.” A smudged print may have only a few ridge endings or other points for comparison, while a clearprint may have many more. And other factors can affect prints, including the material they were found on and thepressure of the fingers in making them.Sargur N. Srihari, an expert in pattern recognition at the University at Buffalo, part of the New York state universitysystem, is trying to quantify the uncertainty. His group did much of the research that led to postal systems that canrecognize handwritten addresses on envelopes, and he works with databases of fingerprints to derive probabilities ofrandom correspondence between two prints.Most features on a print are usually represented by X and Y coordinates and by an angle that represents the orientation ofthe particular ridge where the feature is located. A single print can have 40 or more comparable features.Dr. Srihari uses relatively small databases, including an extreme one that contains fingerprints from dozens of identicaltwins (so the probability of matches is high), and employs the results to further refine mathematical tools for comparisonthat would work with larger populations.“These numbers are not easy to come by at this point,” he said. The goal is not individualization — matching two printswith absolute certainty — but coming up with firm probabilities that would be very useful in legal proceedings.
  7. 7. Other researchers are compiling databases of their own. Nicholas D. K. Petraco, an assistant professor at John Jay College,is studying microscopic tool marks of the kind made by a screwdriver when a burglar jimmies a window. It has beenhypothesized that no two screwdrivers leave exactly the same pattern of marks, although that has never been proved. SoDr. Petraco is systematically making marks in jeweler‟s wax and other materials, creating images of them under a stereomicroscope and quantifying the details, assembling a database that can eventually be mined to determine probabilitiesthat a mark matches a certain tool.Dr. Petraco, a chemist with a strong background in computer science, looks to industry for ideas about pattern recognition— the tools that a company like Netflix uses, for example, to classify people by the kinds of movies they like. “A lot ofcomputational machinery goes into making those kinds of decisions,” he said.He figures that if something works for industry, it will work for forensic science. “You don‟t want to invent anything new,”he said, because that raises legal issues of admissibility of evidence.The work takes time, but the good news is that the data stays around forever. So as software improves, the probabilitiesshould get more accurate. “Algorithms and data comparison evolve over time,” Dr. Petraco said.But it may not be possible to develop useful databases in some disciplines — bite mark analysis, for example. “Using ascrewdriver, that‟s very straightforward and simple,” said Ira Titunik, a forensic odontologist and adjunct professor atJohn Jay College. But bites involve numerous teeth, and there are other factors, including condition of the skin, that maymake it difficult to quantify them for purposes of determining probabilities.A few researchers are looking at how errors creep into forensic analysis. The National Institute of Standards andTechnology recently established a working group on fingerprints, with statisticians, psychologists and others, “to try tounderstand the circumstances that lead to human error,” said Mark Stolorow, director of the Office of Law EnforcementStandards at the institute.In Britain, Itiel Dror, a psychologist who studies decision-making processes, is already looking at human factors. “I like tosay the mind is not a camera, objectively and passively recording information,” said Dr. Dror, who has a consulting firmand is affiliated with University College London. “The brain is an active and dynamic device.”He has conducted studies that show that when working on an identification, fingerprint examiners can be influenced bywhat else they know about a case. In one experiment, he found that the same examiner can come to different conclusionsabout the same fingerprint, if the context is changed over time.The same kinds of contextual biases arise with other decision-makers, said Dr. Dror, who works with the military and withfinancial and medical professionals. He thinks one reason forensic examiners often do not acknowledge that they make
  8. 8. errors is that in these other fields, the mistakes are obvious. “In forensics, they don‟t really see it,” he said. “People go tojail.”Forensics experts say the need for research like Dr. Dror‟s and Dr. Srihari‟s does not mean that disciplines like fingerprintanalysis will turn out to be invalid. “I have no doubt that fingerprint evidence and firearms evidence, once looked into bythe appropriate research entities, are going to be shown to be very reliable and good,” said Mr. Fisher, the formerAmerican Academy of Forensic Sciences president.Dr. Kobilinsky said people should not jump to the conclusion that forensic science is bad science. “There‟s a lot ofexperience and knowledge that goes into somebody‟s expertise,” he said.“It‟s not junk science. But that doesn‟t mean it shouldn‟t be improved.”
  9. 9. Article #3Judging Honesty by Words, Not FidgetsBy BENEDICT CAREYBefore any interrogation, before the two-way mirrors or bargaining or good-cop, bad-cop routines, police officersinvestigating a crime have to make a very tricky determination: Is the person I‟m interviewing being honest, or spinningfairy tales?The answer is crucial, not only for identifying potential suspects and credible witnesses but also for the fate of the personbeing questioned. Those who come across poorly may become potential suspects and spend hours on the business end of aconfrontational, life-changing interrogation — whether or not they are guilty.Until recently, police departments have had little solid research to guide their instincts. But now forensic scientists havebegun testing techniques they hope will give officers, interrogators and others a kind of honesty screen, an improvedmethod of sorting doctored stories from truthful ones.The new work focuses on what people say, not how they act. It has already changed police work in other countries, andsome new techniques are making their way into interrogations in the United States.In part, the work grows out of a frustration with other methods. Liars do not avert their eyes in an interview on averageany more than people telling the truth do, researchers report; they do not fidget, sweat or slump in a chair any more often.They may produce distinct, fleeting changes in expression, experts say, but it is not clear yet how useful it is to analyzethose.Nor have technological advances proved very helpful. No brain-imaging machine can reliably distinguish a doctored storyfrom the truthful one, for instance; ditto for polygraphs, which track changes in physiology as an indirect measure of lying.“Focusing on content is a very good idea,” given the limitations of what is currently being done, said Saul Kassin, aprofessor of psychology at John Jay College of Criminal Justice.One broad, straightforward principle has changed police work in Britain: seek information, not a confession. In the mid-1980s, following cases of false confessions, British courts prohibited officers from using some aggressive techniques, likelying about evidence to provoke suspects, and required that interrogations be taped. Officers now work to gather as muchevidence as possible before interviewing a suspect, and they make no real distinction between this so-called investigativeinterview and an interrogation, said Ray Bull, a professor of forensic psychology at the University of Leicester.
  10. 10. “These interviews sound much more like a chat in a bar,” said Dr. Bull, who, with colleagues like Aldert Vrij at theUniversity of Portsmouth, has pioneered much of the research in this area. “It‟s a lot like the old „Columbo‟ show, youknow, where he pretends to be an idiot but he‟s gathered a lot of evidence.”Dr. Bull, who has analyzed scores of interrogation tapes, said the police had reported no drop-off in the number ofconfessions, nor major miscarriages of justice arising from false confessions. In one 2002 survey, researchers in Swedenfound that less-confrontational interrogations were associated with a higher likelihood of confession.Still, forensic researchers have not abandoned the search for verbal clues in interrogations. In analyses of what people saywhen they are lying and when they are telling the truth, they have found tantalizing differences.Kevin Colwell, a psychologist at Southern Connecticut State University, has advised police departments, Pentagon officialsand child protection workers, who need to check the veracity of conflicting accounts from parents and children. He saysthat people concocting a story prepare a script that is tight and lacking in detail.“It‟s like when your mom busted you as a kid, and you made really obvious mistakes,” Dr. Colwell said. “Well, now you‟reworking to avoid those.”By contrast, people telling the truth have no script, and tend to recall more extraneous details and may even makemistakes. They are sloppier.Psychologists have long studied methods for amplifying this contrast. Drawing on work by Dr. Vrij and Dr. Marcia K.Johnson of Yale, among others, Dr. Colwell and Dr. Cheryl Hiscock-Anisman of National University in La Jolla, Calif.,have developed an interview technique that appears to help distinguish a tall tale from a true one.The interview is low-key but demanding. First, the person recalls a vivid memory, like the first day at college, soresearchers have a baseline reading for how the person communicates. The person then freely recounts the event beinginvestigated, recalling all that happened. After several pointed questions (“Would a police officer say a crime wascommitted?” for example), the interviewee describes the event in question again, adding sounds, smells and other details.Several more stages follow, including one in which the person is asked to recall what happened in reverse.In several studies, Dr. Colwell and Dr. Hiscock-Anisman have reported one consistent difference: People telling the truthtend to add 20 to 30 percent more external detail than do those who are lying. “This is how memory works, byassociation,” Dr. Hiscock-Anisman said. “If you‟re telling the truth, this mental reinstatement of contexts triggers moreand more external details.”
  11. 11. Not so if you‟ve got a concocted story and you‟re sticking to it. “It‟s the difference between a tree in full flower in thesummer and a barren stick in winter,” said Dr. Charles Morgan, a psychiatrist at the National Center for Post-TraumaticStress Disorder, who has tested it for trauma claims and among special-operations soldiers.In one recent study, the psychologists had 38 undergraduates enter a professor‟s office and either steal an exam or replaceone that had been stolen. A week later, half told the truth in this structured interview, and the other half tried not toincriminate themselves by lying in the interview. A prize of $20 was offered to the most believable liars.The researchers had four trained raters who did not know which students were lying analyze the transcripts for responselength and richness of added detail, among other things. They correctly categorized 33 of the 38 stories as truthful ordeceitful.The study, whose co-authors were Amina Memon, Laura Taylor and Jessica Prewett, is one of several showing positiveresults of about 75 percent correct or higher.This summer, Dr. Colwell and Dr. Hiscock-Anisman are scheduled to teach the technique at the San Diego PoliceDepartment, which has a force of some 2,000 officers. “You really develop your own antenna when interviewing peopleover the years,” said Chris Ellis, a lieutenant on the force who invited the researchers to give training. “But we‟re very opento anything that will make our jobs easier and make us more accurate.”This approach, as promising as it is, has limitations. It applies only to a person talking about what happened during aspecific time — not to individual facts, like, “Did you see a red suitcase on the floor?” It may be poorly suited, too, forsomeone who has been traumatized and is not interested in talking, Dr. Morgan said. And it is not likely to flag the personwho changes one small but crucial detail in a story — “Sure, I was there, I threw some punches, but I know nothing aboutno knife” — or, for that matter, the expert or pathological liar.But the science is evolving fast. Dr. Bull, Dr. Vrij and Par-Anders Granhag at Goteborg University in Sweden are findingthat challenging people with pieces of previously gathered evidence, gradually introduced throughout an investigativeinterview, increases the strain on liars.And it all can be done without threats or abuse, which is easier on officers and suspects. Detective Columbo, it turns out,was not just made for TV.
  12. 12. Article #4Tracking Cyberspies Through the Web WildernessBy JOHN MARKOFFFor old-fashioned detectives, the problem was always acquiring information. For the cybersleuth, hunting evidence in thedata tangle of the Internet, the problem is different.“The holy grail is how can you distinguish between information which is garbage and information which is valuable?” saidRafal Rohozinski, a University of Cambridge-trained social scientist involved in computer security issues.Beginning eight years ago he co-founded two groups, Information Warfare Monitor and Citizen Lab, which both haveheadquarters at the University of Toronto, with Ronald Deibert, a University of Toronto political scientist. The groupspursue that grail and strive to put investigative tools normally reserved for law enforcement agencies and computersecurity investigators at the service of groups that do not have such resources.“We thought that civil society groups lacked an intelligence capacity,” Dr. Deibert said.They have had some important successes. Last year Nart Villeneuve, 34, an international relations researcher who worksfor the two groups, found that a Chinese version of Skype software was being used for eavesdropping by one of China‟smajor wireless carriers, probably on behalf of Chinese government law enforcement agencies.This year, he helped uncover a spy system, which he and his fellow researchers dubbed Ghostnet, which looked like aChinese-government-run spying operation on mostly South Asian government-owned computers around the world.Both discoveries were the result of a new genre of detective work, and they illustrate the strengths and the limits ofdetective work in cyberspace.The Ghostnet case began when Greg Walton, the editor of Infowar Monitor and a member of the research team, wasinvited to audit the Dalai Lama‟s office network in Dharamsala, India. Under constant attack — possibly from Chinese-government-sponsored computer hackers — the exiles had turned to the Canadian researchers to help combat the digitalspies that had been planted in their communications system over several years.Both at the Dalai Lama‟s private office and at the headquarters of the exiled Tibetan government, Mr. Walton used apowerful software program known as Wireshark to capture the Internet traffic to and from the exile groups‟ computers.Wireshark is an open-source software program that is freely available to computer security investigators. It isdistinguished by its ease of use and by its ability to sort out and decode hundreds of common Internet protocols that are
  13. 13. used for different types of data communications. It is known as a sniffer, and such software programs are essential for thesleuths who track cybercriminals and spies on the Internet.Wireshark makes it possible to watch an unencrypted Internet chat session while it is taking place, or in the case of Mr.Walton‟s research in India, to watch as Internet attackers copied files from the Dalai Lama‟s network.In almost every case, when the Ghostnet system administrators took over a remote computer they would install aclandestine Chinese-designed software program called GhOst RAT — for Remote Administration Terminal. GhOst RATpermits the control of a distant computer via the Internet, to the extent of being able to turn on audio and video recordingfeatures and capture the resulting files. The operators of the system — whoever they were — in addition to stealing digitalfiles and e-mail messages, could transform office PCs into remote listening posts.The spying was of immediate concern to the Tibetans, because the documents that were being stolen were related tonegotiating positions the Dalai Lama‟s political representatives were planning to take in negotiations the group wasengaged in.After returning to Canada, Mr. Walton shared his captured data with Mr. Villeneuve and the two used a second tool toanalyze the information. They uploaded the data into a visualization program that had been provided to the group byPalantir Technologies, a software company that has developed a program that allows investigators to “fuse” large data setsto look for correlations and connections that may otherwise go unnoticed.The company was founded several years ago by a group of technologists who had pioneered fraud detection techniques atPaypal, the Silicon Valley online payment company. Palantir has developed a pattern recognition tool that is used both byintelligence agencies and financial services companies, and the Citizen Lab researchers have modified it by addingcapabilities that are specific to Internet data.Mr. Villeneuve was using this software to view these data files in a basement at the University of Toronto when he noticeda seemingly innocuous but puzzling string of 22 characters reappearing in different files. On a hunch, he entered the stringinto Google‟s search engine and was instantly directed to similar files stored on a vast computerized surveillance systemlocated on Hainan Island off the coast of China. The Tibetan files were being copied to these computers.But the researchers were not able to determine with certainty who controlled the system. The system could have beencreated by so-called patriotic hackers, independent computer activists in China whose actions are closely aligned with, butindependent from, the Chinese government. Or it could have been created and run by Internet spies in a third country.Indeed, the discovery raised as many questions as it answered. Why was the powerful eavesdropping system notpassword-protected, a weakness that made it easy for Mr. Villeneuve to determine how the system worked? And why
  14. 14. among the more than 1,200 compromised government computers representing 103 countries, were there no United Statesgovernment systems? These questions remain.Cyberforensics presents immense technical challenges that are complicated by the fact that the Internet effortlessly spansboth local and national government boundaries. It is possible for a criminal, for example, to conceal his or her activities byconnecting to a target computer through a string of innocent computers, each connected to the Internet on differentcontinents, making law enforcement investigations time consuming or even impossible.The most vexing issue facing both law enforcement and other cyberspace investigators is this question of “attribution.”The famous New Yorker magazine cartoon in which a dog sits at a computer keyboard and points out to a companion, “onthe Internet, nobody knows you‟re a dog,” is no joke for cyberdetectives.To deal with the challenge, the Toronto researchers are pursuing what they describe as a fusion methodology, in whichthey look at Internet data in the context of real world events.“We had a really good hunch that in order to understand what was going on in cyberspace we needed to collect twocompletely different sets of data,” Mr. Rohozinski said. “On one hand we needed technical data generated from Internetlog files. The other component is trying to understand what is going on in cyberspace by interviewing people, and byunderstanding how institutions work.”Veteran cybersecurity investigators agree that the best data detectives need to go beyond the Internet. They may even needto wear out some shoe leather.“We can‟t become myopic about our tools,” said Kent Anderson, a security investigator who is a member of securitymanagement committee of the Information Systems Audit and Control Association. “I continually bump up against goodtechnologists who know how to use tools, but who don‟t understand how their tools fit into the bigger picture of theinvestigation.”This article has been revised to reflect the following correction:Correction: May 15, 2009An article on Tuesday about the investigative tools being used in computer security operations misspelled the surname ofa political scientist at the University of Toronto. He is Ronald Deibert, not Diebert. It also misstated the surname of theeditor of Infowar Monitor. He is Greg Walton, not Watson.

×