Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Courts Government News

Linux and Forensic Discovery 260

Max Pyziur writes "Found this on cryptome.org where Linux is cited in a DOJ document against Moussaoui (sometimes referred to as the "20th man"). FBI: Moussaoui E-mail Not Recoverable - January 1, 2003." An interesting read which gives some insight into how computer evidence is handled in court.
This discussion has been archived. No new comments can be posted.

Linux and Forensic Discovery

Comments Filter:
  • by craenor ( 623901 ) on Wednesday January 01, 2003 @04:09PM (#4994601) Homepage
    Of the fact that lawyers will argue over anything.

    Heh, this seems to be a discussion about whether they used "approved methods" of retrieving a deleted email. According to one person, the LinuxGNU was the only one approved by NIST (national institute of standards and technologies). This of course, is wrong...NIST doesn't "approve" software, they just test it and declare whether or not it works.
  • At least... (Score:3, Funny)

    by Ironica ( 124657 ) <pixel&boondock,org> on Wednesday January 01, 2003 @04:09PM (#4994603) Journal
    ...someone in the government seems to realize that Microsoft can't be trusted ;-)
  • Secure File Deletion (Score:5, Informative)

    by b1ng0 ( 7449 ) on Wednesday January 01, 2003 @04:10PM (#4994613)
    To anyone who is concerned about having their deleted files recovered, take a look at Wipe [sourceforge.net] - in its strongest mode it will make 37 passes over the data in order to be sure that electron microscopes cannot reconstruct the bit patterns.
    • Or of course there is shred.

      From the man page: shred delete a file securely, first overwriting it to hide its contents

      It comes with the fileutils package (on RedHat anyway). Can't see any differences between wipe and shred. Apart from the fact that one comes already installed. Is there any difference?
      • shred obsolescence (Score:3, Informative)

        by radon28 ( 593565 )
        the shred utility will only work on non-log structured and non-journaling filesystems, i.e. ext2, but not ext3, jfs, reiserfs, etc. see: "man 1 shred" for more info.
        • by ahi ( 29259 )
          Nor will wipe, according to the author's page [sourceforge.net]. In fact, no user-space utility can.
        • If you're using ext3, you could always remount it as ext2 in order to run shred. Not practical to do it for each deletion, but if you only want to shred the occasional file, it's an option. (I don't know if there's a way to do something similar for other journaled filesystems.)
    • by Speare ( 84249 ) on Wednesday January 01, 2003 @04:41PM (#4994770) Homepage Journal

      It seems that journaling filesystems like ext3 cause hell for secure deletions, because changes aren't always committed as the application level assumes and requires. Has anyone suggested a kernel/filesystem hook to make secure media deletions possible?

      • The hook is already there. The chattr can set a "secure delete" extented attribute on a file or directory which will make a subsequent normal rm perform a secure delete. However the man page says it's not implemented yet, but said man page hasn't been updated since kernel version 2.2.

    • Or, for FreeBSD, you could just do rm -P.
      Overwrite regular files before deleting them. Files are overwritten three times, first with the byte pattern 0xff, then 0x00, and then 0xff again, before they are deleted.
      • by Alsee ( 515537 )
        byte pattern 0xff, then 0x00

        A little knowledge is a dangerous thing :)
        0xff is the value for a string of all 1's and 0x00 is the value for a string of all 0's, but harddrives actually record entirely different bit sequences. And different harddrives use different encodings. Without knowing the specific encoding the current drive uses your best bet is probably to write random values.

        -
        • 0xff is the value for a string of all 1's and 0x00 is the value for a string of all 0's, but harddrives actually record entirely different bit sequences.
          Possibly even variable-length sequences, if a run-length-limited code is used. In which case writing random data a few dozen times could easily leave a big chunk of slack space untouched. Erase/write simply isn't good enough.

          The only way to be sure is to nuke the hard drive from orbit. ;-)

    • by Anonymous Coward
      You can't trust those tools anymore. Today's hard drives will physically move sectors around on disk to avoid areas that are bordering on causing media errors.
    • by bloxnet ( 637785 ) on Wednesday January 01, 2003 @05:01PM (#4994854)
      Wipe is a nice program, but it is simply overkill. It has been shown in studies that typically 3 passes of a data wiping program should make your data non-recoverable by standard means (using popular forensics tools such as EnCase, Maresware, NTI's batch of programs, or disk editors on whatever platform you are interested in). As to how much the U.S. government investigators are able to retrieve...well that falls into your urban legends category I suppose. For the most part, DoJ guildelines suggest wiping your data 7 times as part of the norm. This is because of the non precise manner in which hard drive read/write heads pass over the disk itself (more of a wobble rather than a perfect circular motion). I just recently saw a whitepaper on Encase's site that covered users of WinXP using EFS (encrypted filesystem) secure deletion (which just does 3 passes) that makes recovery of the files deleted not possible this is the whitepaper [guidancesoftware.com]. Just as the above reference article concludes, it should be kept in mind that there is so many places to look on Windows and Unix machines other than what files were deleted. Perhaps pictures of your latest porn stash or the Word document covering your NDA violations are gone, but registry settings, file slack (as was mentioned in the parent article briefly), pagefiles, memory dumps, and many other locations that track your activities on a given machine can be used as well. Wow, I did not mean to get so long winded...I just really get into computer forensics. My personal advice for decent file security and deletion is encryption + multi-pass deletion. There are several encrypted filesystems out there for both Windows and *nix, and a few options that are viable with both (BestCrypt File system containers and also BCWipe for deletion [jetico.com] is a good example). I don't see the need to start advertising products, so check out the options for OS level and OS independent solutions.
    • Does anyone know of a "wipe" style utility that can also wipe ununsed disk space (deleted inodes etc) on linux?
  • by chunkwhite86 ( 593696 ) on Wednesday January 01, 2003 @04:13PM (#4994632)
    Linux is used by humans outside of the Slashdot community! Stay Tuned!

  • Oh Please! (Score:5, Interesting)

    by Snowbeam ( 96416 ) on Wednesday January 01, 2003 @04:13PM (#4994636) Homepage
    How is this news? They are using "dd" a Linux utility. Seeing "Linux" in an article does not warrant a story about it. This demeans Linux by using every little scrap of news to attempt to show that it is in use. Instead we should be demostrating it's uses, rather that reporting that it is in use.
    • Re:Oh Please! (Score:3, Informative)

      To be honest, 'dd' is not a Linux utility. Various *nixes used it before Linux was even started.
      • To be honest, 'dd' is not a Linux utility. Various *nixes used it before Linux was even started.

        In fact dd is even overkill for this purpose. The same could be achieved by cat or something even simpler. This task is so simple that we shouldn't really care how they did it. I could have written a 42 line Turbo Pascal program under DOS that could do it.
    • Well said! (Score:3, Interesting)

      by disc-chord ( 232893 )
      Anyone whose even stepped foot into a "Computer Crimes" department (or whatever your local police call their Info Warriors) knows they have been using *nix since day 1 in forensics.

      This is not news, and the idea we should be getting all excited over this suggests that *nix is such a desperatly useless pos as to warrant mass praise whenever anyone actually finds a use. Is that really the message /. wants to convey?

  • by metatruk ( 315048 ) on Wednesday January 01, 2003 @04:17PM (#4994659)
    From the article:
    Before addressing the authentication for the four specific computers, an error in Mr. Allison's affidavit must be corrected. In his affidavit, Mr. Allison writes: "Many methods are available to create an exact duplicate; however, only one method - the GNU/Linux routine dd - has been approved by the National Institute of Standards and Technologies." Allison Affidavit at 3. This statement is simply wrong. The National Institute of Standards and Technologies (NIST) does not "approve" software, it merely tests it and then publishes the results of its tests.

    The test reults are abailable here:
    http://www.ojp.usdoj.gov/nij/sciencetech/cftt.htm [usdoj.gov]
  • electron microscopes (Score:4, Interesting)

    by Alien54 ( 180860 ) on Wednesday January 01, 2003 @04:18PM (#4994663) Journal
    I am confused.(yes, we all know this)

    The document states that image files were generated fo the contents of the hard drives. I do not have confidence that an image would also display latent data.

    I know myself that when I do a data recovery on a system, I can get many more megs of recovered data from file fragments, deleted folders, etc than can fit on the drive. Most of this extra stuff ias junk data, but you get the idea.

    There is no substitue for the original.

    Recovery can require a minimum of specialized software or be as complicated as looking at the platters under an electron microscope. I see nothing here that indicates use of such specialized technology, and yet this is supposed to be a national security matter.

    • by g4dget ( 579145 ) on Wednesday January 01, 2003 @04:34PM (#4994736)
      The document states that image files were generated fo the contents of the hard drives. I do not have confidence that an image would also display latent data.

      It's pretty clear what "dd" images: the entire content of the hard disk drive as it is readable by its disk controller. It won't image residual data that has been erased.

      I know myself that when I do a data recovery on a system, I can get many more megs of recovered data from file fragments, deleted folders, etc than can fit on the drive. Most of this extra stuff ias junk data, but you get the idea.

      Unless your recovery efforts involve custom hardware, the disk image obtained with "dd", together with bad block information and drive geometry, contains every bit of information you are ever going to get out of that drive. Any software-based recovery working on that image is going to be equivalent to recovery working on the original drive.

      Trying to recover data that has been physically overwritten, using analog methods or imaging, is so expensive and time consuming that it is feasible only in special cases.

      • Unless your recovery efforts involve custom hardware, the disk image obtained with "dd", together with bad block information and drive geometry, contains every bit of information you are ever going to get out of that drive. Any software-based recovery working on that image is going to be equivalent to recovery working on the original drive.

        Not so! Remember, when you're using dd, you're still using a relatively high level protocol to talk to the drive. If you can get the drive into a "test" mode, where you can talk to the actual registers on the drive, there's a heck of a lot more you can do. For example, on some drives, you could tweak the positional calibration registers and move the head fractional tracks, reading the data at each step, and maybe pick up some data at the edges of the track that wouldn't be picked up in the center. (You're hoping that there was a slight positional drift from when the data was written to when the data was erased).

        Now actually getting the drive into "test" mode, talking to the registers, and knowing what the hell the registers actually do is very difficult; you're basicallly talking about documentation that only an engineer working at a drive manufacturer would have. (And of course, this stuff is all non-standard, since it's never supposed to be directly accessed...so each model or family of drives would have different capablilties) This is pretty much the definition of "deep magic." But for the select few who have access to that documentation, some amazing tricks are possible.

    • by Alien54 ( 180860 )
      Cryptome has, on it's front page, details on what the FBI is up against [cryptome.org]. Just scroll down a bit.
      • The Eagan, Minnesota Kinkos Computers

        19. The Initial September 2001 Inquiry at the Eagan, MN Kinkos: On October 17, 2002, I spoke with Minneapolis FBI Special Agent David Rapp. At that time, SA Rapp told me that, to the best of SA Rapps unrefreshed recollection, on or about September 19, 2001, SA Rapp went to the Kinkos store in Eagan, Minnesota, to inquire about a receipt found on the person of Zacarias Moussaoui at the time of his arrest. At that time, SA Rapp met with a person who represented himself as a Kinkos employee responsible for managing and maintaining customer computer workstations. At that time, the Kinkos employee informed SA Rapp, in substance, as follows:

        (A) The Kinkos receipt did indicate that a computer workstation had been utilized;

        (B) It could not be determined from the copy of the Moussaoui receipt alone which computer workstation was used;

        (C) In response to SA Rapps inquiry about the possibility of acquiring any information from the computer workstations regarding the use of the computers by Moussaoui, the Kinkos employee stated that, since the date of the receipt, all computers had been wiped clean/formatted and started with a fresh install; and,

        (D) The computer workstations were generally wiped weekly or bi-weekly approximately, even though Kinkos policy called for weekly wipings. At a minimum, the Eagan Kinkos store wiped the computers at least once per month.

        [....]

        21. Eagan Follow-up: On October 11, 2002, I requested that the Minneapolis FBI Field Office contact Kinkos personnel at the Eagan store and determine if, as alleged by the defense, the Kinkos computer could still maintain evidence of defendant Zacarias Moussaouis use from August 2001. On or about October 15, 2002, Special Agents Brendan Hansen and Christopher Lester visited the Eagan Kinkos and interviewed Brian Fay, who, as of August 11, 2001, was one of two Kinkos employees who knew how to restore an image onto the six computers with internet access designated for customer use. Mr. Fay stated that the six computers presently at the store are the same computers (with the same hard drives) that were present in August of 2001. These six computers are leased and scheduled to be replaced at the end of this year.

        The computers are maintained by formatting the computers hard drives and reloading an image using Norton Ghost whenever business is slow and time allows. There are no logs recording the dates or frequency of loading images on to the computers and Fay could not estimate how frequently they were imaged. Although Fay was not personally familiar with the exact details of the formatting and imaging process he administers to the computers, Fay had been advised by Kinkos that the formatting and restoration process destroyed all files associated with previous users.

      This would be rather thorough, it seems.

      ouch

  • CRC/SHA-1/MD5 (Score:1, Interesting)

    by MeanMF ( 631837 )
    If the hash value of the original prior to duplication matches identically the hash value after the duplication, one may conclude that the duplicate file accurately reflects the data on the original file. The fact that the hash values match is typically more important than the hash values themselves.

    Are they saying that two different files can't have the same hash value? That's a load of crap! It's not hard at all to modify data to create any hash value that you want, especially when you're including "deleted space" in the CRC calculations... It's good at telling you if there were any random modifications caused by errors during copying, but not that the files are identical.
    • It's not hard at all to modify data to create any hash value that you want, especially when you're including "deleted space" in the CRC calculations...

      That kind of depends on the strength of the hash algorithm, wouldn't you say?
    • Sure you can. But to be able to do it with something like MD5, you need to factor some very large prime numbers. Hence the security.
      • Oops...If they were prime, they would be easy to factor. You need to factor the products of some very large primes.

        (The last post wasn't a mistake--it was my intentional FUD to keep the terrorist from figuring out RSA. Shhhh!)
      • Sure you can. But to be able to do it with something like MD5, you need to factor some very large prime numbers. Hence the security.

        Sorry, not even close.

        MD5 has been compromised in a paper by Hans Dobbertin of the German Ministry of Information. The compromise is less than a total break but it is also now 8 years old.

        MD5 uses only operations on 32 bit integers, addition, rotation and booleans. It does not use large integers of prime numbers.

    • Re:CRC/SHA-1/MD5 (Score:5, Informative)

      by metatruk ( 315048 ) on Wednesday January 01, 2003 @04:47PM (#4994796)
      Are they saying that two different files can't have the same hash value? That's a load of crap! It's not hard at all to modify data to create any hash value that you want

      From http://www.itl.nist.gov/fipspubs/fip180-1.htm [nist.gov]:

      The SHA-1 is called secure because it is computationally infeasible to find a message which corresponds to a given message digest, or to find two different messages which produce the same message digest. Any change to a message in transit will, with very high probability, result in a different message digest, and the signature will fail to verify.
      So yes, two different files can have the same hash, but it's infeasible to do this. That's why hashing methods like SHA are used in cryptography; SHA-1 is used in DSA signatures.
    • That's a load of crap! It's not hard at all to modify data to create any hash value that you want, especially when you're including "deleted space" in the CRC calculations...

      CRC-32, sure. CRC is meant to check for small random transmission errors, not to function as a secure hash algorithm. But if you've figured out a way to force data to match a given SHA-1, you better get a press agent and a secretary because every crypto nut in the world is gonna call bullshit. And no, "trying lots of combinations" doesn't count.
    • No, what they are saying is that they copied a disc and the two discs had the same hash value.

      If you *don't care* what the contents of the original disc are, as is the case with forensic investigation, only that the dupe acurately reflects it, than checking the hash value of both against each is a perfectly valid test.

      What they're testing for here *is* random errors in the copy process, not intentional tampering.

      KFG
    • Re:CRC/SHA-1/MD5 (Score:3, Informative)

      by bwt ( 68845 )
      Are they saying that two different files can't have the same hash value? That's a load of crap! It's not hard at all to modify data to create any hash value that you want, especially when you're including "deleted space" in the CRC calculations... It's good at telling you if there were any random modifications caused by errors during copying, but not that the files are identical.

      There are no known examples of two files that have the same MD5 (or SHA-1) hash values, so I think you should reevaluate your statement. While it certainly is true that such files do exist (2^128 MD5 values, > 2^128 possible files, pigeon-hole principle, etc...), that does not mean that finding them is computationally easy or even possible.

      A brute force search of files would require ~2^128 files to be search to find a match. If 2^32 computers each processed 2^16 files a second on average per year (60*60*24*365 20^30 seconds), then it would take greater than 2^50 years to find a match. Equivalently, the odds that any of the files that have ever been produced by humans have the same MD5 are pretty bad.

      It might be possbile that a cryptographic flaw in MD5 exists that could be exploited to reduce the number of files that needed to be searched. I believe no such flaw is known. If one does exist, I'm quite sure it doesn't provide dramatic benefits.
      • Re:CRC/SHA-1/MD5 (Score:3, Interesting)

        by MeanMF ( 631837 )
        There are no known examples of two files that have the same MD5 (or SHA-1) hash values

        Sorry, my original message was kind of weak :)
        The programs that the government uses to do the copy use CRC32, which is very easy to get around. The CRC32 values are listed in section 13 of the expert's affadavit. The government says that this is enough to authenticate the data.

        SafeBack and the Logicube SFK-000A incorporate reliable internal CRC verification techniques, CART procedures do not require examiners to generate separate MD5 or SH-1 hashes for computers imaged using SafeBack or Logicube SFK-000A disk duplicator....All hard drives in this case were imaged by one of the three programs used by the FBI, all of which are recognized by the scientific community as reliable imaging programs. Thus, there should be no question about the authenticity of any of the hard drives.

        In terms of autenticating evidence for use in court, shouldn't the government be using something stronger than CRC? If I were on the defense's side, I would tear this apart - the MD5 hash that they eventually received was taken well after the original image was created, leaving plenty of time to alter any data. There was ample opportunity for somebody (whether as part of a "government conspiracy" or as an overzealous investigator/prosecutor) to alter both the image and the original hard drive before taking the MD5 hash, and before the image was delivered to the defense as part of discovery. There's no use in having an MD5 hash if all it is doing is verifying that you have an exact copy of data that has been tampered with. The government should, as standard practice, take the MD5 hash before they even make the first image, and preserve that record along with other evidence. This would make it much more difficult for the defense to claim that the data presented in discovery or at trial is not authentic.
      • that does not mean that finding them is computationally easy or even possible.

        Actually, there are well known issues with MD5 that make it susceptible to collission searches, see:

        H. Dobbertin, "The Status of MD5 After a Recent Attack", RSA Labs' CryptoBytes, Vol. 2 No. 2, Summer 1996.
        http://www.rsa.com/rsalabs/pubs/cryptobytes.html

        dont think that URL works anymore. This one does, in which Robshaw gives an overview of the problems:

        ftp://ftp.rsa.com/pub/pdfs/bulletn4.pdf

        Basically, it has been demonstrated by Dobertin in 1996 that data with a colliding hash can be found with 10 odd hours of processing from a (by now very low powered) PC. Admittedly only for the compression round of MD5, not for the full set of rounds specified by MD5, however it is feared that existing techniques (ie those used to break MD4) can be applied to MD5. (indeed this is what Dobertin demonstrated). TTBOMK there is no known collision attack against the full MD5 algorithm. (least not public knowledge anyway :) )

        So your assertion is incorrect with respect to MD5.

        SHA-1 is currently considered to be safe from hash collission attacks. However, that is not really relevant as the FBI specifically are using CRC-32 and MD5.

        However, presuming that the question is not one of the FBI having deliberately modified the images, it does not /matter/ that MD5 is on shaky ground wrt to strength against collission attacks. The use of MD5 here is to verify that the copies are the same as the original images and that there werent any errors introduced during copying. For this purpose MD5 is fine.
        • So your assertion is incorrect with respect to MD5.

          I disagree.

          Part of the MD5 algorithm is analytically weak, but that falls far short of an actual working attack for the whole thing. Researchers suspect that this weakness MIGHT eventually lead to an analytic attack against the whole of MD5, but as yet, no such attack exists. As Robshaw said: "While the existence of pseudo-collisions is significant on an analytical level, it is of less practical importance."

          Moreover...

          "Note that existing signatures that were generated using MD5 are likely to remain safe from compromise since it seems that current techniques used to cryptanalyze MD5 do not offer any advantage in finding a second preimage. Existing signatures should not be considered as being at risk of compromise at this point."
          • I did state there was no publically known attack against the full rounds of MD5. However, that was 7 years ago, it is not safe to assume that no one else continued on with this work.

  • Solitaire Forensics Kit, SFK-000A hand-held disk duplicator by Logicube, Inc. (hereafter "Logicube")


    I thought Solitaire only duplicated wasted work hours!
  • Ohhh, ohhhh.... (Score:2, Insightful)

    by evilviper ( 135110 )
    Oohhhhhh... Someone said the word ``Linux"... Better put it on the front page...
    • Not only was the word "Linux" mentioned, but so were the words "computer evidence," and "court."
      Hey, this is Slashdot. News for Nerds. Stuff that matters.
      A lot of us are interested in things such as Linux and computer security. I found this document to be an interesting read, and I am glad it was posted on Slashdot.
  • by grub ( 11606 ) <slashdot@grub.net> on Wednesday January 01, 2003 @04:24PM (#4994682) Homepage Journal

    Sept. 10, 2001

    Zach,
    We're going off flying tommorrow, hope to see you on the other side. Last one there gets the 70 ugliest virgins!

    M. Atta
    • by Anonymous Coward
      Last one there gets the 70 ugliest virgins!

      So who did wind up getting the Slashdot editors?
  • by michaelmalak ( 91262 ) <michael@michaelmalak.com> on Wednesday January 01, 2003 @04:32PM (#4994720) Homepage
    See my Aug. 29, 2002 blog article FBI didn't get Moussaoui's e-mail despite having his laptop [underreported.com], which notes the irony that "the U.S. government is interested in the e-mail of all those in the U.S. except for alleged terrorists" and which links to an Aug. 29, 2002 Washington Post article [washingtonpost.com].

    (Recall that Massaoui was already in jail before Sep. 11. These pre-Sep. 11 e-mail search requests were rebuffed, according to FBI whistleblower Colleen Rowley.)

    • by MacAndrew ( 463832 ) on Wednesday January 01, 2003 @05:00PM (#4994844) Homepage
      Note that the FBI, charged by so many with violating people's privacy in every way imaginable, here dropped the ball by bring too cautious about someone's privacy.

      You can't win -- bungling cuts both ways.

      Anyone wonder why the heck the Minnesota FBI office went to Washington for a piddly search warrant, instead of their friendly local court? Because this was not an ordinary warrant, but a national security warrant designed to investigate suspected terrorists who might not have committed any crime to provide probable cause for a regular warrant. (You know, like Minority Report. OK, it's not that bad. :)

      It will be interesting to see who gets blamed once all of the finger-pointing is over.

      From NYT [missouri.edu] by James Risen*:
      According to Ms. Rowley's letter and other bureau officials, the Minneapolis field office believed that the French report on Mr. Moussaoui provided enough troubling information about his ties to Islamic extremism to go to court to obtain a search warrant under the federal law that allows the government to carry out searches and surveillance in espionage and terrorism cases. Under the statute, investigators do not have to show that a subject committed a crime, only that they have reason to believe the suspect is engaged in terrorist activity or espionage on behalf of a foreign power or a terrorist organization.

      * Another little note -- James Risen with Jeff Gerth were the NYT reporters blamed with stoking the fire over Wen Ho Lee debacle. Of course, lots of people were blamed [fas.org] -- sound familiar?
      • Anyone wonder why the heck the Minnesota FBI office went to Washington for a piddly search warrant, instead of their friendly local court? Because this was not an ordinary warrant, but a national security warrant designed to investigate suspected terrorists who might not have committed any crime to provide probable cause for a regular warrant.
        I think you answered your own question pretty well. I live in Minneapolis, and I doubt the local district court [uscourts.gov] has the facilities for classified proceedings involving national security issues. Just the fact they had to check with French intelligence agencies was probably enough to warrant (no pun intended) going to Washington with the case.
        • Well, it was a rhetorical question. :)

          Applications for this special warrant are only granted by a special "secret" court that sits in Washington, per the Foreign Intelligence Surveillance Act, and apparently the FBI central office has veto authority.
      • It is universally agreed that privacy and security are in conflict with each other and must be balanced. But this is a case where a warrant was sought for an individual based on a reasonable suspicion. Contrast this with Carnivore and Total Information Awareness, which are warrantless fishing expeditions of entire populations. I'm a staunch privacy advocate [underreported.com], yet advocate reasonable searches of a very small number of suspected terrorists.

        You say that the FBI was "too cautious" -- do you have any evidence that that was the motive?

        I see no irony in being a privacy advocate while decrying FBI supervisors for denying the request to search Moussaoui's e-mail.

        P.S. In another related story [underreported.com], the FBI supervisor who thwarted Rowley's investigation recently got a big cash bonus.

        • I don't of course know whether they would have gotten the warrant had they been allowed to present the case to the intelligence court. Hindsight is always distorting. But the reason cited by the central office was concern they might not get it, and I think up to now they've gotten just about everything they asked for and are worried about wearing out their welcome.

          This will all be easier to judge once the 9/11 commission issues its report. What? There's no 9/11 commission? But it's been more thann a year! How could that be? (shock, outrage) My point is that the facts are there for the taking but a certain administration is actively resisting unearthing them. Not a conspiracy, just politics as usual.

          Irony -- I meant it is ironic they didn't search when they should have, whereas elsewhere they have searched where they should not.
        • It is universally agreed that privacy and security are in conflict with each other and must be balanced.

          My own personal security is not enhanced in the least by an organization representing millions of heavily armed enforcers watching my every move. Quite the opposite, really: if I do something that gets on the nerves of some frustrated jerk in the Department of Ugly Euphemisms, he can most likely direct some men with guns to emphatically worsen the state of my world.

          Government needs reasonable resource allocation first (I know, let's let murderers out early so we have more room to imprison pot smokers!), greater competence second, and maybe, just maybe, more investigative power last.

  • by wfmcwalter ( 124904 ) on Wednesday January 01, 2003 @04:39PM (#4994759) Homepage
    Neglecting the STEM/SQUID recovery issues mentioned above, it's rather dissapointing to see the feds using only a generic imager like dd to image the disk, as it's not quite a full image of all the stuff on the disk.

    The contents any LBA that is in the drive's remap table (i.e. blocks that the drive electronics have previously determined either to be bad or going bad) aren't captured by dd - the drive instead sends the data payload corresponding to the LBA's remapped physical address. The bad/bad-ish block remains, and its data is quite possibly still valid (or perhaps valid but for a couple of localised errors). These blocks thus hold tiny slivers of data stored on the drive sometime in the past (the last thing written before the block went bad).

    Although this missed data represents a microscopic fraction of the total data on the disk it could, at least in theory, contain recoverable data of an evidenciary nature. The only way to see this is a drive-vendor specific low-level read - I don't know much about the other two tools the article describes, but it doesn't sound like those do that either.

    Given that there's only a handful of drive manufacturers left, and the (non-servo) parts of the firmware on their drives doesn't vary hugely between models, it really wouldn't be too hard for law-enforcement types to have proper physical-level imaging tools for any drive they're likely to encounter.

    • by wfmcwalter ( 124904 ) on Wednesday January 01, 2003 @04:58PM (#4994838) Homepage
      Hey, there's something else - they're doing checksum calculation not on the disk image (/dev/hda) but on the partition image (/dev/hda1) - which means they're not entirely capturing everything that's potentially on the disk (in particular: the boot sector, the MBR, and any other partitions).

      Now, the document says the examiner determined that there was only one partition, and that he used a "a Linux Boot CD" - this implies (it's not terribly clear what that actually is) that he used linux's fdisk command (or diskdruid or something) to determine that there was indeed only one partition - by examining the current contents of the drive's partition table.

      Doing this doesn't capture any space not currently assigned to a partition - in particular, if another partition were present but was then deleted, or if the extant FAT32 partition were resized (say with partition magic).

      Infact it's rather unusual for a windows laptop to only have one FAT32 partition - many (most?) vendor-created laptops ship with a sleep-to-disk partition on the disk as well (Dell seems to always to this on windows systems).

      In a non-forensic setting, these gripes would be beyond pedantic, but given the seriousness of the crime concerned, and the alleged technical skill of the terrorist groups implicated, these omissions are not immaterial. I do hope that they're omissions only in this document and that the examiners actual procedure did properly image, checksum and examine _all_ of the disk's contents.

  • by defile ( 1059 ) on Wednesday January 01, 2003 @05:28PM (#4994965) Homepage Journal

    Given the weight of the issue and the evidence that could be contained on the disks therein, and given that the US government has an unlimited budget whenever anyone says "terrorism", why they went with dd (or the equivalent ) to copy a disk is beyond me.

    I've seen doughnut shops have their hard disks worked on with more advanced technology.

    Shouldn't they have taken the hard disk to a clean room, removed the platters from the disk and painstaking recorded every nanometer of them? I wouldn't trust a suspect's hard disk to make a copy of itself.

  • by Kjella ( 173770 ) on Wednesday January 01, 2003 @05:47PM (#4995094) Homepage
    ...encrypting stuff in the first place using Bestcrypt / PGPdisk / whatever would make the entire wiping/recovery discussion (-1, Redundant) when it comes to collecting evidence.

    Kjella
  • by MoralHazard ( 447833 ) on Wednesday January 01, 2003 @05:59PM (#4995159)
    Call this off-topic if you must, but I've seen gazillions of posts in this and many other threads about forensics and data recovery that are terribly misinformed about the realities of the field. Here's the two cents of a real, live forensic examiner:

    First, it is NOT realistically possible to recover data that has been overwritten ONE time. Yes, yes--I've read all the white papers on magnetic force microscopy (MFM) and I understand that a theory exists about recovery of overwritten data. In practice, nobody actually does it. Maybe one time, six years ago, some dude at NASA or MIT actually made this work conditions on an older disk with a lower bit density, but anyone telling you that old patterns can be read in the real world is full of shit. And yes, it's been tried. Millions have been spent on this, and nobody can do it. Anybody selling you software that claims under laboratory to be "more secure" because it overwrites more than once is being silly. It's not even paranoia, just lacking a clue.

    That's why forensic examiners don't need to have the original media. In fact, one of the big tenets of the job is to never, ever, ever perform analysis on the originals. You make a bitstream copy of the perp's (excuse me, "client's") disk, and you work with that.

    Oh, and electron microscopes have nothing to do with this theorized recovery process. MFM is a related but very different technology.

    Second, Linux versus Windows versus LogicCube versus ImageMasster (another brand) is utterly beside the point. Forensic shops use what they find to be cost effective, fast, and convenient. The dd command is great, and all, and many examiners use it on Linux platforms for their disk imaging needs, but it's not an analytical tool.

    Let me put it this way: do you actually think that a forensic examiner sits down, opens /dev/hdX in vi, and starts paging through 5 GB or hex? Oh, god, no--that would take years. Making the bitstream image is the easy part, and your choices are virtually unlimited. For the actual analysis (what does it MEAN), you need something that can examine an allocation table, interpret the results, and display the contents in an easy-to-understand format. You need software that can quickly search across a drive for a particular keyword, regular expression, or file signature. You need something that can analyze data for randomness in order to re-assemble images that have been chunked out across virtual memory. Linux does NOT have basic utilities for all of this, and neither does Windows.

    Last, a good forensic examiner is less constrained by his/her knowledge of computers than by his/her investigative skills. I know more about operating systems, file allocation, and troubleshooting than any of the 30-50 year old former cops/feds/spooks that I work with, but they're capable of far more effective work than I am. Why? Because once you have a few basic computer operations taken care of, the work has as much to do with computers as Computer Science does.

    The folks that put the child pornographers, embezzlers, script kiddies, and the rest of the computer criminals in jail generally know much, much less than you about computers, Slashdotters. They also don't give a rat's ass about Linux, Windows, Bill Gates, RMS, or any of it.
    • by Zeinfeld ( 263942 ) on Wednesday January 01, 2003 @06:54PM (#4995459) Homepage
      Call this off-topic if you must, but I've seen gazillions of posts in this and many other threads about forensics and data recovery that are terribly misinformed about the realities of the field. Here's the two cents of a real, live forensic examiner:

      One reason why security software is overdesigned is that it has to deal with improvements in technology. To take your point about older low density drives, any drive more than five years old falls into that category.

      The other reason is that forensics rarely deals with information that is deliberately concealled and the fact that information that may become available in 10 or 20 years time is rarely relevant. This is not the case with intelligence where the activities of ten or even twenty years ago might be of major interest.

      The folks that put the child pornographers, embezzlers, script kiddies, and the rest of the computer criminals in jail generally know much, much less than you about computers, Slashdotters. They also don't give a rat's ass about Linux, Windows, Bill Gates, RMS, or any of it.

      Probably right there, but they are not the main customer for the technology we provide and even if they do buy it, it is not that likely to do them a major amount of good. The main customers for computer security are commercial interests, banks and major corporations. There are many documented instances of national security organizations being used for commercial espionage, the French openly boast about it. The people who commit major wire fraud are typically well funded and backed by significant organized crime, at the moment the Russian mafia are the main players.

      There arn't that many investigations into that type of crime because it is amazingly rare. But the level of attack is very sophisticated and very real.

      • Okay, I'll bite. I did make a disparaging comment about an entire line of software products, so I'll do what I can to back it up. I stand by my assertion that recovery of wiped data is snake oil, and here's why.

        The most often cited source of opinions on MFM-related data recovery techniques is a paper from 1996 entitled "Secure Deletion of Data from Magnetic and Solid-State Memory", by Peter Gutmann. It's pretty readable if you have a good grounding in physics and hard drive operation, so I'd recommend checking it out:

        http://www.usenix.org/publications/library/proce ed ings/sec96/gutmann.html

        Notice, though, that Gutmann isn't the actual first-person researcher. His paper is a compilation of data gleaned from other sources. I spent six weeks tracking down (among other things) his bibliography, and found out that MFM techniques had been used in laboratory tests to recover overwritten data, in the early 1990s. These tests were not field-usable. It amounted to "write a regular pattern on the disk, overwrite it with another regular pattern, and look for evidence of the first pattern." Furthermore, these papers all referred to disks which had been manufactured about 10 years ago.

        I'll bet that someone HAS used this to a practical effect, somewhere, but just try finding out who, where, and (most importantly) how. There are no commercial vendors of this kind of technology--just try calling up OnTrack, or any of their competitors, and you'll hear the same thing. Desperate people in lawsuits and other dire straits have thrown millions of dollars down this hole (and that's just in the last few years, that I'm aware of), and gotten nothing for it.

        To hear Gutmann describe it, though, any halfway competent lab technician could make this process work. Where are the papers describing those operations, done on actual post-1993 hard drives, describing their methodologies?

        I personally watched a not-so-reputable data recovery firm tell a judge and some attorneys that they could recover single-pass deleted data if they had $750,000 in R&D and six months. They came up empty handed.

        This kind of data recovery is PIXIE DUST. It's an urban legend of the tech industry, one that everybody knows is true but nobody can ever prove.

        Can I prove to you that some spook lab buried ten miles beneath Ft. Meade, MD hasn't done this, and isn't buying computers thrown out by French businesses and reading every old secret? No, I can't, I don't work for the government and don't plan to start. But last I checked, it wasn't considered good logic to require absolute proof of a negation, when no proof has been shown of the posited statement.

        So, sure. You can MAYBE read data from pre-1993 hard drives, and maybe in 10 years the examination technology will have advanced enough to read today's drives (if hard drive technology stands perfectly still, eh?). The only people who need protection, then, are folks whose adversaries are incredibly wealthy AND willing to spend gobs of money on getting to them, and who would still be harmed if their ten-year old data is read.

        This does not include businesses--who cares what your business plan was ten years ago? This does not include common criminals--the government won't spend millions of dollars just to recover one piece of evidence. This certainly does not include you and I.

        This include ONE type of entity: sovereign governments. Are you selling your disk wiping utilities to governments, or to businesses and consumers?
        • Can I prove to you that some spook lab buried ten miles beneath Ft. Meade, MD hasn't done this, and isn't buying computers thrown out by French businesses and reading every old secret? No, I can't, I don't work for the government and don't plan to start. But last I checked, it wasn't considered good logic to require absolute proof of a negation, when no proof has been shown of the posited statement.

          That is why you will stay on the recovery side while most people who want real security will go to people who think like I do and cover cases that are at the edge of the possible.

          In fact the data wipe programs are pretty useless but for a completely different reason, the wipe procedure can't work unless it is used before the disk is scrapped. The only reliable way to secure data is to use encryption. It is quite practical to completely wipe crypto keys from memory.

          This include ONE type of entity: sovereign governments. Are you selling your disk wiping utilities to governments, or to businesses and consumers?

          Both.

    • it is NOT realistically possible to recover data that has been overwritten ONE time

      It is the usual practice of law enforcement and goverments to instill a sense of superpowers in their abilities, just to keep people in line. Computer crime fighters might not be able to recover overwritten data, but they don't at all mind that you think they can, and probably won't correct anybody's misconception about it. It's part of their "if you commit a crime, we'll always get you!" hubris. As long as most people think that even deleted & overwritten data can be retrieved, they'll be less inclined to wrongdoing.

      That is, they WANT you to think the big bro' is always watching ;))
  • Uh, September 11? (Score:3, Interesting)

    by netik ( 141046 ) on Wednesday January 01, 2003 @06:45PM (#4995419) Homepage
    Aside from the fact that 1) the slashdot editor is stupid, and 2) Just because it says linux doesn't warrant a story, this bit caught my eye:


    The Examination of Moussaoui's Laptop

    Standby counsel's fourth request questions whether the defendant's laptop was imaged before it lost power. The defendant's laptop was imaged on September 11, 2001, before the laptop lost power. Sewell Affidavit at 11. The BIOS settings for the laptop requested by standby counsel are set forth in SSA Sewell's affidavit. Sewell Affidavit at 11. Therefore, this request is now moot.


    Ask your self: How the hell did they know to image his laptop on September 11th? This means they already knew he was part of the attack, and they were already on to him. Funny how we, the people, were never warned.
    • Re:Uh, September 11? (Score:3, Informative)

      by sheldon ( 2322 )
      Ask your self: How the hell did they know to image his laptop on September 11th? This means they already knew he was part of the attack, and they were already on to him. Funny how we, the people, were never warned.

      Have you been living in a Cave for the past year?

      You've never heard of Moussaoui? [washingtonpost.com]

    • For more background, see this [opinionjournal.com]. It's an opinion piece, but the facts in the case are indisputable. Long story short, they had good cause to search his PC before 9/11, but judges brainwashed by that other "PC" wouldn't allow it. The FBI was like "lemme, Lemme, LEMME" and then when 3000 people got killed the judge finally said "OK".

    • What kind of a Moron are you??? Moussaoui was already in custody on unrelated (immigration) charges in August - the month BEFORE the attack.

      He was just another illegal alien at the time - I'm sure he didn't come out and tell them "Oh, BTW I'm a terrorist". It wasn't until Sept 11th that the FBI and CIA took interest in him, and of course they already had his possessions (including said laptop) confiscated by then.

      Do a bit of homework before posting, will ya?
  • by Krellan ( 107440 ) <krellan@NOspAm.krellan.com> on Wednesday January 01, 2003 @08:18PM (#4995869) Homepage Journal
    I read the NIST document and noticed they mentioned a limitation of dd.

    When copying, dd only copies entire blocks. If there is an incomplete block of information remaining at the end of the disk, for example, dd will not copy that last block at all.

    Since dd defaults to a block size of 1024 bytes, and PC hard drives use a sector size of 512 bytes, this could happen. In this case, dd will not copy the final sector of the hard disk, as it is an incomplete block.

    Because of a stupid decision made decades ago, traditional PC hard disk addressing uses 63 sectors per track, not 64. Therefore, odd total numbers of sectors are common. Modern addressing does away with CHS and just numbers all sectors from 0 to the end of the disk (many millions, in most cases). Still, because of the legacy of having 63 sectors per track, many disks have an odd total number of sectors.

    It would be nice if dd had an option to correctly copy a partial block at the end of the source. If there is an incomplete block, it should simply copy one byte at a time until there are no more bytes to copy.

    This would be easy to add to dd. Has it been done already? If so, it should be documented. Making it the default behaviour might break existing applications, so have it as an option that is highly recommended.
    • by delta407 ( 518868 ) <slashdot@l[ ]jhax.com ['erf' in gap]> on Thursday January 02, 2003 @12:53AM (#4996837) Homepage
      (-1, Wrong)

      dd does copy incomplete blocks. Try this:
      $ dd if=/dev/random of=test bs=1 count=1023
      1023+0 records in
      1023+0 records out

      $ dd if=test of=test2 bs=512
      1+1 records in
      1+1 records out

      $ ls -l test2
      -rw-r--r-- 1 delta407 delta407 1023 Jan 1 22:50 test2
      See that? We created a 1023-byte file (test), and then dd'ed it to test2 with a block size of 512. Guess what? dd copied the file in its entirety, even though it didn't line up on a block boundary.
  • Am I stupid, or.... (Score:2, Informative)

    by BigBadBri ( 595126 )
    did I read in all the legal bullshit that all the FBI uses for verification is a CRC sum?

    It's easy to defeat CRC - just add empty space to the end of each file until you get the result you want. SHA-1 or MD-5 is safe(ish), but a straight CRC is too easy to forge.

    I wouddn't trust these disk copies with a bargegepole.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...