Monday, November 5, 2012

PFIC 2012 Slides & Bsides DFW

Hello Reader,
                      With another presentation done here are my slides from PFIC, where I again presented on Anti Anti Forensics. This is a similar presentation to the one I did at Bsides DFW but  with more details on the actual structure of $logfile records and more information.

Slides can be found here: Slides

We are getting close to the official release of ANJP (Advanced NTFS Journal Parser) as we write up our official blog post to put up on the SANS blog. Until then, if you would like a copy of the version 1 free tool please email me so I can get you going. Our goal is to get the community access to our research as quickly as possible!

I'm looking for conferences to spread the good word on journaled file system forensics for next year, so if you are looking for advanced content please let me know!

Tuesday, October 2, 2012

Updates and DFIR Conferences

Hello Readers,
                        I know I've been silent, our workload and conferences have kept me quite busy. Updates for you:

Book News
Computer Forensics, A beginners guide is out to copy edit or will be soon. Looking at an early Q1 2013 release to bookstores. I've been working on this book for way to long but having a child while writing a book will do that.

Hacking Exposed, Computer Forensics Third Edition we just signed the contract for this. Look for a new edition in 2014 with a lot of new content and new sections. We really want to keep this series not only relevant but expand its scope from the US legal system to the world.

Conference News
I spoke at Derbycon this past weekend, but not on forensics. I spoke on running a successful red team, which is both my professional past as well as part of the work I do at the National Collegiate Cyber Defense Competition. People seemed to enjoy the content and here are my slides!

My derbycon slides with notes!

I'll be speaking next at BsidesDFW on November 3, 2012 on Anti Anti forensics. I won't be staying very long after as I have to catch a plane to Utah but I do plan to go to the movie screening the night before so hopefully I'll see you there!

My last planned presentation of the year is at Paraben's Forensic Innovations Conference so if you're going I hope to see you there. I'll be doing my Anti-Anti Forensics talk again but this time be doing a live demonstration of the updated tool we've been showing in the blog here. which leads me to my next update

NTFS $Logfile Parser

After a good response from our beta testers we are feeling confident in elimination of bugs in what we are getting ready to release as version 1.0. In addition we got some great fixes after testing our parser on the NIST CFReDS project's deleted file recovery test images. If you are looking to validate a new tool or test a current one the NIST CFReDS images are great and well documented as a control.

We've decided to call the parser ANJP, Advanced NTFS Journal Parser, to have a clear and distinct acronym from anything else. We plan to expand our research into Ext3 and HFS+ after this and will have AEJP and AHJP parsers released at a later date to expand what we believe is a vital piece of information missing from your examinations. There is a lot of research around Ext3/HFS+ regarding recovering deleted files from the journal, but we can't find much focus on mapping out file creations, time stamps changing or files being renamed. All things possibly unique to the interest of the DFIR community. Our plan is to expand out our research so you can take advantage of all the data available to you.

So what will be in version 1.0?
  • Identification of deleted files with full metadata, in our testing on the NIST CFReDS images we recovered all deleted file records with full metadata.
  • Identification of files being created with full metadata
  • Identification of files being renamed with metadata before and after the rename.
  • Log2timeline output

But Dave, what about all the other cool things you've mentioned? 
There is much more we can determine from the NTFS $logfile, but we've realized that understanding it isn't as simple as just reading the csv it outputs. We don't want to release a tool that becomes a source for false positives and bad testimony so we are going to do follow the Viaforensics model (thanks for thinking this up guys!). We are going to be offering a one day training class that explains NTFS, the MFT and most importantly the $logfile. That class will explain how to parse the log, the event records, the redo/undo operation codes and how to stitch those together to find the information we provide in version 1.0.

Extending beyond that we will then explain how to take the Update Sequence Arrays, timestamp changes, file id/directory ids and tie them back into the MFT, recovering resident files, identifying the approximate number of external drives changes, determining how many systems an external drive was plugged into and be able to make good, reliable conclusions from them for use in your case work.  At the end of the class you'll get a copy of the super duper version 1.0 that gives you way more information that you will be qualified to draw opinions from. There won't be a dongle or a license or any other such thing. If you decide to give a copy to someone we just hope they don't testify to its results without taking our class.

In the future as we continue our research we may be able to reduce the possibility for error in the additional evidence sources and when we will / as we do we will update the publicly released tool to include those. Until then we think everyone is best served by this model that gets the most reliable evidence in everyone's hands ASAP and giving those who want to go deeper a chance to.

I hope to have version 1.0 released in the next week or two and I'll be posting it here when I do.

If you are running a conference and want us to do the ANJP training at your event let us know, we want to get as many people as possible using this as possible! When you see what all we can determine from the $logfile we think you'll agree.

What conferences do you get the most from?
I am planning my 2013 conference schedule and I've asked twitter and I want to ask you the reader, what conferences do you get the most from? I'm planning on CEIC, PFIC and possibly blackhat but  otherwise I want to hear your suggestions! Leave a comment and lets talk.

Sunday, August 12, 2012

Time to find a fancy hat, I'm speaking at Derbycon

Hello Reader,
                      I've been trying to find more conferences to speak at lately (If you are running a conference let me know) to let more people know about fun forensic artifacts. I've been selected to speak at Derbycon 2.0 but not for a forensic topic this time (though I did submit one). Instead the fine folks at Derbycon like my topic titled 'How to run a successful redteam'. If you've been following the blog you'd know that once a year I lead the national red team for the national collegiate cyber defense competition and have been doing so for 5 years now. We've learned alot on how to build and succeed as a competition red team and I thought it would be a good idea to share what we've learned.

So if you are going to at Derbycon and want to either:
 a) have a beer with me and talk forensics or
b) find out how to be a lethal red team full of 'i love it when a plan comes together' moments

Then I'll see you there! Let me know you read the blog if you don't mind it's always nice to know someone is one the other side of the screen.

Wednesday, August 8, 2012

Updates and status

Hello Reader!,
                       It's been awhile since we've talked. Things here at G-C have been pretty busy, the legal sector at least appears to be in a full recovery (knock on wood). While I haven't had time to write up a full blog post on some of the new things we've found over the summer, I did want to take the time to show you how our NTFS $logfile parser is coming. For those of you who attended my CEIC session on 'anti-anti forensics' or who downloaded the labs I posted afterwords you know that we had a rough parser and tests to recover the names of wiped files before.

I'm happy to say we've come a long way since then. The initial proof of concept parser was shown to validate the artifact and divide up the pieces into something we could then further understand. We now have a parser, that is still in development, that can go even further creating CSVs of human readable data extracted from those $logfile segments.

What does that mean? Well it means:
1. We can recover the names of deleted files and their metadata, even if its been purged out of the MFT. This includes the metadata associated with the file (directory, creation, modification and acess times).
2. We can recover the complete rename operation showing cleanly which file became which file. Including parsing out the directory, creation, modification and access times before and after the operation. This essentially will allow you to undo what a wiper has done (except for recover the contents of the file itself).
3. We can determine if files were written to other drives, and an approximation of how many. (This is not in the current version of the parsers and will require ist own blog post).
4. We can recover the original metadata of a file when it was created
5. We should be able to recover timestamps that have been altered

It's all written in perl (woo!) and we are going to release the source and documentation as soon as its ready (tm). In the mean time check out this awesome screenshot showing the parser recovering the metadata from 22 files that were wiped with eraser:

If you are need of this tool for a case immediately drop me a line and I'll see what we can do to help you out!

Friday, June 1, 2012

CEIC 2012 - Anti Anti Forensics Materials

Hello Possible CEIC Attendee,
           I always put my materials up after I give a presentiation. This time since I also made a couple labs to show how to perform this type of investigation into indentifying, detecting and recovering from anti forensic tools I am including those as well. There are 3 labs making up 10gbs of data compressed. The images are e01 and the cases are saved in Encase v7.04 since this was a guidance software conference. There is a lab manual for each lab as well in the root directory to walk you through what you are expected to find.

I'm putting this up on a dropbox account as they are the only file hosting service I could find without  max file size limit (that you couldn't pay to increase).

All three labs here:

The ppt slides are here:

As I've said in the prior post, I'm more of a talker than a powerpoint slide maker. So if you have questions based on the presentation/lab please leave them in the comments below and I'll do my best to answer them.

Also Lab 3 contains a preview of our $logfile research that we will hopefully be presenting at blackhat (please pick me blackhat review board).

If this type of lab download/review thing is popular with you readers I can put up more and we can do a forensic challenge style of blogging for a bit!

Sunday, May 27, 2012

New Project, Tool testing

One of the advantages of running a computer forensic company is that I get to buy lots of tools to use. When I was working for other companies I would have to wait for budget cycles and submit justification for tool purchases, but for the last 7 years I’ve been able to buy them as I needed them. In those 7 years we’ve accumulated a lot of tools that we use for different specializations and a body of knowledge related to them that I feel could be better utilized to share with all of you.

With that in mind I think it would be interesting to see how all these tools compare when working on the same forensic image. So with that in mind I’m going to start making some test images to see how data is interpreted from the same disk but in different image formats. I am going to start with the identification, not recovery, of deleted files and go from there.

My initial tool list to test includes:

Encase v. 7.04

FTK v. 4.01

Smart 3-26-12

X-ways forensics v. 16.5

SIFT v. 2.13

Any other tool you want us to test? Let me know in the comments below

I'll post my results as we finish a round of tests and as always a large case could easily distract me!