Skip to main content
main-content

Über dieses Buch

This book contains a selection of thoroughly refereed and revised papers from the Third International ICST Conference on Digital Forensics and Cyber Crime, ICDF2C 2011, held October 26-28 in Dublin, Ireland. The field of digital forensics is becoming increasingly important for law enforcement, network security, and information assurance. It is a multidisciplinary area that encompasses a number of fields, including law, computer science, finance, networking, data mining, and criminal justice. The 24 papers in this volume cover a variety of topics ranging from tactics of cyber crime investigations to digital forensic education, network forensics, and the use of formal methods in digital investigations. There is a large section addressing forensics of mobile digital devices.

Inhaltsverzeichnis

Frontmatter

Cybercrime Investigations

The Role of Perception in Age Estimation

Law enforcement is increasingly called upon to investigate child exploitation crimes, a task that involves the important task of estimating the age of depicted children. There is limited research into our ability to perceive adult versus child and to more specifically estimate the age of a child based upon an image. There are few training programs available and lack of uniform methodology for child age estimation. A more stable foundation can be found through input from multidisciplinary fields in science and Art. The results of surveys and review of multidisciplinary literature indicate that the human ability to perceive the difference between juvenile and adult is a not just a matter of common sense, but a hardwired, preconscious condition of human experience based upon perceptual cues, and further, indicates a normative ability to make reasonably accurate age estimations based upon facial features and proportion when provided with an evaluative framework.

Cynthia A. Murphy

Internet Child Pornography, U.S. Sentencing Guidelines, and the Role of Internet Service Providers

The following review will provide a historical recap of the United States response to child pornography as it relates to the ever-evolving technological world. Specifically, a review of the child pornography laws, at the federal level, as well as the sentencing guidelines will reveal the delicate balance between criminalizing child pornography and upholding the United States’ constitution. In addition, discussing the role of Internet Service Providers will expose a trend toward using the same technology, which has proliferated the child pornography industry, to identify and censor the illegal content on the Internet. Finally, the strengths and weaknesses of the current laws and regulation tactics, as well as, the suggested amendments will be discussed.

Kathryn C. Seigfried-Spellar, Gary R. Bertoline, Marcus K. Rogers

Law Enforcement 2.0: Regulating the Lawful Interception of Social Media

Lawful interception (LI) has evolved over the past few decades from target based monitoring & interception of telecomm conversations, to the monitoring & interception of packet switched (IP) communications. However, in spite of this evolution, the nature of the communication remained linear, where the initiator communicates with one, or a number of, recipients. Initially, with telecomm, all of the participants in the call were online, i.e. active participants at the time of the call; whereas, with the introduction of packet-switched or IP traffic, some of the interaction between the participants became turn-based, where the recipients receive the information from the initiator after an interval. Notwithstanding spam, the participants, more often than not, opted to receive the information.

Esti Peshin

Mobile Device Forensics

All Bot Net: A Need for Smartphone P2P Awareness

This paper is a call for law enforcement and other members of the digital forensic community to be aware of smartphones connecting to Peer-to-Peer networks. This paper also offers a review of botnet concepts and research surrounding smartphone malware.

Kelly A. Cole, Ramindu L. Silva, Richard P. Mislan

Results of Field Testing Mobile Phone Shielding Devices

This paper is based on thesis research from the authors. Mobile phones are increasingly a source of evidence in criminal investigations. The evidence on a phone is volatile and can easily be overwritten or deleted. There are many devices that claim to radio isolate a phone in order to preserve evidence. There has been little published research on how well these devices work in the field despite the escalating importance of mobile phone forensics. The purpose of this study was to identify situations where the devices used to protect evidence on mobile phones can fail. These devices were tested using mobile phones from three of the largest services providers in the U.S. Calls were made to contact the isolated phones using voice, SMS, and MMS at varying distances from the provider’s towers. In the majority of the test cases the phones were not isolated from their networks.

Eric Katz, Richard Mislan, Marcus Rogers, Anthony Smith

Windows Phone 7 from a Digital Forensics’ Perspective

Windows Phone 7 is a new smartphone operating system with the potential to become one of the major smartphone platforms in the near future. Phones based on Windows Phone 7 are only available since a few months, so digital forensics of the new system is still in its infancy. This paper is a first look at Windows Phone 7 from a forensics’ perspective. It explains the main characteristics of the platform, the problems that forensic investigators face, methods to circumvent those problems and a set of tools to get data from the phone. Data that can be acquired include the file system, the registry, and active tasks. Based on the file system, further information like SMSs, Emails and Facebook data can be extracted.

Thomas Schaefer, Hans Höfken, Marko Schuba

An Agent Based Tool for Windows Mobile Forensics

Mobile devices are very common in everyone’s day-to- day life. Nowadays such devices come with many features of desktop or laptop. Hence people can use these devices for diverse applications. As the acceptability and usability of such devices are very high, there are chances that these devices can be used for illegal activities. The percentage of mobile phones or smart phones involved in cyber crimes is in hike. So it becomes necessary to digitally analyze such devices requiring cyber forensics tools. This paper discusses different types of digital evidence present in Microsoft’s Windows Mobile smart phones and an agent based approach for logically acquiring such devices. Also it describes a tool developed for forensically acquiring and analyzing Windows Mobile devices and WinCE PDAs.

S. Satheesh Kumar, Bibin Thomas, K. L. Thomas

Forensic Extractions of Data from the Nokia N900

The Nokia N900 is a very powerful smartphone and offers great utility to users. As smartphones contain a wealth of information about the user, including information about the user’s contacts, communications, and activities, investigators must have at their disposal the best possible methods for extracting important data from smartphones. Unlike with other smartphones, knowledge of forensic acquisition from the N900 is extremely limited. Extractions of data from the N900 are categorized into limited triage extractions and full physical extractions. The imaging process of the phone has been explained as is necessary for a full investigation of the phone. The types of data as called for in a limited data extraction have been identified, and the locations of these files on the N900 were detailed. Also, a script was created which can be utilized for a limited data extraction from a Nokia N900.

Mark Lohrum

New Developments in Digital Forensics

A Strategy for Testing Metadata Based Deleted File Recovery Tools

Deleted file recovery tools use residual metadata left behind after files are deleted to reconstruct deleted files. File systems use metadata to keep track of the location of user files, time stamps of file activity, file ownership and file access permissions. When a file is deleted, many file systems do not actually remove the file content, but mark the file blocks as available for reuse by future file allocations. This paper describes a strategy for testing forensic tools that recover deleted files from the residual metadata that can be found after a file has been deleted.

James R. Lyle

Finding Anomalous and Suspicious Files from Directory Metadata on a Large Corpus

We describe a tool

Dirim

for automatically finding files on a drive that are anomalous or suspicious, and thus worthy of focus during digital-forensic investigation, based on solely their directory information. Anomalies are found both from comparing overall drive statistics and from comparing clusters of related files using a novel approach of "superclustering" of clusters. Suspicious file detection looks for a set of specific clues. We discuss results of experiments we conducted on a representative corpus on 1467 drive images where we did find interesting anomalies but not much deception (as expected given the corpus). Cluster comparison performed best at providing useful information for an investigator, but the other methods provided unique additional information albeit with a significant number of false alarms.

Neil C. Rowe, Simson L. Garfinkel

A Novel Methodology for Malware Intrusion Attack Path Reconstruction

After an intrusion has propagated between hosts, or even between networks, determining the propagation path is critical to assess exploited network vulnerabilities, and also to determine the vector and intent of the initial intrusion. This work proposes a novel method for malware intrusion attack path reconstruction that extends post-mortem system state comparison methods with network-level correlation and timeline analysis. This work shows that intrusion-related events can be reconstructed at the host level and correlated between related hosts and networks to reconstruct the overall path of an attack. A case study is given that demonstrates the applicability of the attack path reconstruction technique.

Ahmed F. Shosha, Joshua I. James, Pavel Gladyshev

Performance Issues About Context-Triggered Piecewise Hashing

A hash function is a well-known method in computer science to map arbitrary large data to bit strings of a fixed short length. This property is used in computer forensics to identify known files on base of their hash value. As of today, in a pre-step process hash values of files are generated and stored in a database; typically a cryptographic hash function like MD5 or SHA-1 is used. Later the investigator computes hash values of files, which he finds on a storage medium, and performs look ups in his database. Due to security properties of cryptographic hash functions, they can not be used to identify similar files. Therefore Jesse Kornblum proposed a similarity preserving hash function to identify similar files. This paper discusses the efficiency of Kornblum’s approach. We present some enhancements that increase the performance of his algorithm by 55% if applied to a real life scenario. Furthermore, we discuss some characteristics of a sample Windows XP system, which are relevant for the performance of Kornblum’s approach.

Frank Breitinger, Harald Baier

Short Papers

Formal Parameterization of Log Synchronization Events within a Distributed Forensic Compute Cloud Database Environment

Advances in virtual server internetworking and the Internet have been accompanied by increased incidences of computer related crimes for such domains. At the same time, the number of sources of potential evidence in any particular cloud computing forensic investigation has grown considerably, as evidence of the occurrence of relevant events can potentially be drawn not only from multiple computers, networks, and electronic systems but also from disparate personal, organizational, and governmental contexts. Potentially, this leads to significant improvements in forensic outcomes but is accompanied by an increase in complexity and scale of the event information, particularly since such information is treated as composite events. In order for digital investigators to effectively administer the virtual machine(VM) environments they need to have automated methods for correlating and synchronizing such event data as a critical concern. The contribution of the paper is the provision of a University case study of our ongoing work that integrates an automated detection of a computer forensic scenario for virtual network server clouds. This is work based upon facts derived from digital events synchronized within the VM environment. We use our preliminary case evaluations to present the formal parameterized context for which such VM log events are likely to occur based on the event condition action (ECA) paradigm adopted from work done in [16][19].

Sean Thorpe, Indrakshi Ray, Indrajit Ray, Tyrone Grandison, Abbie Barbir, Robert France

Yahoo! Messenger Forensics on Windows Vista and Windows 7

The purpose of this study is to identify several areas of forensic interest within the Yahoo! Messenger application, which are of forensic significance. This study focuses on new areas of interest within the file structure of Windows Vista and Windows 7. One of the main issues with this topic is that little research has been previously conducted on the new Windows platforms. Previously conducted research indicates the evidence found on older file structures, such as Windows XP, as well as outdated versions of Yahoo! Messenger. Several differences were found within the Yahoo Messenger’s registry keys and directory structure on Windows Vista and Windows 7 as compared to Windows XP.

Matthew Levendoski, Tejashree Datar, Marcus Rogers

Robust Hashing for Efficient Forensic Analysis of Image Sets

Forensic analysis of image sets today is most often done with the help of cryptographic hashes due to their efficiency, their integration in forensic tools and their excellent reliability in the domain of false detection alarms. A drawback of these hash methods is their fragility to any image processing operation. Even a simple re-compression with JPEG results in an image not detectable. A different approach is to apply image identification methods, allowing identifying illegal images by e.g. semantic models or facing detection algorithms. Their common drawback is a high computational complexity and significant false alarm rates. Robust hashing is a well-known approach sharing characteristics of both cryptographic hashes and image identification methods. It is fast, robust to common image processing and features low false alarm rates. To verify its usability in forensic evaluation, in this work we discuss and evaluate the behavior of an optimized block-based hash.

Martin Steinebach

Tracking User Activity on Personal Computers

Combining low cost digital storage with the tendency for the average computer user to keep computer files long after they have become useful has created such large stores of data on computer systems that the cost and time to conduct even a preliminary examination has created new technical and operational challenges for forensics investigations. Popular operating systems for personal computers do not inherently provide services that allow the tracking of the user’s activity that would allow a simple personal audit of their computers to be carried out so the user can see what they were doing, when they did it and how long they spent on each activity. Such audit trails would assist in forensics investigations in building timelines of activity so suspects could be quickly eliminated (or not) from an investigation. This paper gives some insight to the advantages of having a user activity tracking system and explores the difficulties in developing a generic third party solution.

Anthony Keane, Stephen O’Shaughnessy

Digital Forensics Techniques

The Forensic Value of the Windows 7 Jump List

The Windows 7 Jump List is an aspect of the Windows 7 operating system that has the potential to contain data and artifacts of great interest to investigators, but has yet to receive any considerable attention or research. As of this writing, only one published work makes mention of their existence, and no tools exist to automate their retrieval and analysis. The goal of this research is to provide an overview of the function and behavior of jump lists, and also to examine the structure of jump lists with the intention of proposing further research for making use of them in a forensic capacity.

Alexander G. Barnett

Finding Forensic Information on Creating a Folder in $LogFile of NTFS

The NTFS journaling file($LogFile) is used to keep the file system clean in the event of a system crash or power failure. The log records operate on files or folders and leaves large amounts of information in the $LogFile. This information can be used to reconstruct operations and can also be used as forensic evidence. In this research, we present methods for collecting forensic evidence of timestamps and folder names relating to a folder’s creation. In some of the related log records for creating a folder, four log records that have timestamps and folder name information that are 0x0E/0x0F(Redo/Undo op. code), 0x02/0x00, 0x08/0x00, and 0x14/0x14 were analyzed. Unfortunately, the structure of $LogFile is not well known or documented. As a result the researchers used reverse engineering in order to gain a better understanding of the log record structures. The study found that using basic information contained in the $LogFile, a forensic reconstruction of timestamp events could be created.

Gyu-Sang Cho, Marcus K. Rogers

Rescuing Digital Data from Submerged HDD

As the increasing number of personal computers is involved in various criminal cases, the importance of the capability of extracting crucial digital data from these electromagnetic devices is getting emphasized. There are criminal cases where digital devices happen to be found in mud, water, and fire. Because digital devices have the possibility of storing essential key information that might contribute to the solution of particular criminal cases, it is usually required to retrieve data contained in the electromagnetic storage installed in personal computers by any means however damaged they may appear to be. This study reports one of the best practices of our successful experimental result on the extraction of digital data from damaged hard disk drive. This result is expected to help digital forensic practitioners deal effectively with similar cases in difficult situations.

Toshinobu Yasuhira, Kazuhiro Nishimura, Tomofumi Koida

Digital Forensics Education

Evaluating the Forensic Image Generator Generator

The Forensic Image Generator Generator (Forensig

2

) is a system that allows to produce file system images for training in forensic computing. We report experiences of using Forensig

2

within a course on forensic computing. Apart from revealing the pitfalls when using artificially generated images in class, we argue that they can be used to quantify the difficulty of an analysis problem and, in turn, help to understand misinterpretation issues in practice.

Christian Moch, Felix C. Freiling

Internet and Network Investigations

Forensic Extractions of Data from the Nokia N900

The Nokia N900 is a very powerful smartphone and offers great utility to users. As smartphones contain a wealth of information about the user, including information about the user’s contacts, communications, and activities, investigators must have at their disposal the best possible methods for extracting important data from smartphones. Unlike with other smartphones, knowledge of forensic acquisition from the N900 is extremely limited. Extractions of data from the N900 are categorized into limited triage extractions and full physical extractions. The imaging process of the phone has been explained as is necessary for a full investigation of the phone. The types of data as called for in a limited data extraction have been identified, and the locations of these files on the N900 were detailed. Also, a script was created which can be utilized for a limited data extraction from a Nokia N900.

Mark Lohrum

Formal Methods of Digital Forensics

A Forensic Framework for Incident Analysis Applied to the Insider Threat

We require a holistic forensic framework to analyze incidents within their complete context. Our framework organizes incidents into their main stages of access, use and outcome to aid incident analysis, influenced by Howard and Longstaff’s security incident classification. We also use eight incident questions, extending the six from Zachman’s framework, to pose questions about the entire incident and each individual stage. The incident analysis using stage decomposition is combined with our three-layer incident architecture, comprising the social, logical and physical levels, to analyze incidents in their entirety, including human and physical factors, rather than from a technical viewpoint alone. We demonstrate the conjunction of our multilayered architectural structure and incident classification system with an insider threat case study, demonstrating clearly the questions that must be answered to organize a successful investigation. The process of investigating extant incidents also applies to proactive analysis to avoid damaging incidents.

Clive Blackwell

Reasoning About a Simulated Printer Case Investigation with Forensic Lucid

In this work we model the ACME (a fictitious company name) “printer case incident” and make its specification in Forensic Lucid, a Lucid- and intensional-logic-based programming language for cyberforensic analysis and event reconstruction specification. The printer case involves a dispute between two parties that was previously solved using the finite-state automata (FSA) approach, and is now re-done in a more usable way in Forensic Lucid. Our approach is based on the said case modeling by encoding concepts like evidence and the related witness accounts as an evidential statement context in a Forensic Lucid “program”. The evidential statement is an input to the transition function that models the possible deductions in the case. We then invoke the transition function (actually its reverse) with the evidential statement context to see if the evidence we encoded agrees with one’s claims and then attempt to reconstruct the sequence of events that may explain the claim or disprove it.

Serguei A. Mokhov, Joey Paquet, Mourad Debbabi

Backmatter

Weitere Informationen

Premium Partner

Neuer Inhalt

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.

Whitepaper

- ANZEIGE -

Product Lifecycle Management im Konzernumfeld – Herausforderungen, Lösungsansätze und Handlungsempfehlungen

Für produzierende Unternehmen hat sich Product Lifecycle Management in den letzten Jahrzehnten in wachsendem Maße zu einem strategisch wichtigen Ansatz entwickelt. Forciert durch steigende Effektivitäts- und Effizienzanforderungen stellen viele Unternehmen ihre Product Lifecycle Management-Prozesse und -Informationssysteme auf den Prüfstand. Der vorliegende Beitrag beschreibt entlang eines etablierten Analyseframeworks Herausforderungen und Lösungsansätze im Product Lifecycle Management im Konzernumfeld.
Jetzt gratis downloaden!

Bildnachweise