skip to main content
research-article

Automated Assessment of Secure Search Systems

Published:20 January 2015Publication History
Skip Abstract Section

Abstract

This work presents the results of a three-year project that assessed nine different privacy-preserving data search systems. We detail the design of a software assessment framework that focuses on low system footprint, repeatability, and reusability. A unique achievement of this project was the automation and integration of the entire test process, from the production and execution of tests to the generation of human-readable evaluation reports. We synthesize our experiences into a set of simple mantras that we recommend following in the design of any assessment framework.

References

  1. H.-J. Boehm, R. R. Atkinson, and M. F. Plass. Ropes: An alternative to strings. Software: Practice and Experience, 25(12):1315--1330, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. D. Cash, S. Jarecki, C. S. Jutla, H. Krawczyk, M.-C. Rosu, and M. Steiner. Highly-scalable searchable symmetric encryption with support for boolean queries. In CRYPTO, volume 8042 of LNCS, pages 353--373. Springer, 2013.Google ScholarGoogle Scholar
  3. J. Dean and S. Ghemawat. MapReduce: Simplified data processing on large clusters. Commun. ACM, 51(1):107--113, Jan. 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Department of Homeland Security. A roadmap for cybersecurity research, November 2009.Google ScholarGoogle Scholar
  5. A. Hamlin and J. Herzog. A test-suite generator for database systems. In IEEE High-Performance Extreme Computing Conference, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  6. IARPA. Broad agency announcement IARPA-BAA-11-01: Security and privacy assurance research (SPAR) program, February 2011.Google ScholarGoogle Scholar
  7. S. Peisert and M. Bishop. How to design computer security experiments. In IFIP, volume 237, pages 141--148. Springer, 2007.Google ScholarGoogle Scholar
  8. Project Gutenberg. http://www.gutenberg.org.Google ScholarGoogle Scholar
  9. M. Raykova, A. Cui, B. Vo, B. Liu, T. Malkin, S. M. Bellovin, and S. J. Stolfo. Usable, secure, private search. IEEE Security & Privacy, 10(5):53--60, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. US Census Bureau. Census 2000 5-percent public use microdata sample (PUMS) files. http://www2.census.gov/census_2000/datasets/PUMS/FivePercent/.Google ScholarGoogle Scholar
  11. US Census Bureau. Genealogy data: Frequently occurring surnames from census 2000. http://www. census.gov/genealogy/www/data/2000surnames/.Google ScholarGoogle Scholar
  12. C. V. Wright, C. Connelly, T. Braje, J. C. Rabek, L. M. Rossey, and R. K. Cunningham. Generating client workloads and high-fidelity network traffic for controllable, repeatable experiments in computer security. In RAID, volume 6307 of Lecture Notes in Computer Science, pages 218--237. Springer, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Y. Yang. Evaluation of somewhat homomorphic encryption schemes. Master's thesis, Massachusetts Institute of Technology, 2013.Google ScholarGoogle Scholar

Index Terms

  1. Automated Assessment of Secure Search Systems

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader