Already a member?

Sign In

Conference Presentations 2014

  • IASSIST 2014-Aligning Data and Research Infrastructure, Toronto
    Host Institution: University of Toronto, Ryerson University, and York University

Posters (Thu, 2014-06-05)
Chair:Samantha Guss

  • Data management in the liberal arts: Current practices and attitudes at a Big 10 American University
    Alicia Hofelich Mohr (University of Minnesota)
    Thomas Lindsay (University of Minnesota)


    The diverse nature of liberal arts research makes identifying needs and providing support for data management a complex task. Attesting to this diversity, in our survey of 29 departments at Minnesota, we found differing practices, attitudes, and awareness about managing data and research materials. Many respondents identified need for data management support across the research lifecycle, with the largest needs for data security, preservation, and sharing. However, there are striking differences across the social sciences, arts, and humanities in attitudes and perceptions towards data management and what it entails, perhaps due to differing requirements and cultures in the fields. These demonstrate that a one-size-fits-all approach to support data management is not appropriate for a broadly diverse liberal arts college; rather, the services we develop should be sensitive to discipline-specific needs. To further explore these possibilities, we plan to administer our survey to other colleges within the University. As Minnesota is one of the few Big 10 Universities to institutionally separate the liberal arts from sciences, comparing our results with other more disciplinary-specific colleges should allow us to evaluate the roles institutional organization and disciplinary expectations may play in the emergence of data management needs and support.

  • Colectica for Excel: A free tool for increasing data accessibility using open standards
    Jeremy Iverson (Colectica)
    Dan Smith (Colectica)


    Traditionally, data in spreadsheets and plain text formats do not contain rich documentation. Often, single-word column headers are the only hint given to data users, making it difficult to make sense of the data. Colectica for Microsoft Excel is a free tool to document your spreadsheet data using DDI, the open standard for data documentation. With this Excel addin, users can add extensive information about each column of data. Variables, Code Lists, and the datasets can be globally identified and described in a standard format. This documentation is embedded with the spreadsheet, ensuring the information is available when data are shared. The addin also adds support for SPSS and Stata formats to Excel. When opening an SPSS or Stata file in Excel, standard metadata is automatically created from the variable and value labels. Colectica for Excel can create print-ready reports based on the data documentation. The information can also be exported to the DDI standard, which can be ingested into other standards-based tools. This booth will include live demonstrations of the latest version of the Colectica for Excel tool, showing how to document the contents of a spreadsheet, publish the information, and use the documentation to access data in an informed way.

  • A data librarian’s dream come true: Data access made effortless
    Jane Fry (Carleton University)
    Alexandra Cooper (Queen's University)


    A collaborative effort among the Ontario data community created ODESI (Ontario Data Documentation, Extraction Service and Infrastructure), a data portal. This revolutionary process of obtaining data has transformed data access for researchers, empowering them by providing ready access to this vital data library service. This poster illustrates the cumbersome process that was necessary to obtain data 15 years ago and compares it with the quick, innovative way it is obtained today, thus freeing up the Data Librarian’s time to educate the researcher in the intricacies of data literacy.

  • Bringing data to the DANCE: Implementing a data acquisition model at the Federal Reserve Bank of Chicago
    Jim Obst (Federal Reserve Bank of Chicago)


    In August of 2011, Federal Reserve’s librarians were asked to come together to manage purchased data acquisitions. The new initiative required one librarian from each of the 12 Reserve Banks and one from the Board of Governors to implement a common process for data procurement. It also formalized the use of a common data catalog. The overall goal was to avoid unnecessary duplication and leverage purchasing power as whole. In response to the mandate, the managing librarians of the Federal Reserve System formed a new, collaborative data work group made up of their 13 chosen data librarians. Together the group created and implemented innovative system-wide policies and workflows, including a unique online catalog. Each individual data librarian was in turn responsible for initiating new policies and workflows for the data management process within their own Reserve Bank. In the Chicago Fed, implementation of the new data acquisition regime has been successful, with variations on its implementation like a District Data Management Group and educational initiatives about data use and procurement protocols. In the first two years, the need for collaborative tools have given way to workflows in SharePoint. Other solutions are planned.

  • Data Seal of Approval
    Mary Vardigan (University of Michigan, ICPSR)


    The Data Seal of Approval (DSA) provides a mechanism for repositories to demonstrate their trustworthiness in a transparent way. The assessment criteria are made up of 16 guidelines for which repositories supply evidence of compliance. Once the self-assessment is complete, a peer reviewer evaluates the evidence, and if the repository is found to be in compliance with the 16 guidelines, the Data Seal of Approval is awarded with the seal displayed on the repository’s Web site. Twenty-four repositories have received the DSA so far with more in the pipeline. This poster will provide more information about the DSA initiative to encourage new DSA applicants.

  • Data documentation and metadata use in research data management
    Christie Wiley (University of Illinois)


    The role of librarian and research data management has been a growing topic of discussion and participation within libraries and universities. Librarians and universities have formed many initiatives, committees and groups to assist in the areas of support for data deposit and data management. The University of Illinois created an e-research implementation group to bring together subject specialists, research data librarians, and functional specialists to advance the library’s data initiatives. In order to support the efforts of a broader goal to educate others about the research data services that are available to them and offer tools to meet their data needs, a smaller group of librarians within the e-research implementation group began a project to update the data services website. This update provided information and education regarding the definition of data, intellectual property, data sharing, funding requirements, files, formats, preservation and storage information. This poster illustrates how a website can be used as an instructional model for data documentation and metadata. The goal of this poster is to provide insight and information as a point of reference when librarians, researchers, data managers, curators and scientists meet individuals with data needs.

  • Deepening collaborative relationships in providing Research Data Management support
    Carol Perry (University of Guelph)
    Wayne Johnston (University of Guelph)


    This poster will trace new patterns of collaboration in establishing programs for research data management in a Canadian context. The University of Guelph Library Research Enterprise and Scholarly Communications team has broadened its relationships with other campus units as well as other institutions to strengthen training program development for graduate students and faculty. Our work includes partnership in a multi-university team creating training modules for graduate students in Ontario. We have worked with a provincial government ministry to create a data repository for agri-environmental research and are working with a non-profit group assisting in the development of a discipline-specific repository. These are just a few examples of initiatives we have undertaken over the past year.

  • Developing incentives for data stewardship and sharing: Library engagement beyond liaison relationships
    Heather Coates (Indiana University-Purdue University Indianapolis (IUPUI))
    Ted Polley (Indiana University-Purdue University Indianapolis (IUPUI))


    Many of the obstacles slowing the adoption of more democratic dissemination of scholarly products are cultural, not technological. While libraries have extended their technological capacity to new methods of dissemination, we have been less proactive in fostering the cultural change necessary for significant adoption. Two particular groups of constituents and communities of practice have been engaged with the library profession, but the personal contact between faculty and librarians at the institutional level is inconsistent and often hinges upon liaison relationships. This poster will describe opportunities for librarians to engage with institutional units and research communities extending beyond institutional boundaries to advance incentives rewarding new forms of dissemination, including data as a valued community resource. Examples of relating changes in dissemination to various community missions will be provided.

  • Defining security requirements for a remote access system
    Katharina Kinder-Kurlanda (GESIS – Leibniz-Institute for the Social Sciences)
    Andreas Poller (GESIS – Leibniz-Institute for the Social Sciences)
    Philipp Holzinger (GESIS – Leibniz-Institute for the Social Sciences)
    Laura Kocksch (GESIS – Leibniz-Institute for the Social Sciences)
    Stefan Triller (GESIS – Leibniz-Institute for the Social Sciences)
    Sven Turpe (GESIS – Leibniz-Institute for the Social Sciences)


    This paper presents some first results of the one-year project "Empirical Secure Software Engineering (ESSE)" which had the two aims (1) to define security requirements for a planned Secure Data Center remote access at GESIS in Germany and (2) to evaluate different threat modelling techniques. Such techniques are intended to assist software developers in defining and evaluating security risks for a system and in deducing necessary requirements for design, implementation and operation. Using several different modelling techniques a group of participating GESIS staff from various archiving and IT backgrounds generated a collection of threat models. We then interviewed participants about their viewpoints, aggregated the models and discussed them in a group session. Through this process we defined security requirements and translated them into implementable technical and organizational security recommendations. Our approach also enabled us to evaluate the applied techniques’ strengths and weaknesses. We will explain some of the security requirements we defined and also show how our process allowed us to make visible different stakeholders’ viewpoints, was able to support meaningful discussion, and facilitated decision making. Our process can be useful for other archives looking for ways to define security requirements in the fields of archiving and data sharing.

  • Aila and Metka: FSD's new tools for the trade
    Matti Heinonen (Finnish Social Science Data Archive (FSD))
    Tuomas J. Alatera (Finnish Social Science Data Archive (FSD))


    This poster introduces Aila and Metka, two brand new tools for the Finnish Social Science Data Archive (FSD). Aila is FSD’s new web based customer service portal. It will be our main tool for data dissemination. Aila allows customers to search, browse and review our data descriptions on study and variable level, and after registration, download data directly from the service portal. We will demonstrate the functionality of the portal and share the experiences gained during the building phase and the first few months of operation. Metka will be FSD’s tool for managing metadata. Metadata will be entered to Metka, which in turn will feed other systems at FSD (e.g., Aila). This will greatly simplify building services based on our rich metadata as it allows repurposing metadata from a single authorative source. We will present Metka’s features and show how it connects to FSD’s services and other systems. The system will be operational in January 2015. Metka is open source and can be obtained from github. With Aila and Metka, we have defined the software platform FSD will utilise in building new tools for the archive and services for our customers. For example, Shibboleth will be used consistently for user authentication.

  • IASSIST Quarterly

    Publications Special issue: A pioneer data librarian
    Welcome to the special volume of the IASSIST Quarterly (IQ (37):1-4, 2013). This special issue started as exchange of ideas between Libbie Stephenson and Margaret Adams to collect


  • Resources


    A space for IASSIST members to share professional resources useful to them in their daily work. Also the IASSIST Jobs Repository for an archive of data-related position descriptions. more...

  • community

    • LinkedIn
    • Facebook
    • Twitter

    Find out what IASSISTers are doing in the field and explore other avenues of presentation, communication and discussion via social networking and related online social spaces. more...