Category: Self Evaluation

Event Report: “Digital Notebooks – productivity tools for researchers” on 15.03.2018


Author: Yasemin Türkyilmaz-van der Velden

This report is also available in a pdf version on the Open Science Framework:

On 15th and 16th of March 2018, two events dedicated to Electronic Lab Notebooks (ELNs) took place at TU Delft Library: “Digital Notebooks – productivity tools for researchers” and “Digital Notebooks – how to provide solutions for researchers?”. The events were organized by the Research Data Services, TU Delft Library. Both events attracted a lot of attention nationally and internationally, and the tickets got quickly sold out. We were very happy to see the amount of interest in these events, and the inspiring discussions initiated by the participants. During my PhD study in molecular biology and genetics, I have always felt the need for a digital tool to manage my research data. Currently being the Data Steward at the TU Delft Faculties of Applied Sciences and Mechanical, Maritime and Materials Engineering, my responsibility is to address the data management needs of the researchers at these faculties. Therefore, it was especially interesting for me to join these events and explore the currently available tools. Below is a report of the first day.

The need for digital notebooks

Many academic researchers use paper notebooks to document all sorts of experimental details ranging from date, purpose, methodology and raw/analyzed data to conclusions. The main problem with paper-based notebooks is that they are not searchable, especially considering that each researcher typically leaves behind a shelf full of such notebooks. As a result, it often becomes very difficult to find the results and details of experiments performed by previous lab members or even just to read and understand the related handwritten notes. Moreover, paper notebooks mostly could store only a printed copy of the finalized dataset, which is not reusable. Furthermore, in a paper notebook, it is impossible to directly link the experimental details to all of the raw, intermediate and final datasets which are mostly digital. Together all of these do not only decrease research efficiency but also presents challenges to research reproducibility, which is a particularly important issue in the light of the current reproducibility crisis in science.

Digital notebooks provide a searchable alternative to paper-based traditional notebooks, and additionally offer lots of efficiency-saving integrations – with various cloud storage platforms, calendars and project management tools.

Digital Notebooks – productivity tools for researchers on 15th of March

This full-day event was aimed at researchers, students, and supervisors who are interested in making their research digital, and research support staff who want to learn more about ELNs and how could ELNs meet the needs of the researchers. All of the presentations in this event can be found here: DOI 10.5281/zenodo.1247390.

pasted image 0.png

Image from the presentation by Esther Maes

Esther Maes from TU Delft Library opened the event stressing the importance of archiving and that archiving is required not only to minimize the risk of losing data but also to avoid fraud. She continued with asking intriguing questions: “What happens when you leave? How can people access the correct version of your data? Is it even easily accessible for you?”

Then Alastair Dunning, the head of TU Delft Research Data Services and 4TU.Centre for Research Data, took the lead and emphasized that data documentation is a time-consuming process, involving many disjointed jumps such as experimenting, analyzing, indexing and publishing, therefore there is a need for making data documentation smoother. He finalized his speech with a valuable remark stating that a new digital solution cannot have poorer usability than the existing paper ones.

The rest of morning sessions focused on case studies from researchers who not only use digital notebooks in daily practice but also took the lead in the implementation of the ELNs in their research groups and institutes.

pasted image 0 (1)

Image from the presentation by Alastair Dunning

Case study 1: Let’s go digital; keeping track of your research using eLABJournal by Evelien Stouten from the Department of Biology, Utrecht University

Evelien Stouten described that the researchers expect an ELN to be not only well-organized and searchable but also suitable for integration with other tools and software packages, adding literature references and data sharing with collaborators. She also highlighted that an ELN is expected to provide safe data storage and be fraud-proof, meaning that everything that is documented remains traceable, even if it is deleted or changed.

pasted image 0 (3)

Image from the presentation by Evelien Stouten

The Faculty of Science at Utrecht University started discussing ELNs in 2013 and the researchers were invited to take part in a test phase from 2014. Her research group found out that eLABJournal meets their expectations and provides an additional application suitable for their needs, namely eLABInventory. This application enables digital documentation and categorization of samples such as strains, plasmids, cell lines, chemicals, antibodies, RNA and DNA samples, and linking of these samples to the experimental data. She stressed that they are obligated by law to keep records of all genetically modified organisms (GMO) and usage of eLABInventory is currently obligatory for all Utrecht University labs using GMOs. She also mentioned that they find the mobile app useful since it enables the researchers to use eLABJournal also on their phones or tablets when they are working in the lab.

She concluded her talk by pointing out that some people are really attached to their paper lab journals and it might take some effort to convince them to start using it, even though it is made obligatory.

Case study 2: From paper to screen: What users really think about electronic lab notebooks by Katharina Hanika, Department of Plant Sciences, Wageningen University

pasted image 0 (4)

Image from the presentation by Katharina Hanika

Katharina Hanika shared with the audience her experience with eLABJournal and her insights into using ELNs. She focused on why to switch from paper to screen by listing the pros and cons of ELNs. For pros, she indicated the readable, structured and searchable information, digital storage of samples, and easy collaboration with colleagues not only for sharing or discussing data but also for version control. As for cons, she pointed out that the startup was time-intensive since it takes time to figure out how the program works. Moreover, a good internet connection is required as eLABJournal is web-based. Although eLABJournal is still under improvement, she sees that as an advantage, since the company provides support and adjusts accordingly to needs of the researchers.

She further continued with discussing how to achieve department-wide implementation of ELNs. She suggested that it is best to start with volunteers since it is challenging to convince the “creatures of habit” to change their ways of working. She pointed out that if researchers try ELNs themselves, they can get frustrated and give up, and therefore it is a good idea to first start with online demonstrations and hands-on exercises. It would be also beneficial to assign the experienced ELN users as contact persons to be reached for questions. Moreover, creating an ELN user group would enable researchers to help each other.

She concluded her talk by stating that any (electronic) lab notebook is only as good as its user and what it takes is time, commitment and adaptability.

Case study 3: Enabling connectivity in electronic laboratory notekeeping – a pilot approach in biomedical sciences by Harald Kusch, University Medical Center Göttingen

pasted image 0 (5)

Image from the presentation by Harald Kusch

Harald Kusch talked about the pilot implementation of RSpace at CRC 1002 Research Data Platform. He highlighted that using an ELN enables linking of experimental data to other relevant elements, such as catalogs for cell lines, mouse lines, and antibodies, as well as databases. He explained the possible ways of structuring data in an ELN, which are chronological, project-oriented and method-oriented. Although it is a challenge to decide which is the best option, the chronological option is the only option in a paper lab journal. He described that RSpace allows both structured and unstructured documentation. Structured documentation is very handy, especially for new people in the lab, as it allows using centralized protocols and facilitated metadata recording. Meanwhile, unstructured documentation offers room for creativity and is especially suitable for new lab protocols. He also stressed that all versions of each document are saved, which prevents fraud. He explained that the data can be exported in different formats, such as PDF, HTML, and XML. Moreover, RSpace offers interfaces for easy transfer of datasets to data repositories such as Dataverse. He finalized his talk emphasizing that start-up phase takes time.

Interactive questions from the audience

During this interactive session, the audience had the chance to ask their questions to the presenters of the case studies. Most questions were focused on the following topics:

Where is the data stored? Is institutional data storage an option?

  • Both eLABJournal and RSpace give the institutional data storage option to their users.

How to use an ELN in a lab environment without going up and down between the lab and the office to write down notes?

  • Katharina: There are fixed tablets available in the lab, some people directly type in the tablet, some make handwritten notes and go back to their PCs.
  • Harald: Not every lab can afford a tablet per lab member, but it may also not be necessary.
  • Evelien: Not everyone types right away, some prefer to make small notes and then type it in the ELN in the office.

What happens to the added hyperlinks in the ELNs if folders are moved, do links work still?

  • If the name or location is changed, the link would indeed break but at least it is possible to trace back to the previous link. There is no direct solution available yet.

Does setting up an ELN in a department need a fully dedicated staff member?

  • To be able to implement an ELN, an ideal way would be that a lab member who knows the research type and needs takes the lead to implement it.

Keynote by Alastair Downie, University of Cambridge: Choosing an Electronic Lab Notebook

Alastair Downie told that the first ELN came in 1997 and the industry was quick to adopt, while this was not the case with academia. He explained that the industry has a variety of incentives to use ELNs, such as the requirement for absolutely consistent processes, protection of intellectual property and other commercial and corporate responsibilities. He answered the question “What is holding universities back?” saying that there are so many different types of ELNs and so many different types of research and research needs which altogether makes it difficult to find the ideal solution. To make it easier for the researchers to choose an ELN, he prepared a valuable resource with an overview of the available solutions. In this source, information about a variety of issues are provided:

  • What is an electronic lab notebook and why should I use one?
  • A note about DIY systems
  • ELN vs LIMS
  • Disengagement – what if I want to change systems?
  • Narrowing the scope, creating a shortlist
  • Evaluating ELN products
  • Table of 25 current ELN products
  • Discussion forum

As an alternative option to the available ELNs, he introduced Do-It-Yourself (DIY) ELNs which could be made by using tools such as EVERNOTE, OneNote, asana, Basecamp, Dropbox, OneDrive. He emphasized that using one of these tools as a DIY ELN still requires a very disciplined approach; however, without any ELN, one needs to be even more structured. He also stressed that these tools are not designed to be used as an ELN and therefore do not provide custom solutions.

pasted image 0 (6).png

Image from the presentation by Alastair Downie

He also focused on the question “What if you chose the wrong product?”. It is possible that after implementing an ELN, the ELN software can change and may not be really suitable for the research needs of the users. If you stop using an ELN, in most cases all you can export is a PDF, HTML or XML file(s), but on the other hand at least such files are easily accessible and searchable and can be backed-up and securely stored.

Then he focused on creating a shortlist to find the ideal option:

  • Do you have a budget?
    • Free or a paid ELN? Is a paid ELN worth the money?
  • Will you use the software as an individual or a group?
    • Collaborative vs self-contained, comprehensive vs lightweight
  • Do you need team collaboration and supervisor features?
    • Group activity dashboard, commenting & discussions
    • Constant discussion, even if the group leader is away
  • Departmental or institutional deployment?
    • Please everyone? Or focus on stability, accessibility, and universal relevance?
  • Do you need multi-operating system (OS) compatibility?
    • Browser-based & OS agnostic, or application-based
  • What devices will be used to operate the software?
    • Tablets on bench? Voice recognition? Phones? Paper?
  • Data security and compliance requirements?
    • GDPR compliance? Local storage?

He further explained how to evaluate the shortlisted products:

  • Interface design: Look and feel user-friendly, intuitive and efficient?
  • Workflow suitability: Does ELN workflow match your own workflow?
  • Content creation tools: Writing, drawing, annotation, markup, equations, chemical structures…
  • Data management & storage features: Upload typical file types/sizes? Larger files? Display/operation? Backed-up?
  • Integration with other software and/or cloud services: Office apps, Statistics, Institutional storage, Community repositories…
  • Collaboration features: Share data and comments in a group? Invite external collaboration?
  • Group leader/Supervisor features: Sufficient oversight and feedback tools? Team/account management?
  • Export features: Pages, sections, entire ELN? Data in original formats?

More detailed information can be found at:

Info from ELN providers about afternoon workshops

There were four ELN providers present at the event:

Before the interactive demonstration sessions, each ELN provider was given the opportunity to give a pitch about their ELN product. The presentations in this session and the morning session can be found here: DOI 10.5281/zenodo.1247390.

Hands-on workshops and opportunity to test tools offered by various ELN providers

In this session, the participants were given the opportunity to try out the ELNs listed above and ask their questions directly to the providers. Here is the feedback that was given by the participants about each ELN at the end of the hands-on workshops:


After this event, we got contacted by researchers from various TU Delft departments to discuss the possibilities of implementing an ELN. Currently, we are in contact with researchers to determine what they expect and require from an ELN and we are planning to start a pilot study afterwards.

I would like to finalize this report by sharing the feedbacks given by the participants about this event:

pasted image 0 (7)


This report is available in a pdf version on the Open Science Framework:

First of all, I would like to thank the Research Data Services, TU Delft Library for organizing this very informative event. We also thank all the speakers for the informative presentations and all the participants for the fruitful discussions. Finally, I would like to give special thanks to Marta Teperek for her critical reading and inspiring suggestions during the preparation of this report.

Do as you preach: results of 2017/2018 data management survey now published


Author: Jasper van Dijck, Data Steward at the Faculty of Electrical Engineering, Mathematics and Computer Science

Data. We advise researchers on how to manage theirs, but we are not averse to gathering and sharing some of our own.

The problem

As data stewards at TU Delft we were asked how we are going to keep track of our progress. After some discussion amongst ourselves, we concluded that we could count the number of researchers we helped with their data management (plans) and we would love to measure the number of data sets shared by TU Delft researchers in the public domain. Presumably, an increase in the former would lead to an increase in the latter. That did not seem quite enough though, since there is a time difference between our usual first point of contact with a researcher, at the beginning of a project, and the archiving/sharing of research data, usually at the end. We would have to be quite patient in finding out if our ventures had paid off since most research projects usually last a few years. So we felt we also needed to know how researchers were currently thinking about research data management (RDM) since one of the focus points of being a data steward at TU Delft is creating awareness and facilitating a change in culture.

The solution

That is why we set up a survey. Nothing fancy, but a simple survey asking researchers a couple of questions on their (attitude towards) research data management. If you are Dutch, this would be our infamous “nulmeting.” This will give us a starting point in measuring the change in attitude and behaviour over time (yes, we are planning to re-do the survey regularly): it will give us insight into what effect our presence and actions have had.

The results

So, we would like to present to you the results of the TU Delft “Quantitative assessment of research data management practice 2017-2018,” or RDM survey 2017/2018 for short. This survey has been set up in cooperation with EPFL and Cambridge University. EPFL has also already finished their survey and Cambridge is currently completing their survey. Our goal is to cross-compare the results between the different institutions to see if we could learn from each other’s approach.

You can find a visualisation of the survey here:!/vizhome/20180809TUDelftResearchDataManagementSurvey2017-2018/TUDelftRDMsurvey2017-2018.

And yes, the anonymised(!) data is in the public domain. You can find it here: We practise what we preach.

Feel free to explore the results of the survey in the visualisation or download the data yourself. We will learn a lot from it and we are looking forward to finding out what has changed in the next survey.

If you are a researcher at TU Delft and you are reading this: we are counting on you to fill out the RDM survey 2018, somewhere near the end of the year. Until that time, if you have any questions, please contact us at

Invitation to collaborate

If you are interested in research data management and would like to do a similar survey at your institutions, you are most welcome to join TU Delft, EPFL and the University of Cambridge in our efforts. The survey itself is available on the Open Science Framework:

So, just drop us an email at

Retrospect on Data Management Plan Support Work 2017


The year 2017 closes in and it was a busy one for the DMP (data management plan) support. The funding bodies NWO (Nederlandse Organisatie voor Wetenschappelijk) and European Commission tightened their demands on research data management during the research and the discoverability and accessibility for the research outcome. Since that caesura the interaction with the researcher receiving grants from these funding bodies steadily increases. Next to other research support services, such as the valorisation centre, pointing researcher with a help request to our services, the RDS (research data services) team proactively contacts researcher to offer advice.

The first step is an introductory talk about the researchers project, their data and research data management. The following topic explains the provided ICT solutions available at TU Delft, because it still appears that researcher are not aware of the full spectrum of available technical infrastructure. Subsequently the DMP section about how to handle research data during the research is discussed. The last part considers preservation, archiving and data availability after the research, where the 4TU.Centre for Research Data is introduced and the benefits explained. Additionally the use of the 4TU.Reserach Data instance of DataVerseNL is described. With this wholesome range of support services, a possible data management and data deposit workflow is discussed.


For NWO the first deadline for a DMP draft is 4 months after the project officially started, for the H2020 programme by the European Commission it is 6 months. The RDS team also offers to give feedback on the DMP between that first deadline and the final submission.


The ideal involvement of the RDS team from start to finish begins with giving feedback on the data section in the proposal stage. When the project has received funding, the researcher comes back to the RDS team, or the team contacts the researcher again to offer support. Besides helping with filling in the DMP, the RDS teams offers training for the project team and the department, if the researcher is sympathetic towards that. With the data-doi reservation service of the 4TU.Centre of Research Data, the researcher are encouraged to already deposit the scientific publication underlying data into the archive. With the collection creation feature, the researcher are offered a great opportunity to represent their research output in the most suitable way and not wait till the end of the project to ‘dump’ some data into the archive to comply to the open data demand by the funding body.

All this is introduced and explained to the researcher in the first session and can lead to a close collaboration throughout the project duration, if the researcher is in favour of that.

So far we supported 20x research projects by NWO and 3x H2020 (open data pilot) with their DMP drafting and first submission. We did not receive any feedback by the funders about the quality of our DMP support yet. However, the researcher at TU Delft appreciate our advice and help.


TU Delft Research Data Services – RISE Evaluation. RDA Tenth Plenary Meeting Presentation 2017.



If you are looking for the presentation slides of the TU Delft Research Data Services – RISE Evaluation presentation held by Wilma van Wezenbeek in Montreal on the behalf of our team, here they are:



To read up on the previous work this presentation is based on, please have a look on the blog post from June this year.

Want to know what Wilma also experienced during the conference? Read her blog post on the TU Delft library weblog.

2017 Self Assessment of Research Data Services and 4TU.Centre for Research Data Services with RISE

The Research Infrastructure Self-Evaluation Framework, (RISE), was published at the beginning of 2017 by the Digital Curation Centre (DCC). It is a way of determining how mature your institutional Research Data Services may be

Version 1.1 provides a self-assessment framework with 10 categories covering, amongst others, RDM policies, business plans, advisory services, and training.

Our Goal

Here at Delft, we used RISE to measure our Research Data Services (RDS) and 4TU.Centre for Research Data (4TU). The RISE model is very helpful in providing a fixed framework for categories for research data management services.

Our Evaluation Team

Four team members of the rds/4tu team of 10 people participated in RISE evaluation. The group consisted of one front office person (ie talking to researchers), one back office person, one person being responsible for training, and the team head.

The whole framework was used to determine the maturity levels of the provided services of the research data services (including 4TU) within the plethora of TU Delft Library services. For this first evaluation the standard set of questions were used and no additions were made. At the time of the evaluation the future service provision was not taken into account. In order to improve the evaluation process, a google form was set-up with tick boxes. That helped to quickly go through the questions and later to determine the majority decisions.

After every team member worked themselves through the framework, the evaluation team came together to  analyse the results and determine the final maturity level for each question.

List Overview of RDS / 4TU Level Scoring

RISE section RDS / 4TU Level Score
1 RDM Policy and Strategy
1 a) Policy Development Level 0
1b) Awareness Raising and Stakeholder Engagement Level 3
1c) RDM Implementation Roadmap Level 3
2 Business Plans and Sustainability
2a) Staff Investment Level 3
2b) Technology Investment Level 2
2c) Cost Modelling Level 3
3 Advisory Services
3 a) Guidance Provision Level 2
4 Training
4 a) Online Training Level 0
4 b) Face to Face Training Level 1
5 Data Management Planning
5 a) DMP Provision Level 2
6 Active Data Management
6 a) Scaleability and Synchronisation Level 2
6 b) Collaboration Support Level 1-3
6 c) Security Management Level 1
7 Appraisal and Risk Assessment
7 a) Data Collection Policy Level 3
7 b) Security, Legal and Ethical Risk Assessment Level 1
7 c) Metadata Collection to Inform Decision-making Level 3
8 Preservation
8 a) Preservation Planning and Action Level 3
8 b) Continuity Support Level 3
9 Access and Publishing
9 a) Monitoring locally produced Dataset Level 3
9 b) Data Publishing Mandate Level 2
9 c) Level of Data Curation Level 2
10 Discovery
10 a) Metadata Cataloguing Scope Level 2

Extended Results

We are planning on annually self-evaluating our services.

The original evaluation is in table form, with commentary on our selection and general comment about that section.

The second shortened version is the spotlight of the single sections with remarks: