Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Status of the production sites

Netarkivet

Panel

We are preparing a 2-days workshop for Netarchive curators on harvesting social media. Hopefully the outcome will be usefull for our coming event harvest on local and regional elections on 21 November. We also aim to use BCWeb with external partners on the election event harvest.

The developers are going to have a workshop in the middle of October. The curator wishes are as follows (in order of priority):

  • Replay of https-pages in Wayback
  • Improvement of Heritrix and integration of supplementary collection tools (e.g. brozzler)
  • Introduction of a (technical) collection concept. This will give us the ability to integrate data collected before and without NAS.
  • Improvement of Access
  • More automated QA

Most likely we wil not be able to perform a full broad crawl with 2 steps this year (our last full broad crawl is from the beginning of 2016), because of our problems with Heritrix 3 Remote Access. We expect to be able to solve this problem with NAS 5.4, which will be implemented after having finished the compression of the archive in the beginning of 2018.

Since January 2017 we only harvested about 25 TB

In the beginning of September 2017 Netarchive was blocked by about 54.000 domains (out of 1.32 Mill. Domains)

The implementation of “Web Danica” (automated identification of Danish web content outside .dk) is ongoing.

The migration of documentation from the old “MediaWiki” to Jira is finished.

BnF

Panel
There have been several changes in the team over the summer. Pascal Tanésie has arrived as assistant head of the digital legal deposit team, and Vladimir Tybin has joined the team as digital curator. Sophie Derrot has left the BnF to take up a post at the Institut national d'histoire de l'art.

Our second test broad crawl, with the complete seed list, is nearly finished. The amount of data crawled in this test has proved to be higher than our budget estimates, mainly because there is no deduplication for this first broad crawl with H3. We will analyze the figures in detail and adapt the budget accordingly.

We are also using our new infrastructure for the tests: the crawlers are more powerful and faster but they use more bandwith. We will therefore need to reduce the number of crawlers from 40 to 35. We had set the duration of each job to 3 days but this has proved to be too much, for the real crawl it will be betwen 2 and 2.5 days.

This week we aim to transfer all our crawls onto the new infrastructure and the next week the real broad crawl will start.

NB

Panel
 

BNE

Panel

Next meetings

...