...
- Mikis will establish the skype-conference at 13:00 (Please do not connect yourself):
- TDC tele-conference: (If it fails to establish a skype tele-conference):
- Dial in number (+45) 70 26 50 45
- Dial in code 9064479#
- BridgeIT: BridgeIT conference will be available about 5 min. before start of meeting. The Bridgit url is konf01.statsbiblioteket.dk. The Bridgit password is sbview.
Participants
- BNF: NicholasNicolas, Sara (Wasnt able to participate).
- ONB: Michaela and Andreas
- KB: Tue, Søren and Nicholas
- SB: Colin and Mikis, Sabine (Sabine and Colin weren't present).
- Any other issues to be discussed on today's tele-conference?
...
- The week of 17.sep. Søren will go to BnFBJ on the 20-21 september.
- Issue for planning: NAS-2066 Heritrix roadmap Workshop.
...
Panel | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
- Shared testing of WARC functionality?
Panel |
---|
|
Moved sourcecode to GitHub?
...
- Git is a much more flexible than Subversion, see 3 Reasons to Switch to Git from Subversion, GitSvnComparison, svn - git vs Subversion - pros and cons, Why You Should Switch from Subversion to Git.
- Will be moving the code to a standard open source hosting sites, which will increase accessability.
- GitHub is great!
Panel |
---|
Have a look, see what you think. |
Iteration 52 (3.21 development release) (Mikis)
Panel | |
---|---|
Jira Legacy | |
server | SBForge | key | NAS-2018
Status of the production sites
- Netarkivet: TLR
Second broad crawl 2012 (NR 15) was finished primo july.
Third broad crwawl 2012 (NR 16) was started this morning August August the 14th using 3.18.3. 1. step is allmost finished.
Version 3.20.* is currently tested accepttested and we are preparing for production medio october. We have found 2 issues, which Søren is looking into.
Our Wayback is now indexed up to July 2012 and I'm preparing/testing automatic indexing in production.
Thanks to Jon and his son we have downloaded thousands of youtube videos the last month.We have during the summer 2 productions issues without big impact on the system:
1) SB SAN pillar was down one day without affecting any harvesting because the KB site was running and all harvesters on SB was inpendent servers with own disk storage.
2) We lost 1 day of harvesting caused by no process resources on our admin server. We are still investigating the logs for futher explanations.2 3 questions for BNF:
1) Can you show "Show comments" for harvested facebook.com sites?
2) If you harvest youtube and download videos, how do you link the youtube "metadata" page with the actual video URL?
3) Which progresql version are you using in production - 8.4?
- Netarkivet: SAS (for a month ago)
...
As to our selective crawls: “business as usual” – that is to say: analyze of “candidates” (new sites proposed for selective crawls), QA of selective crawls, monitoring harvest jobs, revision of harvest profiles
- BNF:
Panel |
---|
On the 8th of August, the last harvest of the electoral project ended. Over a period of seven months, monthly, weekly, daily and single captures have been made of websites selected by librarians for their relation to the French presidential and parliamentary elections. The result is more than 350 million URLs, and 20.38 Tb of data (compressed: 10.67 Tb). We have focused our efforts on harvesting the social Web, especially Twitter and Facebook, but Pinterest and Flickr too. The well-known problem of the # in the URL has been an unsurmountable obstacle to the harvest of some sites (Google+, Pearltree). But solutions were found for others. Thus Twitter was collected 4 times a day with a special harvest template: the crawler declared itself not as a browser, but as a robot. This allowed us to have access to the URL without the problematic <#!> sequence, and therefore to collect tweets. But now Twitter's URLs seem to work without this sequence, even in a normal browser, making them easier to collect. This project was also the occasion to see our new nomination tool (BCWeb) working with NAS on a large scale. It proved to be very useful, even where we had sometimes to adjust the frequency of certain captures (to densify harvests for the electoral week-ends for example). |
- ONB:
Panel |
---|
|
Date for NAS workshop at SB
...