wikimedia-dumps Questions

9

Solved

For example using this Wikipedia dump: http://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=lebron%20james&rvprop=content&redirects=true&format=xmlfm Is there ...

2

I downloaded the german wikipedia dump dewiki-20151102-pages-articles-multistream.xml. My short question is: What does the 'multistream' mean in this case?
Moloch asked 11/11, 2015 at 0:14

1

I'm trying to extract interlanguage related articles in Wikidata dump. After searching on the internet, I found out there is a tool named Wikidata Toolkit that helps to work with these type of data...

1

I see dumps.wikimedia.org/other/pagecounts-raw/, for example, but no country-specific data there...
Procurable asked 11/2, 2015 at 21:1

0

I would like to know if it is possible to get the latest incremental n-triple dumps of Wikidata. I'm using Wikidata Toolkit to download the latest version of the dumps and convert them automatica...
Lindsaylindsey asked 12/1, 2015 at 15:48

2

Solved

I'm new to xml parsing and Python so bear with me. I'm using lxml to parse a wiki dump, but I just want for each page, its title and text. For now I've got this: from xml.etree import ElementTre...
Telangiectasis asked 6/12, 2013 at 23:36
1

© 2022 - 2024 — McMap. All rights reserved.