enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Database dump - Wikipedia

    en.wikipedia.org/wiki/Database_dump

    A database dump contains a record of the table structure and/or the data from a database and is usually in the form of a list of SQL statements ("SQL dump"). A database dump is most often used for backing up a database so that its contents can be restored in the event of data loss. Corrupted databases can often be recovered by analysis of the ...

  3. Import and export of data - Wikipedia

    en.wikipedia.org/wiki/Import_and_export_of_data

    The import and export of data is the automated or semi-automated input and output of data sets between different software applications.It involves "translating" from the format used in one application into that used by another, where such translation is accomplished automatically via machine processes, such as transcoding, data transformation, and others.

  4. Wikipedia:Database download - Wikipedia

    en.wikipedia.org/wiki/Wikipedia:Database_download

    Wikipedia SQL dump parser is a .NET library to read MySQL dumps without the need to use MySQL database WikiDumpParser – a .NET Core library to parse the database dumps. Dictionary Builder is a Rust program that can parse XML dumps and extract entries in files

  5. Wikipedia talk:Database download/Archive 1 - Wikipedia

    en.wikipedia.org/wiki/Wikipedia_talk:Database...

    The database dump hasn't been updated in a month now (it says it's done twice a week on the meta-wiki). Image dumps have been broken for 2 weeks' time now. Image dumps use some strange compression that apparently can only be uncompressed using the right version of the right set of programs on the right platform (in other words, anything but ...

  6. Wikipedia:Database dump import problems - Wikipedia

    en.wikipedia.org/wiki/Wikipedia:Database_dump...

    It seems to have worked. There might be a better way to increase the size of the table here (the db won't import with the default limit of 4Gb) Mr. Jones 13:56, 16 Dec 2004 (UTC) Indeed setting myisam_data_pointer_size to 7 works as well, which seems better. Brighterorange 8 July 2005 16:25 (UTC) Curl appears to also have the 2gb download limit.

  7. Wikipedia talk:Database download - Wikipedia

    en.wikipedia.org/wiki/Wikipedia_talk:Database...

    Not done The page doesn't say anything about the size of pages-meta-current.xml.bz2. The page mentions 19 GB in the context of a different download, pages-articles-multistream.xml.bz2. The latest dump index says that the 19GB file is now about 22GB. -- John of Reading 07:06, 15 October 2023 (UTC)

  8. Wikipedia:Size of Wikipedia - Wikipedia

    en.wikipedia.org/wiki/Wikipedia:Size_of_Wikipedia

    The size of the English Wikipedia can be measured in terms of the number of articles, number of words, number of pages, and the size of the database, among other ways. As of 25 December 2024, there are 6,929,782 articles in the English Wikipedia containing over 4.7 billion words (giving a mean of about 688 words per article).

  9. Data migration - Wikipedia

    en.wikipedia.org/wiki/Data_migration

    Data migration is the process of selecting, preparing, extracting, and transforming data and permanently transferring it from one computer storage system to another. . Additionally, the validation of migrated data for completeness and the decommissioning of legacy data storage are considered part of the entire data migrati