enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Database dump - Wikipedia

    en.wikipedia.org/wiki/Database_dump

    A database dump contains a record of the table structure and/or the data from a database and is usually in the form of a list of SQL statements ("SQL dump"). A database dump is most often used for backing up a database so that its contents can be restored in the event of data loss. Corrupted databases can often be recovered by analysis of the ...

  3. Wikipedia:Database download - Wikipedia

    en.wikipedia.org/wiki/Wikipedia:Database_download

    Wikipedia SQL dump parser is a .NET library to read MySQL dumps without the need to use MySQL database WikiDumpParser – a .NET Core library to parse the database dumps. Dictionary Builder is a Rust program that can parse XML dumps and extract entries in files

  4. Wikipedia:Database dump import problems - Wikipedia

    en.wikipedia.org/wiki/Wikipedia:Database_dump...

    It seems to have worked. There might be a better way to increase the size of the table here (the db won't import with the default limit of 4Gb) Mr. Jones 13:56, 16 Dec 2004 (UTC) Indeed setting myisam_data_pointer_size to 7 works as well, which seems better. Brighterorange 8 July 2005 16:25 (UTC) Curl appears to also have the 2gb download limit.

  5. Import and export of data - Wikipedia

    en.wikipedia.org/wiki/Import_and_export_of_data

    The import and export of data is the automated or semi-automated input and output of data sets between different software applications.It involves "translating" from the format used in one application into that used by another, where such translation is accomplished automatically via machine processes, such as transcoding, data transformation, and others.

  6. Wikipedia : Database dump import problems/sample my.cnf

    en.wikipedia.org/wiki/Wikipedia:Database_dump...

    Main page; Contents; Current events; Random article; About Wikipedia; Contact us

  7. Backup - Wikipedia

    en.wikipedia.org/wiki/Backup

    A backup strategy requires an information repository, "a secondary storage space for data" [7] that aggregates backups of data "sources". The repository could be as simple as a list of all backup media (DVDs, etc.) and the dates produced, or could include a computerized index, catalog, or relational database.

  8. Help:Export - Wikipedia

    en.wikipedia.org/wiki/Help:Export

    You can also use regular expressions to directly process parts of the XML code. These run fast but are difficult to maintain. Please list methods and tools for processing XML export here: Parse::MediaWikiDump is a perl module for processing the XML dump file. m:Processing MediaWiki XML with STX - Stream based XML transformation

  9. Wikipedia talk:Database download - Wikipedia

    en.wikipedia.org/wiki/Wikipedia_talk:Database...

    Not done The page doesn't say anything about the size of pages-meta-current.xml.bz2. The page mentions 19 GB in the context of a different download, pages-articles-multistream.xml.bz2. The latest dump index says that the 19GB file is now about 22GB. -- John of Reading 07:06, 15 October 2023 (UTC)