enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Wikipedia:Database download - Wikipedia

    en.wikipedia.org/wiki/Wikipedia:Database_download

    Start downloading a Wikipedia database dump file such as an English Wikipedia dump. It is best to use a download manager such as GetRight so you can resume downloading the file even if your computer crashes or is shut down during the download. Download XAMPPLITE from (you must get the 1.5.0 version for it to work). Make sure to pick the file ...

  3. Wikipedia talk:Database download/Archive 1 - Wikipedia

    en.wikipedia.org/wiki/Wikipedia_talk:Database...

    If one downloads the old database for English, and then import it into a MySQL database, one finds that it is larger than the 4.1 GB limit most operating systems put on the table size. Of course, I can go in and edit the SQL myself, but it would be better if this was done at the source.

  4. Database dump - Wikipedia

    en.wikipedia.org/wiki/Database_dump

    A database dump contains a record of the table structure and/or the data from a database and is usually in the form of a list of SQL statements ("SQL dump"). A database dump is most often used for backing up a database so that its contents can be restored in the event of data loss. Corrupted databases can often be recovered by analysis of the ...

  5. Comparison of database administration tools - Wikipedia

    en.wikipedia.org/wiki/Comparison_of_database...

    Yes - TXT, CSV, HTML, XML, DBF, SQL script, RTF, MS Word, MS Excel, MS Access, MS Windows Clipboard, Paradox file, WK1, WQ1, SLK, DIF, LDIF (See link for limitations [16]) Yes No Navicat Data Modeler: No No Yes Yes - Import Database from server/ODBC Yes - Export SQL No No MySQL Workbench: Yes Yes Yes

  6. Help:Export - Wikipedia

    en.wikipedia.org/wiki/Help:Export

    Wiki pages can be exported in a special XML format to import into another MediaWiki installation or use it elsewise for instance for analysing the content. See also m:Syndication feeds for exporting all other information except pages, and see Help:Import on importing pages.

  7. Wikipedia talk:Database download - Wikipedia

    en.wikipedia.org/wiki/Wikipedia_talk:Database...

    The page mentions 19 GB in the context of a different download, pages-articles-multistream.xml.bz2. The latest dump index says that the 19GB file is now about 22GB. -- John of Reading 07:06, 15 October 2023 (UTC)

  8. Wikipedia talk:Database download/Archive 2 - Wikipedia

    en.wikipedia.org/wiki/Wikipedia_talk:Database...

    It should be easy to run something like sed or a python script on the database dump to strip all of that formatting out. — Preceding unsigned comment added by RaptorHunter (talk • contribs) 02:47, 11 March 2011 (UTC)

  9. Wikipedia:AutoWikiBrowser/Database Scanner - Wikipedia

    en.wikipedia.org/.../Database_Scanner

    The database scanner is multi-threaded: the database scanner uses the main thread to read the database XML file from disk, and additional thread(s) to search the articles based on the user's search criteria, total threads equalling the number of CPU cores (e.g. if quad core CPU without hyperthreading then 1 main and 3 secondary threads). The ...