Is there an alternative way to open a huge SQL file rather than MySQL cmd , Em Editor and bigdump? - mysql

I am having lots of troubles while opening an SQL script. It's a database, because I managed to see a little bit of it before the things I tried crash down.
This is the script file that I get : https://steam.internet.byu.edu/
It's quite a popular database.
The zipped file is 17 GB and unzipped version of it is around 168 GB. I need to get the "steamid"s from "Player_Summaries" table. All I need is that column. I've tried the following things so far:
-I read a lot of 'opening a huge sql file posts' here and tried to source and open the file via MySQL command line client. It ran almost all night but eventually it crashed.
-I tried Em Editor, I installed the latest version and that also crashed after opening 6-7% of the file. It gives "unexpected crash" and different error reasons every time I tried.
I mean I am not even trying to run/execute the file, I was just going to copy the lines I need and thats all. All help/advice is appreciated.
Sorry for my bad English.

Related

Failing to import a CSV into PhpMyAdmin

Im trying to get data from an sql server Database into a PHPMyAdmin Databse. Most of my attempts have failed in every way. Creating a Script, using MysqlWorkbench, etc.
So I decided maybe I could just recreate a table in PHPMyAdmin manually, dump the Sql Server Data into a .csv file and import it into my PHPMyAdmin Table. If that works I could technically recreate the whole databse manually.
Well this is where my main problem lies. Im sure some of you thought my question would be a duplicate of this one.
Well every result that comes up in my google search says something along the lines of:
"My .csv look like this: data, data, data, data, data, etc."
Well my .csv looks like this:
Im slightly confused as to what Im exactly doing wrong? Where should I put the seperators? Am I looking at it wrong? Should I use another application to open the csv file instead of excel?
Because if I try to import the file in its current state I get hit with an error "Invalid amount of columns in CSV-Import Row 1". Which apparently indicates that theres no seperators that PhpMyAdmin can recognize.
Does anybody see what Im doing wrong?
Well I received a downvote and assumed, that if somebody is willing to downvote me this quickly, I just didnt look hard enough on google.
So I went back to googling around and found an article that explains the most common .csv problems.
And one of them was that people open the file in excel and see the data and are confused as to why they dont seem to have the same file as everybody else.
So I had a "lightbulb over my head" moment and opened the file in Editor and there my missing seperators were.
Sorry for making an unnecessary post.

Writing many MySQL (in vsCode) queries in one file. How to avoid writting 'USE DBaseName" constantly?

I'm learning SQL / MySQL using WAMPSERVER and the MySql extension for vsCode, so that I can write, comment and keep the code in the vsCode editor, instead of sending volatile queries with the command line.
This morning I created a database by right-clicking a database / New Query:
After creating it this way, I was able to create code that I run without having to write USE DBaseName before the selected lines, as in the selected three lines you can see below:
Yet, after restarting my laptop at home, the same code will not run. It returns the usual
undefined
Error: ER_NO_DB_ERROR: No database selected
unless I write the USE statement, as in the last group (4 lines) of code. So I have to write that every time I want to try a query...
Why did it work this morning and not now? How can one run queries without having to constantly write USE DBaseName?
(links to explanations to further understand the underlying mechanism are also very welcome...)
EDIT: posting image of server in vsCode to answer comment:
I'm using a local server (pic below) which hasn't changed since this morning...

How to view a .sql file that is too large to open in an editor?

I have got the .sql file as result of backing up my whole website data. I have the SQL file with me now. All I want is to view the data as tables. The file is quite big and weighs 700 mb. I have MySQL software with me. When I try to open the file, it first ask whether to open or just run, saying it is quite big file to handle. Selecting run make the software immediately stuck and eventually report problem and close.
If I select open, after long time it open up showing many sentences of codes with insert, and all. If I choose to run the SQL from there, it again get stuck. So, it too become impossible to view the table. Is there any alternative way to view the SQL file as a table? Using any software or any online ways, to view the table.
I suggest you import your dump into a local database, then you will be able to navigate it and run queries against it.

Import a database to DataGrip(0xDBE)

How do I import a database just like in phpmyadmin at DataGrip?
I have the .sql exported from phpmyadmin... but those are lots of lines so that the IDE stops working when trying to run the whole .sql
In DataGrip go to File > Open and select your mysql dump file. Then right click the tab for the file to get the context menu, and select "Run [your filename...]" option. It may ask you to select your schema to apply the run to. But this is how I accomplished importing a dump from phpMyadmin using DataGrip.
Jetbrains documentation on running SQL scripts does not provide a ton of information on processing large insert statements. There is a discussion in the Datagrip community forums and apparently upcoming features to make working with large scripts easier.
Quote from thread:
Huge SQL files can be executed from Files view (use a context menu action).
I assume you are attempting to import a database export which is a series of SQL statements saved to a file. There could be a memory issue if you are attempting to run a large SQL file in memory. Try the following.
Insert commit statements in your SQL file in a text editor. This can even be done from within datagrip. Every couple of hundred statements you can place the line
commit;
which should purge the previous statements from memory. I strongly recommend saving the file which you edit separately from the export script. This method is not applicable if you need an all or nothing import, meaning if even one statement or block fails you want all of the statement to be rolled back.
1 - Going to View->Tool Windows->Files
2 - Going to schema folder and open it in windows explorer after that past your dump file in my example i will past MyDump.dmp .
3 - Right click on the MyDump.dmp and run it .
To import data from a script file, run the file as it is described in Run database code. In addition to script files, you can import a CSV, TSV, or any other text file that contains delimiter-separated values.
https://www.jetbrains.com/help/datagrip/import-data.html

Read data from mde database

Ive got an mde file and all i need is to get data from tables of this file in any format... I've tried everything that I could find on the internet or get by myself, but nothing helps. When I am using standard tools for extracting data in access it says to me, that I have something running operations on that database( I think there is some kind of a macro or something else, that starts when I open this file).
If you have any info on how to get data from this file I will be SO thankfull!
Thaks again for your attention and I really sorry for my bad english(doing my best to learn speaking it better).
There are a few tricks you can try:
Press SHIFT when opening the file to disable autoExec macros.
Just in case the file is corrupted, try to compact and repair it (make a backup copy before!).
You can use the command line option /compact to perform the compact and repair without first opening the file.
You can create a new database and create linked tables from the other database.
You'll have the option of copying the whole tables (you may need to rename the .mde into .mdb so you can see it in the dialog asking for the file to link to).