I am uploading Excel files (.xls) containing numeric values such as 884.557 and 731.0547 into a MySQL database using phpMyAdmin's built-in Import function. However, I am having horrible rounding/truncation issues. For some reason, some values like 884.557 and 731.0547 are changed to 99.99999 or 9.99999. However, other values like 127.0947 are imported correctly. Can anyone help? If possible, I would still like to use the built-in phpMyAdmin Import function because it is useful.
If you are familiar with html and php, by using this script simplex excel library you can create your own excel import to mysql. It may take few minutes to create but once your create you can use it for life time.
CREATE A HTML FORM TO UPLOAD EXCEL SHEET
THEN CREATE A PHP SCRIPT LIKE BELOW
require 'simplexlsx.class.php';
if (isset($_FILES['Filedata'])) {
$file = $_FILES['Filedata']['tmp_name']; // UPLOADED EXCEL FILE
$xlsx = new SimpleXLSX($file);
list($cols, $rows) = $xlsx->dimension();
foreach( $xlsx->rows() as $k => $r) { // LOOP THROUGH EXCEL WORKSHEET
$q = "INSERT INTO TABLENAME(COL1, COL2) VALUE(";
$q .= "'".mysql_escape_string($r[0])."', "; // EXCEL DATA
$q .= "'".mysql_escape_string($r[1])."', "; // EXCEL DATA
$q .= ")";
$sql = mysql_query($q);
} // IF ENDS HERE
} // FOR EACH LOOP
}
This is what i normally do:
Save the excel file as CSV format
I will manually create the database table by indicating the data-types for every column of my interest.
I will upload the csv file to the selected table by ignoring the "column names" as i have defined it at step 2. Decimals are truncated because phpmyadmin has some unexplained algorithm to determine the data type and the size allocated to a column. To prevent that, you create the table as mentioned above at step 2.
Hope it helps!
Related
I get SQL via file_get_contents and give DB::unprepared function
$path = public_path('sql/Store.sql');
$sql = file_get_contents($path);
DB::unprepared($sql);
the tables are created but the triggers are not created.
But when I put this SQL code to phpMyadmin both tables and triggers are created successfully. I use Server version: 10.3.28-MariaDB - MariaDB Server.
Do you have any idea how to solve this issue?
You should trim the contents that come from the file to remove some characters like \n at the end of sqls. I've used trim($file_path, "\x00..\x1F") to remove this range of characters from your sql.
$path = public_path('sql/Store.sql');
$sql = trim(file_get_contents($path), "\x00..\x1F");
DB::unprepared($sql);
I have created a section in my functions.php file and registered a shortcode which is intended to echo some data from a custom mysql database table.
I have an external system which will insert data into the database.
My wordpress install should be able to read from this database, and I would like to echo some rows on a wordpress page via shortcode.
The problem I am having is that the table data is not being echoed out. I can do echo "test"; and the word test will get displayed but the problem I having is not showing table data. Heres my code.
function vstp_feed_function(){
//$x++;
//$alternatingclass = ($x%2 == 0)? 'whiteBackground': 'graybackground';
$mydb = new wpdb('root','cencored','testdb','localhost');
$rows = $mydb->get_results("select * from services");
echo "<ul>";
foreach ($rows as $obj) :
echo "<li>".$obj->headcode."</li>";
endforeach;
echo "</ul>";
}
add_shortcode('vstp_feed', 'vstp_feed_function');
So the above code will create a new variable mydb, with the database credentials and information. A new variable called $rows will store the sql query which uses the mydb variable. I get all columns from services table. Thereafter I am echoing an unordered list, running a for loop to display the contents of the column "headcode" in list item html.
Once the select query is done, the for loop ends and so does the html list.
The shortcode is registered as [vstp_feed] which is tested and working with static html.
Any help would be greatly appreciated. Worked with sql before but not with new wordpress db instances. Thanks.
Just to confirm, its apache ubuntu. Its also worth noting that my ##datadir is stored on a different disk seperate from my OS disk, for faster IO and freedom to expand storage on the fly
Place your connection code inside wp-config.php file ( $mydb = new wpdb('root','cencored','testdb','localhost'); )
And use it in your function file as "global $mydb;"
I've got 200k csv files and I need to import them all to a single postgresql table. It's a list of parameters from various devices and each csv's file name contains device's serial number and I need it to be in one of the colums for each row.
So to simplify, I've got few columns of data (no headers), let's say that columns in each csv file are: Date, Variable, Value and file name contains SERIALNUMBER_and_someOtherStuffIDontNeed.csv
I'm trying to use cygwin to write a bash script to iterate over files and do it for me, however for some reason it won't work, showing 'syntax error at or near "as" '
Here's my code:
#!/bin/bash
FILELIST=/cygdrive/c/devices/files/*
for INPUT_FILE in $FILELIST
do
psql -U postgres -d devices -c "copy devicelist
(
Date,
Variable,
Value,
SN as CURRENT_LOAD_SOURCE(),
)
from '$INPUT_FILE
delimiter ',' ;"
done
I'm learning SQL so it might be an obvious mistake, but I can't see it.
Also I know that in that form I will get full file name, not just the serial number bit I want but I can probably handle that somehow later.
Please advise.
Thanks.
I dont think there is a CURRENT_LOAD_SOURCE() function in postgres. A work-around is to leave the name-column NULL on copy, and patch is to the desired value just after the copy. I prefer a shell here-document because that make quoting inside the SQL body easier. (BTW: for 10K of files, the globbing needed to obtain FILELIST might exceed argmax for the shell ...)
#!/bin/bash
FILELIST="`ls /tmp/*.c`"
for INPUT_FILE in $FILELIST
do
echo "File:" $INPUT_FILE
psql -U postgres -d devices <<OMG
-- I have a schema "tmp" for testing purposes
CREATE TABLE IF NOT EXISTS tmp.filelist(name text, content text);
COPY tmp.filelist ( content)
from '/$INPUT_FILE' delimiter ',' ;
UPDATE tmp.filelist SET name = '$FILELIST'
WHERE name IS NULL;
OMG
done
For anyone interested in an answer, I've used a python script to change file names and then another script using psycopg2 to connect to the database and then done everyting in one connection. Took 10 minutes instead of 10 hours.
Here's the code:
Renaming files (also apparently to import from CSV you need all the rows to be filled and the information I needed was in first 4 columns anyway, therefore I've put together a solution to generate whole new CSVs instead of just renaming them):
import os
import csv
path='C:/devices/files'
os.chdir(path)
i=0
for file in os.listdir(path):
try:
i+=1
if i%10000 == 0:
#just to see the progress
print(i)
serial_number = (file[:8])
creader = csv.reader(open(file))
cwriter = csv.writer(open('processed_'+file, 'w'))
for cline in creader:
new_line = [val for col, val in enumerate(cline) if col not in (4, 5, 6, 7)]
new_line.insert(0, serial_number)
#print(new_line)
cwriter.writerow(new_line)
except:
print('problem with file: ' + file)
pass
Updating database:
import os
import psycopg2
path="C:\\devices\\files"
directory_listing = os.listdir(path)
conn = psycopg2.connect("dbname='devices' user='postgres' host='localhost'")
cursor = conn.cursor()
print(len(directory_listing))
i=100001
while i < 218792:
current_file=(directory_listing[i])
i+=1
full_path = "C:/devices/files/" + current_file
with open(full_path) as f:
cursor.copy_from(file=f, table='devicelistlive', sep=",")
conn.commit()
conn.close()
Don't mind while and weird numbers, it's just because I was doing it in portions for testing purposes. Can easily be replaced with for
I am very new to MySQL. I have started learning MySQL today only.
I have been given a task.
In this task I need to create a sample csv file in excel and then upload it on MySQL server, then make some changes and then write it back as a csv file on the hard drive.
I looked up many links available on internet but they are too complicated to understand.
Can somebody explain it to me in simple words assuming that I am a beginner and I just know how to create a connection on MySQL.
Thank You.
Try this,
<?php
$csvFile = 'test_data.csv';
$csv = readCSVFile($csvFile);
if(!empty($csv))
{
foreach($csv as $file)
{
//inserting into database
$query_insert = "insert into csv_data_upload set
name = '".$file[0]."',
value = '".$file[1]."'";
echo $query_insert;
$insert = mysql_query($query_insert);
}
} else {
echo 'Csv is empty';
}
//Reading CSV File
function readCSVFile($csvFile)
{
$file_handle = fopen($csvFile, 'r');
while (!feof($file_handle) )
{
$line_of_text[] = fgetcsv($file_handle, 1024);
}
fclose($file_handle);
return $line_of_text;
}
?>
And for writing CSV File, Try this,
<?php
$list = array (
array('aaa', 'bbb', 'ccc', 'dddd'),
array('123', '456', '789'),
array('"aaa"', '"bbb"')
);
$fp = fopen('file.csv', 'w');
foreach ($list as $fields)
{
fputcsv($fp, $fields);
}
fclose($fp);
?>
I would recommend uploading the data via Microsoft Access. Access is just like Excel part of the office suite from microsoft. You can easily import the CSV / Excel file via a wizzard. Then use the database wizzard to upload everything to MySQL.
By the way you could probably do the same changes to the data in MS Access. It runs with some adaptations the same SQL commands and you could also use the GUI. I would recommend MySQL more for on line purposes, not for stuff you can do off line. And lets be fair using a command line might not be for everybody the most easy point of entry to databases. There are more using friendly command line alternatives for MySQL, for example mysql workbench and phpmyadmin. You could use them to directly import in to MySQL, but I have no experience with that option.
Good luck with you task.
You should put your code what you have tried, even you can search over google before posting question. Suggested link to check - http://www.php-guru.in/2013/import-csv-data-to-mysql-using-php/
I want to store huge content in db and my sample text is 16129 characters in length when i tried to execute this query it is showing "error:The requested URL could not be retrieved" in firefox and "no-data received" in chrome.
Moreover I use LONGTEXT as datatype for text in DB.
I also tried to execute the query directly in phpmyadmin it is working correctly.
The code is shown below.
public function _getConnection($type = 'core_write') {
return Mage::getSingleton('core/resource')->getConnection($type);
}
public function testdbAction(){
$db = $this->_getConnection();
$current_time=now();
$text="The European languages are members of the same family...... ...Europe uses the same vocabulary. The ";//text is 16129 characters in length
$sql = "INSERT into test(`usercontent_id`,`app_id`,`module_id`,`customer_id`,`content`,`created_time`,`updated_time`,`item_id`,`index_id`,`position_id`) VALUES (NULL, 15, 9,2,'" .$text. "','" . $current_time . "','" . $current_time . "',1003,5,4)";
$db->query($sql);
}
How do i handle this? any suggestions or help..
Try using $db->exec($sql) instead of $db->query($sql)
For manipulating with database structure and data Magento has a dedicated place. It is called install / upgrade scripts. It was made to keep track of modules version for easy updates. So you should use a script to add new data.
Here's an example.
config.xml of your module:
<modules>
<My_Module>
<version>0.0.1</version>
</My_Module>
</modules>
<global>
<resources>
<my_module_setup>
<setup>
<module>My_Module</module>
<class>Mage_Core_Model_Resource_Setup</class>
</setup>
</my_module_setup>
</resources>
</global>
Now, you need to create following file:
My_Module/sql/my_module_setup/install-0.0.1.php
Now, depending on your Magento version, file would need to have a different name. If you're using Magento CE 1.4 or lower (EE 1.8) then you should name it mysql4-install-0.0.1.php.
This script will be launched at next website request. Class Mage_Core_Model_Resource_Setup will execute code inside install-0.0.1.php. From within the install script you will have access to the Setup class by using $this object;
So now, you can write in script following code:
$this->startSetup();
$text = <<<TEXT
YOUR LONG TEXT GOES HERE
TEXT;
$this->getConnection()
->insert(
$this->getTable('test'),
array(
'usercontent_id' => null,
'app_id' => 15,
'content' => $text,
//other fields in the same fashion
)
);
$this->endSetup();
And that's it. It's a clean and appropriate way of adding custom data to the database in Magento.
If you want to save a user input on a regular basis using forms, then I recommend creating w model, resource model and collection, define entities in config.xml. For more information please refer to Alan Storm's articles, like this one.
I hope that I understood your question correctly.