Uploading text file to mySql in apache server - mysql

In my project, RFId tags are read by the reader and the info is stored in a text file. I want this text file to be uploaded ( or somehow copied) to the phpmyadmin in apache server so that i may do further processing and display result on a website. How can i do it?

Read the text file and make and array, which you import in the phpmyadmin using a loop.
<?php
$file_name = "filename.txt";
$lines = file($file_name);
foreach($lines as $line)
{
$sql_query=mysql_query("INSERT INTO table_name(column1) VALUES ('$line')");
}
?>

Related

Using wpdb to display non-wordpress database data not working

I have created a section in my functions.php file and registered a shortcode which is intended to echo some data from a custom mysql database table.
I have an external system which will insert data into the database.
My wordpress install should be able to read from this database, and I would like to echo some rows on a wordpress page via shortcode.
The problem I am having is that the table data is not being echoed out. I can do echo "test"; and the word test will get displayed but the problem I having is not showing table data. Heres my code.
function vstp_feed_function(){
//$x++;
//$alternatingclass = ($x%2 == 0)? 'whiteBackground': 'graybackground';
$mydb = new wpdb('root','cencored','testdb','localhost');
$rows = $mydb->get_results("select * from services");
echo "<ul>";
foreach ($rows as $obj) :
echo "<li>".$obj->headcode."</li>";
endforeach;
echo "</ul>";
}
add_shortcode('vstp_feed', 'vstp_feed_function');
So the above code will create a new variable mydb, with the database credentials and information. A new variable called $rows will store the sql query which uses the mydb variable. I get all columns from services table. Thereafter I am echoing an unordered list, running a for loop to display the contents of the column "headcode" in list item html.
Once the select query is done, the for loop ends and so does the html list.
The shortcode is registered as [vstp_feed] which is tested and working with static html.
Any help would be greatly appreciated. Worked with sql before but not with new wordpress db instances. Thanks.
Just to confirm, its apache ubuntu. Its also worth noting that my ##datadir is stored on a different disk seperate from my OS disk, for faster IO and freedom to expand storage on the fly
Place your connection code inside wp-config.php file ( $mydb = new wpdb('root','cencored','testdb','localhost'); )
And use it in your function file as "global $mydb;"

Converting evtx log to csv error

I am trying to convert and evtx log file to csv from log parser 2.2. I just want to copy all of the data into a csv.
LogParser "Select * INTO C:\Users\IBM_ADMI
N\Desktop\sample.csv FROM C:\Users\IBM_ADMIN\Desktop\Event
Logs\sample.evtx" -i:EVTX -o:csv
But I am getting the error below.
Error: Syntax Error: extra token(s) after query: 'Logs\sample.evtx'
Please assist in solving this error.
I know this has been a year but if you (or other people) still need it and for sake of reference, this is what I do:
LogParser "Select * INTO C:\Users\IBM_ADMIN\Desktop\sample.csv FROM 'C:\Users\IBM_ADMIN\Desktop\Event Logs\sample.evtx'" -i:evt -o:csv
Correct input type is evt, not evtx.
If there is space in the Event Logs folder, enclose with single quote.
The Problem was due to the extra space in between the folder name Event Logs. Changed the folder name to a single workd and it worked.
you have to convert .evtx file to .csv than you can read from this .csv file.
like this .enter image description here
//String command = "powershell.exe your command";
//called the PowerShell from java code
String command = "powershell.exe Get-WinEvent -Path C:\windows\System32\winevt\Logs\System.evtx | Export-Csv system.csv";
File seys = new File("system.csv");
Process powerShellProcess = Runtime.getRuntime().exec(command);

How to import a csv file into MySQL server?

I am very new to MySQL. I have started learning MySQL today only.
I have been given a task.
In this task I need to create a sample csv file in excel and then upload it on MySQL server, then make some changes and then write it back as a csv file on the hard drive.
I looked up many links available on internet but they are too complicated to understand.
Can somebody explain it to me in simple words assuming that I am a beginner and I just know how to create a connection on MySQL.
Thank You.
Try this,
<?php
$csvFile = 'test_data.csv';
$csv = readCSVFile($csvFile);
if(!empty($csv))
{
foreach($csv as $file)
{
//inserting into database
$query_insert = "insert into csv_data_upload set
name = '".$file[0]."',
value = '".$file[1]."'";
echo $query_insert;
$insert = mysql_query($query_insert);
}
} else {
echo 'Csv is empty';
}
//Reading CSV File
function readCSVFile($csvFile)
{
$file_handle = fopen($csvFile, 'r');
while (!feof($file_handle) )
{
$line_of_text[] = fgetcsv($file_handle, 1024);
}
fclose($file_handle);
return $line_of_text;
}
?>
And for writing CSV File, Try this,
<?php
$list = array (
array('aaa', 'bbb', 'ccc', 'dddd'),
array('123', '456', '789'),
array('"aaa"', '"bbb"')
);
$fp = fopen('file.csv', 'w');
foreach ($list as $fields)
{
fputcsv($fp, $fields);
}
fclose($fp);
?>
I would recommend uploading the data via Microsoft Access. Access is just like Excel part of the office suite from microsoft. You can easily import the CSV / Excel file via a wizzard. Then use the database wizzard to upload everything to MySQL.
By the way you could probably do the same changes to the data in MS Access. It runs with some adaptations the same SQL commands and you could also use the GUI. I would recommend MySQL more for on line purposes, not for stuff you can do off line. And lets be fair using a command line might not be for everybody the most easy point of entry to databases. There are more using friendly command line alternatives for MySQL, for example mysql workbench and phpmyadmin. You could use them to directly import in to MySQL, but I have no experience with that option.
Good luck with you task.
You should put your code what you have tried, even you can search over google before posting question. Suggested link to check - http://www.php-guru.in/2013/import-csv-data-to-mysql-using-php/

Pairing content on an external website with entries in an mySQL database

tl;dr: I'm looking for a way to find entries in our database which are missing information, getting that information from a website and adding it to the database entry.
We have a media management program which uses a mySQL table to store the information. When employees download media (video files, images, audio files) and import it into the media manager they are suppose to also copy the description of the media (from the source website) and add it to the description in the Media Manager. However this has not been done for thousands of files.
The file name (eg. file123.mov) is unique and the details page for that file can be accessed by going to a URL on the source website:
website.com/content/file123
The information we want to scrape from that page has an element ID which is always the same.
In my mind the process would be:
Connect to database and Load table
Filter: "format" is "Still Image (JPEG)"
Filter: "description" is "NULL"
Get first result
Get "FILENAME" without extension)
Load the URL: website.com/content/FILENAME
Copy contents of the element "description" (on website)
Paste contents into the "description" (SQL entry)
Get 2nd result
Rinse and repeat until last result is reached
My question(s) are:
Is there software that could perform such a task or is this something that would need to be scripted?
If scripted, what would be the best type of script (eg could I achieve this using AppleScript or would it need to be made in java or php etc.)
Is there software that could perform such a task or is this something that would need to be scripted?
I'm not aware of anything that will do what you want out of the box (and even if there was, the configuration required won't be much less work than the scripting involved in rolling your own solution).
If scripted, what would be the best type of script (eg could I achieve this using AppleScript or would it need to be made in java or php etc.)
AppleScript can't connect to databases, so you will definitely need to throw something else into the mix. If the choice is between Java and PHP (and you're equally familiar with both), I'd definitely recommend PHP for this purpose, as there will be considerably less code involved.
Your PHP script would look something like this:
$BASEURL = 'http://website.com/content/';
// connect to the database
$dbh = new PDO($DSN, $USERNAME, $PASSWORD);
// query for files without descriptions
$qry = $dbh->query("
SELECT FILENAME FROM mytable
WHERE format = 'Still Image (JPEG)' AND description IS NULL
");
// prepare an update statement
$update = $dbh->prepare('
UPDATE mytable SET description = :d WHERE FILENAME = :f
');
$update->bindParam(':d', $DESCRIPTION);
$update->bindParam(':f', $FILENAME);
// loop over the files
while ($FILENAME = $qry->fetchColumn()) {
// construct URL
$i = strrpos($FILENAME, '.');
$url = $BASEURL . (($i === false) ? $FILENAME : substr($FILENAME, 0, $i));
// fetch the document
$doc = new DOMDocument();
$doc->loadHTMLFile($url);
// get the description
$DESCRIPTION = $doc->getElementsById('description')->nodeValue;
// update the database
$update->execute();
}
I too am not aware of any existing software packages that will do everything you're looking for. However, Python can connect to your database, make web requests easily, and handle dirty html. Assuming you already have Python installed, you'll need three packages:
MySQLdb for connecting to the database.
Requests for easily making http web requests.
BeautifulSoup for robust parsing of html.
You can install these packages with pip commands or Windows installers. Appropriate instructions are on each site. The whole process won't take more than 10 minutes.
import MySQLdb as db
import os.path
import requests
from bs4 import BeautifulSoup
# Connect to the database. Fill in these fields as necessary.
con = db.connect(host='hostname', user='username', passwd='password',
db='dbname')
# Create and execute our SELECT sql statement.
select = con.cursor()
select.execute('SELECT filename FROM table_name \
WHERE format = ? AND description = NULL',
('Still Image (JPEG)',))
while True:
# Fetch a row from the result of the SELECT statement.
row = select.fetchone()
if row is None: break
# Use Python's built-in os.path.splitext to split the extension
# and get the url_name.
filename = row[0]
url_name = os.path.splitext(filename)[0]
url = 'http://www.website.com/content/' + url_name
# Make the web request. You may want to rate-limit your requests
# so that the website doesn't get angry. You can slow down the
# rate by inserting a pause with:
#
# import time # You can put this at the top with other imports
# time.sleep(1) # This will wait 1 second.
response = requests.get(url)
if response.status_code != 200:
# Don't worry about skipped urls. Just re-run this script
# on spurious or network-related errors.
print 'Error accessing:', url, 'SKIPPING'
continue
# Parse the result. BeautifulSoup does a great job handling
# mal-formed input.
soup = BeautifulSoup(response.content)
description = soup.find('div', {'id': 'description'}).contents
# And finally, update the database with another query.
update = db.cursor()
update.execute('UPDATE table_name SET description = ? \
WHERE filename = ?',
(description, filename))
I'll warn that I've made a good effort to make that code "look right" but I haven't actually tested it. You'll need to fill in the private details.
PHP is a good scraper . I have made a class that wraps the cURL port of PHP here:
http://semlabs.co.uk/journal/object-oriented-curl-class-with-multi-threading
You'll probably need to use some of the options:
http://www.php.net/manual/en/function.curl-setopt.php
To scrape HTML, I usually use regular expressions, but here is a class I made that should be able to query HTML without issues:
http://pastebin.com/Jm9jKjAU
Useage is:
$h = new HTMLQuery();
$h->load( $string_containing_html );
$h->getElements( 'p', 'id' ); // Returns all p tags with an id attribute
The best option to scrape would be XPath, but it can't handle dirty HTML. You can use that to do things like:
//div[#class = 'itm']/p[last() and text() = 'Hello World'] <- selects the last p in div elements that have the innerHTML 'Hello World'
You can use that in PHP with the DOM class (built-in).

Help Importing an Excel File into MySQL using phpMyAdmin

I am uploading Excel files (.xls) containing numeric values such as 884.557 and 731.0547 into a MySQL database using phpMyAdmin's built-in Import function. However, I am having horrible rounding/truncation issues. For some reason, some values like 884.557 and 731.0547 are changed to 99.99999 or 9.99999. However, other values like 127.0947 are imported correctly. Can anyone help? If possible, I would still like to use the built-in phpMyAdmin Import function because it is useful.
If you are familiar with html and php, by using this script simplex excel library you can create your own excel import to mysql. It may take few minutes to create but once your create you can use it for life time.
CREATE A HTML FORM TO UPLOAD EXCEL SHEET
THEN CREATE A PHP SCRIPT LIKE BELOW
require 'simplexlsx.class.php';
if (isset($_FILES['Filedata'])) {
$file = $_FILES['Filedata']['tmp_name']; // UPLOADED EXCEL FILE
$xlsx = new SimpleXLSX($file);
list($cols, $rows) = $xlsx->dimension();
foreach( $xlsx->rows() as $k => $r) { // LOOP THROUGH EXCEL WORKSHEET
$q = "INSERT INTO TABLENAME(COL1, COL2) VALUE(";
$q .= "'".mysql_escape_string($r[0])."', "; // EXCEL DATA
$q .= "'".mysql_escape_string($r[1])."', "; // EXCEL DATA
$q .= ")";
$sql = mysql_query($q);
} // IF ENDS HERE
} // FOR EACH LOOP
}
This is what i normally do:
Save the excel file as CSV format
I will manually create the database table by indicating the data-types for every column of my interest.
I will upload the csv file to the selected table by ignoring the "column names" as i have defined it at step 2. Decimals are truncated because phpmyadmin has some unexplained algorithm to determine the data type and the size allocated to a column. To prevent that, you create the table as mentioned above at step 2.
Hope it helps!