json formatting string to number - json

My Json output generates;
[
{
"a1_id":"7847TK10",
"output2":"7847TK10",
"output4":"something",
"output5":"3stars.gif",
"output9": "269000",
...
etc. etc.
The google visualization api asks for a number format for the output9 element e.g.:
"output9": 269000 instead of "output9": "269000". How can I achieve this for this element?
My json.php generates the json output like this:
?>
{
"total": <?php echo $total ?>,
"success": true,
"rows": [
// Iterate over the rows
$nextRow= $result->nextRow();
$r = 1;
$info = array();
while ( $nextRow ) {
$nextColumn = $result->nextColumn();
// Has this column been printed already
if ( $unique )
{
$d = $result->getDataForField($unique);
if ( array_key_exists($d, $already) )
{
$nextRow= $result->nextRow();
continue;
}
$already[$d] = true;
}
echo '{';
// Iterate over the columns in each row
while ( $nextColumn )
{
// Get the variable
$variable = $result->getOutputVariable();
$name = $variable->getName(true);
$data = $result->getDataForField();
if ( !isset($info[$name]) ) {
$info[$name]['translate'] = $variable->shouldTranslate();
$info[$name]['type'] = $variable->getDataType();
$info[$name]['linkable'] = $variable->isLinkable();
}
// Translate the data if requested
if ( $info[$name]['translate'] ) {
$data = LQMTemplate::_($data);
}
$data = $variable->format($data, false);
$type = $info[$name]['type'];
if ( ($type == 'bool') or ($type == 'boolean') )
{
$data = $data ? '1' : '0';
echo "'$name':$data";
} elseif ( $encode ) {
// Can we use json_encode ?
// str_replace because some versions of PHP have a bug that will over escape forward slashes
echo "\"$name\":".str_replace('\\/', '/', json_encode($data));
} else {
$data = LQMUtility::jsonEscape($data, '"');
//echo "'$name':\"$data\"";
echo "\"$name\":\"$data\"";
}
// Conditionally print the next column
$nextColumn = $result->nextColumn();
if ( $nextColumn ) echo ",\n ";
}
// Conditionally print the next column
$nextRow = $result->nextRow();
echo $nextRow ? "},\n" : "}\n";
$r++;
}
unset($result);
echo ']}';
}
}

This depends on how you are generating your JSON.
For example, if you were using a Ruby backend, you could call:
"output9" => output9.to_i
There are various helper methods in different languages (e.g. Java and Javascript have parseInt() functions) to change a string into an integer.
Edit:
If your JSON is being generated by PHP, cast the string to an integer:
$json['output9'] = int($output9_value);
That should get rid of the quotation marks.

Related

convert csv file to json object with Laravel

I am doing a project with Laravel 7. I have to read a csv file and through a controller passing the data in json format to a view.
Unfortunately I don't know how to do that.
These are my controller methods:
public function index($source)
{
$source = strtolower($source);
switch ($source) {
case "csv":
$file_csv = base_path('transactions.csv');
$transactions = $this->csvToJson($file_csv);
dd(gettype($transactions));
return view('transactions', ['source' => $source, 'transactions' => $transactions]);
break;
case "db":
$transactions = Transaction::all();
dd(gettype($transactions));
return view('transactions', ['source' => $source, 'transactions' => $transactions]);
break;
default:
abort(400, 'Bad sintax error.');
}
}
function csvToJson($filename = '', $delimiter = ',')
{
if (!file_exists($filename) || !is_readable($filename)) {
return false;
}
$header = null;
$data = array();
if (($handle = fopen($filename, 'r')) !== false)
{
while (($row = fgetcsv($handle, 1000, $delimiter)) !== false)
{
if (!$header)
$header = $row;
else
$data[] = array_combine($header, $row);
}
fclose($handle);
}
return $data;
}
As you can see, under the two cases I put a dd with a gettype function inside. In the first case I receive uncorrectly the response array, in the second one I receive correctly the response object.
The converted csv file should have this format:
[{"id":1,"code":"T_218_ljydmgebx","amount":"8617.19","user_id":375,"created_at":"2020-01-19T16:08:59.000000Z","updated_at":"2020-01-19T16:08:59.000000Z"},
{"id":2,"code":"T_335_wmhrbjxld","amount":"6502.72","user_id":1847,"created_at":"2020-01-19T16:08:59.000000Z","updated_at":"2020-01-19T16:08:59.000000Z"}]
Do you know how to convert the array transactions into a json object in the first case?
I don't know if there is any build-in solution on Laravel, but it can be done by PHP.
I didn't test my code. So it might not work, or contain some typos, but I'm sure it will give you some directions.
$cols = ["id","code","amount","user_id","created_at","updated_at"];
$csv = file('folder/name.csv');
$output = [];
foreach ($csv as $line_index => $line) {
if ($line_index > 0) { // I assume the the first line contains the column names.
$newLine = [];
$values = explode(',', $line);
foreach ($values as $col_index => $value) {
$newLine[$cols[$col_index]] = $value;
}
$output[] = $newLine;
}
}
$json_output = json_encode($output);
You can just do this :
$csv= file_get_contents($file);
$array = array_map('str_getcsv', explode(PHP_EOL, $csv));
$json json_encode($array);
If you want to return json object
return $json;
If you want to create a .json file
// Pure PHP
$file = fopen('results.json', 'w');
fwrite($file, $json);
fclose($file);
// Laravel using Storage
Storage::disk('local')->put('public/result.json', $json);
I hope this helps somone.
I have an email marketing application where I made bulk importers. So here is their CSV TO JSON code.
function convert_csv_to_json($csv_data){
// CSV to JSON process 1
$context = array(
'http'=>array(
'follow_location' => false,
'max_redirects' => 1000000
)
);
$context = stream_context_create($context);
if (($handle = fopen($csv_data, "r", false, $context)) !== FALSE) {
$csvs = [];
while(! feof($handle)) {
$csvs[] = fgetcsv($handle);
}
$datas = [];
$column_names = [];
foreach ($csvs[0] as $single_csv) {
$column_names[] = $single_csv;
}
foreach ($csvs as $key => $csv) {
if ($key === 0) {
continue;
}
foreach ($column_names as $column_key => $column_name) {
$datas[$key-1][$column_name] = $csv[$column_key];
}
}
return $json = json_encode($datas);
}
// OR
// CSV to JSON process 1
$cols = ['id',
'owner_id',
'name',
'email',
'country_code',
'phone',
'favourites',
'blocked',
'trashed',
'is_subscribed',
'deleted_at',
'created_at',
'updated_at.'];
$csv = file($csv_data);
$output = [];
foreach ($csv as $line_index => $line) {
if ($line_index > 0) { // I assume the the first line contains the column names.
$newLine = [];
$values = explode(',', $line);
foreach ($values as $col_index => $value) {
$newLine[$cols[$col_index]] = $value;
}
$output[] = $newLine;
}
}
return $json_output = json_encode($output);
}

How to retrieve composite column from Cassandra table in PHP

I have a CassandraHandler that retrieves the queries in rows
class CassandraHandler
{
private $keyspace = 'blabla'; //default is oyvent
private $cluster = NULL;
private $session = NULL;
function __construct(){
$this->cluster = \Cassandra::cluster()
->build(); // connects to localhost by default
$this->session = $this->cluster->connect($this->keyspace);
}
/**
* #return Rows
*/
public function execute($query){
$statement = new \Cassandra\SimpleStatement($query);
$result = $this->session->execute($statement);
return $result;
}
}
When I use for normal columns it's fine but I can't get my photo column in php
I created the column like this
photos frozen<set<map<text,text>>>
my json example
{{"urllarge": "1.jpg", "urlmedium": "2.jpg"},
{"urllarge": "3.jpg", "urlmedium": "4.jpg"}}
And here how can I use PHP to retrieve the composite columns?
$cassandraHandler = new CassandraHandlerClass();
$rows = $cassandraHandler->fetchLatestPosts($placeids, $limit);
foreach ($rows as $row) {
$tmp = array();
$tmp["userid"] = doubleval($row["userid"]);
$tmp["fullname"] = $row["fullname"];
$tmp["photos"] = $row["photos"] //????????
}
I know there is this documentation of the PHP driver https://github.com/datastax/php-driver
But I am a little confused.. I just need to get the json value like I get in cqlsh
You have two options to convert the composites into useable JSON:
Create a function to convert the deserialized/unmarshalled objects into JSON.
Retrieve the values from Cassandra as JSON.
Here is an example that demonstrates both options:
<?php
$KEYSPACE_NAME = "stackoverflow";
$TABLE_NAME = "retrieve_composites";
function print_rows_as_json($rows) {
foreach ($rows as $row) {
$set_count = 0;
echo "{\"photos\": [";
foreach ($photos = $row["photos"] as $photo) {
$map_count = 0;
echo "{";
foreach ($photo as $key => $value) {
echo "\"{$key}\": \"{$value}\"";
if (++$map_count < count($photo)) {
echo ", ";
}
}
echo "}";
if (++$set_count < count($photos)) {
echo ", ";
}
}
echo "]}" . PHP_EOL;
}
}
// Override default localhost contact point
$contact_points = "127.0.0.1";
if (php_sapi_name() == "cli") {
if (count($_SERVER['argv']) > 1) {
$contact_points = $_SERVER['argv'][1];
}
}
// Connect to the cluster
$cluster = Cassandra::cluster()
->withContactPoints($contact_points)
->build();
$session = $cluster->connect();
// Create the keypspace (drop if exists) and table
$session->execute("DROP KEYSPACE IF EXISTS {$KEYSPACE_NAME}");
$session->execute("CREATE KEYSPACE {$KEYSPACE_NAME} WITH replication = "
. "{ 'class': 'SimpleStrategy', 'replication_factor': 1 }"
);
$session->execute("CREATE TABLE ${KEYSPACE_NAME}.{$TABLE_NAME} ( "
. "id int PRIMARY KEY, "
. "photos frozen<set<map<text, text>>> )"
);
// Create a multiple rows to retrieve
$session->execute("INSERT INTO ${KEYSPACE_NAME}.{$TABLE_NAME} (id, photos) VALUES ( "
. "1, "
. "{{'urllage': '1.jpg', 'urlmedium': '2.jpg'}, "
. "{'urllage': '3.jpg', 'urlmedium': '4.jpg'}}"
. ")");
$session->execute("INSERT INTO ${KEYSPACE_NAME}.{$TABLE_NAME} (id, photos) VALUES ( "
. "2, "
. "{{'urllage': '21.jpg', 'urlmedium': '22.jpg'}, "
. "{'urllage': '23.jpg', 'urlmedium': '24.jpg'}}"
. ")");
// Select and print the unmarshalled data as JSON
$rows = $session->execute("SELECT photos FROM ${KEYSPACE_NAME}.{$TABLE_NAME}");
print_rows_as_json($rows);
// Select the data as JSON and print the string
$rows = $session->execute("SELECT JSON photos FROM ${KEYSPACE_NAME}.{$TABLE_NAME}");
foreach ($rows as $row) {
echo $row["[json]"] . PHP_EOL;
}
From the above example you can see that selecting the data as JSON involves less code for your application while also moving the processing onto the server. This probably the preferred choice for your application needs.
NOTE: This example is using v1.3.0 of the DataStax PHP driver which added support to pass a query strings directly to Session::execute() and Session::executeAsync(). If you are using an earlier version you will need to convert all query strings to Cassandra\Statement objects before passing to $session->execute(...).

.JSON file Retrieving Data - Look for a value and get related objects

<?php function getCurrencyFor($arr, $findCountry) {
foreach($arr as $country) {
if ($country->name->common == $findCountry) {
$currency = $country->currency[0];
$capital = $country->capital;
$region = $country->region;
break;
}
}
return $country();
}
$json = file_get_contents("https://raw.githubusercontent.com/mledoze/countries/master/countries.json");
$arr = json_decode($json);
// Call our function to extract the currency for Angola:
$currency = getCurrencyFor($arr, "Aruba");
echo $country('$capital');
echo $country('$currency');
echo $country('$region');
?>
I followed this post - https://stackoverflow.com/a/38906191/3939981
If I rewrite the code inside function, it works
<?php function getCurrencyFor($arr, $findCountry) {
foreach($arr as $country) {
if ($country->name->common == $findCountry) {
$currency = $country->currency[0];
$capital = $country->capital;
$region = $country->region;
echo $capital;
echo $currency;
echo $region;
break;
}
}
return $currency;
}
$json = file_get_contents("https://raw.githubusercontent.com/mledoze/countries/master/countries.json");
$arr = json_decode($json);
// Call our function to extract the currency for Angola:
$currency = getCurrencyFor($arr, "Aruba");
?>
Maybe some parts of the code did not work..Any comments and thoughs
You could use this code. Note that if you want a function to return three values, you should create an array with those values, and return that array. I also renamed the function, since it does not only return currency information:
function getCountryInfo($arr, $findCountry) {
foreach($arr as $country) {
if ($country->name->common == $findCountry) {
return array(
"currency" => $country->currency[0],
"capital" => $country->capital,
"region" => $country->region
);
}
}
}
$json = file_get_contents("https://raw.githubusercontent.com/mledoze/countries/master/countries.json");
$arr = json_decode($json);
// Call our function to extract the currency for Angola:
$country = getCountryInfo($arr, "Aruba");
echo $country['capital'] . "<br>";
echo $country['currency'] . "<br>";
echo $country['region'] . "<br>";

Find how many times every word is repeated in db

Am using drupal to manage my content. I want to search all my contents title and body and find how many times each word is repeated in the whole contents.
It may be by an sql query, but I have no experience with sql.
Any ideas?
This code searches the body field and ALL fields of ANY Content Types for a specific string. You can run it via command line. Say you save it as "fieldsearch.php", you can then run it as:
php fieldsearch.php "myStringForWhichToSearch"
You need to fill in your connection data and database name. It outputs the array of matching nodes but you can format that output into anything you'd like (I recommend csv).
<?php
//Set Parameters here
$env = "dev"; // Options [dev|prod] - Defaults to dev
$prodConnection = array(
"host" => "",
"user" => "",
"pass" => ""
);
$devConnection = array(
"host" => "",
"user" => "",
"pass" => ""
);
//Use the selected env settings
if($env == "prod"){
$connection = $prodConnection;
} else {
$connection = $devConnection;
}
function getTables($con, $database){
//Get the set of field tables
$sql = "SHOW TABLES FROM $database";
$result = mysqli_query($con, $sql);
if (!$result) {
echo "DB Error, could not list tables\n";
echo 'MySQL Error: ' . mysql_error();
exit;
}
$tables = array();
while ($row = mysqli_fetch_row($result)) {
$tables[] = $row[0];
}
mysqli_free_result($result);
return $tables;
}
function getFieldTables($con,$database){
$allTables = getTables($con, $database);
$fieldTables = array();
foreach($allTables as $key => $table){
if( isFieldTable($table) ){
$fieldTables[] = $table;
}
}
//add the common tables
$fieldTables[] = "field_data_body";
$fieldTables[] = "field_data_comment_body";
return $fieldTables;
}
function isFieldTable($table){
//echo $table . "\n";
if( stripos($table, "field_data_field") !== FALSE){
return TRUE;
}
}
//Set the search term here:
if (array_key_exists(1, $argv)) {
$searchString = $argv[1];
}
else {
die('usage: php fieldsearch.php "search string"' . "\n");
}
$databaseName = "myDatabaseName";
$outArray = array();
//Connect
$con=mysqli_connect($connection['host'],$connection['user'],$connection['pass'],$databasePrefix.$databaseNum);
// Check connection
if (mysqli_connect_errno()) {
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
//getFieldTables
$fieldTables = getFieldTables($con, $databaseName);
//Query each field tables data for the string in question
foreach($fieldTables as $key => $table){
//get Field value column name
$valueCol = str_replace("field_data_field_", '', $table);
$result = mysqli_query($con,"SELECT
entity_id
FROM
$table
WHERE
field_" . $valueCol . "_value
LIKE
'%$searchString%';");
if($result){
while($row = mysqli_fetch_assoc($result)){
$dataArray[$table][$row['entity_id']]['nid'] = $row['entity_id'];
}
}
}
//Add the body table
$result = mysqli_query($con,"SELECT
entity_id
FROM
field_data_body
WHERE
body_value
LIKE
'%$searchString%';");
if($result){
while($row = mysqli_fetch_assoc($result)){
$dataArray['field_data_body'][$row['entity_id']]['nid'] = $row['entity_id'];
}
}
var_dump($dataArray);

File path into JSON data structure

I'm doing a disk space report that uses File::Find to collect cumulative sizing in a directory tree.
What I get (easily) from File::Find is the directory name.
e.g.:
/path/to/user/username/subdir/anothersubdir/etc
I'm running File::Find to collect sizes beneath:
/path/to/user/username
And build a cumulative size report of the directory and each of the subdirectories.
What I've currently got is:
while ( $dir_tree ) {
%results{$dir_tree} += $blocks * $block_size;
my #path_arr = split ( "/", $dir_tree );
pop ( #path_arr );
$dir_tree = join ( "/", #path_arr );
}
(And yes, I know that's not very nice.).
The purpose of doing this is so when I stat each file, I add it's size to the current node and each parent node in the tree.
This is sufficient to generate:
username,300M
username/documents,150M
username/documents/excel,50M
username/documents/word,40M
username/work,70M
username/fish,50M,
username/some_other_stuff,30M
But I'd like to now turn that in to JSON more like this:
{
"name" : "username",
"size" : "307200",
"children" : [
{
"name" : "documents",
"size" : "153750",
"children" : [
{
"name" : "excel",
"size" : "51200"
},
{
"name" : "word",
"size" : "81920"
}
]
}
]
}
That's because I'm intending to do a D3 visualisation of this structure - loosely based on D3 Zoomable Circle Pack
So my question is this - what is the neatest way to collate my data such that I can have cumulative (and ideally non cumulative) sizing information, but populating a hash hierarchically.
I was thinking in terms of a 'cursor' approach (and using File::Spec this time):
use File::Spec;
my $data;
my $cursor = \$data;
foreach my $element ( File::Spec -> splitdir ( $File::Find::dir ) ) {
$cursor -> {size} += $blocks * $block_size;
$cursor = $cursor -> {$element}
}
Although... that's not quite creating the data structure I'm looking for, not least because we basically have to search by hash key to do the 'rolling up' part of the process.
Is there a better way of accomplishing this?
Edit - more complete example of what I have already:
#!/usr/bin/env perl
use strict;
use warnings;
use File::Find;
use Data::Dumper;
my $block_size = 1024;
sub collate_sizes {
my ( $results_ref, $starting_path ) = #_;
$starting_path =~ s,/\w+$,/,;
if ( -f $File::Find::name ) {
print "$File::Find::name isafile\n";
my ($dev, $ino, $mode, $nlink, $uid,
$gid, $rdev, $size, $atime, $mtime,
$ctime, $blksize, $blocks
) = stat($File::Find::name);
my $dir_tree = $File::Find::dir;
$dir_tree =~ s|^$starting_path||g;
while ($dir_tree) {
print "Updating $dir_tree\n";
$$results_ref{$dir_tree} += $blocks * $block_size;
my #path_arr = split( "/", $dir_tree );
pop(#path_arr);
$dir_tree = join( "/", #path_arr );
}
}
}
my #users = qw ( user1 user2 );
foreach my $user (#users) {
my $path = "/home/$user";
print $path;
my %results;
File::Find::find(
{ wanted => sub { \&collate_sizes( \%results, $path ) },
no_chdir => 1
},
$path
);
print Dumper \%results;
#would print this to a file in the homedir - to STDOUT for convenience
foreach my $key ( sort { $results{$b} <=> $results{$a} } keys %results ) {
print "$key => $results{$key}\n";
}
}
And yes - I know this isn't portable, and does a few somewhat nasty things. Part of what I'm doing here is trying to improve on that. (But currently it's a Unix based homedir structure, so that's fine).
If you do your own dir scanning instead of using File::Find, you naturally get the right structure.
sub _scan {
my ($qfn, $fn) = #_;
my $node = { name => $fn };
lstat($qfn)
or die $!;
my $size = -s _;
my $is_dir = -d _;
if ($is_dir) {
my #child_fns = do {
opendir(my $dh, $qfn)
or die $!;
grep !/^\.\.?\z/, readdir($dh);
};
my #children;
for my $child_fn (#child_fns) {
my $child_node = _scan("$qfn/$child_fn", $child_fn);
$size += $child_node->{size};
push #children, $child_node;
}
$node->{children} = \#children;
}
$node->{size} = $size;
return $node;
}
Rest of the code:
#!/usr/bin/perl
use strict;
use warnings;
no warnings 'recursion';
use File::Basename qw( basename );
use JSON qw( encode_json );
...
sub scan { _scan($_[0], basename($_[0])) }
print(encode_json(scan($ARGV[0] // '.')));
In the end, I have done it like this:
In the File::Find wanted sub collate_sizes:
my $cursor = $data;
foreach my $element (
File::Spec->splitdir( $File::Find::dir =~ s/^$starting_path//r ) )
{
$cursor->{$element}->{name} = $element;
$cursor->{$element}->{size} += $blocks * $block_size;
$cursor = $cursor->{$element}->{children} //= {};
}
To generate a hash of nested directory names. (The name subelement is probably redundant, but whatever).
And then post process it with (using JSON):
my $json_structure = {
'name' => $user,
'size' => $data->{$user}->{size},
'children' => [],
};
process_data_to_json( $json_structure, $data->{$user}->{children} );
open( my $json_out, '>', "homedir.json" ) or die $!;
print {$json_out} to_json( $json_structure, { pretty => 1 } );
close($json_out);
sub process_data_to_json {
my ( $json_cursor, $data_cursor ) = #_;
if ( ref $data_cursor eq "HASH" ) {
print "Traversing $key\n";
my $newelt = {
'name' => $key,
'size' => $data_cursor->{$key}->{size},
};
push( #{ $json_cursor->{children} }, $newelt );
process_data_to_json( $newelt, $data_cursor->{$key}->{children} );
}
}