Is there a way to send gulp.src(some data) instead of a glob? - json

I want to be able to edit JSON then send it through the gulp stream. I know there's gulp-json-edit but I want to understand how it's done and do it myself. In this case, to change the Basic authorization.
For example, something like this:
var data = JSON.parse(fs.readFileSync('./core-config.json'));
data.local.ENDPOINT.CORE.BASIC = "Basic Stuff";
gulp.src(data)
.pipe(somestuff)
.pipe(gulp.dest('./'));
However, this of course doesn't work because data isn't a glob. How can I then manipulate data in a way that I can then pass it to gulp.src()?

A while ago I wrote a module that can turn a regular object stream into a vinyl stream: vinylize. It's mostly useful for static site generation, but If I understand your question correctly it should be able to handle your use case as well.
Your example code using vinylize() would look like this:
var vinylize = require('vinylize');
var data = JSON.parse(fs.readFileSync('./core-config.json'));
data.local.ENDPOINT.CORE.BASIC = "Basic Stuff";
vinylize([data], {
path: 'core-config.json',
contents: JSON.stringify(data),
ignoreSourceProps: true,
})
.pipe(somestuff)
.pipe(gulp.dest('./'));

Related

Getting JSON from HTTP Discord.Js

Im making a discord bot, and I have a URL here which has some raw json: link here and I want one of the values (hashrateString) to be put inside a embed like:
hashrateString: 1GH
is there a way to do that and if so how?
I never tried this with an external link but it should work the same way.
FIRST: write somewhere high up in your code this line
var fs = require('fs');
var data = JSON.parse(fs.readFileSync('http://ric.pikapools.com/api/stats', 'utf8'));
After you can basically do whatever you want with your new object. There was no hashrateString: 1GH, but hashrateString: 4.68 GH should be accessible with data.algos.primesr.hashrateString (Output: 4.68 GH)
If it, for some weird reason, doesn't accept an URL, just try to copy&paste the text in a json file if possible, and use the path to it
I was able to get this to work by specifying a constant to be the JSON from the url using node-fetch
const ricp = await fetch('http://ric.pikapools.com/api/stats').then(response => response.json());
and find an object in the JSON using
message.channel.send(ricp.algos.primesr.hashrateString)

JSON in cy.request body

In our web app, we have options that can be changed in using a POST HTTP request. There are a good number of options, and I will be writing a new test for each one, so I don't want to use the UI to change each option, seeing as there are 150 of them. SO my idea was to set up a custom command that I could feed arguments into (the argument being which option I want to update, and the new value for that option).
I put the list of options in a fixture, so it is in a JSON object. I was able to get to the point where I can find the key I'm looking for and update the value from the fixture, but I am running into an issue where my cy.request won't actually send any data. I've tried updating the headers, updating the body, setting json:true. Nothing works. So I'm hoping someone here will have some advice.
//fixture.json
{
"option1":"on",
"option2":"off",
"option3":"off
}
//commands.js
Cypress.Commands.add('update_options',(option, newValue) => {
cy.fixture('fixture.json').then((oldBody)=>{
let newBody = Objects.assign({},oldBody);//copy old options list into new object
function replace(option, newBody){
newBody[option]=newValue;
}
replace(option,newValue);
cy.request({
method:'POST',
url:'myURLwithParams',
form: true,
json: true,
body: newBody
})
});
});
//spec.js
cy.update_options("options1", "off");
I can get the new object with the updated code and everything, so that all works. The only thing I can't figure out is how to get it to actually POST. The JSON just doesn't compile correctly. I tried JSON.stringify(newBody) - no luck. I've tried every combination of everything I've mentioned and can't get it to work.
I tried with below code (with some hard coded data) and it works for me,
cy.fixture("fixture").then((oldBody) => {
cy.log(oldBody);
let newBody = oldBody
newBody['option1'] = 'DUMMY_DATA';
cy.log(newBody);
cy.request({
method: "POST",
url: "myURLwithParams",
form: true,
json: true,
body: newBody
});
});
Notable changes:
Directly assigned the old JSON object to a new JSON object (Instead
of Object usage)
Put some logs to track the changes
For your reference, attaching some of the screenshots here,
New JSON data (post substitution):
XHR request sending updated JSON:

Read local JSON files when dynamically creating functional tests in Intern

I am creating functional tests dynamically using Intern v4 and dojo 1.7. To accomplish this I am assigning registerSuite to a variable and attaching each test to the Tests property in registerSuite:
var registerSuite = intern.getInterface('object').registerSuite;
var assert = intern.getPlugin('chai').assert;
// ...........a bunch more code .........
registerSuite.tests['test_name'] = function() {
// READ JSON FILE HERE
var JSON = 'filename.json';
// ....... a bunch more code ........
}
That part is working great. The challenge I am having is that I need to read information from a different JSON file for each test I am dynamically creating. I cannot seem to find a way to read a JSON file while the dojo javascript is running (I want to call it in the registerSuite.tests function where it says // READ JSON FILE HERE). I have tried dojo's xhr.get, node's fs, intern's this.remote.get, nothing seems to work.
I can get a static JSON file with define(['dojo/text!./generated_tests.json']) but this does not help me because there are an unknown number of JSON files with unknown filenames, so I don't have the information I would need to call them in the declare block.
Please let me know if my description is unclear. Any help would be greatly appreciated!
Since you're creating functional tests, they'll always run in Node, so you have access to the Node environment. That means you could do something like:
var registerSuite = intern.getPlugin('interface.object').registerSuite;
var assert = intern.getPlugin('chai').assert;
var tests = {};
tests['test_name'] = function () {
var JSON = require('filename.json');
// or require.nodeRequire('filename.json')
// or JSON.parse(require('fs').readFileSync('filename.json', {
// encoding: 'utf8'
// }))
}
registerSuite('my suite', tests);
Another thing to keep in mind is assigning values to registerSuite.tests won't (or shouldn't) actually do anything. You'll need to call registerSuite, passing it your suite name and tests object, to actually register tests.

AngularJS service PUT to flat JSON file

This is perhaps a stupid question. In which case I apologize.
I know you can use http.get to read flat JSON files, but is there any way to use a flat JSON file in a angular service to mimic a database for other CRUD operations. This would be very basic and only in development. I plan on using django rest, firebase, or something similar, but wanted to focus on the front end first.
You can use $httpBackend service from the ngMockE2E to mock a complete backend with post, get, put, and so on.
I've included the example from the angular documentation for the sake of completeness:
myAppDev = angular.module('myAppDev', ['myApp', 'ngMockE2E']);
myAppDev.run(function($httpBackend) {
var phones = [{name: 'phone1'}, {name: 'phone2'}];
// returns the current list of phones
$httpBackend.whenGET('/phones').respond(phones);
// adds a new phone to the phones array
$httpBackend.whenPOST('/phones').respond(function(method, url, data) {
var phone = angular.fromJson(data);
phones.push(phone);
return [200, phone, {}];
});
$httpBackend.whenGET(/^\/templates\//).passThrough();
//...
});

Backbone multiple collections fetch from a single big JSON file

I would like to know if any better way to create multiple collections fetching from a single big JSON file. I got a JSON file looks like this.
{
"Languages": [...],
"ProductTypes": [...],
"Menus": [...],
"Submenus": [...],
"SampleOne": [...],
"SampleTwo": [...],
"SampleMore": [...]
}
I am using the url/fetch to create each collection for each node of the JSON above.
var source = 'data/sample.json';
Languages.url = source;
Languages.fetch();
ProductTypes.url = source;
ProductTypes.fetch();
Menus.url = source;
Menus.fetch();
Submenus.url = source;
Submenus.fetch();
SampleOne.url = source;
SampleOne.fetch();
SampleTwo.url = source;
SampleTwo.fetch();
SampleMore.url = source;
SampleMore.fetch();
Any better solution for this?
Backbone is great for when your application fits the mold it provides. But don't be afraid to go around it when it makes sense for your application. It's a very small library. Making repetitive and duplicate GET requests just to fit backbone's mold is probably prohibitively inefficient. Check out jQuery.getJSON or your favorite basic AJAX library, paired with some basic metaprogramming as following:
//Put your real collection constructors here. Just examples.
var collections = {
Languages: Backbone.Collection.extend(),
ProductTypes: Backbone.Collection.extend(),
Menus: Backbone.Collection.extend()
};
function fetch() {
$.getJSON("/url/to/your/big.json", {
success: function (response) {
for (var name in collections) {
//Grab the list of raw json objects by name out of the response
//pass it to your collection's constructor
//and store a reference to your now-populated collection instance
//in your collection lookup object
collections[name] = new collections[name](response[name]);
}
}
});
}
fetch();
Once you've called fetch() and the asyn callback has completed, you can do things like collections.Menus.at(0) to get at the loaded model instances.
Your current approach, in addition to being pretty long, risks retrieving the large file multiple times (browser caching won't always work here, especially if the first request hasn't completed by the time you make the next one).
I think the easiest option here is to go with straight jQuery, rather than Backbone, then use .reset() on your collections:
$.get('data/sample.json', function(data) {
Languages.reset(data['Languages']);
ProductTypes.reset(data['ProductTypes']);
// etc
});
If you wanted to cut down on the redundant code, you can put your collections into a namespace like app and then do something like this (though it might be a bit too clever to be legible):
app.Languages = new LanguageCollection();
// etc
$.get('data/sample.json', function(data) {
_(['Languages', 'ProductTypes', ... ]).each(function(collection) {
app[collection].reset(data[collection]);
})
});
I think you can solve your need and still stay into the Backbone paradigm, I think an elegant solution that fits to me is create a Model that fetch the big JSON and uses it to fetch all the Collections in its change event:
var App = Backbone.Model.extend({
url: "http://myserver.com/data/sample.json",
initialize: function( opts ){
this.languages = new Languages();
this.productTypes = new ProductTypes();
// ...
this.on( "change", this.fetchCollections, this );
},
fetchCollections: function(){
this.languages.reset( this.get( "Languages" ) );
this.productTypes.reset( this.get( "ProductTypes" ) );
// ...
}
});
var myApp = new App();
myApp.fetch();
You have access to all your collections through:
myApp.languages
myApp.productTypes
...
You can easily do this with a parse method. Set up a model and create an attribute for each collection. There's nothing saying your model attribute has to be a single piece of data and can't be a collection.
When you run your fetch it will return back the entire response to a parse method that you can override by creating a parse function in your model. Something like:
parse: function(response) {
var myResponse = {};
_.each(response.data, function(value, key) {
myResponse[key] = new Backbone.Collection(value);
}
return myResponse;
}
You could also create new collections at a global level or into some other namespace if you'd rather not have them contained in a model, but that's up to you.
To get them from the model later you'd just have to do something like:
model.get('Languages');
backbone-relational provides a solution within backbone (without using jQuery.getJSON) which might make sense if you're already using it. Short answer at https://stackoverflow.com/a/11095675/70987 which I'd be happy to elaborate on if needed.