I'm using angular 6 where we can create multiple applications and libraries apart from default app.module. I want to use an application variable like a url prefix or cache expiry time which can be used in all libraries and applications. But when i declare a variable in environment.ts i can only refer it in it's root directory that is src folder. since other projects where i want to refer to this variable, are not created in its root directory, it throws error at runtime saying it cannot access variables from folders not declared in it's root directory.
Can you please suggest something which can help me get access to application variable across all application.
I found the solution finally.
I created a folder Config at the root level and added a file environment.ts.
I copied the environment variables from src/app/environment/environment.ts into Config/environment.ts but with different values.
Eg. in src/app/environment/environment.ts
i added a variable setTimeout=3000
export const environment = {
production: false,
setTimout=3000
};
and in Config/environment.ts setTimeout=13000
export const environment = {
production: false,
setTimout=13000
};
in app.component.ts i used this setTimeout variable.
constructor(){
console.log("timeout is "+environment.setTimeout);
}
I ran it in dev env and it consoled 3000.
Now i replaced the folowwing line in angular.json
"configurations": {
"production": {
"fileReplacements": [
{
"replace": "src/environments/environment.ts",
"with": "src/environments/environment.prod.ts"
}
]
with
"configurations": {
"production": {
"fileReplacements": [
{
"replace": "src/environments/environment.ts",
"with": "Config/environment.ts"
}
]
Then i ran ng serve [MyAppName] --prod .
And went to dist/myapp folder and ran lite-server to deploy and run app locally.
And my setTimeout value from config/environment.ts replaced the one used during development in app.component.ts from environment/environment.ts file. It consoled 13000.
So in this way i can create a config like folder and use it in any library or app.
Related
I'm consuming a .json file on my localhost and everything is working fine. However, now I would like to deploy and localhost will no longer work.
This is part of the code:
// BurgerForm.vue
methods: {
async getIngredients() {
const req = await fetch("http://localhost:3000/ingredients");
const data = await req.json();
this.breads = data.breads;
this.meats = data.meats;
},
…
}
In order for me to run localhost I run the following backend in my package.json:
"scripts": {
"serve": "vue-cli-service serve",
"build": "vue-cli-service build",
"backend": "json-server --watch db/db.json"
},
The structure of the files I mentioned is like this:
/db/db.json
/src/components/BurgerForm.vue
So how do I keep querying and getting data from json after I migrate to the server?
You may find Webpack Dev Server useful.
Assuming you don't mind making your API available under a path on the main system, if you add the following to your vue.config.js:
devServer: {
//...other options may be necessary
proxy: {
"/path/to/api/*": {
target: "http://localhost:3000",
secure: false,
changeOrigin: true
}
}
}
Once you start your front-end app using npm run serve if you make a request to the /path/to/api/ingredients targeted at the same server, i.e. fetch("/path/to/api/ingredients"), the dev server will proxy the request to http://localhost:3000/ingredients automatically.
I have an existing Angular v5 app and have environment.json files for my environments (like DEV, Test, Production, etc.). The environments files are stored in the directory like so: src/Environments/DEV/environment.json.
Here is an example of a dev environment.json file:
{
"Comment": "Environment=DEV",
"API_ORIGIN": "https://myapp-dev",
"ORIGIN": "https://myapp-dev/index.html",
}
There is a root environment.json file in src folder that my app reads from. When I want to use a specific environment I just copy that environment content into the root and run the app.
Now with Cucumber and Protractor is there a way I can pass some command line argument to specify which environment.json file to use based on my setup? I have urls in these environment.json files so I need a way to tell Cucumber and Protractor which environment to use. If I have to copy all of the environment.json files into the e2e folder that is fine with me. Just in case the solution I need to use depends on the tools I am using here is my tsconfig.e2e.json file. Please let me know if it is incorrect:
{
"extends": "../tsconfig.json",
"compilerOptions": {
"outDir": "../out-tsc/e2e",
"baseUrl": "./",
"module": "commonjs",
"target": "es5",
"types": [
"chai",
"cucumber",
"node"
]
}
}
Here is the protractor.conf.js file. Let me know if it is incorrect as well please:
// Protractor configuration file, see link for more information
// https://github.com/angular/protractor/blob/master/lib/config.ts
exports.config = {
allScriptsTimeout: 11000,
specs: [
'./e2e/features/**/*.feature'
],
capabilities: {
'browserName': 'chrome'
},
directConnect: true,
baseUrl: 'http://localhost:4200/',
framework: 'custom',
frameworkPath: require.resolve('protractor-cucumber-framework'),
cucumberOpts: {
// require step definition files before executing features
require: ['./e2e/steps/**/*.ts'],
// <string[]> (expression) only execute the features or scenarios with tags matching the expression
tags: [],
// <string[]> ("extension:module") require files with the given EXTENSION after requiring MODULE (repeatable)
compiler: []
},
// Enable TypeScript for the tests
onPrepare() {
require('ts-node').register({
project: 'e2e/tsconfig.e2e.json'
});
}
};
I'm also using npm if that matters. I'm running these tests with ng e2e command provided by angular.
sure, 2 ways:
pass a parameter to protractor protractor conf.js --params.env="dev" and then refer to it as browser.params.env in specs. Downside is, it will only be available when the config is parsed and the browser is started, so you can really use that in the config itself
Run the process with an env variable MY_VAR=Dev protractor config.js and it will be available anywhere by running process.env.MY_VAR
For reference
https://stackoverflow.com/a/58547994/9150146
https://stackoverflow.com/a/66111592/9150146
P.S.
how you implement it is up to you, but this approach is the most flexible
conf.js
let environment = require('src/Environments/' + process.env.TEST_ENV + '/environment.json');
module.exports = {
baseUrl: environment.API_ORIGIN
}
and start your protractor like so
TEST_ENV=DEV protractor config.js
The typescript compiler works fine when I import a json file using
const tasks = require('./tasks.json')
However, when I run tsc, the output directory does not contain no tasks.json file, causing a runtime error.
Is there a way to tell the compiler that it should copy all json files, or should I manually copy/paste all my json files into the dist directory ?
my tsc compilerOptions currently reads
"compilerOptions": {
"target": "es6",
"module": "commonjs",
"sourceMap": true,
"noImplicitAny": true,
"removeComments": false,
"outDir": "./dist/",
"sourceMap": true,
"pretty": true,
"noImplicitThis": true,
"strictNullChecks": true,
"sourceMap": true
},
Thanks !
Problem
For people wanting to copy all JSON files, it's really difficult in TypeScript. Even with "resolveJsonModule": true, tsc will only copy .json files which are directly referenced by an import.
Here is some example code that wants to do a dynamic runtime require(). This can only work if all the JSON files have been copied into the dist/ folder, which tsc refuses to do.
// Works
import * as config from './config.default.json';
const env = process.env.NODE_ENV || 'development';
const envConfigFile = `./config.${env}.json`;
// Does not work, because the file was not copied over
if (fs.existsSync(envConfigFile)) {
const envConfig = require(envConfigFile);
Object.assign(config, envConfig);
}
Solution 1: Keep json files outside the src tree (recommended)
Assuming you have /src/ and /dist/ folders, you could keep your JSON files in the project's / folder. Then a script located at /src/config/load-config.ts could do this at runtime:
const envConfig = require(`../../config.${env}.json`);
// Or you could read manually without using require
const envConfigFile = path.join(__dirname, '..', '..', `config.${env}.json`);
const envConfig = JSON.parse(fs.readFileSync(envConfigFile, 'utf-8'));
This is the simplest solution. You just need to make sure the necessary config files will be in place in the production environment.
The remaining solutions will deal with the case when you really want to keep the config files in your src/ folder, and have them appear in your dist/ folder.
Solution 2: Manually import all possible files
For the above example we could do:
import * as config from './config.default.json';
import * as testingConfig from './config.testing.json';
import * as stagingConfig from './config.staging.json';
import * as productionConfig from './config.production.json';
This should cause the specified json files to be copied into the dist/ folder, so our require() should now work.
Disadvantage: If someone wants to add a new .json file, then they must also add a new import line.
Solution 3: Copy json files using tsc-hooks plugin (recommended)
The tsc-hooks plugin allows you to copy all files from the src tree to the dist tree, and optionally exclude some.
// Install it into your project
$ yarn add tsc-hooks --dev
// Configure your tsconfig.json
{
"compilerOptions": {
"outDir": "dist"
},
// This tells tsc to run the hook during/after building
"hooks": [ "copy-files" ]
// Process everything except .txt files
"include": [ "src/**/*" ],
"exclude": [ "src/**/*.txt" ],
// Alternatively, process only the specified filetypes
"include": [ "src/**/*.{ts,js,json}" ],
}
I found it tsc-hooks announced here.
Solution 4: Copy json files using an npm build script (recommended)
Before tsc-hooks, we could add a cpy-cli or copyfiles step to the npm build process to copy all .json files into the dist/ folder, after tsc has finished.
This assumes you do your builds with npm run build or something similar.
For example:
$ npm install --save-dev cpy-cli
// To copy just the json files, add this to package.json
"postbuild": "cpy --cwd=src --parents '**/*.json' ../dist/",
// Or to copy everything except TypeScript files
"postbuild": "cpy --cwd=src --parents '**/*' '!**/*.ts' ../dist/",
Now npm run build should run tsc, and afterwards run cpy.
Disadvantages: It requires an extra devDependency. And you must make this part of your build process.
Solution 5: Use js files instead of json files
Alternatively, don't use .json files. Move them into .js files instead, and enable "allowJs": true in your tsconfig.json. Then tsc will copy the files over for you.
Your new .js files will need to look like this: module.exports = { ... };
I found this idea recommended here.
Note: In order to enable "allowJs": true you might also need to add "esModuleInterop": true and "declaration": false, and maybe even "skipLibCheck": true. It depends on your existing setup.
And there is one other concern (sorry I didn't test this):
Will tsc transpile your config files if they are not all statically referenced by other files? Your files or their folders may need to be referenced explicitly in the files or include options of your tsconfig.json.
Solution 6: Use ts files instead of json files
Sounds easy, but there are still some concerns to consider:
Your config files will now look something like this: const config = { ... }; export default config;
See the note above about files / include options.
If you load the config files dynamically at runtime, don't forget they will have been transpiled into .js files. So don't go trying to require() .ts files because they won't be there!
If someone wants to change a config file, they should do a whole new tsc build. They could hack around with transpiled .js files in the dist folder, but this should be avoided because the changes may be overwritten by a future build.
Testing
When experimenting with this, please be sure to clear your dist/ folder and tsconfig.tsbuildinfo file between builds, in order to properly test the process.
(tsc does not always clean the dist/ folder, sometimes it just adds new files to it. So if you don't remove them, old files left over from earlier experiments may produce misleading results!)
In tsconfig.json, add
{
"compilerOptions": {
"resolveJsonModule": true,
},
"include": [
"src/config/*.json"
]
}
Notice that it won't copy those json files which are required. If you need to dynamically require some json files and need them to be copied to dist, then you need to change from, for example,
return require("some.json") as YourType
to
return (await import("some.json")) as YourType
.
In typescript 2.9+ you can use JSON files directly and it automatically copied to dist directories.
This is tsconfig.json with minimum needed configuration:
{
"compilerOptions": {
"allowSyntheticDefaultImports": true,
"esModuleInterop" : true,
"module" : "commonjs",
"outDir" : "./dist",
"resolveJsonModule" : true,
"target" : "es6"
},
"exclude" : [
"node_modules"
]
}
Then you can create a json file.
{
"address": "127.0.0.1",
"port" : 8080
}
Sample usage:
import config from './config.json';
class Main {
public someMethod(): void {
console.log(config.port);
}
}
new Main().someMethod();
If you don't use esModuleInterop property you should access your json properties encapsulated in default field. config.default.port.
The typescript compiler works fine when I import a json file using
const tasks = require('./tasks.json')
TypeScript wouldn't complain about this as long as you have a global require() function defined, for example using node.d.ts. With a vanilla setup you would actually get a compile error that require is not defined.
Even if you've told TypeScript about a global require function it just sees it as a function that's expected to return something, it doesn't make the compiler actually analyze what the function is requiring ("tasks.json") and do anything with that file. This is the job of a tool like Browserify or Webpack, which can parse your code base for require statements and load just about anything (JS, CSS, JSON, images, etc) into runtime bundles for distribution.
Taking this a little further, with TypeScript 2.0 you can even tell the TypeScript Compiler about module path patterns that will be resolved and loaded by a bundler (Browserify or Webpack) using wildcard (*) module name declarations:
declare module "*.json" {
const value: any;
export default value;
}
Now you can import your JSON in TypeScript using ES6 module syntax:
import tasks from "./tasks.json";
Which will not give any compile error and will transpile down to something like var tasks = require("./tasks.json"), and your bundler will be responsible for parsing out the require statements and building your bundle including the JSON contents.
you can include this into your build script && ncp src/res build/res, will copy the files directly to your outDir
You can always get an absolute path to your project, with typescript code. To do it just read the JSON file not by the required keyword but with the help of the fs module. In a path of file use process.cwd() to access typescript project directory:
import * as fs from 'fs';
const task: any = JSON.parse(fs.readFileSync(`${process.cwd()}/tasks.json`).toString());
To make it work correctly you may need to change your running script to node dist/src/index.js where you specify a dist folder in the path.
I've have built an Aurelia application, but I'm not sure what needs to be pushed to a production server. I've read up on Node and I'm starting to grasp it a little more. If we just push the dist folder (bundled folder), index.html, and package.json, does the server automatically use the json file to pull down the appropriate packages? Or do we have to run npm install on the server's CLI to pull down those packages? If we have to do that, then I'm assuming we must do the same thing with jspm.
Also, along with the json file, do we need do push config.js to production?
Edit
I just ran gulp export and it produces an export folder with the following:
dist folder
jspm_packages folder
config.js
index.html
favicon.ico
I copy all of those files and push them into production. The first error I'm getting it a 404 on main.js
Here's my bundles.js file
module.exports = {
"bundles": {
"dist/app-build": {
"includes": [
"[**/*.js]",
"**/*.html!text",
"**/*.css!text"
],
"options": {
"inject": true,
"minify": true,
"depCache": true,
"rev": false
}
},
"dist/aurelia": {
"includes": [
"aurelia-framework",
"aurelia-bootstrapper",
"aurelia-fetch-client",
"aurelia-router",
"aurelia-animator-css",
"aurelia-templating-binding",
"aurelia-polyfills",
"aurelia-templating-resources",
"aurelia-templating-router",
"aurelia-loader-default",
"aurelia-history-browser",
"aurelia-logging-console",
"bootstrap",
"bootstrap/css/bootstrap.css!text",
"fetch",
"jquery"
],
"options": {
"inject": true,
"minify": true,
"depCache": false,
"rev": false
}
}
}
};
I'm confused on why it's not loading my nprogress bar. I'm getting the 404 where it's searching for appName/jspm_packages/github/rstacruz-nprogress. Why doesn't it automatically configure this to be bundled/exported? How do I fix it to where it automatically includes all of my libraries that I brought in?
Run the command gulp export. It will bundle the app and copy the necessary files (index.html, config.js, etc...) to a export folder. Then, just copy the export folder to the server. There is no need to install packages in production.
EDIT
When you install a package, such as nprogress, you have to include it into one of the bundle files. The bundles are configured in the build/bundles.js. The aurelia navigation-skeleton comes with 2 bundles configured, one for the aurelia libraries and one for the rest of your application. You can also create more bundles if you want. To add a package into a bundle file, you just have to add its name into the specific array, for example:
//...
"dist/aurelia": {
"includes": [
"aurelia-framework",
"aurelia-bootstrapper",
"aurelia-fetch-client",
"aurelia-router",
"aurelia-animator-css",
"aurelia-templating-binding",
"aurelia-polyfills",
"aurelia-templating-resources",
"aurelia-templating-router",
"aurelia-loader-default",
"aurelia-history-browser",
"aurelia-logging-console",
"bootstrap",
"bootstrap/css/bootstrap.css!text",
"fetch",
"jquery",
"nprogress"
],
//...
In the above example I am adding nprogress into aurelia bundle. You could add this into app-build bundle, or even create another bundle just for nprogress.
Now, when you run gulp export, nprogress will be bundled into aurelia-######.js file, and it will be ready to work in production.
I'm having a problem using a namespace in Laravel4.
We have built an API using Laravel3, in which we created an entire namespaced directory called Components which the RESTful Laravel controllers accessed to perform the logic on each request. The Components namespace was created in this manner so as to allow us to re-use the logic across several applications to keep things DRY.
In Laravel3, in the application/start.php file it was a simple matter of adding:
Autoloader::namespaces(array(
'Components' => 'path\to\Components',
));
This allowed us to simply reference a static method then in any of our RESTful controllers simply by
$result = Components\Services\Common::method();
In Laravel4, it is obviously a different approach. I have added the following to the composer.json file
"autoload": {
"classmap": [
"app/commands",
"app/controllers",
"app/models",
"app/database/migrations",
"app/database/seeds",
"app/tests/TestCase.php"
],
"psr-0": {
"Components": "path/to/API/Components"
}
},
and ran the composer dump-autoload command to add the namespace to the autoload_namespaces.php file.
However, I cannot reference the namespace in any of my new controllers in Laravel4. I just get a "Class 'Components\Services\Common' not found" in HomeController.php.
I have checked in the autoload_real.php file and output the loader variable, where my new namespace is listed under the 'C' element of the array. But no joy in using it.
I know the namespace works as it is in constant use with our Laravel3 applciation. I would rather not replicate the directory into our new Laravel4 application, otherwise the reason we designed things this way will be negated and we'll end up maintaining two codebases. The namespace directory exists within our web root directory but outside of both our Laravel3 and Laravel4 applications.
Thanks for the help guys
If your namespace is
Components
And your application is in
/var/www/application
And your namespaced classes are inside the subfolder
app/API
And this is an example of class file name:
/var/www/application/app/API/Components/Services/Common.php
Then you have to add to your composer json:
"autoload": {
"psr-0": {
"Components": "app/API"
}
},
If you are loading from another base path and your namespaced classes are in /var/www/Components, you can:
"autoload": {
"psr-0": {
"Components": "/var/www"
}
},
But if they are in /var/www/components/Components, then you have to
"autoload": {
"psr-0": {
"Components": "/var/www/components"
}
},
Because "Components" is the base of your namespaces and will always be added to the path before Composer search files to autoload.