Minecraft Structure Datapack modification - json

I am attempting to create a Minecraft data pack to change the way that a certain structure spawns. This structure is new to the game, and has many nbt files that make it up.
To modify the creation, I create a few folders. In my case, I am placing the namespace as "ancientworld". Inside this folder I have the following... data/ancientworld/worldgen/noise_settings/
I then have a file called overworld.json which I specify the characteristics of the spawning for this Ancient City structure.
I have followed this guide https://minecraft.fandom.com/wiki/Custom_world_generation#Noise_settings , but I still cannot figure out how to spawn in the structures. Most of the other structures use one nbt file, but this has 4 different folders with many separate nbt files to make up the structure.
Basically, can anybody help me determine how I need to have this overworld.json file written out so that I can spawn this complex structure extremely frequently? Thank you for looking at this and helping out in advance.

Related

Creating a database using JSON files

Hello, my question is simple but I don't know if it's possible to realize it.
I've got json files structured like that one in the picture and I'd like to use these files, (actually I'd like to use the information contained in them), to create a database. I wanted to understand if it's possible to do this, without the necessity of transcribing all information again, but simply by moving the files into the database that will automatically save them in it.

GIS Data Organization: Accessing files via shortcuts

I work at a company with three different departments and about 15 different GIS users (ESRI). Our work is in natural resources and we have hundreds of large orthophotos and lidar data that take up a lot of space.
Right now, the three different departments "share" all the GIS data, but they store them on different servers (and have for 15 years). So at this point, even though most departments have the same data, they are stored in different places, organized differently, have different naming conventions, and taking up 3 times the storage space (since everyone has different instances of the same data), etc.
I have been tasked to consolidate all the data into one, shared, organized folder. We want to have everything stored in this "master" folder and have the departments using this master folder going forward.
We plan to have separate folders for orthophotos, lidar files, vector data, etc. Within those folders, we have agreed that every file should be named with some variation of the following metadata: year, source, and geography. Everyone has their own way organizing their data, and have for 15 years. Some people want it organized by year_source_geography, some want it geography_year_source, some want it source_year_geography, etc.
QUESTION:
1) Lets say that we all agree to sort the master folder by year_source_geography: is there a way to create other folders with different naming conventions that will "shortcut" to the master data? The idea is that everyone can still have their files (shortcuts) organized the way they want to, but without creating duplicate files. For example: in addition to the master folder which is sorted by year_source_geography, can we have another folder that is sorted by geography_year_source that will "shortcut" to the master list?
2) Do you have any comments on how I'm organizing this data? I'm fairly new to GIS organization, so any suggestions or comments on how I should be organizing this data is welcomed.
Thanks!
I would implement DB rather than plain files.
Start from the following: http://geojson.org/
You can add your properties, such as year, filenames, paths, etc...
Even if your users still want the data in plain files, I still would manage the links and paths in NoSQL DB. This will provide you a great level of flexibility. Push the data into AWC or similar platform.
You can start testing with MongoDB free service http://www.mlab.com

How to load a directory of different files Excel and CSV in multiple tables on database with Talend?

I need to load a directory of different files (Excel and CSV) without any relation between them in multiple tables on database, every file must be loaded in its own table without any transformation.
I tried to do this using TfileList ==> TfileInputExcel ==> tMySQLOutput but it doesn't work because I need a lot of outputs.
Your question is not very clear, but it seems like you want something generic enough that will work with just one flow for all your files.
You might be able to accomplish that using dynamic schemas. See here for further guidance: https://www.talendforge.org/forum/viewtopic.php?id=21887. You will probably need at least 2 flows, one for the CSV files and one for the XLS files. You can filter the files for each flow by their extension in the tFileList component.
But if you are new to Talend, I encourage you to avoid this approach. It might be very hard to understand and use dynamic schemas. Instead, I would recommend you have one flow for each file.

Rails model concept with multiple sources

I have a document management system. I have a data set that can run through a program (another kind of file) which can be turned into images, a different kind of data, or even a new data set. I have to keep track of this "lineage".
If I was thinking in Mysql terms directly, I would add a "source" column and link each file to the file that it was created from.
I can't think of a logical way to do this within the confines of Ruby on Rails. Any ideas/hints/tips?
What you are looking for is GraphDBs. You can try neo4j www.neo4j.org/‎

mysql filesystem

Is there a way to create a file system like this. Whenever a new user is registered a folder with unique ID is created for storing images in the filesystem for that user. If he/she creates a new album for pictures another new folder will be created inside that unique folder for the user.
Thank U.
You can do that with server-side scripting. For example, with php you can create a new directory using the mkdir() function: http://php.net/manual/en/function.mkdir.php
Beware of scalability issues. I'd recommend you to add another level if indirection, that is, to store user directory in the table. At first you may create directory structure however you like, but when you'll have tens of thousands of directories at one level of filesystem, you could get into filesystem-level performance issues. Then you'll just reorganize your folders, move files, and update table links.