How to calculate the time-length of a midi-file - actionscript-3

I am reading midi files in as3 (flash cs5) with the help of the helpful library that is called midas
( http://code.google.com/p/midas3/) - the midi-as3 library.
I am trying to figure out a simple way to calculate the whole duration of the midi file (for example - total time of 4 minutes or 6 minutes...). I assume I could calculate the last note of each track + check the tempo and figure it out, but I was wondering if:
Is the duration of the midi file is written somewhere in the data that I could just pull out and use?
or
Is there an easy way to calculate it without running through the whole file and compare last-notes/tempos.

Nope, you need to read the entire file and determine the time when you read the last note. MIDI files are essentially streaming data, so there is no "length" field in the file's header.
Edit: Upon further thought, "streaming" isn't exactly a great way to describe MIDI files. MIDI files do have a fixed length in bytes, which is is stored in the IFF chunk header. However, there is no property as for the length of the file in seconds, but assuming that you can read all of the bytes into a sequence (and don't forget to take tempo changes into account!), it should not be too difficult to determine the length of the file in seconds.

Related

Writing TIFF pixel data last and making 8 GB TIFF files

Is there any reason not to write TIFF pixel data last? Typically a simple TIFF file starts with a header that describes endianness and contains the offset to the first IFD, then the pixel data, followed at the end of the file by the IFD and then the extra data that the IFD tags point to. All TIFF files I've seen are written in this order, however the TIFF standard says nothing about mandating such an order, in fact it says that "Compressed or uncompressed image data can be stored almost anywhere in a TIFF file", and I can't imagine that any TIFF parser would mind the order as they must follow the IFD offset and then follow the StripOffsets (tag 273) tags. I think it makes more sense to put the pixel data last so that one goes through a TIFF file more sequentially without jumping from the top to the bottom back to the top for no good reason, but yet I don't see anyone else doing this, which perplexes me slightly.
Part of the reason why I'm asking is that a client of mine tries to create TIFF files slightly over 4 GB which doesn't work due to the IFD offsets overflowing, and I'm thinking that even though the TIFF standard claims TIFF files cannot exceed 2^32 bytes there might be a way to create 8 GB TIFF files that would be accepted by most TIFF parsers if we put everything that isn't pixel data first so that they all have very small offsets, and then point to two strips of pixel data, a first strip followed by a second that would start before offset 2^32 and that is itself no larger than 2^32-1, thus giving us TIFF files that can be up to 2*(2^32-1) bytes and still be in theory readable despite being limited to 32 bit offsets and sizes.
Please note that this is a question about the TIFF format, I'm not talking about what any third-party library would accept to write as I wrote my own TIFF writing code, nor am I asking about BigTIFF.
It's OK to write pixel data after the IFD structure and tag data. Many TIFF writing software and libraries do that. Most notable, libtiff based software writes image data first. As for writing image data in huge strips, possibly extending the 4GB file size, check with the software/libraries you intend to read the files with. Software compiled for 32-bit or implementation details might prevent reading such files. I found that for example modern libtiff based software, Photoshop, and tifffile can read such files, while ImageJ, BioFormats, and Paint.NET can not.

How to create a Datasheet for Elden Ring Save-Files

I'd like to find out the offset of certain information like level, equipped items etc. from the ER0000.sl2 save file.
It has been done for Dark Souls here.
And I found out some offset locations in this file
What would be a good approach to reverse-engineer an exhaustive list like the first one?
Change my save-state ingame, diff the save files and see what changes? Or is there an easier way?
I would use previous souls like games as a base (like the link you provided). You want to get a lot of savedatas and then you want to compare what is same/different given you know it is being saved like you suggested.
For Elden ring all character slots are always same size but not all of the data within them is always in same location within the slot's save data. For example death location differs even thousands of bytes on different character saves.
I found out where deaths are in the save game (roughly, above statement explains this) by dying and comparing where it is saved using hex editor, knowing deaths are 32 bit little-endian integers.
I haven't looked but if you can find a structure of the save data within IDA pro or similar you could have easier time too (if they have an object for save data).
A good trick is to use memory reader while playing the game, read the data you want to find and then search it within the save file. I for example made a new save and died until I could find death location inside Cheat Engine, then changed to my other save that had hundreds of deaths and I found the death count instantly which was then easy to locate from the save file.

d3.json() to load large file

I have a 96mb .json file
It has been filtered to only the content needed
There is no index
Binaries have been created where possible
The file needs to be served all at one time to calculate summary statistics from the start.
The site: https://3milychu.github.io/met-erials/
How could I improve performance and speed and/or convert the .json file to a compressed file that can be read client-side in javascript?
Most visitors will not hang around for the page to load -- I thought that the demo was broken when I first visited the site. A few ideas:
JSON is not a compact data format as the tag names get repeated in every datum. CSV/TSV is much better in that respect as the headers only appear once, at the top of the file.
On the other hand, repetitive data compresses well, so you could set up your server to compress your JSON data (e.g. using mod_deflate on Apache or compression on nginx ) and serve it as a gzipped file that will be decompressed by the user's browser. You can experiment to see what combination of file formats and compression works best.
Do the summary stats need to be calculated every single time the page loads? When working with huge datasets in the past, summary data was generated by a daily cron job so users didn't have to wait for the queries to be performed. From user feedback, and my own experience as a user, summary stats are only of passing interest, and you are likely to lose more users by making them wait for an interface to load than you are through not providing summary stats or sending stats that are very slightly out of date.
Depending on how your interface / app is structured, it might also make sense to split your massive file into segments for each category / material type, and load the categories on demand, rather than making the user wait for the whole lot to download.
There are numerous other ways to improve the load time and (perceived) performance of the page -- e.g. bundle up your CSS and your JS files and serve them each as a single file; consider using image sprites to reduce the number of separate requests that the page makes; serve your resources compressed wherever possible; move the JS loading out of the document head and to the foot of the HTML page so it isn't blocking the page contents from loading; lazy-load JS libraries as required; etc., etc.

"Play" sounds into a .wav

I'm trying to make a program that can convert ORG files into WAV files directly. The ORG format is similar to MIDI, in the sense that it is a list of "instructions" about when and how to play specific instruments, and a program plays these instruments for it to create the song.
However, as I said, I want to generate a WAV directly, instead of just playing the ORG. So, in a sense, I want to "play" the sounds into a WAV. I do know the WAV format and have created some files from raw PCM samples, but this isn't as simple.
The sounds generated by the ORG come from a bunch of files containing WAV samples I have. They're mono, 8-bit samples should be played at 22050Hz. They're all under a second long, and the largest aren't more than 11KB. I would assume that to play them all after each other, I would simply put the samples into the WAV one after the other. It isn't that simple though, as the ORG can have up to 16 different instruments playing at once, and each note of each instrument also has a pan (i.e. a balance, allowing stereo sound). What's more, each ORG has its own tempo (i.e. milliseconds between each point a sound can be played), and some sounds may be longer than this tempo, which means that two sounds on the same instrument can overlap. For instance, a note plays on an instrument, 90 milliseconds later the same note plays on the same instrument, but the first not hasn't finished, hence the first note plays into the second.
I just thought to explain all of that to be sure the situation is clear. In any case, I'd basically like to know how I would go about converting or "playing" an ORG (or if you like, a MIDI (since they're essentially the same)) into a WAV. As I mentioned each note does have a pan/balance, so the WAV would also need to be stereo.
If it matters at all, I'll be doing this in ActionScript 3.0 in FlashDevelop. I don't need any code (as that would be asking someone to do the work for me), but I just want to know how I would go about doing this correctly. An algorithm or two may be handy as well.
First let me say AS3 is not the best language to do these kind of things. Super collider would be a better and easier choice.
But if you want to do it in AS3 here's a general approach. I haven't tested any of it, this is pure theory.
First, put all your sounds into an array, and then find a way of matching the notes from your midi file to a position in the array.
I don't know the format of midi in depth, but I know the smallest value is a tick, and the length of a tick depends on the BPM. Here's the formula to calculate a midi tick: Midi Ticks to Actual PlayBack Seconds !!! ( Midi Music)
Let's say your tick is 2ms in length. So now you have a base value. You can fill a Vector (like an Array but faster) with what happens at every tick. If nothing happens at a particular tick, then insert a null value.
Now the big problem is reading that Vector. It's a problem because the Timer class does not work at small values like 2ms. But what you can do is check the ellapsed time in ms since the app started using getTimer(). You can have some loop that will check the ellapsed time, and whenever you have 2ms more, you read the next index in the Vector. If there are notes on that index, you play the sounds. If not you wait for the next tick.
The problem with this, is that if a loop goes on for more than 15 seconds (I'm not sure of that value) Flash will think the program is not responding and will kill it. So you have to take care of that too, ending the loop and opening a new one before Flash kills your program.
Ok, so now you have sounds playing. You can record the sounds that flash is making (wavs, mp3, mic) with a library called Standing Wave 3.
https://github.com/maxl0rd/standingwave3
This is very theoretical... and I'm quite sure depending on the number of sounds you want to play you can freeze your program... but I hope it will help to get you going.

Why people always encourage single js for a website?

I read some website development materials on the Web and every time a person is asking for the organization of a website's js, css, html and php files, people suggest single js for the whole website. And the argument is the speed.
I clearly understand the fewer request there is, the faster the page is responded. But I never understand the single js argument. Suppose you have 10 webpages and each webpage needs a js function to manipulate the dom objects on it. Putting 10 functions in a single js and let that js execute on every single webpage, 9 out of 10 functions are doing useless work. There is CPU time wasting on searching for non-existing dom objects.
I know that CPU time on individual client machine is very trivial comparing to bandwidth on single server machine. I am not saying that you should have many js files on a single webpage. But I don't see anything go wrong if every webpage refers to 1 to 3 js files and those js files are cached in client machine. There are many good ways to do caching. For example, you can use expire date or you can include version number in your js file name. Comparing to mess the functionality in a big js file for all needs of many webpages of a website, I far more prefer split js code into smaller files.
Any criticism/agreement on my argument? Am I wrong? Thank you for your suggestion.
A function does 0 work unless called. So 9 empty functions are 0 work, just a little exact space.
A client only has to make 1 request to download 1 big JS file, then it is cached on every other page load. Less work than making a small request on every single page.
I'll give you the answer I always give: it depends.
Combining everything into one file has many great benefits, including:
less network traffic - you might be retrieving one file, but you're sending/receiving multiple packets and each transaction has a series of SYN, SYN-ACK, and ACK messages sent across TCP. A large majority of the transfer time is establishing the session and there is a lot of overhead in the packet headers.
one location/manageability - although you may only have a few files, it's easy for functions (and class objects) to grow between versions. When you do the multiple file approach sometimes functions from one file call functions/objects from another file (ex. ajax in one file, then arithmetic functions in another - your arithmetic functions might grow to need to call the ajax and have a certain variable type returned). What ends up happening is that your set of files needs to be seen as one version, rather than each file being it's own version. Things get hairy down the road if you don't have good management in place and it's easy to fall out of line with Javascript files, which are always changing. Having one file makes it easy to manage the version between each of your pages across your (1 to many) websites.
Other topics to consider:
dormant code - you might think that the uncalled functions are potentially reducing performance by taking up space in memory and you'd be right, however this performance is so so so so minuscule, that it doesn't matter. Functions are indexed in memory and while the index table may increase, it's super trivial when dealing with small projects, especially given the hardware today.
memory leaks - this is probably the largest reason why you wouldn't want to combine all the code, however this is such a small issue given the amount of memory in systems today and the better garbage collection browsers have. Also, this is something that you, as a programmer, have the ability to control. Quality code leads to less problems like this.
Why it depends?
While it's easy to say throw all your code into one file, that would be wrong. It depends on how large your code is, how many functions, who maintains it, etc. Surely you wouldn't pack your locally written functions into the JQuery package and you may have different programmers that maintain different blocks of code - it depends on your setup.
It also depends on size. Some programmers embed the encoded images as ASCII in their files to reduce the number of files sent. These can bloat files. Surely you don't want to package everything into 1 50MB file. Especially if there are core functions that are needed for the page to load.
So to bring my response to a close, we'd need more information about your setup because it depends. Surely 3 files is acceptable regardless of size, combining where you would see fit. It probably wouldn't really hurt network traffic, but 50 files is more unreasonable. I use the hand rule (no more than 5), but surely you'll see a benefit combining those 5 1KB files into 1 5KB file.
Two reasons that I can think of:
Less network latency. Each .js requires another request/response to the server it's downloaded from.
More bytes on the wire and more memory. If it's a single file you can strip out unnecessary characters and minify the whole thing.
The Javascript should be designed so that the extra functions don't execute at all unless they're needed.
For example, you can define a set of functions in your script but only call them in (very short) inline <script> blocks in the pages themselves.
My line of thought is that you have less requests. When you make request in the header of the page it stalls the output of the rest of the page. The user agent cannot render the rest of the page until the javascript files have been obtained. Also javascript files download sycronously, they queue up instead of pull at once (at least that is the theory).