I know that the max size for Script Properties is 500K in total, but does anyone know the maximum size of a single Script Property? I am reading a spreadsheet, using Utilities.jsonStringify on it and trying to save to a Script Property, but I am getting an 'Argument too large' when I run ScriptProperties. The data is about 8k, and I am nowhere near the 500k limit.
Thanks in advance
Chris
"Each property value has a maximum size limit of 9kB."
https://developers.google.com/apps-script/script_user_properties
You should really consider, as suggested by Srik and the documentation, use ScriptDb.
Related
I have an item that has vm.memory.size[used] as a key, this returns the memory used, this also included the cached and the buffers.
I need to subtract vm.memory.size[cached] and vm.memory.size[buffers] from the vm.memory.size[used] to get the value that I need.
How can I do this please since I cannot find a way to do this, this what I tried lately but deos not work.
If you want to calculate it in a separate item, you must have used, cached and buffers already monitored as normal items. Once you have them, the calculated item formula would be last(vm.memory.size[used])-last(vm.memory.size[cached])-last(vm.memory.size[[buffers]) .
You can also calculate that directly in a trigger, removing the need for the calculated item.
And maybe even simpler than that - vm.memory.size[available] and vm.memory.size[pavailable] item keys can give you the (raw and percentage, respectively) amount of the available memory - already excluding cache & buffers - that you might be able to alert on directly.
I want to graph how much data is written inside my MediaWiki per day to graph the activity. The exact amount of bytes is not important, I just want to see the relative change per day/month/year.
I only found the Statistics Log extension which is not maintained since 1.15.
Any solution via extension/api/mysql would be great. If I can get the value of the bytes/chars or anything else by any method I can do the rest.
Not an easy answer. But you can get started with "RecentChanges" table and related API: https://www.mediawiki.org/wiki/API:RecentChanges
rc_old_len
This field stores the size, in bytes, of previous revision's text.
rc_new_len
This field stores the size, in bytes, of the current revision's text.
Reference: https://www.mediawiki.org/wiki/Manual:Recentchanges_table#rc_new_len
Is there a Rails method to return the data size in bytes of a record?
Let's say I have a table called Item. Is there a method something like #item.data_size that would return "xx bytes"?
I have a mysql database.
Not sure if there's a native way of doing it like C, but try this (it might include the size of the class, which is different from the single SQL row):
require 'objspace'
ObjectSpace.memsize_of(#my_item)
first way
require "rubygems"
require "knjrbfw"
analyzer = Knj::Memory_analyzer::Object_size_counter.new(my_hash_object)
puts "Size: #{analyzer.calculate_size}"
second way
require 'objspace'
h = {"a"=>1, "b"=>2}
p ObjectSpace.memsize_of(h)
Measure the memory taken by a Ruby object (by Robert Klemme)
Fortunately, no. As far as I know it's impossible to determine record size in MySql, as well as in most databases. This is due to following reasons, I'll put only most obvious ones:
Record may include association i.e. link to another record in another table and it's completely unclear how to count this, more over it's unclear how to interpret result of such calculation.
Record has sort of overhead such indexes, should calculation include it or not?
So, this means such record size will be very approximate and average by nature. If such method would exist it could occur lots of confusion. However it doesn't mean this can't be done at all. Referring this SO answer it is possible to get table size. You could try to seed you database with millions of typical records of fake data e.g. using ffaker gem, get size and divide by record number. This should give very good number for your particular situation.
As a next step you may check is average record size related and does it correlate with object size in memory. This may be pretty interesting.
Cheers!
Yes you can count the total number of record accessing your model. As example you can try out this code example
#items = Item.all
#items.size or #items.count or #items.length will return the total number of record holds this #items variable. Or directly you can use count on model like Item.count will return total number of record into database.
Using the DBCC SHOWCONTIG command we get the size of a row in minimum, maximum and on average.
Just to make sure, the unit of Measurement is Byte right?
Yes, the unit of measurement is Bytes.
I use it but I don't found any official informations about that.
I continue searching and post a link if I find any interesting informations.
EDIT :
Bytes is also used here :
Row size overhead
Since end of August 2012 scripts that used to work on GAS have stopped working.
Am using JDBC to retrieve records in my MySQl database and show in a sheet.
Was working and then I started getting "We're sorry, we were unable to
process the operation because it contains too much data."
But the dataset in questions is 2400 rows * 35 columns, which is WAY below the 400 000 cell limit, and way below the 256 columns limit per sheet.
Any ideas? Has something changed recently...
I think the conclusion here is that the
"We're sorry, we were unable to process the operation because it contains too much data."
error can be caused not only be the amount or rows and columns (cells) in a data set, but also be due to internals issues with memory use of objects being applied to the region.
Try adding a portion of the values at a time, with a Spreadsheet.flush(); in between each setValues(). That solved my problem.