I'm looking for a way to recursively generate links to next pages on a website with canonical structure. In essence, I'm trying to generate a link to each next page and then feed that result back in to the process to find the following page ad infinitum. However, I'm having problems automating this as the macro seems to be trying to generate the result for cells that are empty (i.e. the results for an earlier cell hasn't been created/copied yet).
So I'd like to sequentialise the macro to start from A20, generate the result for that cell, copy that result to A21, then begin the macro again for A21, et cetera, without requiring constant human input.
The Google spreadsheet with the error can be seen here in cell C27 and the macro itself can be seen here.
I realise this may be quite a roundabout way to perform this task and am open to any suggestions that may be easier, more intuitive, or faster.
So two suggestions: one is that with stuff that is a continuous scroll, its very easy to find the json of the source, and either grab all the data you want in one go, or by easily picking out the "next page" or pagination...
I personally use importdata() and importxml() more than any other functions, and when in google sheets I also use regexextract() and regexreplace() when needed.
for example the json your looking for is here: http://iconosquare.com/controller_nl.php?action=nlGetMethod&method=mediasTag&value=cricket&max_id=1145408330912313787
if you look at the top row, it tell you what the next min and max is so technically you could just extract that piece to generate your url.
Second option is to just build the query such that it autoincrements the urls. I can give you an example, but I would like to understand a little more about what you really want in the end result...
Are you just looking for the pagination urls, or are you wanting to extract the actual data from them?
Related
Updated everything to include the ultimate goal of this spreadsheet and the issue I encountered during early testing so that if there's a better way to do this that someone with experience can identify, I can change my approach and do that instead.
Link to copy of sheet here
The Issue:
This macro uses paste special (values only) twice, but it only correctly pastes the first one and does nothing for the second. For example, the the first instance copies data from C6 and pastes only the value to F6, which it does. The second copies from K6 and pastes only the value to M6, which it does not do. Everything else is functioning correctly except for this.
For context, this macro also copies a formula down in between the two copy/paste actions. You'll see the code for that below as well. It's working as intended, so you can ignore that.
With help from below, we identified the issue was that the second paste values only function wasn't working because it was copying the cell before that cell could finish summing its formula.
For example, let's say K6 was =sum(K4+K5) with K4=1 and k5=4. Before the sum could record the answer, "5" in that cell, the copy was grabbing the empty data and pasting it, thus pasting nothing.
I tried setting up a delay using sleep, but it was not a consistent solution. It only works about 50% of the time. If it's possible to set up a delay based around letting a formula finishing its output, that would be a perfect solution. If anyone has experience with a similar issue or suggestions for optimization threads, please link below. I'll loop back if I find a solution.
The Goal and Backstory:
This is a mock-up of an internal tracker we use to gauge our availability (mins) against the project (in mins) to determine roughly when we have the project complete. We use this tracker to determine if something will be late, if we have the availability from our team to take on more projects, etc. It's all manual, and we don't have anyone particularly strong with scripts.
Right now, our solution is to manually enter in our availability in each cell with the assistance of basic sum formulas. This takes a lot of time and is prone to human error. So, I'm trying to find a macro that will do the math for us with my very limited experience in scripts/macros.
In the "The Ideal" tab, I've manually created an example of what I'm ultimately working towards -- a macro that sums and enters in data based on availability vs. project mins. I'm obviously new to this, so there are inevitably inefficiencies and I may be completely on the wrong track. If so, please tell me and offer me an explanation as if you were speaking to a grandparent with basic computer knowledge.
Thanks everyone for your help!
Here's the code:
function TestCalc1() {
var spreadsheet = SpreadsheetApp.getActive();
spreadsheet.getRange('D5:H5').activate();
spreadsheet.getActiveRange().autoFill(spreadsheet.getRange('D5:H6'), SpreadsheetApp.AutoFillSeries.DEFAULT_SERIES);
spreadsheet.getRange('I6').activate();
spreadsheet.getRange('H6').copyTo(spreadsheet.getActiveRange(), SpreadsheetApp.CopyPasteType.PASTE_VALUES, false);
spreadsheet.getRange('H6').activate();
};
Try this,
function TestCalc1() {
var spreadsheet = SpreadsheetApp.getActive();
var cellsD5H5 = spreadsheet.getRange("D5:H5").getValues();
spreadsheet.getRange("D6:H6").setValues(cellsD5H5);
var cellH6 = spreadsheet.getRange("H6").getValues();
spreadsheet.getRange("I6").setValues(cellH6);
//spreadsheet.getRange("H6").activate();
};
getValues() -
https://developers.google.com/apps-script/reference/spreadsheet/range#getValues()
this is one heck of a confusing question to ask so here it goes. Firstly, I'm not asking you to write me any code I just need help going in the right direction for what I'm trying to achieve here. Basically the task is this, I want to scan a select area of a web page's source code for changes and if something does change, I want to report it somewhere (like a console or something). However, I do not want just a notification of change, I also want what the change is/was. I've been looking into things like jsoup but I am still struggling to even find out what this is called.
Any pointers would be insanely appreciated. Thanks, Optimistic.
Here are some steps assuming this is from a node.js project:
Get the URL for the specific script file you're looking for a change in.
Using the request() module, fetch that URL.
Break the data up into lines (probably using .split()).
Find the specific line you are looking for either by counting line numbers of by searching for some representative text in that line.
Using some sort of search in that line (perhaps a regex), find the current value of the exact item in that line you are looking for.
Save the current value.
Then, at some future time, repeat this whole process and compare what you find to the previous value.
If this is being done from a browser instead of node.js, then use an Ajax call to retrieve the file. If the file is on another domain from your web page and that domain does not permit cross-origin requests, then you cannot solve this problem in an automated fashion from a browser in your own web page.
Here is how I would do it with Jsoup:
Document doc = Jsoup.connect(url).get();
String scriptCssQuery = "script"; // Tune this CSS query to find THE script you need.
Element script = doc.select(scriptCssQuery).first();
if (script != null) {
String scriptLines = script.html();
// Store the changing line somewhere and compare it to its previous value...
}
Hi I have done thorough research and have come to this extent. All I am trying to do is extract HTML table spanning many webpages.
I have to query the website sec.gov's database and the table then returns appropriate number of results (the size and number of pages vary with every query). For example:
Link: http://www.sec.gov/cgi-bin/srch-edgar
Inputs to be given:
Enter a Search string box: form-type=(8-k) AND filing-date=20140523
Start: 2014
End: 2014
How can I do this totally in R without even opening the browser?
I am sharing what all I have done
I tried many packages and the closest I came to was package RCurl. But in getURL function I opened the browser, ran the query in browser and pasted it in getURL. It returned a very long character, which has the URLs that can be looped and produce the output I want. All this information is in the "center" tag of output.
Now I do not know how to get those URLs out from the middle of the character.
Also, this is not what I wanted. I wanted to run a web query directly from R and get the varied HTML table outputs directly into R. is this possible at all?
Thanks
Meena
Yes, it is possible. You will want to use a combination of the RCurl and XML packages. You will need to programmatically generate the query parameters in the URL (based on the HTML form) and then use getURL() or getURLContent(). Sometimes, the server will expect an HTTP POST, so there is postForm().
To parse the result, look up the XPath language, which the XML package supports with getNodeSet(). I think there is also a function in the XML package for parsing an HTML table into a data.frame.
You might want to invest in this book.
I want to know if there is a way to check if a file has been edited. I looked for methods that can make it for me on Google Apps library, but I have found nothing about it. I don't know if I searched wrong.
Basically, I need to take a file, take a measurable data (something like size) of this file and store on a variable. Then, I need to take that measurable data again, store on another variable, and compare if there was a change. I need a boolean return.
Anyone?
You could do the following (pseudo with links to documentation):
Get the file you want to check using the DocList Class.
Get the ID of that File once you have it using File.getID()
Get the last edit timestamp using File.getLastUpdated()
Store this value in a Spreadsheet, or maybe Script or User Properties.
When you want to check to see if the File was updated, simply File.getFileById()
Repeat step 3.
Then compare the two last-edited timestamps with an operator like !=, or do more complex comparisons on the Dates if you want.
Depending on the result of step 7, return true or false.
Google's documentation is great, for all their services. You just need to read it a bit to understand what kind of power you have through scripting. Hopefully my pseudo-method helps in solving your problem!
Look at the file update date: https://developers.google.com/apps-script/reference/drive/file#getLastUpdated() and for storing data look up the storing data section in the apps script help page.
You could also use the GAT General Audit Tool http://goo.gl/hzZ2yf... which reports when files were edited , viewed and much more.
I have Mysql and Zope talking to each other nicely.
I can insert values in, and even view one record from the databse and have it show up in the form fields(not that cool really).
I have just been following tutorials, but now I need to search the Database and display the results.
I have been fiddling with this for a while but I haven't gotten far.
I know that using a Zope Page template we can call the Zsql method and show the results with TAL, but I need to some how using a script and a redirect form the search form pass the search data.
Can anyone help we get started?
Can anyone show me a way to do it with formgen and a custom script adapter?
Go into your ZMI, and add a "Z Search Interface" object in the same folder as your ZSQL method. Your ZSQL methods should show as "searchable objects". Select the ZSQL method you want, fill in the rest of the fields, and it will generate form and display page templates. Then modify the templates to taste.