nginx: location condition + secure_link - configuration

I have a hard time figuring out the correct configuration.
I have an URL - e.g.
http://example.com/[md5-checksum]/[num-value-1]/[num-value-2]/[file-name]
http://example.com/ac64392dba67d618ea6a76843c006708/123/56789/test.jpg
I want to make sure that the md5-checksum matches salt + num-value-2. So the file name and num-value-1 should be ignored (only needed for the filename header) in order to build the md5 checksum.
Following configuration does not result in what I want to achieve.
location ~* ^/download/([a-zA-Z0-9]+)/([0-9]+)/([0-9]+)/(.*)$ {
secure_link $3;
secure_link_md5 segredo$3;
if ($secure_link = "") {
return 500;
}
set $filename $4;
add_header Content-Disposition "attachment; filename=$filename";
rewrite ^/download/([a-zA-Z0-9]+)/([0-9]+)/([0-9]+)/(.*)$ /$2/$3 break;
}
I appreciate any help!

secure_link $3
should have been
secure_link $1

Related

How can I prevent direct access to files using the .htaccess file?

I have a problem, I tried to restrict access to files with direct link (etc. www.domen.com/folder/subfolder/file.ext), to can only access to them using HTML code like "< img src ='/folder/subfolder/file.ext' >"...
I create .htaccess file with next lines
# enable mod_rewrite
RewriteEngine On
# RewriteCond = define rule condition
# HTTP_REFERER = check from where the request originated
# ! = exclude
# ^ = start of string
# [NC] = case insensitive search
RewriteCond %{HTTP_REFERER} !^http://domen.com/folder/subfolder [NC]
# \ = match any
# . = any character
# () = pattern, group
# $ = end of string
# [F] = forbidden, 403
# [L] = stop processing further rules
RewriteRule \.(gif|jpg|jpeg|png|mp4|mov|mkv|flv)$ - [F,L]
Permission code of my files is 0644 in folders and subfolders, and permission code of my folders and subfolders is 0755
Problem is next.. When I use this code in .htaccess file I restrict direct access to files but at the same time I cant access them with HTML code..
<Directory platform/courses/*>
Order Allow,Deny
Allow from 1.1.1.1
Deny from All
</Directory>
<Directory>
Order Allow, Deny
Allow from All
</Directory>
I tried something like this (with IP addres taken from my cPanel) but I get this result:
This is a cutdown version of what I use, so may have syntax errors.
The code is laid out to demonstrate the process. It can be made more robust.
The user is shown a url with a filename using download.php.
This can be posted.
Once the page is shown, it saves the filename with an expiry of 1 hour
in the cookie. If it expires, the page has to be refreshed.
When the button is pressed, sendfile.php gets all the information
from the cookie, validates the expiry and filename and sends it.
download.php
This is the landing page that the user can link to.
Show the user a link like /download/document.pdf
Use .htaccess to map it to /download?name=document.pdf
<?php
// Returns the param by name, if not found then get it from the current_url.
// So this works regardless of how the .htaccess redirection has been done.
function Get_Param ($name, $current_url)
{ $par = filter_var ($_GET [$name] ?? '', FILTER_SANITIZE_STRING);
if ($par)
{return $par;}
if ($only)
{return '';}
$pi = pathinfo ($current_url);
$pi = $pi['filename'] ?? '';
if (strpos ($pi, '=') == 0)
{return $pi;}
else
{return '';};
}
//Save Information in session with an expiry of 1hr
function Save_Info ($file)
{ $expiry = time()+3610;
setcookie ('Download',$file, $expiry, '/');
}
//Get the Filename from the param or url
$file = Get_Param ('name', $current_url);
//Save to Session
Save_Info ($file)
// Show page with filename and details
// Show button with link "/sendfile"
?>
sendfile.php
This basically pretends to be the file thats being downloaded and then dies.
<?php
// Sends the file and dies
function Send_File ($file)
{ header ('Content-Description: File Transfer');
header ('Content-Type: application/octet-stream');
header ('Content-Disposition: attachment; filename="'.basename($file).'"');
header ('Expires: 0');
header ('Cache-Control: must-revalidate');
header ('Pragma: public');
header ('Content-Length: ' . filesize($file));
flush ();
if (readfile ($file)) {
//LogIt ($file);
}
die ();
}
//Load Info from session and unset it
function Load_Info ()
{ $file = $_COOKIE ['Download'] ?? '';
unset ($_COOKIE ['Download']);
return $file
}
//Validate FileName
function Is_Valid ($file,$path)
{ if (!$file) {
return 'FileName is Blank';
}
$file = urldecode($file);
if (!preg_match('/^[^.][-a-z0-9_.]+[a-z]$/i', $file)) {
return 'Invalid FileName';
}
if (!file_exists($path.$file)) {
return 'File does not exist';
}
return false;
}
//Your actual path to the file
$path ='';
// Load Info from Session
$file = Load_Info ();
// Die if filename invalid or session expired
$error = Is_Valid ($file,$path);
if (!$error) {
//Show Message and die
}
else
{Send_File ($path.$file);
}
?>

Is there a simple way to convert a CSV with 0-indexed paths as keys to JSON with Miller?

Consider the following CSV:
email/1,email/2
abc#xyz.org,bob#pass.com
You can easily convert it to JSON (taking into account the paths defined by the keys) with Miller:
mlr --icsv --ojson --jflatsep '/' cat file.csv
[ { "email": ["abc#xyz.org", "bob#pass.com"] } ]
Now, if the paths are 0-indexed in the CSV (which is surely more common):
email/0,email/1
abc#xyz.org,bob#pass.com
Then, without prior knowledge of the fields names, it seams that you'll have to rewrite the whole conversion:
edit: replaced the hard-coded / with FLATSEP builtin variable:
mlr --icsv --flatsep '/' put -q '
begin { #labels = []; print "[" }
# translate the original CSV header from 0-indexed to 1-indexed
NR == 1 {
i = 1;
for (k in $*) {
#labels[i] = joinv( apply( splita(k,FLATSEP), func(e) {
return typeof(e) == "int" ? e+1 : e
}), FLATSEP );
i += 1;
}
}
NR > 1 { print #object, "," }
# create an object from the translated labels and the row values
o = {};
i = 1;
for (k,v in $*) {
o[#labels[i]] = v;
i += 1;
}
#object = arrayify( unflatten(o,FLATSEP) );
end { if (NR > 0) { print #object } print "]" }
' file.csv
I would like to know if I'm missing something obvious, like a command line option or a way to rename the fields with the put verb, or maybe something else? You're also welcome to give your insights about the previous code, as I'm not really confident in my Miller's programming skills.
Update:
With #aborruso approach of pre-processing the CSV header, this could be reduced to:
note: I didn't keep the regextract part because it means knowing the CSV header in advance.
mlr --csv -N --flatsep '/' put '
NR == 1 {
for (i,k in $*) {
$[i] = joinv( apply( splita(k,FLATSEP), func(e) {
return typeof(e) == "int" ? e+1 : e
}), FLATSEP );
}
}
' file.csv |
mlr --icsv --flatsep '/' --ojson cat
Even if there are workarounds like using the rename verb (when you know the header in advance) or pre-processing the CSV header, I still hope that Miller's author could add an extra command-line option that would deal with this kind of 0‑indexed external data; adding a DSL function like arrayify0 (and flatten0) could also prove useful in some cases.
I would like to know if I'm missing something obvious, like a command line option or a way to rename the fields with put verb, or maybe something else?
Starting from this
email/0,email/1
abc#xyz.org,bob#pass.com
you can use implicit CSV header and run
mlr --csv -N put 'if (NR == 1) {for (k in $*) {$[k] = "email/".string(int(regextract($[k],"[0-9]+"))+1)}}' input.csv
to have
email/1,email/2
abc#xyz.org,bob#pass.com

Regsub not manipulating string from URL

I am getting a strange TCL error when using this iRule, the error is:
<HTTP_REQUEST> - ERR_ARG (line 2) invoked from within "HTTP::uri
[regsub "/3dnotification" [HTTP::uri] ""] "
This is an F5 irule code.
This what I have tried:
when HTTP_REQUEST
{
if { not ( [string tolower [HTTP::uri]] starts_with "/socket.io" )} then {
HTTP::uri [regsub "/3dnotification" [HTTP::uri] ""]
# need to strip trailing slash on URI otherwise results in 404 for resources...
HTTP::uri [regsub "\/$" [HTTP::uri] ""]
} elseif { [string tolower [HTTP::header Upgrade]] contains "websocket" } {
ONECONNECT::reuse disable
set oc_reuse_disable 1
}
HTTP::header replace "X-Forwarded-ContextPath" "/"
}
when SERVER_CONNECTED {
if { [info exists oc_reuse_disable] } {
# Optional; unnecessary if the HTTP profile is disabled (goes into passthrough mode).
ONECONNECT::detach disable
unset oc_reuse_disable
}
}
Since the URI is either a full URI or the protocol-less part (I can't quite tell which from what you say; I'll assume either is possible), removing a leading part or trailing part is going to be a little bit tricky. What you need to do is to first split the URI into its component parts, apply the transformation to the path part, and then reassemble. The key to the splitting and reassembly is the uri package in Tcllib.
package require uri
# Split the URI and pick out the path from the parts
set uriParts [uri::split [HTTP::uri]]
set path [dict get $uriParts path]
# Do the transforms
set path [regsub "/3dnotification" $path ""]
set path [string trimright $path "/"]; # A different way to remove trailing slashes
# Reassemble and write back
dict set uriParts path $path
HTTP::uri [uri::join {*}$uriParts]
I'm assuming that you'd put the package require (or whatever else you need in order to get the code present) at the top of the script, and the rest inside the right when clause(s).
So that you can see what URI splitting actually does, here's your example URI split (in an interactive tclsh session):
% set uri "http://www.example.com:8080/main/index.jsp?user=test&login=check"
http://www.example.com:8080/main/index.jsp?user=test&login=check
% uri::split $uri
fragment {} port 8080 path main/index.jsp scheme http host www.example.com query user=test&login=check pbare 0 pwd {} user {}
As you can see, the path part is just main/index.jsp which is enormously easier to work with than the whole URI.

ReportService2010: create parent directories

I wrote a PowerShell script to bulk upload RDL files to SSRS 2014. I'm using the SOAP API exposed by ReportService2010.aspx:
$ssrsProxy = New-WebServiceProxy -Uri $uri -Credential $cred
$itemType = $ssrsProxy.GetItemType("/$reportFolder")
if($itemType -like "unknown")
{
$ssrsProxy.CreateFolder($reportFolder, "/", $null)
}
This works if $reportFolder is "foo", but not if it's "foo/bar". The error is:
Exception calling "CreateFolder" with "3" argument(s): "The name of the item 'foo/bar' is not valid. The name must be less than 260 characters long. The name must not start with a slash character or contain a reserved character. Other restrictions apply. For more information on valid item names, see http://go.microsoft.com/fwlink/?LinkId=301650.
The URL in the message is invalid and redirects to a "future resource" page. The actual documentation for CreateFolder says:
You can use the forward slash character (/) to separate items in the full path name of the folder, but you cannot use it at the end of the folder name.
Am I interpreting this incorrectly, or does it not actually work as documented?
My quick and dirty solution with no error handling. Use at your own risk.
$elements = $path.Split("/")
$parent = "/"
foreach($element in $elements)
{
$name = ""
if($parent.EndsWith("/"))
{
$name = "$parent$element"
}
else
{
$name = "$parent/$element"
}
$type = $ssrsProxy.GetItemType($name);
if($type -like "unknown")
{
$ssrsProxy.CreateFolder($element, $parent, $null)
}
$parent = $name
}

Removing specific html tag

I would like to remove a specific set of html tags, here is what I have tried
$str_rep="<table></td></tr></table></td></tr></table></td></tr>";
local $^I = ""; # Enable in-place editing.
push(#files,"$report_file");
local #ARGV = #files; # Set files to operate on.
while (<>) {
s/(.*)$str_rep(.*)$/$1$2/g;
print;
}
Html file has got only two lines - one is the page header and the 2nd line has got the full content including a couple of tables. Now I am trying to remove some unwanted table closing tabs which help me to merge tables together. Unfortunately it is removing everything after the replacement string. Where am I going wrong ?
You should escape slashes /, and simply replace the matched string by an empty string :
$str_rep="<table><\/td><\/tr><\/table><\/td><\/tr><\/table><\/td><\/tr>";
local $^I = ""; # Enable in-place editing.
push(#files,"$report_file");
local #ARGV = #files; # Set files to operate on.
while (<>) {
s/$str_rep//g;
print;
}
Here you are:
my $report_file = 'input.html';
# see at this v - you forget about one \/ near table :)
my $str_rep="<\/table><\/td><\/tr><\/table><\/td><\/tr><\/table><\/td><\/tr>";
local $^I = ""; # Enable in-place editing.
push(#files,"$report_file");
local #ARGV = #files; # Set files to operate on.
while (<>) {
s/$str_rep//g;
print;
}
I use diff for input.html and target.html
Everything works fine!