Find stops after the first file - json

Please help me understand why ncu is causing a find operation to stop after the first file? I have 25 project folders, all with their own package.json and bower.json file (not all have bower.json).
Issuing this command with an echo works perfectly:
find ../ -name "package.json" -type f -exec echo '{}' +
... all files are printed to the screen.
However, this syntax stops after the first file when I use ncu:
find ../ -name "package.json" -type f -exec ncu -u -a --packageFile '{}' +
Here's the only output of the command:
$ find ../ -name "package.json" -type f -exec ncu -u -a --silent --packageFile '{}' +
Using /home/joeblow/projects/app01/package.json
[..................] - :
All dependencies match the latest package versions :)
The versions I'm using is:
bash version: 4.3.42(1)-release
find version: 4.7.0-git
node version: 6.9.4
npm version: 4.1.2
npm-check-updates version: 2.8.9

ncu (aka npm-check-updates) silently ignores all but the first file.
Use -exec ncu -u -a --packageFile '{}' \; instead to only run it with a single file at a time.

Related

Troubleshooting Bash script to delete files

On my Synology NAS I've written a bash script that is supposed to delete the junk files that windows creates:
#/bin/ash
find /volume2 -type f -name "Thumbs.db" -exec rm -f {} \;
find /volume2 -type f -name "desktop.ini" -exec rm -f {} \;
#find /volume2 -type d -empty -delete \;
This was to the best of my knowledge working fine until I added the now commented-out line to remove the empty folders after the files are deleted. Now despite not intentionally changing anything, the script is failing with
find: missing argument to `-exec'
I'm sure I'm missing something incredibly obvious but please help, also I don't know why the last line didn't work either. I've read lots of threads on Stack with no joy.

Can I use wildcards in oc exec commands?

I am trying to run remote commands on the openshift pods to delete some files in certain directory and the below command works:
oc exec mypod -i -t -- rm -f /tmp/mydir/1.txt
However, i am unable to use wildcards e.g *.txt to remove all .txt files. The command with wildcards does not give any errors but doesn't delete any files.
Any suggestions will be appreciated.
The following command worked:
oc exec mypod -i -t -- find /tmp/mydir -type f -name '*.txt' -delete
Hopefully it will be useful to someone else.

Find all files with .md extension and execute a command with the file and generate a new file with a name generated through the the md file name

I'm trying to write a shell script to recursively find all files under a directory with the extension .md and execute a command with the .md file and generated new file with the same name but a different extension.
below is the command I'm having but its actually appending the .html to the file instead of replacing .md with .html
find . -name '*.md' -exec markdown-html {} -s
resources/styles/common-custom.css -o {}.html \;
the above command generates a new file "home.md.html" from "home.md" but i want the .md removed. tried different solutions but didn't work
Hi you have to write a small script here, I have given the description how it is going to work, please refer to the comments in the below codes:-
First create a shell script file like convertTohtml.sh and add below codes in it
#!/bin/bash
find . -name '*.md' > filelist.dat
# list down all the file in a temp file
while read file
do
html_file=$( echo "$file" | sed -e 's/\.md//g')
# the above command will store 'home.md' as 'home' to variable 'html_file'
#hence '$html_file.html' equal to 'home.html'
markdown-html $file -s resources/styles/common-custom.css -o $html_file.html
done < filelist.dat
# with while loop read each line from the file. Here each line is a locatin of .md file
rm filelist.dat
#delete the temporary file finally
provide execute permission to your script file like below:-
chmod 777 convertTohtml.sh
Now execute the file:-
./convertTohtml.sh
Below script will work to solve the extension problem.
#!/bin/bash
find . -name '*.md' > filelist.dat
# list down all the file in a temp file
while read file
do
file=`echo ${file%.md}`
#get the filename witout extension
markdown-html $file -s resources/styles/common-custom.css -o $file.html
done < filelist.dat
# with while loop read each line from the file. Here each line is a locatin of .md file
rm filelist.dat
#delete the temporary file finally
If you want to use the output of find multiple times you could try something like this:
find . -name "*.md" -exec sh -c "echo {} && touch {}.foo" \;
Notice the:
sh -c "echo {} && touch {}.foo"
The sh -c will run commands read from the string, then the {} will be replaced with the find output, in this example is first doing an echo {} and if that succeeds && it will then touch {}.foo, in your case this could be like:
find . -name "*.md" -exec sh -c "markdown-html {} -s resources/styles/common-custom.css -o {}.html" \;

How to find last mysql dump file in directory and inport it

I just found Vigrant, and I am trying to make a script that set up fully my development environment. I am using Ubuntu 14.04.01 like a server. With scrip now I install LAMP, set up a password for MySQL, and change the LAMP server public directory. Now I have a directory with many .sql dump file. I want to find the newest one and import it to my database. I have a problem with importing my last created .sql dump of database. I found a command that find last created file in directory.
find /vagrant/VagrantFiles/DB/ -type f -exec stat -c "%n" {} + | sort -r | head -n1
But when add mysql command to import it mysql --user=root --password=pass < {} and execute line like this:
find /vagrant/VagrantFiles/DB/ -type f -exec stat -c "%n" {} + | sort -r | head -n1 | mysql --user=root --password=pass < {}
I get the error in terminal -bash: {}: No such file or directory
How I can make this work?
Hi after two days of searching the solution I found it.
I just separate this command in two part. Firs find newest file and remember it to variable, then execute mysql command with this variable. My inspiration for this solution is this post. And here it is the solution:
last_dump=$(find /vagrant/VagrantFiles/DB/ -type f -exec stat -c "%n" {} + | sort -r | head -n1)
mysql --user=root --password=pass < $last_dump

Argument list too long - Apache

I'm following this tutorial on deploying an wordpress application inside an AWS instance http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/hosting-wordpress.html and I get an error when I do
[ec2-user#ip-10-10-1-73 ]$ find /var/www -type f -exec sudo chmod 0664 {} +
sudo: unable to execute /bin/chmod: Argument list too long
sudo: unable to execute /bin/chmod: Argument list too long
What is the root problem of this error?
So you are trying to pass to many arguments to chmod, you could be running out of stack space. This is a limit you can set on linux using ulimit, but personally I would just modify the command
find /var/www -type f -exec sudo chmod 0664 {} \;
The difference is that with the + you are trying change the permissions of all the files at once, with \; you are setting the permissions one file at a time