hg annotate -unl aFile
Shows:
jim 1519:477: a = 4
bob 1518:468: b = 5
joe 1496:402: return a
How do I get it to show
jim 401: a = 4
bob 402: b = 5
joe 403: return a
Where 401, 402, 403 are the current line numbers. Better yet would be something like git has with git blame -L 401,403 aFile.
To show current line numbers:
hg annotate -u aFile | cat -n
To also select only a certain range of line numbers:
hg annotate -u aFile | cat -n | sed -n 401,403p
I always use the annotate view in hg serve or from TortoiseHg. I find the command line annotate pretty poor since I cannot quickly jump to a parent revision. The hgweb annotate can be seen here:
https://www.mercurial-scm.org/repo/hg/annotate/tip/README
It includes (current) line numbers.
Related
I am trying to download openjdk8 source code from Mercurial repository using
hg clone http://hg.openjdk.java.net/jdk8/jdk8 openJDK8
Am getting the below error:
abort: error: node name or service name not known
If we add the ipaddress and hostname to /etc/hosts file, will it get resolved.
But i dont know how to find the ipaddress and hostname of http://hg.openjdk.java.net.
From another S10 system, i could be able to download the source. i checked /etc/hosts and /etc/resolve.conf. Both are same. When i copy the downloaded source to my system and tried to build it in my system, am getting some timestamp error in hotfolder:
WARNING: You are using cc version 5.13 and should be using version 5.10.
Set ENFORCE_CC_COMPILER_REV=5.13 to avoid this warning.
/opt/csw/bin//gmake: invalid option -- /
/opt/csw/bin//gmake: invalid option -- c
/opt/csw/bin//gmake: invalid option -- c
/opt/csw/bin//gmake: invalid option -- 8
/opt/csw/bin//gmake: invalid option -- /
/opt/csw/bin//gmake: invalid option -- a
/opt/csw/bin//gmake: invalid option -- /
/opt/csw/bin//gmake: invalid option -- c
Usage: gmake [options] [target] ...
Options:
-b, -m Ignored for compatibility.
-B, --always-make Unconditionally make all targets.
-C DIRECTORY, --directory=DIRECTORY
Change to DIRECTORY before doing anything.
-d Print lots of debugging information.
--debug[=FLAGS] Print various types of debugging information.
-e, --environment-overrides
Environment variables override makefiles.
-E STRING, --eval=STRING Evaluate STRING as a makefile statement.
-f FILE, --file=FILE, --makefile=FILE
Read FILE as a makefile.
-h, --help Print this message and exit.
-i, --ignore-errors Ignore errors from recipes.
-I DIRECTORY, --include-dir=DIRECTORY
Search DIRECTORY for included makefiles.
-j [N], --jobs[=N] Allow N jobs at once; infinite jobs with no arg.
-k, --keep-going Keep going when some targets can't be made.
-l [N], --load-average[=N], --max-load[=N]
Don't start multiple jobs unless load is below N.
-L, --check-symlink-times Use the latest mtime between symlinks and target.
-n, --just-print, --dry-run, --recon
Don't actually run any recipe; just print them.
-o FILE, --old-file=FILE, --assume-old=FILE
Consider FILE to be very old and don't remake it.
-O[TYPE], --output-sync[=TYPE]
Synchronize output of parallel jobs by TYPE.
-p, --print-data-base Print make's internal database.
-q, --question Run no recipe; exit status says if up to date.
-r, --no-builtin-rules Disable the built-in implicit rules.
-R, --no-builtin-variables Disable the built-in variable settings.
-s, --silent, --quiet Don't echo recipes.
--no-silent Echo recipes (disable --silent mode).
-S, --no-keep-going, --stop
Turns off -k.
-t, --touch Touch targets instead of remaking them.
--trace Print tracing information.
-v, --version Print the version number of make and exit.
-w, --print-directory Print the current directory.
--no-print-directory Turn off -w, even if it was turned on implicitly.
-W FILE, --what-if=FILE, --new-file=FILE, --assume-new=FILE
Consider FILE to be infinitely new.
--warn-undefined-variables Warn when an undefined variable is referenced.
This program built for i386-pc-solaris2.10
Report bugs to <bug-make#gnu.org>
gmake[5]: *** [/export/home/preethi/buildopenjdk/check8/hotspot/make/solaris/makefiles/top.make:84: ad_stuff] Error 2
gmake[4]: *** [/export/home/preethi/buildopenjdk/check8/hotspot/make/solaris/Makefile:225: product] Error 2
gmake[3]: *** [Makefile:217: generic_build2] Error 2
gmake[2]: *** [Makefile:167: product] Error 2
gmake[1]: *** [HotspotWrapper.gmk:45: /export/home/preethi/buildopenjdk/check8/build/solaris-x86-normal-server-release/hotspot/_hotspot.timestamp] Error 2
gmake: *** [/export/home/preethi/buildopenjdk/check8//make/Main.gmk:109: hotspot-only] Error 2
Following steps from:
https://hg.openjdk.java.net/jdk8u/jdk8u/raw-file/tip/README-builds.html
System spec:
SunOS pkg.oracle.com 5.10 Generic_150401-16 i86pc i386 i86pc
1) If we add any ipaddress and host in /etc/hosts whether the problem will get resolved?
2) Why the copied source is not working in another S10?
In /etc/hosts Added 137.254.56.60 openjdk.java.net. But same error. From my system am not able to ping openjdk.java.net. no answer from 137.254.56.60. Am new to solaris and not very familiar with proxy settings. Can anyone please help.
I have a cypher script file and I would like to run it directly.
All answers I could find on SO to the best of my knowledge use the command neo4j-shell which in my version (Neo4j server 3.5.5) seems to be deprecated and substituted with the command cyphershell.
Using the command sudo ./neo4j-community-3.5.5/bin/cypher-shell --help I got the following instructions.
usage: cypher-shell [-h] [-a ADDRESS] [-u USERNAME] [-p PASSWORD]
[--encryption {true,false}]
[--format {auto,verbose,plain}] [--debug] [--non-interactive] [--sample-rows SAMPLE-ROWS]
[--wrap {true,false}] [-v] [--driver-version] [--fail-fast | --fail-at-end] [cypher]
A command line shell where you can execute Cypher against an
instance of Neo4j. By default the shell is interactive but you can
use it for scripting by passing cypher directly on the command
line or by piping a file with cypher statements (requires Powershell
on Windows).
My file is the following which tries to create a graph from csv files and it comes from the book "Graph Algorithms".
WITH "https://github.com/neo4j-graph-analytics/book/raw/master/data" AS base
WITH base + "transport-nodes.csv" AS uri
LOAD CSV WITH HEADERS FROM uri AS row
MERGE (place:Place {id:row.id})
SET place.latitude = toFloat(row.latitude),
place.longitude = toFloat(row.latitude),
place.population = toInteger(row.population)
WITH "https://github.com/neo4j-graph-analytics/book/raw/master/data/" AS base
WITH base + "transport-relationships.csv" AS uri
LOAD CSV WITH HEADERS FROM uri AS row
MATCH (origin:Place {id: row.src})
MATCH (destination:Place {id: row.dst})
MERGE (origin)-[:EROAD {distance: toInteger(row.cost)}]->(destination)
When I try to pass the file directly with the command:
sudo ./neo4j-community-3.5.5/bin/cypher-shell neo_4.cypher
first it asks for username and password but after typing the correct password (the wrong password results in the error The client is unauthorized due to authentication failure.) I get the error:
Invalid input 'n': expected <init> (line 1, column 1 (offset: 0))
"neo_4.cypher"
^
When I try piping with the command:
sudo cat neo_4.cypher| sudo ./neo4j-community-3.5.5/bin/cypher-shell -u usr -p 'pwd'
no output is generated and no graph either.
How to run a cypher script file with the neo4j command cypher-shell?
Use cypher-shell -f yourscriptname. Check with --help for more description.
I think the key is here:
cypher-shell -- help
... Stuff deleted
positional arguments:
cypher an optional string of cypher to execute and then exit
This means that the paremeter is actual cypher code, not a file name. Thus, this works:
GMc#linux-ihon:~> cypher-shell "match(n) return n;"
username: neo4j
password: ****
+-----------------------------+
| n |
+-----------------------------+
| (:Job {jobName: "Job01"}) |
| (:Job {jobName: "Job02"}) |
But this doesn't (because the text "neo_4.cypher" isn't a valid cypher query)
cypher-shell neo_4.cypher
The help also says:
example of piping a file:
cat some-cypher.txt | cypher-shell
So:
cat neo_4.cypher | cypher-shell
should work. Possibly your problem is all of the sudo's. Specifically the cat ... | sudo cypher-shell. It is possible that sudo is protecting cypher-shell from some arbitrary input (although it doesn't seem to do so on my system).
If you really need to use sudo to run cypher, try using the following:
sudo cypher-shell arguments_as_needed < neo_4.cypher
Oh, also, your script doesn't have a return, so it probably won't display any data, but you should still see the summary reports of records loaded.
Perhaps try something simpler first such as a simple match ... return ... query in your script.
Oh, and don't forget to terminate the cypher query with a semi-colon!
The problem is in the cypher file: each line should end with a semicolon: ;. I still need sudo to run the program.
The file taken from the book seems to contain other errors as well actually.
I'm using hg flow (similar to git flow) approach.
I need all commits between my current branch "release/2.0.0" latest commit and previous "develop" commit in point there "release/1.0.0" was started.
In short words: I want all commits which will go in new release candidate package.
Please take a look at screenshot.
I want all commits which are inside red line. They are all changes since previous app release.
This is actually very easy with mercurial: revsets to the rescue!
You basically want to include everything which happend prior to your current dev release, but exclude that stuff which happend already prior to the last release:
hg log -r"ancestors(DEVREV) and not ancestors(RELEASE)"
where DEVREV is the revision of your current one and RELEASE the revision of the last release.
E.g.:
ingo#aeolus:~/hg-test$ hg log -G -T"{rev}: {desc}\n"
# 8: New release
|
| o 7: New dev stuff
| |
o | 6: Merge 4
|\ \
| | o 5: Add cc
| |/
| o 4: Add bb
| |
o | 3: Add d
| |
o | 2: Add c
| |
o | 1: Add b
|/
o 0: Add a
ingo#aeolus:~/hg-test$ hg log -r"ancestors(7) and not ancestors(8)"
changeset: 5:ce0558751c5a
user: planetmaker <planetmaker#openttd.org>
date: Wed Aug 22 16:14:12 2018 +0200
summary: Add cc
changeset: 7:78f338d1c8fa
parent: 5:ce0558751c5a
user: planetmaker <planetmaker#openttd.org>
date: Tue Oct 09 13:02:20 2018 +0200
summary: New dev stuff
You might want to try --style=changelog with the appropriate revset though.
I'm trying to get a bitbake file to pull down the latest revision of a mercurial repo when it builds. It's unfortunately undocumented (https://www.yoctoproject.org/docs/1.6/bitbake-user-manual/bitbake-user-manual.html#auto-revisions), though there are a few mailing list posts on the topic. None of my attempts to recreate their methods have panned out.
This is what the .bb file looks like:
###############################################################################
# Variables for locations.
###############################################################################
SRC_URI = "hg://foo//bar/foobar/test;rev=${SRCREV};protocol=ssh;branch=default;module=root"
SRCREV = "${AUTOREV}"
# Sources are downloaded to an hg subdirectory when pulling a repo.
S = "${WORKDIR}/hg"
###############################################################################
# The version of the library we're going to install.
###############################################################################
# Set PV to SRCPV so bitbake knows it should always check SRC_URI for a new
# version of the application.
PV = "${SRCPV}"
PR = "r0"
PE = "1"
And here's the relevant part of the output error:
/usr/bin/env hg up -C -r AUTOINC
| DEBUG: Python function base_do_unpack finished
| DEBUG: Python function do_unpack finished
| ERROR: Function failed: Fetcher failure: Fetch command failed with exit code 255, output:
| abort: unknown revision 'AUTOINC'!
|
NOTE: recipe test-1_AUTOINC+AUTOINC-r0: task do_unpack: Failed
DEBUG: Teardown for bitbake-worker
NOTE: Tasks Summary: Attempted 361 tasks of which 359 didn't need to be rerun and 1 failed.
Summary: 1 task failed:
/home/intern/git/poky/meta/recipes-core/test/test_0.0.1.bb, do_unpack
Summary: There was 1 ERROR message shown, returning a non-zero exit code.
The problem seems to be that AUTOREV is resolving to AUTOINC instead of a hash number. Any thoughts on what I can change to fix this?
You can use "tip" as source revision like:
SRCREV = "tip"
SRCMODULE = "myapp"
SRC_URI = "hg://hg_server_url;rev=${SRCREV};protocol=http;branch=${SRCBRANCH};module=${SRCMODULE}"
Keep only in mind (found that hard way) that hg_server_url shouldn't include target repo subname, but include it as SRCMODULE !
I'm using a beaglebone black using Linux kernel 3.17.4, Fedora 21 ARM. If I consider kernel pin 8 (gpio0[8], or P8.35)...
$ sudo grep 'pin 8 ' /sys/kernel/debug/pinctrl/44e10800.pinmux/pinmux-pins yieldspin 8 (44e10820.0): (MUX UNCLAIMED) (GPIO UNCLAIMED)
$ sudo grep 'pin 8 ' /sys/kernel/debug/pinctrl/44e10800.pinmux/pins yields pin 8 (44e10820.0) 00000027 pinctrl-single
So as far as I can tell, pin 8 is receive enabled with a pull down resistor in mode 7.
Then $ echo 8 | sudo tee -a /sys/class/gpio/export creates /sys/class/gpio/gpio8. $ echo out | sudo tee -a /sys/class/gpio/gpio8/direction sets it to out. $ echo 1 | sudo tee -a /sys/class/gpio/gpio8/value should set the pin high.
My observation is that although the value file reads high, the voltage from the gpio pin is low.
If I change "8" to "60", I am able to control the pin, but the filesystem starts going funky, presumably because that pin was being used for something. Notably, pins shows: pin 60 (44e108f0.0) 00000030 pinctrl-single.
So my question is -- Why isn't pin 8 (gpio0[8], or P8.35) working?
I incorrectly thought that kernel pins were calculated as 32 * N + M for gpioN[M]. The kernel pin is determined by the offset from 44e10.
pin 8 above has offset 820, which corresponds to gpio0[22] and p8.19. If you export 22 and check p8.19, the desired result is produced.
Interestingly, much of the blogger documentation on this fact is incorrect. I will not link to those sites to prevent them from proliferating. On the other hand, this post was entirely accurate and helped me understand what was going on:
http://www.valvers.com/embedded-linux/beaglebone-black/step04-gpio/