I'm not well versed with web techniques and would like to know if there's a way - an idea would be to use setWebhook - to make a telegram bot do simple stuff (like simply repeat the same message over and over again whenever someone sends it a message) without setting up a server.
I think there might be no way around it because I need to parse the JSON object to get the chat_id to be able to send messages... but I'm hoping someone here might know a way.
e.g.
https://api.telegram.org/bot<token>/setWebHook?url=https://api.telegram.org/bot<token>/sendMessage?text=Hello%26chat_id=<somehow get the chat_id>
I've tested it with a hard-coded chat id and it works... but of course it'll always only send messages to that same chat, regardless of where it received the message.
Here is a very simple Python bot example, you can run this on your PC no need for a server.
import requests
import json
from time import sleep
# This will mark the last update we've checked
last_update = 0
# Here, insert the token BotFather gave you for your bot.
token = 'YOUR_TOKEN_HERE'
# This is the url for communicating with your bot
url = 'https://api.telegram.org/bot%s/' % token
# We want to keep checking for updates. So this must be a never ending loop
while True:
# My chat is up and running, I need to maintain it! Get me all chat updates
get_updates = json.loads(requests.get(url + 'getUpdates').content)
# Ok, I've got 'em. Let's iterate through each one
for update in get_updates['result']:
# First make sure I haven't read this update yet
if last_update < update['update_id']:
last_update = update['update_id']
# I've got a new update. Let's see what it is.
if 'message' in update:
# It's a message! Let's send it back :D
requests.get(url + 'sendMessage', params=dict(chat_id=update['message']['chat']['id'], text=update['message']['text']))
# Let's wait a few seconds for new updates
sleep(3)
Source
Bot I'm working on
That's really interesting but definitely you'll need a server to parse the JSON value and get the chat_id out of it.
Related
I have failed miserably several times to use httpservice to successfully draw information off of my server for use in my Roblox game I am working on. After a number of failed efforts, I discovered that the main problem was I am failing to parse the table to get the value I need from the JSON table and that is, I think, why I can't get the Currency Handler script to accept the value.
I have an Ubuntu server service and something called Putty to access it.
I have done enough trial and error to believe that the handler script is working as intended except for not receiving the information from the next script correctly.
game.Players.PlayerAdded:Connect(function(player)
local value = Instance.new("NumberValue", player)
value.Name = "Currency"
value.Value = 0
local HttpService = game:GetService("HttpService")
local userID = '0' --player.UserId
local wallet = "http://x8/"
local mining = wallet..userID
local response = HttpService:GetAsync(mining, HttpService:JSONDecode(), Enum.HttpContentType.ApplicationJson)
local a,b,c,d = table.remove(response)
local function f()
value.Value = 1
end
end)
I also have a localscript running that looks like this.
local player = game.Players.LocalPlayer
player:WaitForChild("Currency").changed:Connect(function(value)
script.Parent.Currency.Text = "D "..value
end)
script.Parent.Currency.Text = "D "..player:WaitForChild("Currency").Value
This is after reading the various manuals from Piglet.
I am not a professional programmer or software engineer. I don't really know much about programming except what I have been able to glean from youtube and a few online tutorials. I've come as far as I can with those resources. The irony is I think this should be my last major hurdle to accomplishing my goals of getting a working Roblox game. I've used Wiki, and Roblox developer forums, but I don't fully understand them.
I was able to generate a table using the code above, but the table.remove is still not working. I will attempt to correct that problem, and this is the code I came up with
local mining = wallet..userID
local response
local data
response = HttpService:GetAsync(mining)
data = HttpService:JSONDecode(response, Enum.HttpContentType.ApplicationJson)
local a = Instance.new("NumberValue", player)
a = (data["result"]["amount"])
print(data["result"]["amount"])
print(a)
value.Value = a
print(value.Value)
return value
end)
not sure if this was the right way to do things, but it worked.
I am trying to make a simple python script that posts a text message to a facebook page using requests.
I actually managed to succeed this feat, however, when I add the same logic to a bigger project of mine, a certain request returns a different json.
According to this page https://developers.facebook.com/docs/pages/access-tokens I can exchange the short lived user token I generate in the graph explorer tool for a long lived one that lasts 60 days. This worked for me until now. When I run the same functions, same variables on another .py file that includes other logic as well the request does not return this line:
"expires_in": SECONDS-UNTIL-TOKEN-EXPIRES
And of course later on if I continue the logic and use the token it returns (which is the same) for, let's say, a make_post function the request prints
{'error': {'message': '(#200) If posting to a group, requires app being installed in the group, and \\\n either publish_to_groups permission with user token, or both manage_pages \\\n and publish_pages permission with page token; If posting to a page, \\\n requires both manage_pages and publish_pages as an admin with \\\n sufficient administrative permission', 'type': 'OAuthException', 'code': 200, 'fbtrace_id': 'AqYMMeOcOniWAGgEEtsEURs'}
Why does it not successfully return, the user token had not expired and it has the requires rights. Furthermore I tested this in a smaller .py file and it worked.
Another thing I found out here https://developers.facebook.com/support/bugs/523165725596520/?join_id=f1ff8392b49675c here is that other people have actually reported the same issue but it has been closed as 'intended by design' however there is no information of a solution.
Running the request in my browser also does not work correctly.
Do you have any ideas? I am completely clueless.
Thank you very much in advance
As #CBroe in a comment said, the expires_in didn't have anything to do with my error. The token it returns if valid. The issue I had later on had to do with the url I was parsing
I'm wondering if there is a Twitch app/website out there that will give me a list of all the vod IDs for past broadcasts that exist for a specified Twitch channel. I use ReChat to download chat logs so I can search for moments I want to revisit from past streams when I don't remember on which stream they occurred, but I'm finding it tedious to copy and paste each VOD ID one by one.
I'm not a dev myself but I know there is something in the JSON API that makes this possible - just don't know how to use it so I'm wondering if someone else has set this up anywhere on the Internet. Thanks for everyone's help!
So this took me way too long to figure out, I still don't know how to do proper url redirect authentication for users using your application, but if you just want a local, or server to server python script then here is how to do it with the "new twitch api". Hope it helps someone out there.
import requests
import json
## Its the name you see when you browse to the twitch url of the streamer
USER_ID = "<USER_ID_NAME_YOU_WANT_THE_VIDEOS_FROM>"
## First setup your application on your dashboard.
## here: https://dev.twitch.tv/console
## then click "Register Your Application" on the right hand side.
## For the oauth redirect just write: http://localhost
## Make note of your Client ID
## Make note of your Client Secret
CLIENT_ID = "<YOUR_CLIENT_ID>"
SECRET = "<YOUR_CLIENT_SECRET_CODE>"
## First get a local access token.
secretKeyURL = "https://id.twitch.tv/oauth2/token?client_id={}&client_secret={}&grant_type=client_credentials".format(CLIENT_ID, SECRET)
responseA = requests.post(secretKeyURL)
accessTokenData = responseA.json()
## Then figure out the user id.
userIDURL = "https://api.twitch.tv/helix/users?login=%s"%USER_ID
responseB = requests.get(userIDURL, headers={"Client-ID":CLIENT_ID,
'Authorization': "Bearer "+accessTokenData["access_token"]})
userID = responseB.json()["data"][0]["id"]
## Now you can request the video clip data.
findVideoURL = "https://api.twitch.tv/helix/videos?user_id=%s"%userID
responseC= requests.get(findVideoURL, headers={"Client-ID":CLIENT_ID,
'Authorization': "Bearer "+accessTokenData["access_token"]})
print ( json.dumps( responseC.json(), indent = 4) )
I know you can get 100 from GQL.
You could make a POST request to: https://gql.twitch.tv/gql
With
PostData = [{"operationName":"FilterableVideoTower_Videos","variables":{"limit":100,"channelOwnerLogin":"usernametogetvideos","broadcastType":null,"videoSort":"TIME","cursor":"MTQ1"},"extensions":{"persistedQuery":{"version":1,"sha256Hash":"2023a089fca2860c46dcdeb37b2ab2b60899b52cca1bfa4e720b260216ec2dc6"}}}]
You also require a Client-Id header. Obtainable by going to Twitch on a browser and simply copying your own inside the network developer tool.
It will respond with the entire VOD information for 100 videos.
This python script will output the past broadcasts vod ids of a specific user (Using the new Twitch API v5).
import requests
import json
r = requests.get("https://api.twitch.tv/helix/videos?user_id=USERID&type=archive", headers={"Client-ID":"CLIENTID"})
j = json.loads(r.text)
for vod in j['data']:
print(vod['id'])
You need to replace USERID with an actual user id. To obtain the user id of a streamer, an api call to a specific vod will help: https://api.twitch.tv/helix/videos?id=VODID. The response will include a user_id.
CLIENTID also needs to be replaced. You can obtain it by registering your application at Twitch Developers.
I know how to insert a new group via MySQL, and it works, to a degree. The problem is that the database changes are not loaded into memory if you insert the group manually. Sending a HUP signal to the process does work, but it is kludgy and a hack. I desire elegance :)
What I am looking to do, if possible is to make changes (additions/deletions/changes) to a group via MySQL, and then send an HTTP request to the openfire server to read the new changes. Or in the alternative, add/delete/modify groups similar to how the User Service works.
If anyone can help I would appreciate it.
It seems to me that if sending a HUP signal works for you, then that's actually quite a simple, elegant and efficient way to get Openfire to read your new group, particularly if you do it with the following command on the Openfire server (and assuming it's running a Linux/Unix OS):
pkill -f -HUP openfire
If you still want to send an HTTP request to prompt Openfire to re-read the groups, the following Python script should do the job. It is targeted at Openfire 3.8.2, and depends on Python's mechanize library, which in Ubuntu is installed with the python-mechanize package. The script logs into the Openfire server, pulls up the Cache Summary page, selects the Group and Group Metadata Cache options, enables the submit button and then submits the form to clear those two caches.
#!/usr/bin/python
import mechanize
import cookielib
# Customize to suit your setup
of_host = 'http://openfire.server:9090'
of_user = 'admin_username'
of_pass = 'admin_password'
# Initialize browser and cookie jar
br = mechanize.Browser()
br.set_cookiejar(cookielib.LWPCookieJar())
# Log into Openfire server
br.open(of_host + '/login.jsp')
br.select_form('loginForm')
br.form['username'] = of_user
br.form['password'] = of_pass
br.submit()
# Select which cache items to clear in the Cache Summary page
# On my server, 13 is Group and 14 is Group Metadata Cache
br.open(of_host + '/system-cache.jsp')
br.select_form('cacheForm')
br.form['cacheID'] = ['13','14']
# Activate the submit button and submit the form
c = br.form.find_control('clear')
c.readonly = False
c.disabled = False
r = br.submit()
# Uncomment the following line if you want to view results
#print r.read()
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I (like most tech admins I guess) have quite a lot of status infos from scheduled services in my inbox. However when one service email fails there's obviously no email sent. So I simply want a service looking at my inbox saying "Hey this service did not send an email report yesterday - somethings wrong!".
This one should be solved somewhere I guess. Perhaps Gmail (or some other email provider) has a service of this kind, that would be great.
Wouldn't it be a better option to have a centralized monitoring solution like Nagios that you configure in such way that it only send out notifications when a service misses its heartbeat, reaches highwatermarks, run out of fuel? And then off course of a second monitoring solution that monitors the main monitoring solution....
http://www.nagios.org/documentation
I'm not aware of any service you describe but a manual routine might go like this:
Have a folder/tag structure like this:
Services\Hourly-[NumberOfServices] (or add a folder per service)
Services\Daily-[NumberOfServicves]
Services\Weekly-[NumberOfServicves]
Services\Monthly-[NumberOfServicves]
Have rules for incoming mail to filter each specific service notification and move it to the right folder based on its expected timing.
Wakeup every hour and check if there are unread messages in your Hourly folder. The number of unread should be the same as the NumberOfServices mentioned in the folder. Read/Process them and make sure to all mark them as Read. Any service that didn't e-mailed get's spotted easily.
Wakeup at 0:00 and check if there are unread messages in your Daily folder. etc etc..
Wakeup at 0:00 and Saturday and check if there are unread messages in your Weekly folder. etc.....
Wakeup at 0:00 on the first of the month and check if there are unread messages in your Weekly folder. etc etc etc...
My advice would be to cut down the noise generated by the services.
If you still feel you need a service I can only provide a very very basic .Net implementation roughly based on the above process and works with gmail...
This is also portable to powershell...
static void Main(string[] args)
{
var resolver = new XmlUrlResolver
{
Credentials = new NetworkCredential("yourgoolgeaccount", "yourpassword")
};
var settings = new XmlReaderSettings();
settings.XmlResolver = resolver;
var xr = XmlReader
.Create("https://mail.google.com/mail/feed/atom/[name of your filter]"
, settings);
var navigator = new XPathDocument(xr).CreateNavigator();
var ns = new XmlNamespaceManager(new NameTable());
ns.AddNamespace("fd", "http://purl.org/atom/ns#");
var fullcountNode = navigator.SelectSingleNode(
"/fd:feed/fd:fullcount"
, ns);
Console.WriteLine(fullcountNode.Value);
int fullcount = Int32.Parse(fullcountNode.Value);
int expectCount = 10;
if (expectCount > fullcount)
{
Console.WriteLine("*** NOT EVERY ONE REPORTED BACK");
}
}
You mentioned Gmail, so you may be interested in googlecl, which gives you command-line controls for things like Google Calendar and Docs. Unfortunately they do not yet support Gmail, but if your long-term preference is to use a Gmail account as the hub of your status reports, then googlecl may be your best option.
In the short run, you can try out googlecl right now using the commands for Calendar, Blogger, or Docs, all of which are already supported. For example, these commands add events to Google Calendar:
google calendar add --cal server1 "I'm still alive at 13:45 today"
google calendar add "Server 1 is still alive at 2011-02-08 19:43"
...and these commands query the calendar:
google calendar list --fields title,when,where --cal "commitments"
google calendar list -q party --cal ".*"
Come to think of it, you may even find that Calendar, Blogger, or Docs are a more appropriate place than Gmail for tracking status updates. For example, a spreadsheet or calendar format should make it easier to generate a graphical representation of when a given service was up or down.
You still need to write a little program which uses googlecl to query the calendar (or blog, or docs, or whatever), but once you have simple command lines at your disposal, the rest should be pretty straightforward. Here's a link to further information about googlecl:
http://code.google.com/p/googlecl/
If you really want to use Gmail, and use it right now, they offer an IMAP interface. Using IMAP, you can perform numerous simple operations, such as determining if a message exists which contains a specified subject line. Here's one good place to learn about the details:
http://mail.google.com/support/bin/answer.py?hl=en&answer=75725
Here's a quick example that uses IMAP and Python to list the ten most-recent emails which have a given Gmail "Label":
import getpass, imaplib
# These gmail_* utilties are from https://github.com/drewbuschhorn/gmail_imap
import gmail_mailboxes, gmail_messages, gmail_message
# Update these next lines manually, or turn them into parms or somesuch.
gmail_account_name = "your_user_name#gmail.com" # Your full gmail address.
mailbox_name = "StatusReports" # Use Gmail "labels" to tag the relevant msgs.
class gmail_imap:
def __init__ (self, username, password):
self.imap_server = imaplib.IMAP4_SSL("imap.gmail.com",993)
self.username = username
self.password = password
self.loggedIn = False
self.mailboxes = gmail_mailboxes.gmail_mailboxes(self)
self.messages = gmail_messages.gmail_messages(self)
def login (self):
self.imap_server.login(self.username,self.password)
self.loggedIn = True
def logout (self):
self.imap_server.close()
self.imap_server.logout()
self.loggedIn = False
# Right now this prints a summary of the most-recent ten (or so) messages
# which have been labelled in Gmail with the string found in mailbox_name.
# It won't work unless you've used Gmail settings to allow IMAP access.
if __name__ == '__main__':
gmail = gmail_imap(gmail_account_name,getpass.getpass())
gmail.messages.process(mailbox_name)
for next in gmail.messages:
message = gmail.messages.getMessage(next.uid)
# This is a good point in the code to insert some kind of search
# of gmail.messages. Instead of unconditionally printing every
# entry (which is what the code below does), issue some sort of
# warning if the expected email (message.From and message.Subject)
# did not arrive within the expected time frame (message.date).
print message.date, message.From, message.Subject
gmail.logout()
As noted in the code comments, you could adapt it to issue some sort of warning if the most-recent messages in that mailbox do not contain an expected message. Then just run the Python program once per day (or whatever time period you require) to see if the expected email message was never received.