twitter api updating status' and databases' - mysql

I'm trying to learn how to work with twitter api. I'm still a little confused. If I want users to tweet using my input text boxes on my site, do I still need a database for those tweets? Or does the api handle the storage for the tweets?
Thank you!

It is all about your requirement and then your design pattern. Infact You don't need any database to store tweets. In your case where you just want to tweet, You can do it without DB. Infact twitter-rest-api https://dev.twitter.com/docs/api is very good.
If you need to made frequent api call from your app, you can use caching to avoid too many frequent calls. For reference https://github.com/atsiddiqui/ReTweeted

You do not need a database to store the tweets. Twitter stores them.

Once you use the Twitter API to send the tweet data to Twitter.com, it is stored on Twitter.com's storage system. So, it is in a way handled by the api. It does not matter to Twitter whether you stored them in your own database or not.
But it is good practice to store the information in you own server database for record purpose. For example, you might want to know the statistics -- how many users use your service, what is the average use of it.
It may help investigating issues (sometimes legal) like when an user complains that your site published offensive tweet to his account. Then, you can track and check your database.
There are many more cases when you want to see the records in your system.

Related

Creating individual user profiles

I am working on a project and one of the key components is creating customized user profiles. I already have a schema design for the user data that will generate said profile. But I am lost on how the technology works.
I am mostly front-end so it has been sort of overwhelming. The goal is to allow multiple user profile creations and so far I have only seen that this can be achievable via NodeJs or PHP. I have not found any guides.
I am not sure if I am asking the right questions.
Any help is appreciated. Thank you.
Since you mention you already have schema for the user table, I assume you are going to design your own database and backend node.js API to handle user profiles. You may want to build authentication functionalities in the future. If you are not familiar with Node.js yet, I recommend you to start with https://www.tutorialspoint.com/nodejs/index.htm. It's a good tutorial for beginners.
The whole purpose of a back-end node.js API is building numbers of service with specified route. Once a http request is made to a particular path, it takes parameters and execute some script. In you case the scripts will do something in database containing user profile data, for example, add a row in your data table. This operation is equivalent to creating a new user. Then, the API send response to front-end.
Keep in mind maintaining user profile data is nothing special than regular data. You should be able to pick it up with a couple of days training if you know javascriopt. But if you have to build authentication functionality you need more technologies.

Store data to be accessed on multiple devices without a server

Is it possible to create a client-side only app, with no server backend, that stores data in a way that one user can see things stored by another user on the app?
To give some context, I am trying to create an cross platfor phone application, preferably using html, that will allow users to log their hours in a punch in, punch out style and then have these hours become viewable by a supervisor, however I will not have any server power to store any data.
I'm sure this is possible, perhaps using something like google spreedsheets or something similar to store this data, however I am at a loss how I would do this. Any help would be appreciated.
The short answer is "No."
However you can use a service such as Firebase to host your data for you.

How do you incorporate Node.js/passport into my website?

I'm new to webdev and I'm trying to use passport for registration/authentication on a site I'm setting up. I'm also going to write an application in node later on that will be using some of the user data (users will need to provide an API key for an account on another site that I will use to pull data into the application).
At the moment, the main issue I'm having is figuring out what goes where. I've found plenty of resources that explain how to create an app using passport, but nothing shows how it would be incorporated into your website or where the files should be in relation to your website. I'm relatively new to Node.js, and while I've written a few small applications I have never hosted them anywhere.
Bonus question: I'm using MongoDB with passport and I was also planning to use it to store some JSON my application will be receiving from API calls. However, I wanted to use MySQL to store some data as well. More specifically, I'm planning to save the raw JSON then I'll create a relational database out of the data I need from the JSON and then keep the rest in MongoDB for easy access. Is this common/smart, or should I focus on keeping everything in my MongoDB? I'm relatively new to NoSQL.
Thanks in advance for any help.
I would reference this tutorial. I just recently used this to help myself with a new application. Also there is an example of the same thing but in SQL here. So not sure what you mean by " where the files should be in relation to your website". The information related to to authentication should go in your database.
To your "bonus question" you can use two databases. The key here is to ask yourself why and what are the true needs for data, and how is this data accessed and used. From ground up I would like one and stick with it. If at some point later you realize a certain type of data would be better in a different database then you can add it.
Side note: look into an IDE such as webstorm to help you out.

To add another Database or not to add another Database, that is the questionn

One of my sites is a social networking site running on MySQL. I use postal code and country information to geolocate users using a webservice. This webservice also allows you to download all their many tables of information so that you can access it locally. My site has gotten big enough that I wish to do this now.
My question is, should I create a new database on my site for all of this postal code and country information and all its tables, or should I incorporate those tables into my existing database for my social networking site?
What are the pros/cons either way?
When you're talking about scaling and want to know about other databases like NOSQL, you might find this article interesting: http://highscalability.com/blog/2010/12/6/what-the-heck-are-you-actually-using-nosql-for.html
I'd vote in favor of a separate database if you planned to use the data as read-only and put a web service in front of it to access it. Users would search it based on a small handful of parameters (e.g. address info to get lat/lon data).
I'd say put it in the existing database if you planned to JOIN it with other information in your current schema.
it will live on the same disk probably.
so disk space is not an issue.
if you query the tables in a completely separate manner, then no impact on the existing site.
if you query things together, then easier when all in one database.
overall administration of one database vs 2 is easier.
i think it's a no brainer... they go in one db.

Tracking data access

Backstory
I work for a company that has an online site that allows user to text personal information for collection. We collect the data, and make it available online. Users can choose to share the data with other users.
Going Forward
At some point, this may become classified an FDA-governed medical tool. In anticipation, we'd like to have in place a logging system that shows each time someone accesses our users' data, whether it be the user themselves, another authorized user, or a support person.
Current Architecture
We are currently running Ruby/Rails, and using a MySQL database. The personal information is encrypted in the database.
Data Access for Support
Today, support personnel can access data one of three ways:
admin site The admin site is limited to whatever screens we develop. While we don't currently, we could easily add logging to keep an audit trail of who accessed which data using the admin tool.
sql client I use MySQLWorkbench to access production. However, when connected this way, all personal information (user name, cell number, etc), is encrypted.
Ruby Rails console - Finally, support can log into one of the production boxes and use the Ruby/Rails console from command line. Ruby will decrypt the data, so we can do some simple things such as
u=User.find_all_by_state('active')
and it will return the recordset of all users with state='active', and decrypt their personal information in the resultset.
Holy Grail
logging
easy access for support
I'd love to be have a way to allow easy support access (once authenticated) to the data, but would log everything that is accessed (read or updated). That way, if I'm checking out my buddy's ex-wife's data for example, it gets logged to a place where I can't get in and clean it the audit trail. (See Google firing Gmail employee for an example of employees breaching the data policies).
Anyone have ideas, thoughts, experiences, suggestions with this issue?
hey devguy. This was a issue for me a couple months back. We ended up centralizing our mysql queires so that we could start to track all information coming in and out. Unfortunately the class I wrote is in PHP but the idea behind it could make it very easy to start logging.
https://code.google.com/p/php-centralized-mysql-controller/
Try stored procedures. Make all code use the stored procedures for CRUD activities. This defines an API that your developers can use while business rules are global enforced (don't return entire SSN values, but only last 4 digits, etc).
This serves as the basis for an external API as well.
If you want logging/auditing, you put it in the procedure.
This protects you from everyone except the DBAs.