I have created one as3 application to capture and save the mic or web cam in red5 server by streaming.
I am also changing the file name by passing argument in publish() function.
How I can check whether the any other file with same name already exists in the server or not.
I want to maintain the file name sequence in the server...
Example: if the file name "xyz.flv" I sent through publish() and the xyz.flv already there in the server, then it should take the sequence of the file name like "xyz-1.flv" or "xyz-2.flv"...like that.
why dont you use a timestamp for the naming?
do something like:
var date_str:String = (today_date.getDate()+"_"+(today_date.getMonth()+1)+"_"+today_date.getFullYear()+"_"+today_date.getTime());
newRecording = new Recording(); //create a record instance
newRecording.filename = "rec_" + date_str; //name the file according to timestamp
Related
I have recorded a series of 5 HTTP requests in a thread group (say TG). The response value of a request has to be sent as a parameter in next request, and so on till the last request is made.
To send the parameter in first request, I have created a csv file with unique values (say 1,2,3,4,5).
Now I want this TG to run for all the values read from the csv file (In above case, TG should start running for value 1, then value 2, till 5).
How do I do this?
Given your CSV file looks like:
1
2
3
4
5
In the Thread Group set Loop Count to "Forever"
Add CSV Data Set Config element under the Thread Group and configure it as follows:
Filename: if file is in JMeter's bin folder - file name only. If in the other location - full path to CSV file
Variable Names: anything meaningful, i.e. parameter
Recycle on EOF - false
Stop thread on OEF - true
Sharing mode - according to your scenario
You'll get something like:
See Using CSV DATA SET CONFIG guide for more detailed explanation.
Another option is using __CSVRead() function
This method of creating individual request for each record will not be scalable for multiple records. There is another scalable solution here - Jmeter multiple executions for each record from CSV
In SSIS,I have taken flat file connection manager.
At the time of creation of Flat file connection manager I have passed file name using brows button.
But I want to pass this value using variable for that I have taken one variable with value.
and in FFCM property window,I have set the Expression with the connetionString with another file path.
But when I run this,FFCM not using variable value,Its using which I had passed at the time of creation FFCM
Thanks in advance.
I have a small piece of code to send a CSV file to a bigquery table.
The CSV file is on a local HD (not on google cloud storage).
Here is a simplified version of the csharp code (using version 2.1.5.0.122 of the bigquery csharp API).
BigqueryService bq = someMethodToGetABigqueryServiceInstance();
JobConfigurationLoad loadJobCfg = new JobConfigurationLoad();
loadJobCfg.SourceFormat = "CSV";
.
.
.
Job job = new Job();
JobConfiguration config = new JobConfiguration();
config.Load = loadJobCfg;
job.Configuration = config;
FileStream fs = new FileStream(#"c:\temp\onecol.csv", FileMode.Open, FileAccess.Read, FileShare.Read);
JobsResource.InsertMediaUpload insert = bq.Jobs.Insert(job, projectId, fs, "application/octet-stream");
var progress = insert.Upload();
// wait for Google.Apis.Upload.UploadStatus.Completed
.
.
The problem is that when I receive the Completed status, the file has been uploaded, but the data is not in the target table yet (ie: the job is still running).
Normally I should be able to wait for the job to be finished ('DONE') if I can have its reference (or ID),but I can't find a way to get that reference in the csharp API.
Is it possible to get the job reference from a JobsResource.InsertMediaUpload?
(insert.Body.JobReference is null)
Or is there another way to upload a local file to a bigquery table?
We recommend for all load jobs that you pass your own job reference -- that way in the event of a network hiccup, you'll be able to tell the state of the job. The job reference must be unique within the project, but this is pretty easy to do by creating a random number or using the current time.
Visual Web Developer. Entity data sources model. I have it creating the new database fine. Example
creates SAMPLE1.MDF and SAMPLE1.LDF
When I run my app, it creates another SAMPLE1_LOG.lDF file.
When I run createdatabase, is there a place I can specify the _LOG.ldf for the log file? SQL 2008 r2.
It messes up when I run the DeleteDatabase functions... 2 log files...
How come it does not create the file SAMPLE1_Log.ldf to start with, if that is what it is looking for...
Thank you for your time,
Frank
// database or initial catalog produce same results...
// strip the .mdf off of newfile and see what happens?
// nope. this did not do anything... still not create the ldf file correctly!!!
// sample1.mdf, sample1.ldf... but when run, it creates sample1_log.LDF...
newfile = newfile.Substring(0, newfile.Length - 4);
String mfile = "Initial Catalog=" + newfile + ";data source=";
String connectionString = FT_EntityDataSource.ConnectionManager.GetConnectionString().Replace("data source=", mfile);
// String mexclude = #"attachdbfilename=" + "|" + "DataDirectory" + "|" + #"\" + newfile + ";";
// nope. must have attach to create the file in the app_data, otherwise if goes to documents & setting, etc sqlexpress.
// connectionString = connectionString.Replace(mexclude, "");
Labeldebug2.Text = connectionString;
using (FTMAIN_DataEntities1 context = new FTMAIN_DataEntities1(connectionString))
{
// try
// {
if (context.DatabaseExists())
{
Buttoncreatedb.Enabled = false;
box.Checked = true;
boxcreatedate.Text = DateTime.Now.ToString();
Session["zusermdf"] = Session["zusermdfsave"];
return;
// Make sure the database instance is closed.
// context.DeleteDatabase();
// i have entire diff section for deletedatabase.. not here.
}
// View the database creation script.
// Labeldebug.Text = Labeldebug.Text + " script ==> " + context.CreateDatabaseScript().ToString().Trim();
// Console.WriteLine(context.CreateDatabaseScript());
// Create the new database instance based on the storage (SSDL) section
// of the .edmx file.
context.CreateDatabaseScript();
context.CreateDatabase();
}
took out all the try, catch so i can see anything that might happen...
==========================================================================
Rough code while working out the kinks..
connection string it creates
metadata=res://*/FT_EDS1.csdl|res://*/FT_EDS1.ssdl|res://*/FT_EDS1.msl;provider=System.Data.SqlClient;provider connection string="Initial Catalog=data_bac100;data source=.\SQLEXPRESS;attachdbfilename=|DataDirectory|\data_bac100.mdf;integrated security=True;user instance=True;multipleactiveresultsets=True;App=EntityFramework"
in this example, the file to create is "data_bac100.mdf".
It creates the data_bac100.mdf and data_bac100.ldf
when I actually use this file and tables to run, it auto-creates data_bac100_log.LDF
1) was trying just not to create the ldf, so when the system runs, it just creates the single one off the bat...
2) the Initial Catalog, and/or Database keywords are ONLY added to the connection string to run the createdatabase().. the regular connection strings created in web config only have attachdbfilename stuff, and works fine.
I have 1 connection string for unlimited databases, with the main database in the web.config.. I use a initialize section based on the user roles, whether visitor, member, admin, anonymous, or not authenticated... which sets the database correctly with a expression builder, and function to parse the connection string with the correct values for the database to operate on. This all runs good.
The entity framework automatically generates the script. I have tried with and without the .mdf extensions, makes no difference... thought maybe there is a setup somewhere that holds naming conventions for ldf files...
Eventually all of this will be for naught when start trying to deploy where not using APP_Data folder anyways...
Here is an example of connection string created when running application
metadata=res://*/FT_EDS1.csdl|res://*/FT_EDS1.ssdl|res://*/FT_EDS1.msl;provider=System.Data.SqlClient;provider connection string="data source=.\SQLEXPRESS;attachdbfilename=|DataDirectory|\TDSLLC_Data.mdf;integrated security=True;user instance=True;multipleactiveresultsets=True;App=EntityFramework"
in this case, use the TDSLLCData.mdf file...
04/01/2012... followup...
Entity Framework
feature
Log files created by the ObjectContext.CreateDatabase method
change
When the CreateDatabase method is called either directly or by using Code First with the SqlClient provider and an AttachDBFilename value in the connection string, it creates a log file named filename_log.ldf instead of filename.ldf (where filename is the name of the file specified by the AttachDBFilename value).
impact.
This change improves debugging by providing a log file named according to SQL Server specifications. It should have no unexpected side effects.
http://msdn.microsoft.com/en-us/library/hh367887(v=vs.110).aspx
I am on a Windows XP with .net 4 (not .net 4.5)... will hunt some more.. but looks like a issue that cannot be changed.
4/1/2012, 4:30...
ok, more hunting and searching and some of the inconsistancies I have experienced with createdatabase and databaseexists... so .net 4.5 is supposed to add the _log.ldf, and not just .ldf files, so they must have addressed this for some reason....
found others with same issues, but different server....
MySQL has a connector for EF4, the current version is 6.3.5 and its main functionalities are working fine but it still has issues with a few methods, e.g.
•System.Data.Objects.ObjectContext.CreateDatabase()
•System.Data.Objects.ObjectContext.DatabaseExists()
which makes it difficult to fully use the model-first approach. It's possible by manually editing the MySQL script (available with the CreateDatabaseScript method). The MySQL team doesn't seem eager to solve those bugs, I'm not sure what the commitment level actually is from their part but it certainly is lower than it once was.
That being said, the same methods fail with SQL CE too (they are not implemented, and I don't see the MS team as likely to tackle that soon).
Ran out of space below... it just becomes a problem when create a database, and it does not create the _log.ldf file, but just the ldf file, then use the database, and it creates a _log.ldf file... now you have 2 ldf files.. one becomes invalid.. Then when done with the database, delete it, then try to create a new, and a ldf exists, it will not work....
it turns out this is just the way it is with EF4, and they changed with EF4.5 beta to create the _log.ldf file to match what is created when the database is used.
thanks for time.
I've never used this "mdf attachment" feature myself and I don't know much about it, but according to the xcopy deployment documentation, you should not create a log file yourself because it will be automatically created when you attach the mdf. The docs also mention naming and say that the new log filename ends in _log.ldf. In other words, this behaviour appears to be by design and you can't change it.
Perhaps a more important question is, why do you care what the log file is called? Does it actually cause any problems for your application? If so, you should give details of that problem and see if someone has a solution.
I am new to SSIS i'm working on a critical deadline it would be great if someone could help me on this.
I have a share folder in server location //share/source/files where on daily basis one file will get loaded and on a monthly basis one more file gets loaded both the files are of same extension .csv
Can anyone help me in moving the files Say A.csv and B.csv to corresponding tables and more important is file name on day1 will be A 2011-09-10.csv and on day2 in source the file will be A 2011-09-11.csv..This files has to be moved to table A and file B.csv has to be moved the corresponding destination table table b ,once after moving the files this file has to be moved to archive folder and also we need to send users that tabl-A got loaded with 1000 rows and succesfull similarly table -b load was sucesfull along with date and time .
Note:Source Files will be automatically updated in the folder everyday at 5am in the morning.
First, create a variable that will hold the file path name.
Secondly, create a script task that checks to see if the file is available.
The script task will be as follows:
public void Main()
{
string FileName = String.Format("{0}.csv", DateTime.Now.ToString("yyyy-MM-dd"));
if (System.IO.File.Exists("\\Shared\\Path\\" + FileName))
{
Dts.Variables["FileName"].Value = FileName;
Dts.TaskResult = (int)ScriptResults.Success;
}
else
{
Dts.TaskResult = (int)ScriptResults.Failure;
}
}
After the script task, create a Data Flow Task.
Create a connection in the Connection Manager at the bottom of the page which will point to the flat file. In the Flat File Connection Manager Editor popup, set the File name to a file you'd like to upload (this will be updated dynamically, so its actual value isn't relevant). In the properties of the new Connection, open the Expressions popup. Select ConnectionString property, and set the expression to point to the path and the FileName variable : "\\Shared\\Path\\" + #[User::FileName].
Create a Flat File Source, and use the connection we just created as the Connection for the flat file.
Create a destination data flow item, and point it to the database you'd like to insert data into.
From here, create a SQL Server job that runs at the time you'd like it to run. This job should execute the package you have just created.