Sas : viewer results are the same as output - html

I am writting because I have a problem with my results viewer on Sas 9.4.
When they charge the resukts are the same as in output. Instead of having a table with cell I have a table draw with -+|. Layouts options does not work and when exporting to excel for example each line are contain into one cell (with | where it shiuld change cell).
Is there sonething I can do?
Because like this I am loosing a lot of the options SAS offer
(I already checked tools:options:preferences:results and checked create html)
Thank you on advanced for your help!

Here's an example of how you'd put results in Excel.
I'm guessing you're copying from HTML to Excel?
ods excel file = '/home/fkhurshed/Demo/demo.xlsx';
proc means data=sashelp.class;
run;
proc freq data=sashelp.class;
table sex*gender / norow nocol nopercent;
run;
ods excel close;

Related

How to type big table in LibreOffice Writer?

Here is what I use.
OS: Linux Mint 18
Editor: LibreOffice Writer 5.1.6.2
Situation
Consider the following foo.csv file (just example, the raw data contains hundred of lines):
A,B,C
1,2,3
To create a table in Writer with the data from foo.csv usually one creates the table via Toolbar and then type the contents (possibly using TAB to navigate between cells).
Here is the result of procedure above:
Goal: Since the whole foo.csv contains hundreds of lines, how to proceed?
1st try: copy and paste the data from foo.csv into the table does not work, as seen below.
2nd try: copy and paste the data from foo.csv into the table with all cells selected does not work, as seen below.
Question: is it possible to edit an odt file in some way to write some code (like we could do with tags in HTML) to produce such table?
Embed a Calc spreadsheet is not acceptable.
Just use the "Text to Table" feature:
Insert the csv as "plain text" into your writer document (not into a table, just anywhere else);
Select the inserted lines;
Select Menu "Table" -> "Convert" -> "Text to Table";
Adjust the conversion properties as needed (set separator to comma: Select "Other", enter a comma into the box at the right"):
Hit OK - LO Writer will convert the text content of your CSV into a nice Writer table.
Please note that if you use this solution, there's nothing like a "connection" between the writer table and the csv data. Changing the csv won't affect the writer table. This would be possible only by embedding an object (but this won't result into a Writer table...).
If the csv data is the only content of the odt (writer) file, there's another option: Use LibreOffice Base to create a LO Database using the csv file (dynamically updated if the csv changes), and use the Report feature to get a tabular output of the csv data. LO Base will store the output layout as report, making it easy to create an up-to-date report.

SAS Export dataset as csv or excel preserving line break

I am trying to export my dataset from SAS to excel either is csv or xls format however, when I do this the columns with line breaks messes up my excel. Is there a way to export SAS dataset to excel preserving line breaks? I also need to display labels instead of column names and the dataset is fairly large approx. 150,000 rows.
Here is what I did,
proc export data=Final_w_label
outfile='work/ExtractExcel.csv'
dbms=csv label replace;
run; quit;
Thank you in advance.
See the bottom of the post for sample data.
One effective way to create an export that Excel will open easily and display embedded newlines is to use XML.
libname xmlout xmlv2 'c:\temp\want.xml';
data xmlout.want;
set have;
run;
libname xmlout;
In Excel (365) do File/Open, select the want.xml file and then select As an XML table in the secondary Open XML dialog that is raised.
Other ways
There are other ways to move SAS data into a form that Excel can parse. Proc EXPORT will create a text file with embedded carriage returns in the character variables (which Excel uses for in cell newlines)
proc export dbms=csv data=have label replace file='c:\temp\want.csv';
run;
The problem of the export is that Excel will not import the data properly using it's wizards. There might be a vbs solution for reading the export, but that is probably more trouble than worth.
Another form of export is dbms=excel that creates .xlsx files:
proc export dbms=excel data=have label replace file='c:\temp\want.xlsx';
run;
This export can be opened by Excel and the columns will all be correct. However, the initial view presentation of the data value in cells with embedded carriage returns will not appear to have the newline. Further examination with F2 edit mode will show that those embedded new lines are there, and pressing Enter (to accept edits) will cause the cell view to show the embedded newlines. You don't want to have to F2 every cell render as expected.
Sample Data
data have (label="Lines within stanza are separated by newline character");
attrib
id length=8 label='Identity'
name length=$50 label='Poem name'
auth length=$50 label='Author'
stanza1-stanza20 length=$250;
;
array stz stanza:;
id + 1;
section = 1;
infile cards eof=last;
do while (1=1);
linenum + 1;
input;
select;
when (_infile_ = '--') leave;
when (linenum = 1) name = _infile_;
when (linenum = 2) auth = _infile_;
when (_infile_ = '') section + 1;
otherwise stz(section) = catx('0d'x, stz(section), _infile_);
end;
end;
last:
output;
datalines4;
Trees
Joyce Kilmer
I think that I shall never see
A poem lovely as a tree.
A tree whose hungry mouth is prest
Against the earth’s sweet flowing breast;
A tree that looks at God all day,
And lifts her leafy arms to pray;
A tree that may in Summer wear
A nest of robins in her hair;
Upon whose bosom snow has lain;
Who intimately lives with rain.
Poems are made by fools like me,
But only God can make a tree.
--
;;;;
run;

Paste CSV or Tab-Delimited data to excel with NO formatting

I'm pasting Tab Delimited data from Notepad++ to excel (about 50k rows and 3 columns). No matter how many different ways I try it, Excel wants to convert a cell containing one " to the next instance of " into one cell content.
For Example, if my data looked like this:
"Apple 1.0 Store
Banana 1.3 Store
"Cherry" 2.5 Garden
Watermelon 4.0 Field
The excel file looks like this:
Apple1.0StoreBanana1.3Store
Cherry 2.5GardenWatermelon4.0Field
One way to get around this is to open the file as a CSV in excel, however this leads to Excel formatting the number values to simplified ones using Excel's "General" format. So the data would look like the following:
"Apple 1 Store
Banana 1.3 Store
"Cherry" 2.5 Garden
Watermelon 4 Field
The data I'm getting is coming from SQL Server Studio so my options for file formats are:
.CSV
.Txt (Tab-delimited)
Copy Pasting from Query results
The solution I'm looking for is to have the data represented in Excel with no excel processing taking place on the quotations, numbers or any other cell contents.
Don't open the file directly in excel. Instead import it and control the data types and file layout.
Open a new excel document:
Select Data menu:
Select From Text in get External Data section.
Select file to import
On step 1 of import wizard select delimited
Click next
Select tab checkbox and change text qualifier to {none}.
Click next
Set column data types to general, text, text
Click finish.
Excel auto imports the data the best it can when you open directly in excel. You lose flexibility/control when this happens. better to import and control yourself to get the fine adjustments you're looking for.
You end up with something like this:
By treating the numbers like text, the zero's don't get messed up.
By setting the text qualifier to none, the quotes don't get messed up.
Have you tried opening it via Text Import?
Got to Data tab > From Text (third form left on default)
You will have window similar to Text To Columns.
Select correct delimiter, remember to remove the quote sign from TExt Qualifier and mark all columns as text to avoid Excel autoformatting.
Step 1:
Step 2:
Step 3:
EXCEL TIP: TIME SAVING IN IMPORTING CSV FILES INTO EXCEL: If u pre-set your Text-To-Columns delimiter parameters correctly in EXCEL (eg specify tabs as the delimiter) and then copy and paste the CSV data, Excel will import the CSV paste directly into the correct columns without u having to going through the Text-To-Columns rigmarole. This was particularly time saving when i had to import hundreds of bank statements into Excel.
However if your Text-To-Columns delimiters are pre-specified incorrectly as e.g. comma and you are importing tab delimited files then excel will dump all the data into one column, and u will have to go through the time consuming process of converting Text-To-Columns for each statement.
EXCEL LOOKS TO THE EXISTING Text-To-Columns delimiters TO SEE IF IT CAN USE THOSE TO MAKE YOUR LIFE EASIER WHEN PASTING DATA
Hope that tip helps (It saved me several hours)

How to Get Data from CSV File and Send them to Excel Using Pentaho?

I have a tabular csv file that has seven columns and containing the following data:
ID,Gender,PatientPrefix,PatientFirstName,PatientLastName,PatientSuffix,PatientPrefName
2 ,M ,Mr ,Lawrence ,Harry , ,Larry
I am new to pentaho and I want to design a transformation that moves the data (values of the 7 columns) to an empty excel sheet. The excel sheet has different column names, but should carry the same data, as shown:
prefix_name,first_name,middle_name,last_name,maiden_name,suffix_name,Gender,ID
I tried to design a transformation using the following series of steps, but it gives me errors at the end that I could not interpret them.
What is the proper design to move the data from the csv file to the excel sheet in this case? Any ideas to solve this problem?
As #Brian.D.Myers mentioned in the comment you can use select values step. But here is how you do it step by step explanation.
Select all the fields from CSV file input step.
Configure the select values step as follows.
In the Content tab of Excel writer step click on Get fields button and fill the fields. Alternatively you can use Excel output step as well.

Exporting BIRT report into CSV

I am Using BIRT for reporting in my project.
The report shows correct value for amount(String) as 123456789123, but when i try to export the same report into csv, the csv file shows same amount as 1.234E11.
I want to value as 123456789123 in csv too.
Please help
Thanks
I imagine this is probably an issue with viewing it in Excel. Exporting data does not export format codes. Open the csv in notepad and you will see the correct data. If you export the report to excel you can set a custom format code like #####0 in the Format Number property in the properties editor for the data item.
If the number is too large excel will display the value like that. If you expand the column and set the column to number under format cells it will display correctly. You will need to save it as an excel workbook.
We just had this problem with the our BiRT reporting tool. When we opened the exported file in a text editor the number was formatted in scientific notation. This was a bug with one of Birt's custom formatters.
We had to look at org.eclipse.birt.report.engine.dataextraction.impl.CommonDataExtractionImpl and change the line
valueFormatters[i] = new NumberFormatter( patterns[i], this.locale );
to
String pattern = patterns[i] == null ? "Unformatted" : patterns[i];
valueFormatters[i] = new NumberFormatter( pattern, this.locale );
Setting the pattern to "Unformatted" made default format stay as a normal integer rather than scientific notation (via a decimal formatter).