Any idea how can I Insert a table (with rows and Columns) in word using Sikulix command? - sikuli

Any idea how can I Insert a table (with rows and Columns) in word using Sikulix command? I am able to do till insert table but not able select the rows and columns. Neither Take screenshot nor Insert image is working. Any help is really appreciated. Thanks
Always struck with:
[error] script [ Existing ] stopped with error in line 7
[error] FindFailed ( 1623931201601.png: (26x42) in R[0,0 1536x864]#S(0) ) though the image exists

Related

Remove repeated lines from a text file in order for phpMyAdmin to successfully insert SQL statements into database

I have a large text file full of INSERT SQL statements that I want to insert into a phpMyAdmin database. The problem I am having is that many of these INSERT statements within this file are identical, resulting in “Duplicate Key” error occuring.
Is there a way to make phpMyAdmin ignore the repeated SQL statements? I have tried running the file through a .vbs script that removes duplicate lines but it failed to deliver.
Logic that I am thinking of so far is the following:
Run the file through a script that removes duplicate lines.
Find a solution in which phpMyAdmin ignores repeated lines.
Has anyone got any other ideas or suggestions on how I could solve this problem?
The easy way is by using INSERT IGNORE statement, but you will not know which record is duplicate.
another way, by create new table like 'table2' with no primary key or unique key, insert all the data into it, then INSERT IGNORE to your main table before, and compare which row are duplicate. Or maybe you can use the COUNT() function to get the duplicate row by.

Insert Consecutive Id's between other Id's

I have the following table :
id x y z
1 z
3
6 x
7 zy
....
10000
I need to add id's in between the other id's that are already without deleting the data inside. Can't seem to find any solution, tryed all sorts of things but ended up making blank rows.
Kinda new to sql all together.
Using the information you provided in the comment...
Got a backup that i need to import to current db which is made with update only querys and need to create the rows first so that I can import them.
... and the fact that you are using MySQL, I think there is a simple solution for your problem.
Create a copy of your backup file (to have the original in case it doesn't work as expected), open it in a text editor and replace UPDATE <table_name> with INSERT INTO <table_name> (put the actual name of your table instead of <table_name>).
If some of the rows you want to import already exists in the table you have the following options to solve the conflicts:
use INSERT IGNORE INTO <table_name> as the replacement string to ignore the rows from the backup (the rows already existing in the table remain unmodified); technically, IGNORE doesn't ignore the rows you want to insert; it attempts to insert them and fails because they already exists but treats the failures as warnings (they are normally errors);
use REPLACE INTO <table_name> as the replacement string to replace the existing rows with the data from the backup; technically, REPLACE does DELETE followed by INSERT; it is not the best solution if the rows you want to insert are not complete.

some weird values that i can't insert it into my psql table

I downloaded a data file crawled from an auto information website.
Basically it looks like this:
after I import it using \i in psql.
i got this error:
Invalid command \',. Try \? for help.
i tried the create table part without insert any rows. it can create a table successfully.
and i manually insert some rows from this file. some of them works, others doesn't.
so i guess there are some weird values in the values that i insert.
some weird values like '前● / 后●' can be insert, but until i found this:
'[u\'#70706E\', u\'#854C38\']'
when i insert this, it gives me the same error message.
Could some one tell me what does this value mean? why it can't be insert and how do they get it when they use crawler to crawl the data from the website?

Update MySQL table after merging rows

After executing the following code:
SELECT
industry,
company,
GROUP_CONCAT(symbol SEPARATOR ',') AS symbol
FROM
table1
GROUP BY
SOUNDEX(company),
I managed to merge some rows, however the table isn't updating itself with the merged rows and cells.
I then tried using UPDATE / SET / WHERE but not quite sure what to SET and WHERE as I need to update every merged row/cell. The "best error" that I get is Operand should contain 1 column(s)
I'm now quite stuck here.. any hint at how to update the table would be much appreciated.
EDITED
Some more details:
I'm making a PHP code that can feed data from xml to database, then look for duplicate entries and merge it like this:
here is the snapshot of my DB table:
When I run code above code through PHP - there are no changes to the above DB table
But when the code is executed directly in phpMyAdmin "SQL", the following changes can be seen:
But when I click back to "Browse", all data stays as it was.
Thus it let me think that it is not updating after I did merging...

Unable to do an INSERT INTO using a SELECT where the SELECT is `information_schema`.`COLUMNS`

This one really stumps me. I'm trying to create a MySQL query that I can populate information into a columns table to then export into SQLite. I can populate just about anything I want, except when the SELECT statement comes from information_schema.COLUMNS.
Here's a simple example to test. I created a table called TestTbl in a schema called and gave it 2 rows. The first is an AUTO INCREMENT INT id row, and the 2nd is a varchar 100 row called Namedb. I then run the following query.
INSERT INTO `tblinfoetc`.`testtbl` (Namedb)
SELECT
`COLUMNS`.`TABLE_SCHEMA` AS `Namedb`
FROM
`information_schema`.`COLUMNS`;
My message after doing this is:
Affected rows: 0
Time: 0.000s
If I run this without the INSERT INTO line, it returns the records without a hitch. Any ideas?
This is more of a workaround suggestion than a real solution:
Selecting from information_schema.columns works fine, so try doing the operation in two steps. First, copy the data using a client interface then insert it into the destination table from the client.
I have encountered the same problem on a windows machine running mysql 5.6.13. The query used to work with mysql 5.5. It might be related to this other issue